|
EdEddnEddy posted:The absolute latest ones are safe, but apparently there is a bug when the Monitor/PC sleeps and windows move around. I don't let my PC sleep because 1:5 chance it won't wake up due to the OC lol, but I've had no major issues with it otherwise. It was their semi polished VR driver and it has been good so far. drat, and here I was losing my mind trying to gently caress with various power configurations to try to keep it from that silent sleep death. Good to know where the problem actually stems from.
|
# ? Apr 14, 2016 00:43 |
|
|
# ? May 30, 2024 11:57 |
|
Paul MaudDib posted:But again, as I mentioned DX12 and Vulkan have the potential to change that situation. They're both rumored to be highly DX12/Vulcan compliant (so better efficiency over time as software takes more advantage of the hardware), be more flexible, more powerful, lots o' VRAM, and both are going to be using a 14/16nm process that will likely be around for a fair amount of time in leading GPU's (3-4yr or so). AMD's stuff tends to age more gracefully than nV's but I don't see any reason why they both couldn't stick around for a long time in a competitive sense. Only thing where that might be not true is if VR or super stupid high res (8K) displays take off. But for 4K or less I'd expect Polaris or Pascal to do very well for a long time.
|
# ? Apr 14, 2016 02:08 |
|
PC LOAD LETTER posted:Is it wrong that I'm expecting the new GPU's that are coming from nV and AMD are going to have exceptionally long viability for most people? Yeah, it's probably going to be a major jump with minor incremental releases for a while after. Think of it like the first-gen Keplers or GCN 1.0s - the GTX 680 lived on for a long-rear end time as the 770, and the 7950 as the R9 280, which are both pretty OK cards even now. I ran a 7850 for 2.5 years (switched to a R9 280 in January 2015) and it was still surprisingly acceptable for most stuff even at the end.
|
# ? Apr 14, 2016 02:33 |
|
PC LOAD LETTER posted:Is it wrong that I'm expecting the new GPU's that are coming from nV and AMD are going to have exceptionally long viability for most people? I wouldn't be surprised, both sets of cards should have a lot of the factors that have made the 7970/7950 and 290(X) probably the longest lived graphics cards to date and given most of the rest of the cards around them quite good lifespans by usual standards. I don't think there's going to be a much better time for someone who wants to run a card into the ground anytime soon, especially since I doubt there's going to be any stupid mistakes like AMD putting a hair dryer on their card instead of a cooler.
|
# ? Apr 14, 2016 03:00 |
|
xthetenth posted:I doubt there's going to be any stupid mistakes like AMD putting a hair dryer on their card instead of a cooler.
|
# ? Apr 14, 2016 05:51 |
|
The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles.
|
# ? Apr 14, 2016 06:58 |
|
Zero VGS posted:The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles. I really don't want the dumb NV30 vs R300 series fights over image quality to be the defining factor of GPU selection.
|
# ? Apr 14, 2016 07:38 |
|
Zero VGS posted:The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles. No, there's always room to increase the refresh rate, push out LODs, crank up draw distance, supersample, or use techniques that are currently just not feasible. Hell if the hardware gets strong enough for it then neural networks, especially with asynchronous compute, offer a ton of potential for real improvements that could drive graphics hardware for well past the foreseeable future. Supplement game AI with neural networks to behave more organically, like rendering the world from the viewpoint of a guard and have a neural network decide if the guard can see you in that shadow or not. For pure graphics I'm not sure what neural network enhancement would look like but I'm confident there's a lot of room there too. We're going to hit fundamental physical limits long before we run out of ways to use that processing power.
|
# ? Apr 14, 2016 07:55 |
|
Until VR porn looks photorealistic, there will always be a market for more GPU power.
|
# ? Apr 14, 2016 08:20 |
|
Zero VGS posted:The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles. 4K pegged at 120hz+ so some strobe backlight feature can be enabled seems like it will be more easily distinguishable than high end headphones. I think there is a ton of headroom that won't be filled right away now that higher resolutions and refresh rates are entering the market.
|
# ? Apr 14, 2016 13:44 |
|
21:9 4K 120 Hz HDR OLED quadfire gooooo!
|
# ? Apr 14, 2016 14:48 |
|
I'm just hoping that we can get some Freesync panels with a variable refresh rate larger than loving 15Hz in some cases.
|
# ? Apr 14, 2016 15:33 |
|
Shimrra Jamaane posted:Absolute latest meaning full release or beta release? 364.72 on their site currently. They don't even show any beta ones out currently. Might be something leaked on Guru3d though.
|
# ? Apr 14, 2016 16:43 |
|
xthetenth posted:21:9 4K 120 Hz HDR OLED quadfire gooooo! Interposers are going to be an interestingly important component of GPU design herein. Here is an interesting question, can AMD not use an expensive interposers and slap two P11 chips together without them being recognized as two separate GPUs? I'm not clear on whether GPU MCM is possible outside of an interposer.
|
# ? Apr 14, 2016 16:56 |
|
How's this look for a stable OC on a GTX970? I was following a guide I found somewhere and got some crashes (screen going blank, hard freeze, had to manually restart, almost poo poo my pants the first time it happened), and started lowering the GPU boost clock until I could play a couple hours without issues. Said guide had the memory clock in like 7800mhz, but I remember someone mentioning here I should get the gpu clock stable first before starting fiddling with other settings so I'm aiming for that. I've been playing Dark Souls 3, Dirt Rally and The Division and so far I haven't gotten that "blank screen" crash again; I did get a freeze in The Division earlier, but it was different, the screen just froze and the ambient music kept playing, but I couldn't ctrl-alt-del out of it. I thought it may be a game issue and not the card settings, but maybe I'm going overboard with some of the numbers?
|
# ? Apr 14, 2016 16:56 |
|
FaustianQ posted:Interposers are going to be an interestingly important component of GPU design herein. Here is an interesting question, can AMD not use an expensive interposers and slap two P11 chips together without them being recognized as two separate GPUs? I'm not clear on whether GPU MCM is possible outside of an interposer. My suspicion has been that interposers, at least in the HBM context, aren't expensive because the silicon, since those can be run on any old process. IIRC, they presently are being run on Even if it's more a process that they're licensing rather than actual parts that get press-fit into the interposer, if they aren't using HBM bumps on a given product, (presumably GDDR5X in the R7 460 - R9 480 range) I don't see why they couldn't just wire the active silicon down to the interposers conventionally. SwissArmyDruid fucked around with this message at 17:17 on Apr 14, 2016 |
# ? Apr 14, 2016 17:09 |
|
SwissArmyDruid posted:My suspicion has been that interposers, at least in the HBM context, aren't expensive because the silicon, since those can be run on any old process. IIRC, they presently are being run on 40nm, and that's dirt cheap. They're expensive because of the HBM-spec micro bumps. I say this, with the knowledge of a press release post-Fury launch from a company manufacturing specifically just the microbumps announcing that they were pleased to be continuing their working relationship with AMD, which heavily implies if not outright indicates that the microbumps are not silicon. I'm pretty sure that the sheer size of a big interposer can be painful to work around as well. SwissArmyDruid posted:I'm just hoping that we can get some Freesync panels with a variable refresh rate larger than loving 15Hz in some cases. There's a decent selection, and honestly I'm more interested in freesync becoming a pervasive thing, and that means minimal cost, so I'm kind of torn on whether to get mad at narrow ranges.
|
# ? Apr 14, 2016 17:16 |
|
Edmond Dantes posted:How's this look for a stable OC on a GTX970? Just keep lowering it until its stable. It doesn't sound like it will be much lower than this. If you're in the 1400's you're doing just fine. Yes there are some that get into the 1500's but dont sweat it. The division is an excellent stress test for overclocking especially if you crank it all up. You have a lot more to go on memory, but you know that. Temps are really excellent edit: Didnt notice your voltage was stock, not used to gpu tweak. You can start bumping up voltage for stability if you like instead of lowering clocks.
|
# ? Apr 14, 2016 17:18 |
|
xthetenth posted:There's a decent selection, and honestly I'm more interested in freesync becoming a pervasive thing, and that means minimal cost, so I'm kind of torn on whether to get mad at narrow ranges. You'll get your pervasive with Ice Lake. Once Intel does it, there won't be anyone who doesn't support it.
|
# ? Apr 14, 2016 17:22 |
|
Is it going to keep being tied to GPU brand though? I'm not getting a new monitor for a long time (I'm content at 768p like some kind of monster) but it'd be nice if when I did get one I didn't have to bet on the right GPU horse to get to keep using variable refresh 5 or 10 years later.
|
# ? Apr 14, 2016 17:34 |
|
Desuwa posted:No, there's always room to increase the refresh rate, push out LODs, crank up draw distance, supersample, I would love to see LOD/draw distance start getting pushed more. Granted RAM is going to need to start bumping up a lot but even recent games still pop-in a lot and reducing it would be nice. Super sampling would be good too. I tried Dark Souls 3 at 2x DSR and it was really gorgeous but my 970 started to pant a little. I'm still impressed it was keeping up 30ish fps though (max detail too).
|
# ? Apr 14, 2016 17:39 |
|
HMS Boromir posted:Is it going to keep being tied to GPU brand though? I'm not getting a new monitor for a long time (I'm content at 768p like some kind of monster) but it'd be nice if when I did get one I didn't have to bet on the right GPU horse to get to keep using variable refresh 5 or 10 years later. The official VESA spec is Adaptive Sync. This is variable refresh. Freesync is AMD's name for interfacing with the Adaptive Sync spec. As they are the only brand name there that's not G-Sync, which is its own, entirely different thing, Freesync has come to mean "variable refresh that doesn't require an Nvidia-branded scalar." Intel is going to adopt Adaptive Sync into their iGPU products. The earliest that this can occur is Ice Lake, as that is a tock, and not a tick. That's some time in 2017/2018.
|
# ? Apr 14, 2016 18:02 |
|
Right, but I'm never going to be on an iGPU for longer than it takes to replace the presumably bricked dGPU that's making me use it, so I guess this doesn't mean anything as far as G-Sync vs Freesync (now Everyone Else) tribalism. I guess it'll make Freesync effectively the default, though, so the issue is more nVidia wanting to be in its own special little corner to the detriment of the consumer. HMS Boromir fucked around with this message at 18:23 on Apr 14, 2016 |
# ? Apr 14, 2016 18:07 |
|
HMS Boromir posted:Right, but I'm never going to be on an iGPU for longer than it takes to replace the presumably bricked dGPU that's making me use it, so I guess this doesn't mean anything as far as G-Sync vs Freesync (now Everone Else) tribalism. If you time your card purchases well, you can buy only one company's cards and be really close to optimal.
|
# ? Apr 14, 2016 18:16 |
|
That's fair, I'm probably patient enough with my upgrades that I could make it work as long as I paid enough attention. It's just... sort of unpleasant to me, and I was hoping Intel iGPUs supporting variable refresh might mean something on that front.
|
# ? Apr 14, 2016 18:22 |
|
HMS Boromir posted:Right, but I'm never going to be on an iGPU for longer than it takes to replace the presumably bricked dGPU that's making me use it, so I guess this doesn't mean anything as far as G-Sync vs Freesync (now Everone Else) tribalism. Correct. There is nothing, aside from Nvidia being obstinate, from adopting Adaptive Sync and calling it, oh, I don't know, "G-Sync Lite." I suspect that they will continue to wall themselves off from it so long as it continues to be an optional part of the spec.
|
# ? Apr 14, 2016 18:28 |
|
My MSI Gaming 970 seems to boost at 1455 right out of the box, is that about as good as it gets or would I be able to OC it?
|
# ? Apr 14, 2016 18:42 |
|
There is usually a good bit of headroom over the stock boost.
|
# ? Apr 14, 2016 18:48 |
|
THE DOG HOUSE posted:Just keep lowering it until its stable. It doesn't sound like it will be much lower than this. If you're in the 1400's you're doing just fine. Yes there are some that get into the 1500's but dont sweat it. The division is an excellent stress test for overclocking especially if you crank it all up. The temps really surprised me, especially considering I haven't touched the fan curve yet, it's running on auto which kicks the fans in at ~60; my peak temp so far has been 71 with 40% fan usage. beejay posted:My MSI Gaming 970 seems to boost at 1455 right out of the box, is that about as good as it gets or would I be able to OC it?
|
# ? Apr 14, 2016 19:20 |
|
Edmond Dantes posted:The voltage is actually a wee bit over stock, stock is 1.175, and I have it a 1.200 right now. I may bump it to 1.210 and see how it goes, but I think I'm pretty much in stable territory. Now let's take a look at that memory clock... Maxwell kind of works on the more is less principle. Adding more voltage can actually cause instability at higher clocks. My 980 Ti can do ~1512 at stock voltage, never really did any memory overclocking. I find memory provides less performance than core clocks.
|
# ? Apr 14, 2016 19:48 |
|
Edmond Dantes posted:How's this look for a stable OC on a GTX970? Unlock your temp target from power target and drop the temp target down as low as possible. That'll make it try to keep the card cooler, helping with keeping your OC stable.
|
# ? Apr 14, 2016 19:55 |
|
SlayVus posted:Maxwell kind of works on the more is less principle. Adding more voltage can actually cause instability at higher clocks. My 980 Ti can do ~1512 at stock voltage, never really did any memory overclocking. I find memory provides less performance than core clocks. Rukus posted:Unlock your temp target from power target and drop the temp target down as low as possible. That'll make it try to keep the card cooler, helping with keeping your OC stable.
|
# ? Apr 14, 2016 20:37 |
|
Edmond Dantes posted:Every time I think I'm figuring something out, some new info comes along. (thanks!) Raising temp target will raise the throttle point
|
# ? Apr 14, 2016 21:33 |
|
Anime Schoolgirl posted:I just want a new single slot half-height card, there really hasn't been anything since the 7750 If they ever do that here is the perfect case: http://www.aliexpress.com/item/HTPC...4e-afa26d08fa37
|
# ? Apr 14, 2016 21:49 |
|
Don Lapre posted:Raising temp target will raise the throttle point Yeah, makes sense. Thanks.
|
# ? Apr 14, 2016 21:55 |
|
In other news, Nvidia dropped a Beta Hotfix Driver for the DOOM Beta this weekend
|
# ? Apr 14, 2016 23:19 |
|
I'll be interested to see if my computer matches the dismal results from the alpha. Also I read that the 60 fps cap is not actually true, its 60, 120, and 144 hz at least. And there are tons of graphics settings evidently, was good to read that. All that being said the game itself doesnt look very good lol
|
# ? Apr 14, 2016 23:25 |
|
Some not terrible deals here, they had a 970 for 230 not too long ago. http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%204809&IsNodeId=1&bop=And&Order=PRICED&PageSize=30
|
# ? Apr 15, 2016 00:02 |
|
SwissArmyDruid posted:My suspicion has been that interposers, at least in the HBM context, aren't expensive because the silicon, since those can be run on any old process....that's dirt cheap. This article is a bit old now but it mentions that interposers, by themselves, would cost around 5x memory in 2015. HBM itself is supposed to be around 2x the cost of LPDDR3 which isn't the cheapest stuff around but isn't insanely expensive either. I have no idea with the exact costs would for full package assembly and testing but I'm sure its anything but cheap. SwissArmyDruid posted:which heavily implies if not outright indicates that the microbumps are not silicon. SwissArmyDruid posted:I don't see why they couldn't just wire the active silicon down to the interposers conventionally.
|
# ? Apr 15, 2016 02:26 |
|
|
# ? May 30, 2024 11:57 |
|
Latest rumors have more solid grounding on P11 and P10, apparently P11 is the 470 and sub 50W (which should mean low profile single slot ) P10 is the 480 (also running at a more familiar baseclock of 1050) and TDP appears to be 110-135W. For what we know about performance for both, I'm impressed with the 470 and am not to sure about the 480. Prices may be good but unless the 480 convincingly beats the Fury it's going to draw comparisons to the 970/980. Still, if AMD beats expectations then they have a sub 50W card which duels with a 960, and a sub 150W card which duels the 980ti, with silicon at 123mm² and 232mm². I'm thinking AMD poured everything they could into the P11 as their minimum VR card now. This kind of leaves open what the 490 and 460 are though, although I still want to lean on the 490 and 490X being the dumping ground for defective Fury silicon to get back some money from HBM2 expenditure.
|
# ? Apr 15, 2016 02:34 |