Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

EdEddnEddy posted:

The absolute latest ones are safe, but apparently there is a bug when the Monitor/PC sleeps and windows move around. I don't let my PC sleep because 1:5 chance it won't wake up due to the OC lol, but I've had no major issues with it otherwise. It was their semi polished VR driver and it has been good so far.

drat, and here I was losing my mind trying to gently caress with various power configurations to try to keep it from that silent sleep death. Good to know where the problem actually stems from.

Adbot
ADBOT LOVES YOU

PC LOAD LETTER
May 23, 2005
WTF?!

Paul MaudDib posted:

But again, as I mentioned DX12 and Vulkan have the potential to change that situation.
Is it wrong that I'm expecting the new GPU's that are coming from nV and AMD are going to have exceptionally long viability for most people?

They're both rumored to be highly DX12/Vulcan compliant (so better efficiency over time as software takes more advantage of the hardware), be more flexible, more powerful, lots o' VRAM, and both are going to be using a 14/16nm process that will likely be around for a fair amount of time in leading GPU's (3-4yr or so). AMD's stuff tends to age more gracefully than nV's but I don't see any reason why they both couldn't stick around for a long time in a competitive sense. Only thing where that might be not true is if VR or super stupid high res (8K) displays take off. But for 4K or less I'd expect Polaris or Pascal to do very well for a long time.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

PC LOAD LETTER posted:

Is it wrong that I'm expecting the new GPU's that are coming from nV and AMD are going to have exceptionally long viability for most people?

Yeah, it's probably going to be a major jump with minor incremental releases for a while after. Think of it like the first-gen Keplers or GCN 1.0s - the GTX 680 lived on for a long-rear end time as the 770, and the 7950 as the R9 280, which are both pretty OK cards even now. I ran a 7850 for 2.5 years (switched to a R9 280 in January 2015) and it was still surprisingly acceptable for most stuff even at the end.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

PC LOAD LETTER posted:

Is it wrong that I'm expecting the new GPU's that are coming from nV and AMD are going to have exceptionally long viability for most people?

They're both rumored to be highly DX12/Vulcan compliant (so better efficiency over time as software takes more advantage of the hardware), be more flexible, more powerful, lots o' VRAM, and both are going to be using a 14/16nm process that will likely be around for a fair amount of time in leading GPU's (3-4yr or so). AMD's stuff tends to age more gracefully than nV's but I don't see any reason why they both couldn't stick around for a long time in a competitive sense. Only thing where that might be not true is if VR or super stupid high res (8K) displays take off. But for 4K or less I'd expect Polaris or Pascal to do very well for a long time.

I wouldn't be surprised, both sets of cards should have a lot of the factors that have made the 7970/7950 and 290(X) probably the longest lived graphics cards to date and given most of the rest of the cards around them quite good lifespans by usual standards. I don't think there's going to be a much better time for someone who wants to run a card into the ground anytime soon, especially since I doubt there's going to be any stupid mistakes like AMD putting a hair dryer on their card instead of a cooler.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

xthetenth posted:

I doubt there's going to be any stupid mistakes like AMD putting a hair dryer on their card instead of a cooler.
Oh ye of little faith...

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Zero VGS posted:

The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles.

I really don't want the dumb NV30 vs R300 series fights over image quality to be the defining factor of GPU selection.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Zero VGS posted:

The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles.

No, there's always room to increase the refresh rate, push out LODs, crank up draw distance, supersample, or use techniques that are currently just not feasible.

Hell if the hardware gets strong enough for it then neural networks, especially with asynchronous compute, offer a ton of potential for real improvements that could drive graphics hardware for well past the foreseeable future. Supplement game AI with neural networks to behave more organically, like rendering the world from the viewpoint of a guard and have a neural network decide if the guard can see you in that shadow or not. For pure graphics I'm not sure what neural network enhancement would look like but I'm confident there's a lot of room there too.

We're going to hit fundamental physical limits long before we run out of ways to use that processing power.

Josh Lyman
May 24, 2009


Until VR porn looks photorealistic, there will always be a market for more GPU power.

fozzy fosbourne
Apr 21, 2010

Zero VGS posted:

The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles.

4K pegged at 120hz+ so some strobe backlight feature can be enabled seems like it will be more easily distinguishable than high end headphones. I think there is a ton of headroom that won't be filled right away now that higher resolutions and refresh rates are entering the market.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

21:9 4K 120 Hz HDR OLED quadfire gooooo!

SwissArmyDruid
Feb 14, 2014

by sebmojo
I'm just hoping that we can get some Freesync panels with a variable refresh rate larger than loving 15Hz in some cases.

EdEddnEddy
Apr 5, 2012



Shimrra Jamaane posted:

Absolute latest meaning full release or beta release?

364.72 on their site currently. They don't even show any beta ones out currently. Might be something leaked on Guru3d though.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

xthetenth posted:

21:9 4K 120 Hz HDR OLED quadfire gooooo!

Interposers are going to be an interestingly important component of GPU design herein. Here is an interesting question, can AMD not use an expensive interposers and slap two P11 chips together without them being recognized as two separate GPUs? I'm not clear on whether GPU MCM is possible outside of an interposer.

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL
How's this look for a stable OC on a GTX970?



I was following a guide I found somewhere and got some crashes (screen going blank, hard freeze, had to manually restart, almost poo poo my pants the first time it happened), and started lowering the GPU boost clock until I could play a couple hours without issues. Said guide had the memory clock in like 7800mhz, but I remember someone mentioning here I should get the gpu clock stable first before starting fiddling with other settings so I'm aiming for that.

I've been playing Dark Souls 3, Dirt Rally and The Division and so far I haven't gotten that "blank screen" crash again; I did get a freeze in The Division earlier, but it was different, the screen just froze and the ambient music kept playing, but I couldn't ctrl-alt-del out of it. I thought it may be a game issue and not the card settings, but maybe I'm going overboard with some of the numbers?

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

Interposers are going to be an interestingly important component of GPU design herein. Here is an interesting question, can AMD not use an expensive interposers and slap two P11 chips together without them being recognized as two separate GPUs? I'm not clear on whether GPU MCM is possible outside of an interposer.

My suspicion has been that interposers, at least in the HBM context, aren't expensive because the silicon, since those can be run on any old process. IIRC, they presently are being run on 40nm,edit: 65nm (!!!) and that's dirt cheap. They're expensive because of the HBM-spec micro bumps. I say this, with the knowledge of a press release post-Fury launch from a company manufacturing specifically just the microbumps announcing that they were pleased to be continuing their working relationship with AMD, which heavily implies if not outright indicates that the microbumps are not silicon.

Even if it's more a process that they're licensing rather than actual parts that get press-fit into the interposer, if they aren't using HBM bumps on a given product, (presumably GDDR5X in the R7 460 - R9 480 range) I don't see why they couldn't just wire the active silicon down to the interposers conventionally.

SwissArmyDruid fucked around with this message at 17:17 on Apr 14, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

SwissArmyDruid posted:

My suspicion has been that interposers, at least in the HBM context, aren't expensive because the silicon, since those can be run on any old process. IIRC, they presently are being run on 40nm, and that's dirt cheap. They're expensive because of the HBM-spec micro bumps. I say this, with the knowledge of a press release post-Fury launch from a company manufacturing specifically just the microbumps announcing that they were pleased to be continuing their working relationship with AMD, which heavily implies if not outright indicates that the microbumps are not silicon.

If they aren't using HBM bumps on a given product, (presumably GDDR5X in the R7 460 - R9 480 range) I don't see why they couldn't just wire the active silicon down to the interposers conventionally.

I'm pretty sure that the sheer size of a big interposer can be painful to work around as well.


SwissArmyDruid posted:

I'm just hoping that we can get some Freesync panels with a variable refresh rate larger than loving 15Hz in some cases.

There's a decent selection, and honestly I'm more interested in freesync becoming a pervasive thing, and that means minimal cost, so I'm kind of torn on whether to get mad at narrow ranges.

penus penus penus
Nov 9, 2014

by piss__donald

Edmond Dantes posted:

How's this look for a stable OC on a GTX970?



I was following a guide I found somewhere and got some crashes (screen going blank, hard freeze, had to manually restart, almost poo poo my pants the first time it happened), and started lowering the GPU boost clock until I could play a couple hours without issues. Said guide had the memory clock in like 7800mhz, but I remember someone mentioning here I should get the gpu clock stable first before starting fiddling with other settings so I'm aiming for that.

I've been playing Dark Souls 3, Dirt Rally and The Division and so far I haven't gotten that "blank screen" crash again; I did get a freeze in The Division earlier, but it was different, the screen just froze and the ambient music kept playing, but I couldn't ctrl-alt-del out of it. I thought it may be a game issue and not the card settings, but maybe I'm going overboard with some of the numbers?

Just keep lowering it until its stable. It doesn't sound like it will be much lower than this. If you're in the 1400's you're doing just fine. Yes there are some that get into the 1500's but dont sweat it. The division is an excellent stress test for overclocking especially if you crank it all up.

You have a lot more to go on memory, but you know that.

Temps are really excellent

edit: Didnt notice your voltage was stock, not used to gpu tweak. You can start bumping up voltage for stability if you like instead of lowering clocks.

SwissArmyDruid
Feb 14, 2014

by sebmojo

xthetenth posted:

There's a decent selection, and honestly I'm more interested in freesync becoming a pervasive thing, and that means minimal cost, so I'm kind of torn on whether to get mad at narrow ranges.

You'll get your pervasive with Ice Lake. Once Intel does it, there won't be anyone who doesn't support it.

HMS Boromir
Jul 16, 2011

by Lowtax
Is it going to keep being tied to GPU brand though? I'm not getting a new monitor for a long time (I'm content at 768p like some kind of monster) but it'd be nice if when I did get one I didn't have to bet on the right GPU horse to get to keep using variable refresh 5 or 10 years later.

Panty Saluter
Jan 17, 2004

Making learning fun!

Desuwa posted:

No, there's always room to increase the refresh rate, push out LODs, crank up draw distance, supersample,

I would love to see LOD/draw distance start getting pushed more. Granted RAM is going to need to start bumping up a lot but even recent games still pop-in a lot and reducing it would be nice.

Super sampling would be good too. I tried Dark Souls 3 at 2x DSR and it was really gorgeous but my 970 started to pant a little. I'm still impressed it was keeping up 30ish fps though (max detail too).

SwissArmyDruid
Feb 14, 2014

by sebmojo

HMS Boromir posted:

Is it going to keep being tied to GPU brand though? I'm not getting a new monitor for a long time (I'm content at 768p like some kind of monster) but it'd be nice if when I did get one I didn't have to bet on the right GPU horse to get to keep using variable refresh 5 or 10 years later.

The official VESA spec is Adaptive Sync. This is variable refresh.

Freesync is AMD's name for interfacing with the Adaptive Sync spec. As they are the only brand name there that's not G-Sync, which is its own, entirely different thing, Freesync has come to mean "variable refresh that doesn't require an Nvidia-branded scalar."

Intel is going to adopt Adaptive Sync into their iGPU products. The earliest that this can occur is Ice Lake, as that is a tock, and not a tick. That's some time in 2017/2018.

HMS Boromir
Jul 16, 2011

by Lowtax
Right, but I'm never going to be on an iGPU for longer than it takes to replace the presumably bricked dGPU that's making me use it, so I guess this doesn't mean anything as far as G-Sync vs Freesync (now Everyone Else) tribalism.

I guess it'll make Freesync effectively the default, though, so the issue is more nVidia wanting to be in its own special little corner to the detriment of the consumer.

HMS Boromir fucked around with this message at 18:23 on Apr 14, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

HMS Boromir posted:

Right, but I'm never going to be on an iGPU for longer than it takes to replace the presumably bricked dGPU that's making me use it, so I guess this doesn't mean anything as far as G-Sync vs Freesync (now Everone Else) tribalism.

If you time your card purchases well, you can buy only one company's cards and be really close to optimal.

HMS Boromir
Jul 16, 2011

by Lowtax
That's fair, I'm probably patient enough with my upgrades that I could make it work as long as I paid enough attention. It's just... sort of unpleasant to me, and I was hoping Intel iGPUs supporting variable refresh might mean something on that front.

SwissArmyDruid
Feb 14, 2014

by sebmojo

HMS Boromir posted:

Right, but I'm never going to be on an iGPU for longer than it takes to replace the presumably bricked dGPU that's making me use it, so I guess this doesn't mean anything as far as G-Sync vs Freesync (now Everone Else) tribalism.

I guess it'll make Freesync effectively the default, though, so the issue is more nVidia wanting to be in its own special little corner to the detriment of the consumer.

Correct. There is nothing, aside from Nvidia being obstinate, from adopting Adaptive Sync and calling it, oh, I don't know, "G-Sync Lite." I suspect that they will continue to wall themselves off from it so long as it continues to be an optional part of the spec.

beejay
Apr 7, 2002

My MSI Gaming 970 seems to boost at 1455 right out of the box, is that about as good as it gets or would I be able to OC it?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
There is usually a good bit of headroom over the stock boost.

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL

THE DOG HOUSE posted:

Just keep lowering it until its stable. It doesn't sound like it will be much lower than this. If you're in the 1400's you're doing just fine. Yes there are some that get into the 1500's but dont sweat it. The division is an excellent stress test for overclocking especially if you crank it all up.

You have a lot more to go on memory, but you know that.

Temps are really excellent

edit: Didnt notice your voltage was stock, not used to gpu tweak. You can start bumping up voltage for stability if you like instead of lowering clocks.
The voltage is actually a wee bit over stock, stock is 1.175, and I have it a 1.200 right now. I may bump it to 1.210 and see how it goes, but I think I'm pretty much in stable territory. Now let's take a look at that memory clock...

The temps really surprised me, especially considering I haven't touched the fan curve yet, it's running on auto which kicks the fans in at ~60; my peak temp so far has been 71 with 40% fan usage.

beejay posted:

My MSI Gaming 970 seems to boost at 1455 right out of the box, is that about as good as it gets or would I be able to OC it?
Mine is stable at 1475 right now (in-game frequency, not the boot clock number). Last time it crashed was at 1486, but I had it with stock voltage at the time, so it may actually be stable at 1486 now that I bumped that up, but I think I'm happy with it.

SlayVus
Jul 10, 2009
Grimey Drawer

Edmond Dantes posted:

The voltage is actually a wee bit over stock, stock is 1.175, and I have it a 1.200 right now. I may bump it to 1.210 and see how it goes, but I think I'm pretty much in stable territory. Now let's take a look at that memory clock...

Maxwell kind of works on the more is less principle. Adding more voltage can actually cause instability at higher clocks. My 980 Ti can do ~1512 at stock voltage, never really did any memory overclocking. I find memory provides less performance than core clocks.

Rukus
Mar 13, 2007

Hmph.

Edmond Dantes posted:

How's this look for a stable OC on a GTX970?



I was following a guide I found somewhere and got some crashes (screen going blank, hard freeze, had to manually restart, almost poo poo my pants the first time it happened), and started lowering the GPU boost clock until I could play a couple hours without issues. Said guide had the memory clock in like 7800mhz, but I remember someone mentioning here I should get the gpu clock stable first before starting fiddling with other settings so I'm aiming for that.

I've been playing Dark Souls 3, Dirt Rally and The Division and so far I haven't gotten that "blank screen" crash again; I did get a freeze in The Division earlier, but it was different, the screen just froze and the ambient music kept playing, but I couldn't ctrl-alt-del out of it. I thought it may be a game issue and not the card settings, but maybe I'm going overboard with some of the numbers?

Unlock your temp target from power target and drop the temp target down as low as possible. That'll make it try to keep the card cooler, helping with keeping your OC stable.

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL

SlayVus posted:

Maxwell kind of works on the more is less principle. Adding more voltage can actually cause instability at higher clocks. My 980 Ti can do ~1512 at stock voltage, never really did any memory overclocking. I find memory provides less performance than core clocks.
Every time I think I'm figuring something out, some new info comes along. :argh: (thanks!)

Rukus posted:

Unlock your temp target from power target and drop the temp target down as low as possible. That'll make it try to keep the card cooler, helping with keeping your OC stable.
Out of curiosity, what does that do? Does it start throttling the clocks to keep the temps/consumption down, or is it just fan control?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Edmond Dantes posted:

Every time I think I'm figuring something out, some new info comes along. :argh: (thanks!)

Out of curiosity, what does that do? Does it start throttling the clocks to keep the temps/consumption down, or is it just fan control?

Raising temp target will raise the throttle point

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

Anime Schoolgirl posted:

I just want a new single slot half-height card, there really hasn't been anything since the 7750

If they ever do that here is the perfect case: http://www.aliexpress.com/item/HTPC...4e-afa26d08fa37

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL

Don Lapre posted:

Raising temp target will raise the throttle point

Yeah, makes sense. Thanks.

EdEddnEddy
Apr 5, 2012



In other news, Nvidia dropped a Beta Hotfix Driver for the DOOM Beta this weekend

penus penus penus
Nov 9, 2014

by piss__donald
I'll be interested to see if my computer matches the dismal results from the alpha.

Also I read that the 60 fps cap is not actually true, its 60, 120, and 144 hz at least. And there are tons of graphics settings evidently, was good to read that.

All that being said the game itself doesnt look very good lol

wargames
Mar 16, 2008

official yospos cat censor
Some not terrible deals here, they had a 970 for 230 not too long ago.

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%204809&IsNodeId=1&bop=And&Order=PRICED&PageSize=30

PC LOAD LETTER
May 23, 2005
WTF?!

SwissArmyDruid posted:

My suspicion has been that interposers, at least in the HBM context, aren't expensive because the silicon, since those can be run on any old process....that's dirt cheap.
They're expensive because they're gigantic and still are essentially a big recticle maxing chip 600-700mm in size or so. That and its still really difficult to mount all the necessary components to one and have everything work right. Failure rates are still probably barely practical to produce anything even at a high price in the consumer market.

This article is a bit old now but it mentions that interposers, by themselves, would cost around 5x memory in 2015. HBM itself is supposed to be around 2x the cost of LPDDR3 which isn't the cheapest stuff around but isn't insanely expensive either. I have no idea with the exact costs would for full package assembly and testing but I'm sure its anything but cheap.

SwissArmyDruid posted:

which heavily implies if not outright indicates that the microbumps are not silicon.
Its been known for a while they're not. They couldn't be and still make a electrical connection between dies. I don't think they're standard solder either though and getting them that small with consistent quality is tough.

SwissArmyDruid posted:

I don't see why they couldn't just wire the active silicon down to the interposers conventionally.
Do you mean wire bonding? I don't believe that is practical for HBM or they would've done it already.

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Latest rumors have more solid grounding on P11 and P10, apparently P11 is the 470 and sub 50W (which should mean low profile single slot :neckbeard:) P10 is the 480 (also running at a more familiar baseclock of 1050) and TDP appears to be 110-135W. For what we know about performance for both, I'm impressed with the 470 and am not to sure about the 480. Prices may be good but unless the 480 convincingly beats the Fury it's going to draw comparisons to the 970/980. Still, if AMD beats expectations then they have a sub 50W card which duels with a 960, and a sub 150W card which duels the 980ti, with silicon at 123mm² and 232mm². I'm thinking AMD poured everything they could into the P11 as their minimum VR card now.

This kind of leaves open what the 490 and 460 are though, although I still want to lean on the 490 and 490X being the dumping ground for defective Fury silicon to get back some money from HBM2 expenditure.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply