Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

exquisite tea posted:

On the subject of the GTX 970, is it still the best price:performance investment for current-gen if all I wanna do is play at max settings @ 1080 with minimal overclock finagling?

The 390 is arguably the better card for the near future, but the 970 is also a solid choice if you prefer NVIDIA or have a weaker CPU (AMD's dx11 driver overhead being a possible concern).

Adbot
ADBOT LOVES YOU

Captain Hair
Dec 31, 2007

Of course, that can backfire... some men like their bitches crazy.
I use rivatuner for frame limiting, since it's already installed for me to monitor things via afterburner. Works great can't say it's ever failed me

Ichabod Tane
Oct 30, 2005

A most notable
coward, an infinite and endless liar, an hourly promise breaker, the owner of no one good quality.


https://youtu.be/_Ojd0BdtMBY?t=4
Is there any reason to uninstall drivers completely and reinstall from scratch? I've been doing express install with Nvidia for years

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


HalloKitty posted:

The 390 is arguably the better card for the near future, but the 970 is also a solid choice if you prefer NVIDIA or have a weaker CPU (AMD's dx11 driver overhead being a possible concern).

I'm on an i5-4570 and probably will be until I put together an entirely new system. I've favored nvidia in the past just for the convenience and shadowplay features, although I don't have some crazy allegiance.

How does the 980ti compare at the top end, if I wanted to pay a little extra?

Truga
May 4, 2014
Lipstick Apathy
980Ti is 970 SLI in both price and performance, but without all the SLI bullshit that comes with it.


What I'm saying is, it's a really good deal. e: And unlike some other nvidia products with just enough ram to run current games, it might actually perform a bit longer than most at max details due to 6gb ram.

Truga fucked around with this message at 14:54 on Mar 7, 2016

kxZyle
Nov 7, 2012

Pillbug

Blacktoll posted:

Is there any reason to uninstall drivers completely and reinstall from scratch? I've been doing express install with Nvidia for years

I've read reports of increased performance following a complete cleanup in Safe Mode using something like DDU.

It hasn't happened to me personally but it can't hurt, especially after years' worth of express installs.

penus penus penus
Nov 9, 2014

by piss__donald

sauer kraut posted:

A 390 is really close if you have a 600W+ PSU to run it, that 8GB is looking better every month.
Also the FreeSync vs Gsync thing.

^^yeah

Wait how is a 390any card on the market supposed to use 8 gb of ram without running at 7.2 fps again

like I feel like a puppet but there is endless proof of this for years now

https://www.youtube.com/watch?v=t4uUgIkFa8o&t=309s

The 8gb 390 and 390x was an in-your-face marketing insult that only took one year to finally trickle through as a "good thing". Guys, its not, its not a selling point, it never will be, it never can be, please dont tell people to buy a 390 because it has 8gb of ram.

penus penus penus fucked around with this message at 16:40 on Mar 7, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

THE DOG HOUSE posted:

Wait how is a 390any card on the market supposed to use 8 gb of ram without running at 7.2 fps again

That sweet sweet 100% scaling in Tomb Raider?

Serious though the 8 GB is nice because it's bigger than 4. We know that Rise of the Tomb Raider uses more than 3.5 GB on highest textures because the 970 gets stuttering issues. We know it's less than 4 because the 290 is fine (and I'm pretty sure the 980 but those are rarer and I haven't seen any evidence either way), but NV's already saying to avoid that setting in that game with a 4 GB card. It's like the 4 GB 960 and 380X. They can't use all that memory but the 2 GB they'd have otherwise isn't enough.

penus penus penus
Nov 9, 2014

by piss__donald

xthetenth posted:

That sweet sweet 100% scaling in Tomb Raider?

Serious though the 8 GB is nice because it's bigger than 4. We know that Rise of the Tomb Raider uses more than 3.5 GB on highest textures because the 970 gets stuttering issues. We know it's less than 4 because the 290 is fine (and I'm pretty sure the 980 but those are rarer and I haven't seen any evidence either way), but NV's already saying to avoid that setting in that game with a 4 GB card. It's like the 4 GB 960 and 380X. They can't use all that memory but the 2 GB they'd have otherwise isn't enough.

If you click the video I linked it to the relevant spot. Specifically for the 290 and 390 if you use over 4 gb it doesnt matter because thats not where it bottlenecks. Its very likely that the settings youre talking about use more than 4gb

JacksAngryBiome
Oct 23, 2014

snuff posted:

Personally I would rate the R9 390 a 8/8 and the GTX 970 a 3.5/4.

I wasn't expecting to laugh in this thread. Well played.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

THE DOG HOUSE posted:

If you click the video I linked it to the relevant spot. Specifically for the 290 and 390 if you use over 4 gb it doesnt matter because thats not where it bottlenecks. Its very likely that the settings youre talking about use more than 4gb

Bottlenecks can change, but are you saying that the 290 somehow does better with its ram than the 980? That'd be pretty weird honestly, although with the Fury using a weird rear end caching in VRAM backed by main memory for overflow scheme, that may be the case.


Also bandicam is cool and good and does exactly what I wanted as a frame rate limiter.

xthetenth fucked around with this message at 18:11 on Mar 7, 2016

penus penus penus
Nov 9, 2014

by piss__donald
And for more vram data and perhaps some interesting cases of when it is a good idea to get double the vram (for current lineup offerings)

http://www.techspot.com/review/1114-vram-comparison-test/

But in the case of a 390, 8gb is not usable.

quote:

The 390 and 390X are really graphics cards we never wanted. At the time of their release the Radeon R9 290 and 290X were exceptional buys. The 290X cost just $330, while today the 390X costs around $100 more for no additional performance and it is no different with the 290 and 390.
We see plenty of gamers claiming that the 390 and 390X are excellent buys due to their 8GB frame buffer ensuring that they are "future proofed," and well, that simply isn’t the case, as neither GPU has the horsepower to efficiently crunch that much data. Perhaps the only valid argument here is that the larger frame buffer could support Crossfire better, but we haven’t seen any concrete evidence of this yet.

xthetenth posted:

Bottlenecks can change, but are you saying that the 290 somehow does better with its ram than the 980? That'd be pretty weird honestly, although with the Fury using a weird rear end caching in VRAM backed by main memory for overflow scheme, that may be the case.

Bottlenecks can change but I'm not going to bet on this one.

I have no idea if the 290 does better with its ram than a 980. 290's certainly were king of memory intensive tasks when they were up against kepler cards, but the actual amount of vram starts taking a back seat in that comparison versus the architecture differences - which I dont know enough about to even pretend to understand.

penus penus penus fucked around with this message at 18:06 on Mar 7, 2016

Anime Schoolgirl
Nov 28, 2002

The 290x cost 530 dollars on release though

but still, 4gb gddr5 isn't worth that much of a price hike for the same card. the only saving grace the 390s have is that the 290s are out of production

Anime Schoolgirl fucked around with this message at 18:06 on Mar 7, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

The 390s are really priced pretty much where the cards belong if it weren't for the 290's reputation being based on that godawful blower. I'm pretty sure that post 390 launch their price went up too.

penus penus penus
Nov 9, 2014

by piss__donald

Anime Schoolgirl posted:

The 290x cost 530 dollars on release though

but still, 4gb gddr5 isn't worth that much of a price hike for the same card. the only saving grace the 390s have is that the 290s are out of production

Taken out of context that quote reads strangely, but it means at the time of the 390 release rather than the msrp of the 290 which was definitely higher.

The Slack Lagoon
Jun 17, 2008



I have a 3760k with 970 and I've been getting driver crashes in just about every game, from graphically simple to the complex. It's not heat, card runs sub 70c even under heavy load.

Latest drivers installed, tried uninstalling gforce experience, and no difference.

Anything I can try?

The drivers always recover in game, no ctd.

Potato Salad
Oct 23, 2014

nobody cares


Win10?

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Truga posted:

980Ti is 970 SLI in both price and performance, but without all the SLI bullshit that comes with it.

What I'm saying is, it's a really good deal. e: And unlike some other nvidia products with just enough ram to run current games, it might actually perform a bit longer than most at max details due to 6gb ram.

I ante'd up and bought the 980ti because a) I'm rich, why not and b) this is pretty much gonna be my last upgrade on this system before I build a new one anyway, might as well go all in.

Rastor
Jun 2, 2001

Hey, kids! Is your PC some crappy officeputer with a tiny power supply and no 6pin PCIe auxiliary power connectors? Well, now you can put a GTX 950 in it!

afkmacro
Mar 29, 2009



exquisite tea posted:

I ante'd up and bought the 980ti because a) I'm rich, why not and b) this is pretty much gonna be my last upgrade on this system before I build a new one anyway, might as well go all in.

this is mostly a fake brag post but I quoted you because I felt the same way (part B not as much A though).

I put together a brand new tower last night:

screen: acer predator XB271HU bmiprz
mobo: asus rog hero alpha
cpu: i5-6600K
cooler: h100i v2
ram: corsair dominator ddr4 16gb (2x8)
psu: corsair ax860i
case: corsair 450d
ssd: 950 pro 256gb

but I kept the old graphics card - a r280x - instead of getting a 980 ti because I feel like the next gen is right around the corner.

Durinia
Sep 26, 2014

The Mad Computer Scientist

xthetenth posted:

I'm pretty sure that Cray's made a statement that their second half revenues are going to be tied to Pascal availability. So that very strongly counterindicates first half big Pascal (and consumer big Pascal in general).

Second half starts July 1, and Cray doesn't realize revenue on a system until it has been installed and gone through acceptance testing, which can take weeks/months. If you put padding in for that, and then add in lead times for assembly/manufacturing - you could be anywhere from next week to late October as a window for GPU delivery.

That said, the previous comment made about compute consumers getting first dibs on GP100 until stocks stabilize is dead on. At least it better be, or some of the large account Nvidia sales guys are going to need to go into hiding.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Durinia posted:

Second half starts July 1, and Cray doesn't realize revenue on a system until it has been installed and gone through acceptance testing, which can take weeks/months. If you put padding in for that, and then add in lead times for assembly/manufacturing - you could be anywhere from next week to late October as a window for GPU delivery.

That said, the previous comment made about compute consumers getting first dibs on GP100 until stocks stabilize is dead on. At least it better be, or some of the large account Nvidia sales guys are going to need to go into hiding.

Yeah, that's true that that means the chips could be out for compute customers earlier, as a consumer I'm much more concerned with the implication that they're going to be buying as many as they can.

afkmacro posted:

this is mostly a fake brag post but I quoted you because I felt the same way (part B not as much A though).

I put together a brand new tower last night:

screen: acer predator XB271HU bmiprz
mobo: asus rog hero alpha
cpu: i5-6600K
cooler: h100i v2
ram: corsair dominator ddr4 16gb (2x8)
psu: corsair ax860i
case: corsair 450d
ssd: 950 pro 256gb

but I kept the old graphics card - a r280x - instead of getting a 980 ti because I feel like the next gen is right around the corner.

Top of the line 1440p on a 280X? Man after my own heart.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Durinia posted:

Second half starts July 1, and Cray doesn't realize revenue on a system until it has been installed and gone through acceptance testing, which can take weeks/months. If you put padding in for that, and then add in lead times for assembly/manufacturing - you could be anywhere from next week to late October as a window for GPU delivery.

That said, the previous comment made about compute consumers getting first dibs on GP100 until stocks stabilize is dead on. At least it better be, or some of the large account Nvidia sales guys are going to need to go into hiding.

In your opinion, how much market share in the compute sector would AMD need for stabilization, and for comedy purposes, for heads to roll at Nvidia? Time frame and length of time for AMD to take advantage of any opening?

The Slack Lagoon
Jun 17, 2008




Windows 7 ultimate

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

xthetenth posted:

Top of the line 1440p on a 280X? Man after my own heart.

Not just top of the line, but a G-Sync monitor with a 280x.

What's the chance of a monitor ever coming out with both G-Sync and Freesync supported on the same model? Would NVidia even allow that?

SwissArmyDruid
Feb 14, 2014

by sebmojo
Maybe in a few years. Dual scalers doesn't make much sense from a cost perspective, but Nvidia can't refuse to move off of DP 1.2/1.3 forever. 1.4 just got published last week. But since the adaptive refresh bits are an industry standard, and indeed, Nvidia is already using the VBLANK bits for mobile Gsync (VBLANK being from eDP that were the basis upon which the VESA-standardized Adaptive Sync was built), I bet you that Nvidia will cook up some "G-Sync, now on ALL monitors!" marketing campaign and claim that they were on board all along.

I don't expect it to happen before Intel start shipping variable refresh-enabled iGPUs, though. (They have already committed to this, in fact.)

SwissArmyDruid fucked around with this message at 01:15 on Mar 8, 2016

Spiritus Nox
Sep 2, 2011

Just finished a system restore after nVidia hosed up trying to install new drivers for my 770. Have the latest drivers been weird for anyone else? I thought I'd check to see if the drivers were the problem before I tried updating again.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Spiritus Nox posted:

Just finished a system restore after nVidia hosed up trying to install new drivers for my 770. Have the latest drivers been weird for anyone else? I thought I'd check to see if the drivers were the problem before I tried updating again.

I've heard bad things about them.



Zero VGS posted:

Not just top of the line, but a G-Sync monitor with a 280x.

What's the chance of a monitor ever coming out with both G-Sync and Freesync supported on the same model? Would NVidia even allow that?


Not sure. I think it's more likely that NV supports Freesync than a monitor doing both. That way GSync remains a lock-in for them.

All I know is I want the bloody XR341CK to stop being so ridiculously priced compared to november and december.

Heavy Metal
Sep 1, 2014

America's $1 Funnyman

xthetenth posted:

I've heard bad things about them.

Where do I go to hear about stuff like this? For the record I have the latest drivers installed.

repiv
Aug 13, 2009

Spiritus Nox posted:

Just finished a system restore after nVidia hosed up trying to install new drivers for my 770. Have the latest drivers been weird for anyone else? I thought I'd check to see if the drivers were the problem before I tried updating again.

The new installer can freak out if you have multiple monitors for some reason. NV are saying to disconnect all but one monitor until it's installed, then you're fine to plug others back in.

Sabriel
May 21, 2006

"Does the walker choose the path, or does the path choose the walker?"

Spiritus Nox posted:

Just finished a system restore after nVidia hosed up trying to install new drivers for my 770. Have the latest drivers been weird for anyone else? I thought I'd check to see if the drivers were the problem before I tried updating again.

Yep. Walked out of the room during my update and came back to a disaster. Had to roll them back since the monitor would turn off then the PC when booting up.

Spiritus Nox
Sep 2, 2011

repiv posted:

The new installer can freak out if you have multiple monitors for some reason. NV are saying to disconnect all but one monitor until it's installed, then you're fine to plug others back in.

This seemed to work. We'll see if I end up regretting this upgrade.

afkmacro
Mar 29, 2009



Zero VGS posted:

Not just top of the line, but a G-Sync monitor with a 280x.

What's the chance of a monitor ever coming out with both G-Sync and Freesync supported on the same model? Would NVidia even allow that?

You know the monitor works with ati cards right? I'm just waiting for the new nvidia cards. Couldn't rationalize buying something just for a few months.

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe
I kind of made the mistake of jumping on board with a Strix GTX 960, which is a little weaker than I'd like, at least if I ever go on board with VR equipment. I'm currently listing my Rift DK2, which I won't link here because that would be advertising, but I may try listing it in SA Mart instead.

Anyway, I'm wondering if I should get a Strix 970, or go all the way for a 980 instead, and keep the 960 as a PhysX card. I may also note that a lot of my gaming ends up being streamed using Steam In Home Streaming over a GigE link to my iMac, which shares the desk with my secondary monitor, which also happens to have the PC connected to two of its inputs.

I also wonder if it's even worth keeping the machine's i7 3770 integrated graphics enabled, since some things I've tried to stream inexplicably end up streaming through QuickSync instead of the NVEnc, such as Mercury's Timeless demo.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Mea culpa on anything I've said about AMD taking this next gen by the horns, we probably won't be seeing Big Polaris anytime soon. Apparantly SK Hynix won't even be making HBM2 until Q3? (I don't know the reliability of golem.de, so please monitor your sodium intake accordingly)

Since Samsung already started production on HBM2 back in January (https://news.samsung.com/global/samsung-begins-mass-producing-worlds-fastest-dram-based-on-newest-high-bandwidth-memory-hbm-interface), Nvidia might actually be first to market with HBM2 parts. :stonk:

http://www.golem.de/news/high-bandwidth-memory-sk-hynix-produziert-4-gbyte-stapel-ab-dem-dritten-quartal-1603-119580.html

SwissArmyDruid fucked around with this message at 10:58 on Mar 8, 2016

Anime Schoolgirl
Nov 28, 2002

that assumes that samsung doesn't also have a contract for HBM with AMD, which i'd figure was part of that fab contract

SwissArmyDruid
Feb 14, 2014

by sebmojo
Maybe I'm just being too doom and gloom, because of course Nvidia getting an HBM2 part out before AMD would be par for the course. I can see a production offload agreement between GloFo, Samsung, and AMD having been inked, but suspect that the HBM2 might need to be sourced from Hynix. Like, another one of those "we will acquire a minimum amount X of HBM2 from you guys" kinds of agreements that had AMD's balls stapled to TSMC... last year? The year before?

I am entirely happy to be wrong, though. Someone either in this thread or the AMD thread (Sorry, I forgot your name!) made the argument that Samsung is ultimately the best to handle HBM-enabled part manufacture, as they are so vertically integrated and can handle production of the CPU/GPU, the HBM, the interposers, the TSVs, and then put them all together into a completed product to ship out the door to Sapphire or whatever.

SwissArmyDruid fucked around with this message at 15:04 on Mar 8, 2016

sauer kraut
Oct 2, 2004

kuroshi posted:

I kind of made the mistake of jumping on board with a Strix GTX 960, which is a little weaker than I'd like, at least if I ever go on board with VR equipment. I'm currently listing my Rift DK2, which I won't link here because that would be advertising, but I may try listing it in SA Mart instead.

Anyway, I'm wondering if I should get a Strix 970, or go all the way for a 980 instead, and keep the 960 as a PhysX card. I may also note that a lot of my gaming ends up being streamed using Steam In Home Streaming over a GigE link to my iMac, which shares the desk with my secondary monitor, which also happens to have the PC connected to two of its inputs.

I also wonder if it's even worth keeping the machine's i7 3770 integrated graphics enabled, since some things I've tried to stream inexplicably end up streaming through QuickSync instead of the NVEnc, such as Mercury's Timeless demo.

Don't buy a current gen (2013-14 tech) card for VR, new die shrinks are coming out in a few months.

Not sure if PhysX cards are even a thing anymore. The last I heard of it was the early Batman Arkham games where accelerated PhysX was so crashy that you had better turn it off, and Mafia II I guess?
Most games I played (XCOM EU, Styx from the top of my head) use PhysX in CPU-only mode.

sauer kraut fucked around with this message at 14:46 on Mar 8, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

SwissArmyDruid posted:

Maybe I'm just being too doom and gloom, because of course Nvidia getting an HBM2 part out before AMD would be par for the course. I can see a production offload agreement between GloFo, Samsung, and AMD having been inked, but suspect that the HBM2 might need to be sourced from Hynix. Like, another one of those "we will acquire a minimum amount X of HBM2 from you guys" kinds of agreements that had stapled their balls to TSMC... last year? The year before?

I am entirely happy to be wrong, though. Someone either in this thread or the AMD thread (Sorry, I forgot your name!) made the argument that Samsung is ultimately the best to handle HBM-enabled part manufacture, as they are so vertically integrated and can handle production of the CPU/GPU, the HBM, the interposers, the TSVs, and then put them all together into a completed product to ship out the door to Sapphire or whatever.

That looks like it matches with NV going in on a big chip about as early as possible and AMD waiting till later because they can't rely on corporate budgets to give them enough money to make eating those awful yields worthwhile. Or maybe they've got Samsung ram ready to blow our minds (it's almost definitely the former, they're racing to have a big chip before NV has big chips they can't sell for compute).


Incidentally what in the everloving hell is going on with The Division. The loving 380X@1020 MHz is beating a 970@1330 MHz by an eye-watering 20% in 1440, 9% in 3440x1440, and losing by 13% at 4K. Thing is it's losing at 1080p by 16%. The 970 gets dropped out of a plane with concrete overshoes at anything over 1080.

Good god, my 290 is beating the 970 I had by 30% at my resolution.

Adbot
ADBOT LOVES YOU

Panty Saluter
Jan 17, 2004

Making learning fun!
I thought the 970 was generally an awesome 1080 card but not for anything much higher resolution anyway.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply