Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
khy
Aug 15, 2005

BIG HEADLINE posted:

Even if your 970 was off warranty, you would be able to replace the stock HSF assembly with an aftermarket solution.

I was already looking at it, but since the heatsink can't really 'fail' the way a fan can there's not much point to replacing it. The fans are what fail, and you can easily pick those up for ~12 bucks a fan. I can replace all 3 fans cheaper than one of the arctic HSF assemblies.

Adbot
ADBOT LOVES YOU

GRINDCORE MEGGIDO
Feb 28, 1985


If an accelero S3 fits you'd get great temps with very little noise from a single 120 fan if you had the space. They're cheaper then $36.

But to fit a fan you need the fan mount from the turbo module or cable ties.

GRINDCORE MEGGIDO fucked around with this message at 16:26 on Jul 19, 2018

cowofwar
Jul 30, 2002

by Athanatos
How is this two year old tech still the best and so expensive? I miss the 2000s.

khy
Aug 15, 2005

GRINDCORE MEGGIDO posted:

If an accelero S3 fits you'd get great temps with very little noise from a single 120 fan if you had the space. They're cheaper then $36.

But to fit a fan you need the fan mount from the turbo module or cable ties.

I might have the space in the new case, old one is iffy. But one of the fans doesn't need to be replaced at all, so for the other two alone is $24 and should be just fine.

Turns out that Gigabyte puts a shitload of paste on, it had leaked over all four sides of the GPU and was a bitch to clean off. They also have these super fragile lovely cloth cable management bits under the shroud to keep the fan power cables tucked away. One, predictably, came free so I'm epoxying it back into place. Hooray for waiting for glue to dry. But post cleaning, the fan does appear to be spinning better and I got tons of dust and crap off the heatsink (And parts of the card that dust settled under the heatsink) so hey, minor victories.

New fans have been ordered so now all I gotta do is wait for the epoxy to cure (It's just a tiny, tiny dab of it), put the shroud back on the heatsink, arctic silver up the gpu and I'm good.

khy fucked around with this message at 17:26 on Jul 19, 2018

SYSV Fanfic
Sep 9, 2003

by Pragmatica

cowofwar posted:

How is this two year old tech still the best and so expensive? I miss the 2000s.

Because nvidia is willing to go ET on it's pascal surplus.

Enos Cabell
Nov 3, 2004


There aren't exactly a lot of games out there creating a demand for more GPU power either. My 1080 still crushes content at 1440p, so I'm in no rush to upgrade.

ColTim
Oct 29, 2011
Do conductive pastes like Arctic silver work on gpu dies? I've repasted some gpus but always used non conductive ones

LRADIKAL
Jun 10, 2001

Fun Shoe
You should always use non-conductive paste unless you really need that one degree of extra cooling.

craig588
Nov 19, 2005

by Nyc_Tattoo
Arctic Silver isn't conductive either. Use whatever paste you have unless you're going for those last 2 degrees.

I think it's Kryonaut that's conductive and will get you the last 5 degrees, but it seems like more trouble than those degrees are worth.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

ColTim posted:

Do conductive pastes like Arctic silver work on gpu dies? I've repasted some gpus but always used non conductive ones

Works but if you over apply, you em the risk of shorting something out. I’ve been using arctic MX4 for both CPU & GPU, non-conductive & actually gets better temps than arctic silver.

SwissArmyDruid
Feb 14, 2014

by sebmojo

craig588 posted:

Arctic Silver isn't conductive either. Use whatever paste you have unless you're going for those last 2 degrees.

I think it's Kryonaut that's conductive and will get you the last 5 degrees, but it seems like more trouble than those degrees are worth.

Kryonaut is not conductive.

Conductonaut is the liquid metal.

Stickman
Feb 1, 2004

Arctic Silver isn't conductive, but it is slightly capacitive so you still want to avoid getting it on pins and traces.

Craptacular!
Jul 9, 2001

Fuck the DH
I think Nvidia is waiting to see whether higher resolution or higher refresh rate wins the demand, while quietly cheerleading refresh rate. Gsync 4K is our of most people’s budgets for some time to come, and it’s been increases in resolution (640x480 to 800x600 to 1024x768 to 720 to 1080 to 1440) that have pushed the need for newer generations of hardware.

A customer who buys a 4K monitor at 60hz may someday switch to AMD once they reach 1080ti performance. A customer who buys Gsync at 1440p is likely Nvidia’s through good launches and bad for some time. They’re also content with what’s already out, while any 4K owner likely sees the 1080ti as a minimum.

khy
Aug 15, 2005

Post-clean : GPU is hovering around 80 degrees c, fan speed won't top 2000 rpm. Not sure how good or bad that figure is but it seems OK.

SlayVus
Jul 10, 2009
Grimey Drawer
Gelid GC Extreme and Kyronaut are probably the best two pastes you can use on just about any thermal application. That GPU should definitely be running lower than 80C though. Maybe not enough thermal paste?

khy
Aug 15, 2005

SlayVus posted:

Gelid GC Extreme and Kyronaut are probably the best two pastes you can use on just about any thermal application. That GPU should definitely be running lower than 80C though. Maybe not enough thermal paste?

I'm blaming the fans and the case. Poor airflow combined with fans that are barely spinning. But at least 80 isn't dangerous since the GPU is rated at 97, so it'll do until next week (New fans & case arrive).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
80C at full load on a 970 is high, but maybe 2000 RPM for those fans is low--how high can you crank them? A lot of people who get <60C temps are doing so with their fans cranked way up into jet-engine territory.

But, yeah, giving it a once-over to ensure you used enough paste and then properly seated the heatsink back is probably not a bad plan. With non-conductive paste, while you don't need to go crazy, ensuring you have enough is better than not having enough--especially on a GPU which doesn't have a heatspreader.

Either way, 80C ain't gonna hurt anything.

khy
Aug 15, 2005

DrDork posted:

80C at full load on a 970 is high, but maybe 2000 RPM for those fans is low--how high can you crank them? A lot of people who get <60C temps are doing so with their fans cranked way up into jet-engine territory.

But, yeah, giving it a once-over to ensure you used enough paste and then properly seated the heatsink back is probably not a bad plan. With non-conductive paste, while you don't need to go crazy, ensuring you have enough is better than not having enough--especially on a GPU which doesn't have a heatspreader.

Either way, 80C ain't gonna hurt anything.

Since the fans are severely problematic, I can't really crank them up. The one was seized ENTIRELY last night, 0rpm. Cleaning got it spinning but not spinning fast. The middle fan is spinning but obviously has some extra friction, so it ain't spinning like it should be.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Pro Hack: Take the side panel off, remove the fans (if able), and point a standard box fan at the thing and see what happens.

Cavauro
Jan 9, 2008

That happened to my MSI card and I ordered a couple of new fans. You should be able to find some identifying marks on the fans themselves if you've already done things like taking them out of the shroud and stuff, so going with that and making sure you know how many pins they use, I'm assuming you could do that

khy
Aug 15, 2005

Cavauro posted:

That happened to my MSI card and I ordered a couple of new fans. You should be able to find some identifying marks on the fans themselves if you've already done things like taking them out of the shroud and stuff, so going with that and making sure you know how many pins they use, I'm assuming you could do that

Already bought, will arrive monday.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Craptacular! posted:

I think Nvidia is waiting to see whether higher resolution or higher refresh rate wins the demand, while quietly cheerleading refresh rate. Gsync 4K is our of most people’s budgets for some time to come, and it’s been increases in resolution (640x480 to 800x600 to 1024x768 to 720 to 1080 to 1440) that have pushed the need for newer generations of hardware.

A customer who buys a 4K monitor at 60hz may someday switch to AMD once they reach 1080ti performance. A customer who buys Gsync at 1440p is likely Nvidia’s through good launches and bad for some time. They’re also content with what’s already out, while any 4K owner likely sees the 1080ti as a minimum.

4K gsync (eg XB271HK) is not really any more expensive than 1440p (XB271HU). We're talking maybe a $50 difference on a $600 monitor. It's the high-refresh 4K monitors that have the insane pricing.

SlayVus
Jul 10, 2009
Grimey Drawer

Paul MaudDib posted:

4K gsync (eg XB271HK) is not really any more expensive than 1440p (XB271HU). We're talking maybe a $50 difference on a $600 monitor. It's the high-refresh 4K monitors that have the insane pricing.

The problem is the BOM of just one of the components is over $500. The FPGA chip plus the 8GB or more of RAM for the chip supposedly costs over $500 at bulk pricing.

queef anxiety
Mar 4, 2009

yeah
i want to buy a new pc nvidia pls give me the sweet release

SlayVus
Jul 10, 2009
Grimey Drawer

khy posted:

Since the fans are severely problematic, I can't really crank them up. The one was seized ENTIRELY last night, 0rpm. Cleaning got it spinning but not spinning fast. The middle fan is spinning but obviously has some extra friction, so it ain't spinning like it should be.

If the card isn't idling at around 28-34c than the paste may not have been applied properly. Guru3D's review of the Gigabyte G1 Gaming shows that it hits max temps of around 60-65C. The reason I suggest checking the idle temps is because the the fans shouldn't need to even be around 1k RPM for the idle temps you need to be looking for.

SYSV Fanfic
Sep 9, 2003

by Pragmatica

Enos Cabell posted:

There aren't exactly a lot of games out there creating a demand for more GPU power either. My 1080 still crushes content at 1440p, so I'm in no rush to upgrade.

It's going to be interesting to see what the ps5/navi will be targeting. 4k@30fps seems like a really far stretch, so I'm betting 1080p@60fps will be the native resolution/refresh.

I think we've hit the point of diminishing returns on eye candy. I was wanting to buy a 1080 on fire sale until I realized I was really hard pressed to notice a difference between high settings on 1080p and low settings on 1440p while actually playing. Everyone talks about night and day difference between 60hz and 144hz but I can't tell any difference on a freesync monitor once my frame rate reaches about 45.

craig588
Nov 19, 2005

by Nyc_Tattoo
If it's still on the broken 1.5 fans I can easily see the 80C. If it's 80C after replacing the bad ones there's a problem.

khy
Aug 15, 2005

SlayVus posted:

If the card isn't idling at around 28-34c than the paste may not have been applied properly. Guru3D's review of the Gigabyte G1 Gaming shows that it hits max temps of around 60-65C. The reason I suggest checking the idle temps is because the the fans shouldn't need to even be around 1k RPM for the idle temps you need to be looking for.

It was idling at 30 earlier. 80 was under load. I probably wasn't clear about that before.

I quit the game I was playing, and it's fun to watch the temp drop like 10 degrees in five seconds. It dropped down to 50, but the case I have has poor ventilation and inside was pretty toasty; I took the side off my case and used a desk fan to air out the case a bit, and it's down to 38. With the side of the case back on, hovering at around 40. I'm fairly confident the paste is on there good; it's just a lovely situation with the 1.5 fans and the poorly ventilated case. I'm very excited for the new case, positive airflow woo.

I'm going to be careful and keep a very close eye on it until the new fans arrive. Worst comes to worst, I can always just avoid gaming and play on the PS4 if I really feel the urge to vidya.

khy fucked around with this message at 04:06 on Jul 20, 2018

SlayVus
Jul 10, 2009
Grimey Drawer

craig588 posted:

If it's still on the broken 1.5 fans I can easily see the 80C. If it's 80C after replacing the bad ones there's a problem.

But if it's idling at like 50c with 1.5 fans than something else is wrong like heatsink not mounted 100% or air bubbles in thermal paste or just not enough paste at all. The G1 Gaming supposedly idles at a noise level that shouldn't be perceptible above the standard case noise level. Which would mean that the fans should be barely spinning.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

SlayVus posted:

The problem is the BOM of just one of the components is over $500. The FPGA chip plus the 8GB or more of RAM for the chip supposedly costs over $500 at bulk pricing.

That's only for the 4K144 panels. The 4K60 panels use the same module as the 1440p panels. You can pick up a whole monitor for like $600.

The reason 4K GSync hasn't taken off is because it's a bad set of tradeoffs vs high-refresh, not because of price or because NVIDIA is somehow "cheerleading" for high-refresh. 60 Hz sucks for gaming, and the high-refresh 4K panels were vaporware for years, cost 4-5x as much as a good 1440p panel, and still have major shortcomings. In contrast, 1440p is enough resolution for at least 27", lets you get up to 165 Hz, is mature, and can actually be driven with hardware that exists today, and 1080p144 is dirt cheap (although it doesn't make sense to buy a Gsync 1080p panel).

4K makes sense for productivity panels, since it's practically the same price as a decent 1440p panel. But 4K60 doesn't make sense for gaming in general, let alone a high-end gaming product like GSync. 4K144 is going to be cool when it come down 50% in price and fixes some of the shortcomings like haloing and 8-bit panels, but right now it's an early-adopter product at best.

Paul MaudDib fucked around with this message at 04:46 on Jul 20, 2018

Stickman
Feb 1, 2004

Paul MaudDib posted:

That's only for the 4K144 panels. The 4K60 panels use the same module as the 1440p panels. You can pick up a whole monitor for like $600.

The reason 4K GSync hasn't taken off is because it's a bad set of tradeoffs vs high-refresh, not because of price or because NVIDIA is somehow "cheerleading" for high-refresh. 60 Hz sucks for gaming, and the high-refresh 4K panels were vaporware for years, cost 4-5x as much as a good 1440p panel, and still have major shortcomings.

I think that was point. 4K/144 Hz would be a good performance upgrade target for selling the next generation of NVidia GPUs, but not if 4k/144 Hz GSync monitors cost more than a high-end PC. Without that, 1080 tis work just fine for 1440p/144 Hz or 4k/60 Hz, so there's not really a performance shortfall to use to sell new cards.

It's possible that NVidia is banking on RTX to sell Volta, but we'll see whether they can get some launch games where RTX both makes a sufficient visual difference to impress gamers and also keeps graphical artifacts to a minimum.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
RTX is going to be a gimmick for this generation. The only game that is going to be coming out in the next year or two that supports it is Metro, so you are buying a whole upgrade just for one game?

I wouldn't get too excited until the generation after Volta. The games industry moves slowly, even if NVIDIA immediately does a big push there won't be much of a library for 3 years or so.

As always, if you need to upgrade then upgrade, and having RTX support is nice, but the generation after Volta will surely have massive improvements in RTX performance as NVIDIA tweaks their hardware. First-gen technology is inevitably poo poo and depending on it to be future-proof is a fool's errand.

Speaking of which, did any games ever end up using NVIDIA's fancy dual-viewport thingy for VR?

Paul MaudDib fucked around with this message at 05:06 on Jul 20, 2018

Stickman
Feb 1, 2004

That was my impression, too, though I didn't know that Metro was the only game coming out with RTX support.

So that pretty much leaves cost and efficiency as the only reasons to upgrade to Volta, which worked okay for Pascal, but for two generations in a row? I can see why they're holding off the release to sell backstock.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
12nm isn't even going to be that big an efficiency improvement since it's really just a 16+. Yeah you can run the die sizes up and then clock down, but as far as enthusiasts are concerned if you're running the chip near the edge of its performance then it's going to be pretty close to the same. I mean, that probably only puts them back to Maxwell-level power consumption, but still.

I was reaallllly disappointed when I heard they weren't going with 7nm. 12nm doesn't make much sense as a performance/efficiency increment over 16nm, so their gains are largely going to be coming from uarch improvements and running up the die size.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Paul MaudDib posted:

I was reaallllly disappointed when I heard they weren't going with 7nm. 12nm doesn't make much sense as a performance/efficiency increment over 16nm, so their gains are largely going to be coming from uarch improvements and running up the die size.

Yeah same we've already waited forever might as well wait a bit longer and have a significantly better node.

Captain Yossarian
Feb 24, 2011

All new" Rings of Fire"
I would buy new technology for a Metro game, they are really good

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
I thought the last word was they were going for 7nm this generation? If not, then at least we shouldn't have to wait as long for the subsequent generation.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
....unless AMD continues to drop the ball and provide basically no pressure on NVidia to push products out quickly.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

ConanTheLibrarian posted:

I thought the last word was they were going for 7nm this generation? If not, then at least we shouldn't have to wait as long for the subsequent generation.

A couple weeks ago, an NVIDIA employee leaked a picture of a Turing development board (:rip:) with a Gx102-style configuration (384-bit bus) and if you look at the bumpout it's about 671mm^2 which is definitely not something you'd do on 7nm. So it looks like they are going 12nm this gen.

This either means it's going to be a weird, short generation, or that NVIDIA is going to let AMD take a node advantage on them next year. I really don't get the logic of a 12nm card this late in the game.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

Paul MaudDib posted:

A couple weeks ago, an NVIDIA employee leaked a picture of a Turing development board (:rip:) with a Gx102-style configuration (384-bit bus) and if you look at the bumpout it's about 671mm^2 which is definitely not something you'd do on 7nm. So it looks like they are going 12nm this gen.

This either means it's going to be a weird, short generation, or that NVIDIA is going to let AMD take a node advantage on them next year. I really don't get the logic of a 12nm card this late in the game.

Didn't we establish that the 671mm^2 estimate probably isn't accurate? It was based on the assumption that the center area of the bumpout exactly matches the die area, which isn't the case with Pascal:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply