Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Demostrs
Mar 30, 2011

by Nyc_Tattoo

Yeah, that would work fine.

Adbot
ADBOT LOVES YOU

Drakhoran
Oct 21, 2012


That should work unless… I believe there was a driver error that prevented Nvidia cards from idling if you connected more than two monitors. Anyone know if that is still a thing? If the answer is yes you may want to go with a RX570 instead.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!
The PS5 dev kit has a lot of venting on top, so I expect the production hardware to draw a fair bit of power.

eames
May 9, 2009

AVeryLargeRadish posted:

The PS5 dev kit has a lot of venting on top, so I expect the production hardware to draw a fair bit of power.

I was about to mention that. Looks like a smart design if it has two heatpipe towers inside with the intakes inside the V exhausting outwards (or vice versa).
If the internal heatpipes connect to a baseplate in the middle its basically a NH-15D but stretched, symmetric, angled and probably even more efficient due to the spacing between towers.

Indiana_Krom
Jun 18, 2007
Net Slacker

Drakhoran posted:

That should work unless… I believe there was a driver error that prevented Nvidia cards from idling if you connected more than two monitors. Anyone know if that is still a thing? If the answer is yes you may want to go with a RX570 instead.

Just checked: Still a thing. 3 or more outputs at once causes my 1080 to idle at 1290 MHz instead of the normal 135 MHz. Latest nvidia game ready drivers.

repiv
Aug 13, 2009

AVeryLargeRadish posted:

The PS5 dev kit has a lot of venting on top, so I expect the production hardware to draw a fair bit of power.

On the other hand Sony devkits are always desk-filling monoliths with tons of ventilation :v:

The PS5 probably will be power hungry but the devkit isn't a great indicator of what retail hardware will be like.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Turns out MS just licensed the design of the trashcan Mac pro for the XBSX (Xbox 4? Is there an abbreviation for this thing yet?) and put a different shroud on it.

Dominoes
Sep 20, 2007

Drakhoran posted:

That should work unless… I believe there was a driver error that prevented Nvidia cards from idling if you connected more than two monitors. Anyone know if that is still a thing? If the answer is yes you may want to go with a RX570 instead.
Much appreciated; going to pull the trigger tmw. I'm hesitant to go with ATI; IIRC they have a history of not handling weird/edge cases as well as Nvidia (Linux etc). I consider this a weird edge case because I haven't found any guide/anecdote/source confirming 3x 4k displays from a laptop will or won't work.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Steve savaging AMD over the 5600 XT being a disaster in almost every way.

I honestly think this has the potential to be a really bad moment for AMD. In terms of next-gen GPUs, if AMD is first to market, nobody's going to trust them. If Nvidia is first to market, everyone's last memory of AMD is a loving disaster, so everyone will just buy Nvidia instead of waiting. It's lose/lose and they did it to themselves and deserve every bit of it. It's also drawing more attention to how poo poo the Navi drivers are in general (which is no surprise, AMD in particular tends to have a lot of trouble with drivers for fundamentally new GPUs).

K8.0 fucked around with this message at 05:09 on Jan 27, 2020

Maxwell Adams
Oct 21, 2000

T E E F S

ItBreathes posted:

(Xbox 4? Is there an abbreviation for this thing yet?)

xboxse.cx

Rabid Snake
Aug 6, 2004



K8.0 posted:

Steve savaging AMD over the 5600 XT being a disaster in almost every way.

I honestly think this has the potential to be a really bad moment for AMD. In terms of next-gen GPUs, if AMD is first to market, nobody's going to trust them. If Nvidia is first to market, everyone's last memory of AMD is a loving disaster, so everyone will just buy Nvidia instead of waiting. It's lose/lose and they did it to themselves and deserve every bit of it. It's also drawing more attention to how poo poo the Navi drivers are in general (which is no surprise, AMD in particular tends to have a lot of trouble with drivers for fundamentally new GPUs).

Its not that bad people tend to forget by next gen if the product is good enough. GTX 970 3.5gbs gave settlement checks back it was that bad.

Dr. Despair
Nov 4, 2009


39 perfect posts with each roll.

Indiana_Krom posted:

Just checked: Still a thing. 3 or more outputs at once causes my 1080 to idle at 1290 MHz instead of the normal 135 MHz. Latest nvidia game ready drivers.

I had forgotten that this was a thing and had 3 monitors hooked up after rebuilding my desktop... no wonder my cat has been hanging out next to the bottom where the gpu radiator vents out.

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



Indiana_Krom posted:

Just checked: Still a thing. 3 or more outputs at once causes my 1080 to idle at 1290 MHz instead of the normal 135 MHz. Latest nvidia game ready drivers.

This can easily be fixed though by installing Nvidia Inspector and using the multi monitor power save mode, throttling the card down as long as the GPU usage level is below a certain threshold.

mcbexx fucked around with this message at 07:08 on Jan 27, 2020

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



Quote is not edit.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

ItBreathes posted:

Turns out MS just licensed the design of the trashcan Mac pro for the XBSX (Xbox 4? Is there an abbreviation for this thing yet?) and put a different shroud on it.

I know you jest but the cuboid micro-atx design has been around for years

My only surprise is that they didn't do it for PS4 and Xbone, I guess they thought they could try and force the traditional console design once more but as everyone with a PS4 pro knows, they have their limits when it comes to cooling

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

mcbexx posted:

This can easily be fixed though by installing Nvidia Inspector and using the multi monitor power save mode, throttling the card down as long as the GPU usage level is below a certain threshold.

I can attest that this works, despite NVidia Inspector itself not having been updated since sometime in 2016. The newer Profile Inspector doesn't seem to have the same power saving option as far as I can find.

e; though for me (1080Ti w/3 monitors) I find that enabling the Activate Full GPU By Threshold causes it to gently caress up and send my monitors into a grumpy state. Leaving it with the simple Power Saver enabled and nothing else specified works well, though. Knocks about 40W off my idle power consumption.

DrDork fucked around with this message at 19:35 on Jan 27, 2020

Shaocaholica
Oct 29, 2002

Fig. 5E
Is a custom cooler 1070 worth $200 or should I get something else? This will go into a SFF 4770K system with a 500W PSU. Can't upgrade the PSU.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Shaocaholica posted:

Is a custom cooler 1070 worth $200 or should I get something else? This will go into a SFF 4770K system with a 500W PSU. Can't upgrade the PSU.

I think that's a good price for that card, but it's not, like, a great card right now. Are you doing VR, what's your monitor, etc? If you're looking at mostly 1080p60 its probably fine.

Shaocaholica
Oct 29, 2002

Fig. 5E

Lockback posted:

I think that's a good price for that card, but it's not, like, a great card right now. Are you doing VR, what's your monitor, etc? If you're looking at mostly 1080p60 its probably fine.

No VR. Looking to play modern games at 1080p/1440p.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Shaocaholica posted:

No VR. Looking to play modern games at 1080p/1440p.

So I have a 1070 in 1440p and I am definitely ready to upgrade, though I am being stupidly stubborn and holding out. You're going to have to turn down settings to get to ~60fps on new AAA titles and it'll jump around. I have a G-Sync monitor so it's fine, but not what I'd really want. If you're looking to play, like, Civ 6 or Disco Elysium or whatever you'll be fine but Metro Exodus/Division 2/etc are going to be in the 40s with everything turned on.

1080p it'll still hold up for a while.

Shaocaholica
Oct 29, 2002

Fig. 5E
So what's the logical upgrade from a 1070 then? Will my delided 4770K keep up?

Mindblast
Jun 28, 2006

Moving at the speed of death.


Maybe the next gen has something for ya? I mean the current gen has been around for a bit..

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
1440p I think the 2070S is the obvious answer unless you catch a screaming deal on a 1080 Ti (like, below $400) and don't mind having an older card that is starting to shed performance in newer titles.

1080p I think you're looking for something in the 1070 Ti, 1080, 2060, or 2060S range. Since you already have a 1070, presumably towards the top of that range.

I wouldn't say a 1070 is a bad card for 1080p even today so maybe you do need to think about upgrading that 4770K as well? 4770K+1070 is an overall well-matched pair so it sounds like you're after higher framerate?

That or wait for the next gen later this year, yeah.

Regrettable
Jan 5, 2010



Lockback posted:

So I have a 1070 in 1440p and I am definitely ready to upgrade, though I am being stupidly stubborn and holding out. You're going to have to turn down settings to get to ~60fps on new AAA titles and it'll jump around. I have a G-Sync monitor so it's fine, but not what I'd really want. If you're looking to play, like, Civ 6 or Disco Elysium or whatever you'll be fine but Metro Exodus/Division 2/etc are going to be in the 40s with everything turned on.

1080p it'll still hold up for a while.

Same. I'm just waiting for Nvidia to announce their new lineup at this point.

Shaocaholica
Oct 29, 2002

Fig. 5E

Paul MaudDib posted:

1440p I think the 2070S is the obvious answer unless you catch a screaming deal on a 1080 Ti (like, below $400) and don't mind having an older card that is starting to shed performance in newer titles.

1080p I think you're looking for something in the 1070 Ti, 1080, 2060, or 2060S range. Since you already have a 1070, presumably towards the top of that range.

I wouldn't say a 1070 is a bad card for 1080p even today so maybe you do need to think about upgrading that 4770K as well? 4770K+1070 is an overall well-matched pair so it sounds like you're after higher framerate?

That or wait for the next gen later this year, yeah.

I can wait forever. This isn't a daily and my daily isn't gaming either so OK, I'll wait and in the mean time get everything else setup.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

K8.0 posted:

If you look at the case of the high end Xbox, it's clear that cooling drove the design much more than previous consoles. The PS5/base Xbox will probably be more like 5600 XT level,
lol no

sauer kraut
Oct 2, 2004

Shaocaholica posted:

Is a custom cooler 1070 worth $200 or should I get something else? This will go into a SFF 4770K system with a 500W PSU. Can't upgrade the PSU.

It brings pretty much the exact same performance as a 1660 Super (~$230 new) but with 2GB more Vram.
Seems fair; if you're not afraid of some odd screws I'd consider removing the cooler to apply fresh thermal paste and give it a good dust blowout though.

VelociBacon
Dec 8, 2009

Anyone else have the EVGA AIO aftermarket cooler and mind sharing the results they're getting? I feel like it doesn't keep it as cool as I'd have thought (I hit 69C under load with a reasonable fan curve, with fans at 100% I hit 65C on 2080ti XC Ultra).

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

VelociBacon posted:

Anyone else have the EVGA AIO aftermarket cooler and mind sharing the results they're getting? I feel like it doesn't keep it as cool as I'd have thought (I hit 69C under load with a reasonable fan curve, with fans at 100% I hit 65C on 2080ti XC Ultra).

I’ve always held the belief that a 120mm AIO is too weak for the heat a 2080 ti spits out. I don’t have the EVGA aio but for reference I have a 240mm alphacool AIO on my 2080 ti and it hits ~55C with push/pull A12x25 fans and a 330W power limit.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Shaocaholica posted:

I can wait forever. This isn't a daily and my daily isn't gaming either so OK, I'll wait and in the mean time get everything else setup.

Then $200 for a 1070 and upgrading later isn't a real bad call. Otherwise I agree with Paul and would go for a 2070S or splurging on a 2080S. It kinda depends on what you're willing to live with.

VelociBacon
Dec 8, 2009

B-Mac posted:

I’ve always held the belief that a 120mm AIO is too weak for the heat a 2080 ti spits out. I don’t have the EVGA aio but for reference I have a 240mm alphacool AIO on my 2080 ti and it hits ~55C with push/pull A12x25 fans and a 330W power limit.

I think you're probably right - I'd love to see a liquid temp sensor though because I feel like it's not really saturating the loop with the temps at the die.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

VelociBacon posted:

I think you're probably right - I'd love to see a liquid temp sensor though because I feel like it's not really saturating the loop with the temps at the die.

I’m hoping EVGA plops a 240mm AIO on some of their next gen cards besides the kingpin now that they have experience doing it.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Big 7nm GPUs are going to be very interesting. 7nm pushes power density far enough that transfer efficiency between the chip and the cooler matters a lot more than it has previously, and we all know how loving lazy and bad a lot of GPU designs are in that regard. I'm prepared for some truly awful thermal performances among the first round of third-party 3000 series/big navi GPUs.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

B-Mac posted:

I’ve always held the belief that a 120mm AIO is too weak for the heat a 2080 ti spits out. I don’t have the EVGA aio but for reference I have a 240mm alphacool AIO on my 2080 ti and it hits ~55C with push/pull A12x25 fans and a 330W power limit.

Maybe there's some installation issue there, then? By comparison, I have a 1080Ti SC2 Hybrid, so a factory-installed 120mm AIO, and even with actual power draw of ~300W, it's never gone above about 47C, and that's with the single fan on it set to basically inaudible. I know the 1080Ti and 2080Ti aren't the same, but given they have the same base TDP and the 2080Ti die is substantially larger, you'd think it'd actually be easier to cool--or at least easier to get the heat out of the chip and into the heatsink.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

B-Mac posted:

I’ve always held the belief that a 120mm AIO is too weak for the heat a 2080 ti spits out. I don’t have the EVGA aio but for reference I have a 240mm alphacool AIO on my 2080 ti and it hits ~55C with push/pull A12x25 fans and a 330W power limit.

If you look at the temps on a 295X2, which pushes 450-500W through a 120mm, I don’t believe the rad is the bottleneck. I’d be curious to see a liquid temp sensor too, I think liquid cooling performance is primarily limited by the coldplate and bigger rads don’t help as much as you might think.

Besides, 55C is perfectly fine anyway. Absolute best case you might see 45C under load.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

DrDork posted:

Maybe there's some installation issue there, then? By comparison, I have a 1080Ti SC2 Hybrid, so a factory-installed 120mm AIO, and even with actual power draw of ~300W, it's never gone above about 47C, and that's with the single fan on it set to basically inaudible. I know the 1080Ti and 2080Ti aren't the same, but given they have the same base TDP and the 2080Ti die is substantially larger, you'd think it'd actually be easier to cool--or at least easier to get the heat out of the chip and into the heatsink.

I tried two different SC2 hybrid when I was looking at 1080 Ti and both of them reached 60C with the fan blasting 100%. I’m not calling your a liar but my experience had it nowhere close to 50C no matter the setup and orientation. I chocked the first one up to maybe a bad pump but after the second did it I just assumed the 120mm was too anemic for it. I still think 240 is the way to go for cards with a TDP of 250W or above.

Paul MaudDib posted:

If you look at the temps on a 295X2, which pushes 450-500W through a 120mm, I don’t believe the rad is the bottleneck. I’d be curious to see a liquid temp sensor too, I think liquid cooling performance is primarily limited by the coldplate and bigger rads don’t help as much as you might think.

Besides, 55C is perfectly fine anyway. Absolute best case you might see 45C under load.

Oh I’m not complaining about the temps. The fans run at a fairly low speed and I only lose 30 MHz due to heat from the gpu boost stepping. I did it mostly for just shits and giggles and will most likely grab an air cooled card again unless one of the companies releases a 240mm AIO that also lets you control pump speed.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
What case are you using? I have an abnormally large one, so there's tons of empty space in there to allow for excellent airflow. Wonder if that's part of our different experiences?

I also cannot remember if I cracked it open and repasted it with some better goop. It's absolutely something I could have done, but I simply do not recall.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!
Y'all really need to be talking about delta-T over ambient if you're going to compare temperatures.

VelociBacon
Dec 8, 2009

Assuming we all are in rooms around 22C!

e: I wonder how worthwhile it would be going to a push/pull config on the 120 with some static pressure corsair fans. I can always toss something there for now just to see. Honestly temps of 69C under load are not bothering me I just hoped for a little better.

VelociBacon fucked around with this message at 11:47 on Jan 29, 2020

Adbot
ADBOT LOVES YOU

fuf
Sep 12, 2004

haha

Shaocaholica posted:

So what's the logical upgrade from a 1070 then?

I'm in the same situation and it's annoying but I think the answer is wait until the next generation.

I was thinking about upgrading to a 2070S but I looked at some benchmarks for games I care about and in some cases it was only like 15-20 extra FPS. It didn't seem worth it for like $400.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply