Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
teagone
Jun 10, 2003

That was pretty intense, huh?

Is mankind ever going to reach a point where GPUs and CPUs will operate under full load at room temperature, barring the need for any kind of cooling whatsoever? Or will all humans die off before that happens.

Adbot
ADBOT LOVES YOU

VelociBacon
Dec 8, 2009

teagone posted:

Is mankind ever going to reach a point where GPUs and CPUs will operate under full load at room temperature, barring the need for any kind of cooling whatsoever? Or will all humans die off before that happens.

That will happen but unfortunately only once we have hosed our climate so much that room temperature becomes 85C.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

teagone posted:

Is mankind ever going to reach a point where GPUs and CPUs will operate under full load at room temperature, barring the need for any kind of cooling whatsoever? Or will all humans die off before that happens.

We are there already, look at phones, tablets, and the Macbook.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

teagone posted:

Is mankind ever going to reach a point where GPUs and CPUs will operate under full load at room temperature, barring the need for any kind of cooling whatsoever? Or will all humans die off before that happens.

Yes, once we enter a black hole and the laws of physics break down.

Twerk from Home posted:

We are there already, look at phones, tablets, and the Macbook.

None of those things run at room temperature. You can't be colder than ambient without outside assistance and power will always turn into heat.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Don Lapre posted:

Yes, once we enter a black hole and the laws of physics break down.


None of those things run at room temperature. You can't be colder than ambient without outside assistance and power will always turn into heat.

I misunderstood the question, I thought he meant without fans.

SwissArmyDruid
Feb 14, 2014

by sebmojo

teagone posted:

Is mankind ever going to reach a point where GPUs and CPUs will operate under full load at room temperature, barring the need for any kind of cooling whatsoever? Or will all humans die off before that happens.

Maybe not, but I absolutely want to get my hands on a Greenland-based Fury with Sapphire's Tri-X, you know, the one where the cooler hangs an extra 150mm off the tail end of the shorty-short board? Present version is kind of ridiculous in how well it cools, I am licking my chops over a version with a lower-TDP chip.

Mozi
Apr 4, 2004

Forms change so fast
Time is moving past
Memory is smoke
Gonna get wider when I die
Nap Ghost
Assuming you could superconduct the heat away using quantum entanglement, I don't see any reason it's not possible.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

teagone posted:

Is mankind ever going to reach a point where GPUs and CPUs will operate under full load at room temperature, barring the need for any kind of cooling whatsoever? Or will all humans die off before that happens.

That depends entirely on on what you mean, since something like Atom SoCs do a good job of this, as do most ARM SoCs. VIA CPUs are known to operate for lengthy periods at full load while not breaking room temperature, if you undervolt Kabini it doesn't even needs the supplied heatsink.

If you mean maximum desktop class performance for a given generation, I don't think it'll happen due to nature of desktop.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Twerk from Home posted:

I misunderstood the question, I thought he meant without fans.

Even so, those products thermally throttle like heeeeeellll. Linus did a couple videos about water cooling cell phones and unibody macbooks, the macbooks are especially bad, even with their super-low TDPs.

edit: water COOLING. Stupid autocorrect.

SwissArmyDruid fucked around with this message at 22:27 on Nov 3, 2015

Sininu
Jan 8, 2014

https://www.youtube.com/watch?v=_jJpbmxGnss
Nvidia will turn this off, won't they?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

Maybe not, but I absolutely want to get my hands on a Greenland-based Fury with Sapphire's Tri-X, you know, the one where the cooler hangs an extra 150mm off the tail end of the shorty-short board? Present version is kind of ridiculous in how well it cools, I am licking my chops over a version with a lower-TDP chip.

That's the other good thing about HBM improving GPU form factor, you get WAAAY better blowers and cooling setups.

SwissArmyDruid posted:

Even so, those products thermally throttle like heeeeeellll. Linus did a couple videos about water cooking cell phones and unibody macbooks, the macbooks are especially bad, even with their super-low TDPs.

Makes me wonder how hard you could push a Cherrytrail Atom with proper cooling.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
If you compare the performance difference over time between a mobile chip of a certain TDP the best desktop chip available at the time you'll see that the difference is getting smaller as time goes on. We're getting a lot better at getting more performance per watt at lower TDPs, so while lower power chips aren't going to be as fast as your desktop they'll be increasingly "close enough" for most people.

teagone
Jun 10, 2003

That was pretty intense, huh?

FaustianQ posted:

That depends entirely on on what you mean, since something like Atom SoCs do a good job of this, as do most ARM SoCs. VIA CPUs are known to operate for lengthy periods at full load while not breaking room temperature, if you undervolt Kabini it doesn't even needs the supplied heatsink.

If you mean maximum desktop class performance for a given generation, I don't think it'll happen due to nature of desktop.

Yeah I meant desktop class performance, e.g., running a AAA big budget PC gaming title at some ridiculous resolution with the GPUs/CPUs sans coolers not even breaking a sweat.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

teagone posted:

Yeah I meant desktop class performance, e.g., running a AAA big budget PC gaming title at some ridiculous resolution with the GPUs/CPUs sans coolers not even breaking a sweat.

We used to, actually. Maybe if things keep going the way they are, we'll be deep enough in the land of diminishing returns we can go back to it without losing much. That's a pretty big bet though, lithography is becoming an ever increasing pain.

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

SinineSiil posted:

https://www.youtube.com/watch?v=_jJpbmxGnss
Nvidia will turn this off, won't they?

not unless they want to stay fully DX12 compliant.

EoRaptor
Sep 13, 2003

by Fluffdaddy

xthetenth posted:

We used to, actually. Maybe if things keep going the way they are, we'll be deep enough in the land of diminishing returns we can go back to it without losing much. That's a pretty big bet though, lithography is becoming an ever increasing pain.

GPU process finally moving to 18/14nm should help a bunch, things have been stuck at 28nm for a long time. The other thing that will happen is we will eventually top out at a certain resolution, where there is no more ability to perceive increases in pixels (either through a huge screen or some sort of future VR headset), and this will let GPU manufacturers begin to optimize for other things, though I'd expect this is some ways in the future.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

EoRaptor posted:

GPU process finally moving to 18/14nm should help a bunch, things have been stuck at 28nm for a long time. The other thing that will happen is we will eventually top out at a certain resolution, where there is no more ability to perceive increases in pixels (either through a huge screen or some sort of future VR headset), and this will let GPU manufacturers begin to optimize for other things, though I'd expect this is some ways in the future.

going to 14nm just means more transistors packed on. They arn't going to keep existing designs indefinitely and just make them smaller.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

EoRaptor posted:

GPU process finally moving to 18/14nm should help a bunch, things have been stuck at 28nm for a long time. The other thing that will happen is we will eventually top out at a certain resolution, where there is no more ability to perceive increases in pixels (either through a huge screen or some sort of future VR headset), and this will let GPU manufacturers begin to optimize for other things, though I'd expect this is some ways in the future.

Yeah. And come to think of it I don't think we'll ever see good enough GPU performance in the passive cooler range catch up to bigger hotter chips, that's a CPU thing because they're trying to get blood from a stone trying to wring out tiny gains in IPC, and clock speeds aren't getting that much faster, while GPUs can just go wider. All the trends don't really matter though, because before we get too much farther atoms are just going to be too drat big.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

EoRaptor posted:

GPU process finally moving to 18/14nm should help a bunch, things have been stuck at 28nm for a long time. The other thing that will happen is we will eventually top out at a certain resolution, where there is no more ability to perceive increases in pixels (either through a huge screen or some sort of future VR headset), and this will let GPU manufacturers begin to optimize for other things, though I'd expect this is some ways in the future.

It's already pretty hard to tell the difference between 1080p and 1440p, I don't see any point in going beyond 4k really on a 27-32" monitor. Picture quality means a hell of a lot more than resolution past 1080p IMHO.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

FaustianQ posted:

It's already pretty hard to tell the difference between 1080p and 1440p, I don't see any point in going beyond 4k really on a 27-32" monitor. Picture quality means a hell of a lot more than resolution past 1080p IMHO.

The half frame every second I get from my surface looks really nice though.

The main thing about resolution these days is that it's the only thing that really helps with subpixel aliasing, and that is ugly as hell. Better solutions for that would be nice.

Overall though I agree somewhat.

dissss
Nov 10, 2007

I'm a terrible forums poster with terrible opinions.

Here's a cat fucking a squid.

teagone posted:

Yeah I meant desktop class performance, e.g., running a AAA big budget PC gaming title at some ridiculous resolution with the GPUs/CPUs sans coolers not even breaking a sweat.

It's hardly a modern game now but New Vegas will barely make the fans turn on on my Asus 970 - they do sometimes spin up but only a quite low rpm

VelociBacon
Dec 8, 2009

FaustianQ posted:

It's already pretty hard to tell the difference between 1080p and 1440p, I don't see any point in going beyond 4k really on a 27-32" monitor. Picture quality means a hell of a lot more than resolution past 1080p IMHO.

I have to say that I personally notice a large difference between the two resolutions. I also am really 'into' images though and spend a lot of time looking at medical imaging, games, photographs, etc so I'm probably not representative. I can't tell the difference between FLAC and .mp3 so I guess it's just what you're into.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

FaustianQ posted:

It's already pretty hard to tell the difference between 1080p and 1440p, I don't see any point in going beyond 4k really on a 27-32" monitor. Picture quality means a hell of a lot more than resolution past 1080p IMHO.

Have you ever seen or used a 27" 1080p monitor? 1080p @ 27" has a pretty high dot pitch and looks bad. 2560x1440 is pretty drat necessary for monitors 27" and larger, which are used at a 3 foot or less viewing range.

SwissArmyDruid
Feb 14, 2014

by sebmojo

SinineSiil posted:

https://www.youtube.com/watch?v=_jJpbmxGnss
Nvidia will turn this off, won't they?

They're late to the party, Anandtech already did all this: http://www.anandtech.com/show/9740/directx-12-geforce-plus-radeon-mgpu-preview

I guess they just wanted to see for themselves. My money is on Nvidia doing that thing like how they killed off people buying cheapo cards purely for physx while primarily running an ATI card for your main graphics. Make it just self-terminate the driver entirely if it detects a discrete AMD card in the system at all.

Fumble
Sep 4, 2006
Probation
Can't post for 25 days!

Dead Goon posted:

At the top end of your budget, you could get a GTX 970.

http://uk.pcpartpicker.com/part/zotac-video-card-zt9010110p

Thanks, just ordered one.

Verizian
Dec 18, 2004
The spiky one.

SwissArmyDruid posted:

They're late to the party, Anandtech already did all this: http://www.anandtech.com/show/9740/directx-12-geforce-plus-radeon-mgpu-preview

I guess they just wanted to see for themselves. My money is on Nvidia doing that thing like how they killed off people buying cheapo cards purely for physx while primarily running an ATI card for your main graphics. Make it just self-terminate the driver entirely if it detects a discrete AMD card in the system at all.

They do that, MS will pull their DX12 driver certification, same for Khronos and the new openGL. Brand agnostic multi-GPU and vRAM sharing support is a key feature of the next gen API's, break it and you're off spec. Nvidia's response is NVlink to boost connections between multiple Pascal GPU's then have various sites throw around rumours of a 16GB 1080Ti/32GB TitanNext both using 2nd gen HBM2.

1gnoirents
Jun 28, 2014

hello :)

FaustianQ posted:

It's already pretty hard to tell the difference between 1080p and 1440p

:saddowns:

SlayVus
Jul 10, 2009
Grimey Drawer

FaustianQ posted:

It's already pretty hard to tell the difference between 1080p and 1440p, I don't see any point in going beyond 4k really on a 27-32" monitor. Picture quality means a hell of a lot more than resolution past 1080p IMHO.

40" 4k is similar to dpi of a 24" 1080p. 40" 4k would also probably be preferable because you don't have to use display scaling.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Verizian posted:

They do that, MS will pull their DX12 driver certification, same for Khronos and the new openGL. Brand agnostic multi-GPU and vRAM sharing support is a key feature of the next gen API's, break it and you're off spec. Nvidia's response is NVlink to boost connections between multiple Pascal GPU's then have various sites throw around rumours of a 16GB 1080Ti/32GB TitanNext both using 2nd gen HBM2.

Does this mean I can use a $50 Radeon GPU or APU to drive a FreeSync monitor with a 980ti to actually crunch the shiny?

Yaoi Gagarin
Feb 20, 2014

I have a 280x that's recently started overheating. It idles at 53C in a room that's probably about 20C ambient. The idle frequency is 500 Mhz, the load frequency is 1020 Mhz, and under load the temperature keeps rising until eventually it hits 99C and the card throttles back to 500. There's no way the chip could suddenly start drawing more power, right? This is probably just a cooler problem? The card overheats even if I up the fan speed to 100% in Afterburner or CCC.

repiv
Aug 13, 2009

Zero VGS posted:

Does this mean I can use a $50 Radeon GPU or APU to drive a FreeSync monitor with a 980ti to actually crunch the shiny?

Yep, if the game developers go out of their way to implement a display passthrough mode in their engine.

That's the elephant in the room with DX12 mixed-adapter modes - you're buying into a setup that leaves you completely SOL if a game doesn't implement DX12 mixed-mode or perform well on your special snowflake combination of hardware. There's no waiting for nVidia/AMDs driver wizards to fix it by force, it will simply never work unless the developers can be convinced to care about your niche use-case.

Truga
May 4, 2014
Lipstick Apathy
Ideally, you'd get a 24-30" 8K screen and toss antialiasing since it won't be needed any more at above 300dpi from 2 feet.

VelociBacon
Dec 8, 2009

VostokProgram posted:

I have a 280x that's recently started overheating. It idles at 53C in a room that's probably about 20C ambient. The idle frequency is 500 Mhz, the load frequency is 1020 Mhz, and under load the temperature keeps rising until eventually it hits 99C and the card throttles back to 500. There's no way the chip could suddenly start drawing more power, right? This is probably just a cooler problem? The card overheats even if I up the fan speed to 100% in Afterburner or CCC.

Cooler problem, is it absolutely full of dust or cat hair? Are your case intake/exhaust fans working?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Twerk from Home posted:

Have you ever seen or used a 27" 1080p monitor? 1080p @ 27" has a pretty high dot pitch and looks bad. 2560x1440 is pretty drat necessary for monitors 27" and larger, which are used at a 3 foot or less viewing range.

I currently have a 27" 1440p monitor, one of the best on the market as well, so I kind of skipped that. However my point wasn't that 1440p was unnecessary rather most people gaming won't go much beyond 32" and there are diminishing returns for higher resolution beyond 4K. Nvidia, AMD, and Intel focusing on optimization for 4k makes sense rather than chasing 8k and beyond, sans VR I suppose. Professional imaging, art and TVs might demand higher though.


Is there any real noticeable difference in 1080p vs 1440p below 27"? Consider viewing distance as well.

SlayVus posted:

40" 4k is similar to dpi of a 24" 1080p. 40" 4k would also probably be preferable because you don't have to use display scaling.

40" might be a bit too big for being only 2ft away.

Kazinsal
Dec 13, 2011


VelociBacon posted:

Cooler problem, is it absolutely full of dust or cat hair? Are your case intake/exhaust fans working?

And for that matter, are the card's fans working?

Yaoi Gagarin
Feb 20, 2014

VelociBacon posted:

Cooler problem, is it absolutely full of dust or cat hair? Are your case intake/exhaust fans working?

Case fans are working. I'll pull the card out and take a look, you're probably right about the dust. I've been using this card for about a year and the last case didn't have any filters.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

xthetenth posted:

The main thing about resolution these days is that it's the only thing that really helps with subpixel aliasing, and that is ugly as hell. Better solutions for that would be nice.

People love to freak out about how 1080p should be enough for any honest man, but the topic of "how much resolution is too much" isn't a novel one to science, it's a basic question that gets asked any time anyone does signal transforms (eg A2D or D2A conversion). It makes sense to increase display resolution to at least the Nyquist Rate (twice what you can actually resolve, with the units here being PPI). Otherwise, just like with video, you get aliasing. Potentially further, and it also makes sense to do SSAA even beyond your display resolution of course. Which is (I think) at least 8K in your basic 27-32" desktop monitor.

Paul MaudDib fucked around with this message at 02:21 on Nov 4, 2015

1gnoirents
Jun 28, 2014

hello :)

FaustianQ posted:




Is there any real noticeable difference in 1080p vs 1440p below 27"? Consider viewing distance as well.



Oh, below 27"... probably not. 27" apples to apples, I can't go back.

My phone has a 1440p screen and its totally worthless, for an extreme example, but I think there is a significant difference between 1080p and 1440p at reasonable sizes. But it looks like im reading this all out of context

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Twerk from Home posted:

Have you ever seen or used a 27" 1080p monitor? 1080p @ 27" has a pretty high dot pitch and looks bad. 2560x1440 is pretty drat necessary for monitors 27" and larger, which are used at a 3 foot or less viewing range.

To echo this, the dot pitch of 1080p @ 27" desktop monitor is Pretty loving Big. I freely admit that I am a guy who likes lots of screen real-estate for doing programming and poo poo, but 1440p is really what's appropriate for 27" desktop monitors. Given that you can pick up cheap Korean panels for $200 or less you really have to ask yourself what you're actually saving vs your user experience.

Adbot
ADBOT LOVES YOU

Panty Saluter
Jan 17, 2004

Making learning fun!

VostokProgram posted:

I have a 280x that's recently started overheating. It idles at 53C in a room that's probably about 20C ambient. The idle frequency is 500 Mhz, the load frequency is 1020 Mhz, and under load the temperature keeps rising until eventually it hits 99C and the card throttles back to 500. There's no way the chip could suddenly start drawing more power, right? This is probably just a cooler problem? The card overheats even if I up the fan speed to 100% in Afterburner or CCC.

If you don't mind spending some money and disassembling your card, Arctic Cooling makes some neato coolers that work really well. I had a 4850 with an awful tiny cooler that was mediocre when clean but loved to trap dust, leading to super high temperatures. A fully passive Accelero S1 kept my load temps lower than stock cooler idle temps (the card would idle 70-75C with the stock cooler).

Also make sure you have enough space in your case :v:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply