Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
craig588
Nov 19, 2005

by Nyc_Tattoo
I bet a Zotac would. Worst fans I can remember seeing on a GPU.

Adbot
ADBOT LOVES YOU

Three-Phase
Aug 5, 2006

by zen death robot
I got a GTX 1060 3GB today at Microcenter and installed it on my new i5 8600k system.

DayZ Standalone is still a massive garbage fire of game, but not is an incredibly smooth, detailed, and beautiful garbage fire now. (Upgraded from a LATE 2011 i7-920 and a 750.) I think my framerate went from about 15FPS to well over 30FPS at much much higher graphics settings.

Three-Phase fucked around with this message at 17:40 on Nov 24, 2017

1gnoirents
Jun 28, 2014

hello :)

Three-Phase posted:

I got a GTX 1060 3GB today at Microcenter and installed it on my new i5 8600k system.

DayZ Standalone is still a massive garbage fire of game, but not is an incredibly smooth, detailed, and beautiful garbage fire now. (Upgraded from a LATE 2011 i7-920 and a 750.) I think my framerate went from about 15FPS to well over 30FPS at much much higher graphics settings.

Youre going to see a buttload of actual CPU related improvements as well. Savor them, as they are rare.

Three-Phase
Aug 5, 2006

by zen death robot

1gnoirents posted:

Youre going to see a buttload of actual CPU related improvements as well. Savor them, as they are rare.

Well how many “generations” is going from an i7-920 to a i5-8600? Like five?

SlayVus
Jul 10, 2009
Grimey Drawer

Three-Phase posted:

Well how many “generations” is going from an i7-920 to a i5-8600? Like five?

Nehalem, Westmere, Sandy Bridge, Ivy Bridge, Haswell, Broadwell, Sky lake, Kaby lake, Coffee Lake. Each one providing 3-10% more IPC more than the previous.

Single threaded Passmark score for the 920 is 1164. The i5-8600 gets 2533.

SlayVus fucked around with this message at 21:33 on Nov 24, 2017

dissss
Nov 10, 2007

I'm a terrible forums poster with terrible opinions.

Here's a cat fucking a squid.

SlayVus posted:

Nehalem, Westmere, Sandy Bridge, Ivy Bridge, Haswell, Broadwell, Sky lake, Kaby lake, Coffee Lake. Each one providing 3-10% more IPC more than the previous.

Single threaded Passmark score for the 920 is 1164. The i5-8600 gets 2533.

Sandybridge was the big jump - a 2600 is 1921 single threaded on Passmark

The 920 is a good place to be coming from, I still can't justify moving on from Haswell even with the extra cores modern CPUs have

Three-Phase
Aug 5, 2006

by zen death robot
I downloaded the trial of Hitman from Steam and despite being limited to medium graphics (3GB card) I was still like "Ooooh, ahhh, pretty".

GRINDCORE MEGGIDO
Feb 28, 1985


dissss posted:

Sandybridge was the big jump - a 2600 is 1921 single threaded on Passmark

The 920 is a good place to be coming from, I still can't justify moving on from Haswell even with the extra cores modern CPUs have

Is there any change between skylake to coffee lake at all?

E - clocks prolly

SlayVus
Jul 10, 2009
Grimey Drawer

GRINDCORE MEGGIDO posted:

Is there any change between skylake to coffee lake at all?

E - clocks prolly

i5-6600k to i5-8600k goes from 2149 to 2533. The i5-7600(non-k) gets 2311 and the k version gets 2397. The 6600k and 7600 do 3.5ghz, the 7600k does 3.8 and the 8600k does 3.6.

GRINDCORE MEGGIDO
Feb 28, 1985


Oh that's surprising. I thought they were the same cores.

Three-Phase
Aug 5, 2006

by zen death robot

SlayVus posted:

i5-6600k to i5-8600k goes from 2149 to 2533. The i5-7600(non-k) gets 2311 and the k version gets 2397. The 6600k and 7600 do 3.5ghz, the 7600k does 3.8 and the 8600k does 3.6.

I have gotten 4.2 with that automatic boosting

Three-Phase
Aug 5, 2006

by zen death robot
So I did some testing earlier in, of all things, STALKER: Clear Sky. My video card temperature stabilized around 74C. That seems pretty high, the fan didn't seem to kick into higher gear until I was playing for awhile. No noticable video problems. The X-Ray engine is, in my opinion, an extremely resource-hungry engine in terms of video cards. It's also worth nothing this card (EVGA 3GB 1060) is probably considered a "small form factor" video card.

Is 75C an acceptable temperature for this sort of video card? I saw people who got higher than 80C with the 1060s.

Three-Phase fucked around with this message at 01:43 on Nov 25, 2017

Ihmemies
Oct 6, 2012

Risky Bisquick posted:

3gb will mean you will need to run low-med textures for some newish games. When you consider you can basically flip from low to ultra textures at nearly no performance hit, it doesn't make sense to save the money. You can lower all the other settings to improve fps but still keep great looking textures so your game doesn't look like CS1.6

There are some benefits from cs 1.6 textures though. At very low I find enemies don't blend into the terrain as well, making them easier to spot.

E: also finally ordered an Arctic twin turbo iii and some Kryonaut for my 970gtx. Man the asus strix cooler is a noisy piece of garbage. Even at steady rpm the fans make kind of a wowowowowo sound. Oh and the fans are so loving loud anyways.

I have a twin turbo ii on my htpc's hd6970 and it is so much quieter when gaming. Originally the gtx970's lovely cooling didn't bother me that much because my case fans were a lot louder. After upgrading to 8700K the noisiest part now is my GPU.

I'm never going to pay extra for "better" coolers from now on. This is like the 4th time I've had to swap the cooler so might as well buy those bottom barrel stock fan design models.

Ihmemies fucked around with this message at 01:51 on Nov 25, 2017

craig588
Nov 19, 2005

by Nyc_Tattoo
Below 80C is fine. It probably speeds up to audible when it hits like 78 and then slows down at like 76. There's some amount of hysteresis so it's not constantly changing speeds. If you're really interested you can dump the bios with Nvflash and get the exact values, they're different for every card.

Three-Phase
Aug 5, 2006

by zen death robot
So what's interesting is I did a little more testing. Highest temperature was 72C. But there were a few performance limits that did assert themselves during testing. HWInfo lists these limiting parameters:

Power
Thermal
Reliability Voltage
Max Operating Voltage
Performace Limit - Utilization
Performance Limit - SLI GPUBoost Sync

Those three in bold did "assert" themselves during testing of the card. Not sure if that's normal or not. Nothing funny happened, game ran just fine.

D3D usage peaked at 100%
Fan usage peaked at 68% (average 50%)
Total power peaked at 104% (average 64%)

trash person
Apr 5, 2006

Baby Executive is pleased with your performance!
Hello friends, I have a dumb question.

My gaming PC is using an 8GB RX480. I game on my monitor at 1080p.

I'm getting a new 4K TV this week, and rearranged my living room in a way that by chance allows me to easily run an HDMI cable from my computer to my TV.

I know the RX480 is not 'for' 4K gaming so I'm not going to try to get it to do that, but would I be able to leave my PC/games at 1080p settings and plug it into my new 4K and have it not look like poo poo/blow everything up?

ItBurns
Jul 24, 2007

dissss posted:

Sandybridge was the big jump - a 2600 is 1921 single threaded on Passmark

The 920 is a good place to be coming from, I still can't justify moving on from Haswell even with the extra cores modern CPUs have

I'm in the same boat. By the time Intel has 8-cores as the mainstream RAM will be $25/gb and GPUs will have eclipsed whatever gains the extra 2 cores would have brought by 50%. Arguably that day is today, relative to a year ago (albeit with AMD instead of Intel) but the sentiment is the same.

craig588
Nov 19, 2005

by Nyc_Tattoo

Three-Phase posted:

So what's interesting is I did a little more testing. Highest temperature was 72C. But there were a few performance limits that did assert themselves during testing. HWInfo lists these limiting parameters:

Power
Thermal
Reliability Voltage
Max Operating Voltage
Performace Limit - Utilization
Performance Limit - SLI GPUBoost Sync

Those three in bold did "assert" themselves during testing of the card. Not sure if that's normal or not. Nothing funny happened, game ran just fine.

D3D usage peaked at 100%
Fan usage peaked at 68% (average 50%)
Total power peaked at 104% (average 64%)

The power percentages are also arbitrary and different for every card, but if you have the power supply and case airflow for it you can max that out no problem, the limit of what it'll allow you to set is predefined by the manufacturer so it's safe to max out. (and just to get even more confusing all the reported values are lies to look good fed from the drivers, but you can still benefit from maxing out the power limit)

trash person posted:

Hello friends, I have a dumb question.

My gaming PC is using an 8GB RX480. I game on my monitor at 1080p.

I'm getting a new 4K TV this week, and rearranged my living room in a way that by chance allows me to easily run an HDMI cable from my computer to my TV.

I know the RX480 is not 'for' 4K gaming so I'm not going to try to get it to do that, but would I be able to leave my PC/games at 1080p settings and plug it into my new 4K and have it not look like poo poo/blow everything up?

It depends on your TVs scaler. Integer scaling should look great, but it might try to interpolate anyways because that's the cheaper and lazier way to implement a scaler.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

trash person posted:

Hello friends, I have a dumb question.

My gaming PC is using an 8GB RX480. I game on my monitor at 1080p.

I'm getting a new 4K TV this week, and rearranged my living room in a way that by chance allows me to easily run an HDMI cable from my computer to my TV.

I know the RX480 is not 'for' 4K gaming so I'm not going to try to get it to do that, but would I be able to leave my PC/games at 1080p settings and plug it into my new 4K and have it not look like poo poo/blow everything up?

I have a Sony x900e and 1080p scales well enough for me and looks pretty good. Can’t say for certain it’s the case for every tv.

redeyes
Sep 14, 2002

by Fluffdaddy
I just decided to play some Doom again. Have a RX480 RX Strix model. Getting 50-60fps at 4k in Vulkan now. That is slightly amazing to me.

trash person
Apr 5, 2006

Baby Executive is pleased with your performance!
The reviews on the TV i bought say the scaling is pretty good. I'm excited, thanks for the info guys

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

redeyes posted:

I just decided to play some Doom again. Have a RX480 RX Strix model. Getting 50-60fps at 4k in Vulkan now. That is slightly amazing to me.

Doom is crazy fast in general and also uses all the vendor-specific tricks on AMD cards.

redeyes
Sep 14, 2002

by Fluffdaddy

Paul MaudDib posted:

Doom is crazy fast in general and also uses all the vendor-specific tricks on AMD cards.

I'd say its as fast as they can possibly go. AMD cards that is.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

redeyes posted:

I'd say its as fast as they can possibly go. AMD cards that is.

Wolfenstein II TNC is on the same engine and throws even more stuff into async compute, and uses the FP16 on Vega, so if anything it's another notch faster on AMD cards. Doom is faster than a 1080 on a Vega 64, but Wolf2:TNC is actually competitive with a 1080 Ti on Vega 64.

idTech6 does all sorts of crazy stuff with megatexturing. Everything is a texture, and in fact one giant texture. I'm not sure if this is a great idea or a terrible idea - but either way this is one thing that didn't make the cut in Wolf2:TNC.

Dunno if anyone else is into engine teardowns but I just dug up this one for Doom 2016. Another site I am really fond of is Fabien Sanglard's teardowns of classic game engines, eg Duke Nukem 3D/Build Engine. Oddly, his navigation links are at the top of the page, so don't get confused and think it's only one page :v:

Paul MaudDib fucked around with this message at 03:16 on Nov 25, 2017

GRINDCORE MEGGIDO
Feb 28, 1985


Paul MaudDib posted:

Wolfenstein II TNC is on the same engine and throws even more stuff into async compute, and uses the FP16 on Vega, so if anything it's another notch faster on AMD cards. Doom is faster than a 1080 on a Vega 64, but Wolf2:TNC is actually competitive with a 1080 Ti on Vega 64.

idTech6 does all sorts of crazy stuff with megatexturing. Everything is a texture, and in fact one giant texture. I'm not sure if this is a great idea or a terrible idea - but either way this is one thing that didn't make the cut in Wolf2:TNC.

Dunno if anyone else is into engine teardowns but I just dug up this one for Doom 2016. Another site I am really fond of is Fabien Sanglard's teardowns of classic game engines, eg Duke Nukem 3D. Oddly, his navigation links are at the top of the page, so don't get confused and think it's only one page :v:

It's clever as hell how Doom applies motion blur in a really simple, realistic way. I like engine teardowns, but I don't understand much of it at all. :allears: Carmack.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Quake is another interesting/relevant one since it survives largely intact today in the form of GoldSrc and Source Engine :v: The netcode in particular is virtually exactly the same as what TF2 and CS:S run, and really the whole "game is a sequence of verb commands that get executed by a script" format is the same too. I can't really say about CS:GO but I wouldn't be surprised if Source2 was the same, Quake was modular as hell and while most of the modules have changed, the overall structure is pretty drat similar to what we had 20 years ago.

And of course who could forget Quake3 :v:

(again, there are multiple pages on both those links, see the navbar at the top)

Paul MaudDib fucked around with this message at 03:38 on Nov 25, 2017

repiv
Aug 13, 2009

GRINDCORE MEGGIDO posted:

It's clever as hell how Doom applies motion blur in a really simple, realistic way. I like engine teardowns, but I don't understand much of it at all. :allears: Carmack.

Carmack had almost nothing to do with IdTech6, he moved to Oculus a long time ago :shittydog:

His original idea for IT6 was nuts, he wanted to double down on the Megatexture concept by extending it to geometry using a sparse voxel representation and raytracing everything. The guys who actually ended up building IT6 went in the opposite direction though, reducing the scope of Megatexture to the point that it's more of an implementation detail than the paradigm shift Carmack thought it would be.

Three-Phase
Aug 5, 2006

by zen death robot

craig588 posted:

The power percentages are also arbitrary and different for every card, but if you have the power supply and case airflow for it you can max that out no problem, the limit of what it'll allow you to set is predefined by the manufacturer so it's safe to max out. (and just to get even more confusing all the reported values are lies to look good fed from the drivers, but you can still benefit from maxing out the power limit)

Huh. So the card hitting those limits are OK then?

Anime Schoolgirl
Nov 28, 2002

idtech6 is really cryengine 4 or something because that's who joined up at ID after carmack left

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Three-Phase posted:

Huh. So the card hitting those limits are OK then?

These days the cards are hyper-smart, almost too smart. Unless you flash a replacement VBIOS, nothing you do to the card can possibly hurt it, and the intelligence of the power-management system is almost a barrier.

Basically, at these nodes, the biggest danger is now "pushing too much power/voltage through the card" (electromigration). So now the card will basically watch its temperature, how much power it's pulling, etc in literally realtime (faster than you can poll the card, in fact) and decide the maximum boost clock it's willing to go to (beyond what the nominal rated boost clock is, in fact).

Pascal is actually very different from Kepler or GCN 1.0/1.1 cards in terms of overclocking. The biggest limit is now thermals - the card will actually begin locking out boost bins at about 60C, starting from 2100+ MHz. By 70C you will have a noticeable reduction in performance (more or less down to nominal boost clocks), by 80C you are down to base clocks. On top of that, there is the power limit. This is essentially the "max sprint" that can be pushed through the card to handle "tough frames". Increasing this is the #2 thing you can do to increase performance... behind keeping the chip cool. Voltage increase is actually usually a negative now - since it makes you draw more power and runs the temperature up. The card will actually mostly push its max core clock by itself, overclocking the core rarely yields anything anymore.

So in terms of maximizing performance on a Pascal card - step 1 is maxing out your fan speed. Step 2 is maxing out your power limit, and making sure your temps are still under control. Then you can play with core overclock and undervolting... but usually that's not worth it. If you really want to max out the clock and you're already at 1080 or 1080 Ti (i.e. the point where thermals may be a problem) the best option is to go to a liquid-cooling kit - EVGA makes one, and NZXT makes a kit that allows you to clamp any cooler onto a card. I sit at 47C under a heavy mining load on my 1080.

Paul MaudDib fucked around with this message at 03:56 on Nov 25, 2017

CaptainSarcastic
Jul 6, 2013



I can say that my personal experience is that both DOOM and Wolfenstein TNC play way better on my aging desktop than I'd expect. Running a GTX 1060 6GB card and easily maintaining 60FPS at 1920x1080 at High or better settings even on my 6 year-old CPU. I built this machine with longevity in mind but it's been holding up way better than I hoped for.

Maxwell Adams
Oct 21, 2000

T E E F S

Ihmemies posted:

I'm never going to pay extra for "better" coolers from now on. This is like the 4th time I've had to swap the cooler so might as well buy those bottom barrel stock fan design models.

If you plan on getting an aftermarket cooler, it's not a bad idea to get an MSI Gaming X card. The VRM heatsink is a separate piece, and you can just leave it on after you take of the main heatsink.

Three-Phase
Aug 5, 2006

by zen death robot
Thanks for the explanation on this Paul. I am not used to electrical/electronics that are that forgiving.

Sniep
Mar 28, 2004

All I needed was that fatty blunt...



King of Breakfast
Anyone have experience with cooling in a Phanteks Enthoo Evolv ITX case? (Mini-ITX)

Where it's at now for me is the GPU is face-down on top of the PSU cage, which despite perforations and positive air pressure from 2x 140mm fans on the front of the case, still is hitting thermal junction and throttling down to about 75% power to remain pegged at 81°C.

I have a new card on the way with "hybrid" water cooling closed-loop to try and replace, but i'm not sure where in the case to even put it? Also it's the Ti model so it's going to have more heat to deal with on top of the better built-in cooling.

Current problematic setup:



If I wanted to have it exhaust out the back of the case, i'd need to replace the CPU air cooler which temps on the CPU (i7-8700 non-k) aren't a problem. Do I absolutely have to water cool the CPU too or can the GPU radiator block live on the top of the case out of the way of the current CPU air cooler hsf?

What's the correct way to approach this since hte case design seems pretty flawed to me?

Sniep fucked around with this message at 21:41 on Nov 25, 2017

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Raise the throttle point. Also try removing the front of the case. The evolv cases have terrible airflow.

Sniep
Mar 28, 2004

All I needed was that fatty blunt...



King of Breakfast

Don Lapre posted:

Raise the throttle point. Also try removing the front of the case. The evolv cases have terrible airflow.

Ok so here's a big question then, what's a safe temp to run at?

Right now, it's running pegged at 81°C and various people either say "eh, that's fine, it will just shorten the life of the card" to "idk mine runs at 60s to 70s"

I'd like to be in the latter camp?

Running with the case taken apart is not an option.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Use an overclock utillity and crank temp and power target as high as they can go. You cannot damage the card.

You can also limit your fps to your monitor refresh and it will lower heat output.

Your case basically has zero airflow with the front cover on. It's so bad you can buy precut front panels now.

Don Lapre fucked around with this message at 22:08 on Nov 25, 2017

Ihmemies
Oct 6, 2012

Maxwell Adams posted:

If you plan on getting an aftermarket cooler, it's not a bad idea to get an MSI Gaming X card. The VRM heatsink is a separate piece, and you can just leave it on after you take of the main heatsink.

Actually I looked at the 970 Asus strix disassembly and that card too has a separate vrm heatsink. Basically I unscrew 4 screws around the GPU and the whole heatsink comes off. Then I swap the thermal goop and slap on the new cooler. I could glue on some memory heatsinks too but I don't think they will be of any use.

craig588
Nov 19, 2005

by Nyc_Tattoo
As someone with an Asus Strix 980 I don't think you need to change the heatsink. Good fans and a lot of material, even modded to draw 250 watts it wasn't getting overwhelmed enough to turn the fans up to any significant level, I don't remember exactly how fast they got as I'm on a 1080 now, but I remember being impressed enough with the 980s cooler for the life of the card that I never changed it and I've changed coolers on virtually every single other card I owned. If it's overheating it might be clogged with dust.

Reviewing other posts I guess you got damaged fans. They were the part that impressed me the most, lots of manufacturers get the "huge mass of metal" bit right on heatsinks and then put on some garbage noisy or low flowing fans.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Don Lapre posted:

Your case basically has zero airflow with the front cover on. It's so bad you can buy precut front panels now.

You can test this by getting out a candle or some incense or something and seeing how much smoke is getting pulled around.

It's so :stonklol: that this is a problem nowadays... YOU HAD ONE JOB, case makers! :argh:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply