Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Endymion FRS MK1 posted:

I'll test without, I've noticed people not doing anything. I'd prefer not having to take it apart again, I'm always paranoid about detaching/reattaching coolers :/

Well yeah, especially with it being bare die.

Adbot
ADBOT LOVES YOU

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Christ I hate computers. Got everything together turned on, temps seemed high, booted up Outer Worlds, turned into a slideshow then crashes. Here I am checking connections and I realize I never reset my case fan speeds. The connector the pump was on was set to be off unless the CPU hit a certain temp. So I was literally running zero pump.

Reset all of my fan curves and now I'm idling at 26C and running Unigine Heaven at ~44C

Edit: Changed the fan to something quieter and after a mild +100/500 OC I'm still only at 50-55C in Heaven. Buying this was the best decision I ever made easily, drat

Also pic for reference:

Endymion FRS MK1 fucked around with this message at 08:18 on Oct 31, 2019

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Looks good to me man nice

ItBurns
Jul 24, 2007

Endymion FRS MK1 posted:


Also pic for reference:


You need a tiny Enterprise model to put in front of it. That does look reall good though.

Mozi
Apr 4, 2004

Forms change so fast
Time is moving past
Memory is smoke
Gonna get wider when I die
Nap Ghost
So I have a 2070 Super reference card and the most recent beta for MSI Afterburner (4.6.2 beta 4) says "Added voltage control for reference design NVIDIA GeForce RTX 20x0 SUPER series graphic cards", but I'm not able to actually enable the core voltage slider regardless of my settings. Anybody have any thoughts?

Cavauro
Jan 9, 2008

Have you checked 'unlock voltage control' and monitoring in the settings menu? It should be a gear icon. Unless my version of afterburner is outdated. I guess you already said that you did this and I should have read the post properly. I don't know, friend

Mozi
Apr 4, 2004

Forms change so fast
Time is moving past
Memory is smoke
Gonna get wider when I die
Nap Ghost
Yeah, I did make sure to enable those. Ah well, not a big deal. It’ll probably start working at some point in the future.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

ItBurns posted:

You need a tiny Enterprise model to put in front of it. That does look reall good though.

Lol , one of my systems is in a cube and appropriately named as such



Now the Enterprise idea is perfect, probs gonna steal that

SwissArmyDruid
Feb 14, 2014

by sebmojo

Endymion FRS MK1 posted:

Christ I hate computers. Got everything together turned on, temps seemed high, booted up Outer Worlds, turned into a slideshow then crashes. Here I am checking connections and I realize I never reset my case fan speeds. The connector the pump was on was set to be off unless the CPU hit a certain temp. So I was literally running zero pump.

Reset all of my fan curves and now I'm idling at 26C and running Unigine Heaven at ~44C

Edit: Changed the fan to something quieter and after a mild +100/500 OC I'm still only at 50-55C in Heaven. Buying this was the best decision I ever made easily, drat

Also pic for reference:


The duality of man in one post:

"Christ, this was stupid, why the hell did I do this"

"THIS WAS THE BEST IDEA EVER"

Sininu
Jan 8, 2014

Endymion FRS MK1 posted:

Christ I hate computers. Got everything together turned on, temps seemed high, booted up Outer Worlds, turned into a slideshow then crashes. Here I am checking connections and I realize I never reset my case fan speeds. The connector the pump was on was set to be off unless the CPU hit a certain temp. So I was literally running zero pump.

Reset all of my fan curves and now I'm idling at 26C and running Unigine Heaven at ~44C

Edit: Changed the fan to something quieter and after a mild +100/500 OC I'm still only at 50-55C in Heaven. Buying this was the best decision I ever made easily, drat

Also pic for reference:


After seeing this I really wanna do the same thing. How audible would you rate this GPU cooler to be under load?

I got Gigabyte 2070S Gaming OC 3X and Meshify C case. The cooler on that card may be one of the quietest out there according to reviews but it still very audible and annoying to me under load so I want something better, similar to noise levels my CPU cooling makes.
I think the cooler would fit fine after eyeballing things but does my GPU have standard or some weird PCB?

Arivia
Mar 17, 2011

ItBurns posted:

You need a tiny Enterprise model to put in front of it. That does look reall good though.

I just want you to know that reference is spot on and I appreciate it.

LRADIKAL
Jun 10, 2001

Fun Shoe

Sininu posted:

After seeing this I really wanna do the same thing. How audible would you rate this GPU cooler to be under load?

I got Gigabyte 2070S Gaming OC 3X and Meshify C case. The cooler on that card may be one of the quietest out there according to reviews but it still very audible and annoying to me under load so I want something better, similar to noise levels my CPU cooling makes.
I think the cooler would fit fine after eyeballing things but does my GPU have standard or some weird PCB?

I have a very similar setup. It is very quiet. I'm not sure I ever hear my 1070 spin up.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

Sininu posted:

After seeing this I really wanna do the same thing. How audible would you rate this GPU cooler to be under load?

I got Gigabyte 2070S Gaming OC 3X and Meshify C case. The cooler on that card may be one of the quietest out there according to reviews but it still very audible and annoying to me under load so I want something better, similar to noise levels my CPU cooling makes.
I think the cooler would fit fine after eyeballing things but does my GPU have standard or some weird PCB?

It's definitely quiet. I set the fans to run at a little over half speed and I just hear the soft hum of them and my case fans. Your card has a good cooler though, I had it for a few days before I returned it due to coil whine. Either way your temps will drop. I could probably slow the fan down to half speed (like 800 or 900 rpm) and sacrifice a tiny bit of cooling for near silence

eames
May 9, 2009

FWIW Nvidia's latest driver will now automatically cap the framerate below the max monitor refresh rate if you enable G-Sync, the low latency mode (I only tested it on "Ultra") and enable VSync in the driver control panel. In my case that caps the framerate at 157 FPS on a 165 Hz screen.

People are still trying to figure out what exactly it does and how the different modes affect latency and frametimes but so far it looks like a decent solution for people who just want to set the right settings once for every game.

https://www.blurbusters.com/nvidia-releases-new-geforce-drivers-adds-nvidia-ultra-low-latency-aka-n-u-l-l/


I personally can't tell a difference between the "old" and "new" method but maybe it's more obvious with slower screens.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

eames posted:

FWIW Nvidia's latest driver will now automatically cap the framerate below the max monitor refresh rate if you enable G-Sync, the low latency mode (I only tested it on "Ultra") and enable VSync in the driver control panel. In my case that caps the framerate at 157 FPS on a 165 Hz screen.

People are still trying to figure out what exactly it does and how the different modes affect latency and frametimes but so far it looks like a decent solution for people who just want to set the right settings once for every game.

https://www.blurbusters.com/nvidia-releases-new-geforce-drivers-adds-nvidia-ultra-low-latency-aka-n-u-l-l/


I personally can't tell a difference between the "old" and "new" method but maybe it's more obvious with slower screens.

Out of curiosity what's the game in that sample image?

Stickman
Feb 1, 2004

Paul MaudDib posted:

Out of curiosity what's the game in that sample image?

Looks like Remnant: From the Ashes.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
It's an extremely dull & derivative souls clone

Burno
Aug 6, 2012

I setup the ultra low latency, gsync, and vsync last night and it was really nice. Everything just seemed to work and it was running between 115 and 119 fps on my 120hz display. Much less involved than setting up RTSS and investigating what vsync/framerate caps each game has.

I am not going to bother actually measuring latency, but it seems to work as well as RTSS was.

MikeC
Jul 19, 2004
BITCH ASS NARC
If anyone bought the XFX Thic II

https://www.techpowerup.com/260696/xfx-revises-rx-5700-xt-thicc-ii-cooler-offers-replacements-to-current-owners

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
What's the easiest method to tie H55 fan speed to GPU temp? When gaming I can't hear it and it gives great temps but browsing the web I'd like it to run a little slower. I'll mess around with tying it to CPU temp in BIOS, so it'll go to "gaming" speeds when my CPU goes over 50 or so but if I can get a simple GPU fan curve I'd be happier

eames
May 9, 2009

Endymion FRS MK1 posted:

What's the easiest method to tie H55 fan speed to GPU temp? When gaming I can't hear it and it gives great temps but browsing the web I'd like it to run a little slower. I'll mess around with tying it to CPU temp in BIOS, so it'll go to "gaming" speeds when my CPU goes over 50 or so but if I can get a simple GPU fan curve I'd be happier

If your motherboard supports extra temperature sensor you can connect one of those, attach it to the cold plate of the cooler and set a fan curve tied to that in the BIOS.
That’s how I did it with my aftermarket air cooler. The indicated temp is always 2C below Core GPU temp. I like this approach because it is OS agnostic and more reliable than software solutions.

e: since you use an AIO it would probably make more sense to attach the sensor to a part that reflects coolant temperature, like a part of the radiator.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

eames posted:

If your motherboard supports extra temperature sensor you can connect one of those, attach it to the cold plate of the cooler and set a fan curve tied to that in the BIOS.
That’s how I did it with my aftermarket air cooler. The indicated temp is always 2C below Core GPU temp. I like this approach because it is OS agnostic and more reliable than software solutions.

e: since you use an AIO it would probably make more sense to attach the sensor to a part that reflects coolant temperature, like a part of the radiator.

My board is an MSI Z370 Pro Carbon, how would I find out if it supports that?

eames
May 9, 2009

Endymion FRS MK1 posted:

My board is an MSI Z370 Pro Carbon, how would I find out if it supports that?

Check the user manual for something like “aux temp”

iospace
Jan 19, 2038


TheFluff posted:

But there is an opt-out - if you use GFE. Without GFE there is no opt-out because there is no data collection in the first place. The claim that the driver itself has a telemetry service was outdated.

If you can show that there is tracking with no opt-out there are some EU regulatory agencies that would be very interested in hearing from you.

SwissArmyDruid, looking for any reason, no matter how flimsy, to poo poo on anything not AMD? Why I never.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

eames posted:

Check the user manual for something like “aux temp”

Will do, thanks

eames
May 9, 2009

Endymion FRS MK1 posted:

Will do, thanks

I downloaded the manual and from what I can tell your board doesn’t support extra thermal sensors. :(

Sacred Cow
Aug 13, 2007

Endymion FRS MK1 posted:

What's the easiest method to tie H55 fan speed to GPU temp? When gaming I can't hear it and it gives great temps but browsing the web I'd like it to run a little slower. I'll mess around with tying it to CPU temp in BIOS, so it'll go to "gaming" speeds when my CPU goes over 50 or so but if I can get a simple GPU fan curve I'd be happier

I had the same problem with a G12/H55 so I bought 2 Noctua fans, mounted one on each side of the radiator and manually set the fans to run at its max inaudible speed all the time. Keeps my GPU around 65c max under heavy load.

Stickman
Feb 1, 2004

Another way to do it would be to get an adapter to plug the fan directly into the graphics card and then control it using your card's fan curve. You'd need a PWM fan, though, and believe the H55's fan is 3-pin voltage-controlled. It'll also be tied to the direct gpu temperature rather than liquid temperature, but if installing a liquid temp sensor doesn't work out that's probably what you'd be doing anyway!

E: There's also Speedfan and Argus Monitor that will let you control fans based on GPU temperatures. Speedfan is free, but it hasn't been updated in a while so it may or may not support your motherboard.

EE: Since the fan is 3-pin voltage-controlled, you'd need to set the motherboard header to "voltage controlled" or "DC controlled" instead of "PWM" if you want to control it's speed.

Stickman fucked around with this message at 23:20 on Nov 1, 2019

eames
May 9, 2009

There’s also the option of using something like speedfan to control the fan speed off GPU temperature using software, just be aware that there can be edge cases with catastrophic results (i.e. fan control software crashing during an unattended sustained load, fan/pump stops, cooling system heats up and melts plastic leading to water leaks, etc). This is unlikely but it has happened before.

e: this is another option, though expensive for just this purpose.


https://shop.aquacomputer.de/product_info.php?language=en&products_id=3773

eames fucked around with this message at 23:27 on Nov 1, 2019

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
I've tried Speedfan before, it sadly doesn't support my motherboard. For using that adapter, it wouldn't work with a DC fan right? I'm used to adjusting curves based on voltage, all of my case fans work that way anyway. Maybe I have a PWM fan laying around but I doubt it.

Edit: CPU temp might work, playing some MW now and the CPU averages around 50C so setting a more aggressive curve at 45C or so might work

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Sacred Cow posted:

I had the same problem with a G12/H55 so I bought 2 Noctua fans, mounted one on each side of the radiator and manually set the fans to run at its max inaudible speed all the time. Keeps my GPU around 65c max under heavy load.

This is the best option. The "set the fans to run at a just inaudible speed all the time" and accept that the resultant load temps might be in the 60's instead of the 50's part, not necessarily the "get 2 Noctura fans" part. Those things are made to hit the 90's without issue, so it's not like they care about 55C vs 65C. All the other options either cost goofy money for what it is, or is a awkward cludge.

CPU temp can sorta work, but you'll end up spinning the fans up whenever Chrome decides it wants to be stupid for 5 seconds or whatever unless you stick the trip threshold pretty high.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

DrDork posted:

CPU temp can sorta work, but you'll end up spinning the fans up whenever Chrome decides it wants to be stupid for 5 seconds or whatever unless you stick the trip threshold pretty high.

I did it before when I had everything on zero fan until ~40C with my 1080. I have a decent feel for when my system decides to be stupid outside of games

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Endymion FRS MK1 posted:

I've tried Speedfan before, it sadly doesn't support my motherboard. For using that adapter, it wouldn't work with a DC fan right? I'm used to adjusting curves based on voltage, all of my case fans work that way anyway. Maybe I have a PWM fan laying around but I doubt it.

Edit: CPU temp might work, playing some MW now and the CPU averages around 50C so setting a more aggressive curve at 45C or so might work

I use Argus monitor to control the Noctua fans on my 5700XT, they have an option to set a curve based on GPU temp.

SwissArmyDruid
Feb 14, 2014

by sebmojo

iospace posted:

SwissArmyDruid, looking for any reason, no matter how flimsy, to poo poo on anything not AMD? Why I never.

Don't pussyfoot around it, it's not becoming.

I hate Nvidia.

I merely resent a lot of things, I actively hate Nvidia.

I resent not having a choice when it came to buying computer parts during the worst of the shitcoin boom.
I resent that it was actually cheaper for me to buy a gaming laptop to plug a monitor into and use as a desktop, instead of building my own.
I resent that the best bang-for-buck laptop at the time was the 7700HQ-1060 that I'm on now.
I resent that it was a Dell.
I resent that it has taken AMD over five years to have their Maxwell.
I resent that AMD hasn't done more to make inroads into the notebook space.
I resent that AMD APUs are monolithic chips made on the tech that's just about to get phased out, instead of building on the new poo poo.
I resent that Navi still ain't here yet.
I resent that Intel sat on its rear end for more than a decade, not caring to do anything except 5% YOY off 14nm process improvements with marketing razzle-dazzle.

But Nvidia?

I hate NVidia's current product stack.
I hate how the pricing for NVidia's current product stack is exponential.
I hate Nvidia's current product stack on professional GPUs.
I hate the roadblocks Nvidia puts in my way when I try to pass through GPU into a VM.
I hate how erratic performance from driver release to release have got me checking driver reviews for regressions before I download new drivers.
I hate the words "driver reviews".
I hate the IDEA of "driver reviews".
I hate the telemetry.
I hate the datamining.
I hate features being locked behind accepting to be datamined.
I hate making logins for pointless poo poo.
I hate driver-level features being locked behind a login in GFE.

SwissArmyDruid fucked around with this message at 01:28 on Nov 2, 2019

GRINDCORE MEGGIDO
Feb 28, 1985


GPU Megat[H]read - I actively hate Nvidia.

Sacred Cow
Aug 13, 2007

DrDork posted:

This is the best option. The "set the fans to run at a just inaudible speed all the time" and accept that the resultant load temps might be in the 60's instead of the 50's part, not necessarily the "get 2 Noctura fans" part. Those things are made to hit the 90's without issue, so it's not like they care about 55C vs 65C. All the other options either cost goofy money for what it is, or is a awkward cludge.

The only reason I went with 2 was because my card would naturally stop boosting when it hit 75c. If I used Afterburner to go past that, things would get crashy around 78c with even the slightest overclock on the stock cooler.

The biggest downside is I’ll never be able to put this in any modern case with a glass side panel because it’s ugly as all gently caress.

GRINDCORE MEGGIDO
Feb 28, 1985


All computers are beautiful.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

GRINDCORE MEGGIDO posted:

GPU Megat[H]read - I actively hate Nvidia.

Honestly I have 4 of their cards in my house right now and, gently caress it, same

Cygni
Nov 12, 2005

raring to post

SwissArmyDruid posted:

Don't pussyfoot around it, it's not becoming.

I hate Nvidia.

I merely resent a lot of things, I actively hate Nvidia.

I resent not having a choice when it came to buying computer parts during the worst of the shitcoin boom.
I resent that it was actually cheaper for me to buy a gaming laptop to plug a monitor into and use as a desktop, instead of building my own.
I resent that the best bang-for-buck laptop at the time was the 7700HQ-1060 that I'm on now.
I resent that it was a Dell.
I resent that it has taken AMD over five years to have their Maxwell.
I resent that AMD hasn't done more to make inroads into the notebook space.
I resent that AMD APUs are monolithic chips made on the tech that's just about to get phased out, instead of building on the new poo poo.
I resent that Navi still ain't here yet.
I resent that Intel sat on its rear end for more than a decade, not caring to do anything except 5% YOY off 14nm process improvements with marketing razzle-dazzle.

But Nvidia?

I hate NVidia's current product stack.
I hate how the pricing for NVidia's current product stack is exponential.
I hate Nvidia's current product stack on professional GPUs.
I hate the roadblocks Nvidia puts in my way when I try to pass through GPU into a VM.
I hate how erratic performance from driver release to release have got me checking driver reviews for regressions before I download new drivers.
I hate the words "driver reviews".
I hate the IDEA of "driver reviews".
I hate the telemetry.
I hate the datamining.
I hate features being locked behind accepting to be datamined.
I hate making logins for pointless poo poo.
I hate driver-level features being locked behind a login in GFE.

please dont get mad about toys

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Please don't kill nvidia :(

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply