|
Endymion FRS MK1 posted:I'll test without, I've noticed people not doing anything. I'd prefer not having to take it apart again, I'm always paranoid about detaching/reattaching coolers :/ Well yeah, especially with it being bare die.
|
# ? Oct 31, 2019 06:01 |
|
|
# ? Jun 4, 2024 02:59 |
|
Christ I hate computers. Got everything together turned on, temps seemed high, booted up Outer Worlds, turned into a slideshow then crashes. Here I am checking connections and I realize I never reset my case fan speeds. The connector the pump was on was set to be off unless the CPU hit a certain temp. So I was literally running zero pump. Reset all of my fan curves and now I'm idling at 26C and running Unigine Heaven at ~44C Edit: Changed the fan to something quieter and after a mild +100/500 OC I'm still only at 50-55C in Heaven. Buying this was the best decision I ever made easily, drat Also pic for reference: Endymion FRS MK1 fucked around with this message at 08:18 on Oct 31, 2019 |
# ? Oct 31, 2019 07:27 |
|
Looks good to me man nice
|
# ? Oct 31, 2019 10:19 |
|
Endymion FRS MK1 posted:
You need a tiny Enterprise model to put in front of it. That does look reall good though.
|
# ? Oct 31, 2019 12:44 |
|
So I have a 2070 Super reference card and the most recent beta for MSI Afterburner (4.6.2 beta 4) says "Added voltage control for reference design NVIDIA GeForce RTX 20x0 SUPER series graphic cards", but I'm not able to actually enable the core voltage slider regardless of my settings. Anybody have any thoughts?
|
# ? Oct 31, 2019 12:49 |
|
Have you checked 'unlock voltage control' and monitoring in the settings menu? It should be a gear icon. Unless my version of afterburner is outdated. I guess you already said that you did this and I should have read the post properly. I don't know, friend
|
# ? Oct 31, 2019 15:37 |
|
Yeah, I did make sure to enable those. Ah well, not a big deal. It’ll probably start working at some point in the future.
|
# ? Oct 31, 2019 16:15 |
|
ItBurns posted:You need a tiny Enterprise model to put in front of it. That does look reall good though. Lol , one of my systems is in a cube and appropriately named as such Now the Enterprise idea is perfect, probs gonna steal that
|
# ? Oct 31, 2019 17:32 |
|
Endymion FRS MK1 posted:Christ I hate computers. Got everything together turned on, temps seemed high, booted up Outer Worlds, turned into a slideshow then crashes. Here I am checking connections and I realize I never reset my case fan speeds. The connector the pump was on was set to be off unless the CPU hit a certain temp. So I was literally running zero pump. The duality of man in one post: "Christ, this was stupid, why the hell did I do this" "THIS WAS THE BEST IDEA EVER"
|
# ? Oct 31, 2019 19:10 |
|
Endymion FRS MK1 posted:Christ I hate computers. Got everything together turned on, temps seemed high, booted up Outer Worlds, turned into a slideshow then crashes. Here I am checking connections and I realize I never reset my case fan speeds. The connector the pump was on was set to be off unless the CPU hit a certain temp. So I was literally running zero pump. After seeing this I really wanna do the same thing. How audible would you rate this GPU cooler to be under load? I got Gigabyte 2070S Gaming OC 3X and Meshify C case. The cooler on that card may be one of the quietest out there according to reviews but it still very audible and annoying to me under load so I want something better, similar to noise levels my CPU cooling makes. I think the cooler would fit fine after eyeballing things but does my GPU have standard or some weird PCB?
|
# ? Oct 31, 2019 19:33 |
|
ItBurns posted:You need a tiny Enterprise model to put in front of it. That does look reall good though. I just want you to know that reference is spot on and I appreciate it.
|
# ? Oct 31, 2019 19:46 |
|
Sininu posted:After seeing this I really wanna do the same thing. How audible would you rate this GPU cooler to be under load? I have a very similar setup. It is very quiet. I'm not sure I ever hear my 1070 spin up.
|
# ? Oct 31, 2019 19:56 |
|
Sininu posted:After seeing this I really wanna do the same thing. How audible would you rate this GPU cooler to be under load? It's definitely quiet. I set the fans to run at a little over half speed and I just hear the soft hum of them and my case fans. Your card has a good cooler though, I had it for a few days before I returned it due to coil whine. Either way your temps will drop. I could probably slow the fan down to half speed (like 800 or 900 rpm) and sacrifice a tiny bit of cooling for near silence
|
# ? Oct 31, 2019 20:05 |
|
FWIW Nvidia's latest driver will now automatically cap the framerate below the max monitor refresh rate if you enable G-Sync, the low latency mode (I only tested it on "Ultra") and enable VSync in the driver control panel. In my case that caps the framerate at 157 FPS on a 165 Hz screen. People are still trying to figure out what exactly it does and how the different modes affect latency and frametimes but so far it looks like a decent solution for people who just want to set the right settings once for every game. https://www.blurbusters.com/nvidia-releases-new-geforce-drivers-adds-nvidia-ultra-low-latency-aka-n-u-l-l/ I personally can't tell a difference between the "old" and "new" method but maybe it's more obvious with slower screens.
|
# ? Oct 31, 2019 21:49 |
|
eames posted:FWIW Nvidia's latest driver will now automatically cap the framerate below the max monitor refresh rate if you enable G-Sync, the low latency mode (I only tested it on "Ultra") and enable VSync in the driver control panel. In my case that caps the framerate at 157 FPS on a 165 Hz screen. Out of curiosity what's the game in that sample image?
|
# ? Nov 1, 2019 00:02 |
|
Paul MaudDib posted:Out of curiosity what's the game in that sample image? Looks like Remnant: From the Ashes.
|
# ? Nov 1, 2019 00:53 |
|
It's an extremely dull & derivative souls clone
|
# ? Nov 1, 2019 09:32 |
|
I setup the ultra low latency, gsync, and vsync last night and it was really nice. Everything just seemed to work and it was running between 115 and 119 fps on my 120hz display. Much less involved than setting up RTSS and investigating what vsync/framerate caps each game has. I am not going to bother actually measuring latency, but it seems to work as well as RTSS was.
|
# ? Nov 1, 2019 20:14 |
|
If anyone bought the XFX Thic II https://www.techpowerup.com/260696/xfx-revises-rx-5700-xt-thicc-ii-cooler-offers-replacements-to-current-owners
|
# ? Nov 1, 2019 22:43 |
|
What's the easiest method to tie H55 fan speed to GPU temp? When gaming I can't hear it and it gives great temps but browsing the web I'd like it to run a little slower. I'll mess around with tying it to CPU temp in BIOS, so it'll go to "gaming" speeds when my CPU goes over 50 or so but if I can get a simple GPU fan curve I'd be happier
|
# ? Nov 1, 2019 22:55 |
|
Endymion FRS MK1 posted:What's the easiest method to tie H55 fan speed to GPU temp? When gaming I can't hear it and it gives great temps but browsing the web I'd like it to run a little slower. I'll mess around with tying it to CPU temp in BIOS, so it'll go to "gaming" speeds when my CPU goes over 50 or so but if I can get a simple GPU fan curve I'd be happier If your motherboard supports extra temperature sensor you can connect one of those, attach it to the cold plate of the cooler and set a fan curve tied to that in the BIOS. That’s how I did it with my aftermarket air cooler. The indicated temp is always 2C below Core GPU temp. I like this approach because it is OS agnostic and more reliable than software solutions. e: since you use an AIO it would probably make more sense to attach the sensor to a part that reflects coolant temperature, like a part of the radiator.
|
# ? Nov 1, 2019 22:59 |
|
eames posted:If your motherboard supports extra temperature sensor you can connect one of those, attach it to the cold plate of the cooler and set a fan curve tied to that in the BIOS. My board is an MSI Z370 Pro Carbon, how would I find out if it supports that?
|
# ? Nov 1, 2019 23:04 |
|
Endymion FRS MK1 posted:My board is an MSI Z370 Pro Carbon, how would I find out if it supports that? Check the user manual for something like “aux temp”
|
# ? Nov 1, 2019 23:06 |
|
TheFluff posted:But there is an opt-out - if you use GFE. Without GFE there is no opt-out because there is no data collection in the first place. The claim that the driver itself has a telemetry service was outdated. SwissArmyDruid, looking for any reason, no matter how flimsy, to poo poo on anything not AMD? Why I never.
|
# ? Nov 1, 2019 23:08 |
|
eames posted:Check the user manual for something like “aux temp” Will do, thanks
|
# ? Nov 1, 2019 23:10 |
|
Endymion FRS MK1 posted:Will do, thanks I downloaded the manual and from what I can tell your board doesn’t support extra thermal sensors.
|
# ? Nov 1, 2019 23:12 |
|
Endymion FRS MK1 posted:What's the easiest method to tie H55 fan speed to GPU temp? When gaming I can't hear it and it gives great temps but browsing the web I'd like it to run a little slower. I'll mess around with tying it to CPU temp in BIOS, so it'll go to "gaming" speeds when my CPU goes over 50 or so but if I can get a simple GPU fan curve I'd be happier I had the same problem with a G12/H55 so I bought 2 Noctua fans, mounted one on each side of the radiator and manually set the fans to run at its max inaudible speed all the time. Keeps my GPU around 65c max under heavy load.
|
# ? Nov 1, 2019 23:14 |
|
Another way to do it would be to get an adapter to plug the fan directly into the graphics card and then control it using your card's fan curve. You'd need a PWM fan, though, and believe the H55's fan is 3-pin voltage-controlled. It'll also be tied to the direct gpu temperature rather than liquid temperature, but if installing a liquid temp sensor doesn't work out that's probably what you'd be doing anyway! E: There's also Speedfan and Argus Monitor that will let you control fans based on GPU temperatures. Speedfan is free, but it hasn't been updated in a while so it may or may not support your motherboard. EE: Since the fan is 3-pin voltage-controlled, you'd need to set the motherboard header to "voltage controlled" or "DC controlled" instead of "PWM" if you want to control it's speed. Stickman fucked around with this message at 23:20 on Nov 1, 2019 |
# ? Nov 1, 2019 23:17 |
|
There’s also the option of using something like speedfan to control the fan speed off GPU temperature using software, just be aware that there can be edge cases with catastrophic results (i.e. fan control software crashing during an unattended sustained load, fan/pump stops, cooling system heats up and melts plastic leading to water leaks, etc). This is unlikely but it has happened before. e: this is another option, though expensive for just this purpose. https://shop.aquacomputer.de/product_info.php?language=en&products_id=3773 eames fucked around with this message at 23:27 on Nov 1, 2019 |
# ? Nov 1, 2019 23:22 |
|
I've tried Speedfan before, it sadly doesn't support my motherboard. For using that adapter, it wouldn't work with a DC fan right? I'm used to adjusting curves based on voltage, all of my case fans work that way anyway. Maybe I have a PWM fan laying around but I doubt it. Edit: CPU temp might work, playing some MW now and the CPU averages around 50C so setting a more aggressive curve at 45C or so might work
|
# ? Nov 2, 2019 00:05 |
|
Sacred Cow posted:I had the same problem with a G12/H55 so I bought 2 Noctua fans, mounted one on each side of the radiator and manually set the fans to run at its max inaudible speed all the time. Keeps my GPU around 65c max under heavy load. This is the best option. The "set the fans to run at a just inaudible speed all the time" and accept that the resultant load temps might be in the 60's instead of the 50's part, not necessarily the "get 2 Noctura fans" part. Those things are made to hit the 90's without issue, so it's not like they care about 55C vs 65C. All the other options either cost goofy money for what it is, or is a awkward cludge. CPU temp can sorta work, but you'll end up spinning the fans up whenever Chrome decides it wants to be stupid for 5 seconds or whatever unless you stick the trip threshold pretty high.
|
# ? Nov 2, 2019 00:22 |
|
DrDork posted:CPU temp can sorta work, but you'll end up spinning the fans up whenever Chrome decides it wants to be stupid for 5 seconds or whatever unless you stick the trip threshold pretty high. I did it before when I had everything on zero fan until ~40C with my 1080. I have a decent feel for when my system decides to be stupid outside of games
|
# ? Nov 2, 2019 00:29 |
|
Endymion FRS MK1 posted:I've tried Speedfan before, it sadly doesn't support my motherboard. For using that adapter, it wouldn't work with a DC fan right? I'm used to adjusting curves based on voltage, all of my case fans work that way anyway. Maybe I have a PWM fan laying around but I doubt it. I use Argus monitor to control the Noctua fans on my 5700XT, they have an option to set a curve based on GPU temp.
|
# ? Nov 2, 2019 01:01 |
|
iospace posted:SwissArmyDruid, looking for any reason, no matter how flimsy, to poo poo on anything not AMD? Why I never. Don't pussyfoot around it, it's not becoming. I hate Nvidia. I merely resent a lot of things, I actively hate Nvidia. I resent not having a choice when it came to buying computer parts during the worst of the shitcoin boom. I resent that it was actually cheaper for me to buy a gaming laptop to plug a monitor into and use as a desktop, instead of building my own. I resent that the best bang-for-buck laptop at the time was the 7700HQ-1060 that I'm on now. I resent that it was a Dell. I resent that it has taken AMD over five years to have their Maxwell. I resent that AMD hasn't done more to make inroads into the notebook space. I resent that AMD APUs are monolithic chips made on the tech that's just about to get phased out, instead of building on the new poo poo. I resent that Navi still ain't here yet. I resent that Intel sat on its rear end for more than a decade, not caring to do anything except 5% YOY off 14nm process improvements with marketing razzle-dazzle. But Nvidia? I hate NVidia's current product stack. I hate how the pricing for NVidia's current product stack is exponential. I hate Nvidia's current product stack on professional GPUs. I hate the roadblocks Nvidia puts in my way when I try to pass through GPU into a VM. I hate how erratic performance from driver release to release have got me checking driver reviews for regressions before I download new drivers. I hate the words "driver reviews". I hate the IDEA of "driver reviews". I hate the telemetry. I hate the datamining. I hate features being locked behind accepting to be datamined. I hate making logins for pointless poo poo. I hate driver-level features being locked behind a login in GFE. SwissArmyDruid fucked around with this message at 01:28 on Nov 2, 2019 |
# ? Nov 2, 2019 01:22 |
|
GPU Megat[H]read - I actively hate Nvidia.
|
# ? Nov 2, 2019 01:27 |
|
DrDork posted:This is the best option. The "set the fans to run at a just inaudible speed all the time" and accept that the resultant load temps might be in the 60's instead of the 50's part, not necessarily the "get 2 Noctura fans" part. Those things are made to hit the 90's without issue, so it's not like they care about 55C vs 65C. All the other options either cost goofy money for what it is, or is a awkward cludge. The only reason I went with 2 was because my card would naturally stop boosting when it hit 75c. If I used Afterburner to go past that, things would get crashy around 78c with even the slightest overclock on the stock cooler. The biggest downside is I’ll never be able to put this in any modern case with a glass side panel because it’s ugly as all gently caress.
|
# ? Nov 2, 2019 01:29 |
|
All computers are beautiful.
|
# ? Nov 2, 2019 01:35 |
|
GRINDCORE MEGGIDO posted:GPU Megat[H]read - I actively hate Nvidia. Honestly I have 4 of their cards in my house right now and, gently caress it, same
|
# ? Nov 2, 2019 01:35 |
|
SwissArmyDruid posted:Don't pussyfoot around it, it's not becoming. please dont get mad about toys
|
# ? Nov 2, 2019 02:22 |
|
|
# ? Jun 4, 2024 02:59 |
|
Please don't kill nvidia
|
# ? Nov 2, 2019 03:00 |