|
i moved the radiator fans off the aio and liquid temp control onto a different fan controller and cpu temp control and i regret it a bit. thats my aio story
|
# ? Sep 15, 2019 14:20 |
|
|
# ? Jun 5, 2024 14:11 |
|
Indiana_Krom posted:Otherwise Corsair makes a fan controller that can do it, but an aquacomputer is the ultimate high end + brute force solution (completely independent programmable computerized multi-channel fan controller).
|
# ? Sep 15, 2019 15:58 |
|
Aquacomputer also makes a D5 pump with their controller electronics built in.
|
# ? Sep 15, 2019 18:42 |
|
It really bugs me that all of these fan control softwares is they only work with their own hardware. I dont believe at all that Corsair, for example, are doing some proprietary magic that makes it impossible for NZXT to read the water temps or pump speed reported by the H150i cooler and build fan curves based on them. Especially when a program like HWinfo can read that information.
|
# ? Sep 15, 2019 19:11 |
|
uhhhhahhhhohahhh posted:It really bugs me that all of these fan control softwares is they only work with their own hardware. I dont believe at all that Corsair, for example, are doing some proprietary magic that makes it impossible for NZXT to read the water temps or pump speed reported by the H150i cooler and build fan curves based on them. Especially when a program like HWinfo can read that information. I like to imagine the CEOs of all the hardware companies meet in a smokey restaurant and agree to mutual incompatibility and compete to see who can make the worst LED software.
|
# ? Sep 15, 2019 22:56 |
|
monsterzero posted:I like to imagine the CEOs of all the hardware companies meet in a smokey restaurant and agree to mutual incompatibility and compete to see who can make the worst LED software. In my case, a unified way to disable it. I can't turn a drat light off on my MSI GPU without some MSI garbageware installed and running. That's awful. At least motherboards let you disable it in bios. Khorne fucked around with this message at 04:21 on Sep 16, 2019 |
# ? Sep 16, 2019 04:19 |
|
Khorne posted:I'm really hoping someone, whether it's AMD, nvidia, microsoft, or intel - I really don't care, throws their weight around and forces all vendors to be compliant with some software they're offering. Once one company does it the flood gates will open and we'll have a unified, hopefully open source, tool to control all this junk. I can turn off the lights on any device with this one simple trick. It's called "a good pair of diagonal cutters".
|
# ? Sep 16, 2019 04:52 |
|
Khorne posted:In my case, a unified way to disable it. I can't turn a drat light off on my MSI GPU without some MSI garbageware installed and running. That's awful. At least motherboards let you disable it in bios. The way I set it up only seems to work if I directly run the executable with admin privileges, but then I can kill the process manually and it keeps the lights disabled across reboots. It doesn't seem to trigger properly after a cold boot, though, so the red and light blue logo blare at full brightness and I have to re-run the tool if I shut off the PC.
|
# ? Sep 16, 2019 05:00 |
|
Better yet, put in a hardware toggle switch on the gear so the rest of us non-edgelords can all turn off these goddamn lights without equally stupid RGB software. Other stuff that are also dumb: non front-facing I/O ports on cases, hardwired molex connectors on case fans instead of on a separate 3-pin/molex adapter. Finally, I have seen at least two guys firsthand who were more interested in their RGBs than perf/$ in a local DIY PC store. Palladium fucked around with this message at 05:22 on Sep 16, 2019 |
# ? Sep 16, 2019 05:17 |
|
I’m not one to seek out the LEDs on pc stuff but if it has it I don’t mind either. The HSFs the ryzens come with, with the changing colours, are pretty cool actually.
|
# ? Sep 16, 2019 06:10 |
|
priznat posted:I’m not one to seek out the LEDs on pc stuff but if it has it I don’t mind either. *immediately replaces with a brown Noctua* Edit: Actually, I might have used the stock cooler with the 3600 in my node 202, but it's too tall, so I'm using a Noctua nh-l9a with a 25mm thick af-a9. In reality, I was always going Noctua anyway HalloKitty fucked around with this message at 09:16 on Sep 16, 2019 |
# ? Sep 16, 2019 09:12 |
|
Many GPUs can have the LEDs disabled non permanently by unplugging the LED power cable from the board, you may get lucky and be able to do that without removing the heatsink.
|
# ? Sep 16, 2019 09:27 |
|
Palladium posted:Better yet, put in a hardware toggle switch on the gear so the rest of us non-edgelords can all turn off these goddamn lights without equally stupid RGB software. That's actually a thing on Asrock's 5700xt. Hopefully other manufacturers follow suit. Also HalloKitty posted:*immediately replaces with a brown Noctua*
|
# ? Sep 16, 2019 10:36 |
|
SwissArmyDruid posted:I can turn off the lights on any device with this one simple trick. I got a case without a window 🤷♂️
|
# ? Sep 16, 2019 15:48 |
|
Munkeymon posted:I got a case without a window 🤷♂️ Fractal Design Define S with a solid side panel, I don’t care what it looks like inside as long as it performs well.
|
# ? Sep 16, 2019 16:19 |
|
Munkeymon posted:I got a case without a window 🤷♂️ Even then, some light usually bleeds out of the back, and if the back's against a white wall it can be quite visible in the dark. Less so under a desk.. really, it's no big deal, but if you want no lights at all you should be able to just turn them off and a hardware switch would be much better than having to install software (and not even that expensive for manufacturers I believe). (my case is chock full of gaudy leds and on top of my desk )
|
# ? Sep 16, 2019 16:43 |
|
HalloKitty posted:*immediately replaces with a brown Noctua* Yah Noctua rules. But we have a lot of the 3700x systems at work in open cases and since we don’t care about overclocking using the default fans is fine and they make the lab much more colourful! It’s funny seeing a tired out engineer just stop and stare at a shelf with 8 of those hsfs all swirling their colours, it’s kind of hypnotic.
|
# ? Sep 16, 2019 18:40 |
|
mdxi posted:CPU deaths are so rare that -- in general -- diagnosing them is like the old saw in computer science: "No, it's not a compiler bug. It's never a compiler bug." Just swapped the CPU and it booted and installed Windows just fine. Updated the drivers, now have it running memtest. So far so good!
|
# ? Sep 17, 2019 03:07 |
|
Paul MaudDib posted:AMD confirms "not enough public pressure" to force them to port RIS from GCN 4.0 and 6.0 back to 5.0. This seems...dumb as gently caress? They're going to have 3 generations of iGPU using Vega now, and Vega has native FP16 so it'll basically run with no performance impact. Also it'd be kinda dumb not to at least port it to their Carizzo/Bristol/Stoney poo poo, as that's still getting sold as well.
|
# ? Sep 17, 2019 03:15 |
|
EmpyreanFlux posted:This seems...dumb as gently caress? They're going to have 3 generations of iGPU using Vega now, and Vega has native FP16 so it'll basically run with no performance impact. yeah reddit is stuck on "well they're not selling Vega 56/64 anymore" but sharp upscaling seems like a selling point for APUs that can't run native resolutions at super great fps, and all of those are Vega based. Including what has been announced/rumored so far on the 3000 series. also literally flaunting it like "we don't want to. c'mon, try and make us!" is... weird. Part of what you're paying for with a GPU (a large part) is not just hardware, it's ongoing software support. It's not like Vega is super antiquated technology, they literally sold Vega as a flagship up until like 2 months ago. Not that this is anything new for AMD, Fury owners splashed out for a flagship product then got turbofucked too, but c'mon. Is that really an angle AMD wants to push? maybe this is like a thing like RTG itself wants to support it but AMD mangement is telling them no, and someone is doing a "please tell management how wrong they are"? It's loving weird. Paul MaudDib fucked around with this message at 08:24 on Sep 17, 2019 |
# ? Sep 17, 2019 08:18 |
|
Does anyone have a good source to consult for which game runs better with SMT disabled? For some reason certain games also seem to profit from turning "full screen optimization" off. How is that even a thing?
|
# ? Sep 17, 2019 10:10 |
|
Media Bloodbath posted:Does anyone have a good source to consult for which game runs better with SMT disabled? GanersNexus and Hardware Unboxed (Techspot) did SMT on/off in their Ryzen3 testing. The difference with SMT off is either nothing or such a slight increase that in general, it doesn't seem worth disabling it. Windows 'Full screen optimization' is a misleading name. From what I remember, it's some Microsoft half-fullscreen, half-windowed implementation to allow their DVR and new notifications/overlays to work. Disabling it on Battlefield seems to give me more consistent frames, where I'm GPU limited.
|
# ? Sep 17, 2019 11:23 |
|
uhhhhahhhhohahhh posted:GanersNexus and Hardware Unboxed (Techspot) did SMT on/off in their Ryzen3 testing. The difference with SMT off is either nothing or such a slight increase that in general, it doesn't seem worth disabling it. Most 8700k (9 series might too, not sure) overclock 100-200 MHz higher with HT disabled so I often wonder if it would make more sense to leave it off, also in light of the recent security bugs. It would be a pretty big hit in synthetic or production workloads like rendering. I leave it on because
|
# ? Sep 17, 2019 13:54 |
|
Palladium posted:Better yet, put in a hardware toggle switch on the gear so the rest of us non-edgelords can all turn off these goddamn lights without equally stupid RGB software. I bought a case without windows Hardware toggle
|
# ? Sep 17, 2019 14:35 |
|
All right, ran memtest on the new machine overnight and had no errors. Installed Ryzen master and it's reporting a CPU temp between 45-50C. RAM is set to XMP profile 1. Windows update, installed the relevant drivers for the motherboard, everything seems to be functioning and stable. Is there anything else I should be checking before calling it good and putting in the Windows key? Are there any other tips for Ryzen builds and setup? I remember someone saying something about setting Ryzen balanced in the power options, but not the specifics. Edit: this is the build in question: https://pcpartpicker.com/list/TT3Zx6 The Rat fucked around with this message at 19:35 on Sep 17, 2019 |
# ? Sep 17, 2019 15:35 |
|
Statutory Ape posted:I bought a case without windows If you can't see the LEDs, do they still boost your framerate though?
|
# ? Sep 17, 2019 16:30 |
|
Palladium posted:Better yet, put in a hardware toggle switch on the gear so the rest of us non-edgelords can all turn off these goddamn lights without equally stupid RGB software. considering how whiny the anti-rgb brigade is, i think you got the edgelord part backwards tbh
|
# ? Sep 17, 2019 16:49 |
|
Paul MaudDib posted:yeah reddit is stuck on "well they're not selling Vega 56/64 anymore" but sharp upscaling seems like a selling point for APUs that can't run native resolutions at super great fps, and all of those are Vega based. Including what has been announced/rumored so far on the 3000 series. I'm just not...getting the issue of making RIS available on literally all AMD products, native FP16 or not. It's a goddamn sharpening filter shader, how the gently caress is it not ISA agnostic? Goddamn Lumapsharpen works on old VLIW to my knowledge. Just release a complete driver update to get every single AMD GPU on the same page. Yeah, optimize the old loving fixed pipeline poo poo too, it's really irritating that AMD GPUs are best for vintage computing/gaming but the drivers are an absolute frustrating mess. I know for the older poo poo it's all niche case mostly and that's money sunk where it probably doesn't need to be but it'd probably be a good PR thing to do, and there is still no excuse to not make RIS available for everything, as far as I know anyway.
|
# ? Sep 17, 2019 20:51 |
|
Palladium posted:Finally, I have seen at least two guys firsthand who were more interested in their RGBs than perf/$ in a local DIY PC store. Nothing wrong with this. I’m quite an amateur RGB person figuring all this out, but other than graphics cards I’m not going to buy parts that are incompatible with both Corsair and MSI lighting engines. I HAVE SOENT 20% more for similar performing parts that have RGB when I bought memory at the height of the cartel monopoly. And I avoid anything from NZXT that requires their software because I don’t want to install it. I presently trying to find some piece of Razer equipment that doesn’t make me vomit so that I can use their game-themed color profiles. The idea of the stuff in my glassy PC case changing colors when I swap Overwatch heroes (and put in my piddling bronze tier performance) seems pretty cool. I will probably buy Corsair’s CPU cooler, but integration with the Razer thing makes the Thermaltake product compelling, I gotta admit. Thermaltake is pretty much the only brand doing Razer Chroma on their fans and cooler mounts. There’s no reason to shame this stuff, I used to in the old days but that’s because people were drilling holes into sheet metal to fit acrylic windows to put aftermarket lighting in there to make their PCs look like the matrix. Since it stopped being an aftermarket modder thing and became part of stock hardware I’ve really gotten into it. I can have a “cool PC” and not have to break warranties. What’s not to like. Paul MaudDib posted:yeah reddit is stuck on "well they're not selling Vega 56/64 anymore" but sharp upscaling seems like a selling point for APUs that can't run native resolutions at super great fps, and all of those are Vega based. Including what has been announced/rumored so far on the 3000 series. I think you maybe just figured it out. Why give APU users any assistance towards playable gaming when AMD would rather they add in a graphics card? They cheese off Vega card owners to do it, maybe they’re banking on there not being that many. Craptacular! fucked around with this message at 22:54 on Sep 17, 2019 |
# ? Sep 17, 2019 22:47 |
|
Craptacular! posted:I think you maybe just figured it out. Why give APU users any assistance towards playable gaming when AMD would rather they add in a graphics card? You're thinking too desktop oriented, where RIS matters is the mobile space. Getting close to 720p performance while about 1080p image quality is loving huge, and 540p performance to 720p image quality basically means an R7 3700U will play drat near anything on low. I don't know why AMD is waffling on it, and it'd be so loving dumb not to push this just as Intel is moving to 64EU as standard for iGPU (and showing they have fairly similar performance to AMD, ALU for ALU). So what if a Vega 11 in an R5 3400G starts pushing much better frames than a GT 1030 with RIS, it's a goddamn GT 1030 and mobile matters way more.
|
# ? Sep 17, 2019 23:46 |
|
Craptacular! posted:Nothing wrong with this. I’m quite an amateur RGB person figuring all this out, but other than graphics cards I’m not going to buy parts that are incompatible with both Corsair and MSI lighting engines. I HAVE SOENT 20% more for similar performing parts that have RGB when I bought memory at the height of the cartel monopoly. And I avoid anything from NZXT that requires their software because I don’t want to install it. If you’re already using Corsair software you can set per-game profiles in there.
|
# ? Sep 18, 2019 03:15 |
|
EmpyreanFlux posted:You're thinking too desktop oriented, where RIS matters is the mobile space. Getting close to 720p performance while about 1080p image quality is loving huge, and 540p performance to 720p image quality basically means an R7 3700U will play drat near anything on low. Yup, or you could clock down the GPU core and squeeze more battery life out of a given mobile part (eg with FRTC ala Radeon Chill). Performance and power consumption can be traded pretty freely in a mobile design (eg MaxQ) and that translates into longer runtimes (or thinner laptops, sigh). Since this isn't just a normal sharpen but actually a contrast-adaptive sharpen that takes sharpness across the whole image into account, I wonder if it needs some kind of compute shader. So it might not be completely plug-and-play across architectures. But all GCN should have generally similar capabilities, and Vega's should certainly be a superset of Polaris's. There's no technical reason for this, and it should probably literally just compile and run fine. And it's just so weird that AMD is literally like daring people to get mad at them in as many words, it does feel like a dev-team or interdepartmental squabble that got written up in a testy post or something. Manager/department head says they don't have money/staff and nobody will care, team wants to do it anyway, manager says no, team puts out a testy statement about "if you care show But yeah, all of the modern architectures should be able to do this, and everyone else will probably be following suit (including Intel). It's a fantastic idea, one of those ideas that in hindsight is hilarious that nobody thought of trying that before, but it's not going to differentiate AMD for too long. A hardware-agnostic ~30% gain in performance with relatively minimal quality hit, and can be easily injected into legacy titles from the driver stack. That's pretty cool, doesn't happen too often where someone figures out a real way to download more RAM. Paul MaudDib fucked around with this message at 03:35 on Sep 18, 2019 |
# ? Sep 18, 2019 03:20 |
|
Backyarr posted:They're the same other than the CAD Bus resistances, which I haven't played around with and left at 24-24-24-24. Notice the difference in voltages. Maybe that's the important factor for lower voltages? Forgive me for bringing this back up - its a couple pages back, but I did just get the newest ABBA bios and decided to try the 1.6.2 DRAM Calculator settings and I was able to tighten timings even further with lower voltage. It does seem like the CAD Bus resistances play a major part in that, so the timings and voltage seem to work, even though it might seem crazy.
|
# ? Sep 18, 2019 03:39 |
|
For the first time this week I actually did some work that really required my threadripper, as opposed to it being a nice thing to have. I had to find 3 broken commits in an 11k-diverged vendor linux kernel. Let me tell you about the joys of make -j25: Every build was under 2 minutes, and as it got closer to the target it was under 30 seconds. Doing the same on my i7-7700k ran me ~8 minutes per full build. Threadripper 2920x cut 4 hours off this job today alone. Things not pleasing me: loving yoctoproject being the go-to for vendor-trash goddamnt you snowflakes just make your changes to the kernel and u-boot and let someone competent make the actual ARM distro because I'm going to throw yours away immediately anyway. Just getting their poo poo out of yocto into a standalone build environment takes up too much time.
|
# ? Sep 18, 2019 07:54 |
|
New build with a 2700. Rock solid for hours of stress testing with Uniengine Singularity. Solid for an hour of prime 95. Passes memtest86 Fuckin hard crashed with screeching sound watching Bob's Burgers on Hulu. This is going to be a fun one to track down. Weird feeling it's in the XMP profile. It was really unstable at 3200 before a bios update on the B450 Tomahawk. Bios update made it seem stable at 3200 but I have my doubts.
|
# ? Sep 18, 2019 16:28 |
|
Laranzu posted:New build with a 2700. Rock solid for hours of stress testing with Uniengine Singularity. Solid for an hour of prime 95. Passes memtest86 Are you using an AMD videocard with a driver that is from 2019? Were you using chrome? AMD's driver crashes my entire computer if I have video acceleration on in chrome and view the menu in Netflix.This happened on my 2500k and my 3700x, I've even replaced the graphics card (with another AMD card, the 5700 is just a really good value right now). Google turns up other people having this exact same issue and I'm confused as to why AMD hasn't fixed it in over a year. If you roll back to a really old driver it works perfect but you don't want that and if you actually have a newer card you can't.
|
# ? Sep 18, 2019 16:34 |
|
Edit: I was able to diagnose the issue: Edge Insider was the culprit, the latest Canary build released yesterday no longer crashes the system. The Netflix app standalone doesn't crash, Edge was always involved. That's kinda fun, I have a 3800X and an Nvidia 2070 Super: My system crashes with regularity (about every three episodes) whenever I watch Netflix using the Edge Beta or the Netflix app (Chrome doesn't seem to crash it). I'm getting a 5700 XT as a replacement on Monday, I'll report on what happens with that card. Ran Realbench CPU testing for hours, Memtest86 for ~12 hours and 3Dmark: The hardware seems fine otherwise, must be some kind of weird driver issue. Lambert fucked around with this message at 10:55 on Sep 19, 2019 |
# ? Sep 18, 2019 16:39 |
|
pixaal posted:Are you using an AMD videocard with a driver that is from 2019? Were you using chrome? AMD's driver crashes my entire computer if I have video acceleration on in chrome and view the menu in Netflix.This happened on my 2500k and my 3700x, I've even replaced the graphics card (with another AMD card, the 5700 is just a really good value right now). Google turns up other people having this exact same issue and I'm confused as to why AMD hasn't fixed it in over a year. If you roll back to a really old driver it works perfect but you don't want that and if you actually have a newer card you can't. 1660ti Hulu in chrome. It streamed the entirety of a movie perfectly fine after the lock too so it's not reproducible easily. It might have been in the Ryzen Master gaming profile if there is any instability there. I'll either leave it stock or run 3900mhz @ 1.3625 in the future. I need to reinstall Windows anyway because it's littered with all my testing software and weird RGB control software from messing around with all the terrible lights.
|
# ? Sep 18, 2019 16:41 |
|
Lambert posted:That's kinda fun, I have a 3800X and an Nvidia 2070 Super: My system crashes with regularity (about every three episodes) whenever I watch Netflix using the Edge Beta or the Netflix app (Chrome doesn't seem to crash it). I'm getting a 5700 XT as a replacement on Monday, I'll report on what happens with that card. Netflix works fine for me in Edge, so if you start crashing switch browsers for streaming not like it really matters what browser you are using to run a fullscreen video.
|
# ? Sep 18, 2019 16:49 |
|
|
# ? Jun 5, 2024 14:11 |
|
pixaal posted:Netflix works fine for me in Edge, so if you start crashing switch browsers for streaming not like it really matters what browser you are using to run a fullscreen video. While you're right, especially with this pointing to a driver issues that's likely to be fixed in the future, if the 5700 XT turns out to be working fine I'll just stick with that one over the 2070 Super.
|
# ? Sep 18, 2019 16:51 |