Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
uhhhhahhhhohahhh
Oct 9, 2012
i moved the radiator fans off the aio and liquid temp control onto a different fan controller and cpu temp control and i regret it a bit. thats my aio story

Adbot
ADBOT LOVES YOU

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Indiana_Krom posted:

Otherwise Corsair makes a fan controller that can do it, but an aquacomputer is the ultimate high end + brute force solution (completely independent programmable computerized multi-channel fan controller).
I have the Corsair controller. Mostly because it came in a fancy case that you could just tape anywhere and not be visually offensive. The aquacomputer either ships in a 5.25" drive bay thing, which are becoming increasingly rare in newer cases, or as plain PCB. Aquasuite is way more flexible than iCUE, tho.

Wiseblood
Dec 31, 2000

Aquacomputer also makes a D5 pump with their controller electronics built in.

uhhhhahhhhohahhh
Oct 9, 2012
It really bugs me that all of these fan control softwares is they only work with their own hardware. I dont believe at all that Corsair, for example, are doing some proprietary magic that makes it impossible for NZXT to read the water temps or pump speed reported by the H150i cooler and build fan curves based on them. Especially when a program like HWinfo can read that information.

monsterzero
May 12, 2002
-=TOPGUN=-
Boys who love airplanes :respek: Boys who love boys
Lipstick Apathy

uhhhhahhhhohahhh posted:

It really bugs me that all of these fan control softwares is they only work with their own hardware. I dont believe at all that Corsair, for example, are doing some proprietary magic that makes it impossible for NZXT to read the water temps or pump speed reported by the H150i cooler and build fan curves based on them. Especially when a program like HWinfo can read that information.

I like to imagine the CEOs of all the hardware companies meet in a smokey restaurant and agree to mutual incompatibility and compete to see who can make the worst LED software.

Khorne
May 1, 2002

monsterzero posted:

I like to imagine the CEOs of all the hardware companies meet in a smokey restaurant and agree to mutual incompatibility and compete to see who can make the worst LED software.
I'm really hoping someone, whether it's AMD, nvidia, microsoft, or intel - I really don't care, throws their weight around and forces all vendors to be compliant with some software they're offering. Once one company does it the flood gates will open and we'll have a unified, hopefully open source, tool to control all this junk.

In my case, a unified way to disable it. I can't turn a drat light off on my MSI GPU without some MSI garbageware installed and running. That's awful. At least motherboards let you disable it in bios.

Khorne fucked around with this message at 04:21 on Sep 16, 2019

SwissArmyDruid
Feb 14, 2014

by sebmojo

Khorne posted:

I'm really hoping someone, whether it's AMD, nvidia, microsoft, or intel - I really don't care, throws their weight around and forces all vendors to be compliant with some software they're offering. Once one company does it the flood gates will open and we'll have a unified, hopefully open source, tool to control all this junk.

In my case, a unified way to disable it. I can't turn a drat light off on my MSI GPU without some MSI garbageware installed and running. That's awful. At least motherboards let you disable it in bios.

I can turn off the lights on any device with this one simple trick.

It's called "a good pair of diagonal cutters".

some dillweed
Mar 31, 2007

Khorne posted:

In my case, a unified way to disable it. I can't turn a drat light off on my MSI GPU without some MSI garbageware installed and running. That's awful. At least motherboards let you disable it in bios.
I don't know which GPU you're using, but back around the time I bought my GTX 1060 Gaming X, someone named Vipeax started releasing this MSI LED Tool: https://github.com/Vipeax/MSI-LED-Tool. It looks like it still only supports Pascal, Maxwell, and Polaris cards, so if you have something like a Turing or Navi of some sort then you're probably still out of luck.

The way I set it up only seems to work if I directly run the executable with admin privileges, but then I can kill the process manually and it keeps the lights disabled across reboots. It doesn't seem to trigger properly after a cold boot, though, so the red and light blue logo blare at full brightness and I have to re-run the tool if I shut off the PC.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Better yet, put in a hardware toggle switch on the gear so the rest of us non-edgelords can all turn off these goddamn lights without equally stupid RGB software.

Other stuff that are also dumb: non front-facing I/O ports on cases, hardwired molex connectors on case fans instead of on a separate 3-pin/molex adapter.

Finally, I have seen at least two guys firsthand who were more interested in their RGBs than perf/$ in a local DIY PC store.

Palladium fucked around with this message at 05:22 on Sep 16, 2019

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
I’m not one to seek out the LEDs on pc stuff but if it has it I don’t mind either.

The HSFs the ryzens come with, with the changing colours, are pretty cool actually.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

priznat posted:

I’m not one to seek out the LEDs on pc stuff but if it has it I don’t mind either.

The HSFs the ryzens come with, with the changing colours, are pretty cool actually.

*immediately replaces with a brown Noctua*

Edit:
Actually, I might have used the stock cooler with the 3600 in my node 202, but it's too tall, so I'm using a Noctua nh-l9a with a 25mm thick af-a9. In reality, I was always going Noctua anyway

HalloKitty fucked around with this message at 09:16 on Sep 16, 2019

GRINDCORE MEGGIDO
Feb 28, 1985


Many GPUs can have the LEDs disabled non permanently by unplugging the LED power cable from the board, you may get lucky and be able to do that without removing the heatsink.

Arzachel
May 12, 2012

Palladium posted:

Better yet, put in a hardware toggle switch on the gear so the rest of us non-edgelords can all turn off these goddamn lights without equally stupid RGB software.

That's actually a thing on Asrock's 5700xt. Hopefully other manufacturers follow suit.

Also

HalloKitty posted:

*immediately replaces with a brown Noctua*

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



SwissArmyDruid posted:

I can turn off the lights on any device with this one simple trick.

It's called "a good pair of diagonal cutters".

I got a case without a window 🤷‍♂️

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Munkeymon posted:

I got a case without a window 🤷‍♂️

Fractal Design Define S with a solid side panel, I don’t care what it looks like inside as long as it performs well.

TorakFade
Oct 3, 2006

I strongly disapprove


Munkeymon posted:

I got a case without a window 🤷‍♂️

Even then, some light usually bleeds out of the back, and if the back's against a white wall it can be quite visible in the dark. Less so under a desk.. really, it's no big deal, but if you want no lights at all you should be able to just turn them off and a hardware switch would be much better than having to install software (and not even that expensive for manufacturers I believe).

(my case is chock full of gaudy leds and on top of my desk :pcgaming: )

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

HalloKitty posted:

*immediately replaces with a brown Noctua*

Edit:
Actually, I might have used the stock cooler with the 3600 in my node 202, but it's too tall, so I'm using a Noctua nh-l9a with a 25mm thick af-a9. In reality, I was always going Noctua anyway

Yah Noctua rules. But we have a lot of the 3700x systems at work in open cases and since we don’t care about overclocking using the default fans is fine and they make the lab much more colourful! :pcgaming:

It’s funny seeing a tired out engineer just stop and stare at a shelf with 8 of those hsfs all swirling their colours, it’s kind of hypnotic.

The Rat
Aug 29, 2004

You will find no one to help you here. Beth DuClare has been dissected and placed in cryonic storage.

mdxi posted:

CPU deaths are so rare that -- in general -- diagnosing them is like the old saw in computer science: "No, it's not a compiler bug. It's never a compiler bug."

Except that very, very occasionally, it actually is a compiler bug. You just have to impress upon the newbies the overwhelmingly more probable case that the problem is something they have done.

In this case it sounds like you've done the due diligence in checking every other subsystem you can. Also, anecdotally, I've bought 2 dead AMD CPUs in the past 3 months myself: one near the bottom of the range (2200GE) and one at the very top (3900X). The 2200GE was just plain dead; the 3900X exhibited freakish, randomized, won't-POST/will-POST/boots-halfway/won't-POST behavior.

Swap it.

Just swapped the CPU and it booted and installed Windows just fine. Updated the drivers, now have it running memtest. So far so good!

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

This seems...dumb as gently caress? They're going to have 3 generations of iGPU using Vega now, and Vega has native FP16 so it'll basically run with no performance impact.

Also it'd be kinda dumb not to at least port it to their Carizzo/Bristol/Stoney poo poo, as that's still getting sold as well.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

EmpyreanFlux posted:

This seems...dumb as gently caress? They're going to have 3 generations of iGPU using Vega now, and Vega has native FP16 so it'll basically run with no performance impact.

Also it'd be kinda dumb not to at least port it to their Carizzo/Bristol/Stoney poo poo, as that's still getting sold as well.

yeah reddit is stuck on "well they're not selling Vega 56/64 anymore" but sharp upscaling seems like a selling point for APUs that can't run native resolutions at super great fps, and all of those are Vega based. Including what has been announced/rumored so far on the 3000 series. :shrug:

also literally flaunting it like "we don't want to. c'mon, try and make us!" is... weird. Part of what you're paying for with a GPU (a large part) is not just hardware, it's ongoing software support. It's not like Vega is super antiquated technology, they literally sold Vega as a flagship up until like 2 months ago. Not that this is anything new for AMD, Fury owners splashed out for a flagship product then got turbofucked too, but c'mon. Is that really an angle AMD wants to push?

maybe this is like a thing like RTG itself wants to support it but AMD mangement is telling them no, and someone is doing a "please tell management how wrong they are"? It's loving weird.

Paul MaudDib fucked around with this message at 08:24 on Sep 17, 2019

Media Bloodbath
Mar 1, 2018

PIVOT TO ETERNAL SUFFERING
:hb:
Does anyone have a good source to consult for which game runs better with SMT disabled?

For some reason certain games also seem to profit from turning "full screen optimization" off. How is that even a thing?

uhhhhahhhhohahhh
Oct 9, 2012

Media Bloodbath posted:

Does anyone have a good source to consult for which game runs better with SMT disabled?

For some reason certain games also seem to profit from turning "full screen optimization" off. How is that even a thing?

GanersNexus and Hardware Unboxed (Techspot) did SMT on/off in their Ryzen3 testing. The difference with SMT off is either nothing or such a slight increase that in general, it doesn't seem worth disabling it.

Windows 'Full screen optimization' is a misleading name. From what I remember, it's some Microsoft half-fullscreen, half-windowed implementation to allow their DVR and new notifications/overlays to work. Disabling it on Battlefield seems to give me more consistent frames, where I'm GPU limited.

eames
May 9, 2009

uhhhhahhhhohahhh posted:

GanersNexus and Hardware Unboxed (Techspot) did SMT on/off in their Ryzen3 testing. The difference with SMT off is either nothing or such a slight increase that in general, it doesn't seem worth disabling it.

Most 8700k (9 series might too, not sure) overclock 100-200 MHz higher with HT disabled so I often wonder if it would make more sense to leave it off, also in light of the recent security bugs. It would be a pretty big hit in synthetic or production workloads like rendering. I leave it on because :effort:

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Palladium posted:

Better yet, put in a hardware toggle switch on the gear so the rest of us non-edgelords can all turn off these goddamn lights without equally stupid RGB software.

Other stuff that are also dumb: non front-facing I/O ports on cases, hardwired molex connectors on case fans instead of on a separate 3-pin/molex adapter.

Finally, I have seen at least two guys firsthand who were more interested in their RGBs than perf/$ in a local DIY PC store.

I bought a case without windows

Hardware toggle

The Rat
Aug 29, 2004

You will find no one to help you here. Beth DuClare has been dissected and placed in cryonic storage.

All right, ran memtest on the new machine overnight and had no errors. Installed Ryzen master and it's reporting a CPU temp between 45-50C. RAM is set to XMP profile 1. Windows update, installed the relevant drivers for the motherboard, everything seems to be functioning and stable.

Is there anything else I should be checking before calling it good and putting in the Windows key?

Are there any other tips for Ryzen builds and setup? I remember someone saying something about setting Ryzen balanced in the power options, but not the specifics.

Edit: this is the build in question: https://pcpartpicker.com/list/TT3Zx6

The Rat fucked around with this message at 19:35 on Sep 17, 2019

Aphrodite
Jun 27, 2006

Statutory Ape posted:

I bought a case without windows

Hardware toggle

If you can't see the LEDs, do they still boost your framerate though?

Cygni
Nov 12, 2005

raring to post

Palladium posted:

Better yet, put in a hardware toggle switch on the gear so the rest of us non-edgelords can all turn off these goddamn lights without equally stupid RGB software.

considering how whiny the anti-rgb brigade is, i think you got the edgelord part backwards tbh

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Paul MaudDib posted:

yeah reddit is stuck on "well they're not selling Vega 56/64 anymore" but sharp upscaling seems like a selling point for APUs that can't run native resolutions at super great fps, and all of those are Vega based. Including what has been announced/rumored so far on the 3000 series. :shrug:

also literally flaunting it like "we don't want to. c'mon, try and make us!" is... weird. Part of what you're paying for with a GPU (a large part) is not just hardware, it's ongoing software support. It's not like Vega is super antiquated technology, they literally sold Vega as a flagship up until like 2 months ago. Not that this is anything new for AMD, Fury owners splashed out for a flagship product then got turbofucked too, but c'mon. Is that really an angle AMD wants to push?

maybe this is like a thing like RTG itself wants to support it but AMD mangement is telling them no, and someone is doing a "please tell management how wrong they are"? It's loving weird.

I'm just not...getting the issue of making RIS available on literally all AMD products, native FP16 or not. It's a goddamn sharpening filter shader, how the gently caress is it not ISA agnostic? Goddamn Lumapsharpen works on old VLIW to my knowledge. Just release a complete driver update to get every single AMD GPU on the same page. Yeah, optimize the old loving fixed pipeline poo poo too, it's really irritating that AMD GPUs are best for vintage computing/gaming but the drivers are an absolute frustrating mess.

I know for the older poo poo it's all niche case mostly and that's money sunk where it probably doesn't need to be but it'd probably be a good PR thing to do, and there is still no excuse to not make RIS available for everything, as far as I know anyway.

Craptacular!
Jul 9, 2001

Fuck the DH

Palladium posted:

Finally, I have seen at least two guys firsthand who were more interested in their RGBs than perf/$ in a local DIY PC store.

Nothing wrong with this. I’m quite an amateur RGB person figuring all this out, but other than graphics cards I’m not going to buy parts that are incompatible with both Corsair and MSI lighting engines. I HAVE SOENT 20% more for similar performing parts that have RGB when I bought memory at the height of the cartel monopoly. And I avoid anything from NZXT that requires their software because I don’t want to install it.

I presently trying to find some piece of Razer equipment that doesn’t make me vomit so that I can use their game-themed color profiles. The idea of the stuff in my glassy PC case changing colors when I swap Overwatch heroes (and put in my piddling bronze tier performance) seems pretty cool. I will probably buy Corsair’s CPU cooler, but integration with the Razer thing makes the Thermaltake product compelling, I gotta admit. Thermaltake is pretty much the only brand doing Razer Chroma on their fans and cooler mounts.

There’s no reason to shame this stuff, I used to in the old days but that’s because people were drilling holes into sheet metal to fit acrylic windows to put aftermarket lighting in there to make their PCs look like the matrix. Since it stopped being an aftermarket modder thing and became part of stock hardware I’ve really gotten into it. I can have a “cool PC” and not have to break warranties. What’s not to like.

Paul MaudDib posted:

yeah reddit is stuck on "well they're not selling Vega 56/64 anymore" but sharp upscaling seems like a selling point for APUs that can't run native resolutions at super great fps, and all of those are Vega based. Including what has been announced/rumored so far on the 3000 series. :shrug:

I think you maybe just figured it out. Why give APU users any assistance towards playable gaming when AMD would rather they add in a graphics card?

They cheese off Vega card owners to do it, maybe they’re banking on there not being that many.

Craptacular! fucked around with this message at 22:54 on Sep 17, 2019

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Craptacular! posted:

I think you maybe just figured it out. Why give APU users any assistance towards playable gaming when AMD would rather they add in a graphics card?

They cheese off Vega card owners to do it, maybe they’re banking on there not being that many.

You're thinking too desktop oriented, where RIS matters is the mobile space. Getting close to 720p performance while about 1080p image quality is loving huge, and 540p performance to 720p image quality basically means an R7 3700U will play drat near anything on low.

I don't know why AMD is waffling on it, and it'd be so loving dumb not to push this just as Intel is moving to 64EU as standard for iGPU (and showing they have fairly similar performance to AMD, ALU for ALU). So what if a Vega 11 in an R5 3400G starts pushing much better frames than a GT 1030 with RIS, it's a goddamn GT 1030 and mobile matters way more.

GutBomb
Jun 15, 2005

Dude?

Craptacular! posted:

Nothing wrong with this. I’m quite an amateur RGB person figuring all this out, but other than graphics cards I’m not going to buy parts that are incompatible with both Corsair and MSI lighting engines. I HAVE SOENT 20% more for similar performing parts that have RGB when I bought memory at the height of the cartel monopoly. And I avoid anything from NZXT that requires their software because I don’t want to install it.

I presently trying to find some piece of Razer equipment that doesn’t make me vomit so that I can use their game-themed color profiles. The idea of the stuff in my glassy PC case changing colors when I swap Overwatch heroes (and put in my piddling bronze tier performance) seems pretty cool. I will probably buy Corsair’s CPU cooler, but integration with the Razer thing makes the Thermaltake product compelling, I gotta admit. Thermaltake is pretty much the only brand doing Razer Chroma on their fans and cooler mounts.

There’s no reason to shame this stuff, I used to in the old days but that’s because people were drilling holes into sheet metal to fit acrylic windows to put aftermarket lighting in there to make their PCs look like the matrix. Since it stopped being an aftermarket modder thing and became part of stock hardware I’ve really gotten into it. I can have a “cool PC” and not have to break warranties. What’s not to like.


I think you maybe just figured it out. Why give APU users any assistance towards playable gaming when AMD would rather they add in a graphics card?

They cheese off Vega card owners to do it, maybe they’re banking on there not being that many.

If you’re already using Corsair software you can set per-game profiles in there.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

EmpyreanFlux posted:

You're thinking too desktop oriented, where RIS matters is the mobile space. Getting close to 720p performance while about 1080p image quality is loving huge, and 540p performance to 720p image quality basically means an R7 3700U will play drat near anything on low.

I don't know why AMD is waffling on it, and it'd be so loving dumb not to push this just as Intel is moving to 64EU as standard for iGPU (and showing they have fairly similar performance to AMD, ALU for ALU). So what if a Vega 11 in an R5 3400G starts pushing much better frames than a GT 1030 with RIS, it's a goddamn GT 1030 and mobile matters way more.

Yup, or you could clock down the GPU core and squeeze more battery life out of a given mobile part (eg with FRTC ala Radeon Chill). Performance and power consumption can be traded pretty freely in a mobile design (eg MaxQ) and that translates into longer runtimes (or thinner laptops, sigh).

Since this isn't just a normal sharpen but actually a contrast-adaptive sharpen that takes sharpness across the whole image into account, I wonder if it needs some kind of compute shader. So it might not be completely plug-and-play across architectures. But all GCN should have generally similar capabilities, and Vega's should certainly be a superset of Polaris's. There's no technical reason for this, and it should probably literally just compile and run fine.

And it's just so weird that AMD is literally like daring people to get mad at them in as many words, it does feel like a dev-team or interdepartmental squabble that got written up in a testy post or something. Manager/department head says they don't have money/staff and nobody will care, team wants to do it anyway, manager says no, team puts out a testy statement about "if you care show Karen us"? Otherwise it's bizarre that something like that got past PR.

But yeah, all of the modern architectures should be able to do this, and everyone else will probably be following suit (including Intel). It's a fantastic idea, one of those ideas that in hindsight is hilarious that nobody thought of trying that before, but it's not going to differentiate AMD for too long.

A hardware-agnostic ~30% gain in performance with relatively minimal quality hit, and can be easily injected into legacy titles from the driver stack. That's pretty cool, doesn't happen too often where someone figures out a real way to download more RAM.

Paul MaudDib fucked around with this message at 03:35 on Sep 18, 2019

Burno
Aug 6, 2012

Backyarr posted:

They're the same other than the CAD Bus resistances, which I haven't played around with and left at 24-24-24-24. Notice the difference in voltages. Maybe that's the important factor for lower voltages?
1.6.0.3: https://imgur.com/DFwWPGP.png
1.6.2: https://imgur.com/e7uT7yP.png

Forgive me for bringing this back up - its a couple pages back, but I did just get the newest ABBA bios and decided to try the 1.6.2 DRAM Calculator settings and I was able to tighten timings even further with lower voltage. It does seem like the CAD Bus resistances play a major part in that, so the timings and voltage seem to work, even though it might seem crazy.

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop
For the first time this week I actually did some work that really required my threadripper, as opposed to it being a nice thing to have. I had to find 3 broken commits in an 11k-diverged vendor linux kernel.

Let me tell you about the joys of make -j25: Every build was under 2 minutes, and as it got closer to the target it was under 30 seconds. Doing the same on my i7-7700k ran me ~8 minutes per full build. Threadripper 2920x cut 4 hours off this job today alone.

Things not pleasing me: loving yoctoproject being the go-to for vendor-trash goddamnt you snowflakes just make your changes to the kernel and u-boot and let someone competent make the actual ARM distro because I'm going to throw yours away immediately anyway. Just getting their poo poo out of yocto into a standalone build environment takes up too much time.

Laranzu
Jan 18, 2002
New build with a 2700. Rock solid for hours of stress testing with Uniengine Singularity. Solid for an hour of prime 95. Passes memtest86

Fuckin hard crashed with screeching sound watching Bob's Burgers on Hulu.

This is going to be a fun one to track down. Weird feeling it's in the XMP profile. It was really unstable at 3200 before a bios update on the B450 Tomahawk. Bios update made it seem stable at 3200 but I have my doubts.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Laranzu posted:

New build with a 2700. Rock solid for hours of stress testing with Uniengine Singularity. Solid for an hour of prime 95. Passes memtest86

Fuckin hard crashed with screeching sound watching Bob's Burgers on Hulu.

This is going to be a fun one to track down. Weird feeling it's in the XMP profile. It was really unstable at 3200 before a bios update on the B450 Tomahawk. Bios update made it seem stable at 3200 but I have my doubts.

Are you using an AMD videocard with a driver that is from 2019? Were you using chrome? AMD's driver crashes my entire computer if I have video acceleration on in chrome and view the menu in Netflix.This happened on my 2500k and my 3700x, I've even replaced the graphics card (with another AMD card, the 5700 is just a really good value right now). Google turns up other people having this exact same issue and I'm confused as to why AMD hasn't fixed it in over a year. If you roll back to a really old driver it works perfect but you don't want that and if you actually have a newer card you can't.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
Edit: I was able to diagnose the issue: Edge Insider was the culprit, the latest Canary build released yesterday no longer crashes the system. The Netflix app standalone doesn't crash, Edge was always involved.

That's kinda fun, I have a 3800X and an Nvidia 2070 Super: My system crashes with regularity (about every three episodes) whenever I watch Netflix using the Edge Beta or the Netflix app (Chrome doesn't seem to crash it). I'm getting a 5700 XT as a replacement on Monday, I'll report on what happens with that card.

Ran Realbench CPU testing for hours, Memtest86 for ~12 hours and 3Dmark: The hardware seems fine otherwise, must be some kind of weird driver issue.

Lambert fucked around with this message at 10:55 on Sep 19, 2019

Laranzu
Jan 18, 2002

pixaal posted:

Are you using an AMD videocard with a driver that is from 2019? Were you using chrome? AMD's driver crashes my entire computer if I have video acceleration on in chrome and view the menu in Netflix.This happened on my 2500k and my 3700x, I've even replaced the graphics card (with another AMD card, the 5700 is just a really good value right now). Google turns up other people having this exact same issue and I'm confused as to why AMD hasn't fixed it in over a year. If you roll back to a really old driver it works perfect but you don't want that and if you actually have a newer card you can't.

1660ti Hulu in chrome. It streamed the entirety of a movie perfectly fine after the lock too so it's not reproducible easily.

It might have been in the Ryzen Master gaming profile if there is any instability there. I'll either leave it stock or run 3900mhz @ 1.3625 in the future.

I need to reinstall Windows anyway because it's littered with all my testing software and weird RGB control software from messing around with all the terrible lights.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Lambert posted:

That's kinda fun, I have a 3800X and an Nvidia 2070 Super: My system crashes with regularity (about every three episodes) whenever I watch Netflix using the Edge Beta or the Netflix app (Chrome doesn't seem to crash it). I'm getting a 5700 XT as a replacement on Monday, I'll report on what happens with that card.

Ran Realbench CPU testing for hours, Memtest86 for ~12 hours and 3Dmark: The hardware seems fine otherwise, must be some kind of weird driver issue.

Netflix works fine for me in Edge, so if you start crashing switch browsers for streaming not like it really matters what browser you are using to run a fullscreen video.

Adbot
ADBOT LOVES YOU

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib

pixaal posted:

Netflix works fine for me in Edge, so if you start crashing switch browsers for streaming not like it really matters what browser you are using to run a fullscreen video.

While you're right, especially with this pointing to a driver issues that's likely to be fixed in the future, if the 5700 XT turns out to be working fine I'll just stick with that one over the 2070 Super.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply