Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dr. Video Games 0031
Jul 17, 2004

Ryzens run hotter than Intel CPUs, but not that hot. I would definitely inspect your cooling setup.

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

Lord Stimperor posted:

TLDR the AIO is borked, I think. I'm gonna order an air cooler tonight, I've had it with this bitch that takes up so much space and where you can't see whether it's working.


Long story:
I've pulled out and cleaned most of the components. I think that I used a thermal pad when I put on the CPU cooler the first time. Unfortunately it only covered about 2/3 of the heatspreader, which is a bit disappointing. I put on some fresh paste.

I also changed the location of the radiator. Previously, it was located at the top of the case, as I reasoned hot air would want to naturally want to travel upwards. Downside is that the pump also had to force the water upwards. Now I've mounted the radiator to the only other possible location in the case, the back panel. The in- and outputs of the radiator are now below the CPU. If I understand correctly, this arrangement should make it better for the water to flow.

The thing is, the radiator stays cold all the time. The CPU is running at 80+ degrees. The tube and cold plate casing get hot. But the radiator remains at the same temperature as the case, cool to the touch. That means its hosed, right?

There are some suboptimal ways to install an AIO, but the only outright bad/potentially damaging way is if the pump (which in AIOs is almost always on the CPU block) is on the top of the loop. That wasn't a problem in your first configuration though, and if there's a part of your radiator that's still above the pump then it's fine in your second setup too. It's seeming pretty likely that your pump is to blame here (which is a total loss if so), but maybe you can try wiggling around your tubes and stuff a bit to move any potentially obstructive air pockets that are stuck somehow.

Lord Stimperor posted:

Yeah that's what I figured in my last post, CPU being hot but radiator cold says to me that the water isn't circulating. I'm currently going back and forth between a CoolerMaster Hyper 212 with RGB bling, a be quiet! Dark Rock, and a Noctua NH-L12. For the first two it comes down to performance; what speaks for the latter is that I just don't like big towers very much. That was the reason I wanted the AIO in the first place.

I have a big dual-tower cooler (Dark Rock Pro 4), and it is indeed a pain in the rear end to install. Or rather, it's pretty easy to mount onto your motherboard, but fitting it into your case and plugging everything in around it is what's a pain unless you have a super spacious case. Honestly, if you're not gonna be overclocking or anything, then there's nothing wrong with a good-quality low-profile cooler. If even stock coolers are often good enough in non-overclocked systems, then a Noctua low-profile cooler will certainly be. (I'd probably spend the extra $10 on the taller variant, but that's just me). Your case will need better airflow with an air cooler though, so maybe pick up an extra case fan to go in one of those empty front slots while you're at it.

Dr. Video Games 0031
Jul 17, 2004

ChazTurbo posted:

My 3600 is weird. It idles at around 40c but never goes over 70ish when under load.

Fan curve also has a lot to do with this. It's definitely possible to have a fan curve designed to be very quiet during idle that ramps up quickly under load.

Dr. Video Games 0031
Jul 17, 2004

Yeah, trying to determine the best value for thermal paste makes no sense to me. Just buy the best poo poo that's on the market. It's a negligible addition to your costs. edit: within the consumer-grade realm, i mean

Dr. Video Games 0031 fucked around with this message at 10:31 on Jun 15, 2021

Dr. Video Games 0031
Jul 17, 2004

K8.0 posted:

GN has done testing on them. They're not good. Sure, they will work on a low-power system - but you're spending more money for worse performance, in order to gain what? More convenient repeated removal and reinstallation of HSF on a low-power system where you have no reason to be doing that? I can't see a reason to ever use them unless it's for some really bizarre application where environmental factors would cause thermal paste to age extremely prematurely.

You're misrepresenting GN's verdict. They decided that thermal pads were just fine, actually. There's no problem with using them unless you're a hardcore overclocker. The flip side to this is that there's no obvious reason to use them for your home builds, either. You may as well just use the paste your cooler came with, because it's not like the pad will be any better.

The main reason to use pads is for things you expect to have a long life and wish to service as little as possible. Laptops, grandma's computer, etc. Or in professional environments where you're swapping out components regularly and having perfect thermals doesn't matter.

Dr. Video Games 0031
Jul 17, 2004

Lord Stimperor posted:

BIOS is (now) updated to the most recent version. All chipset drivers were already the newest. In fact, the windows installation was about a day old when I made the first post, couldn't have been much cleaner.

Windows Event viewer always says something along the lines of error 6008 -> error 41. To my knowledge that means something along the lines of "There was a crash, had to reboot".

I've made a bit more progress but it hasn't been super elucidating. Creshes also occur with every possible combination of RAM sticks and slots. On the software side, I've used memtest pro, testmem, and prime 95 to stress memory and CPU, respectively. The computer can these for hours on end without problems. But when I interact with it, it's a matter of moments to minutes until it crashes. And with 'interact' it is really something as simple as writing in Notepad. You leave it, it runs. You touch it, it breaks.

That doesn't rule out a hardware problem of course, but as a result I'm looking at Windows itself again. I'm now on a Ubuntu live disk and lo and behold, I can type messages, browse YouTube and dick around without anything happening. Windows itself was complaining already during the installation procedure, and crashed from the start.

If Ubuntu works the rest of the evening, I believe I'll just wipe the C:\ drive and start over.

I'd also consider the possibility of not just a corrupted install, but a drive failure. I don't remember, have you run any drive scanning software yet?

Dr. Video Games 0031
Jul 17, 2004

I've become "good" at computers by simply being very bad at building them and thus having to constantly troubleshoot all of my dumbass mistakes. Nobody knows how to gently caress up a PC build quite like me. I'm rather proud of this.

Dr. Video Games 0031
Jul 17, 2004

CaptainSarcastic posted:

My most recent forehead-smacker was being mad about my Internet speed being lower than what I paid for, and doing all sorts of tests, removing a filter from the coax, checking my wireless router, and cursing my ISP in general. Then I finally thought to change the cable between modem and router, and lo and behold it turned out to be old ethernet that would only handle 100Mbps. New ethernet cable and I'm getting better speed than I pay for. :doh:

My latest massive blunder came from when I was trying to figure out a system stability issue earlier this year. My computer was just randomly restarting--no blue screens or error logs or anything. For the first month of use it was fine, actually, and the problem slowly started getting worse after that. I could play a demanding game for hours and it would be fine, but then it would restart when watching a youtube video. My computer is sort of in an annoying spot to deal with under my desk, so I try to pull it out and open it up as little as possible. Due to the sporadic nature of the restarts and lack of time available on my part, troubleshooting progress was slow. Then a week or two after the problem started, my computer wouldn't boot at all. I had almost no spare components to test with, so I ordered some things from Amazon with the intent to return them if they weren't the issue.

First up was the PSU, since I suspected this was power related. As I was unplugging things... one of the 6-pin connectors just pops out of the graphics card without having to squeeze the clip. The cable was loose. That was it. The plastic bits meant to line up the two connectors weren't properly aligned, so the second connector didn't go in all the way when I initially plugged it in. It just took some time to gradually come loose after that, which explains why it worked fine at first. I'm not sure I ever felt so dumb before. It was a good reminder to double check all my cable connections, I guess.

At least I was right about it being power related. :)

Dr. Video Games 0031
Jul 17, 2004

Yeah, my guess would be that the faulty AIO let the CPU cook and it's now damaged. What were your results from upping the voltage?

Dr. Video Games 0031
Jul 17, 2004

LRADIKAL posted:

Wait a minute. Under what conditions would a modern CPU actually overheat itself causing damage? Ryzen throttles to stay at 90, and there's a shutdown at a temperature that I don't know above that. Right?

Yes, but considering he had a non-functional cooler for a time, it does make me wonder if it had something to do with it. When I see a water cooler failure leading to an overheating CPU and then subsequent system instability even under better cooling conditions, I get suspicious. But maybe my mentality is stuck in the past back when this was a much bigger risk.

Dr. Video Games 0031
Jul 17, 2004

Shipon posted:

Aww way to ruin it, it was much funnier to think that he pivoted from making up CPU poo poo to COVID denier

No, the Freethinker channel BurritoJustice posted is his actual side channel. Those vids are his too.

Dr. Video Games 0031
Jul 17, 2004


Thank you, this actually answers some questions I've had since upgrading from an older intel chip earlier this year. I've been wondering why my 5600x can reach 4.6GHz in single-core loads but not on all-core loads, and if it was a sign of bad thermals (I wasn't near thermal throttling limits but idk how the PBO algorithm works). Turns out it's just a feature of the PBO algorithm so you don't send excessive currents through your chip?

How safe are bios settings like Asus Performance Enhancement, which I guess messes with the PBO algorithm to let your CPU draw more current?

Dr. Video Games 0031
Jul 17, 2004

Okay. I did some testing, and with asus performance enhancement off my 5600x maxes out at 4.1ghz during a multi-threaded cinebench test, while it maxes out at 4.4ghz with the option turned on. The CPU also only reaches 55C in cinebench with APE off and 70C to 75C with APE on (Tctl/Tdie in hwinfo). It does seem strange that it leaves so much performance on the table with a ton of thermal headroom using default settings, but APE takes things potentially too far. And ultimately the difference in results isn't all that much (15% better score with APE, while noting that synthetics usually artificially inflate differences beyond what you see in real world applications).

In single-threaded loads, it's the same on and off, with default PBO more than willing to push one or two cores to 4.65GHz. And considering this is what most real-world applications will do, I see no reason to turn on that feature. Asus made it sound like some kind of free performance switch which is what made me wonder about it, but I guess nothing's that simple.

Dr. Video Games 0031 fucked around with this message at 07:04 on Jun 29, 2021

Dr. Video Games 0031
Jul 17, 2004

Malloc Voidstar posted:

Multi-display is eternally hosed up with regards to cards going to idle state, at least with Nvidia

Man, that's disappointing. I don't get why multi-monitor support is still so janky across the board. Microsoft is still playing catch-up with windows support too. And there are still new games being released without a monitor selection option in their in-game menu. It's 2021, who the hell's still using single monitors?

I guess this is the one and only thing AMD can point to in their drivers as being better than NVidia. AMD's multi-monitor support is generally pretty decent. I haven't had any weird power or refresh rate related issues with them.

Dr. Video Games 0031
Jul 17, 2004

fast cars loose anus posted:

I feel like I need a sanity check on my CPU temps. I've got a newish build (basically 3 weeks old) with a Ryzen 9 3900x on an MSI x570 gaming edge. Using the Wraith that came with it, in a case with 3 front fans and a back fan, room temp generally ~24c. When idling I'm usually betwen 45-50c and playing hitman 3 with settings jacked usually between 70-80c. Running prime 95 in the highest stress gets me to 75-80 degrees at 3700 Mhz (monitoring done with AMD Ryzen Master). In the most recent one, 10 seconds after I stopped Prime95 it had dropped from 78c to 52c and was down to 47c 20 seconds after that.

Anyway I put all that probably too much detail in to ask if when I open this thing up later in the week to add my new NVMe drive if I should reseat the wraith with new thermal paste (I have some arctic silver lying around but just put it on with the stock thing that was on it out of the box when I built it). I hear Ryzens run hot and going back several pages in this thread I seem to get the idea that mine might be more than it should be even taking that into consideration. Thanks for your help!

While I'm not familiar with the 3900x's exact characteristics, clock speeds are usually not boosted by much during intense all-core torture tests. And 80 degrees should be a fair bit below the thermal throttle limit, so that seems fine. The idle temps are warmer than I'd expect, though. The fact that idle thermals are warm and load thermals appear fine is probably a result of your fan curve. Your fans are barely running at idle, letting your CPU get warm, and then heavily ramping up under load. This is mostly fine, I don't think a CPU can be damaged by idling at warm temperatures. If you'd like to tinker with this, your motherboard bios probably has a feature to let you adjust fan speeds at different temperature levels, or maybe MSI has some software that lets you do so in windows.

Dr. Video Games 0031 fucked around with this message at 09:26 on Jul 6, 2021

Dr. Video Games 0031
Jul 17, 2004

Ihmemies posted:

Some of the coolers are not flat. They are convex for whatever reason. So only the "tip" touches the CPU properly and the coolers rotate easily because the contact area is so small - even when properly tightened. Personally I like flat coolers more, although some claim the convex shape increases cooling performance 🙄

This is due to slight differences in cpu surface shapes between intel and amd. For a while, any universal cooler was designed around intel's cpu shape specifically due to their dominance in the consumer cpu market. The convex cooler plate really did help with intel, but those coolers were less efficient with AMD CPUs as a result. AM4-specific coolers should make better contact with a Ryzen chip with less wiggle room. Still, the curvatures we're talking about here are microscopic. For the most part, thermal paste fill in the gaps well enough to not lose much efficiency.

Dr. Video Games 0031 fucked around with this message at 07:52 on Jul 8, 2021

Dr. Video Games 0031
Jul 17, 2004

edit: wrong thread, lol

Dr. Video Games 0031
Jul 17, 2004

I mean, if all you want is quiet, then any mesh-fronted case with noise-efficient PWM fans on a quiet fan curve will get the job done. If you're picking that case, it's specifically to have a Mac Pro copycat.

edit:

CoolCab posted:

i think LTT used it in a build and while it's been a minute (the series was called something like "windows mac pro killer") and i don't remember them being overwhelmingly impressed by the case. they mostly get the aesthetic right but i remember a bunch of nitpicks about actually using it.

Just looked at these, and the Mac Pro Killer video uses a completely different, very future gamery looking case. In the actual video about the Mac Pro copycat case he's pretty positive about it, though he ran into some issues with cooler clearances (and half of the video was him troubleshooting unrelated system issues).

Dr. Video Games 0031 fucked around with this message at 10:48 on Jul 15, 2021

Dr. Video Games 0031
Jul 17, 2004

So these are mostly the same as their regular X570 boards except with shitloads of USB 3.2 slots on the rear i/o?

Dr. Video Games 0031
Jul 17, 2004

Crunchy Black posted:

I continue to lol at goons goonin' over something so spergy and stupid as an extra small, controllable fan.

Do you all have your computers situated 4 inches from your head and never wear headphones?

Getting derisive toward people who are sensitive to different sound frequencies than you is definitely one of the weirdest things I've seen someone get hostile and ableist about on here, so congrats.

Dr. Video Games 0031
Jul 17, 2004

I believe Silverstone pioneered the single-bottom-fan chimney design with the FT-03 and the FT-03 Mini.

Dr. Video Games 0031
Jul 17, 2004

Max recommended operating temperature for the x570 is apparently 90 C. 60 should be safe.

Dr. Video Games 0031
Jul 17, 2004

Alternatively, reconfigure your airflow situation to make sure there's a case fan blowing directly over the chipset, if there isn't one already.

Dr. Video Games 0031
Jul 17, 2004

Yeah, the 5600x and 5800x have been below msrp in america for the last month too. The 5800x has been reduced in price to $400 in many stores which seems like a better price for that chip. It was always weird how the price gap between those two chips was bigger than the price gap between the 5800x and 5900x.

Dr. Video Games 0031 fucked around with this message at 23:16 on Jul 27, 2021

Dr. Video Games 0031
Jul 17, 2004

SourKraut posted:

Are those dual ran or single rank?

They're single rank. Dual rank 8GB sticks are rare and are never going to be advertised anyway. It's not going to matter to most users.

CoolCab posted:

i still don't understand this despite watching the LTT video on it. it's a quality separate to the speed, timings and advertised capacity, loving somehow?

If you're thinking of the same LTT video I am, then that wasn't even about ranks really, but density or something. Here's a Buildzoid video on the topic:

https://www.youtube.com/watch?v=w2bFzQTQ9aI

Dr. Video Games 0031
Jul 17, 2004

Palladium posted:

I have being seeing 5600X and 5900X going below MSRP, guess this has to do with the stupid GPU prices somehow

Isn't that just normal CPU price behavior?

Dr. Video Games 0031
Jul 17, 2004

ARRGHPLEASENONONONO posted:

I just got a 5900x running on an x570 and some DDR4 rated at 3600 but the motherboard was defaulting to a lower clock speed (2666 iirc). Is this normal? I went into the BIOS and forced it up to 3600 and everything seems stable, but shouldn't the MB be detecting the speed automatically?

That is indeed normal. 2666 will be that stick of ram's JEDEC profile. JEDEC is the standards body that dictates memory speeds and all that, and they have been rather slow about authorizing out-of-the-box clock rates at the speeds most DDR4 kits are being advertised at. So you technically have to overclock them to reach their advertised speeds. This is what XMP or DOCP profiles are for. I would undo whatever bios config you did and enable XMP instead, which will load a more complete profile with potentially better timings.

edit: beaten to hell and back.

Dr. Video Games 0031
Jul 17, 2004

After ignoring it for 8 months, I've started messing with the curve optimizer with my 5600X and am pretty happy with the results. I tried to take it slow and careful by going in increments of -5, one day at a time since I heard that instability can commonly occur in low-workload or idle situations that can only be exposed through normal use. For me, instability happened immediately on startup once I upped the offset to -25. Many of my desktop icons wouldn't load, and there was some unresponsiveness. I tried to restart the system and the start menu wasn't registering my clicks. So yeah, I force restarted, set the offset back to -20, and it's been stable for the couple weeks since then. Temps have been better during lightly threaded workloads, and i'm hitting higher boost frequencies (+150MHz or so) during all-core workloads. This is at the default power limit (PPT maxing out at around 75W). I guess it's possible to go further with the cores Ryzen Master marks as your best, but I'm just sticking with an all-core offset for now.

This seems like something that's probably worth experimenting with for most Zen 3 owners. It's a fairly low-risk way of improving thermals and eking out some extra performance since this is really just an undervolt. (if anything, doing this is better for your CPU's health, no?)

Dr. Video Games 0031
Jul 17, 2004

Not 100% sure, but with Zen 3 I think "Precision Boost 2" is the normal opportunistic boost, while "Precision Boost Overdrive" is basically the power limits off, go nuts mode. I don't fully get it, but I had to set PBO to "on" in my bios to do curve optimization, but it still defaults to off in Windows unless I enable a PBO profile in Ryzen Master.

Dr. Video Games 0031
Jul 17, 2004

The Scythe Fuma 2 can have issues coping with the higher heatloads the 5900X is capable of putting out, though it should be okay if you keep the CPU strictly at stock settings. It'd at least be better than the stock cooler.

The Dark Rock Pro 4 is Be Quiet's current premier cooler. I have one and it's a seriously chonky boy, but it is indeed quiet. I can't vouch for its performance at higher levels of load, but it's rated for 250W which the 5900X falls under. Alternatively, there's the venerable NH-D15 which will never let you down. The fans are even quieter than Be Quiet's at equivalent levels of airflow, but with a higher max RPM so it can really spin up when needed.

Dr. Video Games 0031 fucked around with this message at 22:47 on Aug 20, 2021

Dr. Video Games 0031
Jul 17, 2004

NoDamage posted:

I mean, it's within a couple degrees of the D15 at nearly half the price. If you want to do some heavy overclocking, then sure, pony up for the D15 or get an AIO. But I'm guessing most people don't want to spend $100 on a cooler and at stock speeds it's more than adequate, and pretty good value.

The 100% fan speed tests tell the tale better than noise normalized tests in terms of maximum possible heat dissipation. I still think the Fuma 2 has the possibility of letting the 5900X throttle under max load if you have higher ambient temperatures or worse case airflow than these test benches, but you're right that it wouldn't be an issue under normal circumstances.

Dr. Video Games 0031
Jul 17, 2004

Cabbages and Kings posted:

Any suggestions for quiet coolers with addressable rgb? I would hate to ruin my case aesthetics :laugh:

If not I will probably get the dr4 or noctua.

What CPU? Everyone's favorite bargain-bin overachiever, the Cooler Master Hyper 212, has an RGB version, though it's not addressable and it's not suitable for higher-TDP CPUs. You could also always buy a dual-tower cooler and then replace the front fan with an ARGB one. (edit: poo poo, I totally missed that you were the 5900X guy. Sorry lol)

Most of the RGB though seems to be in the AIOs. The Corsair iCUE series of AIOs are popular, but I think that's a proprietary RGB thing? Cooler Master has an ARGB AIO though. And EK, a popular maker of open-loop supplies, made an addressable RGB AIO. It's rather plain, but I like it. Just a big, dumb, light-up brick sitting on your CPU. (there's a 360mm one too.)

Dr. Video Games 0031 fucked around with this message at 14:27 on Aug 21, 2021

Dr. Video Games 0031
Jul 17, 2004

Zedsdeadbaby posted:

I'm awry on buying six cores, since the new generation of games are using 8c/16t CPUs as baseline.
I'm sure the 5600 is more than fast enough for a couple years yet but you never know.

https://www.youtube.com/watch?v=RQ6k-cQ94Rc&t=928s
https://www.youtube.com/watch?v=VkfjICH1VqY

The 5800X has virtually identical performance to the 5600X in almost every gaming benchmark, and I suspect it will stay that way for many years to come. Games being coded for 8-core consoles has little relevance to the discussion, as it's not like cores/threads can't do multiple things at once. What matters most is overall performance, and the 5600X is pretty far ahead of the console CPUs in that department. Getting a 5800X just to have two more cores for gaming would be a complete waste of money. By saving that money and upgrading your GPU instead, you'll get way more performance out of your PC in both the short and long term.

Dr. Video Games 0031
Jul 17, 2004

Subjunctive posted:

I thought “thread” was the unit of “doing things at once” — what do you mean here by a thread doing multiple things at once? Can they split and schedule the execution units more finely than at thread granularity? I feel like I’m missing something big!

"Thread" in this case is the logical threads on a multi-threaded CPU core. Yeah, sure, each one can only do one thing in any given cycle, but threads are clearly capable of sharing their compute time among many different tasks at once. How do you think we did things before multi-core CPUs and hyperthreading?

The point I was trying to make (which was also made in the videos I linked) is that CPUs aren't as rigid as a lot of people seem to imagine. It's the software that tends to be too rigidly coded, if anything. You can have many tasks assigned to one logical thread, and you can split tasks between multiple threads if it's coded well—modern CPU schedulers are quite flexible in that way. What matters most in games is that you have enough computational capacity to handle those tasks before a frame is drawn by the renderer so as not to bottleneck performance. It just so happens that adding extra cores is the easiest way of adding extra "capacity." It will by no means limit game performance if your CPU is six cores just because the game was originally coded for eight so long as your six cores are more powerful. I also doubt that most games on modern consoles are going to be maxing out their CPUs often to begin with.

note: i'm definitely no expert on this, this is just my understanding.

Dr. Video Games 0031 fucked around with this message at 14:24 on Aug 22, 2021

Dr. Video Games 0031
Jul 17, 2004

Arzachel posted:

Said 2500k got absoleted because of a 8 core netbook CPU :v:

It's hard to tell how things will shake out. Realistically, one of the cores on consoles is going to be pinned for the OS and we have no idea whether anyone will actually use all 14 threads available and 6/12 vs 7/14 is a much smaller difference than 4/4 vs 8/8 but the cores themselves are much closer in power etc. If you are going to be sticking with the system for a long time, getting a 5800x is probably not a bad choice despite the awkward pricing.

Spending that extra $100 to $150 on a better GPU will position you even better for the future than a CPU that doesn't provide even a single percent improvement right now. It seems insane to suggest a 5800X as a means of future proofing unless it's to a person who is buying a 3080 or higher and thus has nothing to upgrade to cheaply there.

Though I find it very hard to believe there will be a meaningful difference at all between the two over the course of this console gen. Stuff like all the extra L3 cache AMD will be adding to their chips will make a far bigger difference, and I expect the 5600X and 5800X to age and fall behind at almost the exact same pace.

Dr. Video Games 0031 fucked around with this message at 15:01 on Aug 23, 2021

Dr. Video Games 0031
Jul 17, 2004

NVidia supposedly requires laptop manufacturers to include GPU TDP in their laptop spec sheets, but these don't appear on store pages most of the time so consumers have to go hunting for them. And some brands such as HP seemingly ignore that requirement outright.

Dr. Video Games 0031
Jul 17, 2004

Have either of you messed with the curve optimizer at all? That could at least partially explain the difference.

Dr. Video Games 0031
Jul 17, 2004

SourKraut posted:

How much performance loss could be expected in gaming if I have a 5900X and went with CL 16 3200 MHz DDR4 over CL18 3600 Mhz? Both would be 2x DIMMs.

None. Those should perform at almost the exact same level. There may be slight differences depending on the subtimings, but nothing perceptible.

Dr. Video Games 0031
Jul 17, 2004

K8.0 posted:

Don't ever enable PBO. If you are a serious OCer you do it manually. If you aren't, you don't OC Zen3. If you want to do something, optimize your memory frequency & timings. It'll have a much larger impact than anything you can do with the CPU clock.

I've been hearing everyone else say "Don't OC manually, just use PBO" this whole time. Why are we now saying the opposite?

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

Oh yeah, if he means the curve optimizer then I can see that, though it's still a part of PBO (or PBO2 I guess to be more specific). That made a difference for me even when staying within stock power limits.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply