Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dr. Video Games 0031
Jul 17, 2004

NVidia supposedly requires laptop manufacturers to include GPU TDP in their laptop spec sheets, but these don't appear on store pages most of the time so consumers have to go hunting for them. And some brands such as HP seemingly ignore that requirement outright.

Adbot
ADBOT LOVES YOU

Gwaihir
Dec 8, 2009
Hair Elf

Paul MaudDib posted:

would a 5900X- or 5950X-haver mind seeing what Cinebench R23 scores you get / what clocks you get when setting a PPT of 60W, then again at 80W? or if you will only do one then give me the 60W number.

(I'm looking for 60W/80W actual total-package-power, so if your board is setting cTDP and then letting it boost above that, adjust accordingly.)

I had this idea of upgrading my NAS to a 5900X (or whatever 6900X is coming with zen3 refresh) but it'd need a low-profile cooler like this mini-box build and 70W is realistically about what I can get it to dissipate even when sucking air directly from outside.

Temps are a completely meaningless number of course since I will have a much smaller cooler, but I can use the 60-70W power dissipation target to see what performance is like (ish - obviously boost considers temperatures and lower temps will boost higher, but it'll be close enough).

Clocking down under severe sustained multi-threaded AVX loads isn't the end of the world, and I will have very adequate airflow as far as the drives are concerned, I'm just curious how it performs within tight power envelopes. The multi-die configurations likely do pull some extra power for IF links/etc but probably not the end of the world, AMD already did similar configurations with the 3900 non-X, that's a 65W TDP/88W PPT similar to the 5700G and that's alright ish in these constraints.

(or - if anyone has a Noctua L9a on a 5900X or a 5950X I'd be grateful to hear your numbers!)

I have a 5900X, at 60w PPT I got 10774 in R23. That was showing as about 35w on the cpu and 5w on the soc.

80w gave 16580 points, with about 1gz higher clocks on all cores.

160w gave 23580 points, with another 1ghz on all cores.

e: Actual Screenshots



Gwaihir fucked around with this message at 01:27 on Sep 1, 2021

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug
System76, Linux laptop maker, just announced a Ryzen 5700u based model, exciting

https://twitter.com/arstechnica/status/1433152939980431360?s=20

Cygni
Nov 12, 2005

raring to post

Gwaihir posted:

I have a 5900X, at 60w PPT I got 10774 in R23. That was showing as about 35w on the cpu and 5w on the soc.

80w gave 16580 points, with about 1gz higher clocks on all cores.

160w gave 23580 points, with another 1ghz on all cores.


with a 5950X, i got:

13991 at 71W PPT (5950x minimum is 71, cant even set 60 in Ryzen Master)
16858 at 80W
26778 at 160W
28793 uncorked

my SoC power draw was way higher than yours, in the 14w range regardless of PPT setting or load. wonder whats up with that? im running 3600 CL16 with coupling forced and geardown disabled, but untuned subtimings. could be a board setting as well i guess, but i havent manually set any SoC voltages or anything.

Gwaihir
Dec 8, 2009
Hair Elf
Yeah, I've done no ram tuning either. Just DDR3200 using xmp settings. I'm using an MSI Meg unify, too.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

https://www.theregister.com/2021/09/01/cloudflare_picks_amd_again/

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gwaihir posted:

I have a 5900X, at 60w PPT I got 10774 in R23. That was showing as about 35w on the cpu and 5w on the soc.

80w gave 16580 points, with about 1gz higher clocks on all cores.

160w gave 23580 points, with another 1ghz on all cores.

e: Actual Screenshots




Hmm. Ryzen Master still doesn't appear to have 5700G support yet (same with the linux graphics drivers - ah, the AMD early adopter experience), so I can't be 100% sure that I'm looking at the same numbers as you. I went into BIOS and explicitly switched PBO from "auto" to "disabled" (did not seem to make any performance/power consumption difference so I guess the board was "auto" deciding to keep it disabled) so I assume I am running at the normal 65W TDP (which I believe is 88W PPT).

Could I trouble you for one more minute? Could you set that 80W PPT limit, then start Cinebench (doesn't need to run to completion), and then see what HwInfo (v7.06 or similar) says your "CPU package power" is, and specifically whether it's the same as Ryzen Master's "PPT" measurement? That number is capping out at about 82.5-83W for me even at a start of the run before the CPU is thermally saturated. I might be losing a couple watts to the iGPU even at idle of course. In this instant the numbers added up to roughly 88W but they don't always (of course that can be a sampling thing as well).

(edit: actually looking at this now I see there is a "CPU PPT Limit" meter at 93% so yeah, that's 81W actual power and 88W PPT limit I think.)



Your numbers seem low to me. Obviously running two dies pulls more IF/uncore power than one die, but you're also low compared to the Cygni with their 5950X.

At ~80-82W "CPU package power" (which I think is most likely PPT) I get around 13300 and 70-72W I get around 13000. 2133 memory (yeah yeah) and everything stock - and obviously Cezanne has a smaller cache and IPC should be lower.

As far as Cygni's numbers I guess that makes sense but is a little disappointing. Wonder if the numbers would be better at 71W and 80W with a 5800X instead, being single die and all. I'd have hoped for more given the higher IPC, but I guess monolithic APUs may have its merits too. It certainly is interesting how steep the perf/watt curve gets at the end - going from 72W to 82W is a 14% power increase that translates into a 2% performance gain (in CB obviously, other tasks may be different).

Thanks for your help all.

Paul MaudDib fucked around with this message at 17:58 on Sep 2, 2021

Gwaihir
Dec 8, 2009
Hair Elf
Certainly, happy to fuss with more stuff. I probably have some bios settings that are Not Quite Right after messing around with undervolts and other various sundry things over time. I'll give it a full reset and see what happens there.

Gwaihir
Dec 8, 2009
Hair Elf
Here's 80w PPT


Here's bone rear end stock other than turning on XMP after a fresh BIOS update:


And here's 180w PPT w/ -20 all core offset undervolt


Max single core speed is 5.074ghz with the settings from the last screenshot. So I don't think it's a perfectly dialed in OC or w/e.

Gwaihir fucked around with this message at 20:01 on Sep 2, 2021

Cygni
Nov 12, 2005

raring to post

Gwaihir posted:

Here's 80w PPT


I tried to match your settings in every way (minus RAM because i was too lazy to reboot and fiddle), and here is what im getting:



The delta that the SoC voltage is still there, and the frequency difference is bigger than I expected. More cores fighting for the same watts, sure, but dang thats a big frequency drop off in comparison. I'm running an MSI board too, a B550 Gaming Carbon. Innnnnteresting.

Dr. Video Games 0031
Jul 17, 2004

Have either of you messed with the curve optimizer at all? That could at least partially explain the difference.

Gwaihir
Dec 8, 2009
Hair Elf
I do use the curve optimizer for my 24/7 settings (it makes a big difference in multi-corr clocks imo) but in that screenshot Cygni quoted I wasn't. I did a bios update, reset everything to defaults, and just changed those two settings (XMP and 80w PPT)

Cygni
Nov 12, 2005

raring to post

Tried flashing to the latest BIOS MSI posted this month, which completely bricked my system for a good 10 hours there, but were back! Not the first time this computer has gone to Haunted Mode on a BIOS reflash, extremely annoying. Booted once, let me in the BIOS, and then would only boot 1/10th of the time after that. Also the Clear CMOS button on the IO shield seems to be purely suggestive, which is neat. :v:

So with the latest flash and memory set to match Gwaihir's and nothing else in the BIOS touched, my clocks went... down more?



Ryzen is truly a mystery man. IO die using a little less power this time, but still more than Gwaihir is getting. I dunno dont ask me man stop yelling at me i dont know!!!!

redeyes
Sep 14, 2002

by Fluffdaddy
Prolly dealing more with MSI poo poo than Ryzen.

Gwaihir
Dec 8, 2009
Hair Elf
Maybe on X570 there's more going on in the discrete chipset portion of the board compared to B550, so the IO die inside the chip in a B550 board has to do more? I'm coming up empty, too. I also have all 3 m.2 slots on my board populated, including one PCIe gen4 ssd, so it's definitely using the CPU lanes from the IO die.

WhyteRyce
Dec 30, 2001

Finally updated my old rear end Kabylake Blue Iris and Plex server to a 5600G and holy poo poo I went from being at a constant 50-60% utilization with motion triggers maxing me out to a constant low teens with no spikes at all during a motion triggered recording. I even got frisky and bumped the live feed of my cameras to 20fps from 1fps and it didn’t make a difference

Why you make me wait so long for a good APU AMD?!?!

My Virtualbox images are BSODing when I load them though, any known issues with that?

Klyith
Aug 3, 2007

GBS Pledge Week

WhyteRyce posted:

My Virtualbox images are BSODing when I load them though, any known issues with that?

AMD-V (the VM hardware acceleration stuff) turned on in the BIOS? Lots of consumer & gamer mobos default to that (and CPU-integrated TPM) being off.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

WhyteRyce posted:

Finally updated my old rear end Kabylake Blue Iris and Plex server to a 5600G and holy poo poo I went from being at a constant 50-60% utilization with motion triggers maxing me out to a constant low teens with no spikes at all during a motion triggered recording. I even got frisky and bumped the live feed of my cameras to 20fps from 1fps and it didn’t make a difference

Why you make me wait so long for a good APU AMD?!?!

My Virtualbox images are BSODing when I load them though, any known issues with that?

Was that 2 core or 4 core Kaby Lake that you were coming from? I'm sitting on a Haswell doing server duty and eyeing a 5600G. I don't want a GPU in the machine.

WhyteRyce
Dec 30, 2001

Twerk from Home posted:

Was that 2 core or 4 core Kaby Lake that you were coming from? I'm sitting on a Haswell doing server duty and eyeing a 5600G. I don't want a GPU in the machine.

4 core. Pretty happy with the upgrade although if it was only running Plex and not an ip cam setup then I probably wouldn’t need to upgrade.

I’ve been sitting on the mobo since last year so being able to update the bios without a cpu or memory installed was a nice surprise

Klyith posted:

AMD-V (the VM hardware acceleration stuff) turned on in the BIOS? Lots of consumer & gamer mobos default to that (and CPU-integrated TPM) being off.

Didn’t consider that, assumed it was just on by default like my previous board so will check

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



How much performance loss could be expected in gaming if I have a 5900X and went with CL 16 3200 MHz DDR4 over CL18 3600 Mhz? Both would be 2x DIMMs.

Dr. Video Games 0031
Jul 17, 2004

SourKraut posted:

How much performance loss could be expected in gaming if I have a 5900X and went with CL 16 3200 MHz DDR4 over CL18 3600 Mhz? Both would be 2x DIMMs.

None. Those should perform at almost the exact same level. There may be slight differences depending on the subtimings, but nothing perceptible.

movax
Aug 30, 2008

Dr. Video Games 0031 posted:

None. Those should perform at almost the exact same level. There may be slight differences depending on the subtimings, but nothing perceptible.

Yeah — that’s a pretty good / damned low latency, which makes up for a lot. Might as well try to OC it if you want, might get lucky, but almost certainly would have to relax timings which negates the point of the lower latency.

Otakufag
Aug 23, 2004
Can a 5600x with stock cooler get damaged long term by having "asus performance enhancer" activated in the bios?

Klyith
Aug 3, 2007

GBS Pledge Week

Otakufag posted:

Can a 5600x with stock cooler get damaged long term by having "asus performance enhancer" activated in the bios?

a. In general it is best to avoid automatic motherboard overclocking or "performance enhancer" stuff because knowing exactly what it's doing is difficult. Mobo companies will use the same branding / name for a feature, but what that feature actually does will be different on different boards.

I see stuff about the Asus performance enhancer that has settings for On/Off or multiple levels. The On/Off one and the level 1 & 2 of the multi-level one seem to just be turning on PBO, which is pretty safe. The level 3 & 4 are applying extra voltage, which will degrade the CPU faster than stock voltage. But it might be different in your board!


b. The specifics of the cooler don't have a lot of impact on how safe or damaging an overclock (or particularly over voltage) will be. They have a lot to do with how successful it is & how much real performance you gain. Throw a bunch of extra juice on something cooled by a stock cooler and it'll either crash or instra-throttle, while a big liquid cooler would run the same settings. But that doesn't mean that that the liquid system is *safe*.


c. Any OCing is 100% pointless with the stock cooler on a 5600x. You are already performance-limited by the crappy wraith stealth -- with a better heatsink the CPU will boost higher and longer with basic Precision Boost.

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord
Alternatively, the wraith that comes with the 5600x is perfectly fine and your cpu will last far beyond its service-life and doing nothing more than check-marking the OC box in the bios won’t make it blow up.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Even on an 8-core the performance scaling above 60W is so bad it’s pretty much not worth doing. Going from 70W to 82W is a 300 point increase in cinebench (13000 to 13300) on my 5700G. Even at 60W the performance loss was practically negligible.

The noise thing is legit though, the Stealth is a lovely cooler in that aspect by all accounts. But if you limit the chip so it runs at a lower TDP you can control that without costing much performance from what I’m seeing.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Don't ever enable PBO. If you are a serious OCer you do it manually. If you aren't, you don't OC Zen3. If you want to do something, optimize your memory frequency & timings. It'll have a much larger impact than anything you can do with the CPU clock.

Dr. Video Games 0031
Jul 17, 2004

K8.0 posted:

Don't ever enable PBO. If you are a serious OCer you do it manually. If you aren't, you don't OC Zen3. If you want to do something, optimize your memory frequency & timings. It'll have a much larger impact than anything you can do with the CPU clock.

I've been hearing everyone else say "Don't OC manually, just use PBO" this whole time. Why are we now saying the opposite?

Klyith
Aug 3, 2007

GBS Pledge Week

Freedom Trails posted:

your cpu will last far beyond its service-life and doing nothing more than check-marking the OC box in the bios won’t make it blow up.

While it's true that most people don't care whether their CPU lasts 15 years or 30 years, and whatever over-voltage that a mobo's one click "oc for dummies" applies will likely be quite modest, the question was not whether it would blow up. The question was whether it would be damaged long term. To which the answer is yes, running voltage over spec is known to shorten the lifespan or cause small, gradually accumulating degradation to chips.

How fast or slow it happens is hugely variable. There are people that ran the 2600k in the +.1 to +.15V range for a full decade without problems. But there are also people who have found that a dialed-in OC & OV combination becomes unstable after a few years.

Freedom Trails posted:

Alternatively, the wraith that comes with the 5600x is perfectly fine

:wrong: The wraith stealth loving sucks. It's adequate to make the CPU functional. It leaves performance on the table at stock CPU settings.

Here's a test that found 5-15% performance differences depending on the task, on a 3600. And that testing was against a deepcool gammaxx, the cheapest grottiest tower heatsink your lack of money can buy.


Dr. Video Games 0031 posted:

I've been hearing everyone else say "Don't OC manually, just use PBO" this whole time. Why are we now saying the opposite?

There's very little to gain from PBO. Basically AMD has got the normal precision boost so well tuned, and their manufacturing so consistent, that the CPUs are operating very close to their peak without it. Like, why even bother for a trivial 1% difference? And 7nm is really affected by internal temperature since each core is so dang tiny, pushing more watts is kinda counterproductive.

So now the new hotness is fiddling with the PBO2 curves to try to shift the frequency/voltage curve, such that the CPU asks for less V per MHz. Effectively an undervolt -- but not a constant one because it's a curve. If you win the silicon lottery, you get an OC because the chip boosts harder. So still using PBO, but very different.

Klyith fucked around with this message at 22:12 on Sep 13, 2021

Dr. Video Games 0031
Jul 17, 2004

Oh yeah, if he means the curve optimizer then I can see that, though it's still a part of PBO (or PBO2 I guess to be more specific). That made a difference for me even when staying within stock power limits.

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord

Klyith posted:

:wrong: The wraith stealth loving sucks. It's adequate to make the CPU functional. It leaves performance on the table at stock CPU settings.

I mean, you’re certainly entitled to your opinion, but the OP was asking if clicking on a manufacturer’s bios setting would harm their cpu, so I don’t think we’re dealing with LN2 cowboy chasing 3% gains. I’ve got the 5600X and the Wraith is fine.

Seamonster
Apr 30, 2007

IMMER SIEGREICH
I wouldn't say it sucks but something that actually directs CPU heated air toward an exhaust fan would be better than just spinning it all around inside the case.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

WhyteRyce posted:

Finally updated my old rear end Kabylake Blue Iris and Plex server to a 5600G and holy poo poo I went from being at a constant 50-60% utilization with motion triggers maxing me out to a constant low teens with no spikes at all during a motion triggered recording. I even got frisky and bumped the live feed of my cameras to 20fps from 1fps and it didn’t make a difference

Why you make me wait so long for a good APU AMD?!?!

My Virtualbox images are BSODing when I load them though, any known issues with that?

the 5600G and 5700G are amazing for embedded/HTPC style situations like this, it's the first time you've had 7nm chips with onboard graphics available to the general public

I know I'm probably missing out vs a 5900X or something but for a tiny little HTPC style rig like I did it's pretty much best in class as far as things you can actually slot into this chassis at 60W or whatever

I did have some video capture (relive/OBS) performance problems I'm hoping are resolved with the linux drivers though, planning to move to linux any time now when the new poo poo gets merged. Right now ReLive is only supported if you copy in an older version of the Radeon Settings app (and I don't know any reason that would change, it looks like an official decision not to support 4750G or others either), and relive and OBS both incur massive performance hits to use capture. It's about 50% for ReLive and this is actually lower than OBS, which is about 66% in hardware encoding mode and that same 50% in software mode (but worse frametimes as you'd expect).

Paul MaudDib fucked around with this message at 06:28 on Sep 14, 2021

Dr. Video Games 0031
Jul 17, 2004

In a well-ventilated case, I don't think you'll throttle with the Wraith cooler on a 5600X. You can reach the typical boost clocks for the stock power limit just fine, it'll just be very loud about it. I should know because I did it in a poorly ventilated case. I did get some throttling, though that cleared up once I improved the airflow situation. So it's far from ideal, but it does not leave any performance on the table as long as your PC has good airflow. You'll only run into issues if you have poo poo airflow.

That said, a lot of people have poo poo airflow—just look at how popular cases with solid front panels are on Amazon. The wraith also gives you zero headroom for going beyond stock settings. I replaced my wraith after a few months with the massive, overkill Dark Rock Pro 4 and never looked back. I enjoy the silence more than anything.

Dr. Video Games 0031 fucked around with this message at 06:36 on Sep 14, 2021

Klyith
Aug 3, 2007

GBS Pledge Week
A 5600X having measurable performance differences, at stock settings, due to the wraith stealth being a limitation:



from pcgamer


It was constantly recommended in this very thread for the previous 2 generations, when people asked whether they should get the _600 or _600X, "get the non-X and spend the difference on a heatsink". Because a non-X would equal the X in performance once you put a good heatsink on it that allowed higher or longer boosting.


Is the difference huge? No. But neither is the difference you can get from OCing a zen 3 with non-exotic cooling. Using PBO or curve editing you can squeeze out an extra 100mhz here or there. That's gonna be a ~2% difference. In those charts you see 3-4% differences.

Dr. Video Games 0031
Jul 17, 2004

Now that's strange. I didn't notice any difference in peak clocks and my cinebench score stayed basically the same when I switched from a well-ventilated Wraith setup to a well-ventilated bigass tower cooler. My testing was far from scientific though so I could've easily missed something.

I also just ran a quick test. Full stock CPU settings gets me a multicore cinebench R23 score of 10209. Curve optimizer with a -20 all-core undervolt setting and default power settings still (76W PPT in Ryzen Master for both this test and the last) gets me a score of 10894. That's 6.7% "free" performance for no additional power draw. Lifting the power limits while curve optimized gets me to 11545, for a total gain of 13%, much higher than the 2% you claim I should get. I'd have to turn on Auto Overclocking to go any higher because I'm hitting the default boost clock limit of 4600 MHz now, but my system isn't terribly stable when I do that so I avoid it (it's otherwise perfectly stable with that turned off). With the power limits lifted, my 5600X caps out at 112 PPT.

And to be clear, I also always recommend that people upgrade from the Wraith cooler. For a lot of people it will hamper performance, and it's loud as hell. And constantly running up against thermal limits will degrade your chip faster. It's just always better to get an aftermarket cooler, even a cheapo tower cooler like a Hyper 212.

Dr. Video Games 0031 fucked around with this message at 16:38 on Sep 14, 2021

Pittsburgh Fentanyl Cloud
Apr 7, 2003


Klyith posted:

A 5600X having measurable performance differences, at stock settings, due to the wraith stealth being a limitation:



from pcgamer


It was constantly recommended in this very thread for the previous 2 generations, when people asked whether they should get the _600 or _600X, "get the non-X and spend the difference on a heatsink". Because a non-X would equal the X in performance once you put a good heatsink on it that allowed higher or longer boosting.


Is the difference huge? No. But neither is the difference you can get from OCing a zen 3 with non-exotic cooling. Using PBO or curve editing you can squeeze out an extra 100mhz here or there. That's gonna be a ~2% difference. In those charts you see 3-4% differences.

I got a 5600X and the first thing I did after assembling it was order a Noctua NH-U125 because the idle temps with the stock cooler were alarming compared to the 3600X it replaced.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
The real story of those graphs is that it is absolutely not worth spending over 100 bucks on an AIO for a 5600X. The difference is miniscule. Just get a cheap air cooler so you don't have to put up with the noise of the wraith.

CaptainSarcastic
Jul 6, 2013



ConanTheLibrarian posted:

The real story of those graphs is that it is absolutely not worth spending over 100 bucks on an AIO for a 5600X. The difference is miniscule. Just get a cheap air cooler so you don't have to put up with the noise of the wraith.

I had my case temps improve overall once I put a tower cooler on my 3600X because now the air is going out the back of the case instead of blowing down toward to the motherboard.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I'm not super interested in manually cranking down timings but is this going to be fine for just something that runs at XMP?

https://smile.amazon.com/dp/B07WLCWQV5

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply