Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Cygni
Nov 12, 2005

raring to post

Now that AMD has released their last AM4 CPUs (in all likelihood) and basically EOLed AM4, they are officially adding support for older boards... after years of fighting the board makers doing it themselves and trying to force people to upgrade lol.

It is still a slightly goofy situation, and with the exception of a few boards from ASUS and maybe a few other scattered high end options from the other brands, AM4 is not going to be a "every board supports every CPU" platform. Here is the new official chart from AMD:



Beyond the fact that AMD has memory holed Bristol Ridge completely at this point, it is bizarre that X570 can officially support Zen+ but not Zen1 and A320 can support everything except Ryzen 3000? I imagine this is mostly because they don't want to bother doing the qualification work when their folks are surely on to AM5 projects by this point.

It is also still gonna be board vendor specific. I don't have high hopes for some of the older AM4 boards I have which get sporadic updates at best.

Adbot
ADBOT LOVES YOU

School Nickname
Apr 23, 2010

*fffffff-fffaaaaaaarrrtt*
:ussr:
I'm still gonna have to bios flash my b450 tomahawk non-max for a 5000 series, correct? I recall something where max boards didn't need to bios flash and non-max boards had to. Currently have a 2600 in it paired with a 3070. I'm tempted to get the 5700x as its tdp is the same as the 2600, but I'm not sure how that translates to real power consumption and if my 650w PSU can handle it. Will likely get a 5600 or a 5600x on sale to be honest.

School Nickname fucked around with this message at 14:31 on Mar 16, 2022

v1ld
Apr 16, 2012

Yes, you'll have to flash for compatibility, which may also open you up to the fTPM bios bug if you're using it. There's a new bios image for that due in early May, so if you've done extensive bios tweaks you may want to hold off till then and do a single upgrade.

Prices have dropped for all of the older parts. Searching /r/buildapcsales shows 5950x prices at $580-590 most places, already below that $599 Best Buy deal. There are w00ts at $520, though I don't think of that as real pricing.

Consider pricing on the older parts, they may be a better deal than the new ones.

v1ld fucked around with this message at 16:02 on Mar 16, 2022

Arzachel
May 12, 2012

School Nickname posted:

I'm still gonna have to bios flash my b450 tomahawk non-max for a 5000 series, correct? Currently have a 2600 in it paired with a 3070. I'm tempted to get the 5700x as its tdp is the same as the 2600, but I'm not sure how that translates to real power consumption and if my 650w PSU can handle it. Will likely get a 5600 or a 5600x on sale to be honest.

For both Zen 2 and 3 the power draw is capped by PPT which on 65W parts is set to 84W and can be changed through the bios. A decent 650W PSU should be more than fine.

School Nickname
Apr 23, 2010

*fffffff-fffaaaaaaarrrtt*
:ussr:

Arzachel posted:

For both Zen 2 and 3 the power draw is capped by PPT which on 65W parts is set to 84W and can be changed through the bios. A decent 650W PSU should be more than fine.

Cheers. Looking up amazon.de I see a 5600x for €240 Inc delivery which is probably as cheap as I'll ever get it for in Ireland. I'll wait for 5700x reviews and make my decision then. Never thought I'd be putting an eight-core 5000 series chip in my b450.

Klyith
Aug 3, 2007

GBS Pledge Week
Also the tomahawk can handle pretty much any ryzen chip, it has a decent VRM for a cheap board which is why they got a lot of recommendations. If you were to put a CPU with more than 8 cores in it I would leave the PBO turned off / stock, and if you're doing water cooling maybe look at the case airflow around the VRMs. But it can do it.

kliras
Mar 27, 2021
best summary of the interview:

https://twitter.com/IanCutress/status/1504220601426358279

seems pretty good to a humble ppt undervolter like myself

kliras fucked around with this message at 00:11 on Mar 17, 2022

Anime Schoolgirl
Nov 28, 2002

Cygni posted:

Now that AMD has released their last AM4 CPUs (in all likelihood) and basically EOLed AM4, they are officially adding support for older boards... after years of fighting the board makers doing it themselves and trying to force people to upgrade lol.

It is still a slightly goofy situation, and with the exception of a few boards from ASUS and maybe a few other scattered high end options from the other brands, AM4 is not going to be a "every board supports every CPU" platform. Here is the new official chart from AMD:



Beyond the fact that AMD has memory holed Bristol Ridge completely at this point, it is bizarre that X570 can officially support Zen+ but not Zen1 and A320 can support everything except Ryzen 3000? I imagine this is mostly because they don't want to bother doing the qualification work when their folks are surely on to AM5 projects by this point.

It is also still gonna be board vendor specific. I don't have high hopes for some of the older AM4 boards I have which get sporadic updates at best.
I wish I bought more Asrock boards because those have been paying off in spades and even though I already have a Zen 2 fleet of PCs already that upgrading to Zen 3 sounds excessive, they've been really handy due to their willingness to shove latest CPU support in "unofficial" bios updates.

Except the A300 :argh:

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

kliras posted:

best summary of the interview:

https://twitter.com/IanCutress/status/1504220601426358279

seems pretty good to a humble ppt undervolter like myself

Does DRAM tuning get anything from a voltage bump on Ryzen CPUs, like it does for Intel chipsets for VCCSA and VCCIO? I guess this is kind of an overclocking thread question. I just want to know if locked CPU voltage affects anything about tuning DRAM since the upcoming Ryzen will be my first one

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Sidesaddle Cavalry posted:

Does DRAM tuning get anything from a voltage bump on Ryzen CPUs, like it does for Intel chipsets for VCCSA and VCCIO? I guess this is kind of an overclocking thread question. I just want to know if locked CPU voltage affects anything about tuning DRAM since the upcoming Ryzen will be my first one

yeah and on Ryzen that is controlled by vSOC. Note that some boards do this automatically when you enable XMP/AMP.

Don’t go too hog wild, you can damage the IMC over time.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Noted, and I'll try to keep it humble. The point was to avoid the DRAM as much as possible with the V-Cache anyways!

BurritoJustice
Oct 9, 2012

Sidesaddle Cavalry posted:

Does DRAM tuning get anything from a voltage bump on Ryzen CPUs, like it does for Intel chipsets for VCCSA and VCCIO? I guess this is kind of an overclocking thread question. I just want to know if locked CPU voltage affects anything about tuning DRAM since the upcoming Ryzen will be my first one

Yeah, Ryzen needs higher IMC and agent voltages for higher ram clocks, especially if you're going for 1:1 with the Infinity fabric at 3800+.

I'd be surprised if they lock out those voltages too, but yeah you'd have issues with going above 3200 if you're hard locked at stock for every voltage.

Arzachel
May 12, 2012

kliras posted:

best summary of the interview:

https://twitter.com/IanCutress/status/1504220601426358279

seems pretty good to a humble ppt undervolter like myself

I'm guessing that means no curve optimizer undervolting, right?

BurritoJustice
Oct 9, 2012

Arzachel posted:

I'm guessing that means no curve optimizer undervolting, right?

That's the consensus from leakers, yeah. Basically anything touching the clocks or voltages of the cores.

redeyes
Sep 14, 2002

by Fluffdaddy
I bet increasing voltage nukes the extra cache.

kliras
Mar 27, 2021
pretty cool how the l3 outperforms dram at ~24mb

https://twitter.com/IanCutress/status/1505902850467057665

can't wait for the super nerdy 5800x3d benchmarks

v1ld
Apr 16, 2012

The 5900x has 64MB L3 and should be good competition for the 5800x3D. +33% cores and -33% cache at the same or most likely an appreciable discount being the old thing vs the new thing.

I prefer the additional cores for future emulation potential, when looking at only gaming workloads.

E: I just don't see how the x3d is a good deal compared to a 5900x. This is additional L3, not L1/2 cache. Latencies are high, just 3:1 against DRAM! I'd take the additional 4 cores over another 50% cache everytime, when it's L3. Now if that was additional L1 where your actually be forestalling pipeline stalls yeah, for sure. But your CPU is going to stall waiting on that L3 anyways. Better to have more cores for when you need them - in my opinion. All this given the price equality.

v1ld fucked around with this message at 17:29 on Mar 21, 2022

Pablo Bluth
Sep 7, 2007

I've made a huge mistake.
https://www.phoronix.com/scan.php?page=article&item=amd-epyc-7773x-linux&num=1

Some very impressive gains shown in the Phoronix benchmarks. Cutting 40% off an OpenFOAM run is serious business.

VorpalFish
Mar 22, 2007
reasonably awesometm

v1ld posted:

The 5900x has 64MB L3 and should be good competition for the 5800x3D. +33% cores and -33% cache at the same or most likely an appreciable discount being the old thing vs the new thing.

I prefer the additional cores for future emulation potential, when looking at only gaming workloads.

E: I just don't see how the x3d is a good deal compared to a 5900x. This is additional L3, not L1/2 cache. Latencies are high, just 3:1 against DRAM! I'd take the additional 4 cores over another 50% cache everytime, when it's L3. Now if that was additional L1 where your actually be forestalling pipeline stalls yeah, for sure. But your CPU is going to stall waiting on that L3 anyways. Better to have more cores for when you need them - in my opinion. All this given the price equality.

For purely gaming workloads, x3d will probably be faster than 5900x (and 5950x) for as long as these chips are relevant, so if that's your only demanding CPU workload, it will make sense.

You can say it's better to have more cores when you need them, but with games you basically never need them.

Obviously it's positioned in an awkward spot where there is no "all things to all people best of the best sku" anymore, so you have to choose what's most important between gaming performance or content creation / compiling / virtualization rather than being able to throw money at it and get something that's the best at everything.

v1ld
Apr 16, 2012

Not saying more cache isn't better - it's always better except if your working set already fits in existing cache, which is unlikely for these workloads.

AMD and most comparisons of the 5800x3D are to the 5800x when the comparison to the 5900x seems better because of pricing, where the 5900x should be cheaper once these things hit the street.

The 5800x3D has +200% L3 relative to the 5800x and only +50% relative to the 5900x. You get the effects of a cache twice / +100% as large as the 5800x's by going to the 5900x. And +50% more cores.

But as said above, for current gaming workloads 8 cores should be sufficient and the +50% cache should yield some benefit, but not marginally anywhere near as much as comparing to the 5800x. But if you're looking to emulate PS4/PS5 in the future - a niche need, but one I'm interested in - those extra cores will come in handy since parallelism will help a lot there.

E: That Phoronix benchmark is neat, but that's showing a +40% bump in perf, very cool, when going from 256MB to 768MB L3. But that's another 3x bump in L3, not 256MB vs 384MB say. It'd be interesting to know the size of the working set of that benchmark, it obviously benefited greatly from the additional lower latency cache.

v1ld fucked around with this message at 18:20 on Mar 21, 2022

VorpalFish
Mar 22, 2007
reasonably awesometm

v1ld posted:


But as said above, for current gaming workloads 8 cores should be sufficient and the +50% cache should yield some benefit, but not marginally anywhere near as much as comparing to the 5800x. But if you're looking to emulate PS4/PS5 in the future - a niche need, but one I'm interested in - those extra cores will come in handy since parallelism will help a lot there.


If I was a betting man, I'd bet good money future psx emulation is gonna scale well with per core performance, and not well past 8 cores - in other words, I'd bet on x3d over 5900x for that use case, too.

Edit: bear in mind when you talk about the x3d only having 50% more cache that half the 5900x L3 is cross ccd via infinity fabric.

VorpalFish fucked around with this message at 20:09 on Mar 21, 2022

VorpalFish
Mar 22, 2007
reasonably awesometm

Should clarify not suggesting people go out and replace 5900x's, still plenty fast for games. Just that the 5800x3d has a niche in the "my budget is effectively unlimited and my use case is only gaming" category.

All pending good showing in independent benchmarks of course.

Klyith
Aug 3, 2007

GBS Pledge Week
Why in gods name would any ryzen chip have trouble emulating a PS4? A PS4 has 8 slow-rear end cat cores, I would expect a 1500X would easily manage the same CPU performance. Might need some sort of extra hyperthread-aware scheduler to keep the heaviest threads from landing on the same core, but even that maybe not.


edit: lmao gently caress God of War's PC port minimum CPU requirement is a ryzen 1200 and recommended is a 2400G

LRADIKAL
Jun 10, 2001

Fun Shoe
Emulators are very intensive.

FuturePastNow
May 19, 2014


I am going to upgrade from my 3600 to either a 5900X (my original plan) or 5800X3D. Just waiting for the juicy reviews.

Seamonster
Apr 30, 2007

IMMER SIEGREICH

Klyith posted:

Why in gods name would any ryzen chip have trouble emulating a PS4? A PS4 has 8 slow-rear end cat cores, I would expect a 1500X would easily manage the same CPU performance. Might need some sort of extra hyperthread-aware scheduler to keep the heaviest threads from landing on the same core, but even that maybe not.


edit: lmao gently caress God of War's PC port minimum CPU requirement is a ryzen 1200 and recommended is a 2400G

For comparison's sake its recommended to run a dual core CPU greater than 2.0 GHz to emulate a PS2. Which had a 300mhz single core. Emulation, in most cases, requires heavy overkill.

VorpalFish
Mar 22, 2007
reasonably awesometm

Klyith posted:

Why in gods name would any ryzen chip have trouble emulating a PS4? A PS4 has 8 slow-rear end cat cores, I would expect a 1500X would easily manage the same CPU performance. Might need some sort of extra hyperthread-aware scheduler to keep the heaviest threads from landing on the same core, but even that maybe not.


edit: lmao gently caress God of War's PC port minimum CPU requirement is a ryzen 1200 and recommended is a 2400G

In theory it was gonna be "really easy" to emulate cause it's also x86, and yet here we are...

But yes, the per core performance of those cat cores was dismal.

denereal visease
Nov 27, 2002

"Research your own experience. Absorb what is useful, reject what is useless, add what is essentially your own."

VorpalFish posted:

Should clarify not suggesting people go out and replace 5900x's, still plenty fast for games. Just that the 5800x3d has a niche in the "my budget is effectively unlimited and my use case is only gaming" category.

All pending good showing in independent benchmarks of course.
Really curious to see how the 5800X3d does wrt gaming performance and thermals versus the 5800X, might be doing a side grade and riding the 5800X3d into it's grave.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

LRADIKAL posted:

Emulators are very intensive.
Emulation? What?

What am I missing? What is there to "emulate" when target and host CPUs are of the same family?! (re: PS4)

CaptainSarcastic
Jul 6, 2013



Combat Pretzel posted:

Emulation? What?

What am I missing? What is there to "emulate" when target and host CPUs are of the same family?! (re: PS4)

Proprietary hardware and OS, I would think. And even if it's the same processor family I'm pretty sure the consoles aren't using off-the-shelf CPUs.

Klyith
Aug 3, 2007

GBS Pledge Week

VorpalFish posted:

In theory it was gonna be "really easy" to emulate cause it's also x86, and yet here we are...

It's gonna be really easy to emulate the PS4 & Xbone once people are fully hacking it from the inside and disassembling the software & figure out how the security system works. Or like, maybe someone hacks MS or Sony and dumps the source code for the OS onto the internet. At that point it will suddenly become pretty easy.

It's hard in a completely different way from previous emulation of consoles. There would still be plenty of work to make an emulator, but it'll look much closer to wine/proton -- a translation layer, which is not hard to make performant (by comparison with emulating hardware on different hardware).



CaptainSarcastic posted:

Proprietary hardware and OS, I would think. And even if it's the same processor family I'm pretty sure the consoles aren't using off-the-shelf CPUs.

The x86-64 parts of the console processors are totally normal. The GPUs have some extra features that aren't on a normal GCN or RDNA2 GPU, which will need emulation & work-arounds. And the integrated security features are unique (possibly not totally unique in the case of MS, since their Pluton thing came from Xbox work).

And it's not the uniqueness that makes it hard, it's that they are really secure systems. The first true jailbreak on PS4 happened just a few months ago. And the xbone still hasn't been broken.

repiv
Aug 13, 2009

The PS4 OS has already been booted on PC hardware under hardware virtualization years ago, the CPU side is basically solved and should be quite fast

https://github.com/AlexAltea/orbital

https://www.youtube.com/watch?v=IBBHisNM74o

The hard part, as mentioned, is everything besides the CPU. Especially getting decrypted software for anything beyond FW 9.0 (the current limit for jailbreaking PS4s) and mapping the GCN-centric graphics API to DX12/Vulkan efficiently.

repiv fucked around with this message at 23:46 on Mar 21, 2022

v1ld
Apr 16, 2012

While the x86 nature of the PS4/PS5 make emulation easier in many ways, there's going to be weird hardware changes that'll still add overhead. PS5 has better-than-DirectStorage texture handling with dedicated compression hardware that bypasses the CPU entirely for eg.

VorpalFish posted:

If I was a betting man, I'd bet good money future psx emulation is gonna scale well with per core performance, and not well past 8 cores - in other words, I'd bet on x3d over 5900x for that use case, too.

Edit: bear in mind when you talk about the x3d only having 50% more cache that half the 5900x L3 is cross ccd via infinity fabric.

The cross-ccd split cache is a very good point, hadn't considered that.

BrainDance
May 8, 2007

Disco all night long!

The original Xbox had a slightly customized PIII, yet look at where we are with Xbox emulation. It's not as straightforward as "same architecture, no need to emulate, easy and no performance penalty."

Inept
Jul 8, 2003

BrainDance posted:

The original Xbox had a slightly customized PIII, yet look at where we are with Xbox emulation. It's not as straightforward as "same architecture, no need to emulate, easy and no performance penalty."

Yeah and the xgpu was a slightly customized GeForce 3. It was probably the most off-the-shelf console we'll ever get.

repiv
Aug 13, 2009

The Switch uses a literal off-the-shelf SoC with no customizations at all :v:

No surprises that emulation is so far along

repiv
Aug 13, 2009

Regarding the GPU on the original Xbox, the author of the CXBX emulator disputes that it's just a slightly customized GF3

quote:

3. Emulating any hardware by NVIDIA is not a walk in the park! The Xbox's GPU, the NV2A is often assumed just a GeForce 3. It's not! It's similar but not identical. It has some GeForce 4 capabilities too, so it's more of a cross between an NV20 and NV25. This is by no means easy to emulate either. NVIDIA's GPUs have very large register sets and afaik not even half of them have been discovered, and a large portion of known registers have unknown purposes. There is little to no documentation on how NVIDIA GPUs work. The best thing to do is to look at similar GPUs such as RIVA, TNT, and older GeForce cards. Some registers are similar, but not identical. The best place to look for information is in open source drivers available on the net. Adding to the dificulty is that no one has ever discovered how pixel shaders work on NV2x cards, vertex shaders yes though. The Xbox GPU also has exclusive registers that are not found in other GeForce cards. Information on the NV2A's GPU registers are just now beginning to be discovered a few months ago. And yet, there's still a long way to go. The GeForce 3 series is the most mysterious of all NVIDIA GPUs (G7x and G8x aside) and the NV2A is alot worse.

Even the vanilla version of NV20 was never fully documented and supported by the open source drivers, so good luck with the weird one-off NV20/NV25 hybrid that was only ever used in the Xbox

It probably doesn't help that the OG XBox just doesn't have that many notable games that haven't been ported to a better or easier to emulate system, which limits the amount of interest in reverse engineering it

repiv fucked around with this message at 01:03 on Mar 22, 2022

Klyith
Aug 3, 2007

GBS Pledge Week

BrainDance posted:

The original Xbox had a slightly customized PIII, yet look at where we are with Xbox emulation. It's not as straightforward as "same architecture, no need to emulate, easy and no performance penalty."

Microsoft seems to manage just fine though?

Really the OG Xbox situation is that very few people cared, there was no scene. There were almost no exclusives of note on the system. Hobby projects like this need a good number of people who care and are sharing information. At some point a few years ago a scene sprung up and they made pretty fast progress in that time.

And I didn't say easy, I said easier. Even MS has had trouble with some games and they have all the information.



Anyways, quibbling over the imaginary performance of emulators that don't exist is pointless. Picking a 5900x or a 5800X3D based on wanting to do PS4 emulation is real stupid, because:
1) you can't know what the performance needs of said emulator will be -- maybe it likes cache, maybe it needs cores, who knows
2) you will probably buy a new PC or two before a PS4 emulator is able to play games


e:

repiv posted:

The Switch uses a literal off-the-shelf SoC with no customizations at all :v:

No surprises that emulation is so far along
Also helps that Nintendo is still in the stone age for security and encryption.

Klyith fucked around with this message at 01:09 on Mar 22, 2022

repiv
Aug 13, 2009

Ironically Nintendo actually did a good job with the Switch security for a change, it was Nvidia who hosed up bigly and blew it completely open by leaving an exploit in the unpatchable bootrom on early Tegra X1 units

The systems security has held up pretty drat well on units produced after they fixed the bootrom, but of course the damage is done and every aspect of the system was documented based on those first gen units getting owned completely

repiv fucked around with this message at 01:19 on Mar 22, 2022

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

repiv posted:

The Switch uses a literal off-the-shelf SoC with no customizations at all :v:

No surprises that emulation is so far along

While the SoC has no customizations, they do apparently use a custom API ("NVN"), which nintendo has touted as allowing them to punch above their SoC's weight, though that's probably 90% marketing BS.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply