|
Now that AMD has released their last AM4 CPUs (in all likelihood) and basically EOLed AM4, they are officially adding support for older boards... after years of fighting the board makers doing it themselves and trying to force people to upgrade lol. It is still a slightly goofy situation, and with the exception of a few boards from ASUS and maybe a few other scattered high end options from the other brands, AM4 is not going to be a "every board supports every CPU" platform. Here is the new official chart from AMD: Beyond the fact that AMD has memory holed Bristol Ridge completely at this point, it is bizarre that X570 can officially support Zen+ but not Zen1 and A320 can support everything except Ryzen 3000? I imagine this is mostly because they don't want to bother doing the qualification work when their folks are surely on to AM5 projects by this point. It is also still gonna be board vendor specific. I don't have high hopes for some of the older AM4 boards I have which get sporadic updates at best.
|
# ? Mar 16, 2022 01:57 |
|
|
# ? May 22, 2024 17:33 |
|
I'm still gonna have to bios flash my b450 tomahawk non-max for a 5000 series, correct? I recall something where max boards didn't need to bios flash and non-max boards had to. Currently have a 2600 in it paired with a 3070. I'm tempted to get the 5700x as its tdp is the same as the 2600, but I'm not sure how that translates to real power consumption and if my 650w PSU can handle it. Will likely get a 5600 or a 5600x on sale to be honest.
School Nickname fucked around with this message at 14:31 on Mar 16, 2022 |
# ? Mar 16, 2022 14:26 |
|
Yes, you'll have to flash for compatibility, which may also open you up to the fTPM bios bug if you're using it. There's a new bios image for that due in early May, so if you've done extensive bios tweaks you may want to hold off till then and do a single upgrade. Prices have dropped for all of the older parts. Searching /r/buildapcsales shows 5950x prices at $580-590 most places, already below that $599 Best Buy deal. There are w00ts at $520, though I don't think of that as real pricing. Consider pricing on the older parts, they may be a better deal than the new ones. v1ld fucked around with this message at 16:02 on Mar 16, 2022 |
# ? Mar 16, 2022 14:34 |
|
School Nickname posted:I'm still gonna have to bios flash my b450 tomahawk non-max for a 5000 series, correct? Currently have a 2600 in it paired with a 3070. I'm tempted to get the 5700x as its tdp is the same as the 2600, but I'm not sure how that translates to real power consumption and if my 650w PSU can handle it. Will likely get a 5600 or a 5600x on sale to be honest. For both Zen 2 and 3 the power draw is capped by PPT which on 65W parts is set to 84W and can be changed through the bios. A decent 650W PSU should be more than fine.
|
# ? Mar 16, 2022 14:42 |
|
Arzachel posted:For both Zen 2 and 3 the power draw is capped by PPT which on 65W parts is set to 84W and can be changed through the bios. A decent 650W PSU should be more than fine. Cheers. Looking up amazon.de I see a 5600x for €240 Inc delivery which is probably as cheap as I'll ever get it for in Ireland. I'll wait for 5700x reviews and make my decision then. Never thought I'd be putting an eight-core 5000 series chip in my b450.
|
# ? Mar 16, 2022 15:04 |
|
Also the tomahawk can handle pretty much any ryzen chip, it has a decent VRM for a cheap board which is why they got a lot of recommendations. If you were to put a CPU with more than 8 cores in it I would leave the PBO turned off / stock, and if you're doing water cooling maybe look at the case airflow around the VRMs. But it can do it.
|
# ? Mar 16, 2022 16:51 |
|
best summary of the interview: https://twitter.com/IanCutress/status/1504220601426358279 seems pretty good to a humble ppt undervolter like myself kliras fucked around with this message at 00:11 on Mar 17, 2022 |
# ? Mar 17, 2022 00:08 |
|
Cygni posted:Now that AMD has released their last AM4 CPUs (in all likelihood) and basically EOLed AM4, they are officially adding support for older boards... after years of fighting the board makers doing it themselves and trying to force people to upgrade lol. Except the A300
|
# ? Mar 17, 2022 02:27 |
|
kliras posted:best summary of the interview: Does DRAM tuning get anything from a voltage bump on Ryzen CPUs, like it does for Intel chipsets for VCCSA and VCCIO? I guess this is kind of an overclocking thread question. I just want to know if locked CPU voltage affects anything about tuning DRAM since the upcoming Ryzen will be my first one
|
# ? Mar 17, 2022 03:30 |
|
Sidesaddle Cavalry posted:Does DRAM tuning get anything from a voltage bump on Ryzen CPUs, like it does for Intel chipsets for VCCSA and VCCIO? I guess this is kind of an overclocking thread question. I just want to know if locked CPU voltage affects anything about tuning DRAM since the upcoming Ryzen will be my first one yeah and on Ryzen that is controlled by vSOC. Note that some boards do this automatically when you enable XMP/AMP. Don’t go too hog wild, you can damage the IMC over time.
|
# ? Mar 17, 2022 03:46 |
|
Noted, and I'll try to keep it humble. The point was to avoid the DRAM as much as possible with the V-Cache anyways!
|
# ? Mar 17, 2022 05:33 |
|
Sidesaddle Cavalry posted:Does DRAM tuning get anything from a voltage bump on Ryzen CPUs, like it does for Intel chipsets for VCCSA and VCCIO? I guess this is kind of an overclocking thread question. I just want to know if locked CPU voltage affects anything about tuning DRAM since the upcoming Ryzen will be my first one Yeah, Ryzen needs higher IMC and agent voltages for higher ram clocks, especially if you're going for 1:1 with the Infinity fabric at 3800+. I'd be surprised if they lock out those voltages too, but yeah you'd have issues with going above 3200 if you're hard locked at stock for every voltage.
|
# ? Mar 17, 2022 08:38 |
|
kliras posted:best summary of the interview: I'm guessing that means no curve optimizer undervolting, right?
|
# ? Mar 17, 2022 10:01 |
|
Arzachel posted:I'm guessing that means no curve optimizer undervolting, right? That's the consensus from leakers, yeah. Basically anything touching the clocks or voltages of the cores.
|
# ? Mar 17, 2022 12:11 |
|
I bet increasing voltage nukes the extra cache.
|
# ? Mar 17, 2022 15:30 |
|
pretty cool how the l3 outperforms dram at ~24mb https://twitter.com/IanCutress/status/1505902850467057665 can't wait for the super nerdy 5800x3d benchmarks
|
# ? Mar 21, 2022 14:52 |
|
The 5900x has 64MB L3 and should be good competition for the 5800x3D. +33% cores and -33% cache at the same or most likely an appreciable discount being the old thing vs the new thing. I prefer the additional cores for future emulation potential, when looking at only gaming workloads. E: I just don't see how the x3d is a good deal compared to a 5900x. This is additional L3, not L1/2 cache. Latencies are high, just 3:1 against DRAM! I'd take the additional 4 cores over another 50% cache everytime, when it's L3. Now if that was additional L1 where your actually be forestalling pipeline stalls yeah, for sure. But your CPU is going to stall waiting on that L3 anyways. Better to have more cores for when you need them - in my opinion. All this given the price equality. v1ld fucked around with this message at 17:29 on Mar 21, 2022 |
# ? Mar 21, 2022 17:20 |
|
https://www.phoronix.com/scan.php?page=article&item=amd-epyc-7773x-linux&num=1 Some very impressive gains shown in the Phoronix benchmarks. Cutting 40% off an OpenFOAM run is serious business.
|
# ? Mar 21, 2022 17:32 |
|
v1ld posted:The 5900x has 64MB L3 and should be good competition for the 5800x3D. +33% cores and -33% cache at the same or most likely an appreciable discount being the old thing vs the new thing. For purely gaming workloads, x3d will probably be faster than 5900x (and 5950x) for as long as these chips are relevant, so if that's your only demanding CPU workload, it will make sense. You can say it's better to have more cores when you need them, but with games you basically never need them. Obviously it's positioned in an awkward spot where there is no "all things to all people best of the best sku" anymore, so you have to choose what's most important between gaming performance or content creation / compiling / virtualization rather than being able to throw money at it and get something that's the best at everything.
|
# ? Mar 21, 2022 17:43 |
|
Not saying more cache isn't better - it's always better except if your working set already fits in existing cache, which is unlikely for these workloads. AMD and most comparisons of the 5800x3D are to the 5800x when the comparison to the 5900x seems better because of pricing, where the 5900x should be cheaper once these things hit the street. The 5800x3D has +200% L3 relative to the 5800x and only +50% relative to the 5900x. You get the effects of a cache twice / +100% as large as the 5800x's by going to the 5900x. And +50% more cores. But as said above, for current gaming workloads 8 cores should be sufficient and the +50% cache should yield some benefit, but not marginally anywhere near as much as comparing to the 5800x. But if you're looking to emulate PS4/PS5 in the future - a niche need, but one I'm interested in - those extra cores will come in handy since parallelism will help a lot there. E: That Phoronix benchmark is neat, but that's showing a +40% bump in perf, very cool, when going from 256MB to 768MB L3. But that's another 3x bump in L3, not 256MB vs 384MB say. It'd be interesting to know the size of the working set of that benchmark, it obviously benefited greatly from the additional lower latency cache. v1ld fucked around with this message at 18:20 on Mar 21, 2022 |
# ? Mar 21, 2022 18:04 |
|
v1ld posted:
If I was a betting man, I'd bet good money future psx emulation is gonna scale well with per core performance, and not well past 8 cores - in other words, I'd bet on x3d over 5900x for that use case, too. Edit: bear in mind when you talk about the x3d only having 50% more cache that half the 5900x L3 is cross ccd via infinity fabric. VorpalFish fucked around with this message at 20:09 on Mar 21, 2022 |
# ? Mar 21, 2022 20:05 |
|
Should clarify not suggesting people go out and replace 5900x's, still plenty fast for games. Just that the 5800x3d has a niche in the "my budget is effectively unlimited and my use case is only gaming" category. All pending good showing in independent benchmarks of course.
|
# ? Mar 21, 2022 20:28 |
|
Why in gods name would any ryzen chip have trouble emulating a PS4? A PS4 has 8 slow-rear end cat cores, I would expect a 1500X would easily manage the same CPU performance. Might need some sort of extra hyperthread-aware scheduler to keep the heaviest threads from landing on the same core, but even that maybe not. edit: lmao gently caress God of War's PC port minimum CPU requirement is a ryzen 1200 and recommended is a 2400G
|
# ? Mar 21, 2022 20:42 |
|
Emulators are very intensive.
|
# ? Mar 21, 2022 20:53 |
|
I am going to upgrade from my 3600 to either a 5900X (my original plan) or 5800X3D. Just waiting for the juicy reviews.
|
# ? Mar 21, 2022 20:54 |
|
Klyith posted:Why in gods name would any ryzen chip have trouble emulating a PS4? A PS4 has 8 slow-rear end cat cores, I would expect a 1500X would easily manage the same CPU performance. Might need some sort of extra hyperthread-aware scheduler to keep the heaviest threads from landing on the same core, but even that maybe not. For comparison's sake its recommended to run a dual core CPU greater than 2.0 GHz to emulate a PS2. Which had a 300mhz single core. Emulation, in most cases, requires heavy overkill.
|
# ? Mar 21, 2022 21:02 |
|
Klyith posted:Why in gods name would any ryzen chip have trouble emulating a PS4? A PS4 has 8 slow-rear end cat cores, I would expect a 1500X would easily manage the same CPU performance. Might need some sort of extra hyperthread-aware scheduler to keep the heaviest threads from landing on the same core, but even that maybe not. In theory it was gonna be "really easy" to emulate cause it's also x86, and yet here we are... But yes, the per core performance of those cat cores was dismal.
|
# ? Mar 21, 2022 21:03 |
|
VorpalFish posted:Should clarify not suggesting people go out and replace 5900x's, still plenty fast for games. Just that the 5800x3d has a niche in the "my budget is effectively unlimited and my use case is only gaming" category.
|
# ? Mar 21, 2022 21:48 |
|
LRADIKAL posted:Emulators are very intensive. What am I missing? What is there to "emulate" when target and host CPUs are of the same family?! (re: PS4)
|
# ? Mar 21, 2022 22:09 |
|
Combat Pretzel posted:Emulation? What? Proprietary hardware and OS, I would think. And even if it's the same processor family I'm pretty sure the consoles aren't using off-the-shelf CPUs.
|
# ? Mar 21, 2022 22:14 |
|
VorpalFish posted:In theory it was gonna be "really easy" to emulate cause it's also x86, and yet here we are... It's gonna be really easy to emulate the PS4 & Xbone once people are fully hacking it from the inside and disassembling the software & figure out how the security system works. Or like, maybe someone hacks MS or Sony and dumps the source code for the OS onto the internet. At that point it will suddenly become pretty easy. It's hard in a completely different way from previous emulation of consoles. There would still be plenty of work to make an emulator, but it'll look much closer to wine/proton -- a translation layer, which is not hard to make performant (by comparison with emulating hardware on different hardware). CaptainSarcastic posted:Proprietary hardware and OS, I would think. And even if it's the same processor family I'm pretty sure the consoles aren't using off-the-shelf CPUs. The x86-64 parts of the console processors are totally normal. The GPUs have some extra features that aren't on a normal GCN or RDNA2 GPU, which will need emulation & work-arounds. And the integrated security features are unique (possibly not totally unique in the case of MS, since their Pluton thing came from Xbox work). And it's not the uniqueness that makes it hard, it's that they are really secure systems. The first true jailbreak on PS4 happened just a few months ago. And the xbone still hasn't been broken.
|
# ? Mar 21, 2022 22:59 |
|
The PS4 OS has already been booted on PC hardware under hardware virtualization years ago, the CPU side is basically solved and should be quite fast https://github.com/AlexAltea/orbital https://www.youtube.com/watch?v=IBBHisNM74o The hard part, as mentioned, is everything besides the CPU. Especially getting decrypted software for anything beyond FW 9.0 (the current limit for jailbreaking PS4s) and mapping the GCN-centric graphics API to DX12/Vulkan efficiently. repiv fucked around with this message at 23:46 on Mar 21, 2022 |
# ? Mar 21, 2022 23:39 |
|
While the x86 nature of the PS4/PS5 make emulation easier in many ways, there's going to be weird hardware changes that'll still add overhead. PS5 has better-than-DirectStorage texture handling with dedicated compression hardware that bypasses the CPU entirely for eg.VorpalFish posted:If I was a betting man, I'd bet good money future psx emulation is gonna scale well with per core performance, and not well past 8 cores - in other words, I'd bet on x3d over 5900x for that use case, too. The cross-ccd split cache is a very good point, hadn't considered that.
|
# ? Mar 21, 2022 23:44 |
|
The original Xbox had a slightly customized PIII, yet look at where we are with Xbox emulation. It's not as straightforward as "same architecture, no need to emulate, easy and no performance penalty."
|
# ? Mar 22, 2022 00:16 |
|
BrainDance posted:The original Xbox had a slightly customized PIII, yet look at where we are with Xbox emulation. It's not as straightforward as "same architecture, no need to emulate, easy and no performance penalty." Yeah and the xgpu was a slightly customized GeForce 3. It was probably the most off-the-shelf console we'll ever get.
|
# ? Mar 22, 2022 00:21 |
|
The Switch uses a literal off-the-shelf SoC with no customizations at all No surprises that emulation is so far along
|
# ? Mar 22, 2022 00:23 |
|
Regarding the GPU on the original Xbox, the author of the CXBX emulator disputes that it's just a slightly customized GF3quote:3. Emulating any hardware by NVIDIA is not a walk in the park! The Xbox's GPU, the NV2A is often assumed just a GeForce 3. It's not! It's similar but not identical. It has some GeForce 4 capabilities too, so it's more of a cross between an NV20 and NV25. This is by no means easy to emulate either. NVIDIA's GPUs have very large register sets and afaik not even half of them have been discovered, and a large portion of known registers have unknown purposes. There is little to no documentation on how NVIDIA GPUs work. The best thing to do is to look at similar GPUs such as RIVA, TNT, and older GeForce cards. Some registers are similar, but not identical. The best place to look for information is in open source drivers available on the net. Adding to the dificulty is that no one has ever discovered how pixel shaders work on NV2x cards, vertex shaders yes though. The Xbox GPU also has exclusive registers that are not found in other GeForce cards. Information on the NV2A's GPU registers are just now beginning to be discovered a few months ago. And yet, there's still a long way to go. The GeForce 3 series is the most mysterious of all NVIDIA GPUs (G7x and G8x aside) and the NV2A is alot worse. Even the vanilla version of NV20 was never fully documented and supported by the open source drivers, so good luck with the weird one-off NV20/NV25 hybrid that was only ever used in the Xbox It probably doesn't help that the OG XBox just doesn't have that many notable games that haven't been ported to a better or easier to emulate system, which limits the amount of interest in reverse engineering it repiv fucked around with this message at 01:03 on Mar 22, 2022 |
# ? Mar 22, 2022 00:41 |
|
BrainDance posted:The original Xbox had a slightly customized PIII, yet look at where we are with Xbox emulation. It's not as straightforward as "same architecture, no need to emulate, easy and no performance penalty." Microsoft seems to manage just fine though? Really the OG Xbox situation is that very few people cared, there was no scene. There were almost no exclusives of note on the system. Hobby projects like this need a good number of people who care and are sharing information. At some point a few years ago a scene sprung up and they made pretty fast progress in that time. And I didn't say easy, I said easier. Even MS has had trouble with some games and they have all the information. Anyways, quibbling over the imaginary performance of emulators that don't exist is pointless. Picking a 5900x or a 5800X3D based on wanting to do PS4 emulation is real stupid, because: 1) you can't know what the performance needs of said emulator will be -- maybe it likes cache, maybe it needs cores, who knows 2) you will probably buy a new PC or two before a PS4 emulator is able to play games e: repiv posted:The Switch uses a literal off-the-shelf SoC with no customizations at all Klyith fucked around with this message at 01:09 on Mar 22, 2022 |
# ? Mar 22, 2022 01:02 |
|
Ironically Nintendo actually did a good job with the Switch security for a change, it was Nvidia who hosed up bigly and blew it completely open by leaving an exploit in the unpatchable bootrom on early Tegra X1 units The systems security has held up pretty drat well on units produced after they fixed the bootrom, but of course the damage is done and every aspect of the system was documented based on those first gen units getting owned completely repiv fucked around with this message at 01:19 on Mar 22, 2022 |
# ? Mar 22, 2022 01:16 |
|
|
# ? May 22, 2024 17:33 |
|
repiv posted:The Switch uses a literal off-the-shelf SoC with no customizations at all While the SoC has no customizations, they do apparently use a custom API ("NVN"), which nintendo has touted as allowing them to punch above their SoC's weight, though that's probably 90% marketing BS.
|
# ? Mar 22, 2022 01:23 |