Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Inept posted:

If you want sustained high framerates, you want more performance than a console that targets 60fps.

the difference between a 5600X and any of the higher SKUs is already negligible at 1080p, and non-existent at higher resolutions


Adbot
ADBOT LOVES YOU

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
1 fps between the 5600 and the 5950 using a 3090 at 4k actually :colbert:

Inept
Jul 8, 2003

gradenko_2000 posted:

the difference between a 5600X and any of the higher SKUs is already negligible at 1080p, and non-existent at higher resolutions

Right, but that might not be the case in 2+ years. There was a time that 4/4 was a negligible difference than 6/6 or 4/8, but that has changed.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
There's CPU-bound games where that's absolutely not true. Path of Exile at 4K was using 100% CPU on my 5600 and dropping tons of frames even though the GPU capped out around 70%. Changing it for a 5900x the CPU, I haven't been able to hit 100% CPU and it's like a world of difference.

Klyith
Aug 3, 2007

GBS Pledge Week
I just ran into this article which shows great evidence that cache matters as much as cores. On the intel side that affects performance a lot because the higher-tier CPUs have more cache, and that makes benchmark graphs appear to show that a CPU with more cores is better.

On the AMD side that's irrelevant for 5600X vs 5800X, but is a huge reason why a 5600X is better than a 3700X despite being down 2 cores.

Fantastic Foreskin posted:

I'm firmly in the '5600 will be fine' camp, but it's hard to overstate just how terrible the cores were in the last console generation.

Jaguar sucked, but all the ways it sucked should have given developers even more incentive to make their games as multi-core as possible. It didn't do any clock boosting if you had just 1 core loaded. You had suck for single-threaded performance, or suck times 8 if you could use all of them. If we lived in a universe where work was always easily divided over an arbitrary # of threads, Jaguar would have been good (and x86 would have been trounced by RISC 30 years ago).

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord

shrike82 posted:

Just get the most CPU you’re able to afford

The price difference is going to pale in comparison to the various GPU tiers anyway

Best advice re cpu chat right here

Ihmemies
Oct 6, 2012

Intel 10nm desktop CPU's and Zen3+ should release in Q4. Intel has 8+8 cores and Zen3+ has 32MB? of extra cache which is good especially in games. AMD promises 15% faster gaming on average with the new cache.

I'd like to upgrade to Intel or AMD, depending on which turns out to be faster in games. I could then move my 8700K to my HTPC to replace the ancient i7 920...

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.
It's an extra 64 megs of cache, for a total of 96 megs. They're able to stuff so much on the extra chiplet since they can manufacture it with a variant of the process optimized for memory density instead of high-speed logic.

Ihmemies
Oct 6, 2012

Nice. That is a lot of cache then :v: Hopefully the new Intel won't be too fast, I'd really like to buy an AMD product again!

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Ihmemies posted:

Intel 10nm desktop CPU's and Zen3+ should release in Q4. Intel has 8+8 cores and Zen3+ has 32MB? of extra cache which is good especially in games. AMD promises 15% faster gaming on average with the new cache.

this is extremely confusing terminology and I've messed it up before but - Zen3+ is Zen3 on DDR5, apparently Rembrandt (mobile/BGA APU with RDNA2) only for now. There were apparently more "Zen3+" chips at one point but they got cancelled, probably due to parts shortages - buying a new mobo just to be an early adopter on a new memory standard with a previous-gen architecture is a tough sell right now, like wow tell me you're a masochist without telling me. Zen3 with VCache is "Zen3 Refresh" in internet terminology, at least as far as I can tell.

second thing, there is no guarantee zen3 refresh is even coming in Q4. AMD will counter launch against Intel eventually but remember, it's almost September (the 3rd month of Q3) and we aren't seeing rumblings of an AMD launch. The Intel launch is imminent, you are starting to see the partners poo poo leaks everywhere (a gigabyte board for alder lake hit a benchmark database a couple days ago) but you don't see that for Zen3 Refresh yet - you will see partners start to leak it in their technical docs (hi again gigabyte) and marketing and their bios notes and benchmark databases and poo poo like that, maybe starting a month before press day and two months before launch or something like that.

there is still a window of "don't know yet" for a Q4 launch, they could totally do like a november launch still, but like, we'll know within the next few weeks here based on how quickly leaks start ramping up. but also...

https://wccftech.com/amd-confirms-zen-3-ryzen-cpus-with-3d-v-cache-stack-chiplet-design-coming-early-next-year-before-zen-4/

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.
upgrade from a 3800x to 5900x and installation of an AIO was very easy. For whatever reason I had to go into the bios and gently caress with the boot order to get windows booting again (??) but that was the only odd hiccup. Wiring the RGB on the AIO into my existing controllers was likewise easy.

Needless to say I instantly saw a 160% increase in multicore performance in Cinebench, and some substantial increase in single core. I booted up ACC in VR mode and it seems to be significantly less glitchy, but I need to spend some hours driving in circles to really prove that out.

My 3800x is on SA-Mart if anyone is looking for one.

Thanks for steering me away from keeping my Wraith; installing an AIO was no big deal and now I have even moar RGBz

CaptainSarcastic
Jul 6, 2013



Paul MaudDib posted:

this is extremely confusing terminology and I've messed it up before but - Zen3+ is Zen3 on DDR5, apparently Rembrandt (mobile/BGA APU with RDNA2) only for now. There were apparently more "Zen3+" chips at one point but they got cancelled, probably due to parts shortages - buying a new mobo just to be an early adopter on a new memory standard with a previous-gen architecture is a tough sell right now, like wow tell me you're a masochist without telling me. Zen3 with VCache is "Zen3 Refresh" in internet terminology, at least as far as I can tell.

second thing, there is no guarantee zen3 refresh is even coming in Q4. AMD will counter launch against Intel eventually but remember, it's almost September (the 3rd month of Q3) and we aren't seeing rumblings of an AMD launch. The Intel launch is imminent, you are starting to see the partners poo poo leaks everywhere (a gigabyte board for alder lake hit a benchmark database a couple days ago) but you don't see that for Zen3 Refresh yet - you will see partners start to leak it in their technical docs (hi again gigabyte) and marketing and their bios notes and benchmark databases and poo poo like that, maybe starting a month before press day and two months before launch or something like that.

there is still a window of "don't know yet" for a Q4 launch, they could totally do like a november launch still, but like, we'll know within the next few weeks here based on how quickly leaks start ramping up. but also...

https://wccftech.com/amd-confirms-zen-3-ryzen-cpus-with-3d-v-cache-stack-chiplet-design-coming-early-next-year-before-zen-4/

If that nomenclature is true then I think I will start referring to the "Refresh" as Zen 3 RAGEMODE.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Klyith posted:

If we lived in a universe where work was always easily divided over an arbitrary # of threads, Jaguar would have been good (and x86 would have been trounced by RISC 30 years ago).

Agree with your overall point but bolded makes no sense at all

FuturePastNow
May 19, 2014


thicc cache edition

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.
Can confirm going from a 3800x to a 5900x instantly fixed probles in AC Competizone

granted the game is a well known Performance Hogging Piece of poo poo With A Questionable VR Implementation.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Cabbages and Kings posted:

Can confirm going from a 3800x to a 5900x instantly fixed probles in AC Competizone

granted the game is a well known Performance Hogging Piece of poo poo With A Questionable VR Implementation.

an instant fix sounds like you had some software (not necessarily game software) issues before you swapped CPUs, tbh

Theris
Oct 9, 2007

Sidesaddle Cavalry posted:

an instant fix sounds like you had some software (not necessarily game software) issues before you swapped CPUs, tbh

No, it's legit. There's probably nothing else in gaming as perceptibly sensitive to even small improvements in single thread performance as VR in racing sims. iRacing and ACC, anyway.

Pittsburgh Fentanyl Cloud
Apr 7, 2003


fwiw Mordhau used to crash constantly at the title screen if I left it there too long on my 3600X. Upgraded to a 5600X and it hasn't crashed since. Absolutely bizarre.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Weird. This couldn't possibly be a difference in the way Windows handles the two different generations of Ryzens, would it?

Cygni
Nov 12, 2005

raring to post

A lot of the issues that Ryzen experienced, particularly the first 2 generations, were definitely because Windows, game studios, board makers, prehiperial makers, GPU AIB partners, memory producers... basically everyone certified, tested, and considered Intel first, and AMD if they had the time later. Erata and bugs on Intel's implementation were fixed by launch, and AMD was fixed after launch (if at all).

Frankly it had been so long since AMD was competitive or had an up to date platform that they weren't especially relevant to the market. It is pretty funny to go back and look at the initial launch offering of X370/B350 boards and see how few they were and how downcosted they were compared to the wave of Z170/Z270 boards they were competing against. Nobody really expected Zen to be as good as it was, and then Zen 2 to be as radical as it was, and then Zen 3 to finally conquer the mountain. Intel slamming its nuts in a panini press certainly helped AMD, too.

When you fast forward to X570/B550, suddenly AMD is ahead in the platform game (Intel didnt even have a PCIe 4 desktop platform to test on), and everyone is certifying and testing on AMD platforms first. This has been a huge deal in the server world, but its also relevant in the desktop. Stability, compatibility, and number of annoying quirks or work arounds has absolutely improved massively since I built my first Ryzen system.

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord
So, what happened to Intel?

They did the thing with the panini press.

What thing?

You know, the slamming thing (points downward).

Ah, right. That explains it.

Klyith
Aug 3, 2007

GBS Pledge Week

Cygni posted:

A lot of the issues that Ryzen experienced, particularly the first 2 generations, were definitely because Windows, game studios, board makers, prehiperial makers, GPU AIB partners, memory producers... basically everyone certified, tested, and considered Intel first, and AMD if they had the time later. Erata and bugs on Intel's implementation were fixed by launch, and AMD was fixed after launch (if at all).

All very true, but for the software side at least the improvements should flow backward to earlier processors. If a game engine had a severe bug on ryzen systems and it didn't get fixed until zen 2, I'd still expect zen 1 & + to benefit from the fix. So as far as someone having a drop-in replacement of 3600X -> 5600X fix a game crash bug, I'd say that sounds like a hardware instability that got fixed.

Memory not 100% solid with the old CPU, old CPU actually had a flaw that only showed up under particular conditions, the game menu having uncapped fps that caused whack system stress ala New World. Or just something dumb and coincidental like a system upgrade might also be the time someone would do BIOS updates, driver updates, etc.

wargames
Mar 16, 2008

official yospos cat censor
https://www.youtube.com/watch?v=ItdkUxZYWjA

amd in a thing

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING


Despite knowing (from Numberphile) that Commonwealth English speakers don't read numbers below 10k as nn-hundred, it never occured to me that product names would also be read out in full. It's so weird to hear someone say "AMD Five Thousand Nine Hundred" :)

wargames
Mar 16, 2008

official yospos cat censor

mdxi posted:

Despite knowing (from Numberphile) that Commonwealth English speakers don't read numbers below 10k as nn-hundred, it never occured to me that product names would also be read out in full. It's so weird to hear someone say "AMD Five Thousand Nine Hundred" :)

Its regional i think, being in the south, Five Thousand Nine Hundred and fifty nine hundo, are both used.

Cygni
Nov 12, 2005

raring to post

https://twitter.com/Locuza_/status/1431679542918356993

I'm surprised how different the designs are coming from the same basic design structures.

Klyith
Aug 3, 2007

GBS Pledge Week

Cygni posted:

https://twitter.com/Locuza_/status/1431679542918356993

I'm surprised how different the designs are coming from the same basic design structures.

The really weird thing is, why does the PS5 only have 1 PCIe block? It's got the extra NVMe port, where's that hooked up? Do they share or something?

The twitter is asking how the compressed CCX deals with thermal density from the FPU, but remember that the PS5 has their crazy liquid metal + direct die cooling. Seems like an obvious A->B connection: Sony had that ready and so was able to squeeze everything for more dies/wafer by having higher heat constraints.

The GPU on the PS5 is also probably a more advantageous layout -- it's the same as AMD uses for Navi 10 and 21 with the cache and gubbins in the center of the shader CUs. MS layout looks far better thermally, since puts the heat loads on opposite sides of the chip with the cache etc between. Again probably a compromise Sony could just ignore.


So my hypothesis is that the PS5 chip is significantly cheaper to produce than the XSX, and Sony's plan was to have plenty of the $399 no-disc systems available to crush MS on price for the first year. Worked for them last time around. But lol at how utterly cocked that plan got. Still probably very nice for them to have a smaller chip with fab capacity what it is.

CoolCab
Apr 17, 2005

glem

Klyith posted:

The twitter is asking how the compressed CCX deals with thermal density from the FPU, but remember that the PS5 has their crazy liquid metal + direct die cooling. Seems like an obvious A->B connection: Sony had that ready and so was able to squeeze everything for more dies/wafer by having higher heat constraints.


i think steve of GN at one point either implied or stated that he had heard that the esoteric thermal solutions may not have been part of the original plan, particularly given how expensive they are compared to paste for what is normally not really that important of a gain. the ps5 still has some minor thermal problems so i could buy them needing to patch a poor hardware design with overengineered cooling components.

shrike82
Jun 11, 2005

they've managed to sell out 50% more PS5 more than XSS+XSX (est. 10M versus 6M LTD) so they're doing something right with production

CoolCab
Apr 17, 2005

glem

shrike82 posted:

they've managed to sell out 50% more PS5 more than XSS+XSX (est. 10M versus 6M LTD) so they're doing something right with production

sure, i mean poo poo on a per unit basis buying liquid metal instead of paste and better cooling makes poor financial sense in hindsight, but if had already started some part of production before noticing the severity of the issue a dollar or whatever of gallium compound is still cheaper than a billion + dollar RROD scandal.

Wiggly Wayne DDS
Sep 11, 2010



the liquid metal solution was known to be in the pipeline for a few years on paper, and it's not something you can knock together to cover a design thermal issue. the work involved in making this practical at the scale they're operating at has been glossed over by the tech media at large imo

Craptacular!
Jul 9, 2001

Fuck the DH

Klyith posted:

I just ran into this article which shows great evidence that cache matters as much as cores.

Oh, I remember the HUB a video version of this article which created a small controversy.

I’m pretty happy with a 3700X instead of a 5600X. CPU gains are routinely obvious with fidelity set to low, and the trade off of not having enough CPU is bottlenecking your graphics card. Except I run 1440p with what is likely the bare minimum hardware to use that resolution, and can’t afford anything better than a 3060ti for an upgrade if prices were normalized which they’re not. But I do run multithreaded video encoders for collective days that can make use of the extra cores.

Basically I don’t think I’ll ever have a Rest of the Computer that can make a 5000 series upgrade practical in games, but I spend a significant amount of time heating up my computer with software that only shows a timer in Terminal. YMMV of course.

Craptacular! fucked around with this message at 15:31 on Aug 29, 2021

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Craptacular! posted:

Oh, I remember the HUB a video version of this article which created a small controversy.

it's an interesting topic but of course HUB being HUB decided to test games that aren't particularly CPU-bound for the most part. like HZD there is obviously and completely GPU-bound in all configurations so this test provides no actual useful information about how things would happen in an actual CPU-bound situation where the CPU performance would come into play.

and even in the games with raytracing (which substantially increases CPU load!) they chose to turn it off :lol:

Paul MaudDib fucked around with this message at 20:27 on Aug 30, 2021

CoolCab
Apr 17, 2005

glem
raytracing impacts cpu load? huh i had no idea

repiv
Aug 13, 2009

current implementations use the CPU for fresh BVH builds, and the GPU for incremental BVH updates AFAIK

it's implementation dependent though

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

CoolCab posted:

raytracing impacts cpu load? huh i had no idea

yeah the BVH trees that the RT units are designed to traverse will change as the game world changes, so there is this additional data structure that the CPU needs to maintain

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
would a 5900X- or 5950X-haver mind seeing what Cinebench R23 scores you get / what clocks you get when setting a PPT of 60W, then again at 80W? or if you will only do one then give me the 60W number.

(I'm looking for 60W/80W actual total-package-power, so if your board is setting cTDP and then letting it boost above that, adjust accordingly.)

I had this idea of upgrading my NAS to a 5900X (or whatever 6900X is coming with zen3 refresh) but it'd need a low-profile cooler like this mini-box build and 70W is realistically about what I can get it to dissipate even when sucking air directly from outside.

Temps are a completely meaningless number of course since I will have a much smaller cooler, but I can use the 60-70W power dissipation target to see what performance is like (ish - obviously boost considers temperatures and lower temps will boost higher, but it'll be close enough).

Clocking down under severe sustained multi-threaded AVX loads isn't the end of the world, and I will have very adequate airflow as far as the drives are concerned, I'm just curious how it performs within tight power envelopes. The multi-die configurations likely do pull some extra power for IF links/etc but probably not the end of the world, AMD already did similar configurations with the 3900 non-X, that's a 65W TDP/88W PPT similar to the 5700G and that's alright ish in these constraints.

(or - if anyone has a Noctua L9a on a 5900X or a 5950X I'd be grateful to hear your numbers!)

Paul MaudDib fucked around with this message at 06:58 on Aug 31, 2021

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Hardware Unboxed reviewed the Ryzen 7 5700U, which is an 8-core/16-thread part that's Zen 2-based, even though it's marketed as being a part of the 5000-series.

https://www.youtube.com/watch?v=RjbbKGG3if0

In summary:

* it delivers far more multi-threaded performance than the Ryzen 7 4700U, which is also an 8-core part on pretty much the same Zen 2 die, because the 4700U does NOT have SMT, so it's 16 threads versus 8 threads

* it delivers near-equal performance to the Ryzen 7 4800U, which is an 8C/16T Zen 2 part, which is to be expected because it's almost exactly the same chip, including SMT, but the 4800U is higher-up on the SKU stack, so presumably it'd be more expensive to get a 4800U compared to a 5700U (which is supposed to be priced like the 4700U)

* it delivers near-equal (within 2%) multi-threaded performance to the Ryzen 7 5800U (8C/16T, Zen 3) at a 15 watt configuration
* a 25-28 watt configuration lets Zen 3 stretch its legs, and the 5800U beats the 5700U by ~6% in multi-threading

* it's well ahead of Intel's i7-1165G7. That's sort of expected since the Intel i7 is a 4C8T part, and anything with more cores from Intel has a higher power configuration (AFAIK)

* in single-threaded performance:
- near-equal to the 4800U, which is to be expected since they're both Zen 2, and the 5700U simply boosts a little higher
- the 5800U delivers 11% more performance, owing to the Zen 3 arch
- the Intel i7 is 7% ahead at 15 watts, and 17% ahead at 25 watts

* gaming is basically a wash because the iGPU is identical between the 4700U, 4800U, and 5700U, and that's the bottleneck, and an Intel laptop is always going to be faster if it even has just an MX GPU

None of this is particularly surprising, and while it's certainly nice that the "x700U" tier doubled in thread-count (does pricing across SKU tiers scale the way it's implied that way?), it's ultimately misleading that they're including Zen 2 parts in the 5000-series naming. Especially when the reason 4000 was skipped for some product lines was so that everything could be "unified" with all the Zen 3 parts at 5000.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

gradenko_2000 posted:

while it's certainly nice that the "x700U" tier doubled in thread-count (does pricing across SKU tiers scale the way it's implied that way?), it's ultimately misleading that they're including Zen 2 parts in the 5000-series naming.

I dunno. I've sort of come around to it I guess. It means that every part has SMT and that the performance across the stack basically increases like you'd expect - previously you had weird jumps like 4600U->4700U where 8C8T is roughly the same thing as 6C12T in terms of actual performance, or at least fails to significantly distinguish itself. Now the 5700U actually is like a lower-performing 5800U and not SMT-disabled.

It would be nicer if AMD didn't rebrandeon the 4800U and just reduced the number of parts in the stack to like 5400U, 5600U, 5800U, but chip makers gotta have 27 different tiers, and it's still a lot simpler than Intel's lineup. Or I guess they could have downclocked the 5400U/5600U/5800U but that's tricky with the Zen2 boost algorithm, how do you artificially limit that when the algorithm is designed to seek out the best performance available in a given configuration.

Zen3 isn't all that different from Zen2. You've got the unified CCX and the unified cache that results but like, does laptop really care that much about cache topology?

it's certainly annoying from a technical perspective but that's not how laptops get bought. In practice Zen2 is basically dumbed-down Zen3 and that's how it's being presented, as a lower-tier part in its "tier".

idk, yeah it would be nice if 5000 series were all zen3, for sure, but in practice it is what it is, and the old system had its flaws too.

Paul MaudDib fucked around with this message at 18:59 on Aug 31, 2021

Adbot
ADBOT LOVES YOU

Dramicus
Mar 26, 2010
Grimey Drawer
Am I alone in wishing that all companies would stop using the same name for laptop and desktop parts? At least with the CPUs you can tell that a 5800u isn't the same as a 5800x due to the letter, but Nvidia is the worst offender as their laptop parts have exactly the same name as the desktop parts. I've had to explain to several people who had watched some benchmarks of 3070s and then went and bought a laptop with a "3070" and were confused as to why they weren't getting nearly the same performance.

I know this kind of stuff is obvious to anyone who has been buying computers for a while, but there are plenty of uninitiated people who see i7 or 5800 + 3070 and assume they are going to get a certain amount of performance. When you start accounting for other differences like the same processor or gpu being run at different wattages it gets even worse.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply