Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dr. Video Games 0031
Jul 17, 2004

Subjunctive posted:

I thought “thread” was the unit of “doing things at once” — what do you mean here by a thread doing multiple things at once? Can they split and schedule the execution units more finely than at thread granularity? I feel like I’m missing something big!

"Thread" in this case is the logical threads on a multi-threaded CPU core. Yeah, sure, each one can only do one thing in any given cycle, but threads are clearly capable of sharing their compute time among many different tasks at once. How do you think we did things before multi-core CPUs and hyperthreading?

The point I was trying to make (which was also made in the videos I linked) is that CPUs aren't as rigid as a lot of people seem to imagine. It's the software that tends to be too rigidly coded, if anything. You can have many tasks assigned to one logical thread, and you can split tasks between multiple threads if it's coded well—modern CPU schedulers are quite flexible in that way. What matters most in games is that you have enough computational capacity to handle those tasks before a frame is drawn by the renderer so as not to bottleneck performance. It just so happens that adding extra cores is the easiest way of adding extra "capacity." It will by no means limit game performance if your CPU is six cores just because the game was originally coded for eight so long as your six cores are more powerful. I also doubt that most games on modern consoles are going to be maxing out their CPUs often to begin with.

note: i'm definitely no expert on this, this is just my understanding.

Dr. Video Games 0031 fucked around with this message at 14:24 on Aug 22, 2021

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Dr. Video Games 0031 posted:

"Thread" in this case is the logical threads on a multi-threaded CPU core. Yeah, sure, each one can only do one thing in any given cycle, but threads are clearly capable of sharing their compute time among many different tasks at once. How do you think we did things before multi-core CPUs and hyperthreading?

Yeah, I’m familiar with software scheduling. I took “at once” in your statement to be some kind of actual hardware parallelism and not interleaved scheduling by the OS in rapid sequence.

VorpalFish
Mar 22, 2007
reasonably awesometm

The cores in the 5600X should be more than 30% faster that the cores in the console SOC in the best possible scenario for the console.

This should be plenty to make it a faster gaming CPU for the lifetime of both devices.

By all means, if money is no object and more cores makes you feel better get more cores. But if you have a budget for your gaming pc and you're choosing between 5800X / gpu or 5600X / gpu+ (or 11400 / gpu++), the combination that gets you the most gpu is probably the play.

Within reason I guess - don't buy a 3090 to drive a 1080p display or whatever.

VorpalFish fucked around with this message at 16:55 on Aug 22, 2021

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
as I said to the other guy, the questions are what you've got, and how long you intend to be running the build.

6C12T is fine for now but people were already talking years ago about how 6C6T is getting saturated and starting to have impacts on frametimes, and it's hard to imagine that in another couple years 6C12T isn't getting saturated as well. If you're doing a build in late 2021 it seems like the goal would be to make it to at least late 2023 when 2nd-gen DDR5 platforms like Zen5 are available, i.e. at least 2 years. So how is 6C12T going to be doing in another 2 years, is the question.

I think the 5600X is going to make it for another 2 years, it's just a relatively expensive option to get there. Like it's a $300 processor, maybe $280 at a discount. That's a relatively expensive upgrade, and it's hard to imagine a 6C processor having much wind left in its sails in 2024+. And we know there is something better in the works (it's pretty solid at this point that AMD is going to do a VCache refresh to counter Alder Lake, whether that happens in Q4 of this year or Q1 of next year).

2 years is actually a pretty short upgrade cycle even for a mid-tier processor, the goal used to be to get an i5 processor to last 3 years and an i7/i9 to last 5 years. And the 5600X is priced at a higher level than even the i5 processors used to be, and you're getting a very short expected lifespan out of it. The time to buy a 6C12T processor was 2016, 2017, 2018, maybe early 2019 with a healthy discount. Late 2021 you are very late to the 6C party and you certainly should not be paying $300 for it at this stage of the game.

And if you're coming from a 3600 you've already got 6 cores and you've got decent per-thread performance, and you're spending all this money upgrading to get a little higher per-thread performance while not moving up in core count any. In this case I'd at least wait until the Zen3 Refresh comes out and you get the actual best-in-socket architecture, like it makes zero sense to rush an upgrade here right before the final Best In Socket thing comes out, while paying premium pricing for the new (old) thing. It's a very expensive, short term upgrade to begin with (even 6600X will be an expensive, short term upgrade) and then you're not even getting the Actual Best Chip.

If Zen3 had kept prices at Zen2 levels then sure, I guess it'd make sense to upgrade to a $200 processor where you can sell your old chip for $150 or something and recoup most of the expense, but with the price increase it's almost into the price ranges where you used to find i7s so you are basically stepping up a whole price class for a bit more single-threaded performance, that's not really a good deal.

imo a 3600 owner should absolutely be either waiting for DDR5 platforms (Zen4 next year, or Zen5) or at least waiting for Zen3 Refresh at an absolute minimum if you just want to do an incremental upgrade, unless you've found something the 3600 is just critically failing in and actively need an upgrade immediately. Even as far as Zen3 Refresh you should probably be looking at the 6900X rather than the 6600X or whatever, so you step up in core count as well, because 6C just isn't going to cut it in the 3-5 year window imo.

Yes, Zen3 has a per-thread performance advantage over consoles, but consoles also generally target much lower framerates than PCs typically do. This is changing, but even still consoles do not really target the same ultra-high framerates that PC users do. So when you make the "6 fast cores equals 8 slower ones" you do have to bear in mind that's at console framerates and using PC framerates will eat up another big chunk of CPU as well. Like I said, 6C12T is fine for now, but is it going to be fine in 2024+? People were already whining about frametimes on 6C6T like two years ago and 4 years later it seems improbable that 6C12T won't be at the limit as well, SMT will get you a limited performance boost but not that much.

Paul MaudDib fucked around with this message at 22:40 on Aug 22, 2021

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I agree with Paul, at least to an extent. I think the 5600X will largely be "fine" for this whole console generation, but there will probably be badly designed games here and there that really rely on 8c/16t and have frametime consistency issues. Those probably aren't a large concern, but the overall benefit is questionable. Typically you aren't going to be satisfied with the 30-60 FPS performance of console, you're going to want significantly more, and dropping 8 cores of work on 6 faster cores won't leave room to scale up as we get a few years into this generation and titles start to really load up the CPUs. Zen3 is great, but if you have a CPU that is "good enough" today and you expect you can hang on until the second generation of DDR5, it's probably a better use of your money.

Over the past decade plus, PC gamers have been used to having our CPUs be 2-3x faster than console CPUs and easily not being single thread performance limited when targeting 1.5-3x console performance. That's not going to be the case as we start seeing next-gen exclusives, and if you want to push performance to the extent that you want the upgrade from a 3600X to 5600X, that 5600X is unlikely to truly satisfy you for all that long.

Cygni
Nov 12, 2005

raring to post

Fwiw, AMDs presentation on their chip stacking technology at Hot Chips today seemingly confirmed again that an AM4 Zen 3 + V-cache is happening. Will be interesting to see if AMD can launch before Alder Lake.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Cygni posted:

Fwiw, AMDs presentation on their chip stacking technology at Hot Chips today seemingly confirmed again that an AM4 Zen 3 + V-cache is happening. Will be interesting to see if AMD can launch before Alder Lake.

finally an XT refresh that will mean something!

VorpalFish
Mar 22, 2007
reasonably awesometm

I agree with the sentiment that someone currently on a 3600 or greater zen2 shouldn't be looking to upgrade at all at this point - I was looking from the perspective of a new build or upgrade from something much older - maybe zen or pre skylake intel.

But ultimately a hypothetical single core that is 8 times as performant per core than an 8 core CPU will outperform in all circumstances. The reason we're not still doing that is because we can't. Really high per core performance can carry you a long way, and the 5600x has a ton of that.

Couple that with how long console generations are and I'd be more than willing to predict the 5600x is golden in games for at least 5 years. Is it gonna be the next 2500k and carry you for a decade? Maybe not. But it will be plenty good enough that I have no compunction telling people to shift additional money to the gpu instead in a gaming build.

I guess if you're a competitive FPS player trying to hold 300 fps or whatever the calculus changes as well. I'm thinking in terms of 120fps targets.

Arzachel
May 12, 2012

VorpalFish posted:

Couple that with how long console generations are and I'd be more than willing to predict the 5600x is golden in games for at least 5 years. Is it gonna be the next 2500k and carry you for a decade? Maybe not. But it will be plenty good enough that I have no compunction telling people to shift additional money to the gpu instead in a gaming build.

Said 2500k got absoleted because of a 8 core netbook CPU :v:

It's hard to tell how things will shake out. Realistically, one of the cores on consoles is going to be pinned for the OS and we have no idea whether anyone will actually use all 14 threads available and 6/12 vs 7/14 is a much smaller difference than 4/4 vs 8/8 but the cores themselves are much closer in power etc. If you are going to be sticking with the system for a long time, getting a 5800x is probably not a bad choice despite the awkward pricing.

Dr. Video Games 0031
Jul 17, 2004

Arzachel posted:

Said 2500k got absoleted because of a 8 core netbook CPU :v:

It's hard to tell how things will shake out. Realistically, one of the cores on consoles is going to be pinned for the OS and we have no idea whether anyone will actually use all 14 threads available and 6/12 vs 7/14 is a much smaller difference than 4/4 vs 8/8 but the cores themselves are much closer in power etc. If you are going to be sticking with the system for a long time, getting a 5800x is probably not a bad choice despite the awkward pricing.

Spending that extra $100 to $150 on a better GPU will position you even better for the future than a CPU that doesn't provide even a single percent improvement right now. It seems insane to suggest a 5800X as a means of future proofing unless it's to a person who is buying a 3080 or higher and thus has nothing to upgrade to cheaply there.

Though I find it very hard to believe there will be a meaningful difference at all between the two over the course of this console gen. Stuff like all the extra L3 cache AMD will be adding to their chips will make a far bigger difference, and I expect the 5600X and 5800X to age and fall behind at almost the exact same pace.

Dr. Video Games 0031 fucked around with this message at 15:01 on Aug 23, 2021

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
yeah I have to agree with the good doctor of videogames - the price difference between a 5600X and a 5800X could be better put to use (at least in a theoretical sense where the MSRP / price scaling properly exists) into a better GPU than a better CPU, especially with how much more important the GPU is at higher resolutions

Dramicus
Mar 26, 2010
Grimey Drawer
My "plan" was to go for more cores now, and then gradually up the resolution as better GPUs are released. That way the CPU should outlive the GPU(s) by quite a margin.

Arzachel
May 12, 2012

Dr. Video Games 0031 posted:

Spending that extra $100 to $150 on a better GPU will position you even better for the future than a CPU that doesn't provide even a single percent improvement right now. It seems insane to suggest a 5800X as a means of future proofing unless it's to a person who is buying a 3080 or higher and thus has nothing to upgrade to cheaply there.

Though I find it very hard to believe there will be a meaningful difference at all between the two over the course of this console gen. Stuff like all the extra L3 cache AMD will be adding to their chips will make a far bigger difference, and I expect the 5600X and 5800X to age and fall behind at almost the exact same pace.

Lets be real here, how much GPU will 100 bucks buy you right now

NewFatMike
Jun 11, 2015

From the GPU thread: Zen 4 looking like it'll have integrated graphics:

https://videocardz.com/newz/amd-zen4-ryzen-cpus-confirmed-to-offer-integrated-graphics

Real interested to see if the mad lads put it in the IO die.

CoolCab
Apr 17, 2005

glem
oh lol i thought i made this post in this thread and i was like :confused: where did it go :confused:

CoolCab posted:

i like it, i have always used a GPU but even a simple lovely iGPU would be very nice for diagnostics or certain goofy linux distros which don't offer good 3000 series support. plus you can use the motherboard display ports, right?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
oh my god all that L3 cache with an iGPU :circlefap:

CoolCab
Apr 17, 2005

glem
one of a handful of "tech pitfalls you hear about" i know for a fact happened to a friend of mine (same as buying either a SATA or NVME m.2 SSD for the other kind of port) is buying a ryzen chip and intending to buy a gpu after the shortage. after all how bad could it be, he said, in late 2020s, and besides, everyone raves about what good value ryzen is surely a modern cpu must work on it's own.

CoolCab
Apr 17, 2005

glem
oh, lol, that very same aforementioned guy who bought the wrong kind of SSD (and had to sell it at a loss because he lost his receipt) also an unrelated time walked into some retail place and bought high end 7200RPM drives for his NAS, then lost the receipt. maybe he was a special case.

NewFatMike
Jun 11, 2015

gradenko_2000 posted:

oh my god all that L3 cache with an iGPU :circlefap:

3D cache on the IO die for GPU is gonna be pretty whack if that's the way they go.

Ihmemies
Oct 6, 2012

CoolCab posted:

oh, lol, that very same aforementioned guy who bought the wrong kind of SSD (and had to sell it at a loss because he lost his receipt) also an unrelated time walked into some retail place and bought high end 7200RPM drives for his NAS, then lost the receipt. maybe he was a special case.

One could just scan the recipts straight to Google Drive before walking out of the shop...

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Shops should email you the receipt, OCUK and Argos do this for example. If they don't, they really ought to! It's 2021.

Whitest Russian
Nov 23, 2013
Has anyone else run into quirks between a 5600x and a 30 series GPU? I did a clean Windows install before I installed my 3080 but I had some weird freezes/crashes afterwards. I USED DDU (per EVGAs instructions) and my initial quirks went away but I got new ones instead. My drivers/firmware/bios are all up to date.

These all describe what was happening at first:

https://www.google.com/search?q=3080+limp+mode
https://www.google.com/search?q=3080+low+gpu+usage
https://www.google.com/search?q=3080+high+framerate+and+locked+fps"

I didn't get any recordings of the limp mode in action.


Pre DDU:
https://i.imgur.com/zvCQDD5.mp4

Post DDU:
https://i.imgur.com/YwDY2m3.mp4
https://i.imgur.com/izs3Jwl.mp4
https://i.imgur.com/xEdSWfl.mp4

My monitor also flickers for a few seconds after the computer wakes up.

Whitest Russian fucked around with this message at 20:53 on Aug 23, 2021

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Zedsdeadbaby posted:

Shops should email you the receipt, OCUK and Argos do this for example. If they don't, they really ought to! It's 2021.

yeah I was gonna say too... what kind of store can't look up a receipt in 2021? usually if you give them your card they can look up all the transactions that were done with that card.

Cygni
Nov 12, 2005

raring to post

I don't really have high expectations for the iGPU performance if it is going to be integrated into the full desktop product stack, for all the same reasons that Intel doesn't spend more of their die size/power budget/bandwidth budget on the iGPU for their desktop products.

I do hope we are getting closer to more fun stuff involving chiplets and iGPUs. I know there are lots of limitations there like the cost of making additional masks, wafer limits, bandwidth, marketing, power budgets, latency, etc etc... but the mix-and-match chiplet future does seem to be inevitable on desktop, and could make for a more cost efficient market for CPUs for lots of various niches.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
some interesting digging on the gigabyte leak and the possible implications for Zen4's architecture

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
That section about the TLB and its coverage kind of pisses me off in that Windows still doesn't do large pages, unless specifically enabled during boot and requested by an application. By now this dumb OS should have had something like Linux' transparent huge pages. There's probably quite some performance left on the table because of that poo poo.

--edit:
Interesting, I'm going down the rabbit hole of the large page poo poo, to see if things maybe changed, to find out it's enabled for my user account. I never touched it in several fresh installs. That's interesting. You can apparently force applications via imagecfg to load their code and data segments in large pages, at least that's something to tinker with.

Combat Pretzel fucked around with this message at 22:30 on Aug 23, 2021

FuturePastNow
May 19, 2014


Putting an iGPU, even a lovely one, in everything is a great idea even if you don't plan on using it. Great for that initial build/setup and for troublshooting graphics problems later. And there are plenty of computers used for tasks that are computationally intensive but need only basic display output. Obsolete the GT 1030 out of existence.

SwissArmyDruid
Feb 14, 2014

by sebmojo
I mean, wasn't it starting to shape up that rather than having X and non-X SKUs, that the G skus were moving in to replace the non-X?

Makes total sense to me.

If I am not misremembering, AMD putting the IGP on the IO die makes even more sense at that point, they can just switch between X and G with a different IO chiplet. Assuming they don't do the thing where all IO dies have an IGP, and they fuse it off if it's no good, because that seems like a less-efficient use of silicon.

edit: RING BUS LIVES AGAIIIIIIIIIIIIIIIN

SwissArmyDruid fucked around with this message at 22:46 on Aug 23, 2021

VorpalFish
Mar 22, 2007
reasonably awesometm

Arzachel posted:

Said 2500k got absoleted because of a 8 core netbook CPU :v:

It took a decade. If you get 10 years out of a $200 CPU you have made about the best move you could make.

Also bear in mind that getting games to scale with thread count is in general gonna get harder for each additional thread you add.

If an 8 core CPU gives you peace of mind and you aren't budget constrained I'm all for it. Full disclosure I'm running a 5800X myself (but I bought at launch, I was gonna buy literally anything in stock). I'm just saying we can probably expect at least 5 good years out of the 5600x and if you can jump a tier gpu wise with the cost difference do that instead.

shrike82
Jun 11, 2005

is there any reason why a bios (flash) update would "undo" itself?

my dual boot grub manager stopped working and i noticed the bios reverted to an older, launch version

Nalin
Sep 29, 2007

Hair Elf

shrike82 posted:

is there any reason why a bios (flash) update would "undo" itself?

my dual boot grub manager stopped working and i noticed the bios reverted to an older, launch version

Maybe the normal BIOS failed to initialized for whatever reason and it launched the backup BIOS?

My Gigabyte X570 board can often have troubles on first boot and I would sometimes get dumped to the older backup BIOS. I had to make sure when I flashed the latest BIOS to enable the option to ALSO flash the backup.

BurritoJustice
Oct 9, 2012

CoolCab posted:

oh, lol, that very same aforementioned guy who bought the wrong kind of SSD (and had to sell it at a loss because he lost his receipt) also an unrelated time walked into some retail place and bought high end 7200RPM drives for his NAS, then lost the receipt. maybe he was a special case.

My friend who unironically bought an 11900K also bought two SATA QLC SSDs (Samsung QVO) to run in RAID0, with no m.2

Klyith
Aug 3, 2007

GBS Pledge Week

Zedsdeadbaby posted:

I'm awry on buying six cores, since the new generation of games are using 8c/16t CPUs as baseline.
I'm sure the 5600 is more than fast enough for a couple years yet but you never know.

The old generation of consoles had 8 cores too, and it took until nearly the end of the lifecycle before you saw games that were unhappy on a 4c/4thread non-HT PC CPU.


Use of many cores in games has not been held back by hardware limits, ever. It's held back by the fact that multithread programming is hard, and extracting more parallelism gets harder the more of it you do. I would bet large amounts of money at high odds that a 5600X and a 5800X will not have major performance differences for gaming for the next 5 years.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Klyith posted:

The old generation of consoles had 8 cores too, and it took until nearly the end of the lifecycle before you saw games that were unhappy on a 4c/4thread non-HT PC CPU.

yeah, it's not about the absolute number of cores (or threads) in a CPU, it's how much work those cores/threads can churn through overall

CaptainSarcastic
Jul 6, 2013



I want to see what Zen with 3D cache looks like, and at what pricing, before I decide on anything. My 3600X handles everything fine right now, and the Be Quiet! Shadowrock Slim has been great (offhand I can't remember the TDP rating for it, but I think it's around 180 degrees celsius). Performance/price/heat are going the be what I'm looking, as I figure I'll have this as my machine for a few more years.

Klyith
Aug 3, 2007

GBS Pledge Week

gradenko_2000 posted:

yeah, it's not about the absolute number of cores (or threads) in a CPU, it's how much work those cores/threads can churn through overall

I think that's a really bad way to phrase it because that's still talking about the hardware. A 5800X can definitely churn through more "work" than a 5600X.

What I'm talking about is purely about the software side, where "work" is not an abstract commodity that's fungible across CPU cores. I think in particular it's very difficult for games because the main loop of a game is very latency-sensitive, and most ways of adding threading also add latency because you have to send messages between threads. They've already done most of the easy ones.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Klyith posted:

I think that's a really bad way to phrase it because that's still talking about the hardware. A 5800X can definitely churn through more "work" than a 5600X.

I already said before that "you want a CPU with as much overall performance [as a console]", rather than simply aiming for an equal number of cores as the console - it didn't seem to stick so I tried saying it another way

but I get what you mean and I'll cop to it being bad phrasing

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Klyith posted:

The old generation of consoles had 8 cores too, and it took until nearly the end of the lifecycle before you saw games that were unhappy on a 4c/4thread non-HT PC CPU.

I'm firmly in the '5600 will be fine' camp, but it's hard to overstate just how terrible the cores were in the last console generation.

shrike82
Jun 11, 2005

Just get the most CPU you’re able to afford

The price difference is going to pale in comparison to the various GPU tiers anyway

Adbot
ADBOT LOVES YOU

Inept
Jul 8, 2003

gradenko_2000 posted:

I already said before that "you want a CPU with as much overall performance [as a console]",

If you want sustained high framerates, you want more performance than a console that targets 60fps.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply