Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I feel like AMD had to have to known that they can time it right. I don't know how much money AMD can make from forcing people to buy new boards, but either you skip Zen 3 entirely and get a new board once we're all in DDR5 / AM5 / Socket 1700, or you switch to Intel which means buying a new board anyway, or you buy a B550. Since 9th gen has already dead-ended, there's no alternative to keeping up with new releases short of just not partaking in it at all, because there's effectively no threat from the competition.

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
They're going to need to go back on this, nobody is going to swap out a board they just bought for a newer one with a loving fan on the chipset only for it to be the end of the line for am4, all because of a sodding flash rom size. Utterly wasteful. The cost of a 16MB vs 32MB chip must be miniscule, yet here we are. If only they were socketed, or AMD had insisited on a decent size from day 1.

I have to assume MSI is annoyed by this, seeing they have put larger chips on their max boards, as well as investing in bios flashback. If you've got bios flashback, there's no reason a single-generation bios couldn't be made, making all size worries moot

HalloKitty fucked around with this message at 08:56 on May 8, 2020

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
The number of people who own 3000 series ryzens who intend to upgrade to a 4xxx is probably on the order of 1%. AMD likely know this and are willing to put up with some barking from a tiny segment of their market.


It's not like switching to Intel in disgust improves your the situation, they consistently obsolete motherboards after 2 gens.

Truga
May 4, 2014
Lipstick Apathy
well i'm glad i bought a 500 series mobo now, though unless zen3 is some kind of gift from a cyber god i don't see myself upgrading

Regrettable
Jan 5, 2010



ConanTheLibrarian posted:

The number of people who own 3000 series ryzens who intend to upgrade to a 4xxx is probably on the order of 1%. AMD likely know this and are willing to put up with some barking from a tiny segment of their market.


It's not like switching to Intel in disgust improves your the situation, they consistently obsolete motherboards after 2 gens.

I am one of those people and I'm really glad I spent the extra money on an x570 motherboard right now.

orcane
Jun 13, 2012

Fun Shoe

HalloKitty posted:

They're going to need to go back on this, nobody is going to swap out a board they just bought for a newer one with a loving fan on the chipset only for it to be the end of the line for am4, all because of a sodding flash rom size. Utterly wasteful. The cost of a 16MB vs 32MB chip must be miniscule, yet here we are. If only they were socketed, or AMD had insisited on a decent size from day 1.

I have to assume MSI is annoyed by this, seeing they have put larger chips on their max boards, as well as investing in bios flashback. If you've got bios flashback, there's no reason a single-generation bios couldn't be made, making all size worries moot
B550 mainboards don't seem to use fans in general. One major problem is that B550 is almost a year late, so a lot of people bought "outdated" mainboards because that was their only affordable option, and AMD never hinted at B450 mainboards being a dead-end for Ryzen 3000 owners until now (at least with Intel you knew from the start you were not going to upgrade to new CPU generations when you bought a Z97, Z270, Z390 etc. platform).

Bonus:


E: Also plenty of X570 mainboards have 16 MB ROMs so AMD isn't even lying very well :v:

orcane fucked around with this message at 10:43 on May 8, 2020

Spatial
Nov 15, 2007

HalloKitty posted:

The cost of a 16MB vs 32MB chip must be miniscule
It's about 75 cents :)

Discussion Quorum
Dec 5, 2002
Armchair Philistine
If this happens then lol at trying to pick up Zen 2 at a discount/used firesale when Zen 3 hits

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I have a bit of a historical/technical question: given that the Bulldozer architecture's "modules" were sharing a single floating-point unit, which is an argument for why they weren't really "cores" per se, and supposedly contributes to their lackluster performance compared to its Intel contemporaries, what made AMD think that that was a good idea?

It gets brought up a lot in retrospectives about FX, and I'm sitting here wondering - if it is or was so obvious to people that designing a processor like that was not a good idea, why did they go through with it? Was there some kind of "Crysis-like" moment where technology/development was moving in a direction that would have made that sort of thing perform better, except the future didn't pan-out that way? Did AMD just guess wrong? Are all these retrospectives benefitting from hindsight when they couldn't have known prior to 2011 that the single-shared-FPU module design was going to turn out the way it did?

Travic
May 27, 2007

Getting nowhere fast

Puddin posted:

Try installing the 1usmus power plan. This seemed to iron out any kinks with my 3700x.

Sorry I missed this. I'll give it a try, thanks.

I tried Ryzen Master as well to get a second data point. It said the CPU was running at 800Mhz at idle. Is task manager just not very good at measuring things? I just figured out that it can't read Directx 12 GPU usage so I wouldn't be surprised. CPU-Z reports 4.2Ghz

Travic fucked around with this message at 14:28 on May 8, 2020

Crunchy Black
Oct 24, 2017

by Athanatos
I chalk it up to AMD knowing where their bread is *still!* really buttered with new xbox/PS launches this year and them trying to focus on that. That said, continued idiocy/malfeasance of this kind is, as has been pointed out upthread, going to piss the people really driving their value prop at the moment off.

Hopefully the folks running the Ryzen team realize that Enterprise REALLY won't tolerate this type of poo poo either.

Truga
May 4, 2014
Lipstick Apathy

Discussion Quorum posted:

If this happens then lol at trying to pick up Zen 2 at a discount/used firesale when Zen 3 hits

it'll still happen, but you'll get a cheapo motherboard too

VorpalFish
Mar 22, 2007
reasonably awesometm

gradenko_2000 posted:

I have a bit of a historical/technical question: given that the Bulldozer architecture's "modules" were sharing a single floating-point unit, which is an argument for why they weren't really "cores" per se, and supposedly contributes to their lackluster performance compared to its Intel contemporaries, what made AMD think that that was a good idea?

It gets brought up a lot in retrospectives about FX, and I'm sitting here wondering - if it is or was so obvious to people that designing a processor like that was not a good idea, why did they go through with it? Was there some kind of "Crysis-like" moment where technology/development was moving in a direction that would have made that sort of thing perform better, except the future didn't pan-out that way? Did AMD just guess wrong? Are all these retrospectives benefitting from hindsight when they couldn't have known prior to 2011 that the single-shared-FPU module design was going to turn out the way it did?

I believe their thinking was that integer workloads were much more common in consumer space, so a module would behave like 2 cores most of the time.

And while it is an example of stretching the truth, I don't think that shared logic was the one thing holding the construction cores back; they were slow at everything including single threaded performance where the shared fpus shouldn't matter at all. I think it would be wrong to frame the failings of the uarch as a single bad bet by amds design team.

Edit: it also probably would have worked better with earlier os support; knowing for example to schedule on cores 0,3,5,7 then 1,2,4,6 or whatever.

VorpalFish fucked around with this message at 14:49 on May 8, 2020

Discussion Quorum
Dec 5, 2002
Armchair Philistine

Truga posted:

it'll still happen, but you'll get a cheapo motherboard too

I'm not so sure, I suspect there are a lot of people like me who are on X470/B450 and Ryzen 1000/2000 and have been holding off until Zen 3 to upgrade. I further suspect that many, like me, will just say "gently caress it, Zen 2 and done" rather than buy a new board when the socket/chipset definitively has no further upgrade path.

Eta: to be clear, I am saying I think demand for the supply of used Zen 2 will be very high at Zen 3 launch, not that there won't be any

Truga
May 4, 2014
Lipstick Apathy
didn't construction cores end up being faster than intel equivalents like 10 years later because the extra thread per "real core" bonus is more important in modern workloads and/or game engines than a couple hundred mhz worth of clock and IPC? or was that just a couple cherry picked benchmarks?

not that that excuses the apalling performance at release, nobody buys a cpu because they want it to be better in 10 years time, but i found it funny.

also not like amd has intel like cash/monopoly to compete fairly, so they keep going for pie in the sky option. construction cores and, in hindsight also GCN (started out fine, but then maxwell happened) were bad, but otoh zen is ridiculously good. i still lmao every time i check top and see 32 threads

Discussion Quorum posted:

Eta: to be clear, I am saying I think demand for the supply of used Zen 2 will be very high at Zen 3 launch, not that there won't be any

ah yeah, that makes sense. yeah, probably.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

VorpalFish posted:

I believe their thinking was that integer workloads were much more common in consumer space, so a module would behave like 2 cores most of the time.

And while it is an example of stretching the truth, I don't think that shared logic was the one thing holding the construction cores back; they were slow at everything including single threaded performance where the shared fpus shouldn't matter at all. I think it would be wrong to frame the failings of the uarch as a single bad bet by amds design team.

thanks for this :)

Truga posted:

didn't construction cores end up being faster than intel equivalents like 10 years later because the extra thread per "real core" bonus is more important in modern workloads and/or game engines than a couple hundred mhz worth of clock and IPC? or was that just a couple cherry picked benchmarks?

I think what also eventually happened was that Windows (for example) started treating an FX-8100 as a "4 core, 8 thread" CPU. This gets brought up a lot in those "real core" arguments as "proof" that therefore they're not, but the practical effect is that workloads would be assigned per module first, to minimize bumping into the shared FPU, and that helped.

orcane
Jun 13, 2012

Fun Shoe

Discussion Quorum posted:

I'm not so sure, I suspect there are a lot of people like me who are on X470/B450 and Ryzen 1000/2000 and have been holding off until Zen 3 to upgrade. I further suspect that many, like me, will just say "gently caress it, Zen 2 and done" rather than buy a new board when the socket/chipset definitively has no further upgrade path.

Eta: to be clear, I am saying I think demand for the supply of used Zen 2 will be very high at Zen 3 launch, not that there won't be any

The 300 and 400 series chipsets are so similar there's no value in going 300 -> 400 let alone 400 -> "better" 400. And I wouldn't pay big bux for a jump from Ryzen 2000 to 3000, but I could see 1000 to 3000.

E: And yeah upgrading from Zen+/2 on B450 to B550 just for Zen 3 (and one upgrade to Zen 4 or whatever it will be called) is also not very attractive IMO. New computers or upgrades from even older CPUs, sure.

orcane fucked around with this message at 15:08 on May 8, 2020

Klyith
Aug 3, 2007

GBS Pledge Week

gradenko_2000 posted:

Did AMD just guess wrong? Are all these retrospectives benefitting from hindsight when they couldn't have known prior to 2011 that the single-shared-FPU module design was going to turn out the way it did?

It's this. At the time you looked at it and said "that makes a certain amount of sense".

Also the double-int-single-fpu design isn't the entirety or why the performance was so bad. The biggest problem was that it was a low-IPC design that was optimized for high clockspeed, and they could never get to that target (shades of Pentium 4 there). So single-thread performance was awful, and at the time that was really all that mattered. But the multi-thread performance also looked terrible because every multi-thread performance benchmark is pretty much exclusively FPU math. It wasn't until 7-8 years later that workloads like games started to look like a Bulldozer, which is why you can find vids on youtube about "gaming on a FX in 2019" showing decent performance for a 2013 CPU.

Meanwhile GlobalFlounderies was having constant cash problems and delays to process shrinks, which didn't help at all. For a while there Intel was basically lapping them. By the time GF was ready to move from 32nm to 28m, Intel was rolling out 22nm.


Would they have done better if they'd gone the other direction and scaled up a Super-Bobcat rather than the Pentium 4 Electric Boogaloo? Hard to say. None of the bobcat-descended CPUs were ever great performers even at their designed task of being low-power mobile chips.

PC LOAD LETTER
May 23, 2005
WTF?!
Well its pretty lovely of AMD to nix support of Zen3 across all its AM4 chipsets. That was a big deal to me and others, as enthusiasts, and I don't really understand why they'd do that.

Hopefully the mobo vendors release support anyways but that'll vary widely from vendor to vendor so you can't rely on it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

orcane posted:

B550 mainboards don't seem to use fans in general. One major problem is that B550 is almost a year late, so a lot of people bought "outdated" mainboards because that was their only affordable option, and AMD never hinted at B450 mainboards being a dead-end for Ryzen 3000 owners until now (at least with Intel you knew from the start you were not going to upgrade to new CPU generations when you bought a Z97, Z270, Z390 etc. platform).

Bonus:


E: Also plenty of X570 mainboards have 16 MB ROMs so AMD isn't even lying very well :v:

B550 doesn’t need a chipset fan because it’s not PCIe 4.0 capable. The chipset runs at 3.0.

The graphics lanes run at 4.0, but the speed the graphics lanes run at has nothing to do with the chipset. AMD just segmented their lineup so that graphics could only run at 4.0 on the 500 series boards.

(The signal integrity requirements aren’t that tight, partners were getting it running on legacy boards until AMD said no, and AMD also refused to even let them release new budget boards with better integrity that were designed for it. It was X570 or nothing for a full year.)

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Truga posted:

didn't construction cores end up being faster than intel equivalents like 10 years later because the extra thread per "real core" bonus is more important in modern workloads and/or game engines than a couple hundred mhz worth of clock and IPC? or was that just a couple cherry picked benchmarks?

not that that excuses the apalling performance at release, nobody buys a cpu because they want it to be better in 10 years time, but i found it funny.

Yeah, this is either HWUB or AdoredTV.

The FX 8350 eventually beat the 2500K in some “selected” thread heavy games. The 2600K remains faster in all respects, even splitting a core with SMT its hyperthreads are still faster than a Bulldozer “core”. But Bulldozer was priced such that the 8350 went up against the 2500K, so that’s semi-fair.

The part that’s a bit hand-wavey is that the 2500K’s contemporary was the 8150, not the 8350. 8350 was Piledriver, basically Bulldozer+, and cleaned up the worst parts of the bulldozer tire-fire.

I’m also not sure if it’s a case of the 8350 ascending to the i5 level, or the i5 descending to Bulldozer’s level (in those games). I mean... would you really want to actually play on a Bulldozer CPU in 2019?

Paul MaudDib fucked around with this message at 15:56 on May 8, 2020

orcane
Jun 13, 2012

Fun Shoe

Paul MaudDib posted:

B550 doesn’t need a chipset fan because it’s not PCIe 4.0 capable. The chipset runs at 3.0.

The graphics lanes run at 4.0, but the speed the graphics lanes run at has nothing to do with the chipset. AMD just segmented their lineup so that graphics could only run at 4.0 on the 500 series boards.

(The signal integrity requirements aren’t that tight, partners were getting it running on legacy boards until AMD said no, and AMD also refused to even let them release new budget boards with better integrity that were designed for it. It was X570 or nothing for a full year.)
Fans don't have to be PCIe 4.0 related, plenty of manufacturers put them on Z490 boards now (on the VRM, not the chipset) :v:

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

yes, other components generate heat. this is not an insightful thing to point out

GRINDCORE MEGGIDO
Feb 28, 1985


If you want a vision of the future... Imagine a fan blowing on every component, forever

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Klyith posted:

Would they have done better if they'd gone the other direction and scaled up a Super-Bobcat rather than the Pentium 4 Electric Boogaloo? Hard to say. None of the bobcat-descended CPUs were ever great performers even at their designed task of being low-power mobile chips.

Fat Cat cores would have been fine if they stole the Excavator frontend and essentially doubled the core width and registers with corresponding cache increases and an L3. There is nothing wrong with the design, no noticeable bottlenecks just low throughput due to narrow execution width. Kabini die size is 114nm² on 28nm, so a double up on cores plus 8MB L3 would have been about ~280mm² as a guess, so only slightly larger than Carizzo/Bristol Ridge itself. I don't think SMT could have been implemented with a cheap and dirty hack like this, so strictly 8C/8T, but heck if AMD had moved to a larger socket size 16C/16T quad channel HEDT would have been possible.

In many respects it'd look like a stripped down Zen, so like take a 1700X, disable SMT and clock the ram at 1866 to get an idea of the performance? Might be optimistic.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Crunchy Black posted:

Hopefully the folks running the Ryzen team realize that Enterprise REALLY won't tolerate this type of poo poo either.

Huh? Enterprise won't give a single poo poo.

Enterprise-y places happily buy servers from Dell and HPE that start in the neighborhood of $15k and go from there. They'll buy whatever fits this year's budget and lets them hit their projected capacity targets. No one working in those environments cares about motherboards at all -- even the "indies" of enterprise buy from Supermicro, so it's someone else's job to worry about the components. All that matters is a bulletproof warranty and RMA experience so that your rear end is covered.

teagone
Jun 10, 2003

That was pretty intense, huh?

Where the hell is desktop Renoir?

orcane
Jun 13, 2012

Fun Shoe
I guess it would make sense to launch it with/shortly after B550 so there are actually APUs for the new chipset.

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.
I would assume they'll only launch desktop Renoir when they build up enough stock of chips that don't meet the bins for mobile parts since they need to build mindshare in the laptop space much more than they need to in the desktop space.

EDIT: Although the poster above me is also probably right that they will need an APU for B550 sooner rather than later, so we'll see.

FAKE EDIT: it would also suck if the chipset lockout also meant Renoir wouldn't work in the Deskmini A300 since I'd love a pint-sized machine with 8 cores and decent integrated graphics.

Mr.Radar fucked around with this message at 23:10 on May 8, 2020

orcane
Jun 13, 2012

Fun Shoe
B550 doesn't support Zen+ which the Ryzen 3000 APUs are using (hence the elaborate sticker saying "only Ryzen 3000 except 3200G/3400G), so maybe the Zen 2-based Renoir will work on <500 series chipsets.

On the other hand, since the reason for supporting which CPU on which chipset is completely fabricated, they could just prevent the old chipsets running anything named Ryzen 4000 for the sake of it.

SwissArmyDruid
Feb 14, 2014

by sebmojo
They really need to make the APU series numbering match the CPU series numbering, this is only a cluster because of how last gen APUs use current get CPU numbers.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug

GRINDCORE MEGGIDO posted:

If you want a vision of the future... Imagine a fan blowing on every component, forever

AMD: "I heard watercooling was all the rage. That's good...that's good...."

WhyteRyce
Dec 30, 2001

teagone posted:

Where the hell is desktop Renoir?

I want one too because my Plex server + ip camera DVR needs a bump :(

Anime Schoolgirl
Nov 28, 2002

Mr.Radar posted:

I would assume they'll only launch desktop Renoir when they build up enough stock of chips that don't meet the bins for mobile parts since they need to build mindshare in the laptop space much more than they need to in the desktop space.

EDIT: Although the poster above me is also probably right that they will need an APU for B550 sooner rather than later, so we'll see.

FAKE EDIT: it would also suck if the chipset lockout also meant Renoir wouldn't work in the Deskmini A300 since I'd love a pint-sized machine with 8 cores and decent integrated graphics.

orcane posted:

B550 doesn't support Zen+ which the Ryzen 3000 APUs are using (hence the elaborate sticker saying "only Ryzen 3000 except 3200G/3400G), so maybe the Zen 2-based Renoir will work on <500 series chipsets.

On the other hand, since the reason for supporting which CPU on which chipset is completely fabricated, they could just prevent the old chipsets running anything named Ryzen 4000 for the sake of it.
I bought a Deskmini with an Athlon 200GE mostly because I may possibly upgrade it to a Zen 2 APU, (did not get a 3400G because it'd run too hot for the coolers I have with it) it'll be really lovely if they locked Zen 2 APUs from the 300/400-series chipsets.

Anime Schoolgirl fucked around with this message at 03:00 on May 9, 2020

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Supposedly Renoir is not locked out for X370/470 and B350/450 which is even more lol, but it's also from Hassan so uh, gonna need a microscopic amount of salt.

Still think the growing backlash and bad press is going to force AMD's hand on this,. It's not like PCIE4 which is niche enough to not really bother or affect the majority of users, this substantially affects near drat everyone invested on the AM4 platform.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I would lean more to the “Renoir will be supported because it’s Zen2” than the “Renoir won’t be supported because it’s Ryzen 4000” side of things. It’s a dumb naming scheme sure, but I think it’s already supported on B450 mobile platforms (unless those are running in SOC mode?).

Now, the one that is not obvious is Renoir on 300 series boards... with AMD getting persnickety about legacy boards they may choose to cut that one off. Hopefully but not as certain as it was two days ago.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Paul MaudDib posted:

I would lean more to the “Renoir will be supported because it’s Zen2” than the “Renoir won’t be supported because it’s Ryzen 4000” side of things. It’s a dumb naming scheme sure, but I think it’s already supported on B450 mobile platforms (unless those are running in SOC mode?).

Now, the one that is not obvious is Renoir on 300 series boards... with AMD getting persnickety about legacy boards they may choose to cut that one off. Hopefully but not as certain as it was two days ago.

You're still adding like at least 4 SKUs to the BIOS, maybe 5.

R7 4700G (8C/16T, 8CU)
R5 4600G (6C/12T, 7CU)
R5 4400G (4C/8T, 6CU)
R3 4200G (4C/8T, 5CU)
Athlon 4000G (4C/4T, 3CU)

Zen3 itself is likely 6, 8 tops.

R9 4950X
R9 4900X
R7 4800X
R7 4700X
R5 4600X
R5 4600

I dunno, just seems loving stupid as all hell, especially when you can just do a 2/3/4 BIOS because Gen1 is basically not available and unlikely to ever be in demand again (Pinnacle is likely to be the min people will consider on the used market). Not one Ryzen user will shed a tear for losing out on Summit.

Cygni
Nov 12, 2005

raring to post

Like half of a lot of x370 boards support list is taken up by ryzen PRO models that no DIYer will ever get outside of an Alibaba/eBay auction and like 10 revisions of the Athlon 200 series. And of course, still lots of A series APUs, including ones that were never released in retail as far as I know. Even if you humor AMDs bios size argument, which you shouldn’t, the OEMs seemingly still have fat to cut if they needed. And like 10 other work arounds. And lots of models with plenty of space.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Is the increment of space usage a particular SKU or is it a generation of chips (Summit/Pinnacle/Matisse/etc)?

Like, there probably has to be some map that says 3800X gets the Matisse microcode but I’m not sure that may is what uses up a bunch of space, compared to the microcode itself.

Renoir would probably use a separate microcode either way, but I’m not sure stripping Pro support would really improve anything.

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I guess a question is whether 3100/3300X needed a BIOS update for boards that could otherwise already run the earlier Zen 2 chips, which I assume not

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply