Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Stickman
Feb 1, 2004

Cygni posted:

Intel sweeps gaming still, which is the point of the 9900KS and the primary reason for sales in DIY PCs, so I would be interested to see if the microcode hit performance at all.

Sweeps gaming if you're willing to pay a premium for a small performance boost and very high fps for games that a running at cpu-limited fps and actually benefit from Intel's single-thread lead, which is honestly something only a tiny fraction of the people who buy expensive cpus that "sweep gaming" are ever going to benefit meaningfully from.

The marketing definitely helps Intel's dominance in the DIY PC arena, but the marketing is massively overblown. See also, AMD marketing Threadripper to gamers. Or the 3950x, for that matter :v:

Stickman fucked around with this message at 20:46 on Nov 14, 2019

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

Stickman posted:

Sweeps gaming if you're willing to pay a premium for a small performance boost and very high fps for games that a running at cpu-limited fps and actually benefit from Intel's single-thread lead, which is honestly something only a tiny fraction of the people who buy expensive cpus that "sweep gaming" are ever going to benefit meaningfully from.

The marketing definitely helps Intel's dominance in the DIY PC arena, but the marketing is massively overblown. See also, AMD marketing Threadripper to gamers. Or the 3950x, for that matter :v:

Yes, if you want price/performance in gaming, you should probably be getting a 3600. But that wasn't really the comparison we were talking about. If you want to make a top end gaming computer, one that will scale with future GPU upgrades as shown in CPU limited scenarios, the 9900k/f/s still beat the equally priced 3900x and the much more expensive 3950x.

As for whether the price premium over a 3600 is "worth it"? We are talking about luxury toys here. People weigh the value of those luxuries differently.

And thats why the microcode numbers might be interesting. Maybe it wont be worth it for a lot of people anymore.

Stickman
Feb 1, 2004

Cygni posted:

As for whether the price premium over a 3600 is "worth it"? We are talking about luxury toys here. People weigh the value of those luxuries differently.

That's true, but my point is that almost everyone who opts to spend more for the "top gaming processor" isn't actually getting the benefit they think they're getting. Marketing has severely distorted the importance of top-end processors to gaming and the only real benefit most buyers are getting is the satisfaction of having the the processor labeled "the best". People don't weigh luxuries differently in a vacuum - their assessment is heavily influenced by marketing and peer pressure.

E: But yeah, definitely agree on the microcode numbers! I doubt it will severely hurt Intel's reputation as the top gaming cpus, but as you say, it'll be useful for assessment and recommendations.

Stickman fucked around with this message at 21:37 on Nov 14, 2019

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

For a top end processor not to be bottlenecked by a top end GPU, ever, either you're turning down settings, which people who buy top end GPUs aren't wont to do, graphics technology has stagnated to the point where they stop making higher resolutions or more processing intensive effects, or you're playing a strategy game.

But y'all are right, people who are buying them are doing it just because it has the biggest numbers. Or are pro-tier e-thletes, I guess.

Cygni
Nov 12, 2005

raring to post

Stickman posted:

That's true, but my point is that almost everyone who opts to spend more for the "top gaming processor" isn't actually getting the benefit they think they're getting. Marketing has severely distorted the importance of top-end processors to gaming and the only real benefit most buyers are getting is the satisfaction of having the the processor labeled "the best". People don't weigh luxuries differently in a vacuum - their assessment is heavily influenced by marketing and peer pressure.

E: But yeah, definitely agree on the microcode numbers!

Yeah, and the marketing from both companies has reached a fever pitch of pure dogshit. The 9900KS "all 5Ghz all the time" can only do that if you manually turn TDP limiting off. The "4.6ghz" boost on the 3950X can only be hit if you arent actually using it. Both have hosed the their own special definitions of TDP to the point where it is completely meaningless. And thats not even getting into the comical benchmarks both of them put out, like that AMD PCIe 4.0 one or the Intel ones using custom, non public software libraries.

I guess the AMD Bulldozer class action suit actually sent the wrong message. If the penalty is only $12.1m for "stretching the truth", thats fuckin' nothing. Lie all the time!

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Stickman posted:

Sweeps gaming if you're willing to pay a premium for a small performance boost and very high fps for games that a running at cpu-limited fps and actually benefit from Intel's single-thread lead, which is honestly something only a tiny fraction of the people who buy expensive cpus that "sweep gaming" are ever going to benefit meaningfully from.

The marketing definitely helps Intel's dominance in the DIY PC arena, but the marketing is massively overblown. See also, AMD marketing Threadripper to gamers. Or the 3950x, for that matter :v:

"almost nobody" notices it since "almost everyone" is playing at ultra or max graphics settings in every title, right?

Dramicus
Mar 26, 2010
Grimey Drawer
I don't think there are many people running a 9900ks with a 2080ti at medium.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dramicus posted:

I don't think there are many people running a 9900ks with a 2080ti at medium.

didn't RDR2 just come out? :smuggo:

9900k starts pulling away at the 1080 Ti/2070S/5700XT tier of performance. That's a $400 GPU... today. Do I think there are people with $400 GPUs who are lowering settings to push higher framerates? Yeah, probably. If you want to max out your 1440p 144 Hz monitor on a $400 GPU you are lowering some settings.

Like yeah we get it 2080 Ti = expensive but that's not the only situation where the 9900K shows off.

And next generation it will be the $200-300 tier of cards where the 9900K/KS are pulling away. That's next year. 6-9 months away.

Paul MaudDib fucked around with this message at 22:16 on Nov 14, 2019

Dramicus
Mar 26, 2010
Grimey Drawer

Paul MaudDib posted:

didn't RDR2 just come out? :smuggo:

9900k starts pulling away at the 1080 Ti/2070S/5700XT tier of performance. That's a $400 GPU... today. Do I think there are people with $400 GPUs who are lowering settings to push higher framerates? Yeah, probably.

Like yeah we get it 2080 Ti = expensive but that's not the only situation where the 9900K shows off.

I dunno, if you want 144hz with a mid tier gpu, I don't think you need a 9900k to pull it off. You need to make a case for something that only the 9900k can do, because anything less and AMD suddenly becomes very competitive.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

The 9700k/9900k can do 10-20 more fps than AMDs offerings (or could, as of Zen2's launch), provided you're entirely CPU limited.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dramicus posted:

I dunno, if you want 144hz with a mid tier gpu, I don't think you need a 9900k to pull it off. You need to make a case for something that only the 9900k can do, because anything less and AMD suddenly becomes very competitive.

There are definitely titles where the 3600/3700X simply can't reach 144 fps at any resolution/settings. The 8700K/9900K/9900KS can't necessarily hit 144 fps either, but a high-clocked Coffee Lake is ~17% faster in gaming (according to GN's 9900KS review) so you're that much closer to doing it.

Yes, the 3600 and the 3700X are good budget+mainstream buys and you shouldn't feel bad for buying them. Does Coffee Lake slot in faster? Yes. Are some people willing to pay an extra $100-200 for 17% more CPU performance? Yes.

Coffee Lake doesn't even have to be super expensive either. Microcenter was running the 8700K for $250. That's $60 more than the 3600. 9900KF was running $415 for a while too.

Paul MaudDib fucked around with this message at 22:28 on Nov 14, 2019

Dramicus
Mar 26, 2010
Grimey Drawer
It sounds like at 60hz it doesn't matter which CPU you go for and 100% 144hz is for next year's CPU lineup on both sides.

MagusDraco
Nov 11, 2011

even speedwagon was trolled
A microcenter being closer than a 6-7 hour drive for all people would be nice. Oh well, guess most people are stuck with going zen 2 this year unless they wanna pay more for a 9700k or something.

Stickman
Feb 1, 2004

Dramicus posted:

It sounds like at 60hz it doesn't matter which CPU you go for and 100% 144hz is for next year's CPU lineup on both sides.

"Yes", and "sort of" because presumably next year's games will also be more demanding.

Paul MaudDib posted:

"almost nobody" notices it since "almost everyone" is playing at ultra or max graphics settings in every title, right?

Pretty much (and at higher resolutions, medium or high settings are still going to be gpu-limited in the sorts of games that have cpu limits below 144Hz). Gamers who will turn down settings to max their fps are very much the minority of DIY PC builders, as are gamers who would actually be able to tell the difference between 120fps and 140fps in a blind test. If that's you then by all means buy what you need to squeeze out top-end performance.

Stickman fucked around with this message at 23:14 on Nov 14, 2019

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Right now the CPUs in the consoles are so terrible that to go from 30 FPS on console to >100 FPS on PC can be achieved by almost anything. Next gen consoles are going to have Zen cores. Games designed to run on CPUs that have made the enormous leap from from Bulldozer > Zen are going to be MUCH more demanding, and the significant gap between AMD and Intel will be exposed far more often.

There's nothing wrong with lower end AMD CPUs for gaming. There's nothing wrong with the 3900X/3950X if you're a student gunning for HEDT on a budget or something. Buying them for gaming, which is what drives most higher-end desktop spending, is stupid. AMD is still well behind Intel in gaming performance and it's silly when people insist on pretending otherwise.

K8.0 fucked around with this message at 23:27 on Nov 14, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

MagusDraco posted:

A microcenter being closer than a 6-7 hour drive for all people would be nice. Oh well, guess most people are stuck with going zen 2 this year unless they wanna pay more for a 9700k or something.

Well, in a month or two here, the 8700K becomes the 10600K and the 9900K becomes the 10700K, and then everyone will be able to buy them at those prices.

I don't disagree that the consumer line needs to be reset in relation to Zen2 in the same way the HEDT lineup was. It's coming.

Intel can still hold a certain premium for their chips. 30% higher price for 15% more performance is viable. Just not 50-100% anymore.

If the 3600 is $190 and the 10600K is $250 that’s fine. If the 3700X is $300 and the 10700K is $350 that’s fine.

Paul MaudDib fucked around with this message at 23:42 on Nov 14, 2019

EdEddnEddy
Apr 5, 2012



I am just happy Games finally use more than 1-2 cores/threads. Those were the days where single thread ipc was such a big deal. Now it seems that lead Intel had has shrunk a good bit, and while Benchmarks may show this or that, really you can't go wrong with any buy at the 6+ core level currently. Pair it with whatever the best GPU you can and enjoy at whatever price point you are aiming for. It sure is a hell of a lot nicer than 2017 era when 4 cores was the top of Intel unless you went HEDT.

Did a comparison with my old E5-1660 v1/2080 vs a 3800X/2080 Super build with the WoT EnCore RT demo and while my comp was beat at the equivalent res we could test at (up to 1440P), it really wasn't a blowout as I was expecting. It may be old but it still makes me not have that super need to build a new rig yet. Since I am at 1440P Ultrawide 100Hz, I am GPU limited in most games as it currently stands.

eames
May 9, 2009

I find that CFL really shines with fast, low latency RAM. The same is true for Ryzen but CFL does better in single threaded and/or latency sensitive titles. Unfortunately no benchmark site ever tests with really fast and optimised RAM, arguably rightfully so because most users don’t even turn on XMP.

Something like DDR4-4133 CL16 with tight sub and tertiary timings is a 20% average frame rate increase over the default JEDEC settings at stock core frequencies and when CPU bound. More like 40% at 5.2 GHz.

I’m not going to pretend that that’s anything but a rare edge case in the grand scheme of things but I’m still pleased with my 8700k over AM4 ‘cause upgrading to a 3950X would be a performance loss for muh FPS (and not a small one either). Now if AMD could catch up on single core performance and vendors would start selling DDR4-4000+ binned ECC modules...

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

There's absolutely no reason to buy a 16 core CPU for gaming anyways, outside of e-peen. It's not an upgrade for anyone, it's just stupid

FlapYoJacks
Feb 12, 2009

ItBreathes posted:

There's absolutely no reason to buy a 16 core CPU for gaming anyways, outside of e-peen. It's not an upgrade for anyone, it's just stupid

The 3950x is cheaper than a 9900x, has 6 more cores, uses less power, and has a higher boost clock.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

ratbert90 posted:

The 3950x is cheaper than a 9900x, has 6 more cores, uses less power, and has a higher boost clock.

By cheaper price you mean for another week until the 10900X launches, right? :lol:

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

ratbert90 posted:

The 3950x is cheaper than a 9900x, has 6 more cores, uses less power, and has a higher boost clock.

There's no reason to buy a 9900X for gaming either :shrug:

The guy I was responding to was comparing it to a 8700K, and the overlap in reasonable use cases between those processors is null. If you've got a use for the cores by all means go for it.

FlapYoJacks
Feb 12, 2009

Paul MaudDib posted:

By cheaper price you mean for another week until the 10900X launches, right? :lol:

Ok.

The 3950x is $150 more expensive than a 10900x, has 6 more cores, uses less power, has a higher sustained boost clock, has 45MB more L3 cache, can support ECC memory, and has PCIe 4.0 support.

Cygni
Nov 12, 2005

raring to post

No one is buying a 9900x for gaming.

Well, nobody is buying a 9900x at all anyway, to the point that I would imagine most people doesn’t even know it exists.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Alright we're now hitting the point where people are trying to use AMD's bullshit boost clocks vs Intel's real boost clocks that the CPUs can easily sustain forever as evidence in AMD's favor. Time to hit the brakes and get back to reality people.

BlankSystemDaemon
Mar 13, 2009



ratbert90 posted:

Ok.

The 3950x is $150 more expensive than a 10900x, has 6 more cores, uses less power, has a higher sustained boost clock, has 45MB more L3 cache, can support ECC memory, and has PCIe 4.0 support.
Minor detail, but when you say ECC, what mode are you talking about?
It can be one of these:
  • The system can go beyond POST with ECC memory plugged in
  • The memory can correct one or two bit error(s), but doesn't generate a non-maskable interrupt to notify the system
  • The system can correct one or two bit error(s), and does generate a non-maskable interrupt to notify the system
  • A non-maskable interrupt is generated, and it causes the CPU to reset rather than write corrupt data to disk
1 is fine if you don't actually care about system stability but why pay the ~$20-50 that ECC costs, 2 is bad because it effectly makes the system behave like it doesn't have ECC when you have enough memory errors, 3 is optimal as it lets you pick whether to ignore the NMI or panic the system, and 4 is only used in the most critical setups with active/active high-availability.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
If you buy the X470D4U or X470D4U-2T it’s 3. I personally would not rely on ECC for any board that is not a server-oriented board.

I have a little heartburn every time someone says ECC is supported. No, AMD didn’t go out of their way to disable it, but there’s no support there. It’s not an official feature and if it breaks sucks to be you. Nobody at board partners is actually testing it and there is no guarantee it doesn’t silently break (revert back to 1) the next time AMD releases a new AGESA or the new unified AGESA or something.

It’s like using undocumented instructions on a Z80 clone or something. Does it work, yes. Will it continue to work, probably. is it Zilog’s problem if they do a new stepping and things silently break in new and exciting ways? Nope. That’s what unsupported means, not necessarily non-functional.

The exception is the server boards. ASRock Rack is going out of their way to test it on those boards and sells that as an official feature.

Paul MaudDib fucked around with this message at 09:39 on Nov 15, 2019

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
What do you mean by "not an official feature"? If you go to the specs page for the X470 Taichi it says "ECC and non-ECC un-buffered" clear as day next to Memory. The manual makes specific reference to it also, including a data poisoning feature that seems like it would be nonfunctional if the board didn't actually use the ECC functionality of ECC memory.

Khorne
May 1, 2002
I have access to a 9900k & a 3900x and a 2070 super and a 1070. I run games at settings so they're CPU bound.

There's no functional difference between the 3900x and 9900k in any title I play. Even in games like Anno 1800, if you set things properly the 3900x has situations where it has 3-4 higher average fps (both CPUs get ~80-90 fps in Anno 1800 on low settings when you have a fully developed world.)

I also have a 240Hz monitor and a 165Hz 1440p monitor. Both hit > 240 fps minimums in the same games, both hit >165 fps minimums in the same games.

The difference in gaming is overblown unless you only play Starcraft 2, which heavily uses ints and is hit hard by latency, or are trying to play new single player games at competitive fps levels which is just weird. I suppose there are certain games that benefit from getting higher than refresh rate fps, and in that case the 9900k does have a 6%-12% lead in some titles.There are even games where the zen2 processors win in fps, especially if you're willing to play the games in linux. In modded minecraft, zen2 wins by ~5%-8% when ran under linux.

They're roughly equivalent in performance, will remain that way, and one has 4-8 more cores if you need or want those.

Khorne fucked around with this message at 20:18 on Nov 15, 2019

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
The point is that ECC is not a guaranteed feature set of AMD CPUs, but rather up to the individual motherboards to bother implementing (or not), including how they're going to implement it. And if some future microcode change or updated CPU stepping messes ECC up, oh well, AMD wasn't really selling that as a feature so too bad for you. (not that I expect them to actually do this, mind you)

It's not the same as when Intel says something is ECC compatible and you know you're going to get option 3 (or 4, depending) from the post above. Then again, you're not paying extra for the feature with AMD, either.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Eletriarnation posted:

What do you mean by "not an official feature"? If you go to the specs page for the X470 Taichi it says "ECC and non-ECC un-buffered" clear as day next to Memory. The manual makes specific reference to it also, including a data poisoning feature that seems like it would be nonfunctional if the board didn't actually use the ECC functionality of ECC memory.

Asrock isn't actually validating the consumer boards before release though. They can and have broken it before.

If you update your BIOS and it stops working then you better hope you notice it, then you send Asrock an email and hope they fix it. It's not an official feature and they don't have to. If it's AMD who broke it in AGESA they may not even be able to, apart from filing a bug ticket with AMD and hoping they fix it.

It's not supported, it's an undocumented feature that some partners surface for you. That's not the same thing as an ECC supported platform.

The ongoing development on AM4 actually makes this worse. AMD finally finished up with Zen and Zen+ last year, this year they churned everything for Zen2, next year they are releasing the "unified" AGESA and churning everything again. AM4 is never stable and finished, AMD is constantly doing major revisions in their firmware. And every time they do that, Asrock has to go back and change how they hook the BIOS into AGESA (you know, all the fun stuff like boost and so on that HUB was blaming them for). So you would really want someone to actually be paying attention and making sure that ECC actually stays working on every build.

Paul MaudDib fucked around with this message at 20:31 on Nov 15, 2019

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
e: hurf durf, never mind, I don't read good

TheFluff fucked around with this message at 20:35 on Nov 15, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

TheFluff posted:

He was specifically and explicitly talking about Asrock Rack, which is a separate Asrock brand for "prosumer" boards, where ECC memory is specifically listed on the memory QVL. But keep on keeping on, Paul.

:chloe:

He was talking about the X470 Taichi as he explicitly linked in his post and discussed in his post, if you bothered to read

I was the one talking about the Asrock Rack, which I advocated as being something that would actually get tested for ECC, since that's a feature that segment actually cares about, as opposed to trusting the consumer stuff

Paul MaudDib fucked around with this message at 20:37 on Nov 15, 2019

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Paul MaudDib posted:

:chloe:

He was talking about the X470 Taichi as he explicitly linked in his post and discussed in his post, if you bothered to read

I was the one talking about the Asrock Rack, which I advocated as being something that would actually get tested for ECC, since that's a feature that market actually cares about

yes my bad I misread the previous conversation

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Paul MaudDib posted:

Asrock isn't actually validating the consumer boards before release though. They can and have broken it before.

If you update your BIOS and it stops working then you better hope you notice it, then you send Asrock an email and hope they fix it. It's not an official feature and they don't have to. If it's AMD who broke it in AGESA they may not even be able to, apart from filing a bug ticket with AMD and hoping they fix it.

It's not supported, it's an undocumented feature that some partners surface for you. That's not the same thing as an ECC supported platform.

The ongoing development on AM4 actually makes this worse. AMD finally finished up with Zen and Zen+ last year, this year they churned everything for Zen2, next year they are releasing the "unified" AGESA and churning everything again. AM4 is never stable and finished, AMD is constantly doing major revisions in their firmware. And every time they do that, Asrock has to go back and change how they hook the BIOS into AGESA (you know, all the fun stuff like boost and so on that HUB was blaming them for). So you would really want someone to actually be paying attention and making sure that ECC actually stays working on every build.

I don't think the thread you link really supports as pessimistic of a view as you're portraying. The guy saw that the BIOS was reporting ECC as disabled, so he contacted ASRock to ask about it. ASRock said "that's not right, we plan to fix it going forward and in the meantime here's a beta BIOS to get it working again." The very next official release had ECC working again. So - not only did the device give a clear notification that the feature wasn't working instead of masking it, but the manufacturer's first response was to fix the feature pretty much ASAP. What more would you ask for if you reported an issue with any other feature?

Also, why are you calling it 'undocumented'? It's in the spec sheet and the manual, it's definitely documented. Saying "it's not an official feature" is another undefined (meaningless?) statement, and "they don't have to [fix issues]" is true of anything you buy without a support contract. I get not wanting people to get their hopes up but this seriously comes off as FUD.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
The difference between AMD's take on it and Intel's is that with Intel, if you buy a CPU that explicitly supports ECC, and a board that supports ECC, you get the ECC behavior you'd expect, full stop. Upgrades, etc., won't break it. The counter being if the CPU doesn't explicitly support ECC, then it doesn't support it at all, full stop.

With AMD, their consumer line is a grey area where AMD isn't actually advertising their chips as ECC compliant, but some motherboards have made it work anyhow. Since AMD isn't selling it as a feature set the way Intel does, that ECC support isn't guaranteed to work, and as noted, it's entirely up to the motherboard OEM to keep on top of things, which they may or may not, and if they don't, you have fairly little recourse.

It's a nice bonus if you're trying to put together a small server for absolutely the lowest cost, but if it's something I actually cared about I'd spring for a CPU and motherboard that explicitly supported ECC--server grade stuff.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Sure, so hypothetically AMD could push out an AGESA update that breaks ECC across the whole Ryzen line and tell everyone to gently caress off. In that case ASRock would just have to stop advertising support, and anyone with a working ECC system would have to stop updating or lose that feature.

The reason I'm calling it FUD is that as far as I can tell there's no reason for AMD to do this (and there are some reasons for them not to do it), with nothing indicating that it's likely to happen. The actual reality today is that ECC seems to work just fine on Ryzen platforms which advertise it. The idea that it won't in the future is, again as far as I can tell, based on nothing except "hey they can't get sued if they decide to do this".

Eletriarnation fucked around with this message at 21:26 on Nov 15, 2019

BlankSystemDaemon
Mar 13, 2009



Paul MaudDib posted:

I personally would not rely on ECC for any board that is not a server-oriented board.
Not even that will guarentee anything unless it's from a reputable ODM like Supermicro.

Eletriarnation posted:

Sure, so hypothetically AMD could push out an AGESA update that breaks ECC across the whole Ryzen line and tell everyone to gently caress off. In that case ASRock would just have to stop advertising support, and anyone with a working ECC system would have to stop updating or lose that feature.

The reason I'm calling it FUD is that as far as I can tell there's no reason for AMD to do this (and there are some reasons for them not to do it), with nothing indicating that it's likely to happen. The actual reality today is that ECC seems to work just fine on Ryzen platforms which advertise it. The idea that it won't in the future is, again as far as I can tell, based on nothing except "hey they can't get sued if they decide to do this".
20 years of industry experience has taught me that "ECC supported via (un)registered DIMMs" doesn't mean "the firmware will generate non-maskable interrupts so you get to decide what happens", no matter how much anyone hopes that that's what it means.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

D. Ebdrup posted:

Not even that will guarentee anything unless it's from a reputable ODM like Supermicro.

20 years of industry experience has taught me that "ECC supported via (un)registered DIMMs" doesn't mean "the firmware will generate non-maskable interrupts so you get to decide what happens", no matter how much anyone hopes that that's what it means.

With all due respect to your 20 years of experience what you fundamentally seem to be saying is, irrespective of the technology, "don't trust the marketing materials, we don't really know it works until we see it work". I feel like that's kind of obvious. It's not too difficult though to find reports of people actually trying this, and this one even mentions an observable difference in interrupt behavior between having Data Poisoning on and off.

If there were no information at all out there, I would share your hesitation at going "Huh, it seems likely that this feature works". However, I'm finding a lot of people saying "I tried it and as far as I can tell it works, with this motherboard that claims to support it." I'm sure you know more about ECC and NMIs than I do so maybe you can read these reports and identify something that I and they missed, but the "I haven't personally seen it work so it all could be bullshit" take is kinda useless in my opinion when we're talking about released products that people are actually using and are not actually reporting problems with.

Like, I'm not recommending that people hock their Xeons for Ryzens because "it totally works guys, guaranteed" but the skepticism that it's doing what it is supposed to seems unsupported.

Eletriarnation fucked around with this message at 23:10 on Nov 15, 2019

Adbot
ADBOT LOVES YOU

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

ECC without NMI is unacceptable for server applications, but I do not agree that is matters that much for home/gaming use. Even perfectly healthy dimms can bitflip occasionally and this will stop that from impacting you, as well as generally increasing overall reliability. Yeah, if you are losing a dimm then troubleshooting it will be a pain in the rear end but a memtest will generally turn it up eventually while reducing the likelihood of system problems being caused by ram.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply