Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MikeC
Jul 19, 2004
BITCH ASS NARC
For some reason I have this irrational need for one even though I am the type of gamer that maxes out graphics settings so am GPU bound anyways.

Adbot
ADBOT LOVES YOU

hobbesmaster
Jan 28, 2008

DLSS and similar may complicate things a lot, you might be at 4k output but if you have 1080p internal resolution is more CPU actually going to help more than you think?

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
Interesting that it tops the charts for 1440p alone. Possibly the margins are so small that any apparent separation between the top end CPUs is just noise. Not a bad response to Alder Lake, at least until Zen 4 arrives.

Dr. Video Games 0031
Jul 17, 2004

I don't know what the actual margin of error is for these kinds of tests, but it's gotta be at least 1%, which puts the 5800X3D in a virtual tie with the 12900K and KS at everything above 720p.

Kazinsal
Dec 13, 2011
My 8700K feels like it's starting to get a little long in the tooth and between the 5800X3D being that juicy and it now being possible to just walk into a store and buy a 3080 Ti in Canada for barely above MSRP I'm having a hard time not blowing out my savings right about now.

Klyith
Aug 3, 2007

GBS Pledge Week

hobbesmaster posted:

DLSS and similar may complicate things a lot, you might be at 4k output but if you have 1080p internal resolution is more CPU actually going to help more than you think?

Yes, using DLSS or FSR or whatever other upscaling will remove GPU constraints on performance and raise framerate to the CPU's limit.

But the thing is, look at the actual framerate numbers of the games in that review. The worst CPUs in the roundup are getting 80 FPS on the worst possible game example. How often will you care about a +50% performance boost if it represents a jump from 120 to 180 FPS?

(Techpowerup is not showing 1% lows though, which is another good thing to consider. Pretty lame to still be doing average only in 2022. Wait for better reviews.)


Kazinsal posted:

My 8700K feels like it's starting to get a little long in the tooth and between the 5800X3D being that juicy and it now being possible to just walk into a store and buy a 3080 Ti in Canada for barely above MSRP I'm having a hard time not blowing out my savings right about now.

I have a hard time thinking a 5800X3D is worth it for a new system build now, as the final CPU for AM4. Like, this is the CPU for gamers who already have an AM4 setup to do an upgrade that will keep them set until we're well into the DDR5 era.

New Zealand can eat me
Aug 29, 2008

:matters:


^ and honestly even that is a hard sell with these 5900/5950X price drops. While I would say its "just" another ~$100 or so to get the 5950X, that's assuming you're already accounting for a 280mm AIO

Being on a 3950X, I am very tempted. But I keep telling myself to stay strong, that money will be better invested going all out on AM5.

Seamonster
Apr 30, 2007

IMMER SIEGREICH
DDR5 should give some sticker shock. Not quite on the level of 4 figure GPUs but think what you want to pay for DDR4 and multply by 3.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Seamonster posted:

but think what you want to pay for DDR4 and multply by 3.

ummmm wow yeah that's ... a lot, I think I'll keep waiting

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
Are we going to see this 3d stacking in next zen or does this go back into the disney vault so to speak.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

incoherent posted:

Are we going to see this 3d stacking in next zen or does this go back into the disney vault so to speak.

they almost certainly will use it for their server parts, as they already bragged about how that was going to be their next big thing

whether it'll make it into desktop Ryzen is going to depend on whether there's more teething problems with it - IIRC we were expecting a different base 5000-series CPU with 3D-stacking instead of the 5800X, and that plus the lack of overclocking might point to how it's not a straightforward technology to integrate into their entire product stack, but I'm not sure

Arzachel
May 12, 2012

Klyith posted:

But the thing is, look at the actual framerate numbers of the games in that review. The worst CPUs in the roundup are getting 80 FPS on the worst possible game example. How often will you care about a +50% performance boost if it represents a jump from 120 to 180 FPS?

(Techpowerup is not showing 1% lows though, which is another good thing to consider. Pretty lame to still be doing average only in 2022. Wait for better reviews.)

The other thing is that multiplayer games/modes are usually significantly more CPU bound but can't produce consistent benchmark results thus usually don't show up in general benchmarks unless someone goes out of their way.

Pablo Bluth
Sep 7, 2007

I've made a huge mistake.
It's definite not going to be used across all of the stack but I'll be amazed if there's not at least some cpus in the lineup with it. The only question is of it's from day 1 or a later refresh.

BurritoJustice
Oct 9, 2012

gradenko_2000 posted:

they almost certainly will use it for their server parts, as they already bragged about how that was going to be their next big thing

whether it'll make it into desktop Ryzen is going to depend on whether there's more teething problems with it - IIRC we were expecting a different base 5000-series CPU with 3D-stacking instead of the 5800X, and that plus the lack of overclocking might point to how it's not a straightforward technology to integrate into their entire product stack, but I'm not sure

Lisa Su originally showed a 5900X3D, but it didn't end up making sense as the 3D cache SKUs lose in general application performance and the 5900x/5950x are more productivity SKUs than gaming SKUs.

You can see in TPU's testing that the 5800X3D consistently loses by ~5% to the 5800x in almost every test before they get to games due to the clock deficit.

Dr. Video Games 0031
Jul 17, 2004

My expectation is that Zen 4 desktop parts won't have 3D v-cache, though supposedly AMD will be following that up with Zen 5 in a much shorter interval this time around, and who knows what that has in store with us. Earlier rumors suggested a die-shrunk Zen 4 mixed with Zen 5 in the same package, and maybe they'll toss in some v-cache too, who knows.

Klyith
Aug 3, 2007

GBS Pledge Week

Arzachel posted:

The other thing is that multiplayer games/modes are usually significantly more CPU bound but can't produce consistent benchmark results thus usually don't show up in general benchmarks unless someone goes out of their way.

Did you mean less CPU bound? Because most dedicated FPS games produce 100s of FPS when you remove the GPU from the picture.

hobbesmaster
Jan 28, 2008

Klyith posted:

Did you mean less CPU bound? Because most dedicated FPS games produce 100s of FPS when you remove the GPU from the picture.

Except when they don’t. The problem is that 1% and 0.1% lows in multiplayer are when a lot of effects are happening and generally those are actually the most critical seconds between losing and winning.

Klyith
Aug 3, 2007

GBS Pledge Week

hobbesmaster posted:

Except when they don’t. The problem is that 1% and 0.1% lows in multiplayer are when a lot of effects are happening and generally those are actually the most critical seconds between losing and winning.

Many of these games have replay functions. The idea that this is not testable is bogus -- get some replay with everyone from both teams setting off all their ults at once or whatever other crazy poo poo might make the CPU cry. If you don't see the same frame drops as live, the answer is that it's network / server performance not CPU.


I think the answer for why these games are not widely tested in reviews is that the results don't show anything particularly interesting and are not much different from CSGO. And for the vast majority of the audience, a $250 CPU versus a $500 CPU will not make an iota of difference to their W/L ratio.

Meanwhile, 0.1% lows are their own can of worms. If you see a review that includes 0.1% lows and doesn't show error bars, that's a problem. You have to test 10x longer to get the same accuracy as 1% lows. If you are testing CSGO or Overwatch and the game is running at 240 FPS average, you need to play it for 7 minutes to collect just 100 points of 0.1% low data.

Arzachel
May 12, 2012

Klyith posted:

Many of these games have replay functions. The idea that this is not testable is bogus -- get some replay with everyone from both teams setting off all their ults at once or whatever other crazy poo poo might make the CPU cry. If you don't see the same frame drops as live, the answer is that it's network / server performance not CPU.


I think the answer for why these games are not widely tested in reviews is that the results don't show anything particularly interesting and are not much different from CSGO. And for the vast majority of the audience, a $250 CPU versus a $500 CPU will not make an iota of difference to their W/L ratio.

Meanwhile, 0.1% lows are their own can of worms. If you see a review that includes 0.1% lows and doesn't show error bars, that's a problem. You have to test 10x longer to get the same accuracy as 1% lows. If you are testing CSGO or Overwatch and the game is running at 240 FPS average, you need to play it for 7 minutes to collect just 100 points of 0.1% low data.

If you ignore the communication overhead, then the CPU doesn't matter, yes. Battlefield games (:dice:), Warzone, Apex (also fairly GPU bound tbf), FFXIV when there's a ton bunch of characters on the screen will all hit the CPU harder than any singleplayer game I can think of except maybe Ashes. CSGO and Siege will also put up less rosy numbers in actual games instead of benchmarks. Realistically this doesn't matter for anyone with a 60hz display but then why are you contemplating buying a $400 CPU for videogames in the first place.

Agreed that 0.1% lows can be iffy but 1% really should be the standard.

MikeC
Jul 19, 2004
BITCH ASS NARC

Klyith posted:

I have a hard time thinking a 5800X3D is worth it for a new system build now, as the final CPU for AM4. Like, this is the CPU for gamers who already have an AM4 setup to do an upgrade that will keep them set until we're well into the DDR5 era.

Yeah there is no way a new builder should go for the X3D. I am only considering it because I am on a 3600X and can effectively lock in the best gaming CPU for the platform while I wait for the next wave of GPUs next year. Maybe I will smarten up and just get the 5700x and save 200 canuck bucks.

CaptainSarcastic
Jul 6, 2013



MikeC posted:

Yeah there is no way a new builder should go for the X3D. I am only considering it because I am on a 3600X and can effectively lock in the best gaming CPU for the platform while I wait for the next wave of GPUs next year. Maybe I will smarten up and just get the 5700x and save 200 canuck bucks.

:same:

Still waiting to see pricing and availability settle out before completely making up my mind.

hobbesmaster
Jan 28, 2008

That 5800x for $269 with that micro center coupon still looks pretty enticing… (though the 5600x with the same coupon is only $159)

New Zealand can eat me
Aug 29, 2008

:matters:


Seamonster posted:

DDR5 should give some sticker shock. Not quite on the level of 4 figure GPUs but think what you want to pay for DDR4 and multply by 3.

Honestly I'm fine with that*, I ended up buying three different sets of memory when Zen dropped trying to figure out what worked best anyways.

*As long as everything works well the first time around

kliras
Mar 27, 2021
5800XD embargo is up!

https://twitter.com/VideoCardz/status/1514588772809285636

https://www.youtube.com/watch?v=hBFNoKUHjcg

https://www.youtube.com/watch?v=ajDUIJalxis

Dr. Video Games 0031
Jul 17, 2004

well,

redeyes
Sep 14, 2002

by Fluffdaddy
Looks like Intel wins most stuff except a few games. With 3x more power usage.

kliras
Mar 27, 2021
110w vs 275w, jesus christ lol

Dr. Video Games 0031
Jul 17, 2004

redeyes posted:

Looks like Intel wins most stuff except a few games. With 3x more power usage.

Intel's biggest wins were with a $450 kit of DDR5 while their DDR4 performance is generally worse than AMD's (with CP2077 being the only game hardware unboxed tested that favored Intel with DDR4). More games need to be tested to draw an accurate comparison, though. Everyone's reviews must've been rushed because they're all using like eight-game sample sets. Thankfully, HUB said they're going to be doing more benchmarking, and their benchmarking setup seems pretty robust, testing with two RAM configurations for the 12900K and 5800X3D.

Broose
Oct 28, 2007
Now I'm curious if the 3800x3d extra cache helps in VR games and if stuff with ray tracing is affected at all.

redeyes
Sep 14, 2002

by Fluffdaddy

Dr. Video Games 0031 posted:

Intel's biggest wins were with a $450 kit of DDR5 while their DDR4 performance is generally worse than AMD's (with CP2077 being the only game hardware unboxed tested that favored Intel with DDR4). More games need to be tested to draw an accurate comparison, though. Everyone's reviews must've been rushed because they're all using like eight-game sample sets. Thankfully, HUB said they're going to be doing more benchmarking, and their benchmarking setup seems pretty robust, testing with two RAM configurations for the 12900K and 5800X3D.

Ah I missed the DDR5 thing. Looks like I wont be upgrading my 3900x afterall. I don't wanna deal with the extra power usage nor new mobo. I don't play games either, so likely 12 real cores is still my jam.

Helter Skelter
Feb 10, 2004

BEARD OF HAVOC

Dr. Video Games 0031 posted:

Thankfully, HUB said they're going to be doing more benchmarking, and their benchmarking setup seems pretty robust, testing with two RAM configurations for the 12900K and 5800X3D.

Their 30+ game spread is in the works, but due to time constraints their Intel numbers from that will either be DDR5-only or DDR4-only (probably DDR4 based on a channel poll they ran a couple days ago).

Klyith
Aug 3, 2007

GBS Pledge Week

Broose posted:

Now I'm curious if the 3800x3d extra cache helps in VR games and if stuff with ray tracing is affected at all.

VR: probably yes
Ray tracing: no*

*ray tracing does add some CPU load, as some setup for RT is done on CPU. but it's a massive performance limiter on the GPU so that doesn't matter. when you cut your FPS by 50% waiting on the rays, the CPU has 2x as much time to get its jobs done. so unless you are already stupidly CPU-limited, RT doesn't care about CPU.

Pablo Bluth
Sep 7, 2007

I've made a huge mistake.
The best productivity benchmarks is phoronix's openbenchmarking.org, so hopefully it won't be too long before it shows up on there. The 3D cache version of epyc look like impressive for HPC uses so it'll be interesting how many productivity use cases benefit.

kliras
Mar 27, 2021
it's pretty amazing that eight cores is just considered "gaming cpu" nowadays and higher is "workstation", really awesome how far we've come

Hughmoris
Apr 21, 2007
Let's go to the abyss!

redeyes posted:

Looks like Intel wins most stuff except a few games. With 3x more power usage.

kliras posted:

110w vs 275w, jesus christ lol

What's the day to day reality of such a high power draw? Will my room become noticeably warmer, or my electricity bill noticeably higher?

Or would most people not be able to tell a difference in power draws if it was a blind test?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
you're probably going to notice the room getting warmer, even immediately if the case is near you - remember that cooling is just removing the heat from the parts, and out of the case, and into the environment, but the heat is "still there"

whether it's going to spike your electricity bill depends on how much you use it - you're not going to be putting a full load on the computer all the time

kliras
Mar 27, 2021
benchmarks results like that are mostly for peak load. ryzen's been pretty bad at managing p-states on my end, so i had to run it very high all the time which is more expensive than intel automatically managing the clock and load of the cpu

computers turning off because the psu can't provide enough power aside, cpu coolers can only dissipate so many watts, so there will be some fun cut-off points where certain coolers might not cut the mustard. i'm not a huge heatsink benchmark person, so i don't know what the listed vs practical limits are, but the good old hyper 212 evo has a listed 150w cap for instance - even though it's a bit more complicated. time to get familiar with fan curves

then again, so much hardware is designed to wring out an extra 5% of performance to look good in benchmarks at the expense of temperature, noise, and sometimes even performance itself due to the throttling this incurs. capping performance through undervolting or some other hard cap goes a long way and should probably be the default for people who aren't engaging in fps pissing matches. 5800X3D having a harder cap on overclocking actually makes it more compelling to me rather than less but ymmv

it's going to be interesting to see what the 40-series will mean for power use and cooling needed; everyone's kinda talking about worst-case scenarios but let's wait and see

e: this chart from hub is instructive:



keep in mind this is with a 3090ti

kliras fucked around with this message at 17:12 on Apr 14, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

AMD's results are attractive to me but even the 12900K which is incredibly power hungry if left to its own devices & becomes kind of ludicrously moreso if overclocked is actually a great performer at much, much lower wattages - you can get most of the performance of 240W+ out of a cap of 180-190W, and a hell of a lot of it out of a cap of 125W, which you can set manual limits to achieve. I reckon they have the thing guzzling electricity to chase the last little bit of % that makes it perform better in some benchmarks versus the top end from AMD so they can advertise those results.

Cygni
Nov 12, 2005

raring to post

Hughmoris posted:

What's the day to day reality of such a high power draw? Will my room become noticeably warmer, or my electricity bill noticeably higher?

Or would most people not be able to tell a difference in power draws if it was a blind test?

You won’t be able to tell unless you are running a heavy workload for long time frames, or on the edge of power delivery or cooling (like in an SFF case). It doesn’t sit at that power draw playing web browser.

In a gaming rig, you really should be focusing on spending on the GPU over these minor CPU differences… and then your GPU is more likely to be the hottest and most annoying thing to cool anyway. AIB 3070s pull about 275w all by themselves, and it’s only up from there.

E: beaten multiple times

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

There are some bizarre outliers and conflicting results out there in some of the reviews (what the hell is going on in the Far Cry games), but this is a pretty interesting outlier:



That’s an insane uplift over the 5800X. SimFolks take note.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply