Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Suburban Dad
Jan 10, 2007


Well what's attached to a leash that it made itself?
The punchline is the way that you've been fuckin' yourself




Struggling real hard on pulling the trigger on a 3700x or waiting another month for a 5600x. Both will be light years ahead of my 2500k that I've been rocking forever but the waiting sucks. I really would like to see some reviews on the new ones before deciding but instant gratification is a hell of a drug.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

not specifically sniping at you or anyone here, but like, everyone has spent the last 3 years talking up "NObOdy uSeS A x80 TI To pLaY 1080p dO mOrE rEaL wOrLd bEnChMaRks" and the second they were on top that predictably vanished instantly, just as I said it would.

it's a pretty transparent thing to minimize the downsides when you're behind, not a coincidence that Intel has moved to "real world benchmarks!" in laptops for the last year.

if you are benchmarking the CPU itself, not the system, then 1080p with the fastest GPU you can find is pretty objectively the right way to do it. It could even be 720p. If you are still seeing the framerate scale up then you are not fully CPU bottlenecked. That's the goal, to push the bottleneck fully onto the CPU.

Paul MaudDib fucked around with this message at 19:43 on Oct 8, 2020

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

Suburban Dad posted:

Struggling real hard on pulling the trigger on a 3700x or waiting another month for a 5600x. Both will be light years ahead of my 2500k that I've been rocking forever but the waiting sucks. I really would like to see some reviews on the new ones before deciding but instant gratification is a hell of a drug.

At least you can choose. I blew my load exactly 2 months early due to how impatient I was.

PC LOAD LETTER
May 23, 2005
WTF?!

Paul MaudDib posted:

not specifically sniping at you or anyone here, but like, everyone has spent the last 3 years talking up "NObOdy uSeS A x80 TI To pLaY 1080p dO mOrE rEaL wOrLd bEnChMaRks" and the second they were on top that predictably vanished instantly, just as I said it would.

Thats kinda BS there Paul. Your whole schtick was that 1080p benches were not only perfectly valid CPU benches but that they were also a good indicator for how a given CPU would perform with future games.

AMD, Intel, or NV doing some marketing crap, which surprise surprise is eye roll worthy, isn't the vindication you're portraying it as and 1080p game benches are still poo poo tier at best indicators of CPU performance.

Riven
Apr 22, 2002
Well if any of ya’ll with a 38 or 3900 upgrade and are looking to sell I’m interested in upgrading from my 3600 for less than retail.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

PC LOAD LETTER posted:

AMD, Intel, or NV doing some marketing crap, which surprise surprise is eye roll worthy, isn't the vindication you're portraying it as and 1080p game benches are still poo poo tier at best indicators of CPU performance.

Not to start this whole argument over again, but yes and no. Is a 1080p bench reasonable for a $300 CPU? Yeah, probably, because there are likely still people using a chip of that tier for 1080p gaming.

A $500+ CPU? I doubt there are more than a handful of people playing at 1080p with that sort of hardware, so it's considerably less meaningful there.

But yeah, every company is going to pick benches that make their stuff look the best they can, regardless of how well said benches actually map to the use-cases the hardware is likely to actually see.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

denereal visease posted:

You did that in BIOS, not using Ryzen Master? I recall hearing something about BIOS and Ryzen Master fighting each other to enforce TDP limits.

BIOS. I'm running Linux, so Ryzen Master (and Windows Power Plans, and wtf-ever else) isn't a thing for me.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

PC LOAD LETTER posted:

Thats kinda BS there Paul. Your whole schtick was that 1080p benches were not only perfectly valid CPU benches but that they were also a good indicator for how a given CPU would perform with future games.

that's indeed what I just said, it's not me who's flip flopping when it makes the numbers look better

Cygni
Nov 12, 2005

raring to post

Low resolution does, potentially, give you an idea of how a CPU will scale with future GPUs, assuming you are going to keep the platform longer than the GPU like most gamers.

None of that is guaranteed, though. And if you are making a gaming decision today for a certain resolution, obviously looking at the resolution you intend to use is far better information. Especially if you are a turbo nerd like many of us here and only keep your CPU platform for 2 years at best.

e: i guess instead of "scaling", the better way to think about it is "heres how many FPS the CPU could theoretically serve to a GPU, if the GPU could actually crank through that many frames at your chosen resolution, and if your chosen monitor could then display them". Theres a few "if" steps in there.

Cygni fucked around with this message at 20:00 on Oct 8, 2020

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


DrDork posted:

Not to start this whole argument over again, but yes and no. Is a 1080p bench reasonable for a $300 CPU? Yeah, probably, because there are likely still people using a chip of that tier for 1080p gaming.

A $500+ CPU? I doubt there are more than a handful of people playing at 1080p with that sort of hardware, so it's considerably less meaningful there.

But yeah, every company is going to pick benches that make their stuff look the best they can, regardless of how well said benches actually map to the use-cases the hardware is likely to actually see.

I know many people that buy a 2-3 GPUs per CPU. The idea is to see how much life you can get out of the CPU with future GPU upgrades. It's a future proof benchmark, not the best but it's important.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Cygni posted:

Low resolution does, potentially, give you an idea of how a CPU will scale with future GPUs, assuming you are going to keep the platform longer than the GPU like most gamers.

Eh. If you keep a GPU for a long-rear end time, you're just going to be increasingly GPU limited as games progress in complexity, which will only reduce the impact the CPU has.

If you change out your GPU regularly, you're probably the type to be chasing ULTRA MAXXX settings or high-hz gaming, in which case you're either going to be GPU limited anyhow, or monitor limited.

There aren't a whole lot of situations where you can be meaningfully (>15%) CPU limited these days without some rather head-scratching setups unless you're sticking with 1080p@>60Hz

e; It's worth taking a look at something like the 10700k vs 7700k vs 2700X at 1440p. In most cases the 7700k is <10% behind the 10700k despite being 3 years old. The 2700X is usually pretty similar, too, despite being "bad" for gaming. At 1080p there are considerably larger differences, of course, but you don't have to up resolution much before the CPU stops mattering too much, and that'll probably continue to be the case.

Wildcard here being buying a 6c chip right before the 8c consoles come out. Those might not age as gracefully.

DrDork fucked around with this message at 20:06 on Oct 8, 2020

Gwaihir
Dec 8, 2009
Hair Elf

Riven posted:

Well if any of ya’ll with a 38 or 3900 upgrade and are looking to sell I’m interested in upgrading from my 3600 for less than retail.

If stock is decent I'm gonna sell my 3900x to try and grab a 5900x, because, well, I'm a dumb nerd.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Prices and core counts not as nice as I was hoping, people even saying the 10700k and zen 2s are better value for money.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Paul MaudDib posted:

not specifically sniping at you or anyone here, but like, everyone has spent the last 3 years talking up "NObOdy uSeS A x80 TI To pLaY 1080p dO mOrE rEaL wOrLd bEnChMaRks" and the second they were on top that predictably vanished instantly, just as I said it would.

it's a pretty transparent thing to minimize the downsides when you're behind, not a coincidence that Intel has moved to "real world benchmarks!" in laptops for the last year.

if you are benchmarking the CPU itself, not the system, then 1080p with the fastest GPU you can find is pretty objectively the right way to do it. It could even be 720p. If you are still seeing the framerate scale up then you are not fully CPU bottlenecked. That's the goal, to push the bottleneck fully onto the CPU.

The comment you responded to that started this conversation was calling 1080p a lovely benchmark. If you're going to declare a victory at least respond to things people are actually saying.

PC LOAD LETTER
May 23, 2005
WTF?!

DrDork posted:

Not to start this whole argument over again, but yes and no.

Nah its pretty much a no.

If you've got to construct specific scenarios to make it a 'yes' the whole argument pretty much falls apart and you're back to "look at benches for resolutions you use" anyways.

Paul MaudDib posted:

that's indeed what I just said, it's not me who's flip flopping when it makes the numbers look better

Its AMD marketing which is very different from the conversation in thread at the time back then.

spunkshui
Oct 5, 2011



pixaal posted:

I know many people that buy a 2-3 GPUs per CPU. The idea is to see how much life you can get out of the CPU with future GPU upgrades. It's a future proof benchmark, not the best but it's important.

My i7 920 @4ghz lasted all the way until 9600k came out.

As much as I love overclocking I would definitely be getting a 5600x.

It’ll be fun to see if these things are really red lined there’s anything left.

Chevy Slyme
May 2, 2004

We're Gonna Run.

We're Gonna Crawl.

Kick Down Every Wall.
My big question mark is how does a 5600 compare to a 38 or 3900 and is it worth jumping on any screaming deals on the latter in the next month versus holding out.

ijyt
Apr 10, 2012

So RAM wise for Ryzen CPUs, do I want 3200 C14 or 3600 C16 sticks? I'll be going for 2x16GB sticks likely.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

ijyt posted:

So RAM wise for Ryzen CPUs, do I want 3200 C14 or 3600 C16 sticks? I'll be going for 2x16GB sticks likely.

3600 CL16 is the best but the difference is very minor.

CaptainPsyko posted:

My big question mark is how does a 5600 compare to a 38 or 3900 and is it worth jumping on any screaming deals on the latter in the next month versus holding out.

In current games the 5600X will probably kill them easily.

In software used for actual work that's heavily multithreaded, the higher core count Zen 2 parts will probably still have an advantage.

In future games, MAYBE having "only" 6 cores/12 threads will become a disadvantage, but we just now reached the point where 4 core/4 thread i5s are struggling in some newer stuff so who knows. The new consoles have what amounts to a 3700X without the boost clocks.

e: This depends on what the actual benchmarks say, as opposed to AMD marketing stuff.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

sean10mm posted:

3600 CL16 is the best but the difference is very minor.


In current games the 5600X will probably kill them easily.

In software used for actual work that's heavily multithreaded, the higher core count Zen 2 parts will probably still have an advantage.

In future games, MAYBE having "only" 6 cores/12 threads will become a disadvantage, but we just now reached the point where 4 core/4 thread i5s are struggling in some newer stuff so who knows. The new consoles have what amounts to a 3700X without the boost clocks.

e: This depends on what the actual benchmarks say, as opposed to AMD marketing stuff.

What would you speculate is the best cooling solution for these CPUS? Would the U14S still be competitive here for the 5900 for example?

ufarn
May 30, 2009
How well is Windows going to know to split the load from a game and livestreaming software on separate Zen 3 CCX's instead of mixing things up and making a mess of latency these days?

Otherwise I might just go for a 3800X with one CCX.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Anandtech is saying only the 5600 will have a stock cooler. Not that anyone uses them but it's another kick in the value department.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

Kraftwerk posted:

What would you speculate is the best cooling solution for these CPUS? Would the U14S still be competitive here for the 5900 for example?

They're only 105W TDP, same as the hotter Zen 2 CPUs. A U14S cools a 105W TDP 3800X the same as a bigger D15 so it has cooling capacity to spare.

Crap, they make a U14S with a modified base to cool Threadrippers, I don't see how a 105W TDP anything is going to bother it particularly.

hobbesmaster
Jan 28, 2008

sean10mm posted:

3600 CL16 is the best but the difference is very minor.

A 3600 cl16 kit can probably run at 3200 cl14, no?

I guess the fabric clk doesn't actually matter if theres only one CCX right? Is Zen 3 still limited to 1800 mhz fclk?

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

sean10mm posted:

They're only 105W TDP, same as the hotter Zen 2 CPUs. A U14S cools a 105W TDP 3800X the same as a bigger D15 so it has cooling capacity to spare.

Crap, they make a U14S with a modified base to cool Threadrippers, I don't see how a 105W TDP anything is going to bother it particularly.

Hopefully not a fake TDP like the Intels where it boosts to like 300.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

hobbesmaster posted:

A 3600 cl16 kit can probably run at 3200 cl14, no?

Probably... but why?

The memory controller is apparently identical between Zen 2 and Zen 3 FYI.

As for cooling, this is overclocked 3800X data with a few coolers:

Only registered members can see post attachments!

Gwaihir
Dec 8, 2009
Hair Elf

Kraftwerk posted:

Hopefully not a fake TDP like the Intels where it boosts to like 300.

They will use more than 105 with boost and whatnot turned on. The current 3900x tops out in actual use at just under 150w fully loaded. Way less than the 320w a 10900k at 5.2ghz all core can suck down.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

Kraftwerk posted:

Hopefully not a fake TDP like the Intels where it boosts to like 300.

I mean who knows, but AMD hasn't really lied on this front before, and the clock speed with Zen 3 is either the same as stuff they're already doing the Zen 2 or only marginally higher.

3800XT is already 4.7 boost clock and AFAIK doesn't have any wacky power draw miles above its spec.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

ufarn posted:

How well is Windows going to know to split the load from a game and livestreaming software on separate Zen 3 CCX's instead of mixing things up and making a mess of latency these days?

Otherwise I might just go for a 3800X with one CCX.

CCX problems haven't been a big deal on Zen2 and that probably won't change on Zen3, if anything they may lessen due to the bigger CCX, I'm not sure why you are so worried.

or if it is, the 3900X gets enough extra cache to help smooth it over

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


ijyt posted:

So RAM wise for Ryzen CPUs, do I want 3200 C14 or 3600 C16 sticks? I'll be going for 2x16GB sticks likely.

In most cases you should be able to get either to run at the other. So you can very likely just test yourself. I'd pick up the one rated for 3600 though since that's going to be your default boot unless you are running something that loves tight timings (then you can eek a bit more performance out).

ijyt
Apr 10, 2012

Some Goon posted:

Anandtech is saying only the 5600 will have a stock cooler. Not that anyone uses them but it's another kick in the value department.

Good, those things were just extra garbage to either trash or try and get recycled somehow.

e: ty for the ram advice all!

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

ijyt posted:

The -3% on BFV. :unsmith:

lol what the hell is happening there, anyway?

Gwaihir
Dec 8, 2009
Hair Elf

Farmer Crack-rear end posted:

lol what the hell is happening there, anyway?

:dice: is happening.

hobbesmaster
Jan 28, 2008

ijyt posted:

e: ty for the ram advice all!

Honestly I would just try to figure out what the actual memory modules are based on googling the UPC. Then punch that into to the ryzen dram calculator and see what it thinks the safe settings for that die/mobo/cpu combo are. Like if the 3200 cl14 ram is Samsung B die and 3600 is Micron E then the Bdie probably is a better performer after you painstakingly OC the module. If you are not going to painstakingly OC the module just get the higher clock speed and use the XMP profile.

Llamadeus
Dec 20, 2005

Some Goon posted:

Anandtech is saying only the 5600 will have a stock cooler. Not that anyone uses them but it's another kick in the value department.
We'll probably see a 5700 with a Wraith Spire cooler at some point. I think AMD are holding all the value parts back to maximize profit in the window before Rocket Lake launches, at which point they can respond with price cuts and fill in the lower end.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

hobbesmaster posted:

get the higher clock speed and use the XMP profile.

:hmmyes:

Don't Ask
Nov 28, 2002

So if I bought a i7 6700K 4 and half years ago, which Zen 3 will match the performance/cost tier? I have never owned an AMD CPU so I have no idea how to parse the numbering scheme.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Don't Ask posted:

So if I bought a i7 6700K 4 and half years ago, which Zen 3 will match the performance/cost tier? I have never owned an AMD CPU so I have no idea how to parse the numbering scheme.

Assuming you paid the original MSRP of $350 for the 6700k, you'd be either looking at the $300 5600X or, honestly, the $350-$370 10700k. The 5800X is $450, so considerably more expensive.

It'll be up to the benchmarks to show whether the 5800X can justify the price premium over the 10700k, but the 10700k should be a good bit faster than the 5600X in basically all cases, though it uses more power.

Khorne
May 1, 2002

Llamadeus posted:

We'll probably see a 5700 with a Wraith Spire cooler at some point. I think AMD are holding all the value parts back to maximize profit in the window before Rocket Lake launches, at which point they can respond with price cuts and fill in the lower end.
It's a similar level of price gouging the 3800x was, and it's likely the 5700x will be the same slaughter of the 5800x that it was for zen2.

As far as other prices go, the launch 5950x price is rough. I wanted to go from a 3900x to a 5950x, but I wanted to do it for closer to $650 than $800. The 5900x seems kind of reasonable and the 6c seems kind of rough and hard to justify over just getting a 3600. I'm hoping this is an "MSRP" situation where Microcenter has it for $50 less with another $25-$50 off if you buy a motherboard.

They're really pushing for money from people who have to have the best, from people who want the last CPU on the socket, and from people who are riding the hype train.

Khorne fucked around with this message at 21:12 on Oct 8, 2020

Adbot
ADBOT LOVES YOU

Worf
Sep 12, 2017

If only Seth would love me like I love him!

wow its impressive that DICE managed to hardcore BFV to actually passively get worse as time goes on without them even trying

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply