Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ijyt
Apr 10, 2012

Plus Zen2 is cheaper.

Adbot
ADBOT LOVES YOU

spunkshui
Oct 5, 2011



ijyt posted:

Plus Zen2 is cheaper.

Smarter then nvidia who overnight made people not want to buy any 20 series card.

There’s clearly a captive market for the latest hardware in rona economy.

The Gadfly
Sep 23, 2012

spunkshui posted:

Smarter then nvidia who overnight made people not want to buy any 20 series card.

Why would nvidia care if people don't want to buy a card that is discontinued and is only being sold used or through third party vendors by now?

The Gadfly fucked around with this message at 17:48 on Oct 11, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

spunkshui posted:

Smarter then nvidia who overnight made people not want to buy any 20 series card.

Yeah, but that's how GPU generations go. The new gens are either so far ahead of the old one that there's no reason to even consider them outside of firesale used prices, or they're not far enough ahead (like Turing vs Pascal) that people lambast them and skip the generation entirely.

AMD would be doing the same thing, normally. This is just a unique case where they're moving up from the "we need to use cut rate pricing to convince people to give us a chance" stage to the "we are now market leaders and pricing accordingly" stage.

VorpalFish
Mar 22, 2007
reasonably awesometm

Is zen2 cheaper even? At this point r&d is a sunk cost and they're on the same process. I guess we don't know die size but based on core counts and cache I imagine they're pretty close.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Console APUs are produced by AMD's semicustom division; they are not off-the-shelf parts. IMO you should think of the semicustom group as a little ARM inside of AMD: they sell customers solutions based on the building blocks of AMD's CPU and GPU IP.

Statements made by both Sony and Microsoft over the past year made it clear that this generation's APUs were rather more "custom" than the previous generation's. They both made initial statements (which confused and/or angered many nerds) to the effect that their next-gen GPUs would be "RDNA with RDNA2 elements". Neither of them ever clarified exactly what parts would be based on RDNA and what parts would be RDNA2, and in fact both now simply say "RDNA2 derived" -- a wise choice, given the shrieking of fanboys who believe they're CompEs because their battlestation post got a decent amount of upvotes.

There haven't been any statements made about Zen2 vs Zen3, that I'm aware of. We might be able to infer something from AMD's statements that Zen3 was a full redesign, while Zen2 was a refinement of Zen+, which was in turn a refinement of Zen1. Maybe AMD didn't feel Zen3 was far enough along to offer to such important customers. Maybe Sony and Microsoft felt Zen2 offered perfectly good performance and opted for the very well tested option. Maybe Zen2 and Zen3 were on the table, and maybe AMD had full confidence in the new design, but Zen2 was a better deal, freeing up budget in the BOM to sink into other components.

We won't know for sure until AMD and/or Sony and/or Microsoft talk about their rationale(s).

Theris
Oct 9, 2007

Zen 2 is such a huge leap from the cat cores in the previous gen consoles that I doubt Sony and Microsoft really care that they're missing out on another few percent from Zen 3. Especially if it would have meant delaying their releases by another several months or more.

Cojawfee
May 31, 2006
I think the US is dumb for not using Celsius
If one of them had thought about waiting for Zen3, the other would have just gone with Zen2 and come out first.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It's going to be like the Xbone/PS4, where the GPUs are basically a 7850 IIRC with higher clocks and a few features from newer GCN revisions tacked on. Expect something like a higher clocked and slightly more efficent 5700 for the PS5, and a higher clocked more efficient and bigger 5700XT for the XBox how many Xs can we put in the name, both with RT hardware and probably some other features tacked on. The designs needed to be nailed down way in advance, it's not a situation like the RDNA2 GPUs where there are a variety of chips that start getting produced shortly before launch and yields dictate how things are cut down etc. Consoles are more about coming up with a design that can be mass produced cheaply and reliably (which also means cheaply).

spunkshui
Oct 5, 2011



The Gadfly posted:

Why would nvidia care if people don't want to buy a card that is discontinued and is only being sold used or through third party vendors by now?

I guess they wouldn't, good point.

I'm certainly not complaining.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

mdxi posted:

There haven't been any statements made about Zen2 vs Zen3, that I'm aware of. We might be able to infer something from AMD's statements that Zen3 was a full redesign, while Zen2 was a refinement of Zen+, which was in turn a refinement of Zen1.

People impute way too much to that statement. Ivy Bridge to Haswell was a “new architecture” too. “New architecture” doesn’t mean you throw everything away and AMD obviously did not do that. Making significant IPC gains from incremental changes to the architecture is the norm especially in the early stages of an architecture when there is a lot of low hanging fruit to be picked

Zen1 to Zen+ is a better example of “pretty much the same silicon”, and that’s what Norrod wanted to put to bed w/r/t Zen3, that it was not “Zen2+”.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

K8.0 posted:

It's going to be like the Xbone/PS4, where the GPUs are basically a 7850 IIRC with higher clocks and a few features from newer GCN revisions tacked on. Expect something like a higher clocked and slightly more efficent 5700 for the PS5, and a higher clocked more efficient and bigger 5700XT for the XBox how many Xs can we put in the name, both with RT hardware and probably some other features tacked on. The designs needed to be nailed down way in advance, it's not a situation like the RDNA2 GPUs where there are a variety of chips that start getting produced shortly before launch and yields dictate how things are cut down etc. Consoles are more about coming up with a design that can be mass produced cheaply and reliably (which also means cheaply).

Og xbone was way cut down from a 7850, more like 7750 iirc

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

HalloKitty posted:

Og xbone was way cut down from a 7850, more like 7750 iirc

It was between the 7770 and 7790 looking at the specs on the tech power up database. Probably closer to the 7770 since the clocks were a bit lower.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Paul MaudDib posted:

People impute way too much to that statement. Ivy Bridge to Haswell was a “new architecture” too. “New architecture” doesn’t mean you throw everything away and AMD obviously did not do that.
Yeah, I get what they meant. "Substantially revised." Something like:
code:
Zen   --  Tick  
Zen+  --  Tock
Zen2  --  Tock
Zen3  --  Tick

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

mdxi posted:

Yeah, I get what they meant. "Substantially revised." Something like:
code:
Zen   --  Tick  
Zen+  --  Tock
Zen2  --  Tock
Zen3  --  Tick

More like tick tock tick tick, Zen2 is substantially different from Zen1 or Zen+

Josh Lyman
May 24, 2009


Paul MaudDib posted:

People impute way too much to that statement. Ivy Bridge to Haswell was a “new architecture” too. “New architecture” doesn’t mean you throw everything away and AMD obviously did not do that. Making significant IPC gains from incremental changes to the architecture is the norm especially in the early stages of an architecture when there is a lot of low hanging fruit to be picked
I’m a little confused about your analogies here since Wikipedia says Haswell was only up to 5% faster single threaded and 3% faster on average compared to Ivy Bridge. Meanwhile, Zen 3 has 19% higher IPC than Zen 2 which is substantial, though I guess the impact on gaming at 1440p and 4K is probably less since performance is driven by the GPU.

Cygni
Nov 12, 2005

raring to post

I don't think the Tick/Tock analogies are all that helpful because AMD isn't working like that.

Zen was a brand new core architecture on a brand new process.
Zen+ was Zen on a lightly tweaked process that allowed small clock increases.
Zen2 was the same core design but with a die shrink and lots of changes to everything else around the core.
Zen3 is the same process but with lots of core changes.
Zen4 is (apparently) a die shrink without core changes.

Khorne
May 1, 2002
AMD has two separate teams and is aiming for tick tick tick. Zen+ was an anomaly and was more or less what they wanted zen1 to be but didn't have the time to do.

I'm not up on zen4/zen5 news, but that was their zen1 through zen3 plan that they executed on.

Llamadeus
Dec 20, 2005

Cygni posted:

Zen4 is (apparently) a die shrink without core changes.
I don't think they would do this: Zen is the name of the architecture, they'd only increment the number if they made changes to the design

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Zen4 was last :airquote: rumored :airquote: to basically be Zen3 but with more L3, 3D stacking and further core/IF optimizations for power. It'd be either memory and/or CPU die stacking, the rumor is that AMD isn't looking to merely double core counts again but go nuts and quadruple them for EPYC and EPYC alone while having a lot of local memory. Ryzen and Threadripper would get a doubling of cores, Threadripper might get a 3D memory, but the real target for Zen4 is 100% server, cloud and datacenter.

I mean if you wanted to beat Sapphire Rapids and Intel, a soul crushing 256C and 512T top end EPYC SKU would probably do that.

EmpyreanFlux fucked around with this message at 03:04 on Oct 12, 2020

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
An 8 core/16 thread part positioned at the Ryzen 3 tier would be hilarious

Cygni
Nov 12, 2005

raring to post

Llamadeus posted:

I don't think they would do this: Zen is the name of the architecture, they'd only increment the number if they made changes to the design

"Without core changes" was probably too strong with the wording by me, I meant more "evolutionary" in its changes to use AMDs own nomenclature. Zen3 is the big core architecture change. Not saying there are zero changes at all for Zen4, but the core itself is not getting large scale changes.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
So... what is the normal gameplan for scoring new CPUs right away? Is it like Ampere was? Do all of the normal retailers like Best Buy sell CPUs?

Is it normally easier or harder to get the top end sku (in this case 5950x?)

Bonus question: what is the best motherboard for a 5950x? I have it narrowed roughly down to:

1) Tomahawk x570, which seems to be a huge favorite, but it doesn't have SLI
2) Aorus Master x570

If there's something better I would love to hear about it. I would prefer to keep SLI in there if possible, since the 3090 has "support" for it and my setup can theoretically, I think, power 2x 3090s. I get that realistically there's no way SLI is going anywhere, but it seems kind of stupid not to leave the option open for the outside chance it pays off. However I'm not married to SLI if there's a much better option without it.

Taima fucked around with this message at 05:14 on Oct 12, 2020

Cojawfee
May 31, 2006
I think the US is dumb for not using Celsius
What the hell are you doing that would require two 3090s? I don't think SLI has ever been worth it.

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy
Sli 3090 is a case of ether you need it for work making money or you don't need it, end of story

SLI could not be any more dead then it currently is, only thing to top it would be Jensen telling you that a stupid mother fucker for buying 2 3090s

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
The SLI benchmarks of the 3090 in the games that supported it are really impressive. I just think leaving the door open to it makes sense given that most motherboards one would pair with a 5950x would probably have SLI anyways (I think?)

hobbesmaster
Jan 28, 2008

Taima posted:

So... what is the normal gameplan for scoring new CPUs right away? Is it like Ampere was? Do all of the normal retailers like Best Buy sell CPUs?

Is it normally easier or harder to get the top end sku (in this case 5950x?)

Bonus question: what is the best motherboard for a 5950x? I have it narrowed roughly down to:

1) Tomahawk x570, which seems to be a huge favorite, but it doesn't have SLI
2) Aorus Master x570

If there's something better I would love to hear about it. I would prefer to keep SLI in there if possible, since the 3090 has "support" for it and my setup can theoretically, I think, power 2x 3090s. I get that realistically there's no way SLI is going anywhere, but it seems kind of stupid not to leave the option open for the outside chance it pays off. However I'm not married to SLI if there's a much better option without it.

If you're setting money on fire put the cash into a really cool looking water cooled display case. If you actually have a use for 3090s in SLI why aren't you waiting for info on the threadrippers? Surely you'd need all those cpu pcie lanes to shovel whatever you're rendering from your pcie 4 ssds to the GPUs. Theres only 24 CPU lanes on an zen 2 ryzen vs 64 on the zen 2 threadrippers. Zen 3 probably will be the same.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Guys please don't get lost in the weeds here I feel like I was pretty clear that I'm not looking to jam 2x 3090s into anything, it's just an insurance policy because I think most boards I would consider would have it anyways, so why not leave the option open if possible on the motherboard for some future use case.

Re: water cooling, I find water cooling to be at best a hobby play. Air cooling is whisper quiet and super good. If setting up a water cooling loop makes your dick hard, more power to you though, just not for me.

shrike82
Jun 11, 2005

You’re not going to find a consumer board that can fit two 3090s without having one standing off via riser anyway so kinda irrelevant prereq

Cygni
Nov 12, 2005

raring to post

Taima posted:

The SLI benchmarks of the 3090 in the games that supported it are really impressive. I just think leaving the door open to it makes sense given that most motherboards one would pair with a 5950x would probably have SLI anyways (I think?)

There are 14 games ever made that support explicit mGPU on the 3000 series, and none in development as far as I know. The last one released was RDR2 in November of 2019. There have been zero games in the last year.

Nvidia restricting SLI to the $1,500 tier means that there likely will never be any more either. It is dead.

For games with SLI profiles, Nvidia has specifically said that there will be no new profiles. Also, the 3000 series will never officially support those profiles. That is also dead.

If you really want to burn $1500 on your gaming computer, try threadripper. SLI is not even worth considering or thinking about.

hobbesmaster
Jan 28, 2008

shrike82 posted:

You’re not going to find a consumer board that can fit two 3090s without having one standing off via riser anyway so kinda irrelevant prereq

Some of the strx40 boards look like they might barely fit. They're 3 slots, not 3 slots and change (lol) right?

shrike82
Jun 11, 2005

I have a 3960X on a TRX40 board and can’t fit 2 Titans on it without a stand-off setup, and ended up hooking one up via eGPU for compute

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

I have a 3960X on a TRX40 board .... and ended up hooking ... for compute

you see what happens when you buy AMD, people?

at first you just want a little compute, maybe a 3900x or something, and then you can't control yourself anymore

Kazinsal
Dec 13, 2011


AMD's core counts are nerd smack.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

hobbesmaster posted:

Some of the strx40 boards look like they might barely fit. They're 3 slots, not 3 slots and change (lol) right?

There certainly are boards where they'll fit, but the two PCIe x16 slots need to be 4 slots apart, since NVidia is only releasing a 4-slot bridge for the 3090. Which, given the 3090's size, does probably mean using risers and a very hosed up setup. Or I guess you could slot it into the bottom PCIe slot on a 3- or 4- slot mobo and enjoy running a 3090 off a PCIe 4x chipset slot, though that'll tank your performance :shrug:

It's still a bad idea, since there is basically no possible world in which you get a single 3090 today for gaming and decide in a year or two that a second one makes any sort of sense for gaming. It just won't happen--either you want it for playing SLI-compatible games today or you don't. It's not an upgrade path at all, NVidia has been very clear about that. Compute, rendering, etc., then sure, that's a totally different story.

Honestly watercooling is probably a better use of the money, especially given the heat put out by the 3090.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Taima posted:

So... what is the normal gameplan for scoring new CPUs right away? Is it like Ampere was? Do all of the normal retailers like Best Buy sell CPUs?

Is it normally easier or harder to get the top end sku (in this case 5950x?)

It's hard to really know at this point. AMD has been talking big like they'll have a ton of inventory at launch, but talk is cheap. The 5950X will likely be easier to get than the others simply because demand for an $800 consumer-grade CPU is not super high. Still, there will likely be stock trackers like there are for the 3080/3090 (Discord, mostly) that would be a good idea to get in on.

Your normal avenues for getting chips in the US are gonna be Amazon, NewEgg, BestBuy (online, not in store), B&H Photo, and MicroCenter.

ijyt
Apr 10, 2012

OcUK has preliminary prices for the 5000 series.

5600X: £289.99 (3600XT: £229.99 / 10600K £248.99)
5800X: £428.99 (3800XT: £359.99 / 10700K £359.99)
5900X: £529.99 (3900XT: £449.99 / 10900K £499.99)
5950X: £749.99 (3950X: £579.99)

ijyt fucked around with this message at 15:42 on Oct 12, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Jesus Christ. AMD marketing the 10700k better than Intel ever could.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Zedsdeadbaby posted:

Jesus Christ. AMD marketing the 10700k better than Intel ever could.

Just like AMD has done before: Once Intel releases new CPUs, AMD will cut prices. These new Ryzens are faster single threaded than Intel, they're premium parts for those willing to pay up right now.

Adbot
ADBOT LOVES YOU

ijyt
Apr 10, 2012

IDK, I'm pretty tempted by the 5900X. If Rocket Lake is still slated for a pre-March launch then I might wait and just focus on getting a 3080.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply