Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
VorpalFish
Mar 22, 2007
reasonably awesometm

DrDork posted:

Fair enough, but do you really need 750W? Seasonic has 650's that have been considerably more available of late if you're not thinking of...well I don't really know what you'd do to need 750W, given that even a heavily OC'ed 3080 only hits like 400W + 100W for the 5900X + like 50W for everything else. 750's made sense for OCing 10700/10900k's, but not so much for Zen 3.

Seasonic's units are sfx-l, not sfx so there are compatibility reasons they aren't always a 1:1 replacement.

Adbot
ADBOT LOVES YOU

Cross-Section
Mar 18, 2009

DrDork posted:

Fair enough, but do you really need 750W? Seasonic has 650's that have been considerably more available of late if you're not thinking of...well I don't really know what you'd do to need 750W, given that even a heavily OC'ed 3080 only hits like 400W + 100W for the 5900X + like 50W for everything else. 750's made sense for OCing 10700/10900k's, but not so much for Zen 3.

I’m just going off the recommended specs on the project pages for all the new GPUs. I might consider downgrading to a lower wattage if literally everything else (GPU and CPU included) comes in for this build, but right now I might as well wait to see if stock shows up.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

VorpalFish posted:

Seasonic's units are sfx-l, not sfx so there are compatibility reasons they aren't always a 1:1 replacement.

That's true, and I suppose the Corsair SFX line drops from 750 to 600, which is maybe a bit close to the limit. SilverStone has both 650W and 700W SFX ones, though, and EVGA has a 650W one, all in stock at Amazon at reasonable prices.

Cross-Section posted:

I’m just going off the recommended specs on the project pages for all the new GPUs. I might consider downgrading to a lower wattage if literally everything else (GPU and CPU included) comes in for this build, but right now I might as well wait to see if stock shows up.

Quite fair. It just seems that a lot of people are still keeping with some pretty outdated rules of thumbs, like not running it past 70% load or whatever, and way over-buying on capacity. I mean if it's the same price and all then, sure, I'd go for the bigger one every time. But if it's a question of being able to build it at all or not, I wouldn't worry about it.

DrDork fucked around with this message at 19:43 on Nov 2, 2020

Klyith
Aug 3, 2007

GBS Pledge Week

Cross-Section posted:

I’m just going off the recommended specs on the project pages for all the new GPUs.

GPU recommended specs have to account for people with lovely low efficiency PSUs that struggle to put out the watts they claim on the label. High quality gold or better units are fairly comfortable delivering their rated watts, and when they put "650" on the box they mean 650 or real 12V, not 580 of 12V and 70 of 5 & 3V.

So good 650W PSUs should generally be ok for a 3080 + Ryzen, if the 3080 is not OCed or one of the maximum pre-OCed units. Maxing both a 3080 and 5950 out with OCs could get a bit dicey, depending on the rest of the system.

OTOH if this is a SFX system, the idea of putting 600W of heat inside a shoebox size container is an interesting challenge itself. You also have to look at things like whether your case mounts the PSU drawing fresh outside air (like most regular desktops these days) or if the PSU is using internal air to assist with airflow. Internal air in a SFX PC can make the PSU get quite warm since it's eating hot air, and that calls for more headroom.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Klyith posted:

OTOH if this is a SFX system, the idea of putting 600W of heat inside a shoebox size container is an interesting challenge itself. You also have to look at things like whether your case mounts the PSU drawing fresh outside air (like most regular desktops these days) or if the PSU is using internal air to assist with airflow. Internal air in a SFX PC can make the PSU get quite warm since it's eating hot air, and that calls for more headroom liquid cooling everything so internal heat isn't an issue in the first place.

This is the way.

GRECOROMANGRABASS
May 14, 2020

Blorange posted:

The 5600x will be $300 because they can charge that much for it. It's looking like ~30% more performance for 50% more money.

SiSoftware found performance to be ~40% better than the 3600X when reviewing the 5600X.

And you are dead wrong about the price being "because they can charge that much for it." AMD is following the same pricing strategy for the 5000 series as Zen 2. Don't assume AMD has the same business model as Intel.

Intel maintains a gross margin of around 65% on their consumer CPUs, while AMD forecasts expected gross margins on Zen 3 consumer CPUs to be roughly ~45%, which is where they have been for a while with Zen. Meaning AMD intends to price Zen 3 at the same level of per-unit profit as they received with Zen 2. Zen 3 has a lot of R&D to recoup, and is likely to be a more expensive product to produce due to factors such as TSMC capacity being at record high demand. They could charge a lot more than $299 for this processor.

AMD buying Xilinx in an all-stock deal is a pretty clever move. Prior to the acquisition, AMD had a free cash flow of ~$150 million over the last year, but the combined companies will have nearly 2 billion in free cash flow over the next 12 months.

Cross-Section
Mar 18, 2009

Cross-Section posted:

Ah, drat. I guess I’m checking the production date on the Asrock B550m-ITX/AC I ordered when it comes in today, because I couldn’t find hide or hair of a BIOS Flashback option on the product page.

Egg on my face for assuming I could just plop a Zen 3 onto a “Supports Future AMD Ryzen Processors” board.

Took a few minutes of looking over this little baby of a board, but...




yay I got a good mobo

Klyith posted:

GPU recommended specs have to account for people with lovely low efficiency PSUs that struggle to put out the watts they claim on the label. High quality gold or better units are fairly comfortable delivering their rated watts, and when they put "650" on the box they mean 650 or real 12V, not 580 of 12V and 70 of 5 & 3V.

So good 650W PSUs should generally be ok for a 3080 + Ryzen, if the 3080 is not OCed or one of the maximum pre-OCed units. Maxing both a 3080 and 5950 out with OCs could get a bit dicey, depending on the rest of the system.

OTOH if this is a SFX system, the idea of putting 600W of heat inside a shoebox size container is an interesting challenge itself. You also have to look at things like whether your case mounts the PSU drawing fresh outside air (like most regular desktops these days) or if the PSU is using internal air to assist with airflow. Internal air in a SFX PC can make the PSU get quite warm since it's eating hot air, and that calls for more headroom.

Good to know. In the end, I caved and got the Silverstone 700W; seems like reviews are good outside of the apparently-inflexible cabling. And yeah, I wasn't planning on OC outside of whatever factory settings may already be applied.

As for the cooling aspect, I'm using a NR200 as the case for this build. PSU mount appears to be out-facing, judging by the mesh (and the, well, mount itself).

Cross-Section fucked around with this message at 22:53 on Nov 2, 2020

Khorne
May 1, 2002

Cross-Section posted:

Good to know. In the end, I caved and got the Silverstone 700W; seems like reviews are good outside of the apparently-inflexible cabling. And yeah, I wasn't planning on OC outside of whatever factory settings may already be applied.
Inflexible cabling is annoying when you're building the computer and then completely irrelevant for the rest of time. For most people at least.

Klyith
Aug 3, 2007

GBS Pledge Week

GRECOROMANGRABASS posted:

and is likely to be a more expensive product to produce due to factors such as TSMC capacity being at record high demand

Ya know, that's a heck of a good point and it's unfortunate that nobody's pointed that out so far with all the grouching about the price increases. It sucks that prices are higher for performance chips, but some of the blame has to be pointed at all the other companies that have failed to make a competitive process.


(Remember when GlobalFoundaries was part of the TSMC-Samsumg-GF "not intel" 7nm development coalition, and right before it was go time they decided the new fabs were so expensive they weren't worth building? :lol:)

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Klyith posted:

(Remember when GlobalFoundaries was part of the TSMC-Samsumg-GF "not intel" 7nm development coalition, and right before it was go time they decided the new fabs were so expensive they weren't worth building? :lol:)

It'd be interesting to see GloFo's books on that one. I mean, Intel hasn't figured out 10/7nm yet, either, but they've had some of the most profitable quarters in their history churning out poo poo on 14nm. Wonder if GloFo has had a similar experience where they've had orders enough to make outsized profits by leveraging mature (and thus very cost-efficient) processes rather than having to recoup the capital expenses of the new foundries.

GRECOROMANGRABASS
May 14, 2020
Yeah, the alliance between TSMC, Samsung, and GF was great, and who knows where we might be with that combined R&D collaboration if GF didn't drop the ball. Samsung's 8nm process has a transistor gate pitch equal to or very close to TSMC's 10nm process, so I get the impression they aren't much further ahead of where they were when GF dropped out.

GRECOROMANGRABASS
May 14, 2020
Late stage capitalism is probably the reason for GF doing what it did. They had short and mid term profits guaranteed from their 12 and 14nm process nodes, and by having a nearly risk production ready 7nm process that just needs a little love and attention to get up and running, they probably have the exit strategy of being acquired for a king's ransom long before 12nm is obsolete.

terrorist ambulance
Nov 5, 2009
Wish they'd release some AMD 5000 bench marks before the 5th. Don't really feel like racing to add to cart when it's not clear what you're getting

Suburban Dad
Jan 10, 2007


Well what's attached to a leash that it made itself?
The punchline is the way that you've been fuckin' yourself




terrorist ambulance posted:

Wish they'd release some AMD 5000 bench marks before the 5th. Don't really feel like racing to add to cart when it's not clear what you're getting

You can always return it.

Knot My President!
Jan 10, 2005

Does anyone know what time they'll be dropping? I've heard midnight, 6am, and 9am PST and I reeealllly don't want to rely on Discord to compete with thousands of nerds a la the 3080 launch (but it will probably be that way again)

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Knot My President! posted:

Does anyone know what time they'll be dropping? I've heard midnight, 6am, and 9am PST and I reeealllly don't want to rely on Discord to compete with thousands of nerds a la the 3080 launch (but it will probably be that way again)

Nope. The assumption is 9am EST like with some other launches, but they've not said anything official.

Beef
Jul 26, 2004
GF doesn't need to chase the same aggressive node technology the other foundries are. It makes sense for logic/compute to chase further shrink, but not everything requires that. AMD demonstrated pretty aptly that it's more economic these days to not do everything on a monolithic die. IO has certain physical constraints that make older mature nodes just as good, at a much lower cost. It makes perfect business sense for GF to specialize their process and tools into things like IO. Isn't GF already producing AMD's IO chiplets?

VorpalFish
Mar 22, 2007
reasonably awesometm

Beef posted:

GF doesn't need to chase the same aggressive node technology the other foundries are. It makes sense for logic/compute to chase further shrink, but not everything requires that. AMD demonstrated pretty aptly that it's more economic these days to not do everything on a monolithic die. IO has certain physical constraints that make older mature nodes just as good, at a much lower cost. It makes perfect business sense for GF to specialize their process and tools into things like IO. Isn't GF already producing AMD's IO chiplets?

They are but iirc AMD is still contractually obligated to purchase some quantity of wafers from them as part of the sale of their interest in the fab, so who knows if they would otherwise. That should run through 2024.

etalian
Mar 20, 2006

A 3600 should be enough for anyone...

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

etalian posted:

A 3600 should be enough for anyone...

I saw a video of a dude that paired an Athlon 200GE with an RTX 2080 and was still getting good 60+ FPS on a bunch of games at 4k

is gaming performance really that heavily biased towards blowing wads of cash on your GPU and the CPU can just be whatever?

VorpalFish
Mar 22, 2007
reasonably awesometm

gradenko_2000 posted:

I saw a video of a dude that paired an Athlon 200GE with an RTX 2080 and was still getting good 60+ FPS on a bunch of games at 4k

is gaming performance really that heavily biased towards blowing wads of cash on your GPU and the CPU can just be whatever?

Yes, with some exceptions (flight sim), but also if you want smooth frame time delivery at high refresh rates having a fast CPU becomes important.

Bias for gaming computers has always been spend more on the gpu though.

etalian
Mar 20, 2006

gradenko_2000 posted:

I saw a video of a dude that paired an Athlon 200GE with an RTX 2080 and was still getting good 60+ FPS on a bunch of games at 4k

is gaming performance really that heavily biased towards blowing wads of cash on your GPU and the CPU can just be whatever?

Yup pretty much benchmarking especially at higher resolutions that roast the GPU like 4K is GPU not CPU limited.

You tend only to see bigger CPUs differences in older games like Counter-strike that can go 200+ FPS or at 1080P.

It's also why going to 3600 to 3600XT doesn't yield big performance changes.

Kibner
Oct 21, 2008

Acguy Supremacy

gradenko_2000 posted:

I saw a video of a dude that paired an Athlon 200GE with an RTX 2080 and was still getting good 60+ FPS on a bunch of games at 4k

is gaming performance really that heavily biased towards blowing wads of cash on your GPU and the CPU can just be whatever?

More or less. The better gaming CPUs are usually for people who want to run games at 120+ fps and also raise 1.0% & 0.1% lows, iirc. Or do software encoding for streaming while playing a game on the same machine.

Suburban Dad
Jan 10, 2007


Well what's attached to a leash that it made itself?
The punchline is the way that you've been fuckin' yourself




I'm running a 2500k at 4.3 and a 1070 GPU and some newer games are unplayable at 1440p (144hz, gsync) . I'll go from 100+fps to 2-3 second freezes randomly and tons of little stutters. Definitely looking forward to a CPU upgrade.

VorpalFish
Mar 22, 2007
reasonably awesometm

Suburban Dad posted:

I'm running a 2500k at 4.3 and a 1070 GPU and some newer games are unplayable at 1440p (144hz, gsync) . I'll go from 100+fps to 2-3 second freezes randomly and tons of little stutters. Definitely looking forward to a CPU upgrade.

Part of that is because 4c/4t is actually becoming a bottleneck - you might not notice as much of that of you were on a 2600k instead.

Not that you didn't get your money's worth and then some out of the 2500k.

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!
That CPU is 9 years old. I think we went through a very rare period where CPUs from 7-9 years old were incredibly viable and offered tons of longevity. This "generation" has seen games really start to utilize multi threading as well, which is why the old i7s sort of held up, and why those CPUs are sputtering now.

I'd almost say that gaming at 1080p is a "solved" problem, I'd never upgrade my CPU if I wasn't trying to push 100 frames constantly at 1440p, FreeSync takes care of the jitters.

ufarn
May 30, 2009
You need at least two cores to run the DRM in Ubisoft games.

Suburban Dad
Jan 10, 2007


Well what's attached to a leash that it made itself?
The punchline is the way that you've been fuckin' yourself




VorpalFish posted:

Part of that is because 4c/4t is actually becoming a bottleneck - you might not notice as much of that of you were on a 2600k instead.

Not that you didn't get your money's worth and then some out of the 2500k.

Not complaining in the least. It's served me really well.

pyrotek
May 21, 2004



Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still.

Watch Dogs Legion is probably a taste of the future.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

pyrotek posted:

Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still.

Watch Dogs Legion is probably a taste of the future.

If they're going to target 30 fps it's going to be for graphics reasons. Moreover, they're not going to target 30 fps. On a per-core basis an OC'd Sandy Lake can keep up with stock Skylake, it's the lack of cores that'll do in older processors, not per-core perf.

E:vvvvv also that. Especially for ubi games.

Fantastic Foreskin fucked around with this message at 16:20 on Nov 3, 2020

hobbesmaster
Jan 28, 2008

pyrotek posted:

Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still.

Watch Dogs Legion is probably a taste of the future.

Halfassed DRM that crashes every couple of minutes and poor use of ray tracing?

...I think you’re right

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


pyrotek posted:

Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still.

Watch Dogs Legion is probably a taste of the future.

There's a lot of buzz about 4k60 and 120Hz gaming on this gen, I very much doubt they are going to target 30 on any resolution.

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!

pixaal posted:

There's a lot of buzz about 4k60 and 120Hz gaming on this gen, I very much doubt they are going to target 30 on any resolution.

I dont think that is going to happen. To the point I feel comfortable citing any off the rack gaming article because MS and Sony themselves are already pushing those goalposts WAY back

https://www.windowscentral.com/xbox-series-x-and-ps5-may-not-offer-massive-leap-visuals-initially

Most PS5 demo footage was rendering at 1440p 30fps, and the Xbox Series X is already missing 60fps targets on cross gen games. As always there will be specific games/developers/progress on this front, launch games are always a bit janky, but you'd expect a cross gen game to just be blown away by the hardware...

Mofabio
May 15, 2003
(y - mx)*(1/(inf))*(PV/RT)*(2.718)*(V/I)

mdxi posted:

I've ordered a B550 mobo (Gigabyte DS3H) and 32G of DDR4-3600 memory, so whenever I can get my hands on a 5900X, I'll be ready to do an upgrade and some testing. I'll (briefly) have Zen, Zen+, Zen 2 and Zen 3 CPUs all in-house. For fun, I'll probably run the same set of benchmarks across all of them.

I'm not gonna go as hard on testing as I did with the 3900X. For instance, I'm not going to do 24 hour runs of each subproject at World Community Grid. I was really driven to quantify what Zen+ vs Zen2 performance was going to be like, in a lot of contexts. This time around I'm far more comfortable assuming it'll be the 15-20% uplift AMD is claiming, and runing some quicker tests to validate that. Plus I just want the upgraded machine to get back to work. I'm thinking:

- Geekbench 5.1.1
- Stockfish chess engine (edit: fantastic for testing ALU parallelism)
- OpenFOAM CFD suite (edit: pure FPU workload; will isolate improvements there)

Though I'm open to requests (so long as it's something that can run headless on Linux).

Yes please!

etalian
Mar 20, 2006

pyrotek posted:

Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still.

Watch Dogs Legion is probably a taste of the future.

Yeah plus AMD is really making a killing by successfully pitching themselves as the console high performance GPU/CPU provide for the next PS and Xbox refreshes.

Even though when they come out the CPU side will be the Zen 2 architecture, while PC CPUs are getting Zen 3 this winter.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

etalian posted:

Yeah plus AMD is really making a killing by successfully pitching themselves as the console high performance GPU/CPU provide for the next PS and Xbox refreshes.

Even though when they come out the CPU side will be the Zen 2 architecture, while PC CPUs are getting Zen 3 this winter.

Which is still plenty of CPU for games, especially because the consoles are a more controlled environment. This round of consoles feels way less behind at launch than the PS4 / XBone did.

etalian
Mar 20, 2006

Twerk from Home posted:

Which is still plenty of CPU for games, especially because the consoles are a more controlled environment. This round of consoles feels way less behind at launch than the PS4 / XBone did.

Well it was a much more crafty strategy by AMD focusing on performance instead of being seen as the budget rated CPU company.

Also getting the contracts for both Xbox and PS5 hardware was also good move to going along with their more PC / server specific products.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

etalian posted:

Well it was a much more crafty strategy by AMD focusing on performance instead of being seen as the budget rated CPU company.

Also getting the contracts for both Xbox and PS5 hardware was also good move to going along with their more PC / server specific products.

The rumors have always been that AMD doesn't actually make very much in terms of profits off the console SoCs, and were chosen repeatedly because they're the only ones able to offer a single-SoC solution with reasonable enough CPU + GPU performance and were willing to sell them super cheap. It's what more or less kept the lights on over at AMD HQ for a while, and apparently heavily funded AMD's GPU R&D for years.

I mean, AMD wasn't playing the budget role because they were "crafty" and wanted to--they were playing it because their arch was pretty crap for a long time, and they got dunked on by Intel both in terms of arch performance and in terms of fabs because Intel was ahead of everyone else. If Intel hadn't hosed up 10nm so badly, things might be considerably different now, but instead Intel is still on a node from 2014. It took Intel almost standing still for 4-6 years for AMD to catch up (depending on what workload you're looking at). AMD taking punches at the performance level is something Intel could absolutely have avoided...but they didn't.

On the GPU side of the house they played the budget fiddle because they didn't have the capital to really do much more than warmed over refreshes of Island-series arch for years, and their few forays into new spaces bombed horribly (Radeon VII, HBM, etc). The console contracts in that sense were very good moves since they pretty much bankrolled the development of RDNA into an actually decent arch. They still might not be looking so great had NVidia gone with TSMC instead of Samsung and had a 10-20% performance bump from that--then again who knows what that would have done for Ampere's pricing.

Anyhow, at least now we might get some real competition, which will be good for consumers on the pricing points, but might not be so great for consumers on the "everyone rushing to build their own walled garden" side of things.

DrDork fucked around with this message at 21:46 on Nov 3, 2020

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

Beef posted:

GF doesn't need to chase the same aggressive node technology the other foundries are. It makes sense for logic/compute to chase further shrink, but not everything requires that. AMD demonstrated pretty aptly that it's more economic these days to not do everything on a monolithic die. IO has certain physical constraints that make older mature nodes just as good, at a much lower cost. It makes perfect business sense for GF to specialize their process and tools into things like IO. Isn't GF already producing AMD's IO chiplets?

If they don't bother moving to newer nodes at some point, they'll sink into the poo poo tier of fabs and be forced to compete with all the other producers who make chips for toasters because their tech is outdated.


If GoFlo had made a stab at 7nm (even if it wasn't very good), they would have a ready customer in Nvidia right now.

Adbot
ADBOT LOVES YOU

The Gadfly
Sep 23, 2012

ConanTheLibrarian posted:

If they don't bother moving to newer nodes at some point, they'll sink into the poo poo tier of fabs and be forced to compete with all the other producers who make chips for toasters because their tech is outdated.

I thought they were moving to a smaller node with their upcoming tiger lake, but no, they are still on 14nm nodes in 2021 :lol:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply