|
DrDork posted:Fair enough, but do you really need 750W? Seasonic has 650's that have been considerably more available of late if you're not thinking of...well I don't really know what you'd do to need 750W, given that even a heavily OC'ed 3080 only hits like 400W + 100W for the 5900X + like 50W for everything else. 750's made sense for OCing 10700/10900k's, but not so much for Zen 3. Seasonic's units are sfx-l, not sfx so there are compatibility reasons they aren't always a 1:1 replacement.
|
# ? Nov 2, 2020 19:24 |
|
|
# ? May 30, 2024 10:43 |
|
DrDork posted:Fair enough, but do you really need 750W? Seasonic has 650's that have been considerably more available of late if you're not thinking of...well I don't really know what you'd do to need 750W, given that even a heavily OC'ed 3080 only hits like 400W + 100W for the 5900X + like 50W for everything else. 750's made sense for OCing 10700/10900k's, but not so much for Zen 3. I’m just going off the recommended specs on the project pages for all the new GPUs. I might consider downgrading to a lower wattage if literally everything else (GPU and CPU included) comes in for this build, but right now I might as well wait to see if stock shows up.
|
# ? Nov 2, 2020 19:37 |
|
VorpalFish posted:Seasonic's units are sfx-l, not sfx so there are compatibility reasons they aren't always a 1:1 replacement. That's true, and I suppose the Corsair SFX line drops from 750 to 600, which is maybe a bit close to the limit. SilverStone has both 650W and 700W SFX ones, though, and EVGA has a 650W one, all in stock at Amazon at reasonable prices. Cross-Section posted:I’m just going off the recommended specs on the project pages for all the new GPUs. I might consider downgrading to a lower wattage if literally everything else (GPU and CPU included) comes in for this build, but right now I might as well wait to see if stock shows up. Quite fair. It just seems that a lot of people are still keeping with some pretty outdated rules of thumbs, like not running it past 70% load or whatever, and way over-buying on capacity. I mean if it's the same price and all then, sure, I'd go for the bigger one every time. But if it's a question of being able to build it at all or not, I wouldn't worry about it. DrDork fucked around with this message at 19:43 on Nov 2, 2020 |
# ? Nov 2, 2020 19:41 |
|
Cross-Section posted:I’m just going off the recommended specs on the project pages for all the new GPUs. GPU recommended specs have to account for people with lovely low efficiency PSUs that struggle to put out the watts they claim on the label. High quality gold or better units are fairly comfortable delivering their rated watts, and when they put "650" on the box they mean 650 or real 12V, not 580 of 12V and 70 of 5 & 3V. So good 650W PSUs should generally be ok for a 3080 + Ryzen, if the 3080 is not OCed or one of the maximum pre-OCed units. Maxing both a 3080 and 5950 out with OCs could get a bit dicey, depending on the rest of the system. OTOH if this is a SFX system, the idea of putting 600W of heat inside a shoebox size container is an interesting challenge itself. You also have to look at things like whether your case mounts the PSU drawing fresh outside air (like most regular desktops these days) or if the PSU is using internal air to assist with airflow. Internal air in a SFX PC can make the PSU get quite warm since it's eating hot air, and that calls for more headroom.
|
# ? Nov 2, 2020 22:09 |
|
Klyith posted:OTOH if this is a SFX system, the idea of putting 600W of heat inside a shoebox size container is an interesting challenge itself. You also have to look at things like whether your case mounts the PSU drawing fresh outside air (like most regular desktops these days) or if the PSU is using internal air to assist with airflow. Internal air in a SFX PC can make the PSU get quite warm since it's eating hot air, and that calls for This is the way.
|
# ? Nov 2, 2020 22:12 |
|
Blorange posted:The 5600x will be $300 because they can charge that much for it. It's looking like ~30% more performance for 50% more money. SiSoftware found performance to be ~40% better than the 3600X when reviewing the 5600X. And you are dead wrong about the price being "because they can charge that much for it." AMD is following the same pricing strategy for the 5000 series as Zen 2. Don't assume AMD has the same business model as Intel. Intel maintains a gross margin of around 65% on their consumer CPUs, while AMD forecasts expected gross margins on Zen 3 consumer CPUs to be roughly ~45%, which is where they have been for a while with Zen. Meaning AMD intends to price Zen 3 at the same level of per-unit profit as they received with Zen 2. Zen 3 has a lot of R&D to recoup, and is likely to be a more expensive product to produce due to factors such as TSMC capacity being at record high demand. They could charge a lot more than $299 for this processor. AMD buying Xilinx in an all-stock deal is a pretty clever move. Prior to the acquisition, AMD had a free cash flow of ~$150 million over the last year, but the combined companies will have nearly 2 billion in free cash flow over the next 12 months.
|
# ? Nov 2, 2020 22:36 |
|
Cross-Section posted:Ah, drat. I guess I’m checking the production date on the Asrock B550m-ITX/AC I ordered when it comes in today, because I couldn’t find hide or hair of a BIOS Flashback option on the product page. Took a few minutes of looking over this little baby of a board, but... yay I got a good mobo Klyith posted:GPU recommended specs have to account for people with lovely low efficiency PSUs that struggle to put out the watts they claim on the label. High quality gold or better units are fairly comfortable delivering their rated watts, and when they put "650" on the box they mean 650 or real 12V, not 580 of 12V and 70 of 5 & 3V. Good to know. In the end, I caved and got the Silverstone 700W; seems like reviews are good outside of the apparently-inflexible cabling. And yeah, I wasn't planning on OC outside of whatever factory settings may already be applied. As for the cooling aspect, I'm using a NR200 as the case for this build. PSU mount appears to be out-facing, judging by the mesh (and the, well, mount itself). Cross-Section fucked around with this message at 22:53 on Nov 2, 2020 |
# ? Nov 2, 2020 22:50 |
|
Cross-Section posted:Good to know. In the end, I caved and got the Silverstone 700W; seems like reviews are good outside of the apparently-inflexible cabling. And yeah, I wasn't planning on OC outside of whatever factory settings may already be applied.
|
# ? Nov 2, 2020 22:57 |
|
GRECOROMANGRABASS posted:and is likely to be a more expensive product to produce due to factors such as TSMC capacity being at record high demand Ya know, that's a heck of a good point and it's unfortunate that nobody's pointed that out so far with all the grouching about the price increases. It sucks that prices are higher for performance chips, but some of the blame has to be pointed at all the other companies that have failed to make a competitive process. (Remember when GlobalFoundaries was part of the TSMC-Samsumg-GF "not intel" 7nm development coalition, and right before it was go time they decided the new fabs were so expensive they weren't worth building? )
|
# ? Nov 2, 2020 23:12 |
|
Klyith posted:(Remember when GlobalFoundaries was part of the TSMC-Samsumg-GF "not intel" 7nm development coalition, and right before it was go time they decided the new fabs were so expensive they weren't worth building? ) It'd be interesting to see GloFo's books on that one. I mean, Intel hasn't figured out 10/7nm yet, either, but they've had some of the most profitable quarters in their history churning out poo poo on 14nm. Wonder if GloFo has had a similar experience where they've had orders enough to make outsized profits by leveraging mature (and thus very cost-efficient) processes rather than having to recoup the capital expenses of the new foundries.
|
# ? Nov 2, 2020 23:21 |
|
Yeah, the alliance between TSMC, Samsung, and GF was great, and who knows where we might be with that combined R&D collaboration if GF didn't drop the ball. Samsung's 8nm process has a transistor gate pitch equal to or very close to TSMC's 10nm process, so I get the impression they aren't much further ahead of where they were when GF dropped out.
|
# ? Nov 2, 2020 23:29 |
|
Late stage capitalism is probably the reason for GF doing what it did. They had short and mid term profits guaranteed from their 12 and 14nm process nodes, and by having a nearly risk production ready 7nm process that just needs a little love and attention to get up and running, they probably have the exit strategy of being acquired for a king's ransom long before 12nm is obsolete.
|
# ? Nov 2, 2020 23:42 |
|
Wish they'd release some AMD 5000 bench marks before the 5th. Don't really feel like racing to add to cart when it's not clear what you're getting
|
# ? Nov 3, 2020 02:06 |
|
terrorist ambulance posted:Wish they'd release some AMD 5000 bench marks before the 5th. Don't really feel like racing to add to cart when it's not clear what you're getting You can always return it.
|
# ? Nov 3, 2020 02:53 |
|
Does anyone know what time they'll be dropping? I've heard midnight, 6am, and 9am PST and I reeealllly don't want to rely on Discord to compete with thousands of nerds a la the 3080 launch (but it will probably be that way again)
|
# ? Nov 3, 2020 04:20 |
|
Knot My President! posted:Does anyone know what time they'll be dropping? I've heard midnight, 6am, and 9am PST and I reeealllly don't want to rely on Discord to compete with thousands of nerds a la the 3080 launch (but it will probably be that way again) Nope. The assumption is 9am EST like with some other launches, but they've not said anything official.
|
# ? Nov 3, 2020 04:25 |
|
GF doesn't need to chase the same aggressive node technology the other foundries are. It makes sense for logic/compute to chase further shrink, but not everything requires that. AMD demonstrated pretty aptly that it's more economic these days to not do everything on a monolithic die. IO has certain physical constraints that make older mature nodes just as good, at a much lower cost. It makes perfect business sense for GF to specialize their process and tools into things like IO. Isn't GF already producing AMD's IO chiplets?
|
# ? Nov 3, 2020 14:07 |
|
Beef posted:GF doesn't need to chase the same aggressive node technology the other foundries are. It makes sense for logic/compute to chase further shrink, but not everything requires that. AMD demonstrated pretty aptly that it's more economic these days to not do everything on a monolithic die. IO has certain physical constraints that make older mature nodes just as good, at a much lower cost. It makes perfect business sense for GF to specialize their process and tools into things like IO. Isn't GF already producing AMD's IO chiplets? They are but iirc AMD is still contractually obligated to purchase some quantity of wafers from them as part of the sale of their interest in the fab, so who knows if they would otherwise. That should run through 2024.
|
# ? Nov 3, 2020 14:37 |
|
A 3600 should be enough for anyone...
|
# ? Nov 3, 2020 14:40 |
|
etalian posted:A 3600 should be enough for anyone... I saw a video of a dude that paired an Athlon 200GE with an RTX 2080 and was still getting good 60+ FPS on a bunch of games at 4k is gaming performance really that heavily biased towards blowing wads of cash on your GPU and the CPU can just be whatever?
|
# ? Nov 3, 2020 14:43 |
|
gradenko_2000 posted:I saw a video of a dude that paired an Athlon 200GE with an RTX 2080 and was still getting good 60+ FPS on a bunch of games at 4k Yes, with some exceptions (flight sim), but also if you want smooth frame time delivery at high refresh rates having a fast CPU becomes important. Bias for gaming computers has always been spend more on the gpu though.
|
# ? Nov 3, 2020 14:47 |
|
gradenko_2000 posted:I saw a video of a dude that paired an Athlon 200GE with an RTX 2080 and was still getting good 60+ FPS on a bunch of games at 4k Yup pretty much benchmarking especially at higher resolutions that roast the GPU like 4K is GPU not CPU limited. You tend only to see bigger CPUs differences in older games like Counter-strike that can go 200+ FPS or at 1080P. It's also why going to 3600 to 3600XT doesn't yield big performance changes.
|
# ? Nov 3, 2020 14:49 |
|
gradenko_2000 posted:I saw a video of a dude that paired an Athlon 200GE with an RTX 2080 and was still getting good 60+ FPS on a bunch of games at 4k More or less. The better gaming CPUs are usually for people who want to run games at 120+ fps and also raise 1.0% & 0.1% lows, iirc. Or do software encoding for streaming while playing a game on the same machine.
|
# ? Nov 3, 2020 14:56 |
|
I'm running a 2500k at 4.3 and a 1070 GPU and some newer games are unplayable at 1440p (144hz, gsync) . I'll go from 100+fps to 2-3 second freezes randomly and tons of little stutters. Definitely looking forward to a CPU upgrade.
|
# ? Nov 3, 2020 15:01 |
|
Suburban Dad posted:I'm running a 2500k at 4.3 and a 1070 GPU and some newer games are unplayable at 1440p (144hz, gsync) . I'll go from 100+fps to 2-3 second freezes randomly and tons of little stutters. Definitely looking forward to a CPU upgrade. Part of that is because 4c/4t is actually becoming a bottleneck - you might not notice as much of that of you were on a 2600k instead. Not that you didn't get your money's worth and then some out of the 2500k.
|
# ? Nov 3, 2020 15:19 |
|
That CPU is 9 years old. I think we went through a very rare period where CPUs from 7-9 years old were incredibly viable and offered tons of longevity. This "generation" has seen games really start to utilize multi threading as well, which is why the old i7s sort of held up, and why those CPUs are sputtering now. I'd almost say that gaming at 1080p is a "solved" problem, I'd never upgrade my CPU if I wasn't trying to push 100 frames constantly at 1440p, FreeSync takes care of the jitters.
|
# ? Nov 3, 2020 15:22 |
|
You need at least two cores to run the DRM in Ubisoft games.
|
# ? Nov 3, 2020 15:23 |
|
VorpalFish posted:Part of that is because 4c/4t is actually becoming a bottleneck - you might not notice as much of that of you were on a 2600k instead. Not complaining in the least. It's served me really well.
|
# ? Nov 3, 2020 15:34 |
|
Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still. Watch Dogs Legion is probably a taste of the future.
|
# ? Nov 3, 2020 15:59 |
|
pyrotek posted:Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still. If they're going to target 30 fps it's going to be for graphics reasons. Moreover, they're not going to target 30 fps. On a per-core basis an OC'd Sandy Lake can keep up with stock Skylake, it's the lack of cores that'll do in older processors, not per-core perf. E:vvvvv also that. Especially for ubi games. Fantastic Foreskin fucked around with this message at 16:20 on Nov 3, 2020 |
# ? Nov 3, 2020 16:18 |
|
pyrotek posted:Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still. Halfassed DRM that crashes every couple of minutes and poor use of ray tracing? ...I think you’re right
|
# ? Nov 3, 2020 16:19 |
|
pyrotek posted:Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still. There's a lot of buzz about 4k60 and 120Hz gaming on this gen, I very much doubt they are going to target 30 on any resolution.
|
# ? Nov 3, 2020 17:15 |
|
pixaal posted:There's a lot of buzz about 4k60 and 120Hz gaming on this gen, I very much doubt they are going to target 30 on any resolution. I dont think that is going to happen. To the point I feel comfortable citing any off the rack gaming article because MS and Sony themselves are already pushing those goalposts WAY back https://www.windowscentral.com/xbox-series-x-and-ps5-may-not-offer-massive-leap-visuals-initially Most PS5 demo footage was rendering at 1440p 30fps, and the Xbox Series X is already missing 60fps targets on cross gen games. As always there will be specific games/developers/progress on this front, launch games are always a bit janky, but you'd expect a cross gen game to just be blown away by the hardware...
|
# ? Nov 3, 2020 17:30 |
|
mdxi posted:I've ordered a B550 mobo (Gigabyte DS3H) and 32G of DDR4-3600 memory, so whenever I can get my hands on a 5900X, I'll be ready to do an upgrade and some testing. I'll (briefly) have Zen, Zen+, Zen 2 and Zen 3 CPUs all in-house. For fun, I'll probably run the same set of benchmarks across all of them. Yes please!
|
# ? Nov 3, 2020 18:06 |
|
pyrotek posted:Consoles before the PS5 generation had bad CPUs compared to computers. Now they'll be running a 3700x with slightly lower clock. CPU requirements will probably shoot way up in a year or two when most games are targeting the performance level of the new consoles, especially if they are using all that power to target 30FPS still. Yeah plus AMD is really making a killing by successfully pitching themselves as the console high performance GPU/CPU provide for the next PS and Xbox refreshes. Even though when they come out the CPU side will be the Zen 2 architecture, while PC CPUs are getting Zen 3 this winter.
|
# ? Nov 3, 2020 21:10 |
|
etalian posted:Yeah plus AMD is really making a killing by successfully pitching themselves as the console high performance GPU/CPU provide for the next PS and Xbox refreshes. Which is still plenty of CPU for games, especially because the consoles are a more controlled environment. This round of consoles feels way less behind at launch than the PS4 / XBone did.
|
# ? Nov 3, 2020 21:16 |
|
Twerk from Home posted:Which is still plenty of CPU for games, especially because the consoles are a more controlled environment. This round of consoles feels way less behind at launch than the PS4 / XBone did. Well it was a much more crafty strategy by AMD focusing on performance instead of being seen as the budget rated CPU company. Also getting the contracts for both Xbox and PS5 hardware was also good move to going along with their more PC / server specific products.
|
# ? Nov 3, 2020 21:21 |
|
etalian posted:Well it was a much more crafty strategy by AMD focusing on performance instead of being seen as the budget rated CPU company. The rumors have always been that AMD doesn't actually make very much in terms of profits off the console SoCs, and were chosen repeatedly because they're the only ones able to offer a single-SoC solution with reasonable enough CPU + GPU performance and were willing to sell them super cheap. It's what more or less kept the lights on over at AMD HQ for a while, and apparently heavily funded AMD's GPU R&D for years. I mean, AMD wasn't playing the budget role because they were "crafty" and wanted to--they were playing it because their arch was pretty crap for a long time, and they got dunked on by Intel both in terms of arch performance and in terms of fabs because Intel was ahead of everyone else. If Intel hadn't hosed up 10nm so badly, things might be considerably different now, but instead Intel is still on a node from 2014. It took Intel almost standing still for 4-6 years for AMD to catch up (depending on what workload you're looking at). AMD taking punches at the performance level is something Intel could absolutely have avoided...but they didn't. On the GPU side of the house they played the budget fiddle because they didn't have the capital to really do much more than warmed over refreshes of Island-series arch for years, and their few forays into new spaces bombed horribly (Radeon VII, HBM, etc). The console contracts in that sense were very good moves since they pretty much bankrolled the development of RDNA into an actually decent arch. They still might not be looking so great had NVidia gone with TSMC instead of Samsung and had a 10-20% performance bump from that--then again who knows what that would have done for Ampere's pricing. Anyhow, at least now we might get some real competition, which will be good for consumers on the pricing points, but might not be so great for consumers on the "everyone rushing to build their own walled garden" side of things. DrDork fucked around with this message at 21:46 on Nov 3, 2020 |
# ? Nov 3, 2020 21:43 |
|
Beef posted:GF doesn't need to chase the same aggressive node technology the other foundries are. It makes sense for logic/compute to chase further shrink, but not everything requires that. AMD demonstrated pretty aptly that it's more economic these days to not do everything on a monolithic die. IO has certain physical constraints that make older mature nodes just as good, at a much lower cost. It makes perfect business sense for GF to specialize their process and tools into things like IO. Isn't GF already producing AMD's IO chiplets? If they don't bother moving to newer nodes at some point, they'll sink into the poo poo tier of fabs and be forced to compete with all the other producers who make chips for toasters because their tech is outdated. If GoFlo had made a stab at 7nm (even if it wasn't very good), they would have a ready customer in Nvidia right now.
|
# ? Nov 3, 2020 22:19 |
|
|
# ? May 30, 2024 10:43 |
|
ConanTheLibrarian posted:If they don't bother moving to newer nodes at some point, they'll sink into the poo poo tier of fabs and be forced to compete with all the other producers who make chips for toasters because their tech is outdated. I thought they were moving to a smaller node with their upcoming tiger lake, but no, they are still on 14nm nodes in 2021
|
# ? Nov 3, 2020 22:37 |