Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dr. Video Games 0031
Jul 17, 2004

hobbesmaster posted:

While 140W is the default, it can go much, much higher. The “can’t draw twice the power” isn’t actually true. With a good board and uncapped PBO settings 5950x users report 240W in prime95 small FFT which should give an absolute upper bound. My 5800x3d can hit 120W in the same test. I guess technically that’s more than double because the io die is like max 20W on both?

However games use a smaller number of threads, and if those threads are spread across two CCDs that means cooling can be easier for the same power. That’s part of why the 2ccd CPUs can realistically stay above 5GHz.

He is right in that the 5950X didn't use 2x the power of the 5800X. The 5800X could use a lot more power than 120W in stress tests with PBO enabled.

The 5800X3D is different though due to the lack of PBO or any kind of power manipulation or overclocking.

Anyway, if your primary use case is gaming, then the 5800X3D makes more sense than the 5950X. It's just the better gaming chip, and it's not even close.

Adbot
ADBOT LOVES YOU

kliras
Mar 27, 2021
speaking of 5800X3D, anyone else see their cpu go to 70c on boot and stay there for a bit? i thought i was doomed to go through this forever as it also went on a bit later, but apparently that was some weird windows background services that robably decided to run because the computer had been turned off for over a week as i was setting up in a new case. had to take off one of the d15 fans for the o11d evo, but still pretty weird

maybe i just didn't hear the fans before in my older case whereas i got this one to make it easier to observe everything that was going on - for better and for worse. makes you a bit more paranoid when you can hear all the hysteresis

cinebench performs better than before, but i'll have to check 3dmark tomorrow to compare

hobbesmaster
Jan 28, 2008

Despite people saying the big air tower coolers are equivalent to 240mm AIOs, “soak time” is longer than a few seconds which helps a bit with brief stuff like that.

The regular 5800x immediately runs hot like that too, i guess there’s just too much going on in the one CCD.

FuturePastNow
May 19, 2014


It's a lot of heat in a very small spot. I've also heard the "direct contact" coolers that got so popular might not be the best with CPUs like the 5800 because the hot spot is so small it could only be touching one of those heatpipes.

Klyith
Aug 3, 2007

GBS Pledge Week

FuturePastNow posted:

I've also heard the "direct contact" coolers that got so popular might not be the best with CPUs like the 5800 because the hot spot is so small it could only be touching one of those heatpipes.

This is true but if kliras has a D15 that's not a problem.


hobbesmaster posted:

Despite people saying the big air tower coolers are equivalent to 240mm AIOs, “soak time” is longer than a few seconds which helps a bit with brief stuff like that.

A D15 isn't gonna be thermally soaked by a couple cores of load from a windows boot or update.


I think this is just the same pattern of observed high CPU temperature that we've seen on all 7nm chips, just one step worse. Temps with light workloads are very high because the CPU can boost a core to the max. One core is so much heat in so small a spot that the thermal conduction of the silicon itself is a limiting factor. Spinning up fans doesn't really do much because it isn't the fin stack that's the problem. It's a <60W load, that's nothing.

The 3000 series was this way, and "is my CPU hot" was a super common question. The 5000s did the same but could boost even higher. And the X3D adds a slab of silicon that makes it even worse.

Water cooling does "help", because modern waterblocks get the water super close to the CPU. The skived plate is only a few mm thick. Fastest possible evacuation of heat from the core.

But alternately you can realize that the problem isn't really a problem. 70c just isn't as hot as it used to be. If you have a good cooler that you know is mounted well (because all-core prime95 temps are low), just raise your fan curves and set longer step-up/down. Hell, on a D15+5800x3d pair, set it to low static fan speed. It has loads more capacity than the CPU can generate.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
My last CPU that didn't immediately went to >65 C on a moderate load was the 2500K.

I have also set my fan curves for my 5600X and 5600 to be zero until 75C.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Leaked slide confirms AMD Mendocino RDNA2 graphics specifications with two Compute Units

Oofta. Is Intel still using last-gen HD graphics at the low end?

Inept
Jul 8, 2003


2 CU is going to be slow. Even the universally panned 6500XT has 16 CU

Dr. Video Games 0031
Jul 17, 2004

Not sure what you were expecting out of a chromebook chip

Inept
Jul 8, 2003

I was more replying to the implication that Intel was somehow going to be far behind at the low end.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Turning on Progress Quest's ray-tracing mode is gonna own on the Mendocino chips

Blorange
Jan 31, 2007

A wizard did it

gradenko_2000 posted:

Turning on Progress Quest's ray-tracing mode is gonna own on the Mendocino chips

:allears: Please tell me this is real and you get periodic updates on what Ray is up to.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Frames per second? More like seconds per frame

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Hell, at just 2 CU the dual channel DDR5 is a huge waste. Might as well have gone with single channel DDR4 to really cut costs.

Dr. Video Games 0031
Jul 17, 2004

The ddr5 support is probably to "future-proof" it. this seems like one of those chips that's going to stick around and be sold to OEMs for an annoyingly long time, past the point where everyone has already switched to DDR5. It hopefully won't take long for low-end DDR5 to reach price parity with DDR4 anyway.

shrike82
Jun 11, 2005

what's the state of emulation and CPUs? with xenoblade coming out in July, i might want to try out emulation given how bad performance is on the switch these days

are there CPU-specific enhancements on modern emulators? do they leverage the GPU at all?

Dr. Video Games 0031
Jul 17, 2004

shrike82 posted:

what's the state of emulation and CPUs? with xenoblade coming out in July, i might want to try out emulation given how bad performance is on the switch these days

are there CPU-specific enhancements on modern emulators? do they leverage the GPU at all?

Some emulators make use of AVX-512 instructions to give a large speed-up, with RPCS3 getting the most benefit there. Yuzu also uses AVX-512, but it seems to receive a much lesser boost from it. This is largely a moot point since you can't buy any modern CPUs with AVX-512 support anymore. You could kinda get it with Intel 12th gen through some BIOS fuckery, but Intel started physically modifying the chips so they don't support those instructions anymore.

Yuzu reportedly benefits from the 5800X3D's extra cache by a lot. I'm seeing reports of a 40% speedup over the 5800X.

shrike82
Jun 11, 2005

Dr. Video Games 0031 posted:

Yuzu reportedly benefits from the 5800X3D's extra cache by a lot. I'm seeing reports of a 40% speedup over the 5800X.

:drat:

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

This is largely a moot point since you can't buy any modern CPUs with AVX-512 support anymore

Zen4 should have it though, AMD is being cagey around confirming which instructions it has besides the AI ones but leaks have shown it has very good coverage

Wikichip on Zen4 posted:

AVX-512 - 512-bit Vector Instructions

AVX512F - Foundation (first introduced with Intel Skylake)
AVX512CD - Conflict Detection Instructions (Skylake X)
AVX512VL - Vector Length Extensions (Skylake X)
AVX512DQ - Doubleword and Quadword Instructions (Skylake X)
AVX512BW - Byte and Word Instructions (Skylake X)
AVX512 IFMA - Integer Fused Multiply-Add (Cannon Lake)
AVX512 VBMI - Vector Bit Manipulation Instructions (Cannon Lake)
AVX512 VPOPCNTDQ - Vector Population Count Instruction (Ice Lake)
AVX512 BITALG - Bit Algorithms (Ice Lake)
AVX512 VBMI2 - Vector Bit Manipulation Instructions 2 (Ice Lake)
AVX512 VNNI - Vector Neural Network Instructions (Ice Lake)
AVX512 BF16 - BFloat16 Instructions (Cooper Lake)

Not supported: AVX512ER, AVX512PF (Knights Landing); AVX512 4VNNIW, 4FMAPS (Knights Mill); VP2INTERSECT (Tiger Lake)

Ignoring the stuff that Intel only ever shipped on accelerators, it's only missing VP2INTERSECT compared to Intel's latest implementation (that you can't buy lol)

repiv fucked around with this message at 01:19 on May 31, 2022

Dr. Video Games 0031
Jul 17, 2004

I'm getting very little in the way of corroboration with regards to Yuzu getting a large boost from the 5800X3D. Someone posted screenshots of xenoblade 2 getting a large framerate boost, and someone else on reddit mentioned getting a boost, but that's it. It seems to be a very undertested scenario.


edit: this comparison is also incomplete without also showing what an Intel chip can do.

Dr. Video Games 0031 fucked around with this message at 01:30 on May 31, 2022

Truga
May 4, 2014
Lipstick Apathy
not surprising the emulator performs well when you can fit the entire game into cpu cache :v:

Anime Schoolgirl
Nov 28, 2002

Dr. Video Games 0031 posted:

The ddr5 support is probably to "future-proof" it. this seems like one of those chips that's going to stick around and be sold to OEMs for an annoyingly long time, past the point where everyone has already switched to DDR5. It hopefully won't take long for low-end DDR5 to reach price parity with DDR4 anyway.
I desperately want the punchline of Mendocino to be that it shows up as an Athlon 5000G to provide an SKU for the very bottom end of AM5, which is fast approaching "premium-only platform" status

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Anime Schoolgirl posted:

I desperately want the punchline of Mendocino to be that it shows up as an Athlon 5000G to provide an SKU for the very bottom end of AM5, which is fast approaching "premium-only platform" status

man, that'd be rough. Even the Athlon was a Vega 3 [compute units]

SwissArmyDruid
Feb 14, 2014

by sebmojo

gradenko_2000 posted:

man, that'd be rough. Even the Athlon was a Vega 3 [compute units]

To be fair, these are RDNA2 CUs. Not loving GCN.

God gently caress, I am so goddamn happy we are done with GCN forever.

Truga
May 4, 2014
Lipstick Apathy
yeah now we'll be stuck with rdna for a decade lmao

Don Dongington
Sep 27, 2005

#ideasboom
College Slice
The concept of even being able to emulate a current gen (even Nintendo) system is insane to me. My emulation box is a HP SFF desktop with a Haswell 4770, and it struggles to do PS2 stuff, can't handle WiiU and occasionally struggles with N64 games.

I feel like grabbing a second hand WiiU and homebrewing it is gonna be a better option than trying to emulate more recent Nintendo stuff, given that a heap of switch launch titles were apparently ported straight from that system anyway?

NewFatMike
Jun 11, 2015

Dude the Wii U came out one year before Haswell, of course it’s having trouble emulating that.

Prescription Combs
Apr 20, 2005
   6

AMD's Robert Hallock and Frank Azor tried to heavily set expectations in the podcast/vid about the built in graphics of Mendocino in the other Videocardz article you posted a few pages back.

Here's the vid where they talk about it, the entire thing is pretty drat good:

https://www.youtube.com/watch?v=_s4w49nAgVs

tl;dr market saturation and providing a product that the market, not us nerds, want. Other APUs will have much more powerful integrated graphics.

NewFatMike
Jun 11, 2015

I’ve been Very Annoyed ™️ by the Google hardware experience for the last 3 years but a Zen Chrome tablet tickles some very nerdy neurons.

Dr. Video Games 0031
Jul 17, 2004

This article is a pretty good breakdown of the IO capabilities of Zen 4 and the new chipsets: https://www.techpowerup.com/295394/amd-zen-4-socket-am5-explained-pcie-lanes-chipsets-connectivity

In short, a Ryzen 7000 CPU in a X670 motherboard will have a total of 24 usable PCIe Gen 5 lanes, 12 usable Gen 4 lanes, and 8 usable Gen 3 lanes/SATA connections. In Comparison, an Alder Lake CPU in a Z670 motherboard has a total of 16 usable PCIe Gen 5 lanes, 16 usable Gen 4 lanes, and 16 usable Gen 3 lanes (not shared with SATA). Technically AMD has more total PCIe bandwidth here due to having 8 more Gen 5 lanes. In either case, it's more than you'd ever practically need.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Dr. Video Games 0031 posted:

This article is a pretty good breakdown of the IO capabilities of Zen 4 and the new chipsets: https://www.techpowerup.com/295394/amd-zen-4-socket-am5-explained-pcie-lanes-chipsets-connectivity

In short, a Ryzen 7000 CPU in a X670 motherboard will have a total of 24 usable PCIe Gen 5 lanes, 12 usable Gen 4 lanes, and 8 usable Gen 3 lanes/SATA connections. In Comparison, an Alder Lake CPU in a Z670 motherboard has a total of 16 usable PCIe Gen 5 lanes, 16 usable Gen 4 lanes, and 16 usable Gen 3 lanes (not shared with SATA). Technically AMD has more total PCIe bandwidth here due to having 8 more Gen 5 lanes. In either case, it's more than you'd ever practically need.

cool

where's my next $80 mobo thank u very much

SwissArmyDruid
Feb 14, 2014

by sebmojo

Prescription Combs posted:

AMD's Robert Hallock and Frank Azor tried to heavily set expectations in the podcast/vid about the built in graphics of Mendocino in the other Videocardz article you posted a few pages back.

Here's the vid where they talk about it, the entire thing is pretty drat good:

https://www.youtube.com/watch?v=_s4w49nAgVs

tl;dr market saturation and providing a product that the market, not us nerds, want. Other APUs will have much more powerful integrated graphics.

well maybe the market should want better poo poo. =E

I am still mad about my Raven Ridge experience.

movax
Aug 30, 2008

Dr. Video Games 0031 posted:

This article is a pretty good breakdown of the IO capabilities of Zen 4 and the new chipsets: https://www.techpowerup.com/295394/amd-zen-4-socket-am5-explained-pcie-lanes-chipsets-connectivity

In short, a Ryzen 7000 CPU in a X670 motherboard will have a total of 24 usable PCIe Gen 5 lanes, 12 usable Gen 4 lanes, and 8 usable Gen 3 lanes/SATA connections. In Comparison, an Alder Lake CPU in a Z670 motherboard has a total of 16 usable PCIe Gen 5 lanes, 16 usable Gen 4 lanes, and 16 usable Gen 3 lanes (not shared with SATA). Technically AMD has more total PCIe bandwidth here due to having 8 more Gen 5 lanes. In either case, it's more than you'd ever practically need.

With Gen 5 and people implementing devices keeping only Gen 5 lane widths in mind (or doing bandwidth bridges) that lane count is beyond fantastic on either platform (e.g., a x2 Gen 5 link = x4 Gen 4 == x8 Gen 3). Even with the 'usual' widths:

x16 PEG
x4 NVMe
x4 NVMe
x2 Wi-Fi

Tons of lanes left. Most consumer 10Gb MACs are still PCIe 3.0 I'd assume. The server space will be wild -- x1 Gen 5 links on their own is stupid amounts of bandwidth for most devices.

FuturePastNow
May 19, 2014


20Gbps USB ports on a x1 card

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
AMD APUs with RDNA2 integrated graphics only got their first release back in January. Mendocino is just a lowest-end continuation of that

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Asrock announced X670 boards. Figured to check out the Taichi, since I fared well with that series. They put loving Killer network stuff onto the board. Goddamnit.

--edit: Wait, looking it up, I get to Intel Ark. Did they buy it, or just co-opt that name?! :confused:

CaptainSarcastic
Jul 6, 2013



Combat Pretzel posted:

Asrock announced X670 boards. Figured to check out the Taichi, since I fared well with that series. They put loving Killer network stuff onto the board. Goddamnit.

--edit: Wait, looking it up, I get to Intel Ark. Did they buy it, or just co-opt that name?! :confused:

Wait, where are you seeing anything about Intel Ark?

repiv
Aug 13, 2009

Intel bought Killer two years ago, at this point I'd assume their newer products are just Intel NICs with gamerified drivers

Dunno if you can use the generic Intel drivers with them though

Dr. Video Games 0031
Jul 17, 2004

Combat Pretzel posted:

Asrock announced X670 boards. Figured to check out the Taichi, since I fared well with that series. They put loving Killer network stuff onto the board. Goddamnit.

--edit: Wait, looking it up, I get to Intel Ark. Did they buy it, or just co-opt that name?! :confused:



i am way more bothered than i should be at the location of the top x16 slot. msi is doing this too and i hate it

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo

Combat Pretzel posted:

Asrock announced X670 boards. Figured to check out the Taichi, since I fared well with that series. They put loving Killer network stuff onto the board. Goddamnit.

--edit: Wait, looking it up, I get to Intel Ark. Did they buy it, or just co-opt that name?! :confused:

Yeah, that's one of those stories that got lost in the middle of COVID. Intel purchased Rivet Networks back in 2020, and now owns the rights to Killer.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply