Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
pmchem
Jan 22, 2010



can someone explain this, what does this mean? will a 5600G be better performance/benchmark-wise than a 5600X at stock settings?

also, what's the latest rumor mill regarding AMD chipsets? any new chipsets on the horizon in the next 6 months?

Adbot
ADBOT LOVES YOU

SamDabbers
May 26, 2003



pmchem posted:

can someone explain this, what does this mean? will a 5600G be better performance/benchmark-wise than a 5600X at stock settings?

also, what's the latest rumor mill regarding AMD chipsets? any new chipsets on the horizon in the next 6 months?

Imagine a 5600X but with integrated graphics.

ufarn
May 30, 2009

SamDabbers posted:

Imagine a 5600X but with integrated graphics.
Will those be usable with a dedicated GPU like Intel's, or is it (still?) mutually exclusive?

SamDabbers
May 26, 2003



ufarn posted:

Will those be usable with a dedicated GPU like Intel's, or is it (still?) mutually exclusive?

You should still be able to use a dGPU but some of the PCIe lanes will be used by the iGPU, if it's anything like previous Zen processors with integrated graphics.

Dramicus
Mar 26, 2010
Grimey Drawer

SamDabbers posted:

You should still be able to use a dGPU but some of the PCIe lanes will be used by the iGPU, if it's anything like previous Zen processors with integrated graphics.

Yeah, it's mostly a case of get the 5600x if you ever plan to use a dGPU, get the 5600g if you don't. You could totally use a dGPU with the 5600g, but it will most likely be running at 8x instead of 16x.

FuturePastNow
May 19, 2014


Yeah all of AMD's Zen APUs so far have also only had 8 PCIe lanes for the graphics slot. Which is not a problem unless you're using a truly high end card of the sort few people will ever pair with an APU. But it's one of the compromises they make to stick some pretty ok graphics in there.

repiv
Aug 13, 2009

Should be less of an issue now with PCIe4 though, provided you pair it with an Ampere or Big Navi card?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

repiv posted:

Should be less of an issue now with PCIe4 though, provided you pair it with an Ampere or Big Navi card?

It's not really an issue even on PCIe 3 unless you're gaming with a fast CPU, high-end GPU, and a 1080p monitor. Otherwise it's like a 1-3% difference between PCIe 4.0 x16 and PCIe 3.0 x8 in most games.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

I'm excited as hell. This in an Asrock Deskmini would make for an incredible tiny workstation.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

Should be less of an issue now with PCIe4 though, provided you pair it with an Ampere or Big Navi card?

Does Cezanne finally add PCIe 4? Renoir didn’t have it, unlike the rest of the zen2s.

Cygni
Nov 12, 2005

raring to post

Paul MaudDib posted:

Does Cezanne finally add PCIe 4? Renoir didn’t have it, unlike the rest of the zen2s.

The leaked roadmap from earlier this year suggests it doesn't have PCIe 4. Rembrandt is the first APU with explicitly listed PCIe 4. Rembrandt is also AMDs first part with DDR5.

Arzachel
May 12, 2012

Cygni posted:

The leaked roadmap from earlier this year suggests it doesn't have PCIe 4. Rembrandt is the first APU with explicitly listed PCIe 4. Rembrandt is also AMDs first part with DDR5.

Could it be for power saving reasons since they're going to reuse the same design for laptops?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arzachel posted:

Could it be for power saving reasons since they're going to reuse the same design for laptops?

yeah definitely, the APUs are laptop-first designs that get socketed for the desktop market because it's easy to do. PCIe 4.0 (a) completely doesn't matter in laptops, and (b) is a giant power hog. there's a reason the only chipset with PCIe4 capability suddenly has a chipset fan.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Speaking of which, I wonder if I can force my X570 PCH down to PCIe 3 and nix the fan. Hmmmm

VorpalFish
Mar 22, 2007
reasonably awesometm

Paul MaudDib posted:

yeah definitely, the APUs are laptop-first designs that get socketed for the desktop market because it's easy to do. PCIe 4.0 (a) completely doesn't matter in laptops, and (b) is a giant power hog. there's a reason the only chipset with PCIe4 capability suddenly has a chipset fan.

I thought that reason was because they're repurposed io dies on glofo 12nm to help meet purchasing volumes on their agreement?

VorpalFish fucked around with this message at 22:47 on Dec 20, 2020

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I do also want to point out that the APUs have less cache than a "pure" CPU, since part of the die that would otherwise be cache is occupied by the iGPU, and that does make them worse than a CPU in most (all?) scenarios where you're already using a dedicated GPU.

I'm just excited for it personally since I like using APUs even with a dGPU for the "insurance" factor of getting to continue to use your computer even if the GPU dies (even if I also own more than one GPU anyway), but also since we never got a Zen 2-based desktop APU that's available over-the-counter.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
just did bios flashback on a b550e in preparation to install a 5900x
I was stressing for nothing. Super easy, new tech is great. Cant wait to play 4 year old games at 300fps

CaptainSarcastic
Jul 6, 2013



Fauxtool posted:

just did bios flashback on a b550e in preparation to install a 5900x
I was stressing for nothing. Super easy, new tech is great. Cant wait to play 4 year old games at 300fps

When I fired up Fallout 4 on my new 3600x it took me a few minutes to realize things were all weird because it was trying to run at like 1000fps. On my old Phenom II it struggled to even get to 30fps in busier areas. It was a :stare: moment for me having to throw a limit on the fps to make it run sanely.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
I tried to play skyrim for the first time 6 years after it came out on what was then a top spec pc. I thought it was a most broken crap game until i turned on vsync. First world problems amirite

SwissArmyDruid
Feb 14, 2014

by sebmojo

I want to see some core counts on those parts, and please, for the love of all that is unholy, let them not have GCN graphics.

Until AMD gets their poo poo together with SR-IOV, one of these plus a 5800 XT will be what finally gets me off Windows as my main operating system for good.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

SwissArmyDruid posted:

I want to see some core counts on those parts, and please, for the love of all that is unholy, let them not have GCN graphics.

would it really matter if they used RDNA instead of Vega? I'm under the impression that an iGPU would be so starved of memory bandwidth that you wouldn't be able to goose any more performance out of it relative to just going with a Vega 11 like with the older APUs

Llamadeus
Dec 20, 2005

SwissArmyDruid posted:

I want to see some core counts on those parts, and please, for the love of all that is unholy, let them not have GCN graphics.
Enough has been leaked about these that they're basically confirmed as having the same Vega GPU as the Zen 2 parts

SwissArmyDruid
Feb 14, 2014

by sebmojo

gradenko_2000 posted:

would it really matter if they used RDNA instead of Vega? I'm under the impression that an iGPU would be so starved of memory bandwidth that you wouldn't be able to goose any more performance out of it relative to just going with a Vega 11 like with the older APUs

With how my one Raven Ridge ultraportable only ever got three driver updates over two years, and still doesn't take generic AMD driver blobs, I have trust issues with not getting driver updates on APUs. I'll hitch my wagon to onboard graphics, fine, but they had better be the next gen ones so I'm not stuck on this side of the hedge.

Cygni
Nov 12, 2005

raring to post

SwissArmyDruid posted:

I want to see some core counts on those parts, and please, for the love of all that is unholy, let them not have GCN graphics.

They use Vega.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

SwissArmyDruid posted:

With how my one Raven Ridge ultraportable only ever got three driver updates over two years, and still doesn't take generic AMD driver blobs, I have trust issues with not getting driver updates on APUs. I'll hitch my wagon to onboard graphics, fine, but they had better be the next gen ones so I'm not stuck on this side of the hedge.

okay yeah if your concern is that Vega graphics aren't getting proper driver updates relative to if they had RDNA I totally get that

repiv
Aug 13, 2009

gradenko_2000 posted:

would it really matter if they used RDNA instead of Vega? I'm under the impression that an iGPU would be so starved of memory bandwidth that you wouldn't be able to goose any more performance out of it relative to just going with a Vega 11 like with the older APUs

RDNA is supposed to have improved framebuffer compression compared to earlier architectures so they could squeeze more value out of the bandwidth they have

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

CaptainSarcastic posted:

When I fired up Fallout 4 on my new 3600x it took me a few minutes to realize things were all weird because it was trying to run at like 1000fps. On my old Phenom II it struggled to even get to 30fps in busier areas. It was a :stare: moment for me having to throw a limit on the fps to make it run sanely.

I was replaying fo4 recently and I was getting a little disgusted with the load times given I have a completely ridiculous system by the standards of hardware available at the time of the release

until someone in the fo4 modding thread mentioned that the load times are tied to framerate... which means they're 60 fps limited. amazing. Apparently there is a mod that fixes it by uncapping framerates while loading.

SwissArmyDruid posted:

With how my one Raven Ridge ultraportable only ever got three driver updates over two years, and still doesn't take generic AMD driver blobs, I have trust issues with not getting driver updates on APUs. I'll hitch my wagon to onboard graphics, fine, but they had better be the next gen ones so I'm not stuck on this side of the hedge.

the other disappointing thing about the new APUs (besides trying to sell Zen2 parts as 5000 series :chloe:) is that they still are using Vega graphics. Like, GCN isn't still just a current architecture, AMD has another new generation of mainstream products entirely based on Vega that they haven't even released yet. We probably won't see mainstream laptop (i.e. not Y series competitors) parts based on RDNA2 until probably 2022.

MH Knights
Aug 4, 2007

CaptainSarcastic posted:

When I fired up Fallout 4 on my new 3600x it took me a few minutes to realize things were all weird because it was trying to run at like 1000fps. On my old Phenom II it struggled to even get to 30fps in busier areas. It was a :stare: moment for me having to throw a limit on the fps to make it run sanely.

Just make sure you completely turn off weapon debris otherwise you will CTD very quickly! I learned that the hard way.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Vega on Renoir had RDNA1 everything but the compute blocks, wouldn't be surprised if Vega on Cezanne was something similar. It is super confusing in that Cezanne isn't even a C/V of Renoir with new Zen3 cores, Zen3 cores are bigger and it has twice the cache so it's already an entirely new physical design.

I'll loving die though if Vega in Cezanne is basically RDNA2 down to a small "IFCache" buffer but Vega CU's, because gently caress you buy Rembrandt.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Not sure it’s “everything but the compute blocks”, it doesn’t have the improved memory compression or anything else that comes with RDNA. (Or does it?)

It seems more accurate to say that it’s Vega with an updated media core and IO block, similar to Gemini Lake on the Intel side. And actually I’m not even all that sure on that, did codec support or HDMI/DP support level change from V64 to the mobile implementation? I thought V64 had HDMI 2.0 and so on...

If it had the memory compression I would care a lot less imo, that's the big value add for RDNA2 for mobile I think. But if you're afraid of getting islanded on an obsolete architecture, then isn't the compute block exactly what you are worried about?

"just the compute block" is kind of a big "just".

Paul MaudDib fucked around with this message at 16:41 on Dec 21, 2020

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
Most of this is moot because of how bandwidth limited iGPUs are with normal ddr, right?

DDR4 is 64 bits per channel, so the most common dual channel config is 128 bit and does a theoretical ~50GB/s out of that at ddr 3200 speeds. How can GDDR6 do more than 4x the bandwidth out of the same size bus? The $299 xbox series s has an RDNA iGPU, 128 bit bus, and gets more than 200GB/s bandwidth out of it somehow.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
System RAM, as in DDR4 is designed for handling transferring small bits of data very quickly, but at the cost of bandwidth.

GDDR, as in the kind that you find in GPUs, is designed for high bandwidth: moving large chunks of data as fast as possible, but at the cost of higher latency compared to system RAM.

They're different because each is specialized to do the thing that's more important for their use-case. I assume that there's some hackery/customization/specialized tuning as far as consoles using GDDR also as "system RAM" without it hobbling the experience.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
People have gotten the new Xboxes running Windows applications, including emulators, and they work pretty well. It only costs $20 to unlock dev mode on any retail console and run unsigned applications, Xbox is a valid UWP target so you can just build and run any UWP application and it'll work. I honestly want to see LibreOffice Spreadsheets benchmarks on the thing, but everybody is fascinated with how well it runs PS2 games.

Have there been any non-console x86 PCs sold using GDDR as main memory? I'm sure something on Aliexpress exists that I could get Windows or Linux running on. I'm curious now what a machine with high bandwidth and high memory latency would feel like.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
There was a guy who took part of the 24 gigs of GDDR6X on his RTX 3090, designated it to be recognized as a RAM drive, installed the new Crysis on it, and it ran well.

I *think* "high memory latency" is a rather relative term and isn't actually high enough to be appreciably noticeable in a lot of use-cases, but I could be wrong.

VorpalFish
Mar 22, 2007
reasonably awesometm

Twerk from Home posted:

Most of this is moot because of how bandwidth limited iGPUs are with normal ddr, right?

DDR4 is 64 bits per channel, so the most common dual channel config is 128 bit and does a theoretical ~50GB/s out of that at ddr 3200 speeds. How can GDDR6 do more than 4x the bandwidth out of the same size bus? The $299 xbox series s has an RDNA iGPU, 128 bit bus, and gets more than 200GB/s bandwidth out of it somehow.

GDDR, starting at 5 does 4 transfers per clock vs 2 with DDRx I believe. I guess with 5x and 6, you multiply clock by 8. So for the purposes of memory bandwidth, GDDR6 is going to have 4x the "effective" clockspeed of identically clocked DDR4.

Then on top of that, it's clocked higher as well.

Edit: well sometimes clocked higher - in your example the ddr4 3200 is clocked at 1600mhz for an effective clock of 3200mhz vs 1750mhz for the gddr6 for an effective clockspeed of 14000mhz.

VorpalFish fucked around with this message at 17:08 on Dec 21, 2020

Seamonster
Apr 30, 2007

IMMER SIEGREICH

gradenko_2000 posted:

There was a guy who took part of the 24 gigs of GDDR6X on his RTX 3090, designated it to be recognized as a RAM drive, installed the new Crysis on it, and it ran well.

I *think* "high memory latency" is a rather relative term and isn't actually high enough to be appreciably noticeable in a lot of use-cases, but I could be wrong.

It was a she.

Gwaihir
Dec 8, 2009
Hair Elf

Dilber posted:

B&H must have gotten more 5900x in stock than they expected. I was originally a "december 23rd" and then got the email pushing me to January sometime.

It just shipped yesterday.

hmmmm, I originally got an email for December 21st, updated to January, and no update since then. I guess you just got luckier!

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

gradenko_2000 posted:

I assume that there's some hackery/customization/specialized tuning as far as consoles using GDDR also as "system RAM" without it hobbling the experience.

There's probably no need for any hackery or workarounds, because can't you do things like write a script to start 80 processes at once, or flip from Chrome with 350 tabs to Photoshop with a 85GB PSD to Cyberpunk 2077 multiple times per second.

What I'm saying is that I think the use case and UI design make it unlikely that a human would ever notice any possible suboptimal memory behavior.

Dilber
Mar 27, 2007

TFLC
(Trophy Feline Lifting Crew)


Gwaihir posted:

hmmmm, I originally got an email for December 21st, updated to January, and no update since then. I guess you just got luckier!

It came in yesterday, so I am lucky there.

Less lucky is the BIOS I flashed to yesterday had now been pulled for stability issues, so now i get to do a bunch of stress testing to see if I need to go to earlier or just wait for the next version.

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

For you mobile/APU-heads out there:

https://twitter.com/_rogame/status/1340294195433660422

https://twitter.com/_rogame/status/1340294511218585601

One thing of note, there is some rumors that VanGogh may actually be cancelled with MS instead going ARM.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply