Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

PC LOAD LETTER posted:

Apparently requires lots of luck or really good binned parts and stability is still touch and go most of the time.

DDR5 5600 is much more achievable, but still pretty hard 4x DIMM's, and gets you a decent portion of the gains that DDR5 6000 would give. I'd try to aim for that instead but apparently even then your chances aren't great.

I think Intel is still a little better with 4x DIMM's (people get to 5600 easier but 6000 is still often a crapshoot) but its not a big difference anymore.

Epycs only allow one DIMM per channel and the number of people who need 4 DIMMs is a drop in the ocean so AMD don't have a lot of motivation to make a memory controller that can handle 4 DIMMs well.

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

ConanTheLibrarian posted:

Epycs only allow one DIMM per channel and the number of people who need 4 DIMMs is a drop in the ocean so AMD don't have a lot of motivation to make a memory controller that can handle 4 DIMMs well.

All Epycs have been designed to support 2 DIMMs per channel, and AMD says that they're able to fix their latest generations memory problem with just a BIOS update. https://www.tomshardware.com/news/amd-responds-to-claims-of-epyc-genoa-memory-bug-says-update-on-track

2DPC gets Epyc to 48 DIMMs in a 2 socket server.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
Good luck fitting that number on a server motherboard without even more physical compromises than 12 DIMMs/socket takes. That article even says they think it's more likely to be used in a single socket setup. Not to mention that memory speed drops from 4800 to 4000 when going to 2DPC on Genoa. To address my original point, the Tom's article also says "market insiders have even predicted that support for 2DPC could end with the DDR6 standard". Gigabyte showcased a board with 2DPC, but I'm assuming it was a prototype because it's not listed on their site 9 months after the photo was taken:

Dr. Video Games 0031
Jul 17, 2004

When are we finally going to do proper quad-channel with 1dpc for consumer chips? What exactly is preventing that from being a thing? Is it just a segmentation thing, to make HEDT more appealing?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Pins and traces.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
For the moment there's still a lot of scope to boost memory bandwidth by increasing memory clocks. Once those gains are exhausted, we'll be even deeper into the mismatch between logic and PHY scaling. Adding an extra two channels may be prohibitively expensive since (a) the CPU manufacturer has to produce a larger die, (b) the mobo manufacturer needs a more complicated MB, and (c) the consumer needs to buy 4 DIMMs to get the most out of their system. Therefore,,,

ConanTheLibrarian posted:

Stack everything. A compute die on top of a cache die on top of a PHY die on top of a HBM die. Let the 3D revolution begin!

PC LOAD LETTER
May 23, 2005
WTF?!
LGA makes it easy to add tons of contacts though. They're already at over 1700 for AM5. LGA2011-v3 and LGA2066 have a bit over 2000 and were 4 channel memory platforms.

There are mobos that support those sockets brand new that sell for around $160 these days so its not THAT expensive or onerous to do. Especially vs what new AM5 mobos can cost.

They're probably worried about appeasing OEM's would be my guess. OEM's are already so drat cheap they still often ship modern systems with a single DIMM of the lowest clocked stuff they can get to save a buck or 2. They'd probably be pissed if they were forced to buy 4 DIMMs every time for AMD systems.

That and until recently I don't think most cared about iGPU performance which is where that extra bandwidth would really matter.

FWIW technically DDR5 is already quad channel buuuut its 4x 32bit channels instead of 4x 64bit channels like with LGA2011-v3 and LGA2066 so we'd reaaaaallllly wanting here are 8x 32bit channel memory systems!

Tuna-Fish
Sep 13, 2017

Dr. Video Games 0031 posted:

When are we finally going to do proper quad-channel with 1dpc for consumer chips? What exactly is preventing that from being a thing? Is it just a segmentation thing, to make HEDT more appealing?

Cost. The requirement for the signal path are really strict these days, which means that routing more lines makes every existing line more expensive.

I think there is very little chance that the same platform that needs to support normal, low-end desktop will also ever support 4 channels with dimms. It's a bit more possible if Dell CAMM becomes a thing even on desktop, as those were designed to make routing cheaper. But even with them, I think the odds are pretty remote.

Honestly, I think consumer CPU memory buses will only grow wider once memory is soldered on the same substrate as the CPU. Because that's the only way to do it cheaply.

PC LOAD LETTER posted:

LGA makes it easy to add tons of contacts though.

The socket is a lot less of a limit these days than what's under the socket.

Klyith
Aug 3, 2007

GBS Pledge Week
The other question to ask is: how many things are actually bandwidth limited by 2 channels of memory on a normal desktop CPU?

You might be able to say something like "bumping from DDR4 4800 to 6000 has x% average increase in FPS across a bunch of games", but that speed boost is also decreasing latency. Meanwhile if you take a Threadripper with 8 goddamn memory channels and look at reviews with gaming performance, it's nothing special.

Games aren't the only thing in the world, but I suspect that relatively few desktop tasks are being limited by memory bandwidth, particularly on the midrange chips like a 7800X or 13600K. Quite possibly a 7950X with 16 cores would see some improvement on some sort of serious-business task like rendering or whatever, but that's why HEDT / workstation exists.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Did memory training get slightly longer with ComboPI 1.0.0.7b?

VorpalFish
Mar 22, 2007
reasonably awesometm

Klyith posted:

The other question to ask is: how many things are actually bandwidth limited by 2 channels of memory on a normal desktop CPU?

You might be able to say something like "bumping from DDR4 4800 to 6000 has x% average increase in FPS across a bunch of games", but that speed boost is also decreasing latency. Meanwhile if you take a Threadripper with 8 goddamn memory channels and look at reviews with gaming performance, it's nothing special.

Games aren't the only thing in the world, but I suspect that relatively few desktop tasks are being limited by memory bandwidth, particularly on the midrange chips like a 7800X or 13600K. Quite possibly a 7950X with 16 cores would see some improvement on some sort of serious-business task like rendering or whatever, but that's why HEDT / workstation exists.

Tbh where it really matters for consumers is igpu - where memory bandwidth is shared, and is a real constraint for how performant your GPU can be.

But at least for now that's way less relevant on desktop where everyone who wants GPU performance is throwing in a pcie card anyways.

There's been a lot of talk about igpu cannibalizing the low to midrange GPU market though, and that's going to need bandwidth if it's going to happen.

Kazinsal
Dec 13, 2011

Tuna-Fish posted:

The socket is a lot less of a limit these days than what's under the socket.

Introducing the world's first 96-layer motherboard PCB, only from ASRock!

Klyith
Aug 3, 2007

GBS Pledge Week

Kazinsal posted:

Introducing the world's first 96-layer motherboard PCB, only from ASRock!

Pffft, boring engineering safety margins to prevent problems? You must be thinking of asus or evga!


The Asrock mobo with quad-channel DDR5 would still have an 8 layer PCB, but the ram slots are arranged in a rhombus around the CPU to avoid crosstalk. One of them is on the back of the mobo because it's colocated with the GPU PCIe slot. And every other USB jack on the IO pane is rotated 90 degrees because they found it cancels a small but significant EFI source. Also every setting in the BIOS has to be set to a multiple of seven, so that it will never operate at the same frequency as the DDR.


And they always take 2 extra weeks to ship because they always get held up in transit when the Postal Inspector drug-detection dogs keep alerting on them, for some inexplicable reason.

FuturePastNow
May 19, 2014


VorpalFish posted:

Tbh where it really matters for consumers is igpu - where memory bandwidth is shared, and is a real constraint for how performant your GPU can be.

But at least for now that's way less relevant on desktop where everyone who wants GPU performance is throwing in a pcie card anyways.

There's been a lot of talk about igpu cannibalizing the low to midrange GPU market though, and that's going to need bandwidth if it's going to happen.

Yeah for consumers it might matter the most for igpu, but aside from Macs the computers that use igpus are for the most budget conscious buyers, where increasing the motherboard cost by $50 might be a real problem.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Klyith posted:

The Asrock mobo with quad-channel DDR5 would still have an 8 layer PCB, but the ram slots are arranged in a rhombus around the CPU to avoid crosstalk. One of them is on the back of the mobo because it's colocated with the GPU PCIe slot. And every other USB jack on the IO pane is rotated 90 degrees because they found it cancels a small but significant EFI source. Also every setting in the BIOS has to be set to a multiple of seven, so that it will never operate at the same frequency as the DDR.


And they always take 2 extra weeks to ship because they always get held up in transit when the Postal Inspector drug-detection dogs keep alerting on them, for some inexplicable reason.

:hmmyes:

BlankSystemDaemon
Mar 13, 2009



Klyith confirmed working at AsRock.

Josh Lyman
May 24, 2009


I upgraded my ASUS ROG B650E-F to the newish 1636 BIOS that uses AGESA 1.0.0.7b and they changed the ROG logo on bootup to red lmao.

I was able to run my G.Skill Flare 2x 16GB 6000MHz 36-36-36-96 memory at 6400MHz 30-36-36-76 but the 3DMark score went up less than 2% and the mobo automatically jacked up my CPU voltages so I pulled it back to stock EXPO.

Cygni
Nov 12, 2005

raring to post

new AMD chipset drivers for AM4/AM5 posted

https://www.amd.com/en/support/kb/release-notes/rn-ryzen-chipset-5-08-02-027

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Those are some real lovely release notes.

CaptainSarcastic
Jul 6, 2013



Subjunctive posted:

Those are some real lovely release notes.

Agreed. I have three machines of my own this applies to, as well as two more owned by family members I end up as default tech support for, and I don't feel like I have any idea what it is actually doing.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Well the intern had to take a break from coding FSR3 to write those, so cut them some slack!

ZombieCrew
Apr 1, 2019
Not really a parts question, but im putting together a new pc for my gf. I got two codes for starfield from the parts i bought her. Is there a way to use a code for me and her since she wont be using both?

E: its through amd rewards

Branch Nvidian
Nov 29, 2012



ZombieCrew posted:

Not really a parts question, but im putting together a new pc for my gf. I got two codes for starfield from the parts i bought her. Is there a way to use a code for me and her since she wont be using both?

E: its through amd rewards

It'll depend on if it generates codes or redeems it to a steam/epic/whatever account. If it's redeemed via Steam then a Steam account has to be linked to the AMD Rewards site, and it'll add it to the account. I think you can probably add a Steam account, redeem one code, unlink that account and link another, and redeem the other code.

ZombieCrew
Apr 1, 2019

Branch Nvidian posted:

It'll depend on if it generates codes or redeems it to a steam/epic/whatever account. If it's redeemed via Steam then a Steam account has to be linked to the AMD Rewards site, and it'll add it to the account. I think you can probably add a Steam account, redeem one code, unlink that account and link another, and redeem the other code.

But id have to do it on the machine im building so it sees the parts?

Branch Nvidian
Nov 29, 2012



Yes, that is correct. You enter the redemption code for one of the parts and you get a verification tool that looks for the specific component, once it sees that component it allows you to add the game to your Steam library or gives you the game code or w/e.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Is it confirmed to be a Steam copy?

Branch Nvidian
Nov 29, 2012



Rinkles posted:

Is it confirmed to be a Steam copy?

According to this Reddit post it is, indeed, a Steam copy.
https://www.reddit.com/r/Starfield/comments/14z5eml/confirmed_that_amd_bundle_gives_a_steam_key/

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Klyith posted:

The other question to ask is: how many things are actually bandwidth limited by 2 channels of memory on a normal desktop CPU? ...

Games aren't the only thing in the world, but I suspect that relatively few desktop tasks are being limited by memory bandwidth, particularly on the midrange chips like a 7800X or 13600K. Quite possibly a 7950X with 16 cores would see some improvement on some sort of serious-business task like rendering or whatever, but that's why HEDT / workstation exists.

Yep. I honestly think this was part of the calculus for keeping quad-cores for so long too. that's what the business world wants as a "decent but not extravagant workstation" (not HEDT) in the 2012-2017 era. the drive for higher cores wasn't there, and if you were a prosumer the HEDT platform was a lot cheaper and more accessible.

kliras
Mar 27, 2021
no breaking news per se, but check out that on-board fan, good lord

https://twitter.com/VideoCardz/status/1698675506478981164

Arivia
Mar 17, 2011

kliras posted:

no breaking news per se, but check out that on-board fan, good lord

https://twitter.com/VideoCardz/status/1698675506478981164



which one? each board has two, one over the chipset (or where the chipset used to be) and one on the VRMs.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://twitter.com/Kepler_L2/status/1698531961294205308

I don't think anyone is actually boycotting AMD like this (lol), but the rumor mills are very bullish on Zen 5 lately

kliras
Mar 27, 2021

gradenko_2000 posted:

https://twitter.com/Kepler_L2/status/1698531961294205308

I don't think anyone is actually boycotting AMD like this (lol), but the rumor mills are very bullish on Zen 5 lately
the people who paid $70 to preorder starfield followed by $5 to use puredark's mod means they're definitely the suckers who will buy whatever by amd they need and make up a reason for it after the fact

Icept
Jul 11, 2001
The worst part about waiting for Zen 5 is you have to wait another half year for the X3D parts

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Icept posted:

The worst part about waiting for Zen 5 is you have to wait another half year for the X3D parts

The internet thinks Zen 5 (AKA Ryzen 8000) has a street date of about a year from now (H2 2024), so AMD will probably unveil it at Computex -- next May? I don't even have the energy to get hyped about that, no matter how badly twitter leakers want me to.

Edit: embarrassing apostrophe error

mdxi fucked around with this message at 19:43 on Sep 4, 2023

Sininu
Jan 8, 2014

I'm eagerly waiting for Zen5 to upgrade my 3700X and avoid the first gen issues DDR5 and new chipsets have.

hobbesmaster
Jan 28, 2008

gradenko_2000 posted:

https://twitter.com/Kepler_L2/status/1698531961294205308

I don't think anyone is actually boycotting AMD like this (lol), but the rumor mills are very bullish on Zen 5 lately

There’s probably one person out there doing this and their reasoning would have to be “AMD is so bad at this. Now, Intel, there’s a company that actually knows how to gently caress over other hardware vendors”

Cygni
Nov 12, 2005

raring to post

Sininu posted:

I'm eagerly waiting for Zen5 to upgrade my 3700X and avoid the first gen issues DDR5 and new chipsets have.

Rumor is that its the exact same IO die as Zen4 without any silicon revision, so might not be much different for memory stuff. Still a ways off from knowing for sure, and I think your sentiment is smart regardless.

hobbesmaster
Jan 28, 2008

The current AGESA versions seem to be making good use of the IOD, at least to the capabilities of the zen4 CPUs. We’ll have to see if this is more like Zen1 to 2 or 2 to 3.

Wibla
Feb 16, 2011

Seems the most recent AGESA has tidied up a lot of the memory bullshit?

Adbot
ADBOT LOVES YOU

Indiana_Krom
Jun 18, 2007
Net Slacker

Sininu posted:

I'm eagerly waiting for Zen5 to upgrade my 3700X and avoid the first gen issues DDR5 and new chipsets have.

Same, but would be upgrading from an Intel 9900k assuming I can wait that long.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply