|
can someone explain this, what does this mean? will a 5600G be better performance/benchmark-wise than a 5600X at stock settings? also, what's the latest rumor mill regarding AMD chipsets? any new chipsets on the horizon in the next 6 months?
|
# ? Dec 20, 2020 17:46 |
|
|
# ? Jun 8, 2024 11:34 |
|
pmchem posted:can someone explain this, what does this mean? will a 5600G be better performance/benchmark-wise than a 5600X at stock settings? Imagine a 5600X but with integrated graphics.
|
# ? Dec 20, 2020 17:48 |
|
SamDabbers posted:Imagine a 5600X but with integrated graphics.
|
# ? Dec 20, 2020 17:53 |
|
ufarn posted:Will those be usable with a dedicated GPU like Intel's, or is it (still?) mutually exclusive? You should still be able to use a dGPU but some of the PCIe lanes will be used by the iGPU, if it's anything like previous Zen processors with integrated graphics.
|
# ? Dec 20, 2020 18:13 |
|
SamDabbers posted:You should still be able to use a dGPU but some of the PCIe lanes will be used by the iGPU, if it's anything like previous Zen processors with integrated graphics. Yeah, it's mostly a case of get the 5600x if you ever plan to use a dGPU, get the 5600g if you don't. You could totally use a dGPU with the 5600g, but it will most likely be running at 8x instead of 16x.
|
# ? Dec 20, 2020 18:26 |
|
Yeah all of AMD's Zen APUs so far have also only had 8 PCIe lanes for the graphics slot. Which is not a problem unless you're using a truly high end card of the sort few people will ever pair with an APU. But it's one of the compromises they make to stick some pretty ok graphics in there.
|
# ? Dec 20, 2020 19:22 |
|
Should be less of an issue now with PCIe4 though, provided you pair it with an Ampere or Big Navi card?
|
# ? Dec 20, 2020 19:31 |
|
repiv posted:Should be less of an issue now with PCIe4 though, provided you pair it with an Ampere or Big Navi card? It's not really an issue even on PCIe 3 unless you're gaming with a fast CPU, high-end GPU, and a 1080p monitor. Otherwise it's like a 1-3% difference between PCIe 4.0 x16 and PCIe 3.0 x8 in most games.
|
# ? Dec 20, 2020 19:41 |
|
I'm excited as hell. This in an Asrock Deskmini would make for an incredible tiny workstation.
|
# ? Dec 20, 2020 20:03 |
|
repiv posted:Should be less of an issue now with PCIe4 though, provided you pair it with an Ampere or Big Navi card? Does Cezanne finally add PCIe 4? Renoir didn’t have it, unlike the rest of the zen2s.
|
# ? Dec 20, 2020 20:26 |
|
Paul MaudDib posted:Does Cezanne finally add PCIe 4? Renoir didn’t have it, unlike the rest of the zen2s. The leaked roadmap from earlier this year suggests it doesn't have PCIe 4. Rembrandt is the first APU with explicitly listed PCIe 4. Rembrandt is also AMDs first part with DDR5.
|
# ? Dec 20, 2020 21:12 |
|
Cygni posted:The leaked roadmap from earlier this year suggests it doesn't have PCIe 4. Rembrandt is the first APU with explicitly listed PCIe 4. Rembrandt is also AMDs first part with DDR5. Could it be for power saving reasons since they're going to reuse the same design for laptops?
|
# ? Dec 20, 2020 21:17 |
|
Arzachel posted:Could it be for power saving reasons since they're going to reuse the same design for laptops? yeah definitely, the APUs are laptop-first designs that get socketed for the desktop market because it's easy to do. PCIe 4.0 (a) completely doesn't matter in laptops, and (b) is a giant power hog. there's a reason the only chipset with PCIe4 capability suddenly has a chipset fan.
|
# ? Dec 20, 2020 21:37 |
|
Speaking of which, I wonder if I can force my X570 PCH down to PCIe 3 and nix the fan. Hmmmm
|
# ? Dec 20, 2020 21:47 |
|
Paul MaudDib posted:yeah definitely, the APUs are laptop-first designs that get socketed for the desktop market because it's easy to do. PCIe 4.0 (a) completely doesn't matter in laptops, and (b) is a giant power hog. there's a reason the only chipset with PCIe4 capability suddenly has a chipset fan. I thought that reason was because they're repurposed io dies on glofo 12nm to help meet purchasing volumes on their agreement? VorpalFish fucked around with this message at 22:47 on Dec 20, 2020 |
# ? Dec 20, 2020 22:45 |
|
I do also want to point out that the APUs have less cache than a "pure" CPU, since part of the die that would otherwise be cache is occupied by the iGPU, and that does make them worse than a CPU in most (all?) scenarios where you're already using a dedicated GPU. I'm just excited for it personally since I like using APUs even with a dGPU for the "insurance" factor of getting to continue to use your computer even if the GPU dies (even if I also own more than one GPU anyway), but also since we never got a Zen 2-based desktop APU that's available over-the-counter.
|
# ? Dec 21, 2020 02:03 |
|
just did bios flashback on a b550e in preparation to install a 5900x I was stressing for nothing. Super easy, new tech is great. Cant wait to play 4 year old games at 300fps
|
# ? Dec 21, 2020 02:34 |
|
Fauxtool posted:just did bios flashback on a b550e in preparation to install a 5900x When I fired up Fallout 4 on my new 3600x it took me a few minutes to realize things were all weird because it was trying to run at like 1000fps. On my old Phenom II it struggled to even get to 30fps in busier areas. It was a moment for me having to throw a limit on the fps to make it run sanely.
|
# ? Dec 21, 2020 02:44 |
|
I tried to play skyrim for the first time 6 years after it came out on what was then a top spec pc. I thought it was a most broken crap game until i turned on vsync. First world problems amirite
|
# ? Dec 21, 2020 02:50 |
|
I want to see some core counts on those parts, and please, for the love of all that is unholy, let them not have GCN graphics. Until AMD gets their poo poo together with SR-IOV, one of these plus a 5800 XT will be what finally gets me off Windows as my main operating system for good.
|
# ? Dec 21, 2020 03:10 |
|
SwissArmyDruid posted:I want to see some core counts on those parts, and please, for the love of all that is unholy, let them not have GCN graphics. would it really matter if they used RDNA instead of Vega? I'm under the impression that an iGPU would be so starved of memory bandwidth that you wouldn't be able to goose any more performance out of it relative to just going with a Vega 11 like with the older APUs
|
# ? Dec 21, 2020 03:18 |
|
SwissArmyDruid posted:I want to see some core counts on those parts, and please, for the love of all that is unholy, let them not have GCN graphics.
|
# ? Dec 21, 2020 03:20 |
|
gradenko_2000 posted:would it really matter if they used RDNA instead of Vega? I'm under the impression that an iGPU would be so starved of memory bandwidth that you wouldn't be able to goose any more performance out of it relative to just going with a Vega 11 like with the older APUs With how my one Raven Ridge ultraportable only ever got three driver updates over two years, and still doesn't take generic AMD driver blobs, I have trust issues with not getting driver updates on APUs. I'll hitch my wagon to onboard graphics, fine, but they had better be the next gen ones so I'm not stuck on this side of the hedge.
|
# ? Dec 21, 2020 03:22 |
|
SwissArmyDruid posted:I want to see some core counts on those parts, and please, for the love of all that is unholy, let them not have GCN graphics. They use Vega.
|
# ? Dec 21, 2020 03:27 |
|
SwissArmyDruid posted:With how my one Raven Ridge ultraportable only ever got three driver updates over two years, and still doesn't take generic AMD driver blobs, I have trust issues with not getting driver updates on APUs. I'll hitch my wagon to onboard graphics, fine, but they had better be the next gen ones so I'm not stuck on this side of the hedge. okay yeah if your concern is that Vega graphics aren't getting proper driver updates relative to if they had RDNA I totally get that
|
# ? Dec 21, 2020 03:35 |
|
gradenko_2000 posted:would it really matter if they used RDNA instead of Vega? I'm under the impression that an iGPU would be so starved of memory bandwidth that you wouldn't be able to goose any more performance out of it relative to just going with a Vega 11 like with the older APUs RDNA is supposed to have improved framebuffer compression compared to earlier architectures so they could squeeze more value out of the bandwidth they have
|
# ? Dec 21, 2020 03:46 |
|
CaptainSarcastic posted:When I fired up Fallout 4 on my new 3600x it took me a few minutes to realize things were all weird because it was trying to run at like 1000fps. On my old Phenom II it struggled to even get to 30fps in busier areas. It was a moment for me having to throw a limit on the fps to make it run sanely. I was replaying fo4 recently and I was getting a little disgusted with the load times given I have a completely ridiculous system by the standards of hardware available at the time of the release until someone in the fo4 modding thread mentioned that the load times are tied to framerate... which means they're 60 fps limited. amazing. Apparently there is a mod that fixes it by uncapping framerates while loading. SwissArmyDruid posted:With how my one Raven Ridge ultraportable only ever got three driver updates over two years, and still doesn't take generic AMD driver blobs, I have trust issues with not getting driver updates on APUs. I'll hitch my wagon to onboard graphics, fine, but they had better be the next gen ones so I'm not stuck on this side of the hedge. the other disappointing thing about the new APUs (besides trying to sell Zen2 parts as 5000 series ) is that they still are using Vega graphics. Like, GCN isn't still just a current architecture, AMD has another new generation of mainstream products entirely based on Vega that they haven't even released yet. We probably won't see mainstream laptop (i.e. not Y series competitors) parts based on RDNA2 until probably 2022.
|
# ? Dec 21, 2020 04:35 |
|
CaptainSarcastic posted:When I fired up Fallout 4 on my new 3600x it took me a few minutes to realize things were all weird because it was trying to run at like 1000fps. On my old Phenom II it struggled to even get to 30fps in busier areas. It was a moment for me having to throw a limit on the fps to make it run sanely. Just make sure you completely turn off weapon debris otherwise you will CTD very quickly! I learned that the hard way.
|
# ? Dec 21, 2020 05:20 |
|
Vega on Renoir had RDNA1 everything but the compute blocks, wouldn't be surprised if Vega on Cezanne was something similar. It is super confusing in that Cezanne isn't even a C/V of Renoir with new Zen3 cores, Zen3 cores are bigger and it has twice the cache so it's already an entirely new physical design. I'll loving die though if Vega in Cezanne is basically RDNA2 down to a small "IFCache" buffer but Vega CU's, because gently caress you buy Rembrandt.
|
# ? Dec 21, 2020 13:24 |
|
Not sure it’s “everything but the compute blocks”, it doesn’t have the improved memory compression or anything else that comes with RDNA. (Or does it?) It seems more accurate to say that it’s Vega with an updated media core and IO block, similar to Gemini Lake on the Intel side. And actually I’m not even all that sure on that, did codec support or HDMI/DP support level change from V64 to the mobile implementation? I thought V64 had HDMI 2.0 and so on... If it had the memory compression I would care a lot less imo, that's the big value add for RDNA2 for mobile I think. But if you're afraid of getting islanded on an obsolete architecture, then isn't the compute block exactly what you are worried about? "just the compute block" is kind of a big "just". Paul MaudDib fucked around with this message at 16:41 on Dec 21, 2020 |
# ? Dec 21, 2020 15:12 |
|
Most of this is moot because of how bandwidth limited iGPUs are with normal ddr, right? DDR4 is 64 bits per channel, so the most common dual channel config is 128 bit and does a theoretical ~50GB/s out of that at ddr 3200 speeds. How can GDDR6 do more than 4x the bandwidth out of the same size bus? The $299 xbox series s has an RDNA iGPU, 128 bit bus, and gets more than 200GB/s bandwidth out of it somehow.
|
# ? Dec 21, 2020 15:36 |
|
System RAM, as in DDR4 is designed for handling transferring small bits of data very quickly, but at the cost of bandwidth. GDDR, as in the kind that you find in GPUs, is designed for high bandwidth: moving large chunks of data as fast as possible, but at the cost of higher latency compared to system RAM. They're different because each is specialized to do the thing that's more important for their use-case. I assume that there's some hackery/customization/specialized tuning as far as consoles using GDDR also as "system RAM" without it hobbling the experience.
|
# ? Dec 21, 2020 15:40 |
|
People have gotten the new Xboxes running Windows applications, including emulators, and they work pretty well. It only costs $20 to unlock dev mode on any retail console and run unsigned applications, Xbox is a valid UWP target so you can just build and run any UWP application and it'll work. I honestly want to see LibreOffice Spreadsheets benchmarks on the thing, but everybody is fascinated with how well it runs PS2 games. Have there been any non-console x86 PCs sold using GDDR as main memory? I'm sure something on Aliexpress exists that I could get Windows or Linux running on. I'm curious now what a machine with high bandwidth and high memory latency would feel like.
|
# ? Dec 21, 2020 15:57 |
|
There was a guy who took part of the 24 gigs of GDDR6X on his RTX 3090, designated it to be recognized as a RAM drive, installed the new Crysis on it, and it ran well. I *think* "high memory latency" is a rather relative term and isn't actually high enough to be appreciably noticeable in a lot of use-cases, but I could be wrong.
|
# ? Dec 21, 2020 16:02 |
|
Twerk from Home posted:Most of this is moot because of how bandwidth limited iGPUs are with normal ddr, right? GDDR, starting at 5 does 4 transfers per clock vs 2 with DDRx I believe. I guess with 5x and 6, you multiply clock by 8. So for the purposes of memory bandwidth, GDDR6 is going to have 4x the "effective" clockspeed of identically clocked DDR4. Then on top of that, it's clocked higher as well. Edit: well sometimes clocked higher - in your example the ddr4 3200 is clocked at 1600mhz for an effective clock of 3200mhz vs 1750mhz for the gddr6 for an effective clockspeed of 14000mhz. VorpalFish fucked around with this message at 17:08 on Dec 21, 2020 |
# ? Dec 21, 2020 16:29 |
|
gradenko_2000 posted:There was a guy who took part of the 24 gigs of GDDR6X on his RTX 3090, designated it to be recognized as a RAM drive, installed the new Crysis on it, and it ran well. It was a she.
|
# ? Dec 21, 2020 16:52 |
|
Dilber posted:B&H must have gotten more 5900x in stock than they expected. I was originally a "december 23rd" and then got the email pushing me to January sometime. hmmmm, I originally got an email for December 21st, updated to January, and no update since then. I guess you just got luckier!
|
# ? Dec 21, 2020 18:22 |
|
gradenko_2000 posted:I assume that there's some hackery/customization/specialized tuning as far as consoles using GDDR also as "system RAM" without it hobbling the experience. There's probably no need for any hackery or workarounds, because can't you do things like write a script to start 80 processes at once, or flip from Chrome with 350 tabs to Photoshop with a 85GB PSD to Cyberpunk 2077 multiple times per second. What I'm saying is that I think the use case and UI design make it unlikely that a human would ever notice any possible suboptimal memory behavior.
|
# ? Dec 21, 2020 20:29 |
|
Gwaihir posted:hmmmm, I originally got an email for December 21st, updated to January, and no update since then. I guess you just got luckier! It came in yesterday, so I am lucky there. Less lucky is the BIOS I flashed to yesterday had now been pulled for stability issues, so now i get to do a bunch of stress testing to see if I need to go to earlier or just wait for the next version.
|
# ? Dec 21, 2020 20:45 |
|
|
# ? Jun 8, 2024 11:34 |
|
For you mobile/APU-heads out there: https://twitter.com/_rogame/status/1340294195433660422 https://twitter.com/_rogame/status/1340294511218585601 One thing of note, there is some rumors that VanGogh may actually be cancelled with MS instead going ARM.
|
# ? Dec 21, 2020 22:05 |