|
Pablo Bluth posted:There's no complexity that would free up die space? These desktop CPUs are behemoths, the static cost of a bit of hardware to implement x86 support is extremely small. You can only really utilize so much of the die at once anyway for thermal and other power reasons, so while their might be some miniscule cost in terms of engineering and BOM, in actual performance the cost of x86 support on x64 is likely literally zero.
|
# ? Nov 23, 2020 02:07 |
|
|
# ? Jun 3, 2024 12:10 |
|
yeah, legacy x86 is ultimately a fairly small amount of silicon area, as in the whole decoder is well under 5% of the core area. modern CPUs are basically cache with a little bit of processor tacked on, it is something like 2/3rds or 3/4 of the chip is just cache, and of the part that's not cache the decoder doesn't make up that much area.
|
# ? Nov 23, 2020 02:09 |
|
Or to look at it from another angle, the Pentium II had around 7.5 million transistors. Apparently a Zen2 chiplet has around 4.15 billion transistors.
|
# ? Nov 23, 2020 02:16 |
|
What abouts the chip that has the fewest transistors? Can you get 1, or even 0 transistor chips?
|
# ? Nov 23, 2020 03:17 |
|
gradenko_2000 posted:AMD only went with HBM because it consumes less power, and they needed to free up power from the memory side of the card so they could spend the budget on goosing the GPU ever higher, because they were still on GCN with the Vega and Radeon VII. They also helped developed it, power is why they stayed with it.
|
# ? Nov 23, 2020 03:31 |
|
Mofabio posted:What abouts the chip that has the fewest transistors? Can you get 1, or even 0 transistor chips? Define “chip” vvv all use transistors of some type to implement gates. the general alternative was vacuum tubes like a triode hobbesmaster fucked around with this message at 03:58 on Nov 23, 2020 |
# ? Nov 23, 2020 03:34 |
|
Mofabio posted:What abouts the chip that has the fewest transistors? Can you get 1, or even 0 transistor chips? Combinatorial logic chips are around. https://electronicsclub.info/cmos.htm
|
# ? Nov 23, 2020 03:43 |
|
SCheeseman posted:Ryzen laptops do a pretty decent job playing games on Linux (provided it doesn't have Nvidia anything in it). There's nothing wrong with Nvidia support on Linux, except that it isn't open source. About 3 years ago, when AMDGPU started maturing, it was the only sane choice, and AMD's (also closed at the time) drivers were universally regarded as an absolute garbage fire. AMD has done a lot of good work and come incredibly far, but that doesn't change anything with about where Nvidia is at. They're still imperfect, but also still mostly fine and safe. Lastly, as someone who lived through the RX 5700 release on Linux (and bought one early, because AMDGPU had gotten so good), I'd recommend waiting a couple-three months to see how AMDs drivers shake out before picking up a 5800 series.
|
# ? Nov 23, 2020 03:56 |
|
mdxi posted:There's nothing wrong with Nvidia support on Linux, except that it isn't open source. Has Nvidia fixed Optimus so it works reliably on Linux yet? Or added support for Wayland? quote:Lastly, as someone who lived through the RX 5700 release on Linux (and bought one early, because AMDGPU had gotten so good), I'd recommend waiting a couple-three months to see how AMDs drivers shake out before picking up a 5800 series. According to Wendell from Level1Techs the 6800 series works great on Linux, both natively and with hardware passthrough to Windows VMs: https://www.youtube.com/watch?v=ykiU49gTNak EDIT: Phoronix didn't report any major issues in their review either. Mr.Radar fucked around with this message at 05:05 on Nov 23, 2020 |
# ? Nov 23, 2020 04:58 |
|
Didn't the kernel devs effectively block NV from supporting Optimus in the binary driver by making the relevent APIs GPL-only?
|
# ? Nov 23, 2020 05:08 |
|
Paul MaudDib posted:Combinatorial logic chips are around. Whoa, that's cool, I love combinatorics. Triangular numbers show up in natural phenomena all the time.
|
# ? Nov 23, 2020 05:16 |
|
mdxi posted:There's nothing wrong with Nvidia support on Linux, except that it isn't open source. it is next to impossible to get many laptop GPUs to work on linux, and AMD+nvidia laptops specifically are a nigh-impossible nut to crack. G14 took like a year to get into working order, and that's an actually popular laptop, if you buy something even a little bit obscure,
|
# ? Nov 23, 2020 06:40 |
|
Sorry, I missed the "laptop" part, and don't have any first-hand knowledge there. Mea culpa.
|
# ? Nov 23, 2020 06:43 |
|
i've encountered quite a few issues with nvidia cards on linux but tbf that's just linux in general
|
# ? Nov 23, 2020 06:44 |
|
Truga posted:it is next to impossible to get many laptop GPUs to work on linux, and AMD+nvidia laptops specifically are a nigh-impossible nut to crack. G14 took like a year to get into working order, and that's an actually popular laptop, if you buy something even a little bit obscure, The G14 has only been out for a little over 7 months. It was released April 2nd.
|
# ? Nov 23, 2020 07:03 |
|
Arzachel posted:The cache isn't inherent to the arch though, I don't think the console APUs have it, do they? Not like it matters, AMD seems to be determined on recycling Vega until DDR5. You're right it isn't inherent to the arch. Some sort of custom work would be needed in the hardware and/or drivers to make it useful here. But they could do it if they wanted to. And given the density they've been able to achieve it might actually pan out economically. I think given the way the iGPU is (eternally) bandwidth starved so long as it has the minimum necessary DX/OGL/Vulkan feature support it probably doesn't make much sense to switch iGPU arch unless you can actually give it more bandwidth somehow. DDR5 is supposed to help somewhat here vs DDR4 but its still going to be only dual channel so I wouldn't go expecting miracles from that either. I sometimes wish either Intel or AMD would be willing to push a 3 or 4 channel main memory solution across their entire lineup. I know it'd add significant costs to the platform (which would probably kill the idea) but that seems to be (short of adding HBM or some other on package/die huge caches) the only way to truly solve the chronic bandwidth starvation issues that iGPU's have had since they were introduced. Given the way the prices are on the new X99 LGA2011-v3 socket mobos, which are quad channel, it doesn't seem quuuuuuuuuite so absurd of an idea anymore.
|
# ? Nov 23, 2020 08:53 |
|
mdxi posted:There's nothing wrong with Nvidia support on Linux, except that it isn't open source. About 3 years ago, when AMDGPU started maturing, it was the only sane choice, and AMD's (also closed at the time) drivers were universally regarded as an absolute garbage fire. Support has rotted. I've had nothing but problems running Nvidia hardware on a bunch of different forms of desktop linux. Vsync is still a mess, driver crashes are frequent (particularly after sleep), support will always be lacking for things like Gallium Nine and they're still dragging their feet on Wayland. My experience with most modern AMD and Intel GPUs required no configuration for a good experience out of the box, it's night and day.
|
# ? Nov 23, 2020 09:14 |
|
I've been running the proprietary Nvidia drivers on Linux for years and rarely had a problem. A few times there was a glitch due to kernel updates and driver updates getting out of sync, but that's been few and far between. I haven't had to do any more setup on my Linux installs than I have on my Windows installs for Nvidia, at least not for several years. Just for the sake of full disclosure, I have relied on my distro's repos for the Nvidia drivers, and haven't installed them from source or anything. My main desktop is running openSUSE Tumbleweed and the Nvidia drivers are just fine on it.
|
# ? Nov 23, 2020 10:12 |
|
Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps.
|
# ? Nov 23, 2020 11:20 |
|
interrodactyl posted:Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps. You can reduce the power target (PPT) either through the bios or Ryzen Master, but 73c isn't bad for all-core loads.
|
# ? Nov 23, 2020 11:28 |
|
Bouncing between variants Ubuntu and Arch I've had similar problems on just about all of them. Google searches seemed to indicate they weren't isolated to me either, whatever success you've had there is clear evidence that things aren't going as smoothly for others.
|
# ? Nov 23, 2020 11:46 |
|
interrodactyl posted:Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps. What do you mean "heavy load"? What cooler are you using? Short answer is it's probably not worthwhile for that CPU IMO.
|
# ? Nov 23, 2020 13:10 |
|
There's just something deep in the human psyche about cpu temps huh? Like 60% of people immediately install a cpu and are instantly concerned about its temperature Mine happily hangs out in the 70s under load boosting all 6 clocks north of 4.5, its an incredible chip! Those are good temps
|
# ? Nov 23, 2020 13:33 |
|
bus hustler posted:There's just something deep in the human psyche about cpu temps huh? Like 60% of people immediately install a cpu and are instantly concerned about its temperature Enough people mis-mount their cooler that it's worth looking at, but that usually causes obvious temperature problems almost instantly.
|
# ? Nov 23, 2020 13:39 |
|
interrodactyl posted:Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps. 73 under load doesn't begin to be notable.
|
# ? Nov 23, 2020 14:37 |
|
interrodactyl posted:Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps. Edit: Whoops this is the CPU thread, I'm going to let my shameful post stand. I got confused by the 6800 discussion and I'm still not used to 5xxx CPU model numbers! I'd say that basically every GPU i've had in the last decade has been over 80C under load. The R9-290 would spend all day long pinned at 95C.
|
# ? Nov 23, 2020 15:03 |
|
The bad old days of CPUs literally smoking if they get too hot are behind us. There's like a billion redundancies built in at the software and hardware level to stop these new-fangled CPUs from destroying themselves.
|
# ? Nov 23, 2020 16:50 |
|
Zedsdeadbaby posted:The bad old days of CPUs literally smoking if they get too hot are behind us. There's like a billion redundancies built in at the software and hardware level to stop these new-fangled CPUs from destroying themselves. They’re like human babies now instead of horses.
|
# ? Nov 23, 2020 16:51 |
|
Twerk from Home posted:Edit: Whoops this is the CPU thread, I'm going to let my shameful post stand. I got confused by the 6800 discussion and I'm still not used to 5xxx CPU model numbers! Gonna pair a 5600X with a 5600XT
|
# ? Nov 23, 2020 17:00 |
|
FuturePastNow posted:If Apple's Arm transition does anything to affect the overall computer industry, it'll be (at least in the near term) to raise the public consciousness that a computer doesn't have to be "an Intel". I mean, they knew this 20 years ago before Apple's transition away from PowerPC and it didn't make much difference...
|
# ? Nov 23, 2020 17:34 |
|
sean10mm posted:What do you mean "heavy load"? I was running Control max settings at 4k with ray tracing, as well as 3dmark benchmarking to set an undervolt on my 3080, and decided to take a look at CPU temps as well. It's primarily a gaming rig. I've got a Noctua nh-u14s on it. And thanks, I'm not too concerned about it temp wise but I'd be fine to tweak things if it's really worth it. But given everyone else's answers, seems like it's not worth the trouble. I'll just set up a custom fan curve and call it a finished build!
|
# ? Nov 23, 2020 17:54 |
|
Has there been any good work done yet on the best settings for PBO or Curve Optimizer for 5600x yet? Zen2 had some weird EDC=1 bugs that you could use though I mostly just set things to Auto OC in Ryaen Master but if I could do better I’m down to tinker
|
# ? Nov 23, 2020 18:34 |
|
Speaking of undervolting Seems like it's aimed at squeezing more efficiency out of thermally bound multi-core loads while still letting the CPU juice single cores for maximum turbo frequencies.
|
# ? Nov 23, 2020 20:28 |
|
I think I need to make a bespoke artisanal heirloom PC that will boot from vinyl. https://hackaday.com/2020/11/23/booting-a-pc-from-vinyl-for-a-warmer-richer-os/
|
# ? Nov 23, 2020 22:40 |
|
Hahaha that’s loving awesome
|
# ? Nov 24, 2020 09:01 |
|
The Scan restock this week is uh, looking pitiful. I'm 138th in queue
|
# ? Nov 24, 2020 13:38 |
|
ijyt posted:The Scan restock this week is uh, looking pitiful. I'm 138th in queue lmao at the total backlog count.
|
# ? Nov 24, 2020 14:33 |
|
ijyt posted:The Scan restock this week is uh, looking pitiful. I'm 138th in queue this is actually fantastic and we don't normally get to see stuff like this across the line. we don't know how many already shipped, but it sure does look like AMD's "don't buy the medium" pricing worked great pushing tons of "just in case" people to the 5900x.
|
# ? Nov 24, 2020 15:01 |
|
Any particular reason why an action so seemingly low-key as loading file thumbnails in Windows Explorer is making my 5900x shoot up to 75-80C but an actual CPU-intensive program like Assassin's Creed Vahalla only makes it go as high as (roughly) 70C? Or is this just PBO being weird?
|
# ? Nov 24, 2020 15:31 |
|
|
# ? Jun 3, 2024 12:10 |
|
Cross-Section posted:Any particular reason why an action so seemingly low-key as loading file thumbnails in Windows Explorer is making my 5900x shoot up to 75-80C but an actual CPU-intensive program like Assassin's Creed Vahalla only makes it go as high as (roughly) 70C? Probably PBO being PBO as similar sort of things happened with the Zen 2 chips back when the first came out/still happens with them at times I think? They'd ramp up at a drop of a hat at times which was normal behavior but people freaked out so an AGESA update changed things some.
|
# ? Nov 24, 2020 15:36 |