Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Pablo Bluth posted:

There's no complexity that would free up die space?

These desktop CPUs are behemoths, the static cost of a bit of hardware to implement x86 support is extremely small. You can only really utilize so much of the die at once anyway for thermal and other power reasons, so while their might be some miniscule cost in terms of engineering and BOM, in actual performance the cost of x86 support on x64 is likely literally zero.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
yeah, legacy x86 is ultimately a fairly small amount of silicon area, as in the whole decoder is well under 5% of the core area.

modern CPUs are basically cache with a little bit of processor tacked on, it is something like 2/3rds or 3/4 of the chip is just cache, and of the part that's not cache the decoder doesn't make up that much area.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Or to look at it from another angle, the Pentium II had around 7.5 million transistors. Apparently a Zen2 chiplet has around 4.15 billion transistors.

Mofabio
May 15, 2003
(y - mx)*(1/(inf))*(PV/RT)*(2.718)*(V/I)
What abouts the chip that has the fewest transistors? Can you get 1, or even 0 transistor chips?

wargames
Mar 16, 2008

official yospos cat censor

gradenko_2000 posted:

AMD only went with HBM because it consumes less power, and they needed to free up power from the memory side of the card so they could spend the budget on goosing the GPU ever higher, because they were still on GCN with the Vega and Radeon VII.

They also helped developed it, power is why they stayed with it.

hobbesmaster
Jan 28, 2008

Mofabio posted:

What abouts the chip that has the fewest transistors? Can you get 1, or even 0 transistor chips?

Define “chip”

vvv all use transistors of some type to implement gates. the general alternative was vacuum tubes like a triode

hobbesmaster fucked around with this message at 03:58 on Nov 23, 2020

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Mofabio posted:

What abouts the chip that has the fewest transistors? Can you get 1, or even 0 transistor chips?

Combinatorial logic chips are around.

https://electronicsclub.info/cmos.htm

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

SCheeseman posted:

Ryzen laptops do a pretty decent job playing games on Linux (provided it doesn't have Nvidia anything in it).

There's nothing wrong with Nvidia support on Linux, except that it isn't open source. About 3 years ago, when AMDGPU started maturing, it was the only sane choice, and AMD's (also closed at the time) drivers were universally regarded as an absolute garbage fire.

AMD has done a lot of good work and come incredibly far, but that doesn't change anything with about where Nvidia is at. They're still imperfect, but also still mostly fine and safe.

Lastly, as someone who lived through the RX 5700 release on Linux (and bought one early, because AMDGPU had gotten so good), I'd recommend waiting a couple-three months to see how AMDs drivers shake out before picking up a 5800 series.

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.

mdxi posted:

There's nothing wrong with Nvidia support on Linux, except that it isn't open source.

Has Nvidia fixed Optimus so it works reliably on Linux yet? Or added support for Wayland?

quote:

Lastly, as someone who lived through the RX 5700 release on Linux (and bought one early, because AMDGPU had gotten so good), I'd recommend waiting a couple-three months to see how AMDs drivers shake out before picking up a 5800 series.

According to Wendell from Level1Techs the 6800 series works great on Linux, both natively and with hardware passthrough to Windows VMs:

https://www.youtube.com/watch?v=ykiU49gTNak

EDIT: Phoronix didn't report any major issues in their review either.

Mr.Radar fucked around with this message at 05:05 on Nov 23, 2020

repiv
Aug 13, 2009

Didn't the kernel devs effectively block NV from supporting Optimus in the binary driver by making the relevent APIs GPL-only?

Mofabio
May 15, 2003
(y - mx)*(1/(inf))*(PV/RT)*(2.718)*(V/I)

Paul MaudDib posted:

Combinatorial logic chips are around.

https://electronicsclub.info/cmos.htm

Whoa, that's cool, I love combinatorics. Triangular numbers show up in natural phenomena all the time.

Truga
May 4, 2014
Lipstick Apathy

mdxi posted:

There's nothing wrong with Nvidia support on Linux, except that it isn't open source.

it is next to impossible to get many laptop GPUs to work on linux, and AMD+nvidia laptops specifically are a nigh-impossible nut to crack. G14 took like a year to get into working order, and that's an actually popular laptop, if you buy something even a little bit obscure, :rip:

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Sorry, I missed the "laptop" part, and don't have any first-hand knowledge there. Mea culpa.

shrike82
Jun 11, 2005

i've encountered quite a few issues with nvidia cards on linux but tbf that's just linux in general

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Truga posted:

it is next to impossible to get many laptop GPUs to work on linux, and AMD+nvidia laptops specifically are a nigh-impossible nut to crack. G14 took like a year to get into working order, and that's an actually popular laptop, if you buy something even a little bit obscure, :rip:

The G14 has only been out for a little over 7 months. It was released April 2nd.

PC LOAD LETTER
May 23, 2005
WTF?!

Arzachel posted:

The cache isn't inherent to the arch though, I don't think the console APUs have it, do they? Not like it matters, AMD seems to be determined on recycling Vega until DDR5.

You're right it isn't inherent to the arch. Some sort of custom work would be needed in the hardware and/or drivers to make it useful here. But they could do it if they wanted to. And given the density they've been able to achieve it might actually pan out economically.

I think given the way the iGPU is (eternally) bandwidth starved so long as it has the minimum necessary DX/OGL/Vulkan feature support it probably doesn't make much sense to switch iGPU arch unless you can actually give it more bandwidth somehow.

DDR5 is supposed to help somewhat here vs DDR4 but its still going to be only dual channel so I wouldn't go expecting miracles from that either. I sometimes wish either Intel or AMD would be willing to push a 3 or 4 channel main memory solution across their entire lineup.

I know it'd add significant costs to the platform (which would probably kill the idea) but that seems to be (short of adding HBM or some other on package/die huge caches) the only way to truly solve the chronic bandwidth starvation issues that iGPU's have had since they were introduced. Given the way the prices are on the new X99 LGA2011-v3 socket mobos, which are quad channel, it doesn't seem quuuuuuuuuite so absurd of an idea anymore.

SCheeseman
Apr 23, 2003

mdxi posted:

There's nothing wrong with Nvidia support on Linux, except that it isn't open source. About 3 years ago, when AMDGPU started maturing, it was the only sane choice, and AMD's (also closed at the time) drivers were universally regarded as an absolute garbage fire.

Support has rotted. I've had nothing but problems running Nvidia hardware on a bunch of different forms of desktop linux. Vsync is still a mess, driver crashes are frequent (particularly after sleep), support will always be lacking for things like Gallium Nine and they're still dragging their feet on Wayland. My experience with most modern AMD and Intel GPUs required no configuration for a good experience out of the box, it's night and day.

CaptainSarcastic
Jul 6, 2013



I've been running the proprietary Nvidia drivers on Linux for years and rarely had a problem. A few times there was a glitch due to kernel updates and driver updates getting out of sync, but that's been few and far between. I haven't had to do any more setup on my Linux installs than I have on my Windows installs for Nvidia, at least not for several years.

Just for the sake of full disclosure, I have relied on my distro's repos for the Nvidia drivers, and haven't installed them from source or anything. My main desktop is running openSUSE Tumbleweed and the Nvidia drivers are just fine on it.

interrodactyl
Nov 8, 2011

you have no dignity
Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps.

Arzachel
May 12, 2012

interrodactyl posted:

Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps.

You can reduce the power target (PPT) either through the bios or Ryzen Master, but 73c isn't bad for all-core loads.

SCheeseman
Apr 23, 2003

Bouncing between variants Ubuntu and Arch I've had similar problems on just about all of them. Google searches seemed to indicate they weren't isolated to me either, whatever success you've had there is clear evidence that things aren't going as smoothly for others.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

interrodactyl posted:

Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps.

What do you mean "heavy load"?

What cooler are you using?

Short answer is it's probably not worthwhile for that CPU IMO.

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!
There's just something deep in the human psyche about cpu temps huh? Like 60% of people immediately install a cpu and are instantly concerned about its temperature

Mine happily hangs out in the 70s under load boosting all 6 clocks north of 4.5, its an incredible chip! Those are good temps

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

bus hustler posted:

There's just something deep in the human psyche about cpu temps huh? Like 60% of people immediately install a cpu and are instantly concerned about its temperature

Mine happily hangs out in the 70s under load boosting all 6 clocks north of 4.5, its an incredible chip! Those are good temps

Enough people mis-mount their cooler that it's worth looking at, but that usually causes obvious temperature problems almost instantly.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

interrodactyl posted:

Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps.

73 under load doesn't begin to be notable.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

interrodactyl posted:

Is undervolting the 5600x worth it (and if so what's the best way to do it)? Under heavy load I'm hitting ~73c with mine, but would be happy to take a slight loss in gaming fps for cooler temps.

Edit: Whoops this is the CPU thread, I'm going to let my shameful post stand. I got confused by the 6800 discussion and I'm still not used to 5xxx CPU model numbers!

I'd say that basically every GPU i've had in the last decade has been over 80C under load. The R9-290 would spend all day long pinned at 95C.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
The bad old days of CPUs literally smoking if they get too hot are behind us. There's like a billion redundancies built in at the software and hardware level to stop these new-fangled CPUs from destroying themselves.

Rolo
Nov 16, 2005

Hmm, what have we here?

Zedsdeadbaby posted:

The bad old days of CPUs literally smoking if they get too hot are behind us. There's like a billion redundancies built in at the software and hardware level to stop these new-fangled CPUs from destroying themselves.

They’re like human babies now instead of horses.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Twerk from Home posted:

Edit: Whoops this is the CPU thread, I'm going to let my shameful post stand. I got confused by the 6800 discussion and I'm still not used to 5xxx CPU model numbers!

I'd say that basically every GPU i've had in the last decade has been over 80C under load. The R9-290 would spend all day long pinned at 95C.

Gonna pair a 5600X with a 5600XT

feedmegin
Jul 30, 2008

FuturePastNow posted:

If Apple's Arm transition does anything to affect the overall computer industry, it'll be (at least in the near term) to raise the public consciousness that a computer doesn't have to be "an Intel".

I mean, they knew this 20 years ago before Apple's transition away from PowerPC and it didn't make much difference...

interrodactyl
Nov 8, 2011

you have no dignity

sean10mm posted:

What do you mean "heavy load"?

What cooler are you using?

Short answer is it's probably not worthwhile for that CPU IMO.

I was running Control max settings at 4k with ray tracing, as well as 3dmark benchmarking to set an undervolt on my 3080, and decided to take a look at CPU temps as well. It's primarily a gaming rig.

I've got a Noctua nh-u14s on it.

And thanks, I'm not too concerned about it temp wise but I'd be fine to tweak things if it's really worth it.

But given everyone else's answers, seems like it's not worth the trouble. I'll just set up a custom fan curve and call it a finished build!

Xaris
Jul 25, 2006

Lucky there's a family guy
Lucky there's a man who positively can do
All the things that make us
Laugh and cry
Has there been any good work done yet on the best settings for PBO or Curve Optimizer for 5600x yet? Zen2 had some weird EDC=1 bugs that you could use though I mostly just set things to Auto OC in Ryaen Master but if I could do better I’m down to tinker

Arzachel
May 12, 2012
Speaking of undervolting

Seems like it's aimed at squeezing more efficiency out of thermally bound multi-core loads while still letting the CPU juice single cores for maximum turbo frequencies.

CaptainSarcastic
Jul 6, 2013



I think I need to make a bespoke artisanal heirloom PC that will boot from vinyl.

https://hackaday.com/2020/11/23/booting-a-pc-from-vinyl-for-a-warmer-richer-os/

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord
Hahaha that’s loving awesome

ijyt
Apr 10, 2012

The Scan restock this week is uh, looking pitiful. I'm 138th in queue :(

etalian
Mar 20, 2006

ijyt posted:

The Scan restock this week is uh, looking pitiful. I'm 138th in queue :(



lmao at the total backlog count.

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!

ijyt posted:

The Scan restock this week is uh, looking pitiful. I'm 138th in queue :(




this is actually fantastic and we don't normally get to see stuff like this across the line. we don't know how many already shipped, but it sure does look like AMD's "don't buy the medium" pricing worked great pushing tons of "just in case" people to the 5900x.

Cross-Section
Mar 18, 2009

Any particular reason why an action so seemingly low-key as loading file thumbnails in Windows Explorer is making my 5900x shoot up to 75-80C but an actual CPU-intensive program like Assassin's Creed Vahalla only makes it go as high as (roughly) 70C?

Or is this just PBO being weird?

Adbot
ADBOT LOVES YOU

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Cross-Section posted:

Any particular reason why an action so seemingly low-key as loading file thumbnails in Windows Explorer is making my 5900x shoot up to 75-80C but an actual CPU-intensive program like Assassin's Creed Vahalla only makes it go as high as (roughly) 70C?

Or is this just PBO being weird?

Probably PBO being PBO as similar sort of things happened with the Zen 2 chips back when the first came out/still happens with them at times I think?

They'd ramp up at a drop of a hat at times which was normal behavior but people freaked out so an AGESA update changed things some.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply