Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

shrike82 posted:

Yeah people are being a little glib about the issues with both CPUs and GPUs getting hotter

Not to mention power draw, my country saw a 52% rise in electricity cost in April, and projected to rise another 50~% in October. It's out of control. The cost of living situation here is getting abysmally bad, to the point where I am skipping certain next-gen hardware if they have power draw that's higher than what I already use

Adbot
ADBOT LOVES YOU

Gwaihir
Dec 8, 2009
Hair Elf

Zedsdeadbaby posted:

Not to mention power draw, my country saw a 52% rise in electricity cost in April, and projected to rise another 50~% in October. It's out of control. The cost of living situation here is getting abysmally bad, to the point where I am skipping certain next-gen hardware if they have power draw that's higher than what I already use

New stuff is still going to have large performance per watt improvements, I feel like that's getting a bit overlooked here. It's just that absolute performance is also going up by so much that even though it's more efficient, it's still possible to use more power.

But that means you can get more options. Depending on what you want, you can get the same performance for less power. Or the same power with more performance. Or more power and way more performance. That's the good thing about modern hardware, like DrDork mentioned, it's trivially configurable via dragging a slider.

Truga
May 4, 2014
Lipstick Apathy
has anyone benched X3D in games like oxygen not included or stellaris yet?

Cygni
Nov 12, 2005

raring to post

ChinaTimes reporting (with some AWESOME google translate errors) lots of Zen 4 details. Apparently the 5nm Zen 4 dies will enter full production this month, with the I/O die being on TSMCs 6nm. The X670 chipset will actually be a "chipset" again, with two separate dies. Trying to parse the translation, it sounds to me like each of the two chipset dies might be identical, though. If that's true, I imagine the "B650" chipset might just be one of the dies. Or it might be two distinct dies with different functions like the olden days.

All of this is being made by Asmedia again (which is a return to the high end for them, as X570 was a reused I/O die last time)

https://www-chinatimes-com.translate.goog/newspapers/20220418000171-260206?chdtv&_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en&_x_tr_pto=wapp

TearsOfPirates
Jun 11, 2016

Stultior stulto fuisti, qui tabellis crederes! - Idiot of idiots, to trust what is written!
Would it be worth to upgrade from a 3700x to a 5700x (some time in the future) alone or would the extra $ be worth for the 5800x/5800X3D for gaming alone or is it not that much worth of an upgrade?

I probably won't go to AM5 until 2025 at least given my life situation atm.

hobbesmaster
Jan 28, 2008

Well you probably want to upgrade at some point if you’re not going to AM5 anytime soon. Depending on your GPU I’d look at benchmarks to see how much uplift you’d get from a zen3.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Cygni posted:

the I/O die being on TSMCs 6nm

:yeshaha:

Lower those IOD watts.

Dr. Video Games 0031
Jul 17, 2004

Truga posted:

has anyone benched X3D in games like oxygen not included or stellaris yet?

Closest I can think of is Hardware Unboxed benching it in Factorio.



No idea how this relates to anything else, though. Creating a proper benchmark pass in a Paradox game sounds hard due to the amount of RNG involved. Even if you set everything up properly so you're in observer mode, the AI will do wildly different things with each pass which can effect the results. If a global/galaxy-spanning war breaks out in one pass but not another, that's too much variance to make for usable data.

Dr. Video Games 0031 fucked around with this message at 21:08 on Apr 19, 2022

Khorne
May 1, 2002

FuturePastNow posted:

'Cause grandpa still needs his trackball from 1997
because ps2 is the superior format for human interface devices

Con: need to restart your computer to recognize the device

Pro: interrupts instead of polling, no ridiculous protocol overhead, dead simple & cheap to implement flawlessly

There are so many bad USB implementations that mess with input latency in keyboards and mice out there. And let's not even get started on key rollover & "grouping" of inputs that frequently happens with USB implementations. Even in 2022 you still need to replace your windows USB driver to set proper poll rates for a number of USB devices due to how poor the implementation is.

It's only when you get into laptops and more mobile formats that usb is a good idea for human interface devices due to limited space for ports (usb can be used for a lot more than mouse/kb while ps2 is fairly limited) and no one wanting to restart their laptop when they plugin an external keyboard or mouse.

Khorne fucked around with this message at 21:43 on Apr 19, 2022

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Rinkles posted:

When/will pcie be replaced?

When X86 moves to SOC's with powerful integrated GPU's and massively wide bus to internal ram at 1TB+/sec. So...in 5 years? 10? Never?

shrike82
Jun 11, 2005

Hopefully AMD supports USB4 in AM5 (USB4 has TB3 baked in)

The lack of TB support has been glaring

hobbesmaster
Jan 28, 2008

shrike82 posted:

Hopefully AMD supports USB4 in AM5 (USB4 has TB3 baked in)

The lack of TB support has been glaring

From that article…

quote:

The X670 chipset of the Supermicro AM5 platform will be designed and mass-produced by Xiangshuo. Because it is a dual-chip architecture, it means that each computer will be equipped with two chips to support different transmission interfaces such as USB 4, PCIe Gen 4, and SATA.

We’ll see.

Klyith
Aug 3, 2007

GBS Pledge Week

Khorne posted:

because ps2 is the superior format for human interface devices

Con: need to restart your computer to recognize the device

Pro: interrupts instead of polling

here's why you're wrong!

Polling rate means your mouse might wait up to 1/rate seconds to send a message. The mouse is ready to send input, but is waiting to be polled, maybe that happens in zero time and maybe it waits the full cycle. So while the default 1/125hz = 8ms seems really bad, on average the latency will be 1/2 the worst case.

But! Latency before we start a message isn't the whole story. The PC can't act until it receives the full message. Well how long is a mouse message? A classic, ancient, 3-button mouse sends 3 bytes per event. So how long does it take a PS/2 bus to send 3 bytes? PS/2 can go from 7-12 kbps, 900-1500 bytes per second. So if we assume that PS/2 has zero latency to start a message, 3/1500 = 0.002 = 2ms before the mouse finishes sending data and the PC responds.

Now let's look at USB. USB 1 = 1,572,864 bytes per second. 3 bytes = 3/1572864 = 0.0000019s ... effectively zero. So USB is defined by latency to start a message, PS/2 by latency to complete it. And if 125hz is all that USB can ever do, it loses to PS/2 handily.

But you can increase polling speed:
code:
rate    worst   average
125hz   8ms     4ms
250hz   4ms     2ms  <- ties PS/2
500hz   2ms     1ms  <- USB is always better
1000hz  lmao
Hmmmm, wait a minute, what was that about a 3-button mouse? That's not a mouse with a scroll wheel! The MS intellimouse added a scroll wheel, and increased the bytes per event to 4. This means that just a 250hz polling rate is, on average, beating PS/2 in latency with any type of mouse you'd want to use today.

New Zealand can eat me
Aug 29, 2008

:matters:


:catstare:

now seems like a good time to mention that I really hope they use an actually good USB chipset on AM5, I do not like being restricted to Intel boards when I want to experiment with 4 or 8,000hz USB polling

E: IIRC it's a "two pronged" issue, with half being the onboard usb controllers chosen on most boards not being capable, and then some other AM4 specific limitation that prevents otherwise supported pcie usb cards from working

New Zealand can eat me fucked around with this message at 06:08 on Apr 20, 2022

kliras
Mar 27, 2021
5800X3D now available. doesn't seem in stock locally, but you can buy directly from amd

prices i get are 516€ with 22€ in shipping

hobbesmaster
Jan 28, 2008

My local microcenter has 20 so they’re out there. https://www.microcenter.com/product/647926/amd-ryzen-7-5800x3d-vermeer-34ghz-8-core-am4-boxed-processor-cooler-not-included

Actuarial Fables
Jul 29, 2014

Taco Defender
It's nice when a product launches and it's not instantly out of stock everywhere.

Reserved one for pickup at my local microcenter, gonna have some fun after work today.

explosivo
May 23, 2004

Fueled by Satan

How much of a gaming performance upgrade am I looking at if I went from a 3900x to a 5900X3D?

kliras
Mar 27, 2021

explosivo posted:

How much of a gaming performance upgrade am I looking at if I went from a 3900x to a 5900X3D?
here is the uplift just from 5800x

https://www.youtube.com/watch?v=sw97hj18OUE

explosivo
May 23, 2004

Fueled by Satan


Goddamn.

Klyith
Aug 3, 2007

GBS Pledge Week

explosivo posted:

How much of a gaming performance upgrade am I looking at if I went from a 3900x to a 5900X3D?

How often are you playing games that are CPU constrained (and that CPU constraint is lower than the refresh rate of your monitor)?

Because for most people, playing most games, the 3900X is already faster than your GPU or other limiting factors and the performance upgrade is effectively zero.

explosivo
May 23, 2004

Fueled by Satan

Klyith posted:

How often are you playing games that are CPU constrained (and that CPU constraint is lower than the refresh rate of your monitor)?

Because for most people, playing most games, the 3900X is already faster than your GPU or other limiting factors and the performance upgrade is effectively zero.

I have a 3080ti and play on a 144hz 1440p monitor OR a 120hz 4k TV so there's not much more room to budge on that part of it. I have also been having some hardware issues that I think might be a sign of my CPU failing but that's just a bit of a guess based on some recent error logs. You are probably right though that for the most part I'm not exactly starved for performance it just... could be better y'know? :shobon:

Quaint Quail Quilt
Jun 19, 2006


Ask me about that time I told people mixing bleach and vinegar is okay
I actually play quite a few cpu limited games, but most need specifically single core performance so I just live with it on my 3700x instead of going intel.

One or two apparently greatly benefit from running off the GPU ram, I think Kenshi is one.

I won't be getting a 3090 though, 3080 founders is plenty for me for now.

E: I wonder if going from 65w to 105w default tdp would cause my GPU to run hotter/worse since the founders kinda pulls from my cpu area slightly.

Quaint Quail Quilt fucked around with this message at 15:34 on Apr 20, 2022

hobbesmaster
Jan 28, 2008

If you’re looking at AM4 upgrades from a 3900x the 5900x has a street price under $400.

Kibner
Oct 21, 2008

Acguy Supremacy

Klyith posted:

How often are you playing games that are CPU constrained (and that CPU constraint is lower than the refresh rate of your monitor)?

Because for most people, playing most games, the 3900X is already faster than your GPU or other limiting factors and the performance upgrade is effectively zero.

The 5800x3d would also help with minimum frametimes and frametime consistency, even if the max or overall framerate doesn't improve much.

Arzachel
May 12, 2012

Quaint Quail Quilt posted:

E: I wonder if going from 65w to 105w default tdp would cause my GPU to run hotter/worse since the founders kinda pulls from my cpu area slightly.

You can manually cap the power limit and it would barely impact gaming performance.

MikeC
Jul 19, 2004
BITCH ASS NARC
drat, $599 here in canuckistan. Debating with myself whether paying an extra 220 bucks before tax is worth it just to say I have one.

kliras
Mar 27, 2021

kliras posted:

5800X3D now available. doesn't seem in stock locally, but you can buy directly from amd

prices i get are 516€ with 22€ in shipping
aaand it's gone

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map


Oh wow, is the cache die made somewhere in the US? Which fab?

Klyith
Aug 3, 2007

GBS Pledge Week

Sidesaddle Cavalry posted:



Oh wow, is the cache die made somewhere in the US? Which fab?

The IO die -- GloFlo's two 14nm foundries are in NY.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
I have no need to upgrade, but I must.

5950x to upgrade from a 5800x? Yes or No? :q: It's $699 (CDN)

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gwaihir posted:

New stuff is still going to have large performance per watt improvements, I feel like that's getting a bit overlooked here. It's just that absolute performance is also going up by so much that even though it's more efficient, it's still possible to use more power.

But that means you can get more options. Depending on what you want, you can get the same performance for less power. Or the same power with more performance. Or more power and way more performance. That's the good thing about modern hardware, like DrDork mentioned, it's trivially configurable via dragging a slider.

Yup, this is essentially “the end of dennard scaling” in a nutshell. Shrinking is still more efficient transistor for transistor, but power per area is actually going up, so if you maintain size, that’s more transistors and more power in total. A 500mm2 5nm chip has more transistors than a 500mm2 7nm chip and thus will pull more power.

In the long term, Dennard’s Ghost is going to force us to actually take some of those gains as efficiency and not just ship giant chips clocked to the max.

A 4060 (or whatever, naming is fluid depending on where the top of the stack goes) is going to be way way faster than a 3070 at iso power. A 4070 will be way faster when power limited down to 3070 levels. Etc etc.

Going from essentially Samsung 10nm (8nm is a 10+) to TSMC 5P is something like a 2 node leap. The people fretting about efficiency are absolutely insane. It’s just a 12900KS where there’s a meme SKU at the top of the stack.

Cygni
Nov 12, 2005

raring to post

Stanley Pain posted:

I have no need to upgrade, but I must.

5950x to upgrade from a 5800x? Yes or No? :q: It's $699 (CDN)

Naw, not unless you are doing something that actually uses the extra cores. If you really want a shiny new toy, consider making an ITX HTPC/NAS/Pfsense/Home server box. Its fun, cheaper (especially with used parts), and useful!

kliras
Mar 27, 2021
5800x to 5950x only makes (some? any?) sense if you already have a computer to put the old cpu into

acksplode
May 17, 2004



I'm looking at replacing my 3700X with a 5800X3D to extend the life of my AM4 gaming PC as long as possible. But it's not an urgent upgrade at all, so I'm trying to game out the best time to purchase. The GN review said AMD might not manufacture a lot of these -- should I avoid being clever and just buy now at launch price while supply seems to be OK?

Kibner
Oct 21, 2008

Acguy Supremacy

acksplode posted:

I'm looking at replacing my 3700X with a 5800X3D to extend the life of my AM4 gaming PC as long as possible. But it's not an urgent upgrade at all, so I'm trying to game out the best time to purchase. The GN review said AMD might not manufacture a lot of these -- should I avoid being clever and just buy now at launch price while supply seems to be OK?

If you want it and can afford to get it, just go get it. The future is unknowable.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
OTOH, do you need to extend the life of your 3700X? If you're already primarily GPU limited and you don't see that changing much over the next year or two, maybe you're better off saving the money for your next new PC or GPU upgrade or whatever.

It's enough of a tossup that whether it's a good idea for you or not is something you'll have to judge for yourself.

acksplode
May 17, 2004



K8.0 posted:

OTOH, do you need to extend the life of your 3700X? If you're already primarily GPU limited and you don't see that changing much over the next year or two, maybe you're better off saving the money for your next new PC or GPU upgrade or whatever.

It's enough of a tossup that whether it's a good idea for you or not is something you'll have to judge for yourself.

I built this PC in late 2020 w/ a 3080, so I'm hoping to put off a rebuild or GPU upgrade for at least another few years, and upgrading the CPU would be in support of that goal. I'm usually GPU limited, however I have a display that supports 120Hz and I'd like to start playing with that once I upgrade to a capable AVR. My concern is getting into a situation where I could use more CPU perf, but the best compatible upgrade is no longer being sold.

Klyith
Aug 3, 2007

GBS Pledge Week

acksplode posted:

The GN review said AMD might not manufacture a lot of these -- should I avoid being clever and just buy now at launch price while supply seems to be OK?

For reference, these 5800X3D dies are the same thing as AMD is making for some Milan-X Epyc CPUs, which are selling like hotcakes. The 7773X has 64-cores and 768MB cache: it is presumably using 8 of the same 32+64mb cache dies as the 5800X3D. It sells for $10,000. This means AMD is losing approximately $800 each selling them to schmucks like us instead of HPC servers.

So these things are definitely not going on any sale or discount.


If you want one don't try to get cute waiting until the Zen 4 launch looking for a sale. It ain't gonna happen. I don't know if Steve was speaking from insider knowledge or just doing the same math I did. But I expect that when Zen 4 is launching (or maybe before) they will just stop making them.

Adbot
ADBOT LOVES YOU

acksplode
May 17, 2004



Thanks, that's really useful to know. Time to buy a new CPU

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply