Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Arzachel
May 12, 2012

Kazinsal posted:

I have a rough IPC comparison spreadsheet I use to estimate compute throughput per GHz and based on it, the Ice Lake uarch is 18% faster clock for clock than the Skylake/Kaby Lake/Coffee Lake uarch, and Zen 3 is 17% faster clock for clock than SKL/KBL/CFL. Obviously varies by workload but that's the average.

Rocket Lake is a fair bit slower, right?

Adbot
ADBOT LOVES YOU

Kazinsal
Dec 13, 2011


Arzachel posted:

Rocket Lake is a fair bit slower, right?

Intel claims 19% average IPC increase between SKL/KBL/CFL and RKL, but I haven't done any testing on my own or seen any reliable third party tests about it. I'd be interested in knowing for sure.

weaaddar
Jul 17, 2004
HAY GUYS WHAT IS TEH INTERWEBNET, AND ISN'T A0L the SECKZ!? :LOL: 1337
PS I'M A FUCKING LOSER
The big problem with this current gen is AMD raised their prices by $50 bucks per sku once they had the performance advantage and did not release the lower end non-X parts. So a $300 5600x is beng compared against a $160 11400f. Sure the 5600x might beat it, but not enough to justify the nearly $140 premium.

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

Just want to report back that my PC is still running. Haven't touched any settings since the last time -- SoC is stock, XMP active, and the VCore is raised a bit since lowering it immediately fails the system. I am kind of becoming interesting in hardware topics as a result and binge watching Gamersnexus because I dig Steve.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Yes for some bizarro reason Intel is the budget king nowadays, there are 11400s and 11600s that are vastly, vastly better bang for buck than a 5600. The new ryzens are overpriced and living off everyone's goodwill.

CerealKilla420
Jan 3, 2014

"I need a handle man..."

Rexxed posted:

It depends what you mean by faster. They can usually get higher clock speeds but that doesn't always mean that every task runs faster and you're usually talking about overclocking. That can also mean that one core might be fast and the others not as fast. They also (generally) generate more heat since they're on a larger process. AMD CPUs tend to get you more for your dollar but the 10th gen and 11th gen intel do have a few comparable CPUs. In general any PC should feel fast as long as it has a SSD for the system drive. If something feels slow there's probably something wrong like it's got a hard disk.

Kazinsal posted:

I have a rough IPC comparison spreadsheet I use to estimate compute throughput per GHz and based on it, the Ice Lake uarch is 18% faster clock for clock than the Skylake/Kaby Lake/Coffee Lake uarch, and Zen 3 is 17% faster clock for clock than SKL/KBL/CFL. Obviously varies by workload but that's the average.

Huh interesting. Yeah I figured that they were actively selecting workloads/benchmark tests that favored intel in some way. That and they were only comparing the top of the line 'non-server' chips the two companies were offering.

The only time I've ever gone intel was with the 2500k which I used for almost a decade. I like AMD just because they're more transparent with their hardware specifications. My best guess is that Intel is spending a massive amount of money to change public opinion in their favor.


Anyways, I was looking at benchmarks because I've been considering upgrading my 3600x to a 5800x. The benchmarks show a pretty incredible increase in performance, but would it really be worth upgrading? Or should I just wait for the next generation chips? Has AMD announced that it will maintain compatibility with X470 boards with the next generation?

CaptainSarcastic
Jul 6, 2013



64bit_Dophins posted:

Anyways, I was looking at benchmarks because I've been considering upgrading my 3600x to a 5800x. The benchmarks show a pretty incredible increase in performance, but would it really be worth upgrading? Or should I just wait for the next generation chips? Has AMD announced that it will maintain compatibility with X470 boards with the next generation?

I have a 3600X, and personally I'm waiting to see if AMD releases a Ryzen 3+ 6000 series for AM4 before considering a CPU upgrade. I have an X570 motherboard, so if there is another spin on Ryzen before AM5 I feel pretty confident my board can support it.

Kazinsal
Dec 13, 2011


Zedsdeadbaby posted:

Yes for some bizarro reason Intel is the budget king nowadays, there are 11400s and 11600s that are vastly, vastly better bang for buck than a 5600. The new ryzens are overpriced and living off everyone's goodwill.

Yeah, AMD's big selling point these days is being able to keep concerningly high all-core clocks with enough cores on a single socket to replace a quarter of a rack.

Perplx
Jun 26, 2004


Best viewed on Orgasma Plasma
Lipstick Apathy
Even if intel is a better value on paper I’ll pay a little extra for the power savings. Also I won’t buy another 14nm just on principle, since I bought a my first 14nm chip in 2015.

mA
Jul 10, 2001
I am the ugly lover.

CaptainSarcastic posted:

I have a 3600X, and personally I'm waiting to see if AMD releases a Ryzen 3+ 6000 series for AM4 before considering a CPU upgrade. I have an X570 motherboard, so if there is another spin on Ryzen before AM5 I feel pretty confident my board can support it.

It's the right choice. There's very little difference in gaming performance to upgrade to the 5000 series if you're gaming at 1440p or above.

LRADIKAL
Jun 10, 2001

Fun Shoe
https://www.youtube.com/watch?v=FqmcWOVv2eY

192MB L3 3D stacked cache on Ryzen, pretty badass, wonder how well it cools. This could also be joined to an APU, which always wants that latency reduction. It's gonna be a trip managing power in those packages. Also, when does Apple start stacking their RAM on top?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

mA posted:

It's the right choice. There's very little difference in gaming performance to upgrade to the 5000 series if you're gaming at 1440p or above.

Yeah the thing about "AMD hasn't yet released lower-stack parts for the 5000 series" is that the Ryzen 3000s are still there

if you really want the IPC/clocks of Zen 3, then you get Zen 3
if you want a competitor to Zen 3, at the cost of lower efficiency/higher temps, and you're only playing in the 4-to-8 core range, than Intel 11th gen is an option
otherwise, Zen 2 is right there

CerealKilla420
Jan 3, 2014

"I need a handle man..."

CaptainSarcastic posted:

I have a 3600X, and personally I'm waiting to see if AMD releases a Ryzen 3+ 6000 series for AM4 before considering a CPU upgrade. I have an X570 motherboard, so if there is another spin on Ryzen before AM5 I feel pretty confident my board can support it.

Sounds good to me. I'll probably do the same.

The 3600x is the first CPU that really felt like some next gen poo poo since the 2500k.

Especially cool considering I was one of the poor bastards that suffered the the fx-8350 era :boehner:

Inverted Sphere
Aug 19, 2013
In a couple of days when everything gets here I'll be upgrading from a 4670k to a 5600x (keeping my 6GB GTX1060 after ultimately deciding to not sell a kidney on the blackmarket) and I'm unreasonably excited to see how much of a jump it is. (One place my CPU is absolutely struggling in, large Stellaris games and bigass fights in Total War where the CPU actually does choke and the game has to slow down)

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

64bit_Dophins posted:

The 3600x is the first CPU that really felt like some next gen poo poo since the 2500k.

I went from an i3-6100T (2c/4t) to a R5 1600 (6c/12t) and it was like *head exploding gesture*.

Off to the side, has anyone recently gone from a single display setup to dual-head? I did that yesterday and my GPU power usage has jumped from 9-11W while browsing, to 55-59W. A nearly 5X power usage increase from adding one display seems a bit unreasonable.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
Multi-display is eternally hosed up with regards to cards going to idle state, at least with Nvidia

CoolCab
Apr 17, 2005

glem
are they different refresh rates? that can be all fucky-wucky i've found

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Thanks for the input. This is kinda sounding like a reason to consider upgrading to a wide-format display and treating it as if it were split down the middle. Something to think about.

Edit: Turns out they're not even expensive anymore if you don't care about high refresh rate https://smile.amazon.com/Samsung-S34J55W-34-Inch-Ultrawide-LS34J550WQNXZA/dp/B07FBS36W2

mdxi fucked around with this message at 19:01 on Jul 1, 2021

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I'm used to people wasting their time doing UI mockups, but there's people doing mock-ups of upcoming EPYC substrates. Why even? :confused:

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



Was :confused: always animated?

Inept
Jul 8, 2003

Munkeymon posted:

Was :confused: always animated?

It got updated to work in dark mode

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Combat Pretzel posted:

I'm used to people wasting their time doing UI mockups, but there's people doing mock-ups of upcoming EPYC substrates. Why even? :confused:

In the run-up to the release of the PS5, r/ps4 (and then r/ps5) were swamped with render after render of "what the PS5 will look like", both reposted from rumormills and homegrown wishful-thinking ones done by hobbyists with Blender hard-ons.

The same thing happens, though to a lesser degree, on r/AMD when new video cards are about to be released. So yeah, it's a thing. And I don't understand it either, except that StarCitizen (and the Trump presidency) has taught me that people loving love theorycrafting (and then loudly talking about their favorite bits of theorycrafting as if they were fact).

CPU package mock-ups do seem extra extra dumb though.

FuturePastNow
May 19, 2014


AdoredTV and Moore's Law is Dead just make up everything they post and sometimes (very rarely) they manage to guess right. No one should watch any of their videos or read any article using them as a source

Dr. Video Games 0031
Jul 17, 2004

Malloc Voidstar posted:

Multi-display is eternally hosed up with regards to cards going to idle state, at least with Nvidia

Man, that's disappointing. I don't get why multi-monitor support is still so janky across the board. Microsoft is still playing catch-up with windows support too. And there are still new games being released without a monitor selection option in their in-game menu. It's 2021, who the hell's still using single monitors?

I guess this is the one and only thing AMD can point to in their drivers as being better than NVidia. AMD's multi-monitor support is generally pretty decent. I haven't had any weird power or refresh rate related issues with them.

GRECOROMANGRABASS
May 14, 2020
Yeah, the AMD drivers are great for multi-monitor/multi-resolution/multi-refresh-rate setups. I'm currently using a stock RX 480 blower model with 8GB of ram, and I use a 32" 4k 60hz monitor as a primary display with a secondary 27" 1440p 144hz monitor. Both monitors have an instance of Chrome with countless tabs open at the moment, and one of the monitors is playing a Hulu video. The power draw ranges from 34W with occasional, but infrequent bumps to ~46W.

Edit: and one monitor is 10 bit color depth, the other 8 bit. One monitor shows FreeSync enabled, the other says "FreeSync Premium" is enabled. I'm pretty impressed with how smoothly this old video card manages two vastly different monitors without a hitch.

GRECOROMANGRABASS fucked around with this message at 00:17 on Jul 2, 2021

Cross-Section
Mar 18, 2009

I remember thinking it was so cool that I was able to switch monitor outputs using just two keyboard shortcuts. Blindsided me when I got a 1070 and found I couldn't do that anymore.

Indiana_Krom
Jun 18, 2007
Net Slacker
I have two monitors, one is 1440p/240 Hz and the other is 1200p 60 Hz, right now my GTX 1080 is idling at 13w (139 MHz). Nvidia cards failing to idle only happens when you have three or more displays connected, as long as it is just two displays nvidia cards will still idle down normally without issue and don't seem to care about different refresh rates.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Joke's on me. The power draw wasn't an effect of dual-head at all.

I did some poking around at ultrawide monitors and found the one I linked to at Amazon in my previous post. Then I happened to notice that googling for its model number autocompleted to "...costco". So yeah, it's $350 at Amazon, but $300 at Costco... but on sale right now for $270. I hopped in the car and picked one up, brought it home, connected it, powered on, and...
code:
amdgpu-pci-0a00
Adapter: PCI adapter
vddgfx:      1000.00 mV 
fan1:        1138 RPM  (min =    0 RPM, max = 4950 RPM)
edge:         +73.0°C  (crit = +100.0°C, hyst = -273.1°C)
                       (emerg = +105.0°C)
junction:     +75.0°C  (crit = +110.0°C, hyst = -273.1°C)
                       (emerg = +115.0°C)
mem:          +74.0°C  (crit = +105.0°C, hyst = -273.1°C)
                       (emerg = +110.0°C)
power1:       56.00 W  (cap = 150.00 W)
I can only guess that driving this many pixels requires waking up stuff that isn't needed for 2560x1440. I'm not exactly mad though, because one of these is a lot nicer than two mismatched displays side by side. I can have a 1720x1440px browser window and a terminal running emacs next to that, displaying two files side-by-side, both with 107 chars line length at my preferred font size. And it was $270 ffs.

Maybe this is just early-production RX 5700 fuckery? These cards had enough other problems, all of which mine seemed to have dodged. Guess I'll find out next upgrade, whenever that happens.

Inept
Jul 8, 2003

When I just had a single 1080p display, my 5700 would idle at about 9 watts. With two 1440p displays at different refresh rates, it idles at about 36 watts. The core clock stays low, but the VRAM fully clocks up.

Meanwhile my Intel Macbook pro drives the same displays without becoming a furnace :shrug:

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Indiana_Krom posted:

I have two monitors, one is 1440p/240 Hz and the other is 1200p 60 Hz, right now my GTX 1080 is idling at 13w (139 MHz). Nvidia cards failing to idle only happens when you have three or more displays connected, as long as it is just two displays nvidia cards will still idle down normally without issue and don't seem to care about different refresh rates.
I fear what will happen when I'm finally running two 4K120.

fast cars loose anus
Mar 2, 2007

Pillbug
I feel like I need a sanity check on my CPU temps. I've got a newish build (basically 3 weeks old) with a Ryzen 9 3900x on an MSI x570 gaming edge. Using the Wraith that came with it, in a case with 3 front fans and a back fan, room temp generally ~24c. When idling I'm usually betwen 45-50c and playing hitman 3 with settings jacked usually between 70-80c. Running prime 95 in the highest stress gets me to 75-80 degrees at 3700 Mhz (monitoring done with AMD Ryzen Master). In the most recent one, 10 seconds after I stopped Prime95 it had dropped from 78c to 52c and was down to 47c 20 seconds after that.

Anyway I put all that probably too much detail in to ask if when I open this thing up later in the week to add my new NVMe drive if I should reseat the wraith with new thermal paste (I have some arctic silver lying around but just put it on with the stock thing that was on it out of the box when I built it). I hear Ryzens run hot and going back several pages in this thread I seem to get the idea that mine might be more than it should be even taking that into consideration. Thanks for your help!

Dr. Video Games 0031
Jul 17, 2004

fast cars loose anus posted:

I feel like I need a sanity check on my CPU temps. I've got a newish build (basically 3 weeks old) with a Ryzen 9 3900x on an MSI x570 gaming edge. Using the Wraith that came with it, in a case with 3 front fans and a back fan, room temp generally ~24c. When idling I'm usually betwen 45-50c and playing hitman 3 with settings jacked usually between 70-80c. Running prime 95 in the highest stress gets me to 75-80 degrees at 3700 Mhz (monitoring done with AMD Ryzen Master). In the most recent one, 10 seconds after I stopped Prime95 it had dropped from 78c to 52c and was down to 47c 20 seconds after that.

Anyway I put all that probably too much detail in to ask if when I open this thing up later in the week to add my new NVMe drive if I should reseat the wraith with new thermal paste (I have some arctic silver lying around but just put it on with the stock thing that was on it out of the box when I built it). I hear Ryzens run hot and going back several pages in this thread I seem to get the idea that mine might be more than it should be even taking that into consideration. Thanks for your help!

While I'm not familiar with the 3900x's exact characteristics, clock speeds are usually not boosted by much during intense all-core torture tests. And 80 degrees should be a fair bit below the thermal throttle limit, so that seems fine. The idle temps are warmer than I'd expect, though. The fact that idle thermals are warm and load thermals appear fine is probably a result of your fan curve. Your fans are barely running at idle, letting your CPU get warm, and then heavily ramping up under load. This is mostly fine, I don't think a CPU can be damaged by idling at warm temperatures. If you'd like to tinker with this, your motherboard bios probably has a feature to let you adjust fan speeds at different temperature levels, or maybe MSI has some software that lets you do so in windows.

Dr. Video Games 0031 fucked around with this message at 09:26 on Jul 6, 2021

CaptainSarcastic
Jul 6, 2013



fast cars loose anus posted:

I feel like I need a sanity check on my CPU temps. I've got a newish build (basically 3 weeks old) with a Ryzen 9 3900x on an MSI x570 gaming edge. Using the Wraith that came with it, in a case with 3 front fans and a back fan, room temp generally ~24c. When idling I'm usually betwen 45-50c and playing hitman 3 with settings jacked usually between 70-80c. Running prime 95 in the highest stress gets me to 75-80 degrees at 3700 Mhz (monitoring done with AMD Ryzen Master). In the most recent one, 10 seconds after I stopped Prime95 it had dropped from 78c to 52c and was down to 47c 20 seconds after that.

Anyway I put all that probably too much detail in to ask if when I open this thing up later in the week to add my new NVMe drive if I should reseat the wraith with new thermal paste (I have some arctic silver lying around but just put it on with the stock thing that was on it out of the box when I built it). I hear Ryzens run hot and going back several pages in this thread I seem to get the idea that mine might be more than it should be even taking that into consideration. Thanks for your help!

I got better temps with the stock cooler after cleaning off the pre-applied paste and repasting it. Your temps don't sound too crazy but if I were you I'd consider an aftermarket cooler - I ended up with a Be Quiet! tower on my 3600X and my thermals are much improved.

fast cars loose anus
Mar 2, 2007

Pillbug

Dr. Video Games 0031 posted:

While I'm not familiar with the 3900x's exact characteristics, clock speeds are usually not boosted by much during intense all-core torture tests. And 80 degrees should be a fair bit below the thermal throttle limit, so that seems fine. The idle temps are warmer than I'd expect, though. The fact that idle thermals are warm and load thermals appear fine is probably a result of your fan curve. Your fans are barely running at idle, letting your CPU get warm, and then heavily ramping up under load. This is mostly fine, I don't think a CPU can be damaged by idling at warm temperatures. If you'd like to tinker with this, your motherboard bios probably has a feature to let you adjust fan speeds at different temperature levels, or maybe MSI has some software that lets you do so in windows.

Thanks I'll check the fan thing when I get home; MSI has control panels for drat near everything so I'm sure I can figure something out.

CaptainSarcastic posted:

I got better temps with the stock cooler after cleaning off the pre-applied paste and repasting it. Your temps don't sound too crazy but if I were you I'd consider an aftermarket cooler - I ended up with a Be Quiet! tower on my 3600X and my thermals are much improved.

I've been looking at one of these because I don't like how the Hyper Evo 212's move around even when seated but I'm worried about the pictures I see of fans overhanging RAM; did you have any issues with this?

e: I found a compatibility thing on be quiet's website that says it should be fine with the dark rock pro 4 so I went ahead and got one. If I'm gonna re-paste anyway I might as well do it for a good one; I also hate how the bottom of the wraith isn't flat.

fast cars loose anus fucked around with this message at 19:01 on Jul 6, 2021

Nice Van My Man
Jan 1, 2008

I keep my whole computer at 40-50 C, just to keep the fan noise down and because I figured that was well within their safe temps. Should I be worrying more about my minimum temperature? I always worried more about my max temps.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Nice Van My Man posted:

I keep my whole computer at 40-50 C, just to keep the fan noise down and because I figured that was well within their safe temps. Should I be worrying more about my minimum temperature? I always worried more about my max temps.

No, that's perfectly legitimate and the silicon will happily run for decades at those temps.

Nerds just have a tendency to min/max. (Self included, obvs.)

CaptainSarcastic
Jul 6, 2013



fast cars loose anus posted:

Thanks I'll check the fan thing when I get home; MSI has control panels for drat near everything so I'm sure I can figure something out.

I've been looking at one of these because I don't like how the Hyper Evo 212's move around even when seated but I'm worried about the pictures I see of fans overhanging RAM; did you have any issues with this?

e: I found a compatibility thing on be quiet's website that says it should be fine with the dark rock pro 4 so I went ahead and got one. If I'm gonna re-paste anyway I might as well do it for a good one; I also hate how the bottom of the wraith isn't flat.

I have a Dark Rock Slim and it's been great, and easily clears my 4 sticks of RAM. With a 3900X you probably want cooling than I need, so the Pro 4 makes sense.

KYOON GRIFFEY JR
Apr 12, 2010



Runner-up, TRP Sack Race 2021/22

fast cars loose anus posted:

I've been looking at one of these because I don't like how the Hyper Evo 212's move around even when seated

If the cooler is moving around it isn't properly seated. the screws for the mounts for AM4 on the 212s are spring-loaded and it's basically impossible to torque it down too much. Source: just installed an AM4 212.

fast cars loose anus
Mar 2, 2007

Pillbug
I had one on an intel and the cooling was working as reflected by the actual temps but that thing was able to rotate more than I'd have liked. I have no idea what else I could have done or what I could have missed since I followed the instructions.

Adbot
ADBOT LOVES YOU

CaptainSarcastic
Jul 6, 2013



KYOON GRIFFEY JR posted:

If the cooler is moving around it isn't properly seated. the screws for the mounts for AM4 on the 212s are spring-loaded and it's basically impossible to torque it down too much. Source: just installed an AM4 212.

Unless one of the screws breaks in half, in which case the cooler is useless. That's one reason I ended up spending a little more on the be quiet! cooler, because the 212 I had originally bought to use as a replacement for the stock cooler broke like that as I was installing it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply