Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
The 9900k will probably pull slightly more ahead over time, but it's never going to justify the price increase over a 3600. By the time you need to upgrade, your money will be better spent on new hardware than the price difference.

Adbot
ADBOT LOVES YOU

Khorne
May 1, 2002

uhhhhahhhhohahhh posted:

I probably wouldn't buy a 2080 Ti right now, and I don't have a 240hz monitor right now either - but i am considering it at some future date. I was planning on holding out until nvidia's 3000 series/if AMD ever throw up a 2080Ti competitor before I upgrade, depending on how impatient I get. I don't know if I'd buy the 3080 Ti either though. My main concern is if the lead between a 9900k and Zen2/Zen3 is going to increase more, say 4 or 5 years in the future, or close/stay the same.
As long as you are willing to turn graphics settings down so your GPU and CPU bottleneck at about the same point it kinda doesn't matter what GPU you have.

If you aren't willing to turn down graphics settings, zen2 is going to perform the same except in a few niche simulation style games where Intel still has a pretty sizable lead.

Khorne fucked around with this message at 19:56 on Jul 28, 2019

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

uhhhhahhhhohahhh posted:

I probably wouldn't buy a 2080 Ti right now, and I don't have a 240hz monitor right now either - but i am considering it at some future date. I was planning on holding out until nvidia's 3000 series/if AMD ever throw up a 2080Ti competitor before I upgrade, depending on how impatient I get. I don't know if I'd buy the 3080 Ti either though. My main concern is if the lead between a 9900k and Zen2/Zen3 is going to increase more, say 4 or 5 years in the future, or close/stay the same.

I can’t imagine the lead between the 9900k and Ryzen 2 growing significantly, at worst it will stay stagnant and at best Ryzen picks up a little bit of steam. I’m sure you would be happy with any of them for the next several years.

Laslow
Jul 18, 2007
Kind of sucks, I want that 3950x to replace my aging Haswell E3, but 4K60 will still be a $1200 pipe dream GPU wise, even waiting til September.

uhhhhahhhhohahhh
Oct 9, 2012

Khorne posted:

As long as you are willing to turn graphics settings down so your GPU and CPU bottleneck at about the same point it kinda doesn't matter what GPU you have.

If you aren't willing to turn down graphics settings, zen2 is going to perform the same except in a few niche simulation style games where Intel still has a pretty sizable lead.

I'm absolutely fine with turning the settings down, especially when playing FPS, but even in single player games as well. I prefer higher fps over how the game looks any time. I have seen a handful of them silent youtube reviews where, in BFV specifically, there is still a difference in FPS between a 9700 or 9900k and a 3900X with a 2080Ti at 1080 on low settings.

nerdrum
Aug 17, 2007

where am I
Microcenter has a bundle on a 2920x and a ASRock X399 Phantom Gaming 6 for $499. Can someone please tell me not to buy this.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Laslow posted:

Kind of sucks, I want that 3950x to replace my aging Haswell E3, but 4K60 will still be a $1200 pipe dream GPU wise, even waiting til September.

Mainstream 4k gaming remains irritatingly out of reach for a while yet still. There was a brief time when my 6700k and 1080ti actually did run high-maximum settings without dipping but game requirements in general has gone up again

I'm just thankful that I only play Destiny 2 99% of the time and that runs 4k60 on max just fine.

Nvidia really dropped the ball with RTX, they should have capitalized on the growing 4KTV segment & 1080ti hype, and focused on bringing 4k gaming to the masses, not ray tracing.

You'd be surprised what you can get away with CPU-wise to achieve 4K60, CPU stuff is largely independent of resolution compared to GPUs, and only has to run at 60 frames after all.

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



Filling in the gaps?

Ryzen 3900, 3700, 3500 listed by EEC.

https://www.tomshardware.com/news/ryzen-9-3900-ryzen-7-3700-ryzen-5-3500-amd-3000-series,40040.html

ZobarStyl
Oct 24, 2005

This isn't a war, it's a moider.
That 3900 looks amazing for upgrading my old Bulldozer server. If it's still in the 4+ ghz boost range at 65 W TDP there's an amazing amount of high efficiency power in that package.

PC LOAD LETTER
May 23, 2005
WTF?!

uhhhhahhhhohahhh posted:

I have seen a handful of them silent youtube reviews where, in BFV specifically, there is still a difference in FPS between a 9700 or 9900k and a 3900X with a 2080Ti at 1080 on low settings.
Can you actually see the difference between say 130 vs 140fps though? Or perhaps 230 vs 240fps? And if so what effect does that have on your gaming? There are plenty of people who are competitive gamers who'll say the difference isn't noticeable and doesn't effect them at all and even plenty of the people who do claim to be able to see those sorts of differences say it doesn't actually matter at all.

Generally speaking Zen2 is going to closely trade blows with Coffeelake/Whiskeylake in gaming and probably beat it on most other stuff. If getting around 15fps more in a few games where the 'slow' chip competitor is still getting over 100fps is more important to you than sure get the Intel chip.

uhhhhahhhhohahhh
Oct 9, 2012

PC LOAD LETTER posted:

Can you actually see the difference between say 130 vs 140fps though? Or perhaps 230 vs 240fps? And if so what effect does that have on your gaming? There are plenty of people who are competitive gamers who'll say the difference isn't noticeable and doesn't effect them at all and even plenty of the people who do claim to be able to see those sorts of differences say it doesn't actually matter at all.

Generally speaking Zen2 is going to closely trade blows with Coffeelake/Whiskeylake in gaming and probably beat it on most other stuff. If getting around 15fps more in a few games where the 'slow' chip competitor is still getting over 100fps is more important to you than sure get the Intel chip.

I could probably tell the difference between a constant 140 and jumping from 130 to 140. It's the min and 1% lows that are more important, and there is a bigger gap between them than 10 fps.

I ordered the 3600 setup anyway. It was probably more like £200-300 difference between this and the 9900k. I can't imagine anything changing in a year, so 6 cores should be enough, and I don't do anything other than gaming and browsing on my PC. Might mess around with some videos, but not streaming. I'll re-evaluate when Zen3 drops, and maybe pick up a GPU in the meantime if my Fury is really struggling at low settings. Feels weird upgrading after 8 years of the same PC Hope there's a noticeable difference compared to my 2600k @ 4.2 in BFV! I'm going from 130 to <60fps constantly at the moment - I'm sure it's a CPU bottleneck because I already tested with a Vega56 and it made zero difference, but maybe it was a GPU bottleneck in some places.

PyPy
Sep 13, 2004

by vyelkin
I built a X2700 build on a whim over the past week to replace my phenom ii X4 955. Moved the 1060 in and it's a whole new world.

forest spirit
Apr 6, 2009

Frigate Hetman Sahaidachny
First to Fight Scuttle, First to Fall Sink


uhhhhahhhhohahhh posted:

Hope there's a noticeable difference compared to my 2600k @ 4.2 in BFV! I'm going from 130 to <60fps constantly at the moment - I'm sure it's a CPU bottleneck because I already tested with a Vega56 and it made zero difference, but maybe it was a GPU bottleneck in some places.

dude you do know that BF5 is notorious for having prevalent stuttering problems on all configurations because the game is poorly assembled. I have a 1070, my two friends an rx480 and a 2080ti, and we all stopped playing because the performance was abysmal. Even with every graphical option cranked down to "bullshit"

BF1, which looks better, also runs a lot better. BF5 is just a garbage example to compare builds, even if it is your most played game at this point in time

also the time between frame 130 and 140 is 7 milliseconds. even if FPS was jumping between the two I doubt most people would notice if they didn't have an fps counter on the screen

forest spirit fucked around with this message at 12:40 on Jul 29, 2019

uhhhhahhhhohahhh
Oct 9, 2012
I know about the performance issues, my fps drops have been going on since release. It was exactly the same in BF1. It's not a stuttering/inconsistency issue, it's persistent low FPS in certain areas on certain maps that isn't solved by zooming into a wall. I'm at 100% CPU usage constantly, and a new GPU made literally zero difference.

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Rabid Snake posted:

Your minimum 1 percent is gonna be horrible on that 2500k even with gaming on a low spec target like 1080p/60hz. You must be stuttering a lot because i was suffering a lot of stuttering on my haswell i5 when I jumped to a 9900k.

Checking benchmarks, the 3600 isn’t too far behind like my 9900k verus a 4 core 4 thread old intel processor

2500k's with a mid 4ghz overclock is a lot more usable again with 1903 and the retpoline mitigation, performance degradation was extremely painful until that came out

dorkanoid
Dec 21, 2004

I'm having some weird temperatures on my 3600, the temperature offset that was an issue with monitoring early Ryzens isn't a thing anymore, right?

It's currently idling at 45-50C (as measured by NZXT CAM and Ryzen Master), with any activity (Windows Defender scanning, etc. - basically single core events) spiking it to ~60ish leading to the AIO spinning up the fans sharply.

So far I've changed the curve to be less steep, but I think I might have to remove the "integrated" paste on the AIO and change it to some new/better paste...

EDIT:

this is not (just) the GIF looping; the temperature slowly drops to 44ish, then spikes to 50+, repeat.

EDIT2: I found part of the issue at least, I had set NZXT CAM to ramp the fans on the CPU temperature rather than the liquid temperature - now it's silent again, with the fans barely moving during a prime95 run.

I'm still seeing the spikes above though.

dorkanoid fucked around with this message at 16:13 on Jul 29, 2019

Khorne
May 1, 2002
This is a common zen2 issue and will hopefully be fixed with a chipset or bios update. There's no real cause and it's normal provided your load temperatures aren't going crazy.

If you really want to get rid of it you can change to windows balanced power plan which will make boost behavior much slower. You also need to make sure your RAM is clocked to under 3600. I'd vote it's not worth doing either of those things because it should be fixed "soon".

dorkanoid
Dec 21, 2004

Khorne posted:

This is a common zen2 issue and will hopefully be fixed with a chipset or bios update. There's no real cause and it's normal provided your load temperatures aren't going crazy.

If you really want to get rid of it you can change to windows balanced power plan which will make boost behavior much slower. You also need to make sure your RAM is clocked to under 3600. I'd vote it's not worth doing either of those things because it should be fixed "soon".

I hit 85C in Prime95, but it dropped back to the loop above after 4-5 seconds when I stopped the run.

Khorne
May 1, 2002

dorkanoid posted:

I hit 85C in Prime95, but it dropped back to the loop above after 4-5 seconds when I stopped the run.
85 in prime95 is better than normal.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
V glad others are beta testing ryzen 3000 and x570

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
Intel manages to put out chips without you worrying about bioses for the most part but amd squeezes out the turd that is x570 and agesa

Slab Squatthrust
Jun 3, 2008

This is mutiny!
OTOH, I bought an ASRock x570 and haven't had any issues with it at all, and everything is running very nicely. I'm not trying to do anything crazy with OCing stuff, but it's not been necessary so far.

Khorne
May 1, 2002

Malcolm XML posted:

Intel manages to put out chips without you worrying about bioses for the most part but amd squeezes out the turd that is x570 and agesa
There's nothing wrong with a 50c idle temp, and x570 itself has worked fine. There are/were bigger AGESA issues with some of the older AM4 boards and zen2. The idle temp is largely due to aggressive boost behavior, but the boost behavior isn't bad or wrong. It just results in a high idle temp with certain background tasks. Zen2's boost will aggressively throw high voltage at the CPU in low current situations.

It may or may not get fixed, because it may or may not be a problem. People just expect a lower number because they're used to every other CPU in existence and a 50c idle usually means "your CPU cooler is on wrong!" That's why I asked him about load temps to rule that out. AMD is releasing a statement tomorrow about the state of AGESA and chipset drivers.

The only real issue they've had is with rdrand. And even that has been around since the fx days. Zen2 managed to break it in slightly different ways, a recent systemd update amplified the issue causing issues with newer linux distros that AMD likely didn't QA on, and Destiny 2 uses it for some reason. It doesn't really effect any other commonly used software.

I agree that Intel usually QAs better, but they also constantly change platform which simplifies QA significantly, have more resources, and haven't had any radical changes in the past 8-9 years.

Khorne fucked around with this message at 18:10 on Jul 29, 2019

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Malcolm XML posted:

Intel manages to put out chips without you worrying about bioses for the most part but amd squeezes out the turd that is x570 and agesa

Yeah, but navigating the myriad of custom regkeys, bios updates, and microcode updates on the Intel side for meltdown/spectre hasn't been a cakewalk. Feels like a wash these days effort-wise.

Arzachel
May 12, 2012

Malcolm XML posted:

Intel manages to put out chips without you worrying about bioses for the most part but amd squeezes out the turd that is x570 and agesa

Intel pushed out Spectre mitigations through Windows update that made a bunch of laptops bootloop

dex_sda
Oct 11, 2012


Officially back on team red for cpu's, just bought my 3600. Can't wait to get it by the end of the week!

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

nerdrum posted:

Microcenter has a bundle on a 2920x and a ASRock X399 Phantom Gaming 6 for $499. Can someone please tell me not to buy this.

Unless you specifically need the PCIe lanes or RAM then I’d probably recommend a 3900X and a cheapie B450 over it. That will nominally be a little more expensive (maybe $50) but you will save some money back by not having to buy TR-specific cooling/quad channel RAM. The 3900X is also gonna be faster in gaming, faster on AVX-heavy tasks, and cooler in general.

If you can get a 1950X/2950X combo for $500 then go for it, 4 extra cores for the same price would be a reasonable sidegrade. But unless you really want to be on the TR platform the 3900X has killed the low-end TR market.

Paul MaudDib fucked around with this message at 20:00 on Jul 29, 2019

dorkanoid
Dec 21, 2004

Khorne posted:

There's nothing wrong with a 50c idle temp, and x570 itself has worked fine. There are/were bigger AGESA issues with some of the older AM4 boards and zen2. The idle temp is largely due to aggressive boost behavior, but the boost behavior isn't bad or wrong. It just results in a high idle temp with certain background tasks. Zen2's boost will aggressively throw high voltage at the CPU in low current situations.

It may or may not get fixed, because it may or may not be a problem. People just expect a lower number because they're used to every other CPU in existence and a 50c idle usually means "your CPU cooler is on wrong!" That's why I asked him about load temps to rule that out. AMD is releasing a statement tomorrow about the state of AGESA and chipset drivers.

The only real issue they've had is with rdrand. And even that has been around since the fx days. Zen2 managed to break it in slightly different ways, a recent systemd update amplified the issue causing issues with newer linux distros that AMD likely didn't QA on, and Destiny 2 uses it for some reason. It doesn't really effect any other commonly used software.

I agree that Intel usually QAs better, but they also constantly change platform which simplifies QA significantly, have more resources, and haven't had any radical changes in the past 8-9 years.

I probably wouldn't even have questioned it - except either I changed it, or today's "Ryzen 3000 compatible" update for NZXT CAM made it spin up the fan >50C (and off at 49C).

Khorne
May 1, 2002

dorkanoid posted:

I probably wouldn't even have questioned it - except either I changed it, or today's "Ryzen 3000 compatible" update for NZXT CAM made it spin up the fan >50C (and off at 49C).
Lots of monitoring software and rgb software is triggering higher clocks when it should be idle. But yeah, just zen2 aggressively boosting voltage.

Khorne fucked around with this message at 20:31 on Jul 29, 2019

Soricidus
Oct 21, 2010
freedom-hating statist shill
With ryzen balanced mode, my cpu hits ~80C regularly in games with associated fan noise etc. With ryzen power saver it barely breaks 45C in the same games, and the frame rate isn’t affected because none of them is remotely cpu bound so there is literally no reason for the cpu to be boosting at all.

At this point I don’t care if amd want to claim this is fine and “just how zen2 works”, because that is clearly just wrong.

eames
May 9, 2009

Malcolm XML posted:

Intel manages to put out chips without you worrying about bioses for the most part

to be fair their current consumer chipset is a mild iteration on a product that has been on the market since Q3 2015.
X570 is very different indeed (IO die repurposed as chipset).

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Intel is not immune to this either, X99 was a very rough launch (some boards literally caught fire) and even with the 9900K there were some Asrock BIOSs that silently overvolted the chip to a dangerous degree regardless of user input. As long as it eventually gets fixed it’s not a big deal.

That said, AMD’s current take is that this is working as intended, so there may not be a (fixable) problem, and this is just how the chips are. There are no other high power 7nm CPUs on the market at the moment so we have no real point of reference on whether this is normal given 7nm’s voltage and thermal thresholds. Running close to threshold may mean less margin to idle down at low clocks and high density may mean spiky thermals. Or it may be related to infinity fabric power - any load at all means it has to have big expensive off-chip(let) serdes links turned on to hit memory.

Navi is obviously a GPU, which is different, but also presents “challenging” thermal characteristics.

Paul MaudDib fucked around with this message at 21:56 on Jul 29, 2019

surf rock
Aug 12, 2007

We need more women in STEM, and by that, I mean skateboarding, television, esports, and magic.
I really hope AMD ramps up production of the 3900X or at least starts delivering some stock in August. I've had the Newegg auto-notify thing on for it and it's come back into stock once in the past week; I spotted the email about 20 minutes after I received it and the drat thing was already sold out again.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Buildzoid mentions in this vid that they actually specify a higher maximum voltage in their boost algorithm for Ryzen 3000 vs earlier generations.

https://www.youtube.com/watch?v=iZI9ZgwrDYg

MaxxBot fucked around with this message at 23:19 on Jul 29, 2019

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

The demi-rumored no-X 3900 is the chip I wanted since the Zen 2 announcement last November. So I would really like it if AMD could hurry up and announce when that poo poo is going to be in stores.

On the topic of Zen 2 having a rough launch, contrary to AMD's statements about getting temperature sensing working in Linux in a timely manner this time around (unlike the last 2 times):

* No k10_temp support for Zen 2 was in the current kernel (5.2) at launch, and there is none today.
* Yesterday we learned that there won't be support in the next version (5.3) either; support is coming in 5.4
* Kernel release cadence has averaged 67 days over the past 24 releases (back to April, 2015)
* Kernel 5.2 was released on 7 July (22 days ago), so we can expect 5.3 around 12 September, and 5.4 around 18 November

It's awesome to have another quarter+ round of trying to do performance/power tuning while having no idea what temperature the CPU is running at.

On the other hand, if you picked up a spiffy new 5700/XT, you can expect it to start doing something other than acting as an unaccelerated VESA-compliant frame buffer when 5.3 drops. As of right now, anyway -- but until yesterday, we all thought we'd be able to check temperatures then, too.

peepsalot
Apr 24, 2007

        PEEP THIS...
           BITCH!

nerdrum posted:

Microcenter has a bundle on a 2920x and a ASRock X399 Phantom Gaming 6 for $499. Can someone please tell me not to buy this.
https://www.youtube.com/watch?v=DYVulrV2ZGU

dorkanoid
Dec 21, 2004

Khorne posted:

Lots of monitoring software and rgb software is triggering higher clocks when it should be idle. But yeah, just zen2 aggressively boosting voltage.

That's not what I meant :)

The NZXT software seems to be able to set the fan curve based on 3 different temperatures: Liquid, CPU or GPU; you want the AIO fans to trigger on liquid temperature, not the CPU temperature, since they're not directly linked - but what I got (due to it being changed from the default) was the fan spinning every time the CPU temperature fluctuated, even if the fans spinning did very little to change the actual temperature.

Other than that, a Kraken x72 was way overkill for a 3600, but it's awesome having cooling this silent (when it's configured right)!

Seamonster
Apr 30, 2007

IMMER SIEGREICH
new chipset drivers are out to address the boost/voltage eagerness of Zen2. Also some destiny2 stuff.

I'm seeing more stable idle temps in the 40-41C range in a rather warm room instead of 46-47 with spikes up to 50+; 3800X, 280mm AIO on low.

Golluk
Oct 22, 2008

BangersInMyKnickers posted:

2500k's with a mid 4ghz overclock is a lot more usable again with 1903 and the retpoline mitigation, performance degradation was extremely painful until that came out

Any way to confirm I have those applied? Still running on a 2500K.

Edit: Looks like I check if I have KB4482887 for retpoline.

Golluk fucked around with this message at 18:31 on Jul 30, 2019

Adbot
ADBOT LOVES YOU

Inept
Jul 8, 2003


How were Geocities PC parts reviews better done than this? Youtube has so many long useless videos like this, who the gently caress watches this

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply