Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
The 1% low improvement is the most impressive part of these benchmarks for me, even though at the resolutions most play at you may never realize them until the next GPU gen, if even that. Still consistency is what I'm most interested in rather than huge average boosts.

If you really, really like Outer Worlds, guess you gotta get this CPU then

Adbot
ADBOT LOVES YOU

CaptainSarcastic
Jul 6, 2013



I'm still considering a CPU upgrade, and wanted to ask if the occasional hitch in a game would be reflective of a 1% low and be improved by a stronger processor? I have a 3600X and mostly it does great, but in some games it will very occasionally have a stutter that as far as I can tell appears to be CPU-based. Otherwise this would happen when my game FPS was perfectly fine otherwise, comfortably over 60 frames and usually higher. This might be placebo, but killing a few background processes seemed to help eliminate this. The games where this would happen were usually more intensive, like BL3 or CP2077, as I recall.

My Windows install is pretty barebones as it is used almost entirely for games, so no third-party antivirus or anything running in the background.

My GPU is a 2070 Super, as a point of reference.

explosivo
May 23, 2004

Fueled by Satan

Welp, picked up an X3D from Microcenter :woop:

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

CaptainSarcastic posted:

I'm still considering a CPU upgrade, and wanted to ask if the occasional hitch in a game would be reflective of a 1% low and be improved by a stronger processor? I have a 3600X and mostly it does great, but in some games it will very occasionally have a stutter that as far as I can tell appears to be CPU-based. Otherwise this would happen when my game FPS was perfectly fine otherwise, comfortably over 60 frames and usually higher. This might be placebo, but killing a few background processes seemed to help eliminate this. The games where this would happen were usually more intensive, like BL3 or CP2077, as I recall.

My Windows install is pretty barebones as it is used almost entirely for games, so no third-party antivirus or anything running in the background.

My GPU is a 2070 Super, as a point of reference.

From looking at CPU usage, with Cyberpunk it was definitely my CPU that was responsible for any stutters (partly because MS Defender was interfering).

acksplode
May 17, 2004



Happy_Misanthrope posted:

The 1% low improvement is the most impressive part of these benchmarks for me, even though at the resolutions most play at you may never realize them until the next GPU gen, if even that. Still consistency is what I'm most interested in rather than huge average boosts.

Yeah that's ultimately what's motivating me. I figure the big cache might help it punch above its weight in a few years and prevent dips below the target framerate. That plus DLSS or similar upscaling for my GPU and I'm hoping I'll be able to ride this build out for a good long while without further major upgrades.

MikeC
Jul 19, 2004
BITCH ASS NARC

Rinkles posted:

From looking at CPU usage, with Cyberpunk it was definitely my CPU that was responsible for any stutters (partly because MS Defender was interfering).

On a 3600x right now. It holds back performance in cp2077 so much that I can crank RT psycho on with DLSS quality at 1440p and the CPU is the bottleneck in cities

Klyith
Aug 3, 2007

GBS Pledge Week

CaptainSarcastic posted:

I'm still considering a CPU upgrade, and wanted to ask if the occasional hitch in a game would be reflective of a 1% low and be improved by a stronger processor? I have a 3600X and mostly it does great, but in some games it will very occasionally have a stutter that as far as I can tell appears to be CPU-based. Otherwise this would happen when my game FPS was perfectly fine otherwise, comfortably over 60 frames and usually higher. This might be placebo, but killing a few background processes seemed to help eliminate this. The games where this would happen were usually more intensive, like BL3 or CP2077, as I recall.

That isn't a 1% low. A 1% low is the 1 worst frame out of 100, so if you are getting 50 FPS it's happening 2 seconds. It you have bad 1% lows the game may feel "stuttery" or inconsistent (particularly if you use vsync on a 60hz, non-VFR monitor).


An "occasional hitch", if talking about something that happens less than once a minute, is rarely anything to do with the CPU or GPU.

It could be the game engine -- something poorly optimized that happens once in a while. It could be background processes stealing CPU time or memory or just blocking something on the game somehow. It could be the internet if playing a multiplayer game. It could be secondary hardware, like audio (realtek drivers) or weird stuff like the AMD fTPM bug.

But the key is, if something isn't happening pretty consistently every ~10-15 seconds, it's not really the CPU/GPU performance doing it and a hardware upgrade is unlikely to solve it. (A complete new system can solve OS or secondary hardware problems, but so can fixing whatever is wrong on the old system.)

kliras
Mar 27, 2021

CaptainSarcastic posted:

I'm still considering a CPU upgrade, and wanted to ask if the occasional hitch in a game would be reflective of a 1% low and be improved by a stronger processor? I have a 3600X and mostly it does great, but in some games it will very occasionally have a stutter that as far as I can tell appears to be CPU-based. Otherwise this would happen when my game FPS was perfectly fine otherwise, comfortably over 60 frames and usually higher. This might be placebo, but killing a few background processes seemed to help eliminate this. The games where this would happen were usually more intensive, like BL3 or CP2077, as I recall.

My Windows install is pretty barebones as it is used almost entirely for games, so no third-party antivirus or anything running in the background.

My GPU is a 2070 Super, as a point of reference.
edge cases can include such things like

- excessive usb polling from busted devices
- aforementioned amd ftpm issue
- less-than-great fan curves (eg fan/temp hysteresis) and other cooling issues
- minimum cpu usage being kind of busted on amd (try setting it high in power options and see if you notice a difference)
- god awful peripheral/rgb firmware, which is to say most of it

i'd just try to find some frame time benchmarks and compare, because all kinds of things can cause judder

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Also, loving vendor software. Like anything besides Afterburner and Precision X is probably a huuuge piece of poo poo that will at least sometimes just destroy your system.

CaptainSarcastic
Jul 6, 2013



Yeah, I think I got the fTPM bug when I had it turned on out of cussedness, and I get some Bluetooth audio glitches once in a while, but those seem unrelated to the hitches I'm seeing - that is, if the audio glitches it doesn't affect framerate. I should probably do a more thorough look at the lower level hardware drivers. A CPU upgrade is still tempting - the 3600X is a good processor, but part of me wants to upgrade and also going from 6 to 8 cores feels like it would give my system a little longer legs.

Oh, and I avoid vendor software like the plague.

explosivo
May 23, 2004

Fueled by Satan

explosivo posted:

Welp, picked up an X3D from Microcenter :woop:

Holy shitfuck I just put this in and the difference is immediately noticeable. This chip might be the Real Deal. Also first time replacing a CPU with the AIO cooler and the lil hooks on the side made that process a breeze. In and out in like 15-20 minutes.

Kibner
Oct 21, 2008

Acguy Supremacy

explosivo posted:

Holy shitfuck I just put this in and the difference is immediately noticeable. This chip might be the Real Deal. Also first time replacing a CPU with the AIO cooler and the lil hooks on the side made that process a breeze. In and out in like 15-20 minutes.

Nice! What did you upgrade from?

hobbesmaster
Jan 28, 2008

In case anyone wanted an update on my liquid freezer ii problem I just got an exchange through Amazon and the new one has the same issue. Again, good spread on the heat spreader so I’m thinking Arctic must be having some production issues. They haven’t responded to my support case yet so I’m guessing they weren’t kidding about the 9 days to hear a reply thing.

explosivo
May 23, 2004

Fueled by Satan

Kibner posted:

Nice! What did you upgrade from?

A 3900X. It wasn't bad but when I got my 3080ti it clearly became the weak link. Now it feels like I'm seeing the full power of the GPU too and my FPS is pretty consistently 100+ when I was struggling to achieve a steady 60 in something like far cry 6.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Cygni posted:

Naw, not unless you are doing something that actually uses the extra cores. If you really want a shiny new toy, consider making an ITX HTPC/NAS/Pfsense/Home server box. Its fun, cheaper (especially with used parts), and useful!

Already have a home server that's kitted out. I could always use those extra cores for more virtual machines. It really wouldn't be an upgrade and I know it. I just want an excuse ;)

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

explosivo posted:

A 3900X. It wasn't bad but when I got my 3080ti it clearly became the weak link. Now it feels like I'm seeing the full power of the GPU too and my FPS is pretty consistently 100+ when I was struggling to achieve a steady 60 in something like far cry 6.

Wow, what resolution?

explosivo
May 23, 2004

Fueled by Satan

Rinkles posted:

Wow, what resolution?

1440p. I haven't tried 4k yet but I'm very curious to see how much of a difference it makes.

Edit: I tried Dying Light 2 and the difference is staggering. Same with Far Cry 6, DL2 just felt poorly optimized to me. I was playing before with everything maxed out with the RTX off and could mostly hang around 60 but it'd dip below quite a bit. This time I turned every RTX toggle on and with DLSS Quality was getting ~100-120 fps.

explosivo fucked around with this message at 01:59 on Apr 21, 2022

shrike82
Jun 11, 2005

did you change anything else in your system - those numbers seem eyepopping

New Zealand can eat me
Aug 29, 2008

:matters:


Far Cry 6 is one of the titles where the X3D absolutely runs away.

If I could be selfish, do any of you X3D-havers have a copy of DiRT Rally 2 and a strong desire to run its openbenchmarking suite?

If anyone happens to have both an X3D & 6900XT, I'll pay for your copy of the game ($19.99 on Steam) once you submit results for this test at 1080/1440

The F1 2021 benchmarks should be evidence enough given it's the more popular cousin of D:R2, but as someone who just really likes the game I need to know what I'm missing out on before I commit to waiting for Zen 4

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

New Zealand can eat me posted:

Far Cry 6 is one of the titles where the X3D absolutely runs away.

Yeah dogshit engines seem to benefit a ton once they fit into cache - and that’s not a bad thing, there are a lot of dogshit engines! Source (CS:GO, TF2, etc) was another - but apparently they don’t gain further on X3D, they fit into cache and that’s that.

Battlefield V probably is another, there’s a gently caress load of stuff going on with physics there, it’s an overly massive open world destructible 64p environment with dynamic lighting / etc. The single-player campaign doesn’t stress your system nearly as much.

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!
Jawn V6 foretold how a 1GB cache would bring 'SSD-like' performance improvements. Praised be the platfrom oracle.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
If that's the path forwards, just go with HBM-on-package like Sapphire Rapids. That way you could forego RAM entirely. This would allow for new motherboard form factors that ditch DIMM slots entirely to allow more room for the heat sinks of future 1000W+ GPUs.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
To be fair I (and surely many others) took notice of the potential of huge caches with regards to gaming performance back at the time of the 5775C, where disabling the integrated graphics to make full use of the 128 MB of eDRAM really smoothed out minimum frame times. That was effectively l4 cache and not hugely fast, but it was still notable. It was a path I'd hoped Intel would take. They didn't, AMD did, and they're seeing the huge perf gains

Kazinsal
Dec 13, 2011
Intel's doing it in their next-gen Xeon architecture but as per usual it's a feature that would be helpful everywhere but Intel is locking it behind "gently caress you more money".

Truga
May 4, 2014
Lipstick Apathy

Dr. Video Games 0031 posted:

No idea how this relates to anything else, though. Creating a proper benchmark pass in a Paradox game sounds hard due to the amount of RNG involved. Even if you set everything up properly so you're in observer mode, the AI will do wildly different things with each pass which can effect the results. If a global/galaxy-spanning war breaks out in one pass but not another, that's too much variance to make for usable data.

what you do is take a very-late-game save so the game is cpu-bound rather than time-bound, set game to max speed, and let it run for 3 years. not much variance happens in stellaris in 3 years, especially in late game when all the events have already fired

then measure the seconds it takes to run said 3 years

Truga fucked around with this message at 08:53 on Apr 21, 2022

BurritoJustice
Oct 9, 2012

HalloKitty posted:

To be fair I (and surely many others) took notice of the potential of huge caches with regards to gaming performance back at the time of the 5775C, where disabling the integrated graphics to make full use of the 128 MB of eDRAM really smoothed out minimum frame times. That was effectively l4 cache and not hugely fast, but it was still notable. It was a path I'd hoped Intel would take. They didn't, AMD did, and they're seeing the huge perf gains

The rumour is that the main P-core upgrade with Raptorlake is 1.5x the L2 and L3 cache, so Intel is catching on a little bit.

EMIB seems to me like a pretty fantastic technology so I imagine if they're working on a cache die that could be great.

Arzachel
May 12, 2012

Kazinsal posted:

Intel's doing it in their next-gen Xeon architecture but as per usual it's a feature that would be helpful everywhere but Intel is locking it behind "gently caress you more money".

The X3D is also a one-off (at least for now), regular Zen 4 won't be using a cache die either.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

New Zealand can eat me posted:

Far Cry 6 is one of the titles where the X3D absolutely runs away.

I was more surprised that the 3900X was holding it back that much.

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.

Arzachel posted:

The X3D is also a one-off (at least for now), regular Zen 4 won't be using a cache die either.

I think the current rumours say there's Zen 4 chips with 3D v-cache on the way, but they aren't going to be available when Zen 4 launches this year. Could just mean there's a 7800X3D next year and nothing more though, who knows.

explosivo
May 23, 2004

Fueled by Satan

New Zealand can eat me posted:

Far Cry 6 is one of the titles where the X3D absolutely runs away.

If I could be selfish, do any of you X3D-havers have a copy of DiRT Rally 2 and a strong desire to run its openbenchmarking suite?

I was thinking about playing this again recently actually, now's probably a good time to. I'll reinstall this today and post my results.

Edit: oh drat this only works on Linux, nevermind. Sorry :(

Edit 2: I will say though, it runs drat well in 1440p! Tried a night race in the rain and it's pretty regularly in the 100-144 fps range with a couple brief dips below to around 60.

explosivo fucked around with this message at 13:58 on Apr 21, 2022

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

HalloKitty posted:

To be fair I (and surely many others) took notice of the potential of huge caches with regards to gaming performance back at the time of the 5775C, where disabling the integrated graphics to make full use of the 128 MB of eDRAM really smoothed out minimum frame times. That was effectively l4 cache and not hugely fast, but it was still notable. It was a path I'd hoped Intel would take. They didn't, AMD did, and they're seeing the huge perf gains

I've always wondered why Intel didn't bother looking into that any further, since basically everyone was impressed with how much work that "L4" cache was doing for the 5775C. I'm sure it added a bit to the BOM and all, and I know at the time Intel was running so far ahead of AMD that they didn't have to bother with anything fancy, but still.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
it's gonna "own" when stacked cache is gonna be the premium tier offering the way Intel held out on hyperthreading for so long

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

gradenko_2000 posted:

it's gonna "own" when stacked cache is gonna be the premium tier offering the way Intel held out on hyperthreading for so long

At least it should give palpable gains without also exposing yourself to an entire new category of hardware security vulnerabilities :mmmhmm:

But yeah, I do expect we'll see it as a +$50/$100 premium option in the future if these sell well.

Arzachel
May 12, 2012

lih posted:

I think the current rumours say there's Zen 4 chips with 3D v-cache on the way, but they aren't going to be available when Zen 4 launches this year. Could just mean there's a 7800X3D next year and nothing more though, who knows.

Yeah, probably depends on how well the 5800X3D does too.

DrDork posted:

I've always wondered why Intel didn't bother looking into that any further, since basically everyone was impressed with how much work that "L4" cache was doing for the 5775C. I'm sure it added a bit to the BOM and all, and I know at the time Intel was running so far ahead of AMD that they didn't have to bother with anything fancy, but still.

Expensive, Niche and honestly not that great at making videogames go faster unless compared to systems running DDR4 1800-2133.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Arzachel posted:

Expensive, Niche and honestly not that great at making videogames go faster unless compared to systems running DDR4 1800-2133.

Expensive and niche seems to be what Intel loves to court, though: if they can add another $20 to the BOM to generate a new market segment they can upcharge $100+ for, they seem to be all for it.

IIRC the 1% lows for the 5775C were very noticeably better, even against high-end systems where it wasn't showing a whole lot of avg FPS gain--much like with the X3D.

Just seems like a lost opportunity to explore an interesting tech branch. Ah well, it might have taken 7 years, but at least AMD has picked up the idea.

Klyith
Aug 3, 2007

GBS Pledge Week

gradenko_2000 posted:

it's gonna "own" when stacked cache is gonna be the premium tier offering the way Intel held out on hyperthreading for so long

Things that cost money to make, costing money to buy? What the hell!


The thing about hyperthreading was that the logic was on every chip and cost functionally nothing to turn on. Heck, creating the functions to be able to fuse off hyperthreading added to the cost of chips. It was a great example of a company with near-monopoly position to coercively extract money from the market.

Stacked cache OTOH is an extra chunk of silicon and a new stage of manufacturing. Those are real actual costs. The bonding process to stack the chips probably has a failure rate, while I doubt there were many intel chips that would only pass QC with hyperthreading off. It's 100% justified for the chips with 3d cache to cost more.

Fortunately, since we have a competitive environment now, I expect both companies to eventually have their Gamer-Cache™️ models and for the cost of buying it to be relatively in line with the cost of production.

FuturePastNow
May 19, 2014


I love weird niche tech and make bad decisions so I ordered a X3D

explosivo
May 23, 2004

Fueled by Satan

FuturePastNow posted:

I love weird niche tech and make bad decisions so I ordered a X3D

I think you'll find this was a very good and smart decision

Quaint Quail Quilt
Jun 19, 2006


Ask me about that time I told people mixing bleach and vinegar is okay
There are rumors it'll help with VR.
I am also considering it. I don't see it at best buy yet, and amd site is maybe getting hammered?

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

Quaint Quail Quilt posted:

There are rumors it'll help with VR.
I am also considering it. I don't see it at best buy yet, and amd site is maybe getting hammered?

If by help with VR, you mean help with minimum frametimes, it will depend on the game. But like all games at higher resolutions (like the ones the VR headsets tend to run at), GPU is gonna matter way, way, way more.

Cygni fucked around with this message at 18:08 on Apr 21, 2022

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply