|
The 1% low improvement is the most impressive part of these benchmarks for me, even though at the resolutions most play at you may never realize them until the next GPU gen, if even that. Still consistency is what I'm most interested in rather than huge average boosts. If you really, really like Outer Worlds, guess you gotta get this CPU then
|
# ? Apr 20, 2022 20:20 |
|
|
# ? May 28, 2024 23:53 |
|
I'm still considering a CPU upgrade, and wanted to ask if the occasional hitch in a game would be reflective of a 1% low and be improved by a stronger processor? I have a 3600X and mostly it does great, but in some games it will very occasionally have a stutter that as far as I can tell appears to be CPU-based. Otherwise this would happen when my game FPS was perfectly fine otherwise, comfortably over 60 frames and usually higher. This might be placebo, but killing a few background processes seemed to help eliminate this. The games where this would happen were usually more intensive, like BL3 or CP2077, as I recall. My Windows install is pretty barebones as it is used almost entirely for games, so no third-party antivirus or anything running in the background. My GPU is a 2070 Super, as a point of reference.
|
# ? Apr 20, 2022 20:44 |
|
Welp, picked up an X3D from Microcenter
|
# ? Apr 20, 2022 20:48 |
|
CaptainSarcastic posted:I'm still considering a CPU upgrade, and wanted to ask if the occasional hitch in a game would be reflective of a 1% low and be improved by a stronger processor? I have a 3600X and mostly it does great, but in some games it will very occasionally have a stutter that as far as I can tell appears to be CPU-based. Otherwise this would happen when my game FPS was perfectly fine otherwise, comfortably over 60 frames and usually higher. This might be placebo, but killing a few background processes seemed to help eliminate this. The games where this would happen were usually more intensive, like BL3 or CP2077, as I recall. From looking at CPU usage, with Cyberpunk it was definitely my CPU that was responsible for any stutters (partly because MS Defender was interfering).
|
# ? Apr 20, 2022 20:48 |
|
Happy_Misanthrope posted:The 1% low improvement is the most impressive part of these benchmarks for me, even though at the resolutions most play at you may never realize them until the next GPU gen, if even that. Still consistency is what I'm most interested in rather than huge average boosts. Yeah that's ultimately what's motivating me. I figure the big cache might help it punch above its weight in a few years and prevent dips below the target framerate. That plus DLSS or similar upscaling for my GPU and I'm hoping I'll be able to ride this build out for a good long while without further major upgrades.
|
# ? Apr 20, 2022 20:50 |
|
Rinkles posted:From looking at CPU usage, with Cyberpunk it was definitely my CPU that was responsible for any stutters (partly because MS Defender was interfering). On a 3600x right now. It holds back performance in cp2077 so much that I can crank RT psycho on with DLSS quality at 1440p and the CPU is the bottleneck in cities
|
# ? Apr 20, 2022 21:06 |
|
CaptainSarcastic posted:I'm still considering a CPU upgrade, and wanted to ask if the occasional hitch in a game would be reflective of a 1% low and be improved by a stronger processor? I have a 3600X and mostly it does great, but in some games it will very occasionally have a stutter that as far as I can tell appears to be CPU-based. Otherwise this would happen when my game FPS was perfectly fine otherwise, comfortably over 60 frames and usually higher. This might be placebo, but killing a few background processes seemed to help eliminate this. The games where this would happen were usually more intensive, like BL3 or CP2077, as I recall. That isn't a 1% low. A 1% low is the 1 worst frame out of 100, so if you are getting 50 FPS it's happening 2 seconds. It you have bad 1% lows the game may feel "stuttery" or inconsistent (particularly if you use vsync on a 60hz, non-VFR monitor). An "occasional hitch", if talking about something that happens less than once a minute, is rarely anything to do with the CPU or GPU. It could be the game engine -- something poorly optimized that happens once in a while. It could be background processes stealing CPU time or memory or just blocking something on the game somehow. It could be the internet if playing a multiplayer game. It could be secondary hardware, like audio (realtek drivers) or weird stuff like the AMD fTPM bug. But the key is, if something isn't happening pretty consistently every ~10-15 seconds, it's not really the CPU/GPU performance doing it and a hardware upgrade is unlikely to solve it. (A complete new system can solve OS or secondary hardware problems, but so can fixing whatever is wrong on the old system.)
|
# ? Apr 20, 2022 21:11 |
|
CaptainSarcastic posted:I'm still considering a CPU upgrade, and wanted to ask if the occasional hitch in a game would be reflective of a 1% low and be improved by a stronger processor? I have a 3600X and mostly it does great, but in some games it will very occasionally have a stutter that as far as I can tell appears to be CPU-based. Otherwise this would happen when my game FPS was perfectly fine otherwise, comfortably over 60 frames and usually higher. This might be placebo, but killing a few background processes seemed to help eliminate this. The games where this would happen were usually more intensive, like BL3 or CP2077, as I recall. - excessive usb polling from busted devices - aforementioned amd ftpm issue - less-than-great fan curves (eg fan/temp hysteresis) and other cooling issues - minimum cpu usage being kind of busted on amd (try setting it high in power options and see if you notice a difference) - god awful peripheral/rgb firmware, which is to say most of it i'd just try to find some frame time benchmarks and compare, because all kinds of things can cause judder
|
# ? Apr 20, 2022 21:13 |
|
Also, loving vendor software. Like anything besides Afterburner and Precision X is probably a huuuge piece of poo poo that will at least sometimes just destroy your system.
|
# ? Apr 20, 2022 21:25 |
|
Yeah, I think I got the fTPM bug when I had it turned on out of cussedness, and I get some Bluetooth audio glitches once in a while, but those seem unrelated to the hitches I'm seeing - that is, if the audio glitches it doesn't affect framerate. I should probably do a more thorough look at the lower level hardware drivers. A CPU upgrade is still tempting - the 3600X is a good processor, but part of me wants to upgrade and also going from 6 to 8 cores feels like it would give my system a little longer legs. Oh, and I avoid vendor software like the plague.
|
# ? Apr 20, 2022 21:44 |
|
explosivo posted:Welp, picked up an X3D from Microcenter Holy shitfuck I just put this in and the difference is immediately noticeable. This chip might be the Real Deal. Also first time replacing a CPU with the AIO cooler and the lil hooks on the side made that process a breeze. In and out in like 15-20 minutes.
|
# ? Apr 20, 2022 22:27 |
|
explosivo posted:Holy shitfuck I just put this in and the difference is immediately noticeable. This chip might be the Real Deal. Also first time replacing a CPU with the AIO cooler and the lil hooks on the side made that process a breeze. In and out in like 15-20 minutes. Nice! What did you upgrade from?
|
# ? Apr 21, 2022 00:09 |
|
In case anyone wanted an update on my liquid freezer ii problem I just got an exchange through Amazon and the new one has the same issue. Again, good spread on the heat spreader so I’m thinking Arctic must be having some production issues. They haven’t responded to my support case yet so I’m guessing they weren’t kidding about the 9 days to hear a reply thing.
|
# ? Apr 21, 2022 00:19 |
|
Kibner posted:Nice! What did you upgrade from? A 3900X. It wasn't bad but when I got my 3080ti it clearly became the weak link. Now it feels like I'm seeing the full power of the GPU too and my FPS is pretty consistently 100+ when I was struggling to achieve a steady 60 in something like far cry 6.
|
# ? Apr 21, 2022 00:22 |
|
Cygni posted:Naw, not unless you are doing something that actually uses the extra cores. If you really want a shiny new toy, consider making an ITX HTPC/NAS/Pfsense/Home server box. Its fun, cheaper (especially with used parts), and useful! Already have a home server that's kitted out. I could always use those extra cores for more virtual machines. It really wouldn't be an upgrade and I know it. I just want an excuse
|
# ? Apr 21, 2022 00:37 |
|
explosivo posted:A 3900X. It wasn't bad but when I got my 3080ti it clearly became the weak link. Now it feels like I'm seeing the full power of the GPU too and my FPS is pretty consistently 100+ when I was struggling to achieve a steady 60 in something like far cry 6. Wow, what resolution?
|
# ? Apr 21, 2022 00:58 |
|
Rinkles posted:Wow, what resolution? 1440p. I haven't tried 4k yet but I'm very curious to see how much of a difference it makes. Edit: I tried Dying Light 2 and the difference is staggering. Same with Far Cry 6, DL2 just felt poorly optimized to me. I was playing before with everything maxed out with the RTX off and could mostly hang around 60 but it'd dip below quite a bit. This time I turned every RTX toggle on and with DLSS Quality was getting ~100-120 fps. explosivo fucked around with this message at 01:59 on Apr 21, 2022 |
# ? Apr 21, 2022 01:53 |
|
did you change anything else in your system - those numbers seem eyepopping
|
# ? Apr 21, 2022 07:11 |
|
Far Cry 6 is one of the titles where the X3D absolutely runs away. If I could be selfish, do any of you X3D-havers have a copy of DiRT Rally 2 and a strong desire to run its openbenchmarking suite? If anyone happens to have both an X3D & 6900XT, I'll pay for your copy of the game ($19.99 on Steam) once you submit results for this test at 1080/1440 The F1 2021 benchmarks should be evidence enough given it's the more popular cousin of D:R2, but as someone who just really likes the game I need to know what I'm missing out on before I commit to waiting for Zen 4
|
# ? Apr 21, 2022 07:24 |
|
New Zealand can eat me posted:Far Cry 6 is one of the titles where the X3D absolutely runs away. Yeah dogshit engines seem to benefit a ton once they fit into cache - and that’s not a bad thing, there are a lot of dogshit engines! Source (CS:GO, TF2, etc) was another - but apparently they don’t gain further on X3D, they fit into cache and that’s that. Battlefield V probably is another, there’s a gently caress load of stuff going on with physics there, it’s an overly massive open world destructible 64p environment with dynamic lighting / etc. The single-player campaign doesn’t stress your system nearly as much.
|
# ? Apr 21, 2022 07:51 |
|
Jawn V6 foretold how a 1GB cache would bring 'SSD-like' performance improvements. Praised be the platfrom oracle.
|
# ? Apr 21, 2022 07:54 |
|
If that's the path forwards, just go with HBM-on-package like Sapphire Rapids. That way you could forego RAM entirely. This would allow for new motherboard form factors that ditch DIMM slots entirely to allow more room for the heat sinks of future 1000W+ GPUs.
|
# ? Apr 21, 2022 08:29 |
|
To be fair I (and surely many others) took notice of the potential of huge caches with regards to gaming performance back at the time of the 5775C, where disabling the integrated graphics to make full use of the 128 MB of eDRAM really smoothed out minimum frame times. That was effectively l4 cache and not hugely fast, but it was still notable. It was a path I'd hoped Intel would take. They didn't, AMD did, and they're seeing the huge perf gains
|
# ? Apr 21, 2022 08:31 |
|
Intel's doing it in their next-gen Xeon architecture but as per usual it's a feature that would be helpful everywhere but Intel is locking it behind "gently caress you more money".
|
# ? Apr 21, 2022 08:47 |
|
Dr. Video Games 0031 posted:No idea how this relates to anything else, though. Creating a proper benchmark pass in a Paradox game sounds hard due to the amount of RNG involved. Even if you set everything up properly so you're in observer mode, the AI will do wildly different things with each pass which can effect the results. If a global/galaxy-spanning war breaks out in one pass but not another, that's too much variance to make for usable data. what you do is take a very-late-game save so the game is cpu-bound rather than time-bound, set game to max speed, and let it run for 3 years. not much variance happens in stellaris in 3 years, especially in late game when all the events have already fired then measure the seconds it takes to run said 3 years Truga fucked around with this message at 08:53 on Apr 21, 2022 |
# ? Apr 21, 2022 08:51 |
|
HalloKitty posted:To be fair I (and surely many others) took notice of the potential of huge caches with regards to gaming performance back at the time of the 5775C, where disabling the integrated graphics to make full use of the 128 MB of eDRAM really smoothed out minimum frame times. That was effectively l4 cache and not hugely fast, but it was still notable. It was a path I'd hoped Intel would take. They didn't, AMD did, and they're seeing the huge perf gains The rumour is that the main P-core upgrade with Raptorlake is 1.5x the L2 and L3 cache, so Intel is catching on a little bit. EMIB seems to me like a pretty fantastic technology so I imagine if they're working on a cache die that could be great.
|
# ? Apr 21, 2022 10:15 |
|
Kazinsal posted:Intel's doing it in their next-gen Xeon architecture but as per usual it's a feature that would be helpful everywhere but Intel is locking it behind "gently caress you more money". The X3D is also a one-off (at least for now), regular Zen 4 won't be using a cache die either.
|
# ? Apr 21, 2022 11:15 |
|
New Zealand can eat me posted:Far Cry 6 is one of the titles where the X3D absolutely runs away. I was more surprised that the 3900X was holding it back that much.
|
# ? Apr 21, 2022 11:21 |
|
Arzachel posted:The X3D is also a one-off (at least for now), regular Zen 4 won't be using a cache die either. I think the current rumours say there's Zen 4 chips with 3D v-cache on the way, but they aren't going to be available when Zen 4 launches this year. Could just mean there's a 7800X3D next year and nothing more though, who knows.
|
# ? Apr 21, 2022 11:50 |
|
New Zealand can eat me posted:Far Cry 6 is one of the titles where the X3D absolutely runs away. I was thinking about playing this again recently actually, now's probably a good time to. I'll reinstall this today and post my results. Edit: oh drat this only works on Linux, nevermind. Sorry Edit 2: I will say though, it runs drat well in 1440p! Tried a night race in the rain and it's pretty regularly in the 100-144 fps range with a couple brief dips below to around 60. explosivo fucked around with this message at 13:58 on Apr 21, 2022 |
# ? Apr 21, 2022 12:31 |
|
HalloKitty posted:To be fair I (and surely many others) took notice of the potential of huge caches with regards to gaming performance back at the time of the 5775C, where disabling the integrated graphics to make full use of the 128 MB of eDRAM really smoothed out minimum frame times. That was effectively l4 cache and not hugely fast, but it was still notable. It was a path I'd hoped Intel would take. They didn't, AMD did, and they're seeing the huge perf gains I've always wondered why Intel didn't bother looking into that any further, since basically everyone was impressed with how much work that "L4" cache was doing for the 5775C. I'm sure it added a bit to the BOM and all, and I know at the time Intel was running so far ahead of AMD that they didn't have to bother with anything fancy, but still.
|
# ? Apr 21, 2022 13:59 |
|
it's gonna "own" when stacked cache is gonna be the premium tier offering the way Intel held out on hyperthreading for so long
|
# ? Apr 21, 2022 14:04 |
|
gradenko_2000 posted:it's gonna "own" when stacked cache is gonna be the premium tier offering the way Intel held out on hyperthreading for so long At least it should give palpable gains without also exposing yourself to an entire new category of hardware security vulnerabilities But yeah, I do expect we'll see it as a +$50/$100 premium option in the future if these sell well.
|
# ? Apr 21, 2022 14:09 |
|
lih posted:I think the current rumours say there's Zen 4 chips with 3D v-cache on the way, but they aren't going to be available when Zen 4 launches this year. Could just mean there's a 7800X3D next year and nothing more though, who knows. Yeah, probably depends on how well the 5800X3D does too. DrDork posted:I've always wondered why Intel didn't bother looking into that any further, since basically everyone was impressed with how much work that "L4" cache was doing for the 5775C. I'm sure it added a bit to the BOM and all, and I know at the time Intel was running so far ahead of AMD that they didn't have to bother with anything fancy, but still. Expensive, Niche and honestly not that great at making videogames go faster unless compared to systems running DDR4 1800-2133.
|
# ? Apr 21, 2022 14:27 |
|
Arzachel posted:Expensive, Niche and honestly not that great at making videogames go faster unless compared to systems running DDR4 1800-2133. Expensive and niche seems to be what Intel loves to court, though: if they can add another $20 to the BOM to generate a new market segment they can upcharge $100+ for, they seem to be all for it. IIRC the 1% lows for the 5775C were very noticeably better, even against high-end systems where it wasn't showing a whole lot of avg FPS gain--much like with the X3D. Just seems like a lost opportunity to explore an interesting tech branch. Ah well, it might have taken 7 years, but at least AMD has picked up the idea.
|
# ? Apr 21, 2022 14:45 |
|
gradenko_2000 posted:it's gonna "own" when stacked cache is gonna be the premium tier offering the way Intel held out on hyperthreading for so long Things that cost money to make, costing money to buy? What the hell! The thing about hyperthreading was that the logic was on every chip and cost functionally nothing to turn on. Heck, creating the functions to be able to fuse off hyperthreading added to the cost of chips. It was a great example of a company with near-monopoly position to coercively extract money from the market. Stacked cache OTOH is an extra chunk of silicon and a new stage of manufacturing. Those are real actual costs. The bonding process to stack the chips probably has a failure rate, while I doubt there were many intel chips that would only pass QC with hyperthreading off. It's 100% justified for the chips with 3d cache to cost more. Fortunately, since we have a competitive environment now, I expect both companies to eventually have their Gamer-Cache™️ models and for the cost of buying it to be relatively in line with the cost of production.
|
# ? Apr 21, 2022 15:21 |
|
I love weird niche tech and make bad decisions so I ordered a X3D
|
# ? Apr 21, 2022 16:47 |
|
FuturePastNow posted:I love weird niche tech and make bad decisions so I ordered a X3D I think you'll find this was a very good and smart decision
|
# ? Apr 21, 2022 16:59 |
|
There are rumors it'll help with VR. I am also considering it. I don't see it at best buy yet, and amd site is maybe getting hammered?
|
# ? Apr 21, 2022 17:43 |
|
|
# ? May 28, 2024 23:53 |
|
Quaint Quail Quilt posted:There are rumors it'll help with VR. If by help with VR, you mean help with minimum frametimes, it will depend on the game. But like all games at higher resolutions (like the ones the VR headsets tend to run at), GPU is gonna matter way, way, way more. Cygni fucked around with this message at 18:08 on Apr 21, 2022 |
# ? Apr 21, 2022 18:05 |