Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
pairofdimes
May 20, 2001

blehhh

Rinkles posted:

Sorry for the posting spam, first PC build and all that.



The motherboard I ordered has these internal USB headers (plus more for old USB). I can use them with a case's front io panel, right? Cause I read a random post that said the motherboard doesn't have a front panel USB-C header.

Both of those ports on the motherboard can provide front panel USB, what case and motherboard is this? You'll want to make sure the motherboard has all the ports the case needs. I think pcpartpicker can also check this.

Adbot
ADBOT LOVES YOU

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Z590 Aorus ELITE AX and the Lian Li Lancool 2 Mesh.

Begall
Jul 28, 2008
What do you guys think about this build https://uk.pcpartpicker.com/list/zNwtcT? All the purchased parts are reused from my existing Ryzen 3600 build in a Thermaltake Core V21 case, with the aim to make something that looks better on my desk. I’m thinking of replacing the front fan on the cooler with one of the AER 2’s, but my only concern with this is whether the non-gaming acoustics will be as good as my current setup with 200mm and 140mm Noctua case fans - I sit next to it all day for work so I can’t be doing with anything audible.

pairofdimes
May 20, 2001

blehhh

Rinkles posted:

Z590 Aorus ELITE AX and the Lian Li Lancool 2 Mesh.

It looks like they should work together.

Case:

2 x USB 3.0
1 x USB 3.1 TYPEC (Optional)

Motherboard:

1 x USB Type-C® port with USB 3.2 Gen 2 support, available through the internal USB header
5 x USB 3.2 Gen1 ports (3 ports on the back panel, 2 ports available through the internal USB header)

The case info page uses the old names for the USB ports which makes it annoying to compare, but USB 3.0 = USB 3.2 Gen 1 and USB 3.1 = USB 3.2 Gen 2
Based on this the motherboard has enough ports for the front panel USB, someone correct me if I'm wrong since the USB naming conventions are terrible.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

pairofdimes posted:

It looks like they should work together.

Case:

2 x USB 3.0
1 x USB 3.1 TYPEC (Optional)

Motherboard:

1 x USB Type-C® port with USB 3.2 Gen 2 support, available through the internal USB header
5 x USB 3.2 Gen1 ports (3 ports on the back panel, 2 ports available through the internal USB header)

The case info page uses the old names for the USB ports which makes it annoying to compare, but USB 3.0 = USB 3.2 Gen 1 and USB 3.1 = USB 3.2 Gen 2
Based on this the motherboard has enough ports for the front panel USB, someone correct me if I'm wrong since the USB naming conventions are terrible.

Thanks. The quirk with the case is that it has a hole for USB C, but you need to buy the cable/port to actually use it. ~$15.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Choosing 3200 CL16 versus 3600 CL18, isn't gonna matter much, right?

roomtone
Jul 1, 2021

by Fluffdaddy
are amd processors better than the intel ones in the mid-high range? as in, i'm looking to get a custom prebuilt with a 3060ti in the £1500 range.

i see that when you get up to mega expensive pcs, intell i9 12th gen seem to be the ones people say are best, but in the mid range i'm not sure whether to go for that i5 11600kf or start again and try customising something with an AMD processor.

DerekSmartymans
Feb 14, 2005

The
Copacetic
Ascetic

roomtone posted:

are amd processors better than the intel ones in the mid-high range? as in, i'm looking to get a custom prebuilt with a 3060ti in the £1500 range.

i see that when you get up to mega expensive pcs, intell i9 12th gen seem to be the ones people say are best, but in the mid range i'm not sure whether to go for that i5 11600kf or start again and try customising something with an AMD processor.

I think the calculus has changed a bit with Intel’s newest chips (12xxx), but for a long time the Ryzen 5600 has been the best bang-for-buck gaming processor. If you need a processor for other tasks, more expensive but useful chips are available (more cores/threads,etc) but for gaming only I still think the 5600 works fine. It’s not just the only choice anymore (and I got an i7 10xxx I’m very happy with, so YMMV of course 😵‍💫).

njsykora
Jan 23, 2012

Robots confuse squirrels.


Intel's new stuff is going to cost more all-in right now though, and seems to have some weird issues like not working right with some DRM systems though I expect that'll be fixed eventually. As a general rule though if you're just getting a PC for games you have no real reason to go above the Ryzen 5600 or Intel xx600 stuff. Stuff like the xx700-xx900 stuff is really for people who are doing video editing and 3D modelling as well as gamers who want to dick wave about having the biggest numbers.

Begall
Jul 28, 2008
If (like me) you’re looking at needing a motherboard that costs £170+ anyway, and are looking at the 5600X/5800X as a CPU, I think there is a lot to be said for biting the bullet and buying into Intel 12th Gen. In the UK you can get a new 12600KF for < £250 and this is a chip which outperforms the 5800X that still goes new for £330+, but obviously this is highly market dependent. On top of just being a plain better performer, the platform you’re buying into can be expected to support Intel 13th gen while it’s not clear if we’ll see another generation on AM4.

BurritoJustice
Oct 9, 2012

Rinkles posted:

Choosing 3200 CL16 versus 3600 CL18, isn't gonna matter much, right?

3600c16 is more likely to be bdie, especially if it's 16-16-16-36, which means you can happily run it at 4000-4200 depending on how high your IMC goes in gear 1. Plus you can tighten the timings to an almost absurd degree which can boost performance way more than max speed.

LampkinsMateSteve
Jan 1, 2005

I've really fucked it. Have I fucked it?
About 5600X for gaming: great 6-core chip, no doubt -- but there is maybe an outlier for gaming: the new Battlefield game. It seems to slam CPUs (especially on the big 128-player servers), and as it is targeted for the 8-core newgen consoles, it just gobbles those cores up. I have a 5900X with 12-cores, and it feels like I get a lot better performance than others who also have a 3080 GPU, but a lesser CPU.

CoolCab
Apr 17, 2005

glem
did you track CPU utilisation? the only content I've seen on it (jay whining) seemed to imply it was just extremely poorly optimized, or maybe exclusively using single core or something.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

LampkinsMateSteve posted:

About 5600X for gaming: great 6-core chip, no doubt -- but there is maybe an outlier for gaming: the new Battlefield game. It seems to slam CPUs (especially on the big 128-player servers), and as it is targeted for the 8-core newgen consoles, it just gobbles those cores up. I have a 5900X with 12-cores, and it feels like I get a lot better performance than others who also have a 3080 GPU, but a lesser CPU.

It uses like one or two cores

notwithoutmyanus
Mar 17, 2009

Zedsdeadbaby posted:

It uses like one or two cores
That's a trope, games have been multithreaded for a decade, specifically including battlefield series. . It does matter (and not just for dick waving), especially if you plan on running literally anything outside of your gaming app. A few extra fps here and there adds to the minimum fps, which is realistically what matters. If it didn't, people would just need 2 core boxes and this thread wouldn't even encourage that. https://www.game-debate.com/news/25726/how-the-number-of-cpu-cores-affects-battlefield-v-pc-performance-bf5-cpu-benchmarks

notwithoutmyanus fucked around with this message at 15:28 on Nov 20, 2021

CoolCab
Apr 17, 2005

glem
you can in fact use a bunch of threads in a way that only stresses one or two cores, most modern games do exactly that. you'd be able to tell that via cpu utilization, which is what i said. also that's kind of an atrocious chip to do this kind of CPU benchmarking on given it's six? years old and the methodology is very very poor - there are radical differences in IPC and cache between the 5820K and modern CPUs.

notwithoutmyanus
Mar 17, 2009
Most modern games do that? Uhhhhh not sure where you get that from. Some loads just aren't parallel, that doesn't mean they'll intentionally frontload a core. That approach would be explicitly terrible because you would overheat your entire cpu because of playing a game. All games are different and I can't tell you what every development team does, but processors will generally spread load even if they don't get instructions to.
It was just an article from 2018 or something.

notwithoutmyanus fucked around with this message at 15:48 on Nov 20, 2021

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Best case I've seen for getting a 12600K yet (though it sounds like the 5900X's core advantage is underutilized).

CoolCab
Apr 17, 2005

glem
they will try to and it makes vanishingly little difference compared to the things that actually meaningfully impact framerates, as demonstrated by good testing here



looky here, where the testing is configured to maximize the CPU bottleneck (at 1080p despite the 3090 a 128 player game with bots) the alleged gap between a six core and an eight core (5600x versus 5800x) of the exact same generation is 2 frames at .1%, 4 frames at 1% and 5 frames overall. this happens because even well multithreaded games which offload some of the load to available spare threads and of course modern programs do this still heavily rely on one or two cores to do the game logic processing and as such become the bottleneck. i'm sorry you don't know what you're talking about here and the example you brought up demonstrates it, frankly.

e: lol beaten

repiv
Aug 13, 2009

warzone got 150 players running at 60fps on last gen consoles so i'm gonna go with :dice: probably being :dice: again

Dr. Video Games 0031
Jul 17, 2004

BF2042 is very much multithreaded and will max out all six of your cores on a 5600X in certain situations.

CoolCab
Apr 17, 2005

glem

Dr. Video Games 0031 posted:

BF2042 is very much multithreaded and will max out all six of your cores on a 5600X in certain situations.

the claim was that a higher core count cpu (specifically the 5900x) would radically outperform a six core (the 5600x) particularly in minimum fps and that's not borne out by testing at all. again i don't know dick about battlefield nor the software architecture i was articulating how you could demonstrate if it was true - the 5900x would have a high level of cpu utilization if it was meaningfully outperforming the 5600x and from my limited observations i very heavily doubt it. it's not a trope to observe these things.

notwithoutmyanus
Mar 17, 2009
Again, there is a difference and 56 vs 58 and a 5900 of course, but this will also be multiplied by literally anything else you run as well. Average home situation is not a isolated benchmark situation. Most people will be running other things on their cores as they play games.

I'm not saying it'll be extreme, but even a 10% variance is going to impact smoothness and minimum frame rates when games get heavy on the load*. (typo)

notwithoutmyanus fucked around with this message at 16:31 on Nov 20, 2021

Dr. Video Games 0031
Jul 17, 2004

I was responding to the guy who said it only uses one or two cores. It most definitely does not.

CoolCab
Apr 17, 2005

glem

Dr. Video Games 0031 posted:

I was responding to the guy who said it only uses one or two cores. It most definitely does not.

ah, i see, apologies.


notwithoutmyanus posted:

Again, there is a difference and 56 vs 58 and a 5900 of course, but this will also be multiplied by literally anything else you run as well. Average home situation is not a isolated benchmark situation. Most people will be running other things on their cores as they play games.

I'm not saying it'll be extreme, but even a 10% variance is going to impact smoothness and minimum frame rates when games get heavy on the load*. (typo)

well this is the pc building thread and people come here for advice as to what machines to build. given most of us obviously can't buy every variation of the hardware we are reliant on good independent testing, methodologies and viable heuristics to give good advice. again, i don't think i've evvver? played a battlefield game i can only go off benchmarks when someone asks "do i need a 5900x to play battlefield 2042 on ultra" then i need a good and reasoned answer to say they could justify the extra expenditure, we're talking about almost literally doubling the price.

and i was under the impression the biggest indicator for stuttering will be .1% where it's closer to 5% not 10% on a set of benchmarks designed to most amplify the CPU difference.

notwithoutmyanus
Mar 17, 2009

CoolCab posted:

ah, i see, apologies.

well this is the pc building thread and people come here for advice as to what machines to build. given most of us obviously can't buy every variation of the hardware we are reliant on good independent testing, methodologies and viable heuristics to give good advice. again, i don't think i've evvver? played a battlefield game i can only go off benchmarks when someone asks "do i need a 5900x to play battlefield 2042 on ultra" then i need a good and reasoned answer to say they could justify the extra expenditure, we're talking about almost literally doubling the price.

and i was under the impression the biggest indicator for stuttering will be .1% where it's closer to 5% not 10% on a set of benchmarks designed to most amplify the CPU difference.

Well yeah, that's kinda why I posted. It wasn't to be pedantic. There's a significant enough quality of life difference, subject to people's budget limitations and what you do that there's a need for these offerings with different core options. There is plenty between web browsing/1080p gaming machine (average setting)/1080p gaming (higher settings)/higher resolutions (again, average/higher settings)/media editing. A secondary common factor is "do you have a billion browser tabs"? or "do you run a bunch of apps?". It's not just okay 1080p gaming get middle end cpu.

It should realistically be something you scale people on like a 1-10 where higher score equates to likely higher performance parts needs. Unfortunately the reality is spending a little more for more cores (not saying jump from 5600 to a threadripper), but in general is going to be budget constrained but also effect how long someone would be justified keeping that pc as their main workhorse.

IE: 2.5 years on a 5600, maybe 3-5 on a 5900. Just hypothetical on those examples.

Butterfly Valley
Apr 19, 2007

I am a spectacularly bad poster and everyone in the Schadenfreude thread hates my guts.

notwithoutmyanus posted:

Most people will be running other things on their cores as they play games.

Will they?

CoolCab
Apr 17, 2005

glem

notwithoutmyanus posted:

Well yeah, that's kinda why I posted. It wasn't to be pedantic. There's a significant enough quality of life difference, subject to people's budget limitations and what you do that there's a need for these offerings with different core options. There is plenty between web browsing/1080p gaming machine (average setting)/1080p gaming (higher settings)/higher resolutions (again, average/higher settings)/media editing. A secondary common factor is "do you have a billion browser tabs"? or "do you run a bunch of apps?". It's not just okay 1080p gaming get middle end cpu.

It should realistically be something you scale people on like a 1-10 where higher score equates to likely higher performance parts needs. Unfortunately the reality is spending a little more for more cores (not saying jump from 5600 to a threadripper), but in general is going to be budget constrained but also effect how long someone would be justified keeping that pc as their main workhorse.

IE: 2.5 years on a 5600, maybe 3-5 on a 5900. Just hypothetical on those examples.

i'm sorry, i disagree strongly with several chunks of your logic there. we and i personally extremely typically advise that if you're doing something other than gaming more cores is more better, you can search for similar phrases and find me a bunch of times (lol). if you have a productivity role the math is different, obviously. but for gaming and in particular for this specific title you would need to justify why you'd spend so much for two more frames at .1% and i don't think it is justified at 1080p where this problem is sharpest. at any other resolution the problem is compounded as you move to a GPU bottleneck - if we were talking about 1440p or something then the difference becomes more academic:



(again, this is still doing the most possible cpu stressing via the highest player counts being used by bots, all of which increase cpu load - in different gamemodes the load drops significantly)

and the futureproofing argument is also in my opinion very poor. maybe the p core/e core model will take off and games will start shuffling tasks off to different cores (unlikely, but for the purpose of argument) and the entire existing ryzen stack becomes poor performing in modern titles, maybe we'll have another period of stagnation like we had with intel and a good CPU from today will last for a decade. maybe the m1 will take the gaming world by storm and in 3-5 years the bleeding edge will be mac gaming. in the absence of a crystal ball i don't know and it would be irresponsible for us to make recommendations on that basis although obviously we do mention what is more likely.

CoolCab fucked around with this message at 17:41 on Nov 20, 2021

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Discord is surprisingly cpu-intensive

Hamboner
Jul 22, 2006

What, you tryin' to mug me?
After sticking to gaming laptops for years I am looking to make the switch to a gaming desktop. I am looking mostly at prebuilts and the Newegg PC building service since I am paranoid I would screw up building my own. I am also pretty computer part illiterate. Appreciate all advice on what I should be looking for and what’s reasonable to expect to spend in the current lovely market.

What country are you in? US
What are you using the system for? Light web and office work (nothing more intense than Microsoft Office and video chats) when I’m working from home, otherwise just gaming. The graphics-intensive games I spend the most time playing are Total War and not a whole lot of twitchy FPSes.
What's your budget? Looking to ideally keep it under $2,000 but have some wiggle room
If you're gaming, what is your monitor resolution / refresh rate? Currently have a 1080p monitor, though interested in potentially upgrading to 1440p in the next year or two

I don’t need to have a PC with crazy bleeding edge graphics but want something that can handle modern games at medium-high settings at 1440p for the next few years. I was thinking a 3070 might be the sweet spot? I am not in a hurry to get to 4k and I come across prebuilts with 3070s that are less than $2k. My current laptop (which has been struggling lately) has a 1060, for reference.

I also know big PC brand prebuilts like HP can be garbage, but also wasn’t sure if this prebuilt with a 3080ti at $2,250 on discount would be worth pulling the trigger on or if it's overkill for what I need.

Butterfly Valley
Apr 19, 2007

I am a spectacularly bad poster and everyone in the Schadenfreude thread hates my guts.

Hamboner posted:

I also know big PC brand prebuilts like HP can be garbage, but also wasn’t sure if this prebuilt with a 3080ti at $2,250 on discount would be worth pulling the trigger on or if it's overkill for what I need.

Garbage case and overkill for your requirements absolutely

CoolCab
Apr 17, 2005

glem
also the 3080ti is a power hungry card with notorious failure rates in esoteric situations similar to the 3090. i am very hesitant of an OEM variant of those and the PSU that runs it, although as i recall when gamers nexus tested the dells they were (by necessity) overbuilt compared to all the other components i don't know if HP is taking a similar strategy.

Pilfered Pallbearers
Aug 2, 2007

notwithoutmyanus posted:

Well yeah, that's kinda why I posted. It wasn't to be pedantic. There's a significant enough quality of life difference, subject to people's budget limitations and what you do that there's a need for these offerings with different core options. There is plenty between web browsing/1080p gaming machine (average setting)/1080p gaming (higher settings)/higher resolutions (again, average/higher settings)/media editing. A secondary common factor is "do you have a billion browser tabs"? or "do you run a bunch of apps?". It's not just okay 1080p gaming get middle end cpu.

It should realistically be something you scale people on like a 1-10 where higher score equates to likely higher performance parts needs. Unfortunately the reality is spending a little more for more cores (not saying jump from 5600 to a threadripper), but in general is going to be budget constrained but also effect how long someone would be justified keeping that pc as their main workhorse.

IE: 2.5 years on a 5600, maybe 3-5 on a 5900. Just hypothetical on those examples.

You should give up on this line of logic.

Benchmarking makes it clear that the differences aren’t high, and future proofing is never a good basis to purchase PC components on.

This thread tends to focus on price/performance rather than cutting edge insanity. Spending $300 to maybe get a ~2% increase in the most intensive situations is deeply stupid.

It’s especially stupid because anyone willing to drop that level of cash on a CPU would 100% be playing above 1080p and they’d be super GPU bound before the CPU would even matter.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
For me the choice was between:
  • $350-$400 for a 11600K + Z590

  • ~$520 for 12600K + Z690

I went with the option that saved me ~$150, because the realistic differences aren't that extreme, and I'm probably gonna be GPU bound anyway. Hope it doesn't bite me in the rear end in two years.

DerekSmartymans
Feb 14, 2005

The
Copacetic
Ascetic

Pilfered Pallbearers posted:

This thread tends to focus on price/performance rather than cutting edge insanity. Spending $300 to maybe get a ~2% increase in the most intensive situations is deeply stupid.

It’s especially stupid because anyone willing to drop that level of cash on a CPU would 100% be playing above 1080p and they’d be super GPU bound before the CPU would even matter.

I’d have to agree with this line of reasoning, mainly because of stuff like,”If you don’t know what ‘Threadripper’ means, you don’t need Threadripper components.” I don’t consider it mean, dismissive, or even elitist because its true!

I’ve seen this a lot since I started following this thread in 2020, but it really does make sense. As a personal example I very, very rarely do anything approaching a “workflow” on my newest system. Sure, I’ll edit a photo or make goatse collages, and occasionally make a video/edit video for family (because I have toolsets & time to learn(ed) them). My brother sends me all his phone pics occasionally because he has no idea how to delete red-eye in his grandkids’ photos, or has time to learn/perform the task, and I learned GIMP fifteen-ish years ago so I just help. I was happy with my non-TotL i7-10xxx even though it’s only upgrade-path on the mobo is lateral (and a no-starter for the small % sidegrade) and won’t fit an 11th (or 12th) gen chip. I don’t expect to need a new chip any time for years, but I just enjoy tech minutiae nobody else around me does so I keep up! I also understand there are people that don’t know even what an i3 means, but have the money and personality to immediately buy the i9 latest cutting edge chip because the number is bigger than their last i9 from two years ago.

Most of these people aren’t in this thread. Price AND performance-for-price (value) is much bigger in this thread, even though many here could probably spout specs for their dream build quicker than their kids’ birthdays! Benchmarks are important for the discussion of course. As is value and, to me most important, what tasks are you going to be doing with the new system? Benchmarks almost seem most important for power users, but most here aren’t competitive esports contenders doing one exclusive task with 97% resources optimized for fps. I love Control, but it’s usually on my main 1080 monitor while Edge (8-9 tabes), iTunes (casting to my TV’s sound system) , and WinExplorer (backups/file copying/idling) plus occasional other programs are also open. Others have different needs but the ones using benchmarking tools already know what’s up and are mainly here to answer questions, not ask. Like DoctorVideoGames in the monitor thread, sometimes posters know so much about a topic it can get hard to follow for a hobbyist, but at that level of dedication it’s just best not to slapfight over minor quibbles unless there is a major deficiency or error (even if they just didn’t feel like explaining a technical or statistical exception because it didn’t apply)!

Thanks for listening to my Tedxplanation, rambling and non-essential as it was :science:!

notwithoutmyanus
Mar 17, 2009

considering they run more than just a benchmark suite, I'd be guessing so, yes. Not hard to guess if you have a browser open somewhere or whatever. It doesn't have to be all gui foreground stuff, but if you have more than 1 monitor it's a reasonable guess. I guess a few people find this to be a huge problem but I consider it a pretty much standard expectation of literally everyone in existence. Examples: messaging applications, browsers, steam, etc. Anyone who acts like they keep nothing in their systray or nothing on their desktop running outside of a game at the same time is likely being a bit dishonest here.

Could I be wrong about spending the extra money compared to the budget perspective of cost efficiency? Sure, I'll acknowledge I could be completely wrong about that. Especially given market shortages. But for other groups, maybe that's not the case if people are building to usage expectations.

notwithoutmyanus fucked around with this message at 21:42 on Nov 20, 2021

Pilfered Pallbearers
Aug 2, 2007

notwithoutmyanus posted:

considering they run more than just a benchmark suite, I'd be guessing so, yes. Not hard to guess if you have a browser open somewhere or whatever. It doesn't have to be all gui foreground stuff, but if you have more than 1 monitor it's a reasonable guess. I guess a few people find this to be a huge problem but I consider it a pretty much standard expectation of literally everyone in existence. Examples: messaging applications, browsers, steam, etc. Anyone who acts like they keep nothing in their systray or nothing on their desktop running outside of a game at the same time is likely being a bit dishonest here.

Could I be wrong about spending the extra money compared to the budget perspective of cost efficiency? Sure, I'll acknowledge I could be completely wrong about that. Especially given market shortages. But for other groups, maybe that's not the case if people are building to usage expectations.

My point here is most people that are utilizing a build with a 5600x (or similar) are running 3060+ GPUs and at 1440p or higher, and at 1440p you’re GPU bound way before CPU, so the CPU is free to do your extra systray poo poo.

I have a i7-7700k and a 3080, played at 1440p, and an enormous amount of poo poo in the background (I have plex server stuff, automation, HDD pooling and detection, 100 Firefox tabs, etc etc). Even with that setup, if I shut down all that poo poo I barely get any FPS gains, and my CPU is serval magnitudes worse than the 5600x.

High frame rate (240+) competitive 1080p gaming or specific workflow tasks that are unrelated to gaming are the only reason to move above a 5600k/12600k.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Pilfered Pallbearers posted:

My point here is most people that are utilizing a build with a 5600x (or similar) are running 3060+ GPUs and at 1440p or higher, and at 1440p you’re GPU bound way before CPU, so the CPU is free to do your extra systray poo poo.

I have a i7-7700k and a 3080, played at 1440p, and an enormous amount of poo poo in the background (I have plex server stuff, automation, HDD pooling and detection, 100 Firefox tabs, etc etc). Even with that setup, if I shut down all that poo poo I barely get any FPS gains, and my CPU is serval magnitudes worse than the 5600x.

High frame rate (240+) competitive 1080p gaming or specific workflow tasks that are unrelated to gaming are the only reason to move above a 5600k/12600k.

Your minimum framerates are going to suck with a 7700K compared to a 5600X or similar modern CPU, even without apps in the background

I was in your situation, the difference is astronomical. You are doing yourself a disservice pairing that 3080 with a 7700K

Pilfered Pallbearers
Aug 2, 2007

Zedsdeadbaby posted:

Your minimum framerates are going to suck with a 7700K compared to a 5600X or similar modern CPU, even without apps in the background

I was in your situation, the difference is astronomical. You are doing yourself a disservice pairing that 3080 with a 7700K

Oh I’m fully aware, I’m just not in the market to upgrade and I can squeak enough performance out of the 7700k to be happy until it’s time.

Adbot
ADBOT LOVES YOU

mom and dad fight a lot
Sep 21, 2006
Probation
Can't post for 29 days!
Four cores four threads
Runnin' it 'till it's dead

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply