|
gradenko_2000 posted:Something that occurred to me just now is that if the new console generation is going to cause a big leap in system requirements, I wonder how that's going to eat into the value proposition of things like AMD APUs or Intel's Xe iGPU as a way to get into gaming on a budget, and especially since prices on GPUs are high enough that it's hard to get even an entry-level GPU without throwing at least 200 dollars at it (unless you're willing to go second-hand), such that getting an APU becomes the thing that replaces the idea of an entry-level GPU in terms of budgeting, but that'll only hold if games continue to be decently runnable with such graphics. You should always be willing to buy used if it’s an EVGA, MSI or whatever the other one is card that’s under warranty. Seriously, it’s not like the bad old days where you’d get a flashed BIOS AMD brick — I don’t think Nvidia’s been cracked in years iirc
|
# ? Jul 15, 2020 08:37 |
|
|
# ? May 31, 2024 15:21 |
|
CPU limited games are the badly built games. I upgraded from my old 2500k to reduce load times in Total War: Warhammer 2 and it helped a bit. Then they released a patch that just cut load times to a quarter of what they were. And what's up with Unity? Every Unity game I've played has been CPU limited, and I had to cap Disco Elysium to 70fps to reduce CPU fan noise. Do developers who use Unity just keep making the same mistakes or does the engine just suck?
|
# ? Jul 15, 2020 08:42 |
|
Truga posted:there's a bunch of games where a new cpu does jack because even with a 2080ti you're bottlenecked there, or it's just a badly built game. but then there's games where a decent cpu upgrade will change the game like night and day, like stellaris, kerbal, etc, because they can use all the extra cpu and/or memory bandwidth they can get. welcome to what the blue team has been saying for over ten years, the refreshment table is over on the left now just imagine what would happen in those games if you were 20% faster than a 3600. With a max-overclocked 8700K you could have had that three years ago - 10-series is all still skylake isn't it? but "nobody cares about per-thread performance in 2020 " right? but "who cares about another 5 fps average " right? it's always funny to see this realization play out with former hardliners, that per-core performance does actually still matter. In this case an old failing Haswell to a 3600. Paul MaudDib fucked around with this message at 09:05 on Jul 15, 2020 |
# ? Jul 15, 2020 08:51 |
|
Paul MaudDib posted:welcome to what the blue team has been saying for over ten years, the refreshment table is over on the left that's a bizarre bunch of things to say to someone who's a happy Ryzen upgrader
|
# ? Jul 15, 2020 09:09 |
|
Happy_Misanthrope posted:Ugh reading now that Death Stranding is one of those HDR games that require you to turn on HDR in Windows 10 so your desktop looks like poo poo in order for it to show up during the game options. Christ I hate games that require that. HDR on Windows is so loving poo poo compared to any other form of media (including consoles) it's unreal Microsoft needs to pull their heads out their arses and actually work on it
|
# ? Jul 15, 2020 09:14 |
|
shrike82 posted:that's a bizarre bunch of things to say to someone who's a happy Ryzen upgrader why's that?
|
# ? Jul 15, 2020 09:16 |
|
Paul MaudDib posted:welcome to what the blue team has been saying for over ten years, the refreshment table is over on the left joke's on you, stellaris runs like poo poo on my brother's 8700K too what a dumbass lmao. single core performance does in fact not matter
|
# ? Jul 15, 2020 09:20 |
|
Truga posted:joke's on you, stellaris runs like poo poo on my brother's 8700K too Hmm, which benchmarks are you making this claim based on? It certainly could be one of those games that scales better on cache than latency, haven’t seen benchmarks either way. Paul MaudDib fucked around with this message at 09:27 on Jul 15, 2020 |
# ? Jul 15, 2020 09:24 |
|
the benchmark of "when we play multiplayer his client is slowing down the game because it performs like rear end". he has a 2080ti too, so it's not like the gpu is some onboard poo poo that could be slowing things down
|
# ? Jul 15, 2020 09:31 |
|
shrike82 posted:that's a bizarre bunch of things to say to someone who's a happy Ryzen upgrader wait a minute aren’t you the “NVIDIA is worth zero ($0) because google is going to eat their lunch any day now... any day now...” guy? Bizarre thing to say.
|
# ? Jul 15, 2020 09:34 |
|
Truga posted:the benchmark of "when we play multiplayer neat and yes cache improvements are still per-thread performance improvements
|
# ? Jul 15, 2020 09:38 |
|
otoh, maybe it is his gpu making GBS threads the bed another friend i play with has to watch twitch at exactly 30fps on his second monitor when we play because otherwise his game performance goes to poo poo and he starts lagging the game in year 0 lmfao :nvidia:
|
# ? Jul 15, 2020 09:45 |
|
lol I’m suddenly reminded of fishmech for some reason. Whatever happened to him?
|
# ? Jul 15, 2020 09:47 |
|
shrike82 posted:lol I’m suddenly reminded of fishmech for some reason. Whatever happened to him? Permad for hosting an ftp server full of goon homegrown.
|
# ? Jul 15, 2020 09:48 |
|
Welp... that’s me told
|
# ? Jul 15, 2020 09:50 |
|
gradenko_2000 posted:Something that occurred to me just now is that if the new console generation is going to cause a big leap in system requirements, I wonder how that's going to eat into the value proposition of things like AMD APUs or Intel's Xe iGPU as a way to get into gaming on a budget, and especially since prices on GPUs are high enough that it's hard to get even an entry-level GPU without throwing at least 200 dollars at it (unless you're willing to go second-hand), such that getting an APU becomes the thing that replaces the idea of an entry-level GPU in terms of budgeting, but that'll only hold if games continue to be decently runnable with such graphics. Integrated graphics are going to double in performance over night with DDR5 and, since cpus are moving towards increasingly fancier packaging tech, interposers/silicon bridges should be the next big upgrade.
|
# ? Jul 15, 2020 10:14 |
|
Also, you aren't playing AAA games with iGPUs today, right? There'll still be tons of games that run on potatoes, it's not like the budget titles in the store are going away with higher requirements. Those types of games are usually built on PC and ported over anyway since it's usually cheaper. The higher specs on consoles will eventually translate to PCs but: 1: Things like Ray Tracing have been out for a year already so that's nothing new. Consoles have to catch up A LOT before they start exceeding PC hardware so let's not clutch pearls too much about PC users being left behind. 2: It'll take a while for games to start getting released that effectively use that console hardware. AAA game dev time is years at this point, while you can turn dials up to take advantage of the newer hardware but you can always turn those dials down. It won't be "all games released after December 2021 will suddenly have higher requirements". 3: I'm always suspicious of pre-release specs. There will always be weird bottlenecks and handicaps on any hardware platform.
|
# ? Jul 15, 2020 14:32 |
|
Lockback posted:Also, you aren't playing AAA games with iGPUs today, right? I kinda do? I played WoW and CoD: Warzone and Dark Souls 3 and Doom 2016 and The Division with a Vega iGPU for a couple of weeks when I was in an odd time between set-ups. Of course you have to temper your expectations and running stuff on High is usually not in the cards, but it worked well enough even when chasing after 60 Hz. I guess the thing I'm... fearing (if that's even a fair descriptor to use) is if requirements shoot up enough that the relative usefulness of these drops back to a level as though they were Intel UHDs again.
|
# ? Jul 15, 2020 14:58 |
|
It really depends on what native resolution developers will be targeting, doesn't it? With Sony and MS pushing for >1080p, it's possible / probably that for 1080p60 or lower GPUs less powerful than the consoles' will work as well as they always did.
|
# ? Jul 15, 2020 15:16 |
|
Those are mostly games that came out before the Vega iGPU did by a couple years (well, not Warzone but thats an example of a game that isn't really pushing technical boundaries today), so you have that buffer time as iGPUs will probably get better. And yeah, I don't think anyone is arguing this isn't going to accelerate PC requirements somewhat but I think things like "Trying to play certain AAA games in iGPUs" is probably pretty niche since it sucks today anyway and I don't think the requirement acceleration is going to be a brick wall as much as "slightly steeper incline than normal" with probably a few outlier titles here and there.
|
# ? Jul 15, 2020 15:16 |
|
Lockback posted:And yeah, I don't think anyone is arguing this isn't going to accelerate PC requirements somewhat but I think things like "Trying to play certain AAA games in iGPUs" is probably pretty niche since it sucks today anyway and I don't think the requirement acceleration is going to be a brick wall as much as "slightly steeper incline than normal" with probably a few outlier titles here and there. I tend to agree. iGPUs have always been slow enough that AAA gaming hasn't been a realistic option without serious compromises. The only way iGPUs are gonna ever be usable for good experiences on games not some years old / indy / Blizzard games are if Intel/AMD manage to get a DLSS-competitor to work and shove it in there. For reference, a 1650 mobile GPU is about equivalent to a PS4 Pro. Most games for the foreseeable future are going to be not just cross-platform, but cross-generation, so you'll at least know that they've figured out a way to make it playable on that sort of hardware, meaning you're not going to be all that much worse off than you are now. In a year or two when games are really using the power of the new consoles, you'll also have new generations of iGPUs out to help close that gap somewhat.
|
# ? Jul 15, 2020 15:26 |
|
Thanks for the responses, folks. I think I just have a case of poor brain from having two GPUs die on me over the life of my old Athlon 640 and I was never able to replace those quickly, so there'd always be like a week where I'd be out of a computer and I don't like having to repeat that experience, so now I like having iGPUs as a backup (and more than one spare dedicated GPU besides) but I still can't shake that nagging feeling.
|
# ? Jul 15, 2020 16:17 |
|
Services like GeForce NOW are probably the best bet there. They work really well (As long as you're not playing a competitive twitch game) and don't require anything more than an iGPU, and right now at least GeForce NOW is cheap, especially as a stopgap. Some publishers/Devs have pulled their games (Since only NVidia can steal your personal data) but it'll cover you better than an iGPU will.
|
# ? Jul 15, 2020 17:08 |
|
Zedsdeadbaby posted:HDR on Windows is so loving poo poo compared to any other form of media (including consoles) it's unreal I could be wrong here but I think that’s mostly because monitors in general have really bad HDR specs. I hated Windows HDR until I used it on my OLED and it was 100% fine. I was as baffled as anyone at the time.
|
# ? Jul 15, 2020 17:34 |
|
The answer to any PC hardware gap is just buy a Switch. It has (a few) really good games that will easily keep you satisfied for the gap.
|
# ? Jul 15, 2020 17:35 |
|
K8.0 posted:The answer to any PC hardware gap is just buy a Switch. It has (a few) really good games that will easily keep you satisfied for the gap. You're not wrong (the Switch is great, especially if you've got one of the old-model ones with the unpatched comm port), but I got a X1E in large part so I didn't need to carry around yet another device on my train ride in to work every day. While I'd never really suggest mobile gaming for any sort of serious AAA titles or whatever, there's a valid use case for the ability to do some light games on the same hardware you're tied to for work reasons.
|
# ? Jul 15, 2020 18:12 |
|
Taima posted:I could be wrong here but I think that’s mostly because monitors in general have really bad HDR specs. The problem is that it absolutely ruins your desktop, no matter what the tweaking you do (and Win10's HDR 'calibration' also sucks) apps and text will look like crap. Even using my PC hooked up to a TV for gaming 99% of the time I can't just leave it on as even just browsing from the couch/watching youtube gets painful. My TV's HDR isn't great shakes, but games can look noticeably improved with it enabled - apps and the desktop, no way. For games that have a simple HDR toggle in-game that doesn't depend on this, it's fine - turn it on, use the game's calibration, boom - just like console. But the need to enable/disable it from settings for certain games is obnoxious. The process basically goes: 1) Launch game. 2) Hey, HDR is greyed out. gently caress. 3) Quit game. 4) Turn on HDR from Win10 settings. Squint. 5) Run game. 6) Disable HDR when you quit game. If Win10's new exclusive mode requires the desktop to also be in HDR, it still could have been significantly improved by not requiring the user to enable/disable it manually from settings beforehand - the game should see HDR as an option depending on the display. When you enable it in-game, your desktop would also be in HDR which sure isn't great if you're alt-tabbing, but when you quit your display goes back to the default you've set in the display settings. It really just needs another setting for "Desktop: SDR/HDR", so you can leave the flag on for games that utilize the new exclusive display mode, but have your desktop remain in SDR when not running them. Basically, that's how the PS4 Pro HDR works now - you calibrate HDR, but until you launch an HDR game, the PS4 UI (equivalent to a PC's desktop in this comparison) is in SDR. When you launch an HDR game and hit the PS button to go back to the dashboard, the dashboard remains in HDR too - which is what you want, as TV's can take a few seconds to switch from HDR mode to SDR, so having to wait for that every time you want to just check a trophy would be obnoxious. When you quit an HDR game, your TV switches back to SDR. Happy_Misanthrope fucked around with this message at 18:42 on Jul 15, 2020 |
# ? Jul 15, 2020 18:37 |
|
Windows HDR is basically "we didn't really think this out all the way, but needed to do something with these HDR-capable monitors!" Same with very high DPI monitors: Windows scaling is nowhere near as reliable as it needs to be in order for it to be a good experience. Unfortunately, because of how Windows handles these things in terms of sometimes leaving it up to applications, sometimes not, I doubt we'll see the situation meaningfully improve anytime soon.
|
# ? Jul 15, 2020 18:58 |
|
Oooh gotcha, makes sense, thanks
|
# ? Jul 15, 2020 19:23 |
|
shrike82 posted:We don't even have to go that far. I suspect most of us will be on 1440P for a while so 1440native->1440+ output would be good enough. Are there PSUs that have 12-pin connectors?
|
# ? Jul 15, 2020 22:05 |
|
What does the 12-pin one do that the 2x8 one doesn't?
|
# ? Jul 15, 2020 22:27 |
|
Combat Pretzel posted:What does the 12-pin one do that the 2x8 one doesn't? Make someone rich(er) in making the required adapters. The big question will be if this is something you'll see only on FE cards, or if they'll let AIBs continue using 6/8 pin connectors so long as they can guarantee the proper power.
|
# ? Jul 15, 2020 22:38 |
|
OhFunny posted:Are there PSUs that have 12-pin connectors? Not yet. But it sounds like they're just two 6-pins smooshed together, so most PSUs should be able to use their existing cabling with, at most, a 6+6 -> 12 adapter. Combat Pretzel posted:What does the 12-pin one do that the 2x8 one doesn't? 8 pins can do up to 150W. Apparently this 12 pin can go above 300W (400-600 depending), and do it using 4 fewer pins. So I guess if you're assuming you'll need that sort of power on a bunch of cards going forward, it'd be an advantage since it'd be smaller and less cluttered thanks to fewer cables, while also helping ensure power quality by (apparently) having appropriate wire gauge requirements. DrDork fucked around with this message at 00:32 on Jul 16, 2020 |
# ? Jul 16, 2020 00:26 |
|
If Nvidia used a special power connector on the card, it would probably just break out into two 8-pins. But I've seen some posts saying that rumor was false, anyway.
|
# ? Jul 16, 2020 01:24 |
|
They should just ship cards with their own plug in power adapter off the back.
|
# ? Jul 16, 2020 02:25 |
|
Suddenly, the 750W power supply I've been eyeing might not actually be enough. :grimacing:
|
# ? Jul 16, 2020 02:31 |
|
Lockback posted:They should just ship cards with their own plug in power adapter off the back. And we're right back to the Voodoo 5 days.
|
# ? Jul 16, 2020 02:33 |
|
SwissArmyDruid posted:Suddenly, the 750W power supply I've been eyeing might not actually be enough. :grimacing: I was planning on upgrading just the graphics card, but I seriously may have to rebuild my whole computer and just keep the RAM, hard drives, and maaaaybe the CPU.
|
# ? Jul 16, 2020 02:44 |
|
it's too bad egpus never became more of a thing. they'd help with heat and power consumption issues. i have a razer core x for external compute and it's a neat piece of hardware.
|
# ? Jul 16, 2020 02:44 |
|
|
# ? May 31, 2024 15:21 |
|
Ugly In The Morning posted:I was planning on upgrading just the graphics card, but I seriously may have to rebuild my whole computer and just keep the RAM, hard drives, and maaaaybe the CPU. If you're keeping all that, why are you replacing the motherboard?
|
# ? Jul 16, 2020 02:46 |