Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

gradenko_2000 posted:

Something that occurred to me just now is that if the new console generation is going to cause a big leap in system requirements, I wonder how that's going to eat into the value proposition of things like AMD APUs or Intel's Xe iGPU as a way to get into gaming on a budget, and especially since prices on GPUs are high enough that it's hard to get even an entry-level GPU without throwing at least 200 dollars at it (unless you're willing to go second-hand), such that getting an APU becomes the thing that replaces the idea of an entry-level GPU in terms of budgeting, but that'll only hold if games continue to be decently runnable with such graphics.

You should always be willing to buy used if it’s an EVGA, MSI or whatever the other one is card that’s under warranty. Seriously, it’s not like the bad old days where you’d get a flashed BIOS AMD brick — I don’t think Nvidia’s been cracked in years iirc

Adbot
ADBOT LOVES YOU

Mercrom
Jul 17, 2009
CPU limited games are the badly built games. I upgraded from my old 2500k to reduce load times in Total War: Warhammer 2 and it helped a bit. Then they released a patch that just cut load times to a quarter of what they were.

And what's up with Unity? Every Unity game I've played has been CPU limited, and I had to cap Disco Elysium to 70fps to reduce CPU fan noise. Do developers who use Unity just keep making the same mistakes or does the engine just suck?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Truga posted:

there's a bunch of games where a new cpu does jack because even with a 2080ti you're bottlenecked there, or it's just a badly built game. but then there's games where a decent cpu upgrade will change the game like night and day, like stellaris, kerbal, etc, because they can use all the extra cpu and/or memory bandwidth they can get.

welcome to what the blue team has been saying for over ten years, the refreshment table is over on the left

now just imagine what would happen in those games if you were 20% faster than a 3600. With a max-overclocked 8700K you could have had that three years ago - 10-series is all still skylake isn't it?

but "nobody cares about per-thread performance in 2020 :qq:" right?

but "who cares about another 5 fps average :qq:" right?

it's always funny to see this realization play out with former hardliners, that per-core performance does actually still matter. In this case an old failing Haswell to a 3600.

Paul MaudDib fucked around with this message at 09:05 on Jul 15, 2020

shrike82
Jun 11, 2005

Paul MaudDib posted:

welcome to what the blue team has been saying for over ten years, the refreshment table is over on the left

now just imagine what would happen in those games if you were 20% faster than a 3600. With a max-overclocked 8700K you could have had that three years ago - 10-series is all still skylake isn't it?

but "nobody cares about per-thread performance in 2020 :qq:" right?

but "who cares about another 5 fps average :qq:" right?

it's always funny to see this realization play out with former hardliners, that per-core performance does actually still matter. In this case an old failing Haswell to a 3600.

that's a bizarre bunch of things to say to someone who's a happy Ryzen upgrader

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Happy_Misanthrope posted:

Ugh reading now that Death Stranding is one of those HDR games that require you to turn on HDR in Windows 10 so your desktop looks like poo poo in order for it to show up during the game options. Christ I hate games that require that.

HDR on Windows is so loving poo poo compared to any other form of media (including consoles) it's unreal

Microsoft needs to pull their heads out their arses and actually work on it

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

that's a bizarre bunch of things to say to someone who's a happy Ryzen upgrader

why's that?

Truga
May 4, 2014
Lipstick Apathy

Paul MaudDib posted:

welcome to what the blue team has been saying for over ten years, the refreshment table is over on the left

now just imagine what would happen in those games if you were 20% faster than a 3600. With a max-overclocked 8700K you could have had that three years ago - 10-series is all still skylake isn't it?

but "nobody cares about per-thread performance in 2020 :qq:" right?

but "who cares about another 5 fps average :qq:" right?

it's always funny to see this realization play out with former hardliners, that per-core performance does actually still matter. In this case an old failing Haswell to a 3600.

joke's on you, stellaris runs like poo poo on my brother's 8700K too

what a dumbass lmao. single core performance does in fact not matter

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Truga posted:

joke's on you, stellaris runs like poo poo on my brother's 8700K too

what a dumbass lmao. single core performance does in fact not matter

Hmm, which benchmarks are you making this claim based on?

It certainly could be one of those games that scales better on cache than latency, haven’t seen benchmarks either way.

Paul MaudDib fucked around with this message at 09:27 on Jul 15, 2020

Truga
May 4, 2014
Lipstick Apathy
the benchmark of "when we play multiplayer his client is slowing down the game because it performs like rear end". he has a 2080ti too, so it's not like the gpu is some onboard poo poo that could be slowing things down

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

that's a bizarre bunch of things to say to someone who's a happy Ryzen upgrader

wait a minute aren’t you the “NVIDIA is worth zero ($0) because google is going to eat their lunch any day now... any day now...” guy?

Bizarre thing to say.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Truga posted:

the benchmark of "when we play multiplayer

neat

and yes cache improvements are still per-thread performance improvements

Truga
May 4, 2014
Lipstick Apathy
otoh, maybe it is his gpu making GBS threads the bed

another friend i play with has to watch twitch at exactly 30fps on his second monitor when we play because otherwise his game performance goes to poo poo and he starts lagging the game in year 0 lmfao

:nvidia:

shrike82
Jun 11, 2005

lol I’m suddenly reminded of fishmech for some reason. Whatever happened to him?

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

shrike82 posted:

lol I’m suddenly reminded of fishmech for some reason. Whatever happened to him?

Permad for hosting an ftp server full of goon homegrown.

shrike82
Jun 11, 2005

Welp... that’s me told

Arzachel
May 12, 2012

gradenko_2000 posted:

Something that occurred to me just now is that if the new console generation is going to cause a big leap in system requirements, I wonder how that's going to eat into the value proposition of things like AMD APUs or Intel's Xe iGPU as a way to get into gaming on a budget, and especially since prices on GPUs are high enough that it's hard to get even an entry-level GPU without throwing at least 200 dollars at it (unless you're willing to go second-hand), such that getting an APU becomes the thing that replaces the idea of an entry-level GPU in terms of budgeting, but that'll only hold if games continue to be decently runnable with such graphics.

Integrated graphics are going to double in performance over night with DDR5 and, since cpus are moving towards increasingly fancier packaging tech, interposers/silicon bridges should be the next big upgrade.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Also, you aren't playing AAA games with iGPUs today, right? There'll still be tons of games that run on potatoes, it's not like the budget titles in the store are going away with higher requirements. Those types of games are usually built on PC and ported over anyway since it's usually cheaper.

The higher specs on consoles will eventually translate to PCs but:

1: Things like Ray Tracing have been out for a year already so that's nothing new. Consoles have to catch up A LOT before they start exceeding PC hardware so let's not clutch pearls too much about PC users being left behind.

2: It'll take a while for games to start getting released that effectively use that console hardware. AAA game dev time is years at this point, while you can turn dials up to take advantage of the newer hardware but you can always turn those dials down. It won't be "all games released after December 2021 will suddenly have higher requirements".

3: I'm always suspicious of pre-release specs. There will always be weird bottlenecks and handicaps on any hardware platform.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Lockback posted:

Also, you aren't playing AAA games with iGPUs today, right?

I kinda do? I played WoW and CoD: Warzone and Dark Souls 3 and Doom 2016 and The Division with a Vega iGPU for a couple of weeks when I was in an odd time between set-ups. Of course you have to temper your expectations and running stuff on High is usually not in the cards, but it worked well enough even when chasing after 60 Hz.

I guess the thing I'm... fearing (if that's even a fair descriptor to use) is if requirements shoot up enough that the relative usefulness of these drops back to a level as though they were Intel UHDs again.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

It really depends on what native resolution developers will be targeting, doesn't it? With Sony and MS pushing for >1080p, it's possible / probably that for 1080p60 or lower GPUs less powerful than the consoles' will work as well as they always did.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Those are mostly games that came out before the Vega iGPU did by a couple years (well, not Warzone but thats an example of a game that isn't really pushing technical boundaries today), so you have that buffer time as iGPUs will probably get better.

And yeah, I don't think anyone is arguing this isn't going to accelerate PC requirements somewhat but I think things like "Trying to play certain AAA games in iGPUs" is probably pretty niche since it sucks today anyway and I don't think the requirement acceleration is going to be a brick wall as much as "slightly steeper incline than normal" with probably a few outlier titles here and there.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Lockback posted:

And yeah, I don't think anyone is arguing this isn't going to accelerate PC requirements somewhat but I think things like "Trying to play certain AAA games in iGPUs" is probably pretty niche since it sucks today anyway and I don't think the requirement acceleration is going to be a brick wall as much as "slightly steeper incline than normal" with probably a few outlier titles here and there.

I tend to agree. iGPUs have always been slow enough that AAA gaming hasn't been a realistic option without serious compromises. The only way iGPUs are gonna ever be usable for good experiences on games not some years old / indy / Blizzard games are if Intel/AMD manage to get a DLSS-competitor to work and shove it in there.

For reference, a 1650 mobile GPU is about equivalent to a PS4 Pro. Most games for the foreseeable future are going to be not just cross-platform, but cross-generation, so you'll at least know that they've figured out a way to make it playable on that sort of hardware, meaning you're not going to be all that much worse off than you are now. In a year or two when games are really using the power of the new consoles, you'll also have new generations of iGPUs out to help close that gap somewhat.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Thanks for the responses, folks. I think I just have a case of poor brain from having two GPUs die on me over the life of my old Athlon 640 and I was never able to replace those quickly, so there'd always be like a week where I'd be out of a computer and I don't like having to repeat that experience, so now I like having iGPUs as a backup (and more than one spare dedicated GPU besides) but I still can't shake that nagging feeling.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Services like GeForce NOW are probably the best bet there. They work really well (As long as you're not playing a competitive twitch game) and don't require anything more than an iGPU, and right now at least GeForce NOW is cheap, especially as a stopgap. Some publishers/Devs have pulled their games (Since only NVidia can steal your personal data) but it'll cover you better than an iGPU will.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Zedsdeadbaby posted:

HDR on Windows is so loving poo poo compared to any other form of media (including consoles) it's unreal

Microsoft needs to pull their heads out their arses and actually work on it

I could be wrong here but I think that’s mostly because monitors in general have really bad HDR specs.

I hated Windows HDR until I used it on my OLED and it was 100% fine. I was as baffled as anyone at the time.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
The answer to any PC hardware gap is just buy a Switch. It has (a few) really good games that will easily keep you satisfied for the gap.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

K8.0 posted:

The answer to any PC hardware gap is just buy a Switch. It has (a few) really good games that will easily keep you satisfied for the gap.

You're not wrong (the Switch is great, especially if you've got one of the old-model ones with the unpatched comm port), but I got a X1E in large part so I didn't need to carry around yet another device on my train ride in to work every day. While I'd never really suggest mobile gaming for any sort of serious AAA titles or whatever, there's a valid use case for the ability to do some light games on the same hardware you're tied to for work reasons.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Taima posted:

I could be wrong here but I think that’s mostly because monitors in general have really bad HDR specs.

I hated Windows HDR until I used it on my OLED and it was 100% fine. I was as baffled as anyone at the time.
It's not that it doesn't work, you can get very similar if not identical results to consoles with HDR supported games. It's the implementation, MS didn't really think through the implication of it's 'exclusive mode but not really' fullscreen optimization improvements in Win10 and HDR. The problem is with newer games that use this, such as Death Stranding/Shadow of the Tomb Raider/Metro Exodus, you need to have it enabled from Win10's settings for the game to recognize you have a HDR display at all, regardless if the display you're hooked up to has HDR.

The problem is that it absolutely ruins your desktop, no matter what the tweaking you do (and Win10's HDR 'calibration' also sucks) apps and text will look like crap. Even using my PC hooked up to a TV for gaming 99% of the time I can't just leave it on as even just browsing from the couch/watching youtube gets painful. My TV's HDR isn't great shakes, but games can look noticeably improved with it enabled - apps and the desktop, no way.

For games that have a simple HDR toggle in-game that doesn't depend on this, it's fine - turn it on, use the game's calibration, boom - just like console. But the need to enable/disable it from settings for certain games is obnoxious. The process basically goes:

1) Launch game.
2) Hey, HDR is greyed out. gently caress.
3) Quit game.
4) Turn on HDR from Win10 settings. Squint.
5) Run game.
6) Disable HDR when you quit game.

If Win10's new exclusive mode requires the desktop to also be in HDR, it still could have been significantly improved by not requiring the user to enable/disable it manually from settings beforehand - the game should see HDR as an option depending on the display. When you enable it in-game, your desktop would also be in HDR which sure isn't great if you're alt-tabbing, but when you quit your display goes back to the default you've set in the display settings. It really just needs another setting for "Desktop: SDR/HDR", so you can leave the flag on for games that utilize the new exclusive display mode, but have your desktop remain in SDR when not running them.

Basically, that's how the PS4 Pro HDR works now - you calibrate HDR, but until you launch an HDR game, the PS4 UI (equivalent to a PC's desktop in this comparison) is in SDR. When you launch an HDR game and hit the PS button to go back to the dashboard, the dashboard remains in HDR too - which is what you want, as TV's can take a few seconds to switch from HDR mode to SDR, so having to wait for that every time you want to just check a trophy would be obnoxious. When you quit an HDR game, your TV switches back to SDR.

Happy_Misanthrope fucked around with this message at 18:42 on Jul 15, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Windows HDR is basically "we didn't really think this out all the way, but needed to do something with these HDR-capable monitors!" Same with very high DPI monitors: Windows scaling is nowhere near as reliable as it needs to be in order for it to be a good experience.

Unfortunately, because of how Windows handles these things in terms of sometimes leaving it up to applications, sometimes not, I doubt we'll see the situation meaningfully improve anytime soon.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Oooh gotcha, makes sense, thanks :)

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

shrike82 posted:

We don't even have to go that far. I suspect most of us will be on 1440P for a while so 1440native->1440+ output would be good enough.

Anyway, more rumors :-


What's the supply situation for Nvidia at launch? Will I be able to just get it from Amazon?

Are there PSUs that have 12-pin connectors?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
What does the 12-pin one do that the 2x8 one doesn't?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Combat Pretzel posted:

What does the 12-pin one do that the 2x8 one doesn't?

Make someone rich(er) in making the required adapters.

The big question will be if this is something you'll see only on FE cards, or if they'll let AIBs continue using 6/8 pin connectors so long as they can guarantee the proper power.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

OhFunny posted:

Are there PSUs that have 12-pin connectors?

Not yet. But it sounds like they're just two 6-pins smooshed together, so most PSUs should be able to use their existing cabling with, at most, a 6+6 -> 12 adapter.

Combat Pretzel posted:

What does the 12-pin one do that the 2x8 one doesn't?

8 pins can do up to 150W. Apparently this 12 pin can go above 300W (400-600 depending), and do it using 4 fewer pins. So I guess if you're assuming you'll need that sort of power on a bunch of cards going forward, it'd be an advantage since it'd be smaller and less cluttered thanks to fewer cables, while also helping ensure power quality by (apparently) having appropriate wire gauge requirements.

DrDork fucked around with this message at 00:32 on Jul 16, 2020

FuturePastNow
May 19, 2014


If Nvidia used a special power connector on the card, it would probably just break out into two 8-pins. But I've seen some posts saying that rumor was false, anyway.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
They should just ship cards with their own plug in power adapter off the back.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Suddenly, the 750W power supply I've been eyeing might not actually be enough. :grimacing:

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Lockback posted:

They should just ship cards with their own plug in power adapter off the back.

And we're right back to the Voodoo 5 days.

Ugly In The Morning
Jul 1, 2010
Pillbug

SwissArmyDruid posted:

Suddenly, the 750W power supply I've been eyeing might not actually be enough. :grimacing:

I was planning on upgrading just the graphics card, but I seriously may have to rebuild my whole computer and just keep the RAM, hard drives, and maaaaybe the CPU.

shrike82
Jun 11, 2005

it's too bad egpus never became more of a thing. they'd help with heat and power consumption issues.
i have a razer core x for external compute and it's a neat piece of hardware.

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ugly In The Morning posted:

I was planning on upgrading just the graphics card, but I seriously may have to rebuild my whole computer and just keep the RAM, hard drives, and maaaaybe the CPU.

If you're keeping all that, why are you replacing the motherboard?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply