Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

ItBreathes posted:

15.7 are on a 1070 or better, 14.6 are on a 1060, leaving ~70% of those polled on something less than the 1080p card from 3 years ago. Y'all pretty drat skewed, especially the guy who said "everything below the $350ish 5700s are just "limp by playing old games" products".

The thing I will note is that Steam stats include a lot of laptops, which is readily apparent by the prevalence of 1366x768 and 720p screens. You would have to go out of your way to purchase a monitor with that resolution... but it’s a super common thing on laptops.

Steam Hardware Survey doesn’t distinguish between PCs that get played on, and the ones that are streaming guests or just get used for store purchases. I have a lovely Thinkpad T410 that has steam on it so I can buy games, I have a NUC in my living room for streaming, but my games PC has a 1080 Ti. Any of them can get surveyed.

For the people building desktop PCs, the specs are a lot higher.

If you’re using anything older than Maxwell or a 7870 you don’t even have driver support, so you’re not spending $60 on the latest AAA title either, therefore your hardware is irrelevant to AAA devs looking to target their audience.

Not that I disagree with your general thrust that there’s a lot of people perfectly happy on lower-spec or older hardware. But there is a pretty real limit to how old something can still be and still be active in titles released in 2019/2020.

The average gaming-spec PC gets an x60. NVIDIA has been clear that’s their best selling segment. It may not be upgraded for 5 years of course, so you do have a long tail of older hardware.

Paul MaudDib fucked around with this message at 23:14 on Dec 31, 2019

Adbot
ADBOT LOVES YOU

shrike82
Jun 11, 2005

lmao “actually you need to unskew the survey to factor in people with multiple computers”

people game on lovely laptops all the time and it’s pretty unusual for non enthusiasts to have more than one PC (desktop/laptop combo)

Worf
Sep 12, 2017

If only Seth would love me like I love him!

shrike82 posted:

lmao “actually you need to unskew the survey to factor in people with multiple computers”

people game on lovely laptops all the time and it’s pretty unusual for non enthusiasts to have more than one PC (desktop/laptop combo)

It might be unusual for them to have a second PC but they definitely have phones and tablets that skew the survey was his point, I assume

I'm p sure steam works on some smart TVs now according to remote play advertising

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Statutory Ape posted:

It might be unusual for them to have a second PC but they definitely have phones and tablets that skew the survey was his point, I assume

I'm p sure steam works on some smart TVs now according to remote play advertising

Pretty sure the steam hardware survey doesn't poll the steam phone app. For (phone/tablet/tv) remote play, the steam instance would still be on said persons main computer.

Paul was saying that people with lovely laptops are skewing the percentages. There's valid reasons for doing so, but among people with only 1 display it's like 12% of people. You could add up all the mobile/integrated GPUs in the list for better numbers but :effort:. More notably, 65% of people with 1 monitor have a 1080p monitor and 80% of people with more than one monitor are running two 1080p monitors. No stats are given for the relative size of these two groups.

Like he said though, the average gaming PC gets a x60, the survey shows it, every report on GPUs says the $200-300 segment is the best selling, and the guys saying that anyone wigh below a 5700xt aren't real gamers are total dipshits.

Fantastic Foreskin fucked around with this message at 00:44 on Jan 1, 2020

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

lmao “actually you need to unskew the survey to factor in people with multiple computers”

people game on lovely laptops all the time and it’s pretty unusual for non enthusiasts to have more than one PC (desktop/laptop combo)

Desktop/laptop combo havers have multiple PCs, obviously, they get surveyed separately. And yeah, people need to log into their steam account to stream, that machine can get surveyed too.

I’m not saying low-spec gamers or casual/indie gamers aren’t “real gamers” at all, but neither are AAA game devs taking your GT210 or Intel HD Graphics 310 into account when they spec out the next Call of Battlefield either. So pretending like “only 17% of the public has more than a GTX 1060 !!1!!1@ is kind of deceptive. Of the people playing high-spec games, the numbers are much higher.

The spec for an “average” purpose-built gaming desktop is around an x60, or whatever is $200-300, although again it may not be upgraded for years and years. Beige boxes and laptops almost certainly aim lower than average PC enthusiast DIY builds.

And yeah there are a fair amount of confounding factors there for anyone trying to dig out specific niches from within the data. It would be nice to drill down to desktop vs laptop systems, or systems that have played a game (not streamed) for more than 10 hours total in their life. Or hardware-hours would be an interesting way to look at that (and might help filter out the effects of cybercafes, which may have lots of unique logins), or by average spend.

Effectively the public Steam Hardware Survey data is not specific enough to show some of the common things people want to know from data like this. I’m sure they’ll sell you that information but the public doesn’t get it for free. That’s the conclusion this topic always ends up with. It's probably by design, studios or publishers will probably pay for that kind of data.

It's not a pissing match about who's a "true gamer", it's people curious what the data looks like sliced some particular way.

Paul MaudDib fucked around with this message at 01:47 on Jan 1, 2020

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Nah, people upthread were. I'll be the first to admit Steam Hardware Survey isn't the greatest data set, but its all we got. It's still enough to reasonably demonstrate that high end components aren't anywhere near mainstream, given than there are ~twice as many 1060, 1050 and 1050ti owners as there are anything higher end.

Fantastic Foreskin fucked around with this message at 01:13 on Jan 1, 2020

Hackan Slash
May 31, 2007
Hit it until it's not a problem anymore
This caused me to actually check out the survey and I'm glad the 2080S is at the very bottom, tied with the Nvidia 965M and 740.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
Has anyone gotten integer scaling on an RTX card to actually work? First I just couldn't enable it - applying the settings change would just not stick, the screen would just blink and go back to the aspect ration scaling mode instead. Figured out that if you're running your second display on the IGPU then it won't work, so ok, fine, put the secondary on the main GPU and then it'd let me enable it. But then I tried actually playing a game (Starsector) in fullscreen mode, but instead of getting fullscreen I got a borderless window in the top left quarter of the screen with the desktop visible below at some downscaled pixellated resolution. Am I doing something wrong here or does it just not work with some games?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I'm running on an 18.5-inch monitor with a 1366x768 native resolution, and I managed to bump it up to 1600x900, and while various graphical options make it look good in-game, the text is a little too small / grainy (?) when just browsing the internet

is there anything I can do to improve it? It's almost like I want to apply some kind of sharpening or filtering or post-processing, but to Windows

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

gradenko_2000 posted:

I'm running on an 18.5-inch monitor with a 1366x768 native resolution, and I managed to bump it up to 1600x900, and while various graphical options make it look good in-game, the text is a little too small / grainy (?) when just browsing the internet

is there anything I can do to improve it? It's almost like I want to apply some kind of sharpening or filtering or post-processing, but to Windows

you really need to figure out whether your native resolution is really 1366x768 or 1600x900, some monitors let you feed them higher resolutions and they'll downsample it.

there is such a thing as "UI scaling" in windows but feeding it an upscaled UI and then using the monitor to downsample it probably isn't going to produce stellar image quality. Just feed it whatever its native resolution is.

Fabulousity
Dec 29, 2008

Number One I order you to take a number two.

Also as the decade ends let's pour one out for 16:10 displays. :smith:

Voxx
Jul 28, 2009

I'll give 'em a hold
and a break to breathe
And if they can't play nice
I won't play with 'em at all

Fabulousity posted:

Also as the decade ends let's pour one out for 16:10 displays. :smith:

rip

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
1200p was just too perfect for this world.

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE

Paul MaudDib posted:

I have a lovely Thinkpad T410 that has steam on it so I can buy games, I have a NUC in my living room for streaming, but my games PC has a 1080 Ti. Any of them can get surveyed.

Have you been previously surveyed on those streaming clients? Seems to me that Valve could significantly improve the quality of their surveys if they limit them to machines that have actually been running games, no complicated heuristics necessary.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

isndl posted:

Have you been previously surveyed on those streaming clients? Seems to me that Valve could significantly improve the quality of their surveys if they limit them to machines that have actually been running games, no complicated heuristics necessary.

yeah I've been surveyed on my laptop at least once, maybe twice

Level 1 Thief
Dec 17, 2007

I'm busy, and I'm having fun.

TheFluff posted:

Has anyone gotten integer scaling on an RTX card to actually work? First I just couldn't enable it - applying the settings change would just not stick, the screen would just blink and go back to the aspect ration scaling mode instead. Figured out that if you're running your second display on the IGPU then it won't work, so ok, fine, put the secondary on the main GPU and then it'd let me enable it. But then I tried actually playing a game (Starsector) in fullscreen mode, but instead of getting fullscreen I got a borderless window in the top left quarter of the screen with the desktop visible below at some downscaled pixellated resolution. Am I doing something wrong here or does it just not work with some games?

This isn't very helpful for solving your problem but I haven't had any problems with integer scaling whatsoever since it launched (on a 2070, mainly using a 4k TV). When it's active, any resolution works exactly as it would with any other scaling method, multiplied as many times as will fit and then centered on the screen. Leaves some extra borders on resolutions that don't scale cleanly (like 480p would need a 4.5x so it just does 4x) but I can't think of any times it's failed to do exactly as it's supposed to. I also don't have an IGPU so make of that what you will.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Fabulousity posted:

Also as the decade ends let's pour one out for 16:10 displays. :smith:

I wouldn't pour one out just yet. I mean, everyone knows that 3:2 is god's own resolution for getting work done, but 16:10 is still alive and well in the mobile space, see Dell's new XPS 13 2-in-1.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

SwissArmyDruid posted:

I wouldn't pour one out just yet. I mean, everyone knows that 3:2 is god's own resolution for getting work done, but 16:10 is still alive and well in the mobile space, see Dell's new XPS 13 2-in-1.

Don't try to frighten us with your sorcerer's ways, Lord Vader. Your sad devotion to that ancient religion has not helped you conjure up the stolen data tapes, or given you clairvoyance enough to find the Rebels' hidden fort...

[chokes on own gum]

Craptacular!
Jul 9, 2001

Fuck the DH

K8.0 posted:

What you're saying here isn't really in conflict with what I said. There really IS no GPU like the 1070 on the market today - the 1660 Super is $230 and not as good.

I would say the problem is that there is no GPU like the 1070ti out there today. The Super-izing of Turing is nice but happened at all models. The 1070ti provided a fairly significant boost for about the same price and simply depressed 1070s into being more affordable.

The worst part is that Nvidia sells these lower Turing cards that market themselves as RTX but really can't do it without significant compromise. I guess the 1080p deserves needs an RTX card too but I tell any friends who listen not to buy these rebadged machine learning cards, and wait until Nvidia makes an RTX generation that doesn't have a bunch of tensor cores costing money and doing nothing.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Craptacular! posted:

I guess the 1080p deserves needs an RTX card too but I tell any friends who listen not to buy these rebadged machine learning cards, and wait until Nvidia makes an RTX generation that doesn't have a bunch of tensor cores costing money and doing nothing.

Radeon VII is still a super solid buy if you literally only are doing compute. And no CUDA, only OpenCL.

R7 has been running $500 - 10% ebay bucks. If ROCm fits your bill you would be stupid not to get that card at that price. 2080 Ti compute performance, 16GB VRAM, ~5700XT pricing.

Past that you have the choice of the 1080 Ti vs 2080 (lower vram vs tensor), if you need CUDA.

Paul MaudDib fucked around with this message at 10:06 on Jan 1, 2020

Craptacular!
Jul 9, 2001

Fuck the DH
No, I mean that IMO the 20x0 cards are ML cards repurposed for gaming, and you're having to pay extra for using a card that wasn't intended for games. It's not like Nvidia is going to eat the loss on all of those tensor cores that never turn on.

Setzer Gabbiani
Oct 13, 2004

Fabulousity posted:

Also as the decade ends let's pour one out for 16:10 displays. :smith:

I'm not budging from my U2410, no

Arzachel
May 12, 2012

Craptacular! posted:

No, I mean that IMO the 20x0 cards are ML cards repurposed for gaming, and you're having to pay extra for using a card that wasn't intended for games. It's not like Nvidia is going to eat the loss on all of those tensor cores that never turn on.

Tensor cores account for maybe ~5% of the die.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arzachel posted:

Tensor cores account for maybe ~5% of the die.

"cause the compute market wants it lol"

see also: "cause pixar wants it lol"

you've unearthed the secret nvidia design documents

Arzachel
May 12, 2012
It's kind of like calling every modern GPU a repurposed video encoder because they all have on-die fixed function encoders.

sauer kraut
Oct 2, 2004

Arzachel posted:

Tensor cores account for maybe ~5% of the die.

Shockingly true if you run the number of conventional stuff per mm²
The Turing chips are just humongous in general, even the humble 1660ti is larger than a 5700XT :smith:

eames
May 9, 2009

Craptacular! posted:

No, I mean that IMO the 20x0 cards are ML cards repurposed for gaming, and you're having to pay extra for using a card that wasn't intended for games. It's not like Nvidia is going to eat the loss on all of those tensor cores that never turn on.

I agree with you but it sure looks like NVidia is all-in on RTX for now. That seems unlikely to change unless the RTX standard fails because performance stagnates or other manufacturers pull ahead with better rasterization performance (i.e. 5900XT performing the same as a 3080ti at a 30% lower price). Obviously both of those scenarios look very unlikely. Maybe it’ll all make more sense with the next generation. I don’t think Turing was ever intended to be this physically large.

Shaocaholica
Oct 29, 2002

Fig. 5E
Is 8GB+ vram recommended for ‘enthusiast’ gaming these days? For modern titles and let’s say games released in the next 2 years.

VelociBacon
Dec 8, 2009

Sounds about right but the actual card matters more than the vram in many cases.

Arzachel
May 12, 2012

Shaocaholica posted:

Is 8GB+ vram recommended for ‘enthusiast’ gaming these days? For modern titles and let’s say games released in the next 2 years.

Very much depends on what your definition of enthusiast is, but doesn't everything north of a 2060 have 8GB vram anyways?

Shaocaholica
Oct 29, 2002

Fig. 5E

VelociBacon posted:

Sounds about right but the actual card matters more than the vram in many cases.

Sure but if you run out of vram for game assets then even a faster card is going to drop way below a slower card with enough vram.

I guess that might not be a problem with modern cards either if they all have 8+

I have a few projects going on now and some of then are open to the used market at $150 and up and I guess that might contain sub 8GB cards which I might avoid.

Arzachel
May 12, 2012

Shaocaholica posted:

Sure but if you run out of vram for game assets then even a faster card is going to drop way below a slower card with enough vram.

I guess that might not be a problem with modern cards either if they all have 8+

I have a few projects going on now and some of then are open to the used market at $150 and up and I guess that might contain sub 8GB cards which I might avoid.

If you're on a budget, just put the money towards a faster card. 6GB should be perfectly fine in the near future.

Shaocaholica
Oct 29, 2002

Fig. 5E
Is either AMD or Nvidia Vulkan driver implementation better than the other per similar on paper performance?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Vulkan and DX12 are both low level APIs. Performance is generally less about the driver about more about how well-implemented the software is, as well as how suited the hardware is to the given task.

Shaocaholica
Oct 29, 2002

Fig. 5E

K8.0 posted:

Vulkan and DX12 are both low level APIs. Performance is generally less about the driver about more about how well-implemented the software is, as well as how suited the hardware is to the given task.

Ok. I just wanted to be sure there weren’t any driver-api implementation issues for Vulkan for either platform since it’s a not commonly used api and probably has less development support on the driver side. Obviously the application side matters but that should be platform independent.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Shaocaholica posted:

Is 8GB+ vram recommended for ‘enthusiast’ gaming these days? For modern titles and let’s say games released in the next 2 years.

The 2080/ti is arguably the only enthusiast card series, and they have sufficient ram, yes.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Craptacular! posted:

No, I mean that IMO the 20x0 cards are ML cards repurposed for gaming, and you're having to pay extra for using a card that wasn't intended for games. It's not like Nvidia is going to eat the loss on all of those tensor cores that never turn on.

I think you are strongly over estimating the effect of their cores on the price. Nvidia charges as much as they do on the top end because they can, not because of production costs. If they removed the tensor cores the price wouldn't really change.

It's the McDonald's example. A cheeseburger, fries and a drink all cost about $1 each. The cost to produce them is about a dollar, about 25 cents and about 5 cents respectively. They charge the most they can for them, not some arbitrary percentage above production cost

shrike82
Jun 11, 2005

Have 20x0 cards dropped significantly in price? Sales didn't meet expectations iirc so it makes you wonder why they have't adjusted prices accordingly

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Because he's wrong and the BOM is in fact why Turing is so expensive. The GPUs are absolutely enormous, and big pieces of high-end silicon are really expensive to produce. Nvidia's a publicly traded company, you can just look at their financials. They're doing well, but they're not doing as well as they would be if they could produce cheap GPUs. The Supers did make a solid dent for them, because they're capitalizing on everything they can get out of the GPUs they're fabbing.

The situation is not likely to change much soon. Samsung 7nm seems to have fallen on its face, and TSMC won't be able to ramp up production nearly enough to meet the combined demand from everyone who wants to fab high-performance parts.

Adbot
ADBOT LOVES YOU

Inept
Jul 8, 2003

Taima posted:

The 2080/ti is arguably the only enthusiast card series, and they have sufficient ram, yes.

I remember when posters in these threads were younger and the $200 dollar video cards were mainstream and anything above was enthusiast. Nerds ITT have too much money now.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply