|
ItBreathes posted:15.7 are on a 1070 or better, 14.6 are on a 1060, leaving ~70% of those polled on something less than the 1080p card from 3 years ago. Y'all pretty drat skewed, especially the guy who said "everything below the $350ish 5700s are just "limp by playing old games" products". The thing I will note is that Steam stats include a lot of laptops, which is readily apparent by the prevalence of 1366x768 and 720p screens. You would have to go out of your way to purchase a monitor with that resolution... but it’s a super common thing on laptops. Steam Hardware Survey doesn’t distinguish between PCs that get played on, and the ones that are streaming guests or just get used for store purchases. I have a lovely Thinkpad T410 that has steam on it so I can buy games, I have a NUC in my living room for streaming, but my games PC has a 1080 Ti. Any of them can get surveyed. For the people building desktop PCs, the specs are a lot higher. If you’re using anything older than Maxwell or a 7870 you don’t even have driver support, so you’re not spending $60 on the latest AAA title either, therefore your hardware is irrelevant to AAA devs looking to target their audience. Not that I disagree with your general thrust that there’s a lot of people perfectly happy on lower-spec or older hardware. But there is a pretty real limit to how old something can still be and still be active in titles released in 2019/2020. The average gaming-spec PC gets an x60. NVIDIA has been clear that’s their best selling segment. It may not be upgraded for 5 years of course, so you do have a long tail of older hardware. Paul MaudDib fucked around with this message at 23:14 on Dec 31, 2019 |
# ? Dec 31, 2019 22:53 |
|
|
# ? Jun 5, 2024 19:21 |
|
lmao “actually you need to unskew the survey to factor in people with multiple computers” people game on lovely laptops all the time and it’s pretty unusual for non enthusiasts to have more than one PC (desktop/laptop combo)
|
# ? Jan 1, 2020 00:10 |
|
shrike82 posted:lmao “actually you need to unskew the survey to factor in people with multiple computers” It might be unusual for them to have a second PC but they definitely have phones and tablets that skew the survey was his point, I assume I'm p sure steam works on some smart TVs now according to remote play advertising
|
# ? Jan 1, 2020 00:25 |
|
Statutory Ape posted:It might be unusual for them to have a second PC but they definitely have phones and tablets that skew the survey was his point, I assume Pretty sure the steam hardware survey doesn't poll the steam phone app. For (phone/tablet/tv) remote play, the steam instance would still be on said persons main computer. Paul was saying that people with lovely laptops are skewing the percentages. There's valid reasons for doing so, but among people with only 1 display it's like 12% of people. You could add up all the mobile/integrated GPUs in the list for better numbers but . More notably, 65% of people with 1 monitor have a 1080p monitor and 80% of people with more than one monitor are running two 1080p monitors. No stats are given for the relative size of these two groups. Like he said though, the average gaming PC gets a x60, the survey shows it, every report on GPUs says the $200-300 segment is the best selling, and the guys saying that anyone wigh below a 5700xt aren't real gamers are total dipshits. Fantastic Foreskin fucked around with this message at 00:44 on Jan 1, 2020 |
# ? Jan 1, 2020 00:39 |
|
shrike82 posted:lmao “actually you need to unskew the survey to factor in people with multiple computers” Desktop/laptop combo havers have multiple PCs, obviously, they get surveyed separately. And yeah, people need to log into their steam account to stream, that machine can get surveyed too. I’m not saying low-spec gamers or casual/indie gamers aren’t “real gamers” at all, but neither are AAA game devs taking your GT210 or Intel HD Graphics 310 into account when they spec out the next Call of Battlefield either. So pretending like “only 17% of the public has more than a GTX 1060 !!1!!1@ is kind of deceptive. Of the people playing high-spec games, the numbers are much higher. The spec for an “average” purpose-built gaming desktop is around an x60, or whatever is $200-300, although again it may not be upgraded for years and years. Beige boxes and laptops almost certainly aim lower than average PC enthusiast DIY builds. And yeah there are a fair amount of confounding factors there for anyone trying to dig out specific niches from within the data. It would be nice to drill down to desktop vs laptop systems, or systems that have played a game (not streamed) for more than 10 hours total in their life. Or hardware-hours would be an interesting way to look at that (and might help filter out the effects of cybercafes, which may have lots of unique logins), or by average spend. Effectively the public Steam Hardware Survey data is not specific enough to show some of the common things people want to know from data like this. I’m sure they’ll sell you that information but the public doesn’t get it for free. That’s the conclusion this topic always ends up with. It's probably by design, studios or publishers will probably pay for that kind of data. It's not a pissing match about who's a "true gamer", it's people curious what the data looks like sliced some particular way. Paul MaudDib fucked around with this message at 01:47 on Jan 1, 2020 |
# ? Jan 1, 2020 00:59 |
|
Nah, people upthread were. I'll be the first to admit Steam Hardware Survey isn't the greatest data set, but its all we got. It's still enough to reasonably demonstrate that high end components aren't anywhere near mainstream, given than there are ~twice as many 1060, 1050 and 1050ti owners as there are anything higher end.
Fantastic Foreskin fucked around with this message at 01:13 on Jan 1, 2020 |
# ? Jan 1, 2020 01:01 |
|
This caused me to actually check out the survey and I'm glad the 2080S is at the very bottom, tied with the Nvidia 965M and 740.
|
# ? Jan 1, 2020 01:19 |
|
Has anyone gotten integer scaling on an RTX card to actually work? First I just couldn't enable it - applying the settings change would just not stick, the screen would just blink and go back to the aspect ration scaling mode instead. Figured out that if you're running your second display on the IGPU then it won't work, so ok, fine, put the secondary on the main GPU and then it'd let me enable it. But then I tried actually playing a game (Starsector) in fullscreen mode, but instead of getting fullscreen I got a borderless window in the top left quarter of the screen with the desktop visible below at some downscaled pixellated resolution. Am I doing something wrong here or does it just not work with some games?
|
# ? Jan 1, 2020 01:21 |
|
I'm running on an 18.5-inch monitor with a 1366x768 native resolution, and I managed to bump it up to 1600x900, and while various graphical options make it look good in-game, the text is a little too small / grainy (?) when just browsing the internet is there anything I can do to improve it? It's almost like I want to apply some kind of sharpening or filtering or post-processing, but to Windows
|
# ? Jan 1, 2020 04:09 |
|
gradenko_2000 posted:I'm running on an 18.5-inch monitor with a 1366x768 native resolution, and I managed to bump it up to 1600x900, and while various graphical options make it look good in-game, the text is a little too small / grainy (?) when just browsing the internet you really need to figure out whether your native resolution is really 1366x768 or 1600x900, some monitors let you feed them higher resolutions and they'll downsample it. there is such a thing as "UI scaling" in windows but feeding it an upscaled UI and then using the monitor to downsample it probably isn't going to produce stellar image quality. Just feed it whatever its native resolution is.
|
# ? Jan 1, 2020 04:16 |
|
Also as the decade ends let's pour one out for 16:10 displays.
|
# ? Jan 1, 2020 04:37 |
|
Fabulousity posted:Also as the decade ends let's pour one out for 16:10 displays. rip
|
# ? Jan 1, 2020 05:29 |
|
1200p was just too perfect for this world.
|
# ? Jan 1, 2020 05:43 |
|
Paul MaudDib posted:I have a lovely Thinkpad T410 that has steam on it so I can buy games, I have a NUC in my living room for streaming, but my games PC has a 1080 Ti. Any of them can get surveyed. Have you been previously surveyed on those streaming clients? Seems to me that Valve could significantly improve the quality of their surveys if they limit them to machines that have actually been running games, no complicated heuristics necessary.
|
# ? Jan 1, 2020 06:06 |
|
isndl posted:Have you been previously surveyed on those streaming clients? Seems to me that Valve could significantly improve the quality of their surveys if they limit them to machines that have actually been running games, no complicated heuristics necessary. yeah I've been surveyed on my laptop at least once, maybe twice
|
# ? Jan 1, 2020 06:08 |
|
TheFluff posted:Has anyone gotten integer scaling on an RTX card to actually work? First I just couldn't enable it - applying the settings change would just not stick, the screen would just blink and go back to the aspect ration scaling mode instead. Figured out that if you're running your second display on the IGPU then it won't work, so ok, fine, put the secondary on the main GPU and then it'd let me enable it. But then I tried actually playing a game (Starsector) in fullscreen mode, but instead of getting fullscreen I got a borderless window in the top left quarter of the screen with the desktop visible below at some downscaled pixellated resolution. Am I doing something wrong here or does it just not work with some games? This isn't very helpful for solving your problem but I haven't had any problems with integer scaling whatsoever since it launched (on a 2070, mainly using a 4k TV). When it's active, any resolution works exactly as it would with any other scaling method, multiplied as many times as will fit and then centered on the screen. Leaves some extra borders on resolutions that don't scale cleanly (like 480p would need a 4.5x so it just does 4x) but I can't think of any times it's failed to do exactly as it's supposed to. I also don't have an IGPU so make of that what you will.
|
# ? Jan 1, 2020 08:21 |
|
Fabulousity posted:Also as the decade ends let's pour one out for 16:10 displays. I wouldn't pour one out just yet. I mean, everyone knows that 3:2 is god's own resolution for getting work done, but 16:10 is still alive and well in the mobile space, see Dell's new XPS 13 2-in-1.
|
# ? Jan 1, 2020 09:16 |
|
SwissArmyDruid posted:I wouldn't pour one out just yet. I mean, everyone knows that 3:2 is god's own resolution for getting work done, but 16:10 is still alive and well in the mobile space, see Dell's new XPS 13 2-in-1. Don't try to frighten us with your sorcerer's ways, Lord Vader. Your sad devotion to that ancient religion has not helped you conjure up the stolen data tapes, or given you clairvoyance enough to find the Rebels' hidden fort... [chokes on own gum]
|
# ? Jan 1, 2020 09:51 |
|
K8.0 posted:What you're saying here isn't really in conflict with what I said. There really IS no GPU like the 1070 on the market today - the 1660 Super is $230 and not as good. I would say the problem is that there is no GPU like the 1070ti out there today. The Super-izing of Turing is nice but happened at all models. The 1070ti provided a fairly significant boost for about the same price and simply depressed 1070s into being more affordable. The worst part is that Nvidia sells these lower Turing cards that market themselves as RTX but really can't do it without significant compromise. I guess the 1080p deserves needs an RTX card too but I tell any friends who listen not to buy these rebadged machine learning cards, and wait until Nvidia makes an RTX generation that doesn't have a bunch of tensor cores costing money and doing nothing.
|
# ? Jan 1, 2020 09:56 |
|
Craptacular! posted:I guess the 1080p deserves needs an RTX card too but I tell any friends who listen not to buy these rebadged machine learning cards, and wait until Nvidia makes an RTX generation that doesn't have a bunch of tensor cores costing money and doing nothing. Radeon VII is still a super solid buy if you literally only are doing compute. And no CUDA, only OpenCL. R7 has been running $500 - 10% ebay bucks. If ROCm fits your bill you would be stupid not to get that card at that price. 2080 Ti compute performance, 16GB VRAM, ~5700XT pricing. Past that you have the choice of the 1080 Ti vs 2080 (lower vram vs tensor), if you need CUDA. Paul MaudDib fucked around with this message at 10:06 on Jan 1, 2020 |
# ? Jan 1, 2020 10:00 |
|
No, I mean that IMO the 20x0 cards are ML cards repurposed for gaming, and you're having to pay extra for using a card that wasn't intended for games. It's not like Nvidia is going to eat the loss on all of those tensor cores that never turn on.
|
# ? Jan 1, 2020 10:47 |
|
Fabulousity posted:Also as the decade ends let's pour one out for 16:10 displays. I'm not budging from my U2410, no
|
# ? Jan 1, 2020 11:34 |
|
Craptacular! posted:No, I mean that IMO the 20x0 cards are ML cards repurposed for gaming, and you're having to pay extra for using a card that wasn't intended for games. It's not like Nvidia is going to eat the loss on all of those tensor cores that never turn on. Tensor cores account for maybe ~5% of the die.
|
# ? Jan 1, 2020 11:38 |
|
Arzachel posted:Tensor cores account for maybe ~5% of the die. "cause the compute market wants it lol" see also: "cause pixar wants it lol" you've unearthed the secret nvidia design documents
|
# ? Jan 1, 2020 11:40 |
|
It's kind of like calling every modern GPU a repurposed video encoder because they all have on-die fixed function encoders.
|
# ? Jan 1, 2020 11:56 |
|
Arzachel posted:Tensor cores account for maybe ~5% of the die. Shockingly true if you run the number of conventional stuff per mm² The Turing chips are just humongous in general, even the humble 1660ti is larger than a 5700XT
|
# ? Jan 1, 2020 12:50 |
|
Craptacular! posted:No, I mean that IMO the 20x0 cards are ML cards repurposed for gaming, and you're having to pay extra for using a card that wasn't intended for games. It's not like Nvidia is going to eat the loss on all of those tensor cores that never turn on. I agree with you but it sure looks like NVidia is all-in on RTX for now. That seems unlikely to change unless the RTX standard fails because performance stagnates or other manufacturers pull ahead with better rasterization performance (i.e. 5900XT performing the same as a 3080ti at a 30% lower price). Obviously both of those scenarios look very unlikely. Maybe it’ll all make more sense with the next generation. I don’t think Turing was ever intended to be this physically large.
|
# ? Jan 1, 2020 13:14 |
|
Is 8GB+ vram recommended for ‘enthusiast’ gaming these days? For modern titles and let’s say games released in the next 2 years.
|
# ? Jan 1, 2020 18:38 |
|
Sounds about right but the actual card matters more than the vram in many cases.
|
# ? Jan 1, 2020 18:54 |
|
Shaocaholica posted:Is 8GB+ vram recommended for ‘enthusiast’ gaming these days? For modern titles and let’s say games released in the next 2 years. Very much depends on what your definition of enthusiast is, but doesn't everything north of a 2060 have 8GB vram anyways?
|
# ? Jan 1, 2020 19:02 |
|
VelociBacon posted:Sounds about right but the actual card matters more than the vram in many cases. Sure but if you run out of vram for game assets then even a faster card is going to drop way below a slower card with enough vram. I guess that might not be a problem with modern cards either if they all have 8+ I have a few projects going on now and some of then are open to the used market at $150 and up and I guess that might contain sub 8GB cards which I might avoid.
|
# ? Jan 1, 2020 19:25 |
|
Shaocaholica posted:Sure but if you run out of vram for game assets then even a faster card is going to drop way below a slower card with enough vram. If you're on a budget, just put the money towards a faster card. 6GB should be perfectly fine in the near future.
|
# ? Jan 1, 2020 19:54 |
|
Is either AMD or Nvidia Vulkan driver implementation better than the other per similar on paper performance?
|
# ? Jan 1, 2020 20:13 |
|
Vulkan and DX12 are both low level APIs. Performance is generally less about the driver about more about how well-implemented the software is, as well as how suited the hardware is to the given task.
|
# ? Jan 1, 2020 20:30 |
|
K8.0 posted:Vulkan and DX12 are both low level APIs. Performance is generally less about the driver about more about how well-implemented the software is, as well as how suited the hardware is to the given task. Ok. I just wanted to be sure there weren’t any driver-api implementation issues for Vulkan for either platform since it’s a not commonly used api and probably has less development support on the driver side. Obviously the application side matters but that should be platform independent.
|
# ? Jan 1, 2020 22:19 |
|
Shaocaholica posted:Is 8GB+ vram recommended for ‘enthusiast’ gaming these days? For modern titles and let’s say games released in the next 2 years. The 2080/ti is arguably the only enthusiast card series, and they have sufficient ram, yes.
|
# ? Jan 1, 2020 22:25 |
|
Craptacular! posted:No, I mean that IMO the 20x0 cards are ML cards repurposed for gaming, and you're having to pay extra for using a card that wasn't intended for games. It's not like Nvidia is going to eat the loss on all of those tensor cores that never turn on. I think you are strongly over estimating the effect of their cores on the price. Nvidia charges as much as they do on the top end because they can, not because of production costs. If they removed the tensor cores the price wouldn't really change. It's the McDonald's example. A cheeseburger, fries and a drink all cost about $1 each. The cost to produce them is about a dollar, about 25 cents and about 5 cents respectively. They charge the most they can for them, not some arbitrary percentage above production cost
|
# ? Jan 1, 2020 22:55 |
|
Have 20x0 cards dropped significantly in price? Sales didn't meet expectations iirc so it makes you wonder why they have't adjusted prices accordingly
|
# ? Jan 2, 2020 01:24 |
|
Because he's wrong and the BOM is in fact why Turing is so expensive. The GPUs are absolutely enormous, and big pieces of high-end silicon are really expensive to produce. Nvidia's a publicly traded company, you can just look at their financials. They're doing well, but they're not doing as well as they would be if they could produce cheap GPUs. The Supers did make a solid dent for them, because they're capitalizing on everything they can get out of the GPUs they're fabbing. The situation is not likely to change much soon. Samsung 7nm seems to have fallen on its face, and TSMC won't be able to ramp up production nearly enough to meet the combined demand from everyone who wants to fab high-performance parts.
|
# ? Jan 2, 2020 01:33 |
|
|
# ? Jun 5, 2024 19:21 |
|
Taima posted:The 2080/ti is arguably the only enthusiast card series, and they have sufficient ram, yes. I remember when posters in these threads were younger and the $200 dollar video cards were mainstream and anything above was enthusiast. Nerds ITT have too much money now.
|
# ? Jan 2, 2020 02:03 |