Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rusty
Sep 28, 2001
Dinosaur Gum
Seriously, CRTs were massive and most people had tiny screens with huge bezels, in their peak, people were using 17" or less. It's easy to see why everyone was eager to throw them away.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

big CRTs contained multiple kilograms of lead, they were doomed by environmental regulations as soon as an alternative came along

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
And let's not forget the fun of dealing with convergence issues if you dared go above 17". Also text on CRTs was not always exactly the best experience.

Pivo
Aug 20, 2004


And rear-projection TVs were the only affordable way to go big and they were awful

Fast refresh, wonderful colours and insane contrast did make CRTs pretty great for games though. 21" Trinitron crew represent

Level 1 Thief
Dec 17, 2007

I'm busy, and I'm having fun.
I had a 34" HD CRT for the better part of a decade and I cannot stress enough that it weighed over two hundred pounds.

spunkshui
Oct 5, 2011



Heavy rear end screen, fully plastic feet.

You need to get to a plug in the back?

EEEEEEEEEEEEEEECH across the desk

TerminalSaint
Apr 21, 2007


Where must we go...

we who wander this Wasteland in search of our better selves?
It was all worth it for a good degauss.

repiv
Aug 13, 2009

sony made a 24" widescreen PC CRT and used ones still sell for over £1000 on occasion

someone out there is unironically using a CRT by choice in the year of our lord 2020

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT
I'll always be thankful for tech that turned giant gently caress-off 100 pound televisions into svelte, slim 40 pound sideways monoliths

shrike82
Jun 11, 2005

PC gaming tech is neat but people forget how crazy smartphone tech is.
modern phones are basically science fiction gizmos from two decades ago

CaptainSarcastic
Jul 6, 2013



repiv posted:

sony made a 24" widescreen PC CRT and used ones still sell for over £1000 on occasion

someone out there is unironically using a CRT by choice in the year of our lord 2020

I had one and it was my main monitor up to 2010, when it died. It was fantastic, but it was also huge and weighed literally 98 pounds.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
Are there any details out about the big navi cards that have some legitimacy behind them? Mostly curious about the power consumption.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

buglord posted:

Are there any details out about the big navi cards that have some legitimacy behind them? Mostly curious about the power consumption.

There are rumors/leaks that it'll be 300+W just like Ampere. No official word from AMD yet, but we'll find out everything on Wednesday.

terrorist ambulance
Nov 5, 2009
Is it possible they'll be good enough to actually outpace the 3080 even with all of Nvidia's bells and whistles, or are people just hoping for parity-ish

Sagebrush
Feb 26, 2012

ERM... Actually I have stellar scores on the surveys, and every year students tell me that my classes are the best ones they’ve ever taken.
without DLSS it doesn't really matter how good they are

Xaris
Jul 25, 2006

Lucky there's a family guy
Lucky there's a man who positively can do
All the things that make us
Laugh and cry

The Big Bad Worf posted:

It's absolutely terrible how the convenience and form factor of LCDs basically killed off a perfectly fine display technology. Motion resolution is still universally awful on LCDs until you start pushing really high refresh rates or invest in monitors/displays that can do black frame insertion/backlight strobing very well. I can think of exactly one IPS monitor that gives near CRT-level motion quality, if you run it at 75hz, and also accept LCD "black" in place of the true black a CRT is capable of. Just a full two decades of having to look at smeary messes of images that are harder to resolve than 360p video the moment you pan the camera or you try to track an object moving across the screen with your eyes.

Shipon posted:

"perfectly fine". CRTs were massive, consumed more power than some computers, and were environmental disasters. Good riddance to that garbage.
yeah. i'm not sad they're gone, althoguh i do sympathize with the guy above that high-refresh rate CRTs were Actually A Thing and a lot of CAL/CPL people kept one even long into the LCD-era.

i remember playing Natural Selection (an asymmetric HL1 mod that features a lot of super fast-paced blinking/leaping/flying aliens) back in the early 2000s on an high-refresh rate CRT and was godly tracking aliens or playing Fade/Lerk, like usually hitting #1-#3. My parents bought me an early LCD and threw away my old one and my skills went to basically zero because everything was a blurry mush at that rate and impossible to play fast-paced stuff on it. poo poo mattered a hell of a lot.

it wasn't up until about ~5 years that high-refresh rate LCDs became a real thing (likely in part advanced by Overwatch which also features fast-paced movement abilities and more e-sportsy oriented) and god drat it was so nice to finally have 120Hz back. That said the first high refresh rate LCD i bought was a TN panel and man the downgrade in color quality over my IPS was SO loving bad I ended up giving it away not too long after and got a Samsung CFG70 which was one of the more affordable ~very colorful~ VA 144hz monitors at the time. Now on a Viewsonic vx2758-2kp-mhd which is OK, good colors, low latency, flat-screen, but it has some displayport blinking issues at times.

Jeff Fatwood
Jun 17, 2013

Sagebrush posted:

without DLSS it doesn't really matter how good they are

It's always fascinating to me that even after physx, hairworks, g-sync and whatever else Nvidia exclusive bullshit, people are still going all in on "if it doesn't have X proprietary Nvidia stuff, it's not worth using".

Moly B. Denum
Oct 26, 2007

Jeff Fatwood posted:

It's always fascinating to me that even after physx, hairworks, g-sync and whatever else Nvidia exclusive bullshit, people are still going all in on "if it doesn't have X proprietary Nvidia stuff, it's not worth using".

To be fair, this seems like the first one of theirs that's actually worth using. We'll just have to wait and see if it gets widespread adoption before Microsoft or whoever makes a hardware-agnostic equivalent.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Xaris posted:

it wasn't up until about ~5 years that high-refresh rate LCDs became a real thing (likely in part advanced by Overwatch

:chloe:

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
how do/did people game on 4k without DLSS, or in games that don't have DLSS?

Sagebrush
Feb 26, 2012

ERM... Actually I have stellar scores on the surveys, and every year students tell me that my classes are the best ones they’ve ever taken.

Jeff Fatwood posted:

It's always fascinating to me that even after physx, hairworks, g-sync and whatever else Nvidia exclusive bullshit, people are still going all in on "if it doesn't have X proprietary Nvidia stuff, it's not worth using".

PhysX and hairworks were obviously gimmicks.

VRR is a huge huge deal and if Freesync didn't exist I would be saying that G-Sync support is a dealbreaker in 2020. Getting a monitor without either one is definitely dumb.

DLSS is another technology like G-Sync that can increase the quality and performance of any game (for which it's implemented) at essentially no cost. It's a really huge change, far bigger than dumb poo poo like hairworks.

gradenko_2000 posted:

how do/did people game on 4k without DLSS, or in games that don't have DLSS?

worse

it's obviously perfectly possible to run at native 4k, and it will even look a little sharper than a DLSS image, but comparing a 3080 and whatever AMD comes out with that performs the same, you'll have the ability to get similar framerates at 4k native on either -- or turn on DLSS on nVidia and get up to like a 50% boost with negligible image quality loss. The only way of achieving that on the AMD card is going to be through a much greater drop in quality. It's a no-brainer.

Sagebrush fucked around with this message at 08:02 on Oct 23, 2020

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

gradenko_2000 posted:

how do/did people game on 4k without DLSS, or in games that don't have DLSS?

Slowly, and without raytracing.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

gradenko_2000 posted:

how do/did people game on 4k without DLSS, or in games that don't have DLSS?

poorly

I mean, there’s a lot of indie/competitive titles you can drive at high frame rates easily, or that don’t care about 60 FPS locked, and up until a year or two ago 60 Hz was all you could get at 4K (especially at any sort of mainstream price, 4K120 was a dream). But in AAAs the answer has always been “poorly”, the 2080 Ti was the first card to really do 4K60 locked in a lot of AAA titles. Before that you upscaled or something.

It’s been one of those things where people have been pronouncing “maybe next generation” since the First 4K Card, which was of course the 980 Ti, but it’s been a moving target since games keep getting tougher. 1080 Ti was close, 2080 Ti finally hit 4K60.

Paul MaudDib fucked around with this message at 08:04 on Oct 23, 2020

Knot My President!
Jan 10, 2005

gradenko_2000 posted:

how do/did people game on 4k without DLSS, or in games that don't have DLSS?


4k@60 or 4k@30, dual GPU, significantly lower settings, etc.

imo 4k gaming has been abysmal and/or out of reach up until relatively recently and simply out of reach until very recently

LethalGeek
Nov 4, 2009

It's wild reading this thread while still being on a 1060 and 590 @ 1080p and being perfectly happy with it all.

Though I do really want some ray tracing cores sooner than latter but the madness of trying to acquire is beyond my means & desire.

LethalGeek fucked around with this message at 08:43 on Oct 23, 2020

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I gamed at 4K/60 on a vanilla 2080 for 1.5 years. It was totally fine and I didn’t have to “significantly” reduce settings. There are always settings in every AAA game that do very little and tank performance by 20-30%.

I can’t describe how weird it is to do something for 18 months and then have people repeatedly say it’s impossible the entire tine.

As far as I can tell the problem is that almost no one games at 4K and benchmarking online is always max settings, so people have this super unrealistic view of the resolution that’s essentially based on unoptimized settings no one would use, combined with hearsay.

4K/60 has been fine for a while. Ray tracing, that’s another matter.

Shipon
Nov 7, 2005
I had a 4k monitor at 60hz i used for a while but it was always difficult to get a good experience at max settings even with a 1080Ti. I moved to a 1440p/165 hz GSync display and it was a complete world of difference and now seeing a 60hz monitor just isn't enough. Give me higher refresh 1440p over 4k any day.

Kazinsal
Dec 13, 2011
Yeah, I run a 1440p144 Hz monitor with a 8700k and a 1080Ti and it's *juicy*. In modern envelope-pushing games I feel like I might need to move up to a 3080 but A) I don't play those games that often and B) holy gently caress lol getting a 3080 in Canada it's bad enough trying to get one in the US but Nvidia's allocations for the Great White North so far are absolutely abysmal despite the average Canadian having more relative disposable income than the average American.

Honestly, if I didn't play EVE occasionally, I'd still be on a 1080p144 screen because visual and temporal quality mean more to me than than resolution (at LCD native). The 1440p was for more visibility in EVE and a larger field of view, but I can't bring myself to play below native resolution on an LCD, so everything else I play is at 1440p to the detriment of either framerate or visual quality.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
You guys are missing the point

8K GAMING is here!*

*Subject to availability. Subject to $30,000 displays. 8K Gaming may or may not actually be here. Framerates can go down as well as up

CaptainSarcastic
Jul 6, 2013



After jumping from a 24" 1080p at 60hz to a 27" 1440p at 144hz I definitely prefer the latter. At this point when I fire up the old machine the monitor feels a little small and crowded, although it's still better than when I have to use a laptop. I like gaming at 1440p, but I absolutely love it for normal day to day browsing and working on documents - the increased screen real estate is fantastic.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


8k gaming isn't just the resolution, it's also the price point!

Jeff Fatwood
Jun 17, 2013

Sagebrush posted:

PhysX and hairworks were obviously gimmicks.

VRR is a huge huge deal and if Freesync didn't exist I would be saying that G-Sync support is a dealbreaker in 2020. Getting a monitor without either one is definitely dumb.

DLSS is another technology like G-Sync that can increase the quality and performance of any game (for which it's implemented) at essentially no cost. It's a really huge change, far bigger than dumb poo poo like hairworks.

Of course. Freesync exists because VRR is a huge deal. Not because G-sync is a huge deal. If G-sync was the only tech in town, VRR would still be niche because the ~200usd surcharge to enter a proprietary walled garden will keep it that way. My point here was that a closed proprietary approach never works for mass adoption.

DLSS will never be anything more than another G-sync and will be overtaken by a free standard if one ever comes out. Same with raytracing. That's why I find it fascinating that people are so ready to declare everything else useless when Sony and Microsoft don't seem to think like that. Hell, Sony and Microsoft themselves have tried proprietary walled garden poo poo multiple times on different markets and they've either crashed and burned or made a cool little niche that they make a buck off of.

edit. I should clarified/used better examples but my point was also that Hairworks and Physx wouldn't have probably been gimmicks if they were something more open instead of Nvidia loving around and trying to make a quick buck off of a cool idea. I mean, physics in games and swaying hair are both staples now in game tech and there's no reason that they couldn't be big Nvidia trademarks if they had an actual interest in innovation and not just trying to ultra capitalize on everything they do. They're not Apple.

double edit. also it's going to be very fitting if Cyberpunk is the last big RTX title and after that the AMD approach takes over because it's supported by everything and not just thousand dollar graphics cards that also need another proprietary tech in conjunction to make run acceptably (RE: Witcher 3 / CDPR / Hairworks)

Jeff Fatwood fucked around with this message at 09:52 on Oct 23, 2020

Truga
May 4, 2014
Lipstick Apathy

Taima posted:

I gamed at 4K/60 on a vanilla 2080 for 1.5 years. It was totally fine and I didn’t have to “significantly” reduce settings. There are always settings in every AAA game that do very little and tank performance by 20-30%.

I can’t describe how weird it is to do something for 18 months and then have people repeatedly say it’s impossible the entire tine.

As far as I can tell the problem is that almost no one games at 4K and benchmarking online is always max settings, so people have this super unrealistic view of the resolution that’s essentially based on unoptimized settings no one would use, combined with hearsay.

4K/60 has been fine for a while. Ray tracing, that’s another matter.

:same: but a stupendously overclocked 980Ti

where's my new cards lisa, i need to replace this poo poo and get rid of windows already

shrike82
Jun 11, 2005

you're doubling the pixel count moving from 1440p to 4K so the question is what frame rate were you hitting in 1440p (or lower)

Kazinsal
Dec 13, 2011

shrike82 posted:

you're doubling the pixel count moving from 1440p to 4K so the question is what frame rate were you hitting in 1440p (or lower)

Yeah, the raw pixel difference between 1080p and 1440p is 1.778x, and then the difference between 1440p and 4K is another 1.84x. Resolution bumps are pretty fuckin huge

Andrigaar
Dec 12, 2003
Saint of Killers

exquisite tea posted:

8k gaming isn't just the resolution, it's also the price point!

Looks like displays are only $2,000-$5,000 USD!

Except that freak-sized one that LTT had. Isn't that one like $30K?

Kazinsal
Dec 13, 2011

Andrigaar posted:

Looks like displays are only $2,000-$5,000 USD!

Except that freak-sized one that LTT had. Isn't that one like $30K?

:stare:

Based on my monitor upgrade schedule I look forward to being at 4K in 2027 and that's fine by me

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Jeff Fatwood posted:

Of course. Freesync exists because VRR is a huge deal. Not because G-sync is a huge deal. If G-sync was the only tech in town, VRR would still be niche because the ~200usd surcharge to enter a proprietary walled garden will keep it that way. My point here was that a closed proprietary approach never works for mass adoption.

DLSS will never be anything more than another G-sync and will be overtaken by a free standard if one ever comes out. Same with raytracing. That's why I find it fascinating that people are so ready to declare everything else useless when Sony and Microsoft don't seem to think like that. Hell, Sony and Microsoft themselves have tried proprietary walled garden poo poo multiple times on different markets and they've either crashed and burned or made a cool little niche that they make a buck off of.

edit. I should clarified/used better examples but my point was also that Hairworks and Physx wouldn't have probably been gimmicks if they were something more open instead of Nvidia loving around and trying to make a quick buck off of a cool idea. I mean, physics in games and swaying hair are both staples now in game tech and there's no reason that they couldn't be big Nvidia trademarks if they had an actual interest in innovation and not just trying to ultra capitalize on everything they do. They're not Apple.

double edit. also it's going to be very fitting if Cyberpunk is the last big RTX title and after that the AMD approach takes over because it's supported by everything and not just thousand dollar graphics cards that also need another proprietary tech in conjunction to make run acceptably (RE: Witcher 3 / CDPR / Hairworks)

Yeah Nvidia, stop trying to make a quick buck from technology you developed.
Technology can be successful for a company without every potential customer having access to it. We have plenty of examples of this. Not everything developed is intended to be ubiquitous and nor should it be, because designing for ubiquity will often mean steep compromises.
Saying Nvidia doesn't have an actual interest in innovation is patently false. And your comments about real-time ray-tracing here betray your ignorance about how the graphics APIs implement it in games.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

Jeff Fatwood posted:

DLSS will never be anything more than another G-sync and will be overtaken by a free standard if one ever comes out. Same with raytracing.

But right now there literally is nothing else like it, because it's something that Nvidia have pioneered for the home market. So whining that Nvidia are marketing their own thing that they spent a lot of time and effort developing (and which demonstrably works and is awesome) is weird.

Yes, an open solution which works on all hardware is preferable, and will likely happen, but right now it is not there at all, so calling it niche and gimmicky just because it's in it's infancy is a bit absurd.

Were Pixel Shaders "niche" when the GeForce 3 launched, and like 4 games used them?

Adbot
ADBOT LOVES YOU

shrike82
Jun 11, 2005

pixel shaders are a bad example because it's not like game developers were coding against a proprietary black box solution that only worked on Nvidia cards

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply