|
Seriously, CRTs were massive and most people had tiny screens with huge bezels, in their peak, people were using 17" or less. It's easy to see why everyone was eager to throw them away.
|
# ? Oct 23, 2020 03:23 |
|
|
# ? Jun 9, 2024 19:00 |
|
big CRTs contained multiple kilograms of lead, they were doomed by environmental regulations as soon as an alternative came along
|
# ? Oct 23, 2020 03:23 |
|
And let's not forget the fun of dealing with convergence issues if you dared go above 17". Also text on CRTs was not always exactly the best experience.
|
# ? Oct 23, 2020 03:39 |
|
And rear-projection TVs were the only affordable way to go big and they were awful Fast refresh, wonderful colours and insane contrast did make CRTs pretty great for games though. 21" Trinitron crew represent
|
# ? Oct 23, 2020 03:42 |
|
I had a 34" HD CRT for the better part of a decade and I cannot stress enough that it weighed over two hundred pounds.
|
# ? Oct 23, 2020 03:46 |
|
Heavy rear end screen, fully plastic feet. You need to get to a plug in the back? EEEEEEEEEEEEEEECH across the desk
|
# ? Oct 23, 2020 03:56 |
|
It was all worth it for a good degauss.
|
# ? Oct 23, 2020 04:00 |
|
sony made a 24" widescreen PC CRT and used ones still sell for over £1000 on occasion someone out there is unironically using a CRT by choice in the year of our lord 2020
|
# ? Oct 23, 2020 04:10 |
|
I'll always be thankful for tech that turned giant gently caress-off 100 pound televisions into svelte, slim 40 pound sideways monoliths
|
# ? Oct 23, 2020 04:18 |
|
PC gaming tech is neat but people forget how crazy smartphone tech is. modern phones are basically science fiction gizmos from two decades ago
|
# ? Oct 23, 2020 04:19 |
|
repiv posted:sony made a 24" widescreen PC CRT and used ones still sell for over £1000 on occasion I had one and it was my main monitor up to 2010, when it died. It was fantastic, but it was also huge and weighed literally 98 pounds.
|
# ? Oct 23, 2020 04:29 |
|
Are there any details out about the big navi cards that have some legitimacy behind them? Mostly curious about the power consumption.
|
# ? Oct 23, 2020 04:32 |
|
buglord posted:Are there any details out about the big navi cards that have some legitimacy behind them? Mostly curious about the power consumption. There are rumors/leaks that it'll be 300+W just like Ampere. No official word from AMD yet, but we'll find out everything on Wednesday.
|
# ? Oct 23, 2020 04:37 |
|
Is it possible they'll be good enough to actually outpace the 3080 even with all of Nvidia's bells and whistles, or are people just hoping for parity-ish
|
# ? Oct 23, 2020 04:49 |
|
without DLSS it doesn't really matter how good they are
|
# ? Oct 23, 2020 04:53 |
|
The Big Bad Worf posted:It's absolutely terrible how the convenience and form factor of LCDs basically killed off a perfectly fine display technology. Motion resolution is still universally awful on LCDs until you start pushing really high refresh rates or invest in monitors/displays that can do black frame insertion/backlight strobing very well. I can think of exactly one IPS monitor that gives near CRT-level motion quality, if you run it at 75hz, and also accept LCD "black" in place of the true black a CRT is capable of. Just a full two decades of having to look at smeary messes of images that are harder to resolve than 360p video the moment you pan the camera or you try to track an object moving across the screen with your eyes. Shipon posted:"perfectly fine". CRTs were massive, consumed more power than some computers, and were environmental disasters. Good riddance to that garbage. i remember playing Natural Selection (an asymmetric HL1 mod that features a lot of super fast-paced blinking/leaping/flying aliens) back in the early 2000s on an high-refresh rate CRT and was godly tracking aliens or playing Fade/Lerk, like usually hitting #1-#3. My parents bought me an early LCD and threw away my old one and my skills went to basically zero because everything was a blurry mush at that rate and impossible to play fast-paced stuff on it. poo poo mattered a hell of a lot. it wasn't up until about ~5 years that high-refresh rate LCDs became a real thing (likely in part advanced by Overwatch which also features fast-paced movement abilities and more e-sportsy oriented) and god drat it was so nice to finally have 120Hz back. That said the first high refresh rate LCD i bought was a TN panel and man the downgrade in color quality over my IPS was SO loving bad I ended up giving it away not too long after and got a Samsung CFG70 which was one of the more affordable ~very colorful~ VA 144hz monitors at the time. Now on a Viewsonic vx2758-2kp-mhd which is OK, good colors, low latency, flat-screen, but it has some displayport blinking issues at times.
|
# ? Oct 23, 2020 07:17 |
|
Sagebrush posted:without DLSS it doesn't really matter how good they are It's always fascinating to me that even after physx, hairworks, g-sync and whatever else Nvidia exclusive bullshit, people are still going all in on "if it doesn't have X proprietary Nvidia stuff, it's not worth using".
|
# ? Oct 23, 2020 07:24 |
|
Jeff Fatwood posted:It's always fascinating to me that even after physx, hairworks, g-sync and whatever else Nvidia exclusive bullshit, people are still going all in on "if it doesn't have X proprietary Nvidia stuff, it's not worth using". To be fair, this seems like the first one of theirs that's actually worth using. We'll just have to wait and see if it gets widespread adoption before Microsoft or whoever makes a hardware-agnostic equivalent.
|
# ? Oct 23, 2020 07:48 |
|
Xaris posted:it wasn't up until about ~5 years that high-refresh rate LCDs became a real thing (likely in part advanced by Overwatch
|
# ? Oct 23, 2020 07:54 |
|
how do/did people game on 4k without DLSS, or in games that don't have DLSS?
|
# ? Oct 23, 2020 07:54 |
|
Jeff Fatwood posted:It's always fascinating to me that even after physx, hairworks, g-sync and whatever else Nvidia exclusive bullshit, people are still going all in on "if it doesn't have X proprietary Nvidia stuff, it's not worth using". PhysX and hairworks were obviously gimmicks. VRR is a huge huge deal and if Freesync didn't exist I would be saying that G-Sync support is a dealbreaker in 2020. Getting a monitor without either one is definitely dumb. DLSS is another technology like G-Sync that can increase the quality and performance of any game (for which it's implemented) at essentially no cost. It's a really huge change, far bigger than dumb poo poo like hairworks. gradenko_2000 posted:how do/did people game on 4k without DLSS, or in games that don't have DLSS? worse it's obviously perfectly possible to run at native 4k, and it will even look a little sharper than a DLSS image, but comparing a 3080 and whatever AMD comes out with that performs the same, you'll have the ability to get similar framerates at 4k native on either -- or turn on DLSS on nVidia and get up to like a 50% boost with negligible image quality loss. The only way of achieving that on the AMD card is going to be through a much greater drop in quality. It's a no-brainer. Sagebrush fucked around with this message at 08:02 on Oct 23, 2020 |
# ? Oct 23, 2020 07:58 |
|
gradenko_2000 posted:how do/did people game on 4k without DLSS, or in games that don't have DLSS? Slowly, and without raytracing.
|
# ? Oct 23, 2020 08:00 |
|
gradenko_2000 posted:how do/did people game on 4k without DLSS, or in games that don't have DLSS? poorly I mean, there’s a lot of indie/competitive titles you can drive at high frame rates easily, or that don’t care about 60 FPS locked, and up until a year or two ago 60 Hz was all you could get at 4K (especially at any sort of mainstream price, 4K120 was a dream). But in AAAs the answer has always been “poorly”, the 2080 Ti was the first card to really do 4K60 locked in a lot of AAA titles. Before that you upscaled or something. It’s been one of those things where people have been pronouncing “maybe next generation” since the First 4K Card, which was of course the 980 Ti, but it’s been a moving target since games keep getting tougher. 1080 Ti was close, 2080 Ti finally hit 4K60. Paul MaudDib fucked around with this message at 08:04 on Oct 23, 2020 |
# ? Oct 23, 2020 08:01 |
|
gradenko_2000 posted:how do/did people game on 4k without DLSS, or in games that don't have DLSS? 4k@60 or 4k@30, dual GPU, significantly lower settings, etc. imo 4k gaming has been abysmal and/or out of reach up until relatively recently and simply out of reach until very recently
|
# ? Oct 23, 2020 08:03 |
|
It's wild reading this thread while still being on a 1060 and 590 @ 1080p and being perfectly happy with it all. Though I do really want some ray tracing cores sooner than latter but the madness of trying to acquire is beyond my means & desire. LethalGeek fucked around with this message at 08:43 on Oct 23, 2020 |
# ? Oct 23, 2020 08:13 |
|
I gamed at 4K/60 on a vanilla 2080 for 1.5 years. It was totally fine and I didn’t have to “significantly” reduce settings. There are always settings in every AAA game that do very little and tank performance by 20-30%. I can’t describe how weird it is to do something for 18 months and then have people repeatedly say it’s impossible the entire tine. As far as I can tell the problem is that almost no one games at 4K and benchmarking online is always max settings, so people have this super unrealistic view of the resolution that’s essentially based on unoptimized settings no one would use, combined with hearsay. 4K/60 has been fine for a while. Ray tracing, that’s another matter.
|
# ? Oct 23, 2020 08:36 |
|
I had a 4k monitor at 60hz i used for a while but it was always difficult to get a good experience at max settings even with a 1080Ti. I moved to a 1440p/165 hz GSync display and it was a complete world of difference and now seeing a 60hz monitor just isn't enough. Give me higher refresh 1440p over 4k any day.
|
# ? Oct 23, 2020 08:51 |
|
Yeah, I run a 1440p144 Hz monitor with a 8700k and a 1080Ti and it's *juicy*. In modern envelope-pushing games I feel like I might need to move up to a 3080 but A) I don't play those games that often and B) holy gently caress lol getting a 3080 in Canada it's bad enough trying to get one in the US but Nvidia's allocations for the Great White North so far are absolutely abysmal despite the average Canadian having more relative disposable income than the average American. Honestly, if I didn't play EVE occasionally, I'd still be on a 1080p144 screen because visual and temporal quality mean more to me than than resolution (at LCD native). The 1440p was for more visibility in EVE and a larger field of view, but I can't bring myself to play below native resolution on an LCD, so everything else I play is at 1440p to the detriment of either framerate or visual quality.
|
# ? Oct 23, 2020 08:59 |
|
You guys are missing the point 8K GAMING is here!* *Subject to availability. Subject to $30,000 displays. 8K Gaming may or may not actually be here. Framerates can go down as well as up
|
# ? Oct 23, 2020 09:11 |
|
After jumping from a 24" 1080p at 60hz to a 27" 1440p at 144hz I definitely prefer the latter. At this point when I fire up the old machine the monitor feels a little small and crowded, although it's still better than when I have to use a laptop. I like gaming at 1440p, but I absolutely love it for normal day to day browsing and working on documents - the increased screen real estate is fantastic.
|
# ? Oct 23, 2020 09:15 |
|
8k gaming isn't just the resolution, it's also the price point!
|
# ? Oct 23, 2020 09:15 |
|
Sagebrush posted:PhysX and hairworks were obviously gimmicks. Of course. Freesync exists because VRR is a huge deal. Not because G-sync is a huge deal. If G-sync was the only tech in town, VRR would still be niche because the ~200usd surcharge to enter a proprietary walled garden will keep it that way. My point here was that a closed proprietary approach never works for mass adoption. DLSS will never be anything more than another G-sync and will be overtaken by a free standard if one ever comes out. Same with raytracing. That's why I find it fascinating that people are so ready to declare everything else useless when Sony and Microsoft don't seem to think like that. Hell, Sony and Microsoft themselves have tried proprietary walled garden poo poo multiple times on different markets and they've either crashed and burned or made a cool little niche that they make a buck off of. edit. I should clarified/used better examples but my point was also that Hairworks and Physx wouldn't have probably been gimmicks if they were something more open instead of Nvidia loving around and trying to make a quick buck off of a cool idea. I mean, physics in games and swaying hair are both staples now in game tech and there's no reason that they couldn't be big Nvidia trademarks if they had an actual interest in innovation and not just trying to ultra capitalize on everything they do. They're not Apple. double edit. also it's going to be very fitting if Cyberpunk is the last big RTX title and after that the AMD approach takes over because it's supported by everything and not just thousand dollar graphics cards that also need another proprietary tech in conjunction to make run acceptably (RE: Witcher 3 / CDPR / Hairworks) Jeff Fatwood fucked around with this message at 09:52 on Oct 23, 2020 |
# ? Oct 23, 2020 09:16 |
|
Taima posted:I gamed at 4K/60 on a vanilla 2080 for 1.5 years. It was totally fine and I didn’t have to “significantly” reduce settings. There are always settings in every AAA game that do very little and tank performance by 20-30%. but a stupendously overclocked 980Ti where's my new cards lisa, i need to replace this poo poo and get rid of windows already
|
# ? Oct 23, 2020 09:24 |
|
you're doubling the pixel count moving from 1440p to 4K so the question is what frame rate were you hitting in 1440p (or lower)
|
# ? Oct 23, 2020 09:27 |
|
shrike82 posted:you're doubling the pixel count moving from 1440p to 4K so the question is what frame rate were you hitting in 1440p (or lower) Yeah, the raw pixel difference between 1080p and 1440p is 1.778x, and then the difference between 1440p and 4K is another 1.84x. Resolution bumps are pretty fuckin huge
|
# ? Oct 23, 2020 09:29 |
|
exquisite tea posted:8k gaming isn't just the resolution, it's also the price point! Looks like displays are only $2,000-$5,000 USD! Except that freak-sized one that LTT had. Isn't that one like $30K?
|
# ? Oct 23, 2020 09:35 |
|
Andrigaar posted:Looks like displays are only $2,000-$5,000 USD! Based on my monitor upgrade schedule I look forward to being at 4K in 2027 and that's fine by me
|
# ? Oct 23, 2020 09:36 |
|
Jeff Fatwood posted:Of course. Freesync exists because VRR is a huge deal. Not because G-sync is a huge deal. If G-sync was the only tech in town, VRR would still be niche because the ~200usd surcharge to enter a proprietary walled garden will keep it that way. My point here was that a closed proprietary approach never works for mass adoption. Yeah Nvidia, stop trying to make a quick buck from technology you developed. Technology can be successful for a company without every potential customer having access to it. We have plenty of examples of this. Not everything developed is intended to be ubiquitous and nor should it be, because designing for ubiquity will often mean steep compromises. Saying Nvidia doesn't have an actual interest in innovation is patently false. And your comments about real-time ray-tracing here betray your ignorance about how the graphics APIs implement it in games.
|
# ? Oct 23, 2020 10:08 |
|
Jeff Fatwood posted:DLSS will never be anything more than another G-sync and will be overtaken by a free standard if one ever comes out. Same with raytracing. But right now there literally is nothing else like it, because it's something that Nvidia have pioneered for the home market. So whining that Nvidia are marketing their own thing that they spent a lot of time and effort developing (and which demonstrably works and is awesome) is weird. Yes, an open solution which works on all hardware is preferable, and will likely happen, but right now it is not there at all, so calling it niche and gimmicky just because it's in it's infancy is a bit absurd. Were Pixel Shaders "niche" when the GeForce 3 launched, and like 4 games used them?
|
# ? Oct 23, 2020 10:16 |
|
|
# ? Jun 9, 2024 19:00 |
|
pixel shaders are a bad example because it's not like game developers were coding against a proprietary black box solution that only worked on Nvidia cards
|
# ? Oct 23, 2020 10:24 |