Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

Vasler posted:

Where are you finding the 2070 black for $700 CAD? I've only been able to find that card for $800+ CAD.

It’s actually 659 after rebate at Canada Computers:

https://ca.pcpartpicker.com/product/mYYLrH/evga-geforce-rtx-2070-8gb-black-video-card-08g-p4-1071-kr

This is the barest bones version with 2 slot height and no LEDs etc

Adbot
ADBOT LOVES YOU

orcane
Jun 13, 2012

Fun Shoe

Vasler posted:

Thanks for all your help guys. I do have one more question (I hope this is the last one).

Looking at the 1070 Ti prices (used), they're very similar to what a new 2060 costs. What are the pros/cons of getting a used 1070 Ti vs a new 2060?

This sounds like a stupid question but there must be some reason why used cards are being recommended over new ones.

Is the 1070/2060 going to be a short-term solution until a new generation comes out or will it last for a while?
TLDR: It's because the current Nvidia generation occupies about the same spot in price/performance as the old one, so you're not getting 1070 Ti performance out of a 2060 for 1060 pricing, you roughly get 1070 Ti performance out of a 2060 for 1070 Ti money.

Long answer: The RTX cards have less VRAM for the "same" price (except the 2070/1080, both of them have 8 GB) which has the potential to suck in the long run and people don't like that in a $350+ GPU. The new tensor cores mainly do two things for games: Enable raytracing effects (RTX) at playable framerates and increase performance at higher resolutions with deep learning supersampling (DLSS), a form of AA where a machine learning algorithm improves an faster, upscaled lower resolution to look closer to the native target resolution image. The caveat is that few games do RTX at this point and there might not be enough games for the feature to matter before a refresh generation does RTX better/faster. And DLSS is very limited because Nvidia has to train the algorithm for every game and resolution, and apparently also target framerate.

Used GTX 1070 Ti: You're buying a used card, modern Nvidia GPUs have protections (eg. no critical overvolting or BIOS flashing), but it can still fail for other reasons. However, a 1070 Ti that had 3 years of (transferable!) warranty still has over 1.5 years left because the cards only came out in November 2017. It might still be a bigger hassle than a new card you bought yourself. The 1070 Ti comes with 2 GB more VRAM which may or may not matter in the card's lifetime in your system.

The RTX 2060-series has the tensor cores and has advantages in low-level APIs (DX12/Vulkan) and HDR over the 1070 Ti, along with shader improvements. Also, it's better at GPU compute if you do more than play games. And as a minor point, the RTX 2060 will get driver support for longer and possible optimizations in the future that the 1070 Ti will not get anymore (this was not really a thing in 1000-series/Pascal over 900-series/Maxwell so I wouldn't count on it).

Whether it's a short-term solution depends on your demands. Either card should play modern games at max details up to 1440p at 60fps (the longer you keep the card the more you'll have to turn down details or resolution), if you need more you need a faster card. The next big jump in performance requirements is probably another year+ out though (new console generation launches rumoured in 2020). In terms of the tensor features, I'll expect a die shrink to let Nvidia get cheaper or faster RTX products out of the door by the time RTX is more widely supported so I wouldn't specifically buy a 2060 for that. Whether this will happen soon is unknown, some people assume Nvidia will move the RTX line to a smaller process within a year but IMO as long as AMD can't really fight back, they might sit on their mature process and try to milk enthusiasts for a while longer - used Pascal cards will not last forever.

orcane fucked around with this message at 23:38 on Feb 19, 2019

Vasler
Feb 17, 2004
Greetings Earthling! Do you have any Zoom Boots?

orcane posted:

TLDR: It's because the current Nvidia generation occupies about the same spot in price/performance as the old one, so you're not getting 1070 Ti performance out of a 2060 for 1060 pricing, you roughly get 1070 Ti performance out of a 2060 for 1070 Ti money.

Long answer: The RTX cards have less VRAM for the "same" price (except the 2070/1080, both of them have 8 GB) which has the potential to suck in the long run and people don't like that in a $350+ GPU. The new tensor cores mainly do two things for games: Enable raytracing effects (RTX) at playable framerates and increase performance at higher resolutions with deep learning supersampling (DLSS), a form of AA where a machine learning algorithm improves an faster, upscaled lower resolution to look closer to the native target resolution image. The caveat is that few games do RTX at this point and there might not be enough games for the feature to matter before refresh does RTX better/faster. And DLSS is very limited because Nvidia has to train the algorithm for every game and resolution, and apparently also target framerate.

Used GTX 1070 Ti: You're buying a used card, modern Nvidia GPUs have protections (eg. no critical overvolting or BIOS flashing), but it can still fail for other reasons. However, a 1070 Ti that had 3 years of (transferable!) warranty still has over 1.5 years left because the cards only came out in November 2017. It might still be a bigger hassle than a new card you bought yourself. The 1070 Ti comes with 2 GB more VRAM which may or may not matter in the card's lifetime in your system.

The RTX 2060-series has the tensor cores and has advantages in low-level APIs (DX12/Vulkan) and HDR over the 1070 Ti, along with shader improvements. Also, it's better at GPU compute if you do more than play games. And as a minor point, the RTX 2060 will get driver support for longer and possible optimizations in the future that the 1070 Ti will not get anymore.

Whether it's a short-term solution depends on your demands. Either card should play modern games at max details up to 1440p at 60fps (the longer you keep the card the more you'll have to turn down details or resolution), if you need more you need a faster card. The next big jump in performance requirements is probably another year+ out though (new console generation launches rumoured in 2020). In terms of the tensor features, I'll expect a die shrink to let Nvidia get cheaper or faster RTX products out of the door by the time RTX is more widely supported so I wouldn't specifically buy a 2060 for that. Whether this will happen soon is unknown, some people assume Nvidia will move the RTX line to a smaller process within a year or so.

This is fantastic. Thank you so much for taking the time and effort to post this. Its a shame that things shook out the way they did as I really liked how my 970 was reasonably priced for pretty great performance.

Since I game in 1080p and have no plans to upgrade in the near future, I wonder how much the 6 GB vs 8 GB VRAM will matter. If I stick with 1080p what would you recommend? Searching for a 1070 Ti or grabbing a 2060? Which specific one should I get?

Again, apologies for these basic questions. I really appreciate your response and everyone else's that has responded to me. Thank you so much!

Dominoes
Sep 20, 2007

Dominoes posted:

Much apprec. I noticed you can buy directly from Nvidia. Any reason to buy a branded (Gigabyte Evga etc) one off Amazon instead?
Thanks again dudes; bought.

FuzzySlippers
Feb 6, 2009

At high resolutions (3440) does the CPU have much impact on gaming? I've still got a Haswell i7 4770k and I'm thinking of buying a 2080. Looking at benchmarks it seems like 4770k is still fine but few sites benchmark back that far and most do it at low resolutions to exaggerate the CPU impact.

Craptacular!
Jul 9, 2001

Fuck the DH

Setzer Gabbiani posted:

Vulkan's problems have nothing to do with Linux, and everything to do with the fact that the AAA industry abandoned Khronos API's around the time of UT2k4, and they reinforced this decision around the time when Brink and Rage were things to point at. id is the only dev that voluntarily uses OGL/Vulkan, something that is more than likely a Carmack/Sousa clause that Bethesda probably can't wait to weasel out of at some point. Every modern PC release has been DX11 all the way down, unless it's sidelined with a DX12 renderer that adds absolutely nothing

I meant in the sense that if Nvidia can divorce ray-tracing from DX12 with their own proprietary stuff it would do a lot to improve operating systems not called Windows, a world where presently everyone favors AMD no matter how bad their product is.

I am fierce about ending Microsoft's OS dominance among people who care about games. Direct3D is just a hurdle and the industry needs to detoxify itself from it, and if MS is going to make DX12 a huge pain in the rear end tied into their attempts to also take over the digital downloads space then it's a good time as any for the industry to move to something that can work on another OS.

Stickman posted:

Honestly, the most disappointing aspect of the 2060 is that it's existence means that we won't get a 1070 Ti-level Turing GTX card.

It's essentially a 1070, the VRAM is the most disappointing part.

That said the sheer number of 6GB cards in circulation will mean that games won't shut the door on them for some time to come.

Craptacular! fucked around with this message at 00:14 on Feb 20, 2019

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

FuzzySlippers posted:

At high resolutions (3440) does the CPU have much impact on gaming? I've still got a Haswell i7 4770k and I'm thinking of buying a 2080. Looking at benchmarks it seems like 4770k is still fine but few sites benchmark back that far and most do it at low resolutions to exaggerate the CPU impact.

Depends what you care about. You'll see much higher average framerates, but your bad frametimes where you're getting CPU bound probably won't change much. How much is going to depend on the game, how much it bothers you will depend on the game and you.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

priznat posted:

Oh drat that’s gettin expensive

"Maxing out" a high-refresh display is a fool's errand. I ran a 3418DW off of a 970 for the better part of a year and was perfectly happy since 40fps GSync is more than playable. You'll be happy with it too.

It's best to look at synched displays as something you buy to age *with* your hardware, not another component that needs to be fed with upgrades to keep it ~operating optimally~.

VelociBacon
Dec 8, 2009

I think a 60fps minimum with a high refresh rate monitor even with gsync is advisable. Otherwise you might as well connect your 8 year old nephew's Xbox to it.

Indiana_Krom
Jun 18, 2007
Net Slacker
Yeah, the feel of a game at 180+ fps is absolutely a lot smoother and snappier than at 40 fps, even if its a gsync monitor that does it all without tearing or lag.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I'm not disputing that - I'm just saying that telling someone to go from a $450 GPU to a $630 GPU is a jump, especially when they've already dropped ~$750 on a monitor.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

BIG HEADLINE posted:

"Maxing out" a high-refresh display is a fool's errand. I ran a 3418DW off of a 970 for the better part of a year and was perfectly happy since 40fps GSync is more than playable. You'll be happy with it too.

It's best to look at synched displays as something you buy to age *with* your hardware, not another component that needs to be fed with upgrades to keep it ~operating optimally~.

Yeah I have a 970 now and will see how it is. If it sucks then a 2060/2070 or possibly 2080 (but a bit on the pricey side) will be the next upgrade. Guess I am locked in to nvidia somewhat now though, hope navi isn’t a wonder gpu :haw:

Cygni
Nov 12, 2005

raring to post

Adding to it is the bad news is that youll be dropping settings to get consistent 100fps even with a goddamn 2080 Ti in some titles (AC:Odyssey, GTAV)

Nothing is more disheartening then splashing for a big GPU/monitor and seeing 30fps on the FPS counter. So yeah, i wouldn't chase the "maxing" dragon.

VelociBacon
Dec 8, 2009

9900k and 2080ti here both overclocked and not reaching 144hz in 1440p maxed on some stuff (if you count maxing resolution scaler as maxing). It's a kick in the balls but that's our hobby.

Craptacular!
Jul 9, 2001

Fuck the DH
I was a "highest preset available" guy but I've absolutely learned to live with lower settings since going to 1440/144, often just stepping down settings in GFE until a profile hits my desired framerate (although in some games I look at detailed performance guides). It's very nice that most of the time the difference between highest and high is indistinguishable.

The day this 1070 needs Medium-Low in the majority of games to hit my target framerate, though... oof. I've pretty much resigned myself to never owning a Switch because I'll need that money for a GPU upgrade someday.

Admiral Joeslop
Jul 8, 2010




I just need to power 1080p/144hz with whatever card I get. I don't plan on upgrading this ASUS VG248QE anytime soon. Hopefully will get some more use out of this 970 once I upgrade the CPU and do a fresh Windows install.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Craptacular! posted:

The day this 1070 needs Medium-Low in the majority of games to hit my target framerate, though... oof. I've pretty much resigned myself to never owning a Switch because I'll need that money for a GPU upgrade someday.

uh... isn't that pretty much now? I don't know how you define "majority of games" but the 1070 hasn't exactly been a spring chicken at 1440p for a while now. I owned one and gamed on it at 1440p since launch day (just got a 2080).

It's kind of irrelevant though. PC gaming is, in all probability, going to die entirely when the next batch of consoles is released. I only got the 2080 to hold me off until then. After that, I'm probably done with PC gaming forever. Once consoles can do 4k/60 it's just over. It's been a good run, I've been in the PC gaming market since when floppy disks were actually floppy, but even the current gen consoles severely threaten the viability of pc gaming and 99.9% of people couldn't give less of a gently caress about 144fps.

Taima fucked around with this message at 03:48 on Feb 20, 2019

Setzer Gabbiani
Oct 13, 2004

poo poo's bad when hot garbage like DLSS is being promoted as a cool new development, instead of, y'know, cards powerful enough to handle poo poo maxed at their native resolution without needing a hardware accelerated Topaz plugin to fake it

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Setzer Gabbiani posted:

poo poo's bad when hot garbage like DLSS is being promoted as a cool new development, instead of, y'know, cards powerful enough to handle poo poo maxed at their native resolution without needing a hardware accelerated Topaz plugin to fake it

Completely agree.

Truth is, when I got this 2080, I moved my entire gaming rig out to the living room permanently to game on a 75 inch, 4k TV. My home office gaming/productivity room is now powered entirely off a Macbook Pro connected to external monitors, and I don't miss having a gaming PC in that room at all. My gaming will be on gamepad only via TV, though I am loosely considering buying something ridiculous like the Roccat Sova to use a mouse in the living room.

So even now my classic PC gaming setup is dead entirely, in early 2019. The GPU market can barely come up with worthwhile price/performance, and I'm sorry but ray tracing/DLSS is dead in the water until further notice (and I say that as a 2080 owner). If you want more evidence of this, take a look at the Nexus walkthrough of RTX improvements in the new Metro (spoiler: it's not kind on the RTX technology it uses).

I also own a PS4 Pro, and the fact is, a lot of the newer games are "good enough" for me at 4k/30fps and look amazing. And that's with the monumentally lovely PS4 Pro hardware. Given a new console, we are looking at the complete ruination of PC gaming entirely. I feel like Steam intuitively knows this, which is why they have clearly just been trying to ride out the current market climate until such a time that they are irrelevant.

PCs will have their arena in the productivity space, and that will have to be enough.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
spicy hot take to counter the spicy hot take: consoles other than portable things like the switch are dead already, there's just no point now that they're just bad pc's with old and lovely pc hardware that you can't upgrade and can't do productivity tasks on. you can get the same games on pc anyway.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Yeah, objectively speaking PC gaming is the strongest it's ever been. Console exclusives are almost entirely dead, the rise of phones and tablets has annihilated the casual gaming market, and IMO one of the most overlooked factors is that those devices have also pushed a significant number of people off laptops and back to desktops, and once you have a desktop it's hard to argue it shouldn't also be a gaming machine.

People are reading way the gently caress too much into waiting for a new kind of GPU to hit a die shrink, get content, and come into its own.

The Gunslinger
Jul 24, 2004

Do not forget the face of your father.
Fun Shoe

Taima posted:

It's kind of irrelevant though. PC gaming is, in all probability, going to die entirely when the next batch of consoles is released.

Man I've heard this so many times over the years. Not to get into console war silliness (love mine) but I'll never give up PC gaming. It's the inputs, flexibility, game pricing, open platform and so much more that keep me coming back. Ironically I see more people building gaming desktops than ever thanks to streamers.

It's a lovely generation of video cards for price to performance combined with features that don't really have adoption yet, it'll pass.

Cygni
Nov 12, 2005

raring to post

Computer Gaming: Dead since 1983 and loving it

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Taima posted:

Given a new console, we are looking at the complete ruination of PC gaming entirely. I feel like Steam intuitively knows this, which is why they have clearly just been trying to ride out the current market climate until such a time that they are irrelevant.

People keep saying this, but I think it's far more likely that consoles are going to start emulating PCs. Instead of evolutionary platform upgrades every ~7 years, I see consoles becoming modular and/or upgradable. I think the "Pro" consoles were a test case to see if people would be willing to re-buy something they already owned to have a "better experience" than other players who *weren't* willing or able to do so. To make things even more evil, seeing as Sony and Microsoft both have the hardware support frameworks in place for their existing consumer electronics/computers, I could see them taking a cue from Apple and making user upgrades impossible or prohibited by technology akin to Apple's "T2" chip.

And before you hand-wave this notion - imagine the shittiest little fucknut console gamer you possibly can, and how much stock they place on being ~good at vidya gaemz~ and how much they'd be willing to spend/bug their mom for to have a game that's just a bit smoother and more responsive than a stock console owner.

PC gaming's always got a future because it's a value added proposition. Sure, you need to spend a good amount of money (but far less than most 'adult' hobbies) to make the experience truly amazing, but so long as you *can* play games on a PC, people will.

BIG HEADLINE fucked around with this message at 04:55 on Feb 20, 2019

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I get the apprehension, I really do.

All I can say is that I'm about as hardcore of a PC gamer as you could possibly get, historically, and I'm gaming on a 4K TV in my living room. I don't think it's a spicy take. I don't think it's a far leap. The market is speaking for itself in a way it never has, and people like me- the mainstays of PC gaming through thick and thin- well, a lot of us are now one step away from dismissing the market entirely. And again, I literally say this as a dude who bought a 2080 YESTERDAY. AS in, Monday, the day before today.

I feel that it's incredibly disingenuous to say "oh they've been saying that since forever". Ok, sure, but they said the exact same thing about steam engines. It's relevant until it's not. The people who are invested, get to doubt the naysayers, until the writing is on the wall. The privileged position is not the people arguing for change, it's the people who get to say "there was no change" year after year.

The Gunslinger posted:

Man I've heard this so many times over the years. Not to get into console war silliness (love mine) but I'll never give up PC gaming. It's the inputs, flexibility, game pricing, open platform and so much more that keep me coming back. Ironically I see more people building gaming desktops than ever thanks to streamers.

It's a lovely generation of video cards for price to performance combined with features that don't really have adoption yet, it'll pass.

Yep. I get it. Unfortunately while this seems like a good argument to you personally (and perhaps even me) it means absolutely nothing to the world at large. And while you're doing your best to afford that new hotness ultrawide GSYNC monitor, the world has moved on.

PC gaming has been saved recently because its competition is basically poo poo. Incredibly old console architectures, or in the case of Xbox One X, just a lovely environment in general. I dunno, believe what you want. I've been PC gaming for 20 loving years. But I think it's done soon, and we'll see who is right.

Taima fucked around with this message at 05:00 on Feb 20, 2019

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
oh come on now. consoles and pc's are pretty much the same thing now dude, it's the same commodity hardware and the same graphics API's and the same game engines and to some extent even the same loving operating systems and companies (read: microsoft). who cares if you play on a tv in your livingroom or on a monitor at a desk or if you buy your games on app store a rather than app store b or if you play using a controller or a mouse/keyboard, it's all the same games and the same market. i don't think there's a strong argument to be made that the higher end enthusiast segment of the market will suddenly disappear for some mysterious reason or another.

you're just getting old, grandpa.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord

BIG HEADLINE posted:

PC gaming's always got a future because it's a value added proposition. Sure, you need to spend a good amount of money (but far less than most 'adult' hobbies) .

I was talking to a buddy about this and it’s kinda wild how cheap it is if you’re comparing it to car modding or something.

Re the dying pc gaming thing: if you consider Best Buy to be even a slightly reliable indicator of the PC gaming market, go into there now and see the tons of components, peripherals, and gaming PCs that weren’t there in the mid 2000s-early 2010s.

GPUs are indeed at a sucky place though and it hurts because I want to replace my GTX 1070 so I can metro at higher settings but then yuck the RTXs look like bad values right now.

Cygni
Nov 12, 2005

raring to post

i could see game streaming having an impact on both consoles and PCs pretty soon.

but thats another thing people have talked about for decades but never seems to get over the hump. i remember using a Geode based Citrix thin client in like 99 in a library and thinking it was gonna take over the world, lol.

Craptacular!
Jul 9, 2001

Fuck the DH

Taima posted:

uh... isn't that pretty much now? I don't know how you define "majority of games" but the 1070 hasn't exactly been a spring chicken at 1440p for a while now.

I mostly play Destiny 2, Dota, League, etc. My expectations aren't that high, and with some tweaking I can get 80 FPS in Monster Hunter World which is certainly good enough for me. AC Origins craters it, but it's the exception more than the rule. Like the next game I'm excited to play on this thing is Yakuza Kiwami. Then DOA6. Then DMC5. This isn't exactly Frostbite Engine stuff.

I only bought this card in August 2017 and I'm not exactly roaring to replace it already, especially since the current gen is a mini-shitshow and the [Jim Sterling whine voice] Triple-Ehhhh game industry is a huge category 5 shitstorm of broken dreams and late-stage capitalism. Why upgrade for more lootbox battle royale bullshit?

Craptacular! fucked around with this message at 06:13 on Feb 20, 2019

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender
PC/Console gaming isn't dead, it just won't be on dedicated hardware. We'll all be launching a game streaming app on our STB of choice and then pick up where we left off on our browser or phone.

That's what should be keeping Nvidia/AMD/MS/Sony/Intel up at night. Why spend $400 on a new console or $1k+ on a gaming PC when I can just stream a game to my TV using my Chromecast and pick it up later on my Chromebook.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

Krailor posted:

PC/Console gaming isn't dead, it just won't be on dedicated hardware. We'll all be launching a game streaming app on our STB of choice and then pick up where we left off on our browser or phone.

That's what should be keeping Nvidia/AMD/MS/Sony/Intel up at night. Why spend $400 on a new console or $1k+ on a gaming PC when I can just stream a game to my TV using my Chromecast and pick it up later on my Chromebook.

This is where a lot of it is heading, with things like Playstation Now and the possible Google streaming service too. OnLive was just ahead of its time and there weren't enough datacenters located around to get the latency down for a lot of people.

There will be subscription tiers for quality so you get more compute resources and bandwidth accordingly..

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Krailor posted:

That's what should be keeping Nvidia/AMD/MS/Sony/Intel up at night.

nVidia's already tried to get out in front of this with ~GeForce Now~, but they're too early. I also think the barrier to entry with regards to a streaming-only system is tied to fixing our antiquated data infrastructure. 5G probably won't be up to snuff in its first iterations, and even 10 or 100Gbit internet is limited by 1) the cost of the hardware on both the end user and ISP side, and 2) latency.

BIG HEADLINE fucked around with this message at 10:12 on Feb 20, 2019

Arzachel
May 12, 2012

Krailor posted:

PC/Console gaming isn't dead, it just won't be on dedicated hardware. We'll all be launching a game streaming app on our STB of choice and then pick up where we left off on our browser or phone.

That's what should be keeping Nvidia/AMD/MS/Sony/Intel up at night. Why spend $400 on a new console or $1k+ on a gaming PC when I can just stream a game to my TV using my Chromecast and pick it up later on my Chromebook.

Even if you magically handwave all the infrastructure issues, a good chunk of games are always going to have direct control that feels like rear end with 20+ms of input lag introduced.

Also I find the claims that 4k tvs are killing pc gaming funny when my PS4 is gathering dust ever since I hooked up a laptop and now a SSF pc to said tv.

Arzachel fucked around with this message at 09:28 on Feb 20, 2019

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Arzachel posted:

Even if you magically handwave all the infrastructure issues, a good chunk of games are always going to have direct control that feels like rear end with 20+ms of input lag introduced.

A majority of people most likely play with 100ms of input lag or more already since that's about where you end up with a 60Hz monitor and v-sync triple buffering, even if your monitor has no input lag. The new Smash Bros for the Switch has ~90ms of input lag. It's way less noticeable than you'd think.

craig588
Nov 19, 2005

by Nyc_Tattoo
100 ms of input lag seems insane. I'm disabled and my muscles move at about a 50 ms delay and that's extremely debilitating. I don't know most people could deal with a 100 ms delay. I'd expect it to be more like 16 or 33 ms.

TorakFade
Oct 3, 2006

I strongly disapprove


The whole console ecosystem tends to be much easier to work with, but also much more limited in terms of inputs, deals, etc

I mean, a console game tends to be about 20€ more expensive than a PC game. A medium-weight gaming PC can cost you about 800-1000€ vs 300-400€ for a console, true, but when you're spending an extra 20€ for each game, the scales tip at around 30 games. During the system's lifetime a "gamer" will probably buy more than that, it's barely 5 games every year.

Plus a well built PC will last you far longer than a console will, with basic upgrading (changing the GPU 3-4 years in works wonders, and even with the crazy inflated prices we have now if you're aiming for console parity you're going to target mid-range GPUs so 300-400€ or so)

I don't think PC gaming is dead, and that's from someone who recently built a 1500€ PC to replace the aging one I had, and stopped playing the PS3 / didn't even think getting a One or PS4 because I was tired of the PSN, paying a subscription to play online, the excessive cost of physical games, and not having access to a whole lot of games that just don't play well with a controller

Consoles are great for : people that aren't very tech savvy and don't want to bother with building/upgrading/messing with drivers etc, people who play just a few games, or don't have enough disposable income / don't want to spend a lot and are OK with "reduced" graphics quality (when consoles will do 4k/60, PCs will probably be on 8k/60 if not 120 :) )

PCs are great for: people who really really love vidyagames and will buy a lot of them, people who love tinkering with stuff, people who have disposable income and want poo poo to look as good as possible and don't want to "settle" for lesser options, people who loves the kind of games that just don't work on a couch/controller setup (and there are many)

both fields are fully "valid" too, no shame in being a console or a PC gamer, or heck even both... it's just that both are not dying or going anywhere anytime soon, not until we can get 1GBPS fiber optic to every house and could maaaybe potentially stream games from a central server.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

craig588 posted:

100 ms of input lag seems insane. I'm disabled and my muscles move at about a 50 ms delay and that's extremely debilitating. I don't know most people could deal with a 100 ms delay. I'd expect it to be more like 16 or 33 ms.

It might sound crazy but that's just how it is. See for example blurbusters (graph at the bottom), Dan Luu, or Battle(non)sense's button-to-pixel videos on YouTube. What's typically measured here is time from input device actuation to first visible on-screen reaction (typically measured in the center of the screen - monitors refresh top to bottom over the duration of a frame; they don't update the entire picture at once). I don't think you can get this much below 50ms at all with a 60Hz monitor.

TheFluff fucked around with this message at 12:48 on Feb 20, 2019

Alchenar
Apr 9, 2008

Krailor posted:

PC/Console gaming isn't dead, it just won't be on dedicated hardware. We'll all be launching a game streaming app on our STB of choice and then pick up where we left off on our browser or phone.

That's what should be keeping Nvidia/AMD/MS/Sony/Intel up at night. Why spend $400 on a new console or $1k+ on a gaming PC when I can just stream a game to my TV using my Chromecast and pick it up later on my Chromebook.

Truth-in-the-middle: the PC/Console gaming distinction will fall away as in future you will have a single piece of hardware in your home that does the gruntwork streaming data to your monitor/tv/radio as you choose. It'll also be the device controlling your heating/ac/lights and unlocking your door when you get home because hey why not.

The next form-factor for PCs will be a box built into the design of your house for the computer that will control everything and you will get to choose how beefy you want it.

..btt
Mar 26, 2008

Taima posted:

PC gaming is, in all probability, going to die entirely when the next batch of consoles is released.

:lol:

Been a while since I've heard that one.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy

Alchenar posted:

Truth-in-the-middle: the PC/Console gaming distinction will fall away as in future you will have a single piece of hardware in your home that does the gruntwork streaming data to your monitor/tv/radio as you choose. It'll also be the device controlling your heating/ac/lights and unlocking your door when you get home because hey why not.

The next form-factor for PCs will be a box built into the design of your house for the computer that will control everything and you will get to choose how beefy you want it.

lol no, that's an incredibly bad idea

internet connected "computer that will control everything" is such a colossally bad idea there's a whole twitter account about it

e: and re: pc gaming dying, pc gaming is still cheaper than consoles unless you insist on running 4k at 60hz+ (not that consoles can do that), remember nvidia dudes were whining about amd "dumping" with their 500 series during the last earnings report shitshow.

Truga fucked around with this message at 13:26 on Feb 20, 2019

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply