Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Worf
Sep 12, 2017

If only Seth would love me like I love him!

lllllllllllllllllll posted:

Following current trends and priorities those will likely offer 8k with RTX on in... 30 FPS.

ya this is probably literally the plan tbh

the ground will reflect the sun perfectly in every game ever but good luck trying to read any of the graffiti on the brick walls at x480 res

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy
the funny thing there is that upping the texture resolution is basically free these days as long as you have the vram to store them

eames
May 9, 2009

Many of the people that are happy with consoles or 60 Hz screens just never experienced a decent system with high refresh rates/low input latency/steady frametimes. It's a completely different experience, I know this because was one of them. :shobon:
It'd be a nice first step for next gen consoles to support 1080p/120Hz now that some OLEDs support that out of the box.

Craptacular!
Jul 9, 2001

Fuck the DH
The console making GBS threads screams insecurity. 4K is already in a lot of homes, and 60fps is now established as the premium tier for cinematic-style titles.

Truga
May 4, 2014
Lipstick Apathy
And yet, just a couple months ago I saw some dumbass outlet do an unironic "cinematic experience" 30fps excuse. I forgot where it was, but it was still just as funny and dumb as 10 years ago.

Mr.PayDay
Jan 2, 2004
life is short - play hard
The point to be made is that Consoles are always a crafted compromise: It's PC tech inside, but outdated already at release. But thats never was the selling point after the mid 90s, where the Atari Jaguar or the Panasonic 3DO were ahead of their time.
The moment PC gaming got the 3D accelerators, that "battle" was "lost".

Consoles habe three main selling points imo: 1.) Exclusive titles 2.) chill and lazy couch gaming in front of a big tv. 3.)"cheaper" than a new PC.
We could discuss long and endless about point 3 because the Online Services like PSN are 60 Euro/Dollar per year and PC Gaming gets more sales overall.
On top of that, and thats what I want to refer to: You can get a GPU that provides the same graphic and quality as the native 4K console XB1X of 2018, for less money.
Of course, that's not a complete PC build, so you gotta take the choice.

I feel it is a big misunterstanding when consoleros are saying that they can get that amazing 4K experience on the way cheaper consoles. Yeah, sure, but at 30fps locked (the PS4 pro struggles heavily even to match 30 fps in RDR2) and with cut down effects to actually allow 4K and 30 fps in some titles.

I had that argument about Forza Horizon 4, Metro Exodus, Project Cars 2 and The Division 2 with a buddy who claimed that a PC could maybe provide "slightly" better gfx.
We got some booze, our ladies were out and I fired up FH4, PC2 and TD2 and Metro Exodus and yeah, I got a 2080Ti, so I play on 1440p Ultra anyway in all games and on top of that I added RTX on to Metro Exodus.
Long story short, it did not take long for the "WTF" moments after he experienced all 4 games with smooth Gsync and 100+ fps avg, while ME tanked with RTX to 60, of course.
My buddy admitted that it was a visual "night and day" difference that he cannot "unsee" after this. Especially Forza with MSAA8 and Ultra effects and still 100 fps completely flashed him.
Racing games at 30 fps? Lol, why would you even do that?

"Hey, The XB1X can fire Forza Horizon 4 at 60 fps!!"
Yes, it is the visually cut down "Performance" Mode and the difference is huge imo.

That's the beauty and advantage of PC Gaming, you get the fps AND quality. You don't have to chose.

I think it's funny when people still believe they can substitute PC Gaming with a console "because the graphics are almost the same".
Thats the wrong reason in the first place. It might be the budget, but never the tech.

You can get 1:1 XB1X 4K gfx and fps with a medium tier 300-350 Dollar/Euro GPU
You get way better XB1X 4K fps AND effects with a 5700 and 2070+ GPU.

For everything else, no console can even come close to the arsenal of PC graphic options and the smooth 60-144+ fps experience.
But thats NOT their selling point to begin with, right? So a "battle" does not make sense.

I have my consoles for the exclusive titles and Joypad + TV couch sessions on that big rear end 65" OLED and that's where I don't miss the PC.

Mr.PayDay fucked around with this message at 15:09 on Dec 30, 2019

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Craptacular! posted:

The console making GBS threads screams insecurity. 4K is already in a lot of homes, and 60fps is now established as the premium tier for cinematic-style titles.

please expand upon your insecurity remark

i think experience wise the consoles ive owned (every current gen except nintendo) is extremely sub par

i and many others itt were secure enough to go buy the console(s) and use them enough to form educated opinions

its a GPU thread, excluding consoles to some degree is intrinsic to the form factor.

i apologize if i misunderstood your post, i assure you it was borne of the pattern of posts you've made that in my opinion are incendiary itt :shrug:

consoles scale graphics regardless of the resolution of the output device lol, because i got some news regarding the 1080p/4k experience console people are getting at definitely 60 fps and not 30

thats not even getting into the fact that a console experience in excess of 60 FPS is going to probably cost more than just owning a PC setup that can do the same :shrug:

Worf fucked around with this message at 15:37 on Dec 30, 2019

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
If you've gone out and bought every console then it's really hard to say they are always lovely. Obviously they're good enough to keep buying, which is why they make them.

They're different experiences but I think a lot of people in this thread are way way way underestimating the percent of game players that are happy to play a AAA game at 1080p, medium settings at 45fps on PC or anything a console gives and not get mad enough about it to post online.

I know people with 1440 or 4k monitors that downscale to play a game and don't care.

Klyith
Aug 3, 2007

GBS Pledge Week

Mr.PayDay posted:

A „bit nicer“ than X Box you say? ;-) The PC Ultra Settings are on a whole different level of details(literally) and visual immersion

Nah. I can go play a game from 5 or 10 years ago that is a much bigger loss of 'fidelity' than the difference between a console and $1000 GPU today, and still be totally immersed. Even just on the visual level, I'm not talking about some gameplay > visuals argument. Style and artistic choice are easily equal to photorealism (or more important IMO).

edit: RDR2 does push the best case for the expensive GPU, but that's because their artist choice was photorealism.


Mr.PayDay posted:

On top of that, and thats what I want to refer to: You can get a GPU that provides the same graphic and quality as the native 4K console XB1X of 2018, for less money.

Wait, I thought that a GPU that costs less than grand was a compromised experience that is missing an entire level of immersion?

Klyith fucked around with this message at 16:44 on Dec 30, 2019

orcane
Jun 13, 2012

Fun Shoe
Yeah, to a very large number of people the current consoles' often-not-even-1080p, medium to low settings at 30fps is "good enough". Prettier and faster is nice, but not "pay significantly more money for it" nice. They can play the game they want, it runs, it looks nice enough, done. If they cared about (or could afford) 1440p/120fps or 4k/60fps at "not allowed to move a single slider down from ultra" settings, they would be posting on PC hardware forums.

Console users and players on mainstream PCs are paying a fraction to get a gaming experience that's acceptable to them (Pareto principle etc.) and I don't expect the next generation of consoles to really alter that deal in the long run (as usual the differences to a gaming PC will be a bit smaller right around the console's release). There are "pro" consoles now for people who value performance and visuals - enough to drop another $200 on a faster version of their console, not enough to get a gaming PC. But even with those, the price tag you can put on a console for the mass market is limited. Even if some magical leap of technology suddenly allowed $500 consoles to hit 4k/60fps at high details, the manufacturers would probably still aim for sub-4k rendering at medium details and possibly target 30 fps (outside of specific genres) in order to sell that console for $300 or $400 instead.

orcane fucked around with this message at 16:59 on Dec 30, 2019

Klyith
Aug 3, 2007

GBS Pledge Week

Lockback posted:

I know people with 1440 or 4k monitors that downscale to play a game and don't care.

orcane posted:

Console users and players on mainstream PCs are paying a fraction to get a gaming experience that's acceptable to them

I've related this experience ITT before but a whole lot of the horrible, awful, enjoyment destroying flaws that high end PCMR people talk about are literally invisible unless you become sensitized to them. Some of them, like micro-stutter, you drat near have to train yourself to look for.

20 years ago (!) I played a whole lot of Morrowind, which was a game that was really bad about screen tearing. It was so noticeable in that game that I started seeing it all the time, and ever since then I've been in the vsync on always camp. But I've watched friends playing games with some serious tearing, they didn't see it and were mystified when I mentioned it.


OTOH, I had a game spontaneously reset to 1080p resolution recently (my monitor is 1440). Took me a while to notice, and at first I went looking into the driver control panel settings rather than the game settings. The menu text was a bit blurry so I thought MLAA has been enabled or something. If it hadn't been for the menu I'd never have been the wiser.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Lockback posted:

If you've gone out and bought every console then it's really hard to say they are always lovely. Obviously they're good enough to keep buying, which is why they make them.

They're different experiences but I think a lot of people in this thread are way way way underestimating the percent of game players that are happy to play a AAA game at 1080p, medium settings at 45fps on PC or anything a console gives and not get mad enough about it to post online.

I know people with 1440 or 4k monitors that downscale to play a game and don't care.

I can't play PS4 or Xbox games with my friends on any of my PCs. Just because a person does a thing doesn't mean they approve or appreciate it in particular contexts. Hth.

Consoles are underpowered, cheap , bad hardware that have been holding back gaming tech imho. I buy every single one still because I'd rather socialize than not

30 fps is still a joke

I appreciate the social and gaming experience and detest the visual, OS, hardware, peripheral, and drm experience

Worf fucked around with this message at 19:19 on Dec 30, 2019

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I like 144hz and everything but the biggest gaming wow moment I’ve ever had since buying a Riva TNT is easily, without a doubt, 4k/60 OLED Gsync.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Also "they're manufactured, obviously they're worth owning" in the GPU thread of all places is a riot

Cygni
Nov 12, 2005

raring to post

Why are y’all arguing about consoles and “value” for computer toys again? :(

TheCoach
Mar 11, 2014

Statutory Ape posted:

Consoles are underpowered, cheap , bad hardware that have been holding back gaming tech imho.

Good, the last drat thing we want is the nutjob hardware arms race of late 90s/early 2000s especially when no one has the god drat money for that kind of thing right now.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

K8.0 posted:

People not buying them doesn't mean they don't want to. The drive is there because games are murdering GPUs right now. You have to spend more than ever to get decent performance in most games. For many years, $300-330 bought a GPU that would run almost every game at/close to max detail at the max res/refresh your average gamer's monitor supported.
"Max res/refresh" in those years meant 1080p at 60hz. There is no doubt the scaling has slowed and the lack of competition in some segments has amplified this, but we were stuck at 1080p/1200p for a long time, the jump to 4K and '4k-ish' widescreen resolutions with 120+ hz displays has skewed the value proposition I think. Yeah we're not getting as much with each new generation now, but we're expecting a lot more.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Statutory Ape posted:


Consoles are underpowered, cheap , bad hardware that have been holding back gaming tech imho.
The games that actually push graphics tech wouldn't exist without them. A DIY PC just isn't a feasible option for the majority of the public.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Happy_Misanthrope posted:

"Max res/refresh" in those years meant 1080p at 60hz. There is no doubt the scaling has slowed and the lack of competition in some segments has amplified this, but we were stuck at 1080p/1200p for a long time, the jump to 4K and '4k-ish' widescreen resolutions with 120+ hz displays has skewed the value proposition I think. Yeah we're not getting as much with each new generation now, but we're expecting a lot more.

Just to put it into reference, 1080p60 is 124,416,000 pixels / second, 1440p144 is 528,768,000 pixels / second (4.25x), 4k(p)60 is 487,664,000 pixels / second (4x).

In conclusion, people whining about video cards are tossers, and the jump in displays is leaps and bounds more than can be expected from silicon.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Happy_Misanthrope posted:

The games that actually push graphics tech wouldn't exist without them. A DIY PC just isn't a feasible option for the majority of the public.

I understand the catch 22 there

I have to carry an external hard drive to bring my Xbox one to my friends house bc even if I swapped the internal drive for an SSD it uses a sata 2 port on the Mobo for that slot

My gripes can coexist with what you said imho, since I agree (to an extent )and understand what you're saying and still think what I said is true :shrug:

E: that being said, without looking it up I bet most of the best looking console games are on engines crafted by what are traditionally PC houses lol

Worf fucked around with this message at 20:29 on Dec 30, 2019

Craptacular!
Jul 9, 2001

Fuck the DH

Statutory Ape posted:

please expand upon your insecurity remark

i think experience wise the consoles ive owned (every current gen except nintendo) is extremely sub par

i and many others itt were secure enough to go buy the console(s) and use them enough to form educated opinions

its a GPU thread, excluding consoles to some degree is intrinsic to the form factor.

i apologize if i misunderstood your post, i assure you it was borne of the pattern of posts you've made that in my opinion are incendiary itt :shrug:

I didn’t mean for you to think I was directing that toward you. I actually wasn’t referring to you, but to MrPayDay who is aggravating with almost every post. All he does is talk about how slightly diffused shadows or minute lighting differences is totally worth tripling the price of your graphics card. Just buy it!

But I also think there’s a little lacking realism in discussion like:

quote:

not even getting into the fact that a console experience in excess of 60 FPS is going to probably cost more than just owning a PC setup that can do the same :shrug:
Relatively nobody cares about experiences over 60 FPS. Going over 60 FPS is not on most games’s agenda. The reason I keep bringing up Assasins Creed, Tomb Raider, Watch Dogs, and all these other elaborate single player campaigns that aren’t focused on twitchy competitive combat and more focused on immersion and knocking your socks off with textures. If presented a range of FPS to work with they will choose 60 and up the detail. They do not care about FPS above 60, and on consoles probably would artificially cap high FPS to ensure a consistent performance in all scenes.

I have that same monitor you do, and yes I know that my Overwatch performance is better for it. I can cap the game at 60 and my confidence in my tracking and lining headshot drops massively. But I bought that thing because I sink more hours into FPS than any other genre by a massive margin. And the success of those things shows just what a big market shooters and competitive exclusives like League and Dota are to the PC platform, but they’re irrelevant on consoles beyond, I guess COD heads? COD always has worked to run at a higher frame rate than most console games, I suppose I could show console COD fans why they should move to the PC version of MW and buy an adaptive sync monitor, but 100 FPS is basically loving nothing over 60 when it comes to stuff like The Witcher.

Craptacular! fucked around with this message at 20:33 on Dec 30, 2019

Truga
May 4, 2014
Lipstick Apathy

TheCoach posted:

Good, the last drat thing we want is the nutjob hardware arms race of late 90s/early 2000s especially when no one has the god drat money for that kind of thing right now.

the thing about late early 00s hardware race was though, that i could, as a poor kid, get cards on the very cheap tho, because anything not latest did not run the latest lovely AAA game on high because it lacked <feature> and i could score them hilariously cheap.

e: sure, i played max payne at ~25fps, but otoh the card cost me the equivalent of about $20 and I got the game for :filez:

Truga fucked around with this message at 20:39 on Dec 30, 2019

Arzachel
May 12, 2012

Truga posted:

the thing about late early 00s hardware race was though, that i could, as a poor kid, get cards on the very cheap tho, because anything not latest did not run the latest lovely AAA game on high because it lacked <feature> and i could score them hilariously cheap.

e: sure, i played max payne at ~25fps, but otoh the card cost me the equivalent of about $20 and I got the game for :filez:

On the other hand a 7870 will still happily run every game seven years later at 1080p with playable fps.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
As someone who prefers pc gaming versus consoles the next gen consoles are gonna own.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

Arzachel posted:

On the other hand a 7870 will still happily run every game seven years later at 1080p with playable fps.

Define playable. 30FPS on everything low 1080p for RDR2 isn't really playable to most PC gamers.

Truga
May 4, 2014
Lipstick Apathy

Arzachel posted:

On the other hand a 7870 will still happily run every game seven years later at 1080p with playable fps.

yeah and that's why it still costs $60+ on ebay. not to even mention the CPU needed to run these games :v:

Worf
Sep 12, 2017

If only Seth would love me like I love him!

B-Mac posted:

As someone who prefers pc gaming versus consoles the next gen consoles are gonna own.

just let me be hardware agnostic for multiplayer. Separate by input and call it a day , forever, the end

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Arzachel posted:

On the other hand a 7870 will still happily run every game seven years later at 1080p with playable fps.

Let's not forget that to meet GTA V's "4K 30" recommended spec only takes a 7870.

Stickman
Feb 1, 2004

HalloKitty posted:

Let's not forget that to meet GTA V's "4K 30" recommended spec only takes a 7870.

GTA V was released in 2013 :corsair:

Arzachel
May 12, 2012

jisforjosh posted:

Define playable. 30FPS on everything low 1080p for RDR2 isn't really playable to most PC gamers.

Then you have a very skewed idea of what hardware most people are running. Besides, it's still going to be a far better experience than trying to play Oblivion on a FX5200, let me tell you! (It was fine, if extremely ugly, once they patched in a ultra low preset that disabled most shader effects)

Mr.PayDay
Jan 2, 2004
life is short - play hard

Klyith posted:


OTOH, I had a game spontaneously reset to 1080p resolution recently (my monitor is 1440). Took me a while to notice
That’s explaining a lot of your post tbh. Maybe people may have different or untrained or tolerant „senses“ in that case.
Before Nvidia forced to manually turn G Sync on again, I had buddies that did not notice (how?!) it.
I just can’t comprehend how you wouldn’t notice 1080p on your 27“ Monitor. I can literally see if there is a „problem“ with my resolution, GSync or smoothness because of driver problems or setting malfunctions.
Because of a misclick I hardlocked my Swift to 60. Hz. It took me seconds after starting a BF V session to notice something was of.
Buddies invite me to „check“ if their system runs smoothly because I can literally see it if sth is off.

My „worst“ days of recent gaming were 1 year ago as my Swift PSU died. I had to play on a non GSync 60 Hz Monitor for some days.
:psyduck:

I can never go back, it’s a game breaker for me and I wonder how gamers deal with ghosting and tearing without G-/FreeSync....
There obviously is a personal visual sensitivity as your and mine experiences seem to be a proof of.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Source your quotes

The Big Bad Worf
Jan 26, 2004
Quad-greatness

Taima posted:

I like 144hz and everything but the biggest gaming wow moment I’ve ever had since buying a Riva TNT is easily, without a doubt, 4k/60 OLED Gsync.

Buying my first oled really made me feel like we've truly lost the last 20 years to lovely, smeary lcd panels. They were stuck at 60hz for like a decade when even a mediocre crt could do at least 85hz with amazing motion clarity and deep blacks.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

2020 is basically just going to be who can get the best FPS on quake 2 on the nicest monitor itt

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

Arzachel posted:

Then you have a very skewed idea of what hardware most people are running. Besides, it's still going to be a far better experience than trying to play Oblivion on a FX5200, let me tell you! (It was fine, if extremely ugly, once they patched in a ultra low preset that disabled most shader effects)

Going off of Steam Hardware surveys, something near 50% of those surveyed are running something better than a 960 so while yes I am a bit skewed as are most people on this forum, we're not that far off from reality.

Mr.PayDay
Jan 2, 2004
life is short - play hard

Craptacular! posted:

I, but 100 FPS is basically loving nothing over 60 when it comes to stuff like The Witcher.

That’s where you are wrong tho. Every game feels and runs smoother with additional fps. I will agree that a difference between 140 and 180 fps is an absurd debate, but the experience between 60 and 100 fps is in a complete different level. Just take your Witcher example and fight and action scenes with fast camera and angle changes.
If you aren’t sensitive and can’t see it, good for you (as you can save tons of money), you would not get an immersion and experience benefit from a high end GPU.

The leap of 40 additional fps from that corridor from 40-60 to 80-100 is loving huge and makes everything so much smoother.

Mr.PayDay fucked around with this message at 22:19 on Dec 30, 2019

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

15.7 are on a 1070 or better, 14.6 are on a 1060, leaving ~70% of those polled on something less than the 1080p card from 3 years ago. Y'all pretty drat skewed, especially the guy who said "everything below the $350ish 5700s are just "limp by playing old games" products".

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Wistful of Dollars posted:

What's the over/under on AMD coming up with something that actually competes with the future 3080?

Well Nvidia is dealing with fab issues right now, an unplanned switch from Samsung to TSMC, which could mean significant delays. AMD's best bet would be putting out something to compete with the 2080 Super and 2080 Ti and hopefully having a while to sell it before the RTX 3000 series comes.

Mr.PayDay
Jan 2, 2004
life is short - play hard

ItBreathes posted:

15.7 are on a 1070 or better, 14.6 are on a 1060, leaving ~70% of those polled on something less than the 1080p card from 3 years ago. Y'all pretty drat skewed, especially the guy who said "everything below the $350ish 5700s are just "limp by playing old games" products".

So 90% are just casual gamers who don’t give a drat about gfx sliders, more news at 11.
Does not change the fact that console settings are easily matched by GPUs that don’t need 800 Euros.
So if you care about gfx and immersion, this is the thread for discussion what GPU can get you a native 4K console setting and fps without selling a kidney.

Adbot
ADBOT LOVES YOU

Arzachel
May 12, 2012

ItBreathes posted:

15.7 are on a 1070 or better, 14.6 are on a 1060, leaving ~70% of those polled on something less than the 1080p card from 3 years ago. Y'all pretty drat skewed, especially the guy who said "everything below the $350ish 5700s are just "limp by playing old games" products".

Also the Chinese and SEA market dwarfs Steam's install base and I'd be surprised if the numbers there wouldn't be tilted even heavier towards the low end.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply