Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

eames posted:

Isn’t that the whole point of the GeForce Experience tool? It used to be, I haven’t used it in years since they started requiring a login.

Sort of. It hasn't done a good job of keeping up with the times, in terms of 4k and high-hz monitors. For example, it doesn't suggest anything different for my 3440x1440 monitor when it's set to 60Hz vs 100Hz.

Adbot
ADBOT LOVES YOU

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

MikeC posted:

Hardware Unboxed says they got better performance from dx12. Wierd

Seems like it depends on your CPU/GPU combination.

repiv
Aug 13, 2009

Also depends how you define better performance, some people are saying that Vulkan has better average FPS but DX12 has significantly better frametimes.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Mail me cards and I'll do benchmarks.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

lllllllllllllllllll posted:

Yep, or from 1024x768 to 1080p in computerland. Now I want to make the jump from 1080p to 1440p already but it's an investment of around 1100$ (screen and new graphics). A lot of money for not all that much gain. Maybe next year.

All really depends on what you're going for, I shopped around & recently updated my monitor to a 144Hz 1440p screen, this one to be exact, for $232 (it's $250 now but still not a bad price):
https://www.amazon.com/gp/product/B07V39QHMY/ref=ppx_yo_dt_b_asin_title_o00_s01?ie=UTF8&psc=1

It's not the most high-end fancypants screen but it's really nice for the price IMHO, I still rock an i5 4690K @ 4.5Ghz with 16GB DDR3 1866 & an overclocked GTX 1070, all my drives are SSD & games run wicked smooth. I didn't think the difference between 60-75Hz & 144Hz would be that big but doing comparisons, I can never go back, it's almost like when movies started increasing frame rates & everything felt weirdly fast/smooth for a while.

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
That does look interesting and seems like an affordable alternative to those pricy (mostly) IPS screens. Thanks for that!

VelociBacon
Dec 8, 2009

27" seems extremely small to be curving the screen on.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
It’s a VA panel. Motion clarity sucks really bad on those. Curved 27” is an easy tell on that, they curve it because the viewing angle is even more terrible than TN and if they didn’t you’d notice color shift.

(sorry, I know, “ACKCHUYALLY THAT THING YOU GOT IS TERRIBLE AND SUCKS”, this is why you never tell goons you bought a thing that you like)

Right now if you’re near a Microcenter, they have the XF270HU(A?) for $299. Buy that instead.

Mr.PayDay
Jan 2, 2004
life is short - play hard

gradenko_2000 posted:

is chasing after the ability to play every title at Ultra settings really a realistic goal for people to aim for?

The max settings alias Ultra or Extreme (like in FH4) offer the best possible visual output the devs and designers implemented, so it will always be a goal for many gamers to play the game on/at the most beautiful settings.
Why would you not?
Because at one point you have so start sacrificing stuff. And often marginal and minor visual effects cost 20% performance overall so you are forced to find the sweet spot the older your GPU is in many new games.

RDR2 is absolutely stunning and beautiful in ultra and additional maxed settings.
Cutting down effects on 1440 will add 5-10 more fps without disabling anything but there are that little differences that will turn down the visuals from perfect to still very good.


Games like Overwatch or CoD and especially Counterstrike are way more fps focused, and in the action you just won’t enjoy the nice MSAA stuff or smooth shadows anyway.

Just compare RDR2 ultra to medium settings and tell me you don’t notice serious differences. You literally can see it.
Why would you even play RDR2 in low settings in the worst case, maybe a Xbox would be the better way to enjoy the visual immersion.


Games like RDR2, Far Cry New Dawn, Metro Exodus are loving beautiful on Ultra settings.
You can tweak a handful settings, but these games literally lose their visual art in fragments with every setting you have to sacrifice.

Enjoying games the way the devs offer them, and that means the visual max settings, is always my way to go. So yeah, I am chasing ultra settings, and I can’t understand why you would not (if you could!).
Limited by GPU power (and budget) is a burden tho, I know.
Different topic.

Mr.PayDay fucked around with this message at 01:20 on Nov 7, 2019

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Is anyone playing Red Dead 2 on OLED VRR? I'm in my early 30s so my friends all had kids and stopped gaming. I need someone to talk to about how loving awesome this is.

I knew that OLED VRR would change gaming to some degree when the news first broke but it's... it's really good. I can't help but feel like we're looking at the future of gaming here.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Talking out my rear end here, but provided you have a replicable sequence, you should be able to capture and quantitatively describe the difference between a frame at different quality settings, yes? At the very least you should be able to get a straight screen capture, is it at possible to get information on how a GPU is deciding to render a given frame?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Taima posted:

Is anyone playing Red Dead 2 on OLED VRR? I'm in my early 30s so my friends all had kids and stopped gaming. I need someone to talk to about how loving awesome this is.

I knew that OLED VRR would change gaming to some degree when the news first broke but it's... it's really good. I can't help but feel like we're looking at the future of gaming here.

No, I'm gaming at 1080p on a big rear end IPS panel I think. Might be LED backlit, but don't quote me on that... I gotta be honest though I didn't look up that much about this thing when I got it, just scored it cheap on craigslist and try to keep in a graphics card that will make shiny poo poo on it at 60 FPS. I'm continuously impressed at how much fancier TVs get all the time, but I don't keep up with the tech personally these days from a consumption standpoint or read much about them. I dunno what VRR is even. I bet it's great, I'll check it out in 2025 when everyone else is rendering to V1 in the occipital lobe.

Agreed fucked around with this message at 06:47 on Nov 7, 2019

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Hey man nothing wrong with that. Before I got the OLED I was on a 65 inch LED too, at 1080p, with a 2080. Exodus with ray tracing on high looked killer on that old thing, sounds like we had a similar setup.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I feel I've been profoundly influenced by having cut my PC gaming teeth back in 1999 on an e-machines type computer with one of ATI's really garbage Rage 3D chips that had no T&L if I remember right. Got like 12 frames per second (3 FPS when spells were being cast) in Everquest in some old version of directX, maybe DX5 or DX6. Admittedly I did get my first real computer within a couple years and enjoyed seeing proper framerates for the first time but I spent so much time gaming on that old total shite that it's just built into my brain now to wait 'til it's really pressing to upgrade. Hence the 780 Ti for 6 years, haha. In the end I love my shinies, but I am not especially demanding in the scheme of things and I have been so happy with the 2070 so far for running on max in everything I want to that isn't RTX, but being able to do RTX so I can please my retinas, and being able to downscale from 1440p or even up to 3K or 4K in some titles (thanks to the low-by-today's-standards 60FPS!).

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Agreed posted:

No, I'm gaming at 1080p on a big rear end IPS panel I think. Might be LED backlit, but don't quote me on that... I gotta be honest though I didn't look up that much about this thing when I got it, just scored it cheap on craigslist and try to keep in a graphics card that will make shiny poo poo on it at 60 FPS. I'm continuously impressed at how much fancier TVs get all the time, but I don't keep up with the tech personally these days from a consumption standpoint or read much about them. I dunno what VRR is even. I bet it's great, I'll check it out in 2025 when everyone else is rendering to V1 in the occipital lobe.

VRR is variable refresh rate and it's pretty much around the corner. Basically screens right now are at set refresh rates (60hz or 144hz for example) so the framerate has to match that or it stutters, hitches, tears, and all kinds of bad things you don't want to happen. Vsync forces games to match their framerate to the display's refresh rate to avoid all that, but it comes at the cost of slight input delay. And if a vsynced game's framerate drops below the refresh rate the slowdowns are even worse. VRR is the opposite, it makes the display's refresh rate match the framerate of whatever game you're playing, and displays it immediately as it renders. So you get smooth picture output and virtually no input delay at all. You can tell when a game drops to 59fps with vsync, you can't tell if a game drops to 59fps with VRR.

Gsync and freesync are two examples of VRR, screens have had either one of those for a few years now. The upcoming HDMI 2.1 standard comes with VRR, and some TVs such as LG's 2019 lineup already comes with freesync feature built in. This is kind of a big deal as AMD finally made some cards capable of exploiting high resolution high framerate such as the 5700XT, as well as nvidia finally playing ball and including freesync support on their cards.

Arzachel
May 12, 2012

DrDork posted:

While I generally agree with you that you can play a lot of games with some compromises at 4k, trotting out a MMO that released in 2013 alongside a PS4 varient, and an isometric RPG is proooobably not the strongest argument you can make. And I say that as someone who plays a ton of FFXIV (on a laptop with a 1650. It's fun!).

Well they're running a gpu from 2014 so it works out. Change that to a 5700xt/2070S and the list of games in which you have to take a noticable iq hit to run at 4k60 is very small (ignoring RTX). RDR2 might, but it also seems to have a whole load of nonsense settings like stacking anti-aliasing on top of TAA so maybe not. Metro Exodus? I honestly can't think of anything else.

repiv
Aug 13, 2009

Apparently RDR2 still has the bizarre performance cliff from GTA5, where the engine shits the bed if the CPU is too fast in certain cases :eng99:

The higher the average FPS gets the worse the frametimes get, especially on chips with very high single-thread throughput but not a ton of threads.

https://www.youtube.com/watch?v=z_ty-gajwoA

repiv fucked around with this message at 13:58 on Nov 7, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
eat that 9900kailures

wait

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Yeah over in the game thread people are complaining. Like most i5s stutter bad every 30 seconds or so.

Enos Cabell
Nov 3, 2004


Weird, I have a 9700k but haven't noticed any stuttering at all. 1080ti @1440p mostly high settings, averaging around 65fps.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
are older-than-Ryzen AMD CPUs so bad that they're no longer including them, even when they're still including i7's from 2011, and even when the minimum specs still suggest an FX?

Llamadeus
Dec 20, 2005

Enos Cabell posted:

Weird, I have a 9700k but haven't noticed any stuttering at all. 1080ti @1440p mostly high settings, averaging around 65fps.
Not that weird, a bit later in the video they show that the stuttering only occurs if the framerate gets too high and goes away at 1440p medium with a 2080Ti.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

gradenko_2000 posted:

are older-than-Ryzen AMD CPUs so bad that they're no longer including them, even when they're still including i7's from 2011, and even when the minimum specs still suggest an FX?

They haven’t included FX series in their reviews for a while now, I would assume because bulldozer is poopy poop. I think the last time I saw it benched was in 2017 when Ryzen was first released.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
a fx-8150 is better than a 2500k nowadays though, AMD buyers receiving that finewine, the multicore future is today (tm)

also, you definitely will want to own a 1600 in another year when it finally starts to surpass the 7600K's averages consistently outside of random edge cases. especially since your average console is going to be pushing a higher framerate than a 1600 before too long.

boy, AMD users really won out on the >4 year tail end of that bet too, way to save $100 over just buying the i7. the 8700k was con lake, amirite guys, barely able to hit 4.7 on average and only on $500 XOC motherboards with extreme cooling, right?

Paul MaudDib fucked around with this message at 14:56 on Nov 7, 2019

repiv
Aug 13, 2009

Llamadeus posted:

Not that weird, a bit later in the video they show that the stuttering only occurs if the framerate gets too high and goes away at 1440p medium with a 2080Ti.

Yeah it's not as big of an issue in practice once you put more load onto the GPU, but it's useful to know that the game is completely useless as a CPU benchmark.

TorakFade
Oct 3, 2006

I strongly disapprove


repiv posted:

Apparently RDR2 still has the bizarre performance cliff from GTA5, where the engine shits the bed if the CPU is too fast in certain cases :eng99:

The higher the average FPS gets the worse the frametimes get, especially on chips with very high single-thread throughput but not a ton of threads.

The game has huge issues right now though, tons of people unable to play, heavy stuttering, and so on. They pushed it out a bit soon I guess.

I made some tests and it turns out that my 2600X + EVGA 1080 FTW can push an average of 30-32FPS at 1440p every setting Ultra or High if ultra is not available, 34-36FPS with a mix of High and Ultra, and 40-42FPS with a mix of Medium and High which doesn't make much sense to me - I'd expect a lot more variance. Also, overclocking the GPU by +70Mhz core, +700Mhz memory gives an extra easy 5FPS making it perfectly playable in any of those configurations.

Oh and minimum FPS / frametimes are pretty great, FPS very rarely dips below 30 and even at a steady 35FPS I don't feel any stutter or notice choppiness, it's as fluid as other games are at 60FPS. Maybe it's the slow pace of the game? Makes me happy that I can get a little extra bling anyway :)

(This might change once I get to more populated areas of the map, I'm still in the beginning "tutorial")

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

TorakFade posted:

The game has huge issues right now though, tons of people unable to play, heavy stuttering, and so on. They pushed it out a bit soon I guess.

I think that's just Ryzen users though?

:unsmigghh:

Maybe don't buy the budget processor and expect to play on launch day, is all I'm saying.

ufarn
May 30, 2009
The 2600K performs better than the 9600K, it's such a mess of coding.

Turns out crunch and managerial reigns of chaos don't produce polished products.

TorakFade
Oct 3, 2006

I strongly disapprove


Paul MaudDib posted:

I think that's just Ryzen users though?

:unsmigghh:

Maybe don't buy the budget processor and expect to play on launch day, is all I'm saying.

Dunno, my own Ryzen worked perfectly (once I sorted out the "launcher not launching" bug which was caused by the bios , specifically AGESA 1.0.0.3) - and I'd say it's good enough to feed a 1080, I'm definitely GPU-limited since a slight overclock on the GPU nets me 5 easy extra FPS

Apparently Ryzens have big issues launching the game and crashing due to BIOSes, while the stutter problem is very common on i5 CPUs new and old. I guess maybe don't ever buy anything less than "top 3 SKU" if you want to play videogames? :v:

I believe after a few patches and optimization passes it will be alright, though this is definitely a game that will push you to upgrade if you value eye-candy and high FPS

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

TorakFade posted:

Dunno, my own Ryzen worked perfectly (once I sorted out the "launcher not launching" bug which was caused by the bios , specifically AGESA 1.0.0.3) - and I'd say it's good enough to feed a 1080, I'm definitely GPU-limited since a slight overclock on the GPU nets me 5 easy extra FPS

Apparently Ryzens have big issues launching the game and crashing due to BIOSes, while the stutter problem is very common on i5 CPUs new and old. I guess maybe don't ever buy anything less than "top 3 SKU" if you want to play videogames? :v:

I believe after a few patches and optimization passes it will be alright, though this is definitely a game that will push you to upgrade if you value eye-candy and high FPS

it's a big enough release I'm sure it'll get fixed

MikeC
Jul 19, 2004
BITCH ASS NARC

Paul MaudDib posted:

I think that's just Ryzen users though?

:unsmigghh:

Maybe don't buy the budget processor and expect to play on launch day, is all I'm saying.

I think you took a wrong turn on the way to the reddit fanboy trolling wars.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
WCCF is talking about Ampere getting delayed till late 2020 or early 2021, any reason to believe these rumors?

https://wccftech.com/nvidia-geforce-rtx-2080-ti-super-q1-2020-launch-rumor/

repiv
Aug 13, 2009

Taima posted:

any reason to believe these rumors?

yeah, if videocardz posts a corroborating rumour

wccf is bad

Klyith
Aug 3, 2007

GBS Pledge Week

gradenko_2000 posted:

are older-than-Ryzen AMD CPUs so bad that they're no longer including them, even when they're still including i7's from 2011, and even when the minimum specs still suggest an FX?

When reviewers add in old stuff for comparison purposes, the most useful way is to choose what is most common among the audience. Relatively few people bought FX chips since they were hot (literally :v:) garbage at the time. So even though the later versions have had a weird long tail of "out-performs the CPUs it got crushed by at release", that info isn't really valuable for anything. Whereas there are still a fair number of people with 2600Ks.


MikeC posted:

I think you took a wrong turn on the way to the reddit fanboy trolling wars.

Hey, why let a complete opposite reading of the data get in the way of an opportunity to smug?

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Paul please take your meds, we are all worried about you.

ufarn
May 30, 2009

Taima posted:

WCCF is talking about Ampere getting delayed till late 2020 or early 2021, any reason to believe these rumors?

https://wccftech.com/nvidia-geforce-rtx-2080-ti-super-q1-2020-launch-rumor/
Be hilarious if the smart move for me a year and a half ago was to actually have bought a used 1080ti instead of a used 1070.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Okay, who's the bozo who used their genie's wish for AMD to land an incredible streak of lucky breaks in the industry instead of, say, something important, like world peace, the world pulling together to solve global warming, or an end to anti-scientific thought?

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
On one hand it'd suck if my Super get obsoleted so quickly at the beginning on next year but on the other it's crushing anything I throw at it at 1440p so whatever

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Endymion FRS MK1 posted:

On one hand it'd suck if my Super get obsoleted so quickly at the beginning on next year but on the other it's crushing anything I throw at it at 1440p so whatever

A better video card coming out won't make yours any worse.

Adbot
ADBOT LOVES YOU

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Paul MaudDib posted:

I think that's just Ryzen users though?

:unsmigghh:

Maybe don't buy the budget processor and expect to play on launch day, is all I'm saying.

just don't post for a day, just a loving day

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply