Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ham Sandwiches
Jul 7, 2000

PC LOAD LETTER posted:

Diminishing returns tend to kick in pretty quickly after 72Hz for most people, unless you've got 'golden eyes' you won't notice much if any difference past 100Hz.

How do people have these bizarre beliefs? Anything over 30 fps wasted too because of your eyes?

If you scroll a window around at 240 hz you'll really clearly see the difference from 120 hz, even if your eyes are like the plain normal ones most mortals have.

Adbot
ADBOT LOVES YOU

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Parker Lewis posted:

How big is the difference between a 100Hz and 144Hz G-SYNC monitor? Are there diminishing returns once you start going beyond 100Hz or is it a case of "the more frames, the better"?

I'm trying to decide if a 100Hz 3440x1440 or 144Hz 2560x1440 would be the better upgrade (from an LG 34UM95) and I've never used a monitor with G-SYNC or higher than a 60hz panel.

Frame rates are inversely proportional to perceived motion blur. The higher the refresh rate of a monitor, the better it is at producing more complete images per second, and the less time your eyes have to make do with tracking an object in motion in reduced quality.

If you don't play a lot of games where identifying stuff flying across your screen helps you win, then don't worry about it.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Humans can effectively notice up to 500fps in the peripheral and ~200fps in a focus area. Human eyes are not analogous to a shutter or frame buffer, it's based more on response time and illumination and is really good at noticing even miniscule changes in light. 100 or higher hertz or fps is good enough but not lifelike, and going 200 or above will look funny and might even hit uncanny valley due to the subconsciously perceived differences because the periphery is expecting "500fps".

Parker Lewis
Jan 4, 2006

Can't Lose


Sidesaddle Cavalry posted:

Frame rates are inversely proportional to perceived motion blur. The higher the refresh rate of a monitor, the better it is at producing more complete images per second, and the less time your eyes have to make do with tracking an object in motion in reduced quality.

If you don't play a lot of games where identifying stuff flying across your screen helps you win, then don't worry about it.

I mostly play single player, third person action / RPGs, not twitch FPS games, so I'm leaning towards sticking with 3440x1440 and sacrificing FPS over 100 to do so.

Going back to 2560x1440 feels strangely claustrophobic after getting used to the 21:9 aspect ratio. Even something like GTA V feels a lot less cinematic when I go back to playing it at 16:9.

Gwaihir
Dec 8, 2009
Hair Elf

HalloKitty posted:

Personally I remember the CRT days, and a CRT driven at 60Hz would be uncomfortable to look at, 75Hz was OK but I could still perceive flickering, especially when viewed in peripheral vision; however 85Hz and above seemed to stop the flickering, so all these VR headsets targeting 90Hz are probably bang on. High enough to be smooth, but low enough to be achievable by a graphics card you can buy.

Of course, perception may vary.

Echoing this, as another guy with a 144hz monitor. There's not much difference to me in game, mainly because I can't push 144 fps in anything I play but war thunder with a single 980.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Parker Lewis posted:

I mostly play single player, third person action / RPGs, not twitch FPS games, so I'm leaning towards sticking with 3440x1440 and sacrificing FPS over 100 to do so.

Going back to 2560x1440 feels strangely claustrophobic after getting used to the 21:9 aspect ratio. Even something like GTA V feels a lot less cinematic when I go back to playing it at 16:9.

Aspect ratio and image fidelity does seem to have positive effects on more game genres.

Also there's a school of thought that sees motion blur as a good thing, if it replicates the experience of the widely accepted 24fps film format for example, or if it's an artificial additive to make motion appear more dramatic

PC LOAD LETTER
May 23, 2005
WTF?!

Rakthar posted:

How do people have these bizarre beliefs? Anything over 30 fps wasted too because of your eyes?

If you scroll a window around at 240 hz you'll really clearly see the difference from 120 hz, even if your eyes are like the plain normal ones most mortals have.
So I didn't say anything over 30fps is wasted and I personally can't see any difference at all with a 120Hz or '240hz' screen nor can most people. I do fine with 60Hz and I can see a difference between 60Hz and 85Hz but not 72Hz and 85Hz FWIW. 72Hz is ~3x what many movies are displayed at and the monitor refresh rate won't sync up with fluorescent lights either to cause flicker so its hardly an unreasonable number or something that has just been made up.

FaustianQ posted:

Humans can effectively notice up to 500fps in the peripheral and ~200fps in a focus area. Human eyes are not analogous to a shutter or frame buffer, it's based more on response time and illumination and is really good at noticing even miniscule changes in light. 100 or higher hertz or fps is good enough but not lifelike, and going 200 or above will look funny and might even hit uncanny valley due to the subconsciously perceived differences because the periphery is expecting "500fps".
Fighter pilots tend to be the guys who can actually see and perceive detail at ~200fps but most people just can't do that. I think the average most can see is some where around 45fps but elderly people are included in that average which probably won't apply to most people here or gamers in general. I don't know what it is for younger people 30 and under but I think its safe to say you almost certainly won't need anything near 200Hz or for that matter 120Hz to get smooth game play.

PC LOAD LETTER fucked around with this message at 21:14 on Jun 19, 2015

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

PC LOAD LETTER posted:

So I didn't say anything over 30fps is wasted and I personally can't see any difference at all with a 120Hz or '240hz' screen nor can most people. I do fine with 60Hz and I can see a difference between 60Hz and 85Hz but not 72Hz and 85Hz FWIW. 72Hz is ~3x what many movies are displayed at and the monitor refresh rate won't sync up with fluorescent lights either to cause flicker so its hardly an unreasonable number or something that has just been made up.

Fighter pilots tend to be the guys who can actually see and perceive detail at ~200fps but most people just can't do that. I think the average most can see is some where around 45fps but elderly people are included in that average which probably won't apply to most people here or gamers in general. I don't know what it is for younger people 30 and under but I think its safe to say you almost certainly won't need anything near 200Hz or for that matter 120Hz to get smooth game play.

Nah, what your thinking about for fighter pilots is visual acuity, which is more the ability to resolve detail from the rods inside the eye. Human ability to detect light changes, which is what constitutes a different "frame", is much more basic than that, not to say variance isn't possible but it's much more even across the spectrum. But that doesn't mean you won't start having smooth image quality at 60hz and greater, what I mean is that the differences are detectable by the human eye up until 500fps, and are a sliding scale of "real". 120hz is smooth, but the eye doesn't see it as "real" because the light in reality is updated at a different speed than the light for the generated image on the screen. It's not an argument for 200hz monitors though, but I could see it being one for "full immersion VR!".

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
I think part of the issue here that gets completely lost in the whole "200fps!" "no, 500fps!" "no, only 100fps!" banter is that--for the incredibly vast majority of people--anything over the mid-80's is "smooth enough to be pleasing." If you sit a 100Hz monitor right next to a 240Hz monitor and scroll black-text-on-white-background at high speeds, will average people be able to point out which one is which? Yup. Will most of them care? Nope.

PC LOAD LETTER is effectively correct when he talked about diminishing returns. You can still figure out which panel is which, but the difference between a 144Hz panel and a 240Hz panel in normal use is going to be perceptually a hell of a lot less than the difference between a 60Hz panel and a 144Hz one, even though there's actually fewer additional frames going 60->144 than 144->240.

To answer the original question: Parker Lewis, you are much less likely to get a 100Hz monitor and end up sitting there going "man, I wish I had 144Hz!" than you are to get a 2560x1440 one and go "man, I wish I had gone 3440x1440!" (assuming, of course, that you have GPU(s) capable of pushing 3440x1440 at acceptable framerates).

Ham Sandwiches
Jul 7, 2000

DrDork posted:

I think part of the issue here that gets completely lost in the whole "200fps!" "no, 500fps!" "no, only 100fps!" banter is that--for the incredibly vast majority of people--anything over the mid-80's is "smooth enough to be pleasing." If you sit a 100Hz monitor right next to a 240Hz monitor and scroll black-text-on-white-background at high speeds, will average people be able to point out which one is which? Yup. Will most of them care? Nope.

I find these sorts of assumptions on behalf of the audience very weird. Do you recall when 720p / 1080p / hdtv in general was just getting big? How many articles did you see claiming that most people could not tell the difference between SD and HD content? How many examples of HDTVs hooked up with composite cables were mocked on the internet?

And yet these days, do you honestly think people can't tell the difference between extra pixels? The fact that 1440p is now considered the minimum for 27" screens when that res wasn't even viable a few years ago seems to say otherwise.

Seeing the same kind of debate about how "people don't need it" and "most can't tell the difference" only being applied to hz instead of pixels is odd to me. I guess this is another thing I disagree on. Smooth enough to be pleasing? The difference is immediately noticeable, just because 100 feels like a big step up from 60 or whatever they're used to doesn't mean that over time they can't / won't grow to appreciate the difference.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rakthar posted:

And yet these days, do you honestly think people can't tell the difference between extra pixels? The fact that 1440p is now considered the minimum for 27" screens when that res wasn't even viable a few years ago seems to say otherwise.
You mistake the simple ability to discern the difference vice what's "good enough" for it to be safe to move on to other things as deciding factors. Yes, if you put a 720p right next to a 1080p people can (usually, but not always) say which is which. If you just give them a 720p on its own and ask what it is, it's much harder. That everyone recommends 1440p 27" now is because it has become the "sweet spot" in terms of what's available for the price--not that people simply cannot discern 1440p from 4k or something; just that 4k for most people is not worth the extra cost, and the price difference between 1080p and 1440p is fairly small now. Awhile back when the cheapest 1440p 27" was $800, not many people recommended them because they didn't make a lot of sense at the time.

Same, too, with the argument about sky-high refresh rates. Can people pick which is which? Yeah, usually (though again, it's much easier to pick 60 vs 144 than it is 144 vs 240--just like it's much easier to tell a game running at 30FPS vs 60FPS than 100FPS vs 130FPS). Is worrying about 100Hz vs 144Hz worthwhile when there are other factors more likely to make a meaningful impact on which monitor he should buy? No, it's not--100Hz is almost certainly "good enough" compared to 144Hz to be able to move on to other bits, like resolution.

e; I mean, I'm sure at some point going forward 240Hz or whatever will become "the standard" just as 144Hz is slowly replacing 60Hz, because it is better. But we're not there yet, and you need some pretty hilarious GPU power to push 100+ frames at even 1440p right now, so unless you're rolling with SLI'd 980Ti's or something, you literally will never know the difference between 100 and 144Hz outside of some very minor black-on-white scroll smoothing, or if your idea of entertainment is sitting in CS:GO and spinning in circles all day.

DrDork fucked around with this message at 02:03 on Jun 20, 2015

DonkeyHotay
Jun 6, 2005

I've been tempted for a while to get a monitor with a higher resolution higher than my current 1080p. Right now dell is selling a U2715H for a fairly sweet $540 + a $200 gc. If 90% of my computer time is spent either browsing the web or gaming would I be better served by the dell, or a g-sync or high refresh monitor?

theblackw0lf
Apr 15, 2003

"...creating a vision of the sort of society you want to have in miniature"
On the Acer XB270HU, the brightness really just controls the backlight correct? Is there a way to change the actual brightness of the monitor?

Running through Microsoft's calibration test, it seems my monitor is too bright. But the only way to really change it is by using the control panel for my video card. Changing the brightness on the monitor just controls the backlight.

theblackw0lf fucked around with this message at 11:19 on Jun 20, 2015

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Rakthar posted:

And yet these days, do you honestly think people can't tell the difference between extra pixels?

I'm sure many people won't notice the difference. People don't seem to care about huge artifacts on their TV from low quality streams, as long as there's something telling them it's Full HD; they're pacified.

How many times have you seen (more so 10 years ago than now) people with widescreen TVs set to stretch the 4:3 content right out, and yet they never even notice, they never thought to check what was wrong?

HalloKitty fucked around with this message at 12:27 on Jun 20, 2015

Corin Tucker's Stalker
May 27, 2001


One bullet. One gun. Six Chambers. These are my friends.
I wanted a glossy monitor with an HDMI port for console games, so I picked up a Dell S2415h that was on sale for $160. This thing's great. Fantastic image quality, nice design with a super thin bezel, and the speakers are better than I was expecting. The lack of height adjustment in the stand is a bummer, but fortunately it's set at a perfect height for me.

Doing a side-by-side comparison with my old matte Dell is surprising. I had grown accustomed to the protective coating on the matte screen, but now it seems so grainy while the new one is crystal clear.

CopperHound
Feb 14, 2012

theblackw0lf posted:

On the Acer XB270HU, the brightness really just controls the backlight correct? Is there a way to change the actual brightness of the monitor?

Running through Microsoft's calibration test, it seems my monitor is too bright. But the only way to really change it is by using the control panel for my video card. Changing the brightness on the monitor just controls the backlight.

Uhm.. Are you thinking of gamma instead of brightness? What's wrong with turning down the backlight? Backlight is the brightness control unless you want to squash the dynamic range of your monitor.

CopperHound fucked around with this message at 17:18 on Jun 21, 2015

Bald Stalin
Jul 11, 2004

Our posts
I'm in the market for a big 144Hz monitor to game on. Been using two Asus VW246Hs for years so cheap TN panels without height adjustment, bad color and light bleed don't bother me. Seems like the biggest monitors you can get that are 144hz and >1080p are 27" only? Any recommendations?

Etrips
Nov 9, 2004

Having Teemo Problems?
I Feel Bad For You, Son.
I Got 99 Shrooms
And You Just Hit One.

d3rt posted:

I'm in the market for a big 144Hz monitor to game on. Been using two Asus VW246Hs for years so cheap TN panels without height adjustment, bad color and light bleed don't bother me. Seems like the biggest monitors you can get that are 144hz and >1080p are 27" only? Any recommendations?

How deep are your pockets? Would you want G-Sync/Freesync?

Bald Stalin
Jul 11, 2004

Our posts
Deep, though if such a monitor exists for under $700 I might be able to pick up two.

Oh and just from reading 2 newegg reviews, apparently I can't use freesync because I have an nVidia videocard (and plan on buying a 980 Ti soon).

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

d3rt posted:

Deep, though if such a monitor exists for under $700 I might be able to pick up two.

Oh and just from reading 2 newegg reviews, apparently I can't use freesync because I have an nVidia videocard (and plan on buying a 980 Ti soon).

Would you be interested in a single 21:9 aspect ratio wide monitor? A bunch of those are going to hit the market soon, with G-sync too.

Confusion
Apr 3, 2009

PC LOAD LETTER posted:

So I didn't say anything over 30fps is wasted and I personally can't see any difference at all with a 120Hz or '240hz' screen nor can most people. I do fine with 60Hz and I can see a difference between 60Hz and 85Hz but not 72Hz and 85Hz FWIW. 72Hz is ~3x what many movies are displayed at and the monitor refresh rate won't sync up with fluorescent lights either to cause flicker so its hardly an unreasonable number or something that has just been made up.

Fighter pilots tend to be the guys who can actually see and perceive detail at ~200fps but most people just can't do that. I think the average most can see is some where around 45fps but elderly people are included in that average which probably won't apply to most people here or gamers in general. I don't know what it is for younger people 30 and under but I think its safe to say you almost certainly won't need anything near 200Hz or for that matter 120Hz to get smooth game play.

The problem is that humans simply do not see frame-by-frame.

The fighter pilot test that the 200fps comes from is the following: Flash a bright white shape of an airplane on a black background, and have the fighter pilot attempt to identify the airplane. In this test they could identify airplanes when they were flashed only for 1/200th of a second.

However you can not convert this to a frame-rate humans are supposed to be able to see. For instance have a black airplane shape on a bright background and they probably couldn't see it even at 1/50th of a second. Or show two shapes in succession at 1/200 and they sure as hell won't be able to identify either one of them. That is because the rods and cones in our eyes have a sort of afterglow effect when they detect light, which makes them transmit a signal to the brain for longer than there actually is light.

Truga
May 4, 2014
Lipstick Apathy
Humans don't see in frames so the more the better, really. :v:

Bald Stalin
Jul 11, 2004

Our posts

Sidesaddle Cavalry posted:

Would you be interested in a single 21:9 aspect ratio wide monitor? A bunch of those are going to hit the market soon, with G-sync too.

Sure that could work provided there's a reliable way to split the screen. Windows 8.1 split screen doesn't seem to work with anything other than their weird 'apps'.

This looks pretty good too http://www.newegg.com/Product/Product.aspx?Item=N82E16824236466 ASUS MG279Q but it says free-sync. Will I still be able to get the higher refresh rate on an nVidia card?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

d3rt posted:

This looks pretty good too http://www.newegg.com/Product/Product.aspx?Item=N82E16824236466 ASUS MG279Q but it says free-sync. Will I still be able to get the higher refresh rate on an nVidia card?

Freesync/Gsync don't affect maximum refresh rate, just what happens when you don't hit the monitor's configured refresh rate with the game's frame rate.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

d3rt posted:

Sure that could work provided there's a reliable way to split the screen. Windows 8.1 split screen doesn't seem to work with anything other than their weird 'apps'.

This looks pretty good too http://www.newegg.com/Product/Product.aspx?Item=N82E16824236466 ASUS MG279Q but it says free-sync. Will I still be able to get the higher refresh rate on an nVidia card?

Freesync is an option, and on this monitor it works between 35-90Hz, but you can turn it on or off. Since you don't have the option to use Freesync, it will just work like any other 144Hz monitor.

Bald Stalin
Jul 11, 2004

Our posts
Sounds like I will prefer a Gsync monitor then.

27"+ >1920x1080p 144Hz preferably G-sync monitor preferably under $700 but there's $ wiggle room. Any actual existing monitor recommendations? Thanks for the helpful replies everyone.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

HalloKitty posted:

Freesync is an option, and on this monitor it works between 35-90Hz, but you can turn it on or off. Since you don't have the option to use Freesync, it will just work like any other 144Hz monitor.

Wait, it's a 144Hz monitor and Freesync only functions up to 90? Really?

Why on earth would you ever miss frames when targeting 144Hz... :rolleye:

repiv
Aug 13, 2009

Subjunctive posted:

Wait, it's a 144Hz monitor and Freesync only functions up to 90? Really?

Why on earth would you ever miss frames when targeting 144Hz... :rolleye:

To say Asus half assed that monitor is an understatement, in its original release it was configured to accept 144hz but actually skipped every 6th frame as if it were 120hz. They had to recall all stock to fix the firmware :v:

BurritoJustice
Oct 9, 2012

d3rt posted:

Sounds like I will prefer a Gsync monitor then.

27"+ >1920x1080p 144Hz preferably G-sync monitor preferably under $700 but there's $ wiggle room. Any actual existing monitor recommendations? Thanks for the helpful replies everyone.

The Acer XB270HU is $750 and is 1440p/144Hz/IPS/GSync/27". Ticks all your boxes and then some.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Subjunctive posted:

Wait, it's a 144Hz monitor and Freesync only functions up to 90? Really?

Why on earth would you ever miss frames when targeting 144Hz... :rolleye:

This is especially odd as not only is it not a limitation of the spec, there are other TN monitors that have a Freesync range from 40-144Hz. Although I think 40 is a bit high on the low end, and I'd prefer to see around 30, this shows that ASUS have done something bizarre to only allow 90 on the top end.

BurritoJustice
Oct 9, 2012

HalloKitty posted:

This is especially odd as not only is it not a limitation of the spec, there are other TN monitors that have a Freesync range from 40-144Hz. Although I think 40 is a bit high on the low end, and I'd prefer to see around 30, this shows that ASUS have done something bizarre to only allow 90 on the top end.

Supposedly the limitation is due to it being the only monitor that allows Freesync and overdrive at the same time, and the complications from this lead to there being a crappy range.

Freesync really is a rainforest compared to the nice walled garden of GSync, there is no consistency in the experience between monitors which is a real hallmark of GSync.

Incredulous Dylan
Oct 22, 2004

Fun Shoe
I'm really impressed by the picture quality on my LG 34UM95, but I think I'm ready to go back to a real gaming monitor. I've written about this a few times in the thread already but the monitor's response times are just too slow for the super twitch gaming that I do. The tearing is god awful and very distracting without v-sync on, and at 3440x1440 even with a 980 ti I can't guarantee I'm gonna always be above 60 fps for v-sync when keeping the eye candy on. Even when just playing TF2, with my high mouse sensitivity the monitor cannot display fast enough to keep up with fast spins, etc. Not enough games I play support the full 3440x1440 properly, either (mostly Valve titles and for good reason, but it sucks). I'm gonna pick up a G-sync monitor, compare the two directly and then return/sell one. I'll probably just Craigslist the 34UM95 since I didn't keep the huge box for it, but if I do put it up on SA mart I'll link here for first dibs.

I've been following the thread so afaik I should be waiting for a few months for the new crop of g-sync monitors to be released.

Incredulous Dylan fucked around with this message at 17:31 on Jun 22, 2015

Mozi
Apr 4, 2004

Forms change so fast
Time is moving past
Memory is smoke
Gonna get wider when I die
Nap Ghost
Possibly somewhat relevant, but the ULMB feature on the XB270HU works really really well. So it's not just a good monitor for playing new games at max settings and feeling like your 40-odd FPS is higher than it is by using GSYNC, but you can pop open some 2d game or older game like TF2, switch over to 100Hz/ULMB, and really see the benefits there. ULMB didn't really factor into my decision to buy the monitor but it's a really nice feature.

Incredulous Dylan
Oct 22, 2004

Fun Shoe
I just placed an order for a refurbished XB270HU at $699 on Amazon, so we will see how it shakes out. I was checking out the new monitors coming and didn't see much beyond size that competes with it. I'm a high volume Amazon seller myself, so I sent them an email with the order to ensure everything goes smoothly as possible and I don't waste time/their money receiving a defective product. Looking forward to comparing the two! Any ideas on a fair price to sell the 34UM95 at if it is in perfect condition?

Incredulous Dylan fucked around with this message at 18:35 on Jun 22, 2015

Bald Stalin
Jul 11, 2004

Our posts

BurritoJustice posted:

The Acer XB270HU is $750 and is 1440p/144Hz/IPS/GSync/27". Ticks all your boxes and then some.

Purchased, thank you. Found it here http://www.nothingbutsavings.com/Pr...xK-bhoCc6zw_wcB for $728 new. Google Trusted store, whatever that means.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

BurritoJustice posted:

Supposedly the limitation is due to it being the only monitor that allows Freesync and overdrive at the same time, and the complications from this lead to there being a crappy range.

Freesync really is a rainforest compared to the nice walled garden of GSync, there is no consistency in the experience between monitors which is a real hallmark of GSync.

I'm getting mine tomorrow, will see if I can disable overdrive and get a better Freesync range in return. What the hell does Overdrive do for me anyway?

repiv
Aug 13, 2009

FaustianQ posted:

What the hell does Overdrive do for me anyway?

Improves response time using overshoot trickery: https://youtu.be/2Fi1QHhdqV4?t=2574

Parker Lewis
Jan 4, 2006

Can't Lose


Incredulous Dylan posted:

I'm really impressed by the picture quality on my LG 34UM95, but I think I'm ready to go back to a real gaming monitor. I've written about this a few times in the thread already but the monitor's response times are just too slow for the super twitch gaming that I do. The tearing is god awful and very distracting without v-sync on, and at 3440x1440 even with a 980 ti I can't guarantee I'm gonna always be above 60 fps for v-sync when keeping the eye candy on. Even when just playing TF2, with my high mouse sensitivity the monitor cannot display fast enough to keep up with fast spins, etc. Not enough games I play support the full 3440x1440 properly, either (mostly Valve titles and for good reason, but it sucks). I'm gonna pick up a G-sync monitor, compare the two directly and then return/sell one. I'll probably just Craigslist the 34UM95 since I didn't keep the huge box for it, but if I do put it up on SA mart I'll link here for first dibs.

I've been following the thread so afaik I should be waiting for a few months for the new crop of g-sync monitors to be released.

Sup 34UM95 / 980 Ti buddy. I've actually found that an overclocked 980 Ti can handle just about anything other than Witcher 3 Ultra at 3440x1440; even GTA V with everything on High runs well. But I do share a lot of the same gripes about using the 34UM95 as a gaming monitor.

Valve and Blizzard games (other than WoW which works great) being locked to 16:9 is a drag. NBA 2K15 doesn't have any ultrawide support either. Having to use hacks like Flawless Widescreen to enable/fix 21:9 for games like Mass Effect 3 and The Witcher 2 leaves a bad taste in my mouth. But I will say that I have had a pretty high success rate of games that I've been wanting to play having native support for 21:9, probably better than 75%.

I don't play many twitch games so the response time isn't as big of a problem for me as it sounds like it is for you but it does kind of suck feeling like I'm living with tradeoffs on a $750 monitor.

It's a tough call because when 21:9 works I feel like pretty much any game is more enjoyable at 3440x1440 than 2560x1440; I've been playing BioShock Infinite, Dishonored, Guild Wars 2, Skyrim, and Civ V lately and they all benefit from the wider screen and I have a hard time going back to 2560x1440 (I still have a 27" Benq sitting around). Upgrading from a 970 to an overclocked 980 Ti has pretty much negated the performance argument (I used to justify sticking to 1080p or 1440p because the 970 had trouble with some games at 3440x1440). But there is something nice in knowing that a 2560x1440 monitor is going to support every game and be one less thing to worry about.

I guess in the end I have decided that 3440x1440 works enough of the time that the ultrawide experience makes it worth it to me to deal with using widescreen hacks or dealing with pillarboxed 16:9 games for the handful of titles that don't support it. I guess that living with hacks and games being locked to 16:9 just comes with the territory of owning an ultrawide monitor right now.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

Parker Lewis posted:

Sup 34UM95 / 980 Ti buddy. I've actually found that an overclocked 980 Ti can handle just about anything other than Witcher 3 Ultra at 3440x1440; even GTA V with everything on High runs well. But I do share a lot of the same gripes about using the 34UM95 as a gaming monitor.

Valve and Blizzard games (other than WoW which works great) being locked to 16:9 is a drag. NBA 2K15 doesn't have any ultrawide support either. Having to use hacks like Flawless Widescreen to enable/fix 21:9 for games like Mass Effect 3 and The Witcher 2 leaves a bad taste in my mouth. But I will say that I have had a pretty high success rate of games that I've been wanting to play having native support for 21:9, probably better than 75%.

I don't play many twitch games so the response time isn't as big of a problem for me as it sounds like it is for you but it does kind of suck feeling like I'm living with tradeoffs on a $750 monitor.

It's a tough call because when 21:9 works I feel like pretty much any game is more enjoyable at 3440x1440 than 2560x1440; I've been playing BioShock Infinite, Dishonored, Guild Wars 2, Skyrim, and Civ V lately and they all benefit from the wider screen and I have a hard time going back to 2560x1440 (I still have a 27" Benq sitting around). Upgrading from a 970 to an overclocked 980 Ti has pretty much negated the performance argument (I used to justify sticking to 1080p or 1440p because the 970 had trouble with some games at 3440x1440). But there is something nice in knowing that a 2560x1440 monitor is going to support every game and be one less thing to worry about.

I guess in the end I have decided that 3440x1440 works enough of the time that the ultrawide experience makes it worth it to me to deal with using widescreen hacks or dealing with pillarboxed 16:9 games for the handful of titles that don't support it. I guess that living with hacks and games being locked to 16:9 just comes with the territory of owning an ultrawide monitor right now.

The upshot is that the 3440x1440 is a new standard that companies are going to start marketing toward so it should be a lot easier for games to support ultra widescreen gaming.

I can never go back to a 16:9 monitor ever again, 21:9 is just too drat good.

Adbot
ADBOT LOVES YOU

Bald Stalin
Jul 11, 2004

Our posts
People like me like dual screens because its quick and easy to move windows between the two. Are you guys with ultra wides using software to 'split' it into two? Or just buying two ultra wides? As far as I can tell, Windows 8.1 doesn't support doing this natively, unless you count their lovely 'apps'.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply