|
PC LOAD LETTER posted:Diminishing returns tend to kick in pretty quickly after 72Hz for most people, unless you've got 'golden eyes' you won't notice much if any difference past 100Hz. How do people have these bizarre beliefs? Anything over 30 fps wasted too because of your eyes? If you scroll a window around at 240 hz you'll really clearly see the difference from 120 hz, even if your eyes are like the plain normal ones most mortals have.
|
# ? Jun 19, 2015 18:13 |
|
|
# ? Jun 3, 2024 04:25 |
|
Parker Lewis posted:How big is the difference between a 100Hz and 144Hz G-SYNC monitor? Are there diminishing returns once you start going beyond 100Hz or is it a case of "the more frames, the better"? Frame rates are inversely proportional to perceived motion blur. The higher the refresh rate of a monitor, the better it is at producing more complete images per second, and the less time your eyes have to make do with tracking an object in motion in reduced quality. If you don't play a lot of games where identifying stuff flying across your screen helps you win, then don't worry about it.
|
# ? Jun 19, 2015 18:35 |
|
Humans can effectively notice up to 500fps in the peripheral and ~200fps in a focus area. Human eyes are not analogous to a shutter or frame buffer, it's based more on response time and illumination and is really good at noticing even miniscule changes in light. 100 or higher hertz or fps is good enough but not lifelike, and going 200 or above will look funny and might even hit uncanny valley due to the subconsciously perceived differences because the periphery is expecting "500fps".
|
# ? Jun 19, 2015 18:36 |
|
Sidesaddle Cavalry posted:Frame rates are inversely proportional to perceived motion blur. The higher the refresh rate of a monitor, the better it is at producing more complete images per second, and the less time your eyes have to make do with tracking an object in motion in reduced quality. I mostly play single player, third person action / RPGs, not twitch FPS games, so I'm leaning towards sticking with 3440x1440 and sacrificing FPS over 100 to do so. Going back to 2560x1440 feels strangely claustrophobic after getting used to the 21:9 aspect ratio. Even something like GTA V feels a lot less cinematic when I go back to playing it at 16:9.
|
# ? Jun 19, 2015 19:05 |
|
HalloKitty posted:Personally I remember the CRT days, and a CRT driven at 60Hz would be uncomfortable to look at, 75Hz was OK but I could still perceive flickering, especially when viewed in peripheral vision; however 85Hz and above seemed to stop the flickering, so all these VR headsets targeting 90Hz are probably bang on. High enough to be smooth, but low enough to be achievable by a graphics card you can buy. Echoing this, as another guy with a 144hz monitor. There's not much difference to me in game, mainly because I can't push 144 fps in anything I play but war thunder with a single 980.
|
# ? Jun 19, 2015 19:16 |
|
Parker Lewis posted:I mostly play single player, third person action / RPGs, not twitch FPS games, so I'm leaning towards sticking with 3440x1440 and sacrificing FPS over 100 to do so. Aspect ratio and image fidelity does seem to have positive effects on more game genres. Also there's a school of thought that sees motion blur as a good thing, if it replicates the experience of the widely accepted 24fps film format for example, or if it's an artificial additive to make motion appear more dramatic
|
# ? Jun 19, 2015 19:33 |
|
Rakthar posted:How do people have these bizarre beliefs? Anything over 30 fps wasted too because of your eyes? FaustianQ posted:Humans can effectively notice up to 500fps in the peripheral and ~200fps in a focus area. Human eyes are not analogous to a shutter or frame buffer, it's based more on response time and illumination and is really good at noticing even miniscule changes in light. 100 or higher hertz or fps is good enough but not lifelike, and going 200 or above will look funny and might even hit uncanny valley due to the subconsciously perceived differences because the periphery is expecting "500fps". PC LOAD LETTER fucked around with this message at 21:14 on Jun 19, 2015 |
# ? Jun 19, 2015 21:12 |
|
PC LOAD LETTER posted:So I didn't say anything over 30fps is wasted and I personally can't see any difference at all with a 120Hz or '240hz' screen nor can most people. I do fine with 60Hz and I can see a difference between 60Hz and 85Hz but not 72Hz and 85Hz FWIW. 72Hz is ~3x what many movies are displayed at and the monitor refresh rate won't sync up with fluorescent lights either to cause flicker so its hardly an unreasonable number or something that has just been made up. Nah, what your thinking about for fighter pilots is visual acuity, which is more the ability to resolve detail from the rods inside the eye. Human ability to detect light changes, which is what constitutes a different "frame", is much more basic than that, not to say variance isn't possible but it's much more even across the spectrum. But that doesn't mean you won't start having smooth image quality at 60hz and greater, what I mean is that the differences are detectable by the human eye up until 500fps, and are a sliding scale of "real". 120hz is smooth, but the eye doesn't see it as "real" because the light in reality is updated at a different speed than the light for the generated image on the screen. It's not an argument for 200hz monitors though, but I could see it being one for "full immersion VR!".
|
# ? Jun 19, 2015 21:54 |
|
I think part of the issue here that gets completely lost in the whole "200fps!" "no, 500fps!" "no, only 100fps!" banter is that--for the incredibly vast majority of people--anything over the mid-80's is "smooth enough to be pleasing." If you sit a 100Hz monitor right next to a 240Hz monitor and scroll black-text-on-white-background at high speeds, will average people be able to point out which one is which? Yup. Will most of them care? Nope. PC LOAD LETTER is effectively correct when he talked about diminishing returns. You can still figure out which panel is which, but the difference between a 144Hz panel and a 240Hz panel in normal use is going to be perceptually a hell of a lot less than the difference between a 60Hz panel and a 144Hz one, even though there's actually fewer additional frames going 60->144 than 144->240. To answer the original question: Parker Lewis, you are much less likely to get a 100Hz monitor and end up sitting there going "man, I wish I had 144Hz!" than you are to get a 2560x1440 one and go "man, I wish I had gone 3440x1440!" (assuming, of course, that you have GPU(s) capable of pushing 3440x1440 at acceptable framerates).
|
# ? Jun 20, 2015 00:25 |
|
DrDork posted:I think part of the issue here that gets completely lost in the whole "200fps!" "no, 500fps!" "no, only 100fps!" banter is that--for the incredibly vast majority of people--anything over the mid-80's is "smooth enough to be pleasing." If you sit a 100Hz monitor right next to a 240Hz monitor and scroll black-text-on-white-background at high speeds, will average people be able to point out which one is which? Yup. Will most of them care? Nope. I find these sorts of assumptions on behalf of the audience very weird. Do you recall when 720p / 1080p / hdtv in general was just getting big? How many articles did you see claiming that most people could not tell the difference between SD and HD content? How many examples of HDTVs hooked up with composite cables were mocked on the internet? And yet these days, do you honestly think people can't tell the difference between extra pixels? The fact that 1440p is now considered the minimum for 27" screens when that res wasn't even viable a few years ago seems to say otherwise. Seeing the same kind of debate about how "people don't need it" and "most can't tell the difference" only being applied to hz instead of pixels is odd to me. I guess this is another thing I disagree on. Smooth enough to be pleasing? The difference is immediately noticeable, just because 100 feels like a big step up from 60 or whatever they're used to doesn't mean that over time they can't / won't grow to appreciate the difference.
|
# ? Jun 20, 2015 01:00 |
|
Rakthar posted:And yet these days, do you honestly think people can't tell the difference between extra pixels? The fact that 1440p is now considered the minimum for 27" screens when that res wasn't even viable a few years ago seems to say otherwise. Same, too, with the argument about sky-high refresh rates. Can people pick which is which? Yeah, usually (though again, it's much easier to pick 60 vs 144 than it is 144 vs 240--just like it's much easier to tell a game running at 30FPS vs 60FPS than 100FPS vs 130FPS). Is worrying about 100Hz vs 144Hz worthwhile when there are other factors more likely to make a meaningful impact on which monitor he should buy? No, it's not--100Hz is almost certainly "good enough" compared to 144Hz to be able to move on to other bits, like resolution. e; I mean, I'm sure at some point going forward 240Hz or whatever will become "the standard" just as 144Hz is slowly replacing 60Hz, because it is better. But we're not there yet, and you need some pretty hilarious GPU power to push 100+ frames at even 1440p right now, so unless you're rolling with SLI'd 980Ti's or something, you literally will never know the difference between 100 and 144Hz outside of some very minor black-on-white scroll smoothing, or if your idea of entertainment is sitting in CS:GO and spinning in circles all day. DrDork fucked around with this message at 02:03 on Jun 20, 2015 |
# ? Jun 20, 2015 01:55 |
|
I've been tempted for a while to get a monitor with a higher resolution higher than my current 1080p. Right now dell is selling a U2715H for a fairly sweet $540 + a $200 gc. If 90% of my computer time is spent either browsing the web or gaming would I be better served by the dell, or a g-sync or high refresh monitor?
|
# ? Jun 20, 2015 03:32 |
|
On the Acer XB270HU, the brightness really just controls the backlight correct? Is there a way to change the actual brightness of the monitor? Running through Microsoft's calibration test, it seems my monitor is too bright. But the only way to really change it is by using the control panel for my video card. Changing the brightness on the monitor just controls the backlight. theblackw0lf fucked around with this message at 11:19 on Jun 20, 2015 |
# ? Jun 20, 2015 10:39 |
|
Rakthar posted:And yet these days, do you honestly think people can't tell the difference between extra pixels? I'm sure many people won't notice the difference. People don't seem to care about huge artifacts on their TV from low quality streams, as long as there's something telling them it's Full HD; they're pacified. How many times have you seen (more so 10 years ago than now) people with widescreen TVs set to stretch the 4:3 content right out, and yet they never even notice, they never thought to check what was wrong? HalloKitty fucked around with this message at 12:27 on Jun 20, 2015 |
# ? Jun 20, 2015 11:36 |
|
I wanted a glossy monitor with an HDMI port for console games, so I picked up a Dell S2415h that was on sale for $160. This thing's great. Fantastic image quality, nice design with a super thin bezel, and the speakers are better than I was expecting. The lack of height adjustment in the stand is a bummer, but fortunately it's set at a perfect height for me. Doing a side-by-side comparison with my old matte Dell is surprising. I had grown accustomed to the protective coating on the matte screen, but now it seems so grainy while the new one is crystal clear.
|
# ? Jun 20, 2015 23:23 |
|
theblackw0lf posted:On the Acer XB270HU, the brightness really just controls the backlight correct? Is there a way to change the actual brightness of the monitor? Uhm.. Are you thinking of gamma instead of brightness? What's wrong with turning down the backlight? Backlight is the brightness control unless you want to squash the dynamic range of your monitor. CopperHound fucked around with this message at 17:18 on Jun 21, 2015 |
# ? Jun 21, 2015 17:09 |
|
I'm in the market for a big 144Hz monitor to game on. Been using two Asus VW246Hs for years so cheap TN panels without height adjustment, bad color and light bleed don't bother me. Seems like the biggest monitors you can get that are 144hz and >1080p are 27" only? Any recommendations?
|
# ? Jun 22, 2015 08:46 |
|
d3rt posted:I'm in the market for a big 144Hz monitor to game on. Been using two Asus VW246Hs for years so cheap TN panels without height adjustment, bad color and light bleed don't bother me. Seems like the biggest monitors you can get that are 144hz and >1080p are 27" only? Any recommendations? How deep are your pockets? Would you want G-Sync/Freesync?
|
# ? Jun 22, 2015 08:51 |
|
Deep, though if such a monitor exists for under $700 I might be able to pick up two. Oh and just from reading 2 newegg reviews, apparently I can't use freesync because I have an nVidia videocard (and plan on buying a 980 Ti soon).
|
# ? Jun 22, 2015 08:53 |
|
d3rt posted:Deep, though if such a monitor exists for under $700 I might be able to pick up two. Would you be interested in a single 21:9 aspect ratio wide monitor? A bunch of those are going to hit the market soon, with G-sync too.
|
# ? Jun 22, 2015 09:39 |
|
PC LOAD LETTER posted:So I didn't say anything over 30fps is wasted and I personally can't see any difference at all with a 120Hz or '240hz' screen nor can most people. I do fine with 60Hz and I can see a difference between 60Hz and 85Hz but not 72Hz and 85Hz FWIW. 72Hz is ~3x what many movies are displayed at and the monitor refresh rate won't sync up with fluorescent lights either to cause flicker so its hardly an unreasonable number or something that has just been made up. The problem is that humans simply do not see frame-by-frame. The fighter pilot test that the 200fps comes from is the following: Flash a bright white shape of an airplane on a black background, and have the fighter pilot attempt to identify the airplane. In this test they could identify airplanes when they were flashed only for 1/200th of a second. However you can not convert this to a frame-rate humans are supposed to be able to see. For instance have a black airplane shape on a bright background and they probably couldn't see it even at 1/50th of a second. Or show two shapes in succession at 1/200 and they sure as hell won't be able to identify either one of them. That is because the rods and cones in our eyes have a sort of afterglow effect when they detect light, which makes them transmit a signal to the brain for longer than there actually is light.
|
# ? Jun 22, 2015 10:40 |
|
Humans don't see in frames so the more the better, really.
|
# ? Jun 22, 2015 10:42 |
|
Sidesaddle Cavalry posted:Would you be interested in a single 21:9 aspect ratio wide monitor? A bunch of those are going to hit the market soon, with G-sync too. Sure that could work provided there's a reliable way to split the screen. Windows 8.1 split screen doesn't seem to work with anything other than their weird 'apps'. This looks pretty good too http://www.newegg.com/Product/Product.aspx?Item=N82E16824236466 ASUS MG279Q but it says free-sync. Will I still be able to get the higher refresh rate on an nVidia card?
|
# ? Jun 22, 2015 15:34 |
|
d3rt posted:This looks pretty good too http://www.newegg.com/Product/Product.aspx?Item=N82E16824236466 ASUS MG279Q but it says free-sync. Will I still be able to get the higher refresh rate on an nVidia card? Freesync/Gsync don't affect maximum refresh rate, just what happens when you don't hit the monitor's configured refresh rate with the game's frame rate.
|
# ? Jun 22, 2015 15:37 |
|
d3rt posted:Sure that could work provided there's a reliable way to split the screen. Windows 8.1 split screen doesn't seem to work with anything other than their weird 'apps'. Freesync is an option, and on this monitor it works between 35-90Hz, but you can turn it on or off. Since you don't have the option to use Freesync, it will just work like any other 144Hz monitor.
|
# ? Jun 22, 2015 15:44 |
|
Sounds like I will prefer a Gsync monitor then. 27"+ >1920x1080p 144Hz preferably G-sync monitor preferably under $700 but there's $ wiggle room. Any actual existing monitor recommendations? Thanks for the helpful replies everyone.
|
# ? Jun 22, 2015 15:46 |
|
HalloKitty posted:Freesync is an option, and on this monitor it works between 35-90Hz, but you can turn it on or off. Since you don't have the option to use Freesync, it will just work like any other 144Hz monitor. Wait, it's a 144Hz monitor and Freesync only functions up to 90? Really? Why on earth would you ever miss frames when targeting 144Hz...
|
# ? Jun 22, 2015 15:49 |
|
Subjunctive posted:Wait, it's a 144Hz monitor and Freesync only functions up to 90? Really? To say Asus half assed that monitor is an understatement, in its original release it was configured to accept 144hz but actually skipped every 6th frame as if it were 120hz. They had to recall all stock to fix the firmware
|
# ? Jun 22, 2015 15:55 |
|
d3rt posted:Sounds like I will prefer a Gsync monitor then. The Acer XB270HU is $750 and is 1440p/144Hz/IPS/GSync/27". Ticks all your boxes and then some.
|
# ? Jun 22, 2015 16:28 |
|
Subjunctive posted:Wait, it's a 144Hz monitor and Freesync only functions up to 90? Really? This is especially odd as not only is it not a limitation of the spec, there are other TN monitors that have a Freesync range from 40-144Hz. Although I think 40 is a bit high on the low end, and I'd prefer to see around 30, this shows that ASUS have done something bizarre to only allow 90 on the top end.
|
# ? Jun 22, 2015 16:29 |
|
HalloKitty posted:This is especially odd as not only is it not a limitation of the spec, there are other TN monitors that have a Freesync range from 40-144Hz. Although I think 40 is a bit high on the low end, and I'd prefer to see around 30, this shows that ASUS have done something bizarre to only allow 90 on the top end. Supposedly the limitation is due to it being the only monitor that allows Freesync and overdrive at the same time, and the complications from this lead to there being a crappy range. Freesync really is a rainforest compared to the nice walled garden of GSync, there is no consistency in the experience between monitors which is a real hallmark of GSync.
|
# ? Jun 22, 2015 16:34 |
|
I'm really impressed by the picture quality on my LG 34UM95, but I think I'm ready to go back to a real gaming monitor. I've written about this a few times in the thread already but the monitor's response times are just too slow for the super twitch gaming that I do. The tearing is god awful and very distracting without v-sync on, and at 3440x1440 even with a 980 ti I can't guarantee I'm gonna always be above 60 fps for v-sync when keeping the eye candy on. Even when just playing TF2, with my high mouse sensitivity the monitor cannot display fast enough to keep up with fast spins, etc. Not enough games I play support the full 3440x1440 properly, either (mostly Valve titles and for good reason, but it sucks). I'm gonna pick up a G-sync monitor, compare the two directly and then return/sell one. I'll probably just Craigslist the 34UM95 since I didn't keep the huge box for it, but if I do put it up on SA mart I'll link here for first dibs. I've been following the thread so afaik I should be waiting for a few months for the new crop of g-sync monitors to be released. Incredulous Dylan fucked around with this message at 17:31 on Jun 22, 2015 |
# ? Jun 22, 2015 17:29 |
|
Possibly somewhat relevant, but the ULMB feature on the XB270HU works really really well. So it's not just a good monitor for playing new games at max settings and feeling like your 40-odd FPS is higher than it is by using GSYNC, but you can pop open some 2d game or older game like TF2, switch over to 100Hz/ULMB, and really see the benefits there. ULMB didn't really factor into my decision to buy the monitor but it's a really nice feature.
|
# ? Jun 22, 2015 17:43 |
|
I just placed an order for a refurbished XB270HU at $699 on Amazon, so we will see how it shakes out. I was checking out the new monitors coming and didn't see much beyond size that competes with it. I'm a high volume Amazon seller myself, so I sent them an email with the order to ensure everything goes smoothly as possible and I don't waste time/their money receiving a defective product. Looking forward to comparing the two! Any ideas on a fair price to sell the 34UM95 at if it is in perfect condition?
Incredulous Dylan fucked around with this message at 18:35 on Jun 22, 2015 |
# ? Jun 22, 2015 18:07 |
|
BurritoJustice posted:The Acer XB270HU is $750 and is 1440p/144Hz/IPS/GSync/27". Ticks all your boxes and then some. Purchased, thank you. Found it here http://www.nothingbutsavings.com/Pr...xK-bhoCc6zw_wcB for $728 new. Google Trusted store, whatever that means.
|
# ? Jun 22, 2015 18:14 |
|
BurritoJustice posted:Supposedly the limitation is due to it being the only monitor that allows Freesync and overdrive at the same time, and the complications from this lead to there being a crappy range. I'm getting mine tomorrow, will see if I can disable overdrive and get a better Freesync range in return. What the hell does Overdrive do for me anyway?
|
# ? Jun 22, 2015 19:08 |
|
FaustianQ posted:What the hell does Overdrive do for me anyway? Improves response time using overshoot trickery: https://youtu.be/2Fi1QHhdqV4?t=2574
|
# ? Jun 22, 2015 19:30 |
|
Incredulous Dylan posted:I'm really impressed by the picture quality on my LG 34UM95, but I think I'm ready to go back to a real gaming monitor. I've written about this a few times in the thread already but the monitor's response times are just too slow for the super twitch gaming that I do. The tearing is god awful and very distracting without v-sync on, and at 3440x1440 even with a 980 ti I can't guarantee I'm gonna always be above 60 fps for v-sync when keeping the eye candy on. Even when just playing TF2, with my high mouse sensitivity the monitor cannot display fast enough to keep up with fast spins, etc. Not enough games I play support the full 3440x1440 properly, either (mostly Valve titles and for good reason, but it sucks). I'm gonna pick up a G-sync monitor, compare the two directly and then return/sell one. I'll probably just Craigslist the 34UM95 since I didn't keep the huge box for it, but if I do put it up on SA mart I'll link here for first dibs. Sup 34UM95 / 980 Ti buddy. I've actually found that an overclocked 980 Ti can handle just about anything other than Witcher 3 Ultra at 3440x1440; even GTA V with everything on High runs well. But I do share a lot of the same gripes about using the 34UM95 as a gaming monitor. Valve and Blizzard games (other than WoW which works great) being locked to 16:9 is a drag. NBA 2K15 doesn't have any ultrawide support either. Having to use hacks like Flawless Widescreen to enable/fix 21:9 for games like Mass Effect 3 and The Witcher 2 leaves a bad taste in my mouth. But I will say that I have had a pretty high success rate of games that I've been wanting to play having native support for 21:9, probably better than 75%. I don't play many twitch games so the response time isn't as big of a problem for me as it sounds like it is for you but it does kind of suck feeling like I'm living with tradeoffs on a $750 monitor. It's a tough call because when 21:9 works I feel like pretty much any game is more enjoyable at 3440x1440 than 2560x1440; I've been playing BioShock Infinite, Dishonored, Guild Wars 2, Skyrim, and Civ V lately and they all benefit from the wider screen and I have a hard time going back to 2560x1440 (I still have a 27" Benq sitting around). Upgrading from a 970 to an overclocked 980 Ti has pretty much negated the performance argument (I used to justify sticking to 1080p or 1440p because the 970 had trouble with some games at 3440x1440). But there is something nice in knowing that a 2560x1440 monitor is going to support every game and be one less thing to worry about. I guess in the end I have decided that 3440x1440 works enough of the time that the ultrawide experience makes it worth it to me to deal with using widescreen hacks or dealing with pillarboxed 16:9 games for the handful of titles that don't support it. I guess that living with hacks and games being locked to 16:9 just comes with the territory of owning an ultrawide monitor right now.
|
# ? Jun 22, 2015 19:52 |
|
Parker Lewis posted:Sup 34UM95 / 980 Ti buddy. I've actually found that an overclocked 980 Ti can handle just about anything other than Witcher 3 Ultra at 3440x1440; even GTA V with everything on High runs well. But I do share a lot of the same gripes about using the 34UM95 as a gaming monitor. The upshot is that the 3440x1440 is a new standard that companies are going to start marketing toward so it should be a lot easier for games to support ultra widescreen gaming. I can never go back to a 16:9 monitor ever again, 21:9 is just too drat good.
|
# ? Jun 23, 2015 03:04 |
|
|
# ? Jun 3, 2024 04:25 |
|
People like me like dual screens because its quick and easy to move windows between the two. Are you guys with ultra wides using software to 'split' it into two? Or just buying two ultra wides? As far as I can tell, Windows 8.1 doesn't support doing this natively, unless you count their lovely 'apps'.
|
# ? Jun 23, 2015 04:32 |