Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Truga posted:

Is that a thing many people get? Because the first thing I do when I get a monitor is dial it full bright and then grumble about it not being bright enough lol

:catstare:
Do you work outside in sunlight near the equator or something? I run my monitor at like 6/100 brightness after gradually dialing it down until it didn't hurt to look at it after an hour or so of use. The recommended brightness for a well lit office is 120 nits, but most monitors are capable of 300 nits or more.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy
N-No I live in a swamp where the colder half of the year is dominated by permanent fog lol.

Then again, I never understood why people wear sunglasses either, maybe my eyes are just weird. Coworkers also often tell me "how the gently caress can you even see anything" because I run my 13" 2560x1700 laptop with no scaling and can read from it just fine while normally sitting at a desk and they can't. :v:

LimburgLimbo
Feb 10, 2008
I know virtually nothing about monitors but want to buy a new rig and want a gaming monitor that can do 144Hz and decent resolution, etc.

Was checking what was around with the range of specs I'm looking at on https://www.productchart.com/monitors/ and heard about Freesync and G-Sync for the first time. Is this something I should really care about? Reading some breakdowns of the tech but don't really have a subjective grasp of how much it will matter. My GPU is going to be Nvidia RTX 2070 Super so understand that I should presumably look for G-Sync supported monitors.

On that note if there's go-to monitors with 144Hz in the ~300-400 USD range (preferably 27" I think) I'm very interested.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
VRR (Freesync/Gsync) is the only way to get good input latency without tearing. It's really good. There are a lot of options for monitors, but assuming you want the best value in your budget, the short answer is buy a Nixeus EDG-27s v2. Buy the non-S version if you care about having a height-adjustable stand, or buy/supply one yourself.

To actually use VRR, you have to do several things :

Set your refresh rate to 144hz.
Enable Gsync in the Nvidia control panel.
Use the new framerate cap in the Nvidia drives to cap your framerate to 140 FPS. (it's actually good now, you don't need to use RTSS anymore)
And it's probably not a bad idea to force vsync on in the Nvidia control panel as well.

Additionally, for some games (particularly overwatch), it's worth creating a game profile that disables the framerate limiter so you can use the in-game one instead for even lower latency.

KingEup posted:

I’ve never used a VR headset but blur busters reports:
https://www.blurbusters.com/faq/oled-motion-blur/

The persistence and motion clarity is still nowhere near CRT levels. Also, it's possible for them to use very short strobe duration and still maintain decent brightness because they're only lighting a tiny display 1" away from your eyes. Monitors need to be a lot brighter, and strobing on them is more compromised as a result.

LCDs loving suck and I almost wish they had stayed too lovely for mainstream use. SED/FED consumer displays would probably always have been really expensive, but at least they would be good. Since LCDs are cheap and good enough for most uses, we're stuck with them loving forever.

K8.0 fucked around with this message at 21:01 on Jan 16, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
I mean, I get that in some ways they're inferior to what could maybe possibly have been achieved with advanced CRT derivitives, but I'm also pretty damned ok with not having a run of the mill 24" monitor cost >$1k and weigh enough to never make a monitor arm a possibility while also taking up a big chunk of my desktop.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
What's the thread recommendation on wall-mount monitor arms again? still AmazonBasics?

Does a monitor arm have to be drilled into a stud?

Zarin
Nov 11, 2008

I SEE YOU

Paul MaudDib posted:

What's the thread recommendation on wall-mount monitor arms again? still AmazonBasics?

Does a monitor arm have to be drilled into a stud?

I would think you absolutely want a stud, preferably with beefy screws.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Zarin posted:

I would think you absolutely want a stud, preferably with beefy screws.

:quagmire:

Never thought about it but it'd be nice to be able to just lift it off my desk to clean or reroute cables. I'm using a wooden kitchen table type thing and it doesn't really have any holes for a mount, it doesn't have much of a lip, and I don't want to drill it. A wall-mount arm would work though.

The Big Bad Worf
Jan 26, 2004
Quad-greatness

KingEup posted:

I’ve never used a VR headset but blur busters reports:
https://www.blurbusters.com/faq/oled-motion-blur/

I hope a rolling shutter technique comes to desktop displays. I really strongly prefer having good motion clarity, as without something like ULMB or whatever any other brand calls their backlight strobe technology, the entire screen just looks like a smeary mess as soon as you pan the camera anywhere, which is partially just that lcd panels are still very slow relative to CRTs, but primarily because of the "sample and hold" technique relative to the older impulse style.

The viewsonic xg270 is supposed to have very good backlight strobing (tuned by blur busters, even) but only at lower refresh rates (75-144) and not quite crt perfect or even as good as we've seen in VR headsets, according to some hands on reports. Really hate that we're still stuck with these lovely slow panels when crts were so fast and looked so clear.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

:quagmire:

Never thought about it but it'd be nice to be able to just lift it off my desk to clean or reroute cables. I'm using a wooden kitchen table type thing and it doesn't really have any holes for a mount, it doesn't have much of a lip, and I don't want to drill it. A wall-mount arm would work though.

Depends on how heavy your monitor is and how adventurous you want to get. If it's just a 24" or something you can easily get away with just using some higher-capacity (75lbs+) wall anchors--like 4 of 'em. If you plan on yanking the thing around a lot, or it's a 32" heavy monster, then yeah, you probably want to drill that into a stud if at all possible.

Another option would be to do a lip/clamp mount. I know you said your tabletop is thin, but you can always fix that by just getting another piece of wood and slapping it up under the tabletop--a 6" square piece would probably be sufficient for a smaller monitor, though a 12" square would be better. Hell, you probably could just use the clamp force to keep it in place if you didn't want to glue/screw it together.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
What are the CRTs to get if you want to do a retro thing?

I think I remember one of the Syncmasters being really good?

Also for TVs, it's the Sony WEGAs with high-scan and digital inputs?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Sony FW900 Trinitrons were usually the go-to. 24" (22.5" viewable) 2304x1440@80Hz, 160Hz max, 93lbs of goodness.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

Paul MaudDib posted:

Also for TVs, it's the Sony WEGAs with high-scan and digital inputs?

Any TV with a SCART RGB input or failing that any 60Hz TV with component inputs.

As for Sony Trinitron you also have to decide for yourself whether you prefer an aperture grill or in-line shadow mask

KingEup fucked around with this message at 08:19 on Jan 17, 2020

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

DrDork posted:

Sony FW900 Trinitrons were usually the go-to. 24" (22.5" viewable) 2304x1440@80Hz, 160Hz max, 93lbs of goodness.

I still miss mine for some things. I'm glad we have cheap LCDs tho since there is no contest between reading text - especially a lot of it - on LCDs vs CRT.

Artelier
Jan 23, 2015


LG 27GL850, anything I should know about it concerning media consumption, typing, photo/video editing, or gaming? I found it by chance today, and I think it will be a massive upgrade, but it's pricey.

For reference, it's RM1,900 here, while the (surely lower quality) Nitro VG271UP is only RM1,400, but is also 2k, ips, 144hz. No Nixeus here, though that would be around RM1,400 too if it was here.

Besides the price, can't find a display copy either. But nobody seems to be doing a display copy of any 2k monitors so...I have no real frame of reference and am being cautious before dropping serious cash. I see earlier on the previous page people saying the LG is very good but didn't go that much into it. The enthusiasm is really tempting though!

EDIT:

After researching some of the similarly priced or cheaper options, I guess what I'm asking is, LG27GL850, worth the money?

Artelier fucked around with this message at 17:07 on Jan 17, 2020

Constellation I
Apr 3, 2005
I'm a sucker, a little fucker.
It's very very good, only issue is that it's pricier than the rest of the competition.

What you can do is try to see if the LG 27GL83A-B is available. The main difference is sRGB gamut on this one, while LG 27GL850 uses DCI-P3 gamut.

VelociBacon
Dec 8, 2009

I've been out of the thread for a few months - is there a good IPS, 4k, >100hz, adaptive sync monitor right now or are they all still over $1k and with lovely HDR modules and all that?

Cygni
Nov 12, 2005

raring to post

Constellation I posted:

It's very very good, only issue is that it's pricier than the rest of the competition.

What you can do is try to see if the LG 27GL83A-B is available. The main difference is sRGB gamut on this one, while LG 27GL850 uses DCI-P3 gamut.

And like myself and others can confirm, some? all? 27GL83A's actually ship with the exact same panel/backlight/gamut as the 27GL850. It is an identical monitor without a USB hub in the back. YMMV though.

LimburgLimbo
Feb 10, 2008
I'm guessing that 2560 x 1080 on a 34" wide monitor is to be avoided?

I'm eyeing a the LG 34GL750 because for the price it seems like a lot of monitor.

Torn a little between that and an ASUS TUF VG27AQ, but the ASUS TUF VG27AQ is apparently going for ~330 USD equivalent where I am in Taiwan (versus ~500 USD on Amazon in the US for some reason?)

Someone let me know if I'm overlooking something big dumb here.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

LimburgLimbo posted:

I'm guessing that 2560 x 1080 on a 34" wide monitor is to be avoided?

In the same way that 1080p on a 27" is, yes--but in the same way, some people find it perfectly acceptable, so YMMV. I'd always recommend 3440x1440 or higher, though.

LimburgLimbo
Feb 10, 2008

DrDork posted:

In the same way that 1080p on a 27" is, yes--but in the same way, some people find it perfectly acceptable, so YMMV. I'd always recommend 3440x1440 or higher, though.

Yeah that was my thinking. I've also never had a big curved monitor before so not sure I want to jump in just yet. Figure for the specs and price point if I want to upgrade to a wide curved one later I can do so and keep the Asus for a second monitor.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
90% of the people who will tell you to buy an ultrawide have never lived with one.
The other 10% just want you to make the same mistake they did.

Ultrawides are for people with money to throw away. The value proposition is awful, do not consider them if you care about money.

BONESTORM
Jan 27, 2009

Buy me Bonestorm or go to Hell!

K8.0 posted:

90% of the people who will tell you to buy an ultrawide have never lived with one.
The other 10% just want you to make the same mistake they did.

Ultrawides are for people with money to throw away. The value proposition is awful, do not consider them if you care about money.

I love mine because I play primarily FPS and, since I got a wheel and pedal setup, racing games. It’s a lot more immersive compared to my old 16:9 1080p monitor. Outside those use cases though, you are paying substantially more compared to similarly spec’d 16:9 monitors for minor benefit. I wouldn’t recommend them for most people, but I’ll never go back.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

K8.0 posted:

90% of the people who will tell you to buy an ultrawide have never lived with one.
The other 10% just want you to make the same mistake they did.

Ultrawides are for people with money to throw away. The value proposition is awful, do not consider them if you care about money.

I will assume you're in the 90%, then. I have one and it's great, and my girlfriend saw it and now wants one, too. I do have the desk space to allow me to have it and two 27", so I don't have to make the same "one big vs two discrete screens" trade-off that some people do.

You do pay a considerable premium for it vs a 27" with otherwise similar specs, admittedly, but that's also like noting that both a BMW and a Toyota can easily do 60mph on the way to the store: true enough, but maybe not entirely the point. I mean, the value proposition of the 2080Ti is pretty bad, too.

v1ld
Apr 16, 2012

Same - the extra horizontal space of an ultrawide lets you have two windows side by side of reasonable size, which is a significant and palpable benefit over 16:9. The extra 30% horizontal space is significant.

If a resolution/aspect ratio is to be questioned, I'd look at 4K first: the increased DPI is coming in close to the same surface area as 1080/1440p screens which means greater resolution/dpi but not more actual usable space: I can't get much more on screen with a 4K than I can with 1440p. I question the increased fidelity relative to the increased cost both in price and rendering overhead over 1440p.

A 3440 x 1440 monitor is lower DPI than 4K but more usable screen space, it's a no-brainer for my usage.

Unless you're narrowly focused on gaming as the only use case for a monitor, the extra horizontal space is very useful - and I'm liking it for gaming as well, though it's not as useful as for normal usage.

Zarin
Nov 11, 2008

I SEE YOU

K8.0 posted:

90% of the people who will tell you to buy an ultrawide have never lived with one.
The other 10% just want you to make the same mistake they did.

Ultrawides are for people with money to throw away. The value proposition is awful, do not consider them if you care about money.

At the risk of seriously responding to somebody who just might be trolling:

I assume by "value proposition" you mean that you can get higher refresh rates and better LCD panel quality on a 1080p or 1440p 16:9 monitor? If so, then sure, I'll agree with that.

However, for many types of games (MMOs, RTS, MOBAs, Racing, probably many others) the extra screen real estate is pretty handy, in that they allow you to see more of the map or have more room for chatboxes/information displays/etc. I will qualify my statement by saying that it is my understanding that 4k gaming just makes the image look prettier, and doesn't translate into "being able to see more of the map" or whatever.

For productivity, it's my understanding that 4k can grant more screen real-estate. I haven't had the chance to do spreadsheet things on a 4k monitor, so I can't say if I'd prefer 4k or 3440x1440 for that.

At any rate, you're the first person I've met who was disappointed by ultrawide, and I've been recommending it ever since I started using one.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zarin posted:

However, for many types of games (MMOs, RTS, MOBAs, Racing, probably many others) the extra screen real estate is pretty handy, in that they allow you to see more of the map or have more room for chatboxes/information displays/etc. I will qualify my statement by saying that it is my understanding that 4k gaming just makes the image look prettier, and doesn't translate into "being able to see more of the map" or whatever.

Unless you're playing Overwatch or other "e-sports" that take themselves too seriously and somehow figure that allowing entirely uncapped FPS is "not game enhancing" but allowing a wider FOV is, and thus do stupid poo poo like chop off 1/3 of the vertical space to effectively stretch a 16:9 frame onto a 21:9 monitor. Because somehow that's better than simply displaying the full 16:9 frame with pillarboxing :shrug:

Zarin
Nov 11, 2008

I SEE YOU

DrDork posted:

Unless you're playing Overwatch or other "e-sports" that take themselves too seriously and somehow figure that allowing entirely uncapped FPS is "not game enhancing" but allowing a wider FOV is, and thus do stupid poo poo like chop off 1/3 of the vertical space to effectively stretch a 16:9 frame onto a 21:9 monitor. Because somehow that's better than simply displaying the full 16:9 frame with pillarboxing :shrug:

It's not chopping and stretching, though . . . 3440x1440 is just extra horizontal space compared to 1440p. I will see more of the map than a 1440p monitor, and/or have extra space for chat boxes, meters, etc.

As for uncapping FPS, it's my understanding that there are some diminishing returns there. Many people seem to report that north of 120hz becomes hard to distinguish, and I figure if a video card is good enough to run 1440p at 144hz, then it can probably run 3440x1440 at 120hz - the game would probably feel the same to me, so I would find myself opting for "seeing more stuff". :shrug: (I'm assuming that 144hz is the top-end of 16:9 G-Sync and 120hz is the top-end of 21:9 G-Sync, I may be off there)

Maybe we're just talking past each other here, though; I'm only running on assumptions of what you mean by "value proposition". Can you define it for me? Because it's possible we are sensitive to/notice different things about displays.

Zarin fucked around with this message at 22:54 on Jan 17, 2020

Sphyre
Jun 14, 2001

Playing overwatch in 21:9 will crop the top and bottom, not increase the FOV. Like so:



It's the same for DOTA, probably league of legends, etc.

Personally I vastly prefer my 27" 1440p over the Alienware 34" ultrawide I tried and quickly returned. But i've never really cared for dual monitor setups and the like so YMMV

Zarin
Nov 11, 2008

I SEE YOU

Sphyre posted:

Playing overwatch in 21:9 will crop the top and bottom, not increase the FOV. Like so:



It's the same for DOTA, probably league of legends, etc.

Personally I vastly prefer my 27" 1440p over the Alienware 34" ultrawide I tried and quickly returned. But i've never really cared for dual monitor setups and the like so YMMV

Oh, interesting, I did not know that!

I know in Diablo 3, it seems like I can see things that are off-screen to my buddies with 16:9 screens, and in MMOs it gave me a lot of extra space to unclutter the center of the screen.

Yeah, with HotS, it used to letterbox the sides of the screen. I was surprised that it didn't seem to do that anymore, but it also didn't feel like I was seeing anything extra, so that may be what's going on there as well.

"What games you typically play" is probably the most important aspect of monitor selection. I'll have to play around with resolution settings and see which games are cropping vs. actually increasing visual area. Thanks for sharing that!

Zarin fucked around with this message at 23:16 on Jan 17, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zarin posted:

Oh, interesting, I did not know that!

Yeah, that's what I was getting at: there are some games that take the "competitive" aspect too seriously and force a 16:9 viewport because allowing a proper 21:9 would, indeed, increase FOV like you mention. Overwatch is probably the most notable one that does that, but I think there are a few other Blizzard titles, too. The stupid part is that--while I agree with you that the difference between 144 and 120Hz ain't much, and diminishing returns are absolutely a thing--Overwatch will happily let you play at 1000Hz if you want, and then put on a serious face and claim that that's not any real advantage over 60Hz, which is dumb and patently not true.

Most games let you just do whatever and enjoy the wider FOV 21:9 offers, which is grand. Though some older games (Fallout, Skyrim, etc) will simply stretch 16:9 to 21:9 by warping the outer edges, which makes for somewhat odd viewing experiences at times.

Zarin posted:

Maybe we're just talking past each other here, though; I'm only running on assumptions of what you mean by "value proposition". Can you define it for me? Because it's possible we are sensitive to/notice different things about displays.

Mostly that you can get a 27" IPS 1440p 120Hz (or better) monitor for like $300-$400, while a 34" 1440p 100-120Hz monitor is still $700. You pay about twice as much for a screen that isn't twice as good or twice as large, so it's a "bad value" in that sense. But in the same way, a 2080Ti isn't twice as fast as a 2070, despite costing twice as much, so it too is a "bad value" that people still buy because it offers something unique. For the ultrawide, it's...well, being wide. For the 2080Ti, it's because there's no other way to get that top-end speed. So my point was mostly that it being a "bad value" isn't necessarily a reason to avoid buying it if it happens to give you what you're looking for.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


K8.0 posted:

To actually use VRR, you have to do several things :

Set your refresh rate to 144hz.
Enable Gsync in the Nvidia control panel.
Use the new framerate cap in the Nvidia drives to cap your framerate to 140 FPS. (it's actually good now, you don't need to use RTSS anymore)

Why would I set a frame rate cap to 140? Why not 144 or not even have a cap at all?

repiv
Aug 13, 2009

Zarin posted:

I'll have to play around with resolution settings and see which games are cropping vs. actually increasing visual area. Thanks for sharing that!

WSGF is a good reference for this, e.g. https://www.wsgf.org/dr/overwatch/en

repiv fucked around with this message at 23:29 on Jan 17, 2020

Zarin
Nov 11, 2008

I SEE YOU

DrDork posted:

Yeah, that's what I was getting at: there are some games that take the "competitive" aspect too seriously and force a 16:9 viewport because allowing a proper 21:9 would, indeed, increase FOV like you mention. Overwatch is probably the most notable one that does that, but I think there are a few other Blizzard titles, too. The stupid part is that--while I agree with you that the difference between 144 and 120Hz ain't much, and diminishing returns are absolutely a thing--Overwatch will happily let you play at 1000Hz if you want, and then put on a serious face and claim that that's not any real advantage over 60Hz, which is dumb and patently not true.

Okay, I'm picking up what you're putting down now! I guess I had assumed that games that didn't want you to have that extra advantage would pillarbox it, and those that didn't care let you run the resolution. Granted, I formed this assumption based on how WoW and HotS behaved when I first got an ultrawide, and never did any extra reading on it.

And yeah, even though I said there were "diminishing returns", given enough diminished returns, you still get actual returns.


quote:

Most games let you just do whatever and enjoy the wider FOV 21:9 offers, which is grand. Though some older games (Fallout, Skyrim, etc) will simply stretch 16:9 to 21:9 by warping the outer edges, which makes for somewhat odd viewing experiences at times.

I think it's possible to fix that in Skyrim with a shitload of mods, but I'm not surprised that older games don't quite know what to do with it, either.


quote:

Mostly that you can get a 27" IPS 1440p 120Hz (or better) monitor for like $300-$400, while a 34" 1440p 100-120Hz monitor is still $700. You pay about twice as much for a screen that isn't twice as good or twice as large, so it's a "bad value" in that sense. But in the same way, a 2080Ti isn't twice as fast as a 2070, despite costing twice as much, so it too is a "bad value" that people still buy because it offers something unique. For the ultrawide, it's...well, being wide. For the 2080Ti, it's because there's no other way to get that top-end speed. So my point was mostly that it being a "bad value" isn't necessarily a reason to avoid buying it if it happens to give you what you're looking for.

Makes sense! Yeah, I'm 100% in agreement with you now. Seems like my understanding of ultrawide was missing some pieces. While I still don't regret having one, I'll be opting to run 16:9 1440 in the games that are only pretending to support 21:9 resolutions - if, for no other reason, than reduced eye travel.

Thanks, thread, for taking the time to explain this to me!

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Zarin posted:

It's not chopping and stretching, though . . . 3440x1440 is just extra horizontal space compared to 1440p. I will see more of the map than a 1440p monitor, and/or have extra space for chat boxes, meters, etc.

As for uncapping FPS, it's my understanding that there are some diminishing returns there. Many people seem to report that north of 120hz becomes hard to distinguish, and I figure if a video card is good enough to run 1440p at 144hz, then it can probably run 3440x1440 at 120hz - the game would probably feel the same to me, so I would find myself opting for "seeing more stuff". :shrug: (I'm assuming that 144hz is the top-end of 16:9 G-Sync and 120hz is the top-end of 21:9 G-Sync, I may be off there)

Maybe we're just talking past each other here, though; I'm only running on assumptions of what you mean by "value proposition". Can you define it for me? Because it's possible we are sensitive to/notice different things about displays.

Just to interject, 3440x1440 is slightly more than 34% more pixels per frame than 2560x1440. Assuming each pixel is equally difficult to render, a GPU that can do 144fps at 1440p will only push ~107fps at 3440x1440. Not the end of the world, but a substantial jump in pixelcount and required processing power.

:goonsay:

Zarin
Nov 11, 2008

I SEE YOU

ItBreathes posted:

Just to interject, 3440x1440 is slightly more than 34% more pixels per frame than 2560x1440. Assuming each pixel is equally difficult to render, a GPU that can do 144fps at 1440p will only push ~107fps at 3440x1440. Not the end of the world, but a substantial jump in pixelcount and required processing power.

:goonsay:

I guess I just made up the numbers and was assuming that 1). The card could do even more than 144 (like a 2080ti or something) and 2). you'd cap your frames using G-Sync. (Is there even a benefit to leaving frames uncapped and going far beyond the refresh rate of the monitor? I assume there must be, and I guess it's true that there are 200+hz monitors starting to hit the market now)

Thank you for the math though! Based on the words I typed, though, you're technically correct (which is the best kind of correct) and I appreciate your post :3:

Artelier
Jan 23, 2015


Constellation I posted:

It's very very good, only issue is that it's pricier than the rest of the competition.

What you can do is try to see if the LG 27GL83A-B is available. The main difference is sRGB gamut on this one, while LG 27GL850 uses DCI-P3 gamut.

I looked up it up, and where I'm at, it's actually cheaper than most of the competition, discounting the properly budget models. It's still relatively a big chunk of money but if I am going to get one at all, I might as well get this one.

That other model you described 27GL83A-B sounds great and in some ways (price wise) is even better, but they're not here, so either I get a 27GL850 or just hold on, live with current 1080p, build up more cash.

Constellation I
Apr 3, 2005
I'm a sucker, a little fucker.
At 27", there's nothing really interesting on the horizon to hold out for IMO, unless you're interested in better 27" 4k 144Hz panels or the upcoming 1440p 240Hz ones.

There's also plenty of similar options available (just higher panel lottery) like the Nixeus EDG27S v1/v2. Though I'm not sure on availability in M'sia.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

VelociBacon posted:

I've been out of the thread for a few months - is there a good IPS, 4k, >100hz, adaptive sync monitor right now or are they all still over $1k and with lovely HDR modules and all that?

The Acer XV273K has gotten cheaper and is around $700 on Amazon now. It's okay but I hesitate to call it "good" - for that price I'd say it's acceptable if you really want 4K high refresh rate, but when it comes to panel quality and response time it's quite poor compared to what you get at 1440p. One particularly annoying issue is that while it's capable of quite good response times under ideal circumstances, enabling freesync forces a less aggressive overdrive mode so you get a really mediocre result - like, the $300 Nixeus EDG-27v2 performs better.

Tab8715 posted:

Why would I set a frame rate cap to 140? Why not 144 or not even have a cap at all?

If your game renders faster than your refresh rate with variable refresh rate enabled, you get increased input lag because the rendering will start to behave like you have V-sync enabled and you'll have frames buffered in queue waiting to get drawn on the screen. Setting a frame rate limit to just below the screen's max refresh rate ensures you'll get to enjoy the selling point of variable refresh rate - low latency with no tearing.

Zarin posted:

(Is there even a benefit to leaving frames uncapped and going far beyond the refresh rate of the monitor? I assume there must be, and I guess it's true that there are 200+hz monitors starting to hit the market now)

On paper yes, if you disable v-sync and variable refresh rate and render at a frame rate significantly higher than the monitor refresh rate, then yes, you can get slightly better latency than with just variable frame rate and capped frame rate enabled. The price you pay for this is tearing, though, but at very high frame rates it's not super noticeable. In reality though once you're over 140fps any further frame rate increases offer incredibly small benefits - we're talking reducing latency by single digit milliseconds. Might be worth doing if your daily bread depends on how fast you can click people's heads, but not otherwise.

Adbot
ADBOT LOVES YOU

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Zarin posted:

At the risk of seriously responding to somebody who just might be trolling:

I assume by "value proposition" you mean that you can get higher refresh rates and better LCD panel quality on a 1080p or 1440p 16:9 monitor? If so, then sure, I'll agree with that.

However, for many types of games (MMOs, RTS, MOBAs, Racing, probably many others) the extra screen real estate is pretty handy, in that they allow you to see more of the map or have more room for chatboxes/information displays/etc. I will qualify my statement by saying that it is my understanding that 4k gaming just makes the image look prettier, and doesn't translate into "being able to see more of the map" or whatever.

For productivity, it's my understanding that 4k can grant more screen real-estate. I haven't had the chance to do spreadsheet things on a 4k monitor, so I can't say if I'd prefer 4k or 3440x1440 for that.

At any rate, you're the first person I've met who was disappointed by ultrawide, and I've been recommending it ever since I started using one.

I wrote the post on a 34GK950F. I am a person with money to throw away.

By value proposition I mean you pay about twice as much as for a 16:9 monitor for 34% more screen real estate that is only useful sometimes.

Yes, in some games you can see more. However, in most games, you either can't see more because they limit horizontal FOV, or they don't support ultrawide at all, or the extra real estate is almost counterproductive because the UI design pushes the elements out to the corners where they're hard to see. For the small selection of immersive games that support it well, it's cool. For everything else, you're wasting GPU horsepower rendering things well outside your central field of view. It's absolutely not a competitive advantage in almost any game, there are no serious competitive gamers using them for anything. As far as immersion goes, for the most part 4k would serve you better (although the monitors don't exist yet and it's far harder to drive).

It's also basically worthless on desktop. Everything is designed for 16:9, so you don't really get much of anything out of a 21:9 display. Either you're stretching apps across it in a way that's useless, or you're splitting the screen in half and having UIs get compressed to a sub-4:3 aspect ratio. Either way, again outside a very limited selection of apps, you're better off with more 16:9 monitors. The one place where ultrawide is particularly strong is consuming ultrawide media, although there's a limited selection of that. Most wide-form movies are letterboxed, so you have to manually zoom and crop if you want them close to full screen.

But the biggest overall reason not to buy an ultrawide is still the money one. You're paying a lot of money for what is at best a very marginal upgrade. It just doesn't make sense, most people have budgets and care about how much they spend on things. When there are monitors for $300 that are 99% as good as the best you can get and not compromised in any meaningful way, it's hard to argue for anything more expensive unless you don't mind wasting money.

Tab8715 posted:

Why would I set a frame rate cap to 140? Why not 144 or not even have a cap at all?

To further emphasize what TheFluff said, if you leave your framerate uncapped and your framerate exceeds your refresh rate, VRR/Gsync/Freesync is off and you are just using a conventional monitor. You either get input latency from vsync or you get tearing.

The reason you cap to 140 instead of 144 is that what you're actually doing is limiting the interval between frames. If you try to time it perfectly, you will sometimes get frames that come too quickly, and you wind up having VRR constantly going off and on and getting bad frame pacing as a result. Capping 4 FPS below refresh is generally enough to avoid this. Some people use 3 or 2 but the incredibly small benefits aren't worth the significant increase in how often you get issues from it.

K8.0 fucked around with this message at 05:45 on Jan 18, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply