Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Gunslinger
Jul 24, 2004

Do not forget the face of your father.
Fun Shoe
I ended up trying a Samsung Odyssey G7 and I'm returning the LG 27GL83A-B I got. It was fine but the Odyssey just blows it away. I was coming from an old Acer XB271hu which is still a solid monitor to this day but the Samsung is super impressive. Blacks are excellent, motion handling is good and the calibration was decent out of the box. I expected problems with backlight bleed and flickering but no issues at all, if there is a panel lottery then I definitely won it. I guess the firmware has fixed the Gsync issues because mine is flawless with adaptive sync enabled on all of the games I've tried. This thing even has some bias lighting built into the back but it's too weak to really notice so I turned it off. I thought the curve would bother me but it's the opposite, it's loving great for gaming. I'm going to return the LG and sell the Acer to help offset the cost a bit. I could've probably waited until Black Friday for a deal but no guarantees and with the supply situation these days I would rather just eat the $100 savings and enjoy it right now. Anyways what a great monitor for gaming, super happy with it so far.

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

If that kind of quality was universal among all G7s shipped, it would be the best gaming monitor on the market, hands down. The QA issues seem way too frequent for a $700 - $800 monitor, though. Glad yours turned out well. When everything works correctly, it looks amazing.

The Gunslinger
Jul 24, 2004

Do not forget the face of your father.
Fun Shoe

Dr. Video Games 0031 posted:

If that kind of quality was universal among all G7s shipped, it would be the best gaming monitor on the market, hands down. The QA issues seem way too frequent for a $700 - $800 monitor, though. Glad yours turned out well. When everything works correctly, it looks amazing.

Yeah I made sure to pick it up from Amazon so that I could return it easily. I think they had a bad run in 2020, looking at reddit for 2021 I don't see anywhere near as many complaints.

CaptainSarcastic
Jul 6, 2013



Running a 1440p monitor off a 1070 I think would be doable, with some caveats. I ran my current 27" 1440p 144hz monitor off a 1060 6GB briefly before deciding I needed to upgrade my GPU, but I had just built a new system and really wanted my graphics performance to be more in line with the rest of my machine.

I could run modern titles at an acceptable framerate on the 1060, but it required dialing quality settings down in-game. The 1070 should provide more headroom than that, but still likely require putting more stuff at medium or whatever. Had I been running a dual-monitor setup I would've turned off the second monitor, which was my practice on my old less-powerful system where I did run dual monitors. Personally the 27" at 1440p is enough screen real estate I don't feel the need to run dual-monitors, but I'm also used to alt-tabbing out of a game if I need to look up a walkthrough or something.

Max Wilco
Jan 23, 2012

I'm just trying to go through life without looking stupid.

It's not working out too well...

Dr. Video Games 0031 posted:

You don't need any special software for what you want. Most modern games have an option in the settings for which monitor they're displayed on. Though not all... *cough* Flight Simulator. For those that don't, they generally run on your primary display, which is a setting you can change in the windows display settings quite easily.

The one you linked is a good one, though it's not exactly the same as the monitor you saw on RTINGS. The one RTINGS reviewed is the quantum dot variant that has a wider color gamut. This one will perform identically in most aspects, but will have a narrower color gamut. This doesn't matter for 99.9% of content since both still covers the full sRGB gamut. I can't find any clear differences between this non-QD version and the G273QF ($310) I mentioned earlier aside from the G273QF having a worse stand that isn't height adjustable. The Gigabyte M27Q is $300, has slightly worse response times (won't be noticeable to most people), a wider color gamut than the G273QF (again, not relevant for most content, but it may be nice to have), and a flaw in the form of BGR subpixels that will make text appear a bit fuzzier in programs that do custom text anti aliasing without using ClearType (Chrome is one, I think?). And an adjustable stand. These would be my picks for mid-budget 1440p monitors.

I think I'd either go with the MSI G273QF, or the non-QD MAG274QRF. On Amazon, there's only like a $10 price difference between the Gigabyte M27Q and the MSI G273QF, and the BGR subpixel flaw you describe with the Gigabyte is a mark against it for me, since it sound like it makes some text harder to read.

Of the two MSI monitors, I'm leaning more towards the MAG274QRF, since it looks like it would fit on my desk better (the G273QF has those legs that stick out to the side, which I imagine would make it awkward when trying to place the two monitors close together) and the tilt/rotate adjustment would help with angling where I can see it properly. RTINGS doesn't seem to have a review for the non-QD monitor, but on Amazon, the QD variant is $200 more ($569), and based on what you said, the lack of QD isn't too noticeable.

CaptainSarcastic posted:

Running a 1440p monitor off a 1070 I think would be doable, with some caveats. I ran my current 27" 1440p 144hz monitor off a 1060 6GB briefly before deciding I needed to upgrade my GPU, but I had just built a new system and really wanted my graphics performance to be more in line with the rest of my machine.

I could run modern titles at an acceptable framerate on the 1060, but it required dialing quality settings down in-game. The 1070 should provide more headroom than that, but still likely require putting more stuff at medium or whatever. Had I been running a dual-monitor setup I would've turned off the second monitor, which was my practice on my old less-powerful system where I did run dual monitors. Personally the 27" at 1440p is enough screen real estate I don't feel the need to run dual-monitors, but I'm also used to alt-tabbing out of a game if I need to look up a walkthrough or something.

Yeah, it's the sort of thing where you have to experiment with what works well and what doesn't. If I try running something on the 1440p and the performance suffers, I can either turn down the settings or toggle over exclusively to the 1080p to play it.

Eventually, if/when things eventually improve, and buying/building a computer becomes more viable, I can upgrade to a more powerful system and run games on a 1440p with ease.

DerekSmartymans
Feb 14, 2005

The
Copacetic
Ascetic

Max Wilco posted:

I think I'd either go with the MSI G273QF, or the non-QD MAG274QRF. On Amazon, there's only like a $10 price difference between the Gigabyte M27Q and the MSI G273QF, and the BGR subpixel flaw you describe with the Gigabyte is a mark against it for me, since it sound like it makes some text harder to read.

Of the two MSI monitors, I'm leaning more towards the MAG274QRF, since it looks like it would fit on my desk better (the G273QF has those legs that stick out to the side, which I imagine would make it awkward when trying to place the two monitors close together) and the tilt/rotate adjustment would help with angling where I can see it properly. RTINGS doesn't seem to have a review for the non-QD monitor, but on Amazon, the QD variant is $200 more ($569), and based on what you said, the lack of QD isn't too noticeable.

Yeah, it's the sort of thing where you have to experiment with what works well and what doesn't. If I try running something on the 1440p and the performance suffers, I can either turn down the settings or toggle over exclusively to the 1080p to play it.

Eventually, if/when things eventually improve, and buying/building a computer becomes more viable, I can upgrade to a more powerful system and run games on a 1440p with ease.

I understand jumping up in resolution for quality on the right hardware. I understand why 1ms response is good for many Pro and ProAm e-sports because it can mean the difference between $50K and having to sell your kid or something. I can, even with my eyesight, see a difference between human eye max fps and 60 fps. I can’t really tell between 1080p at 60 and 1080p at 75, but realize other people can as well as unlocking your reflexes in Fortnite or Halo multiplayer.

Why in the world would you need 240fps on any game or even movie? Are there any 144fps Battlezone wizards who lose to 200+fps Chads? Or I s it just a fun thing to stream or screenshot your overlay, kinda like a pie-eating contest? I do, again, see why games with consequences take a player with faster equipment and the reflexes to use them at another level. Is it like running a custom cooler to overclock your 3090 200% at -40°C for ”street” nerd cred? Is there any advantage to 220fps gameplay over playing at 198fps? I figure I’m just not keeping up with new hardware as well as being really bad on First Person Shooters against humans, even though I play well in single player campaigns against HardMode AI in the exact same game. Hope this is an ok thread for this because monitors matter at the high end but not as much for fans of other genres with slower gameplay styles, like PvP in a MMO or Starcraft2.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Regardless of your skill level, 240hz is absolutely a measurable advantage over 144hz. poo poo, there are mice now that poll at 8000 hz and that's a demonstrable improvement over 1000hz. Both in terms of latency reduction and also in terms of how things play out in practical reality, there's really no such thing as "enough" FPS or low enough latency. It's not going to make the difference between being a good and a bad player (In FPSs, that has far more to do with understanding how movement and positioning work than anything else), but it will give you a clear, statistically measurable advantage. e: to be clear, that advantage scales down exponentially. 60 FPS is an immense advantage over 30 FPS, 120 FPS is a pretty big advantage over 60 FPS. 120 to say 165... a lot of people can notice. 165 to 240? You probably won't notice, but you will perform better.

There's also a big advantage to not being GPU bound, although there are ways to mitigate it. Having GPU utilization at 100% creates serious latency issues. If you're a serious competitive player in a game where minimum settings at a competitive resolution only gets you 100 FPS, you should cap your FPS to something like 95 and you will perform better because your latency will be lower (as well as more consistent).

I bet you FPS matters more in SC2 than it does in most games. Execution scaling in RTSs is insane, there's really no cap to how fast you can be and the benefits you gain from it.

e2 : also, to answer the movie part of your question : movies come in all kinds of fucky framerates. You may or may not notice the occasional hitching that comes from playing a 59.94 FPS TV content on a 60.0hz monitor. FPS It's fairly easy to see the artifacting over a 24 FPS movie juddering on a fixed 60hz monitor (although frankly 24 FPS always looks like absolute garbage no matter what). At higher refresh rates, the juddering is dramatically reduced. 120 and 144hz are multiples of 24 and thus have extremely minor judder depending on source (a lot of digital movie content is actually 23.976 FPS), plus it's harder to see judder when the jump from one frame to another is 7ms and not 17ms.

K8.0 fucked around with this message at 04:29 on Oct 24, 2021

Dr. Video Games 0031
Jul 17, 2004

What the quantum dots give you are a full adobe RGB color gamut which is great if you're a professional photo editor, but it oversaturates the image of any standard content you'll consume. Some people actually like oversaturated images, but going with a fully unrestrained 100% adobe RGB color space in standard sRGB content can be a bit intense. People's faces will glow bright red. Quantum dots also give you better DCI-P3 coverage—that's the HDR color space. HDR content that takes full advantage of the expanded color space will appear more vibrant without being oversaturated. In practice, I've only really come across this in random showcase videos on youtube, and I can't think of any games off the top of my head that make full use of DCI-P3. This means you'll be sticking to SDR mode with the sRGB color space for just about everything, and in that respect, the non-QD version should look the same as the QD version. The non-QD version probably has a wider than sRGB gamut as well, just not as wide as the QD version, letting you still get more saturated colors if you really want.

(note that with either version, you'll be able to use sRGB gamut clamps or emulation modes to keep the colors accurate if you want. I know MSI released an updated firmware that gives the QD version a fully customizable sRGB mode, at least, and if the non-QD version lacks this then there's software that can do it, which is what I currently use for my XB273U GX.)

DerekSmartymans posted:

I understand jumping up in resolution for quality on the right hardware. I understand why 1ms response is good for many Pro and ProAm e-sports because it can mean the difference between $50K and having to sell your kid or something. I can, even with my eyesight, see a difference between human eye max fps and 60 fps. I can’t really tell between 1080p at 60 and 1080p at 75, but realize other people can as well as unlocking your reflexes in Fortnite or Halo multiplayer.

Why in the world would you need 240fps on any game or even movie? Are there any 144fps Battlezone wizards who lose to 200+fps Chads? Or I s it just a fun thing to stream or screenshot your overlay, kinda like a pie-eating contest? I do, again, see why games with consequences take a player with faster equipment and the reflexes to use them at another level. Is it like running a custom cooler to overclock your 3090 200% at -40°C for ”street” nerd cred? Is there any advantage to 220fps gameplay over playing at 198fps? I figure I’m just not keeping up with new hardware as well as being really bad on First Person Shooters against humans, even though I play well in single player campaigns against HardMode AI in the exact same game. Hope this is an ok thread for this because monitors matter at the high end but not as much for fans of other genres with slower gameplay styles, like PvP in a MMO or Starcraft2.

Even if you can't perceive a difference in smoothness, there are things that are unconsciously happening within your brain when viewing a sample-and-hold display (almost all non-CRT displays) at low frame rates which makes the image appear less clear. You may not even notice it until you see side-by-side comparisons (if you have a high refresh monitor, this site can show you the difference), but once you get past 120, 140 fps or so, the difference starts to diminish. It could be that my 270hz monitor doesn't have a high enough response time to produce true 270 fps visuals, but there's not a big difference between 135 fps and 270 fps—135 is already pretty drat clear. The main benefits that extra refresh rate gives you are to do with competitive multiplayer stuff mostly, yes. Faster refresh rates means less input lag. A large amount of the input lag in modern systems these days is the time it takes to get a frame onto your monitor's screen, and the faster the screen refreshes, the less wait time there is. There are also demonstrable improvements to reaction times. Even if you don't consciously see a difference, your brain still reacts faster to an image that updates more rapidly.

For the vast majority of people, I agree that the mid 100s is good enough. Now that 120Hz is making its way into mainstream televisions and there are a handful of console games that have 120Hz modes, perhaps 120fps will eventually become the new performance baseline in the future. It'll take probably a good decade to get there, but I can see it happening. Hardware is improving faster than visual fidelity.

Dr. Video Games 0031 fucked around with this message at 04:28 on Oct 24, 2021

DerekSmartymans
Feb 14, 2005

The
Copacetic
Ascetic

K8.0 posted:

<snip really clear response>

I want to let you know I appreciate that answer! I figured there had to be a reason or five as if it weren’t even slightly a booster then pros wouldn’t do it in the first place. Their livelihoods depend on the same as a Pro surfer needing a top of the line wax and Tony Hawk needs a very specific board he had practiced with for a new trick because muscle memory is a real phenomenon. They can still kick rear end without custom gear (me at 1080p), but the bleeding edge means newest tech at highest specs (oc’d 3090 etc) available.

Thanks again! I just hope I can tell the difference after my corneal transplant, which is the main reason I cannot compete in old hobbies or recognize the difference in native HD and 4K. Saves me money, I guess 😳.

Edit: Thanks to you too, Doc! I hope I didn’t come off like a jerk or even sarcastic—-I’ve had an eye that has lower visual acuity than a Planarian for a decade, and figured lack of good depth perception was the reason I can’t function much at reading text on greater than 1080/60 from about 24”-30” away from my screens! My eye is perfectly fine except for my pyramidal shaped cornea and we’ve run out of weapons to mitigate it now. I really hope by this time next year to be asking about a couple of decent 1440p/120fps monitors!

DerekSmartymans fucked around with this message at 05:02 on Oct 24, 2021

The Joe Man
Apr 7, 2007

Flirting With Apathetic Waitresses Since 1984

K8.0 posted:

If it's just WFH and you aren't getting a GPU for over a year, why not save a ton of money on 4k60hz and then buy a 4k144 next year?

A 1050Ti is DP 1.4 and thus lacks the bandwidth to drive 4k144 at full color depth. You can do 4k120 on the desktop and it will be fine.
Thanks for responding but legit don't understand what you're saying. I'd be able to do 120hz but not 144hz?

I just want to know if I'll be able to do WFH/Desktop/Normal stuff but also run the same games that I do now and a handful of older MP ones at 144hz (CS:GO, Quake Live, etc...).

I like to future-proof.

Dr. Video Games 0031
Jul 17, 2004

The 1050 Ti supports DisplayPort 1.4 with a firmware update, which allows up to 120hz at 4K. Most recent video cards also support Display Stream Compression, a form of lossless and latency-free video compression that enables higher refresh rates, but the 1050 Ti does not. It CAN still do 4K 144hz, but this requires something called chroma subsampling, a lossy compression technique explained here. I've never had to use it, but apparently games running at 4K don't suffer much from using it (I've heard mixed things from actual people so YMMV), but windows desktop usage will suffer quite a bit.

Going with a 4K 144Hz monitor and just sticking to 120Hz until you get a better video card would be fine. You should not have a problem running Windows apps at 4K, though I sorta doubt the 1050 Ti's ability to play even CS:GO at high frame rates at 4K.

Dr. Video Games 0031 fucked around with this message at 11:48 on Oct 24, 2021

Butterfly Valley
Apr 19, 2007

I am a spectacularly bad poster and everyone in the Schadenfreude thread hates my guts.
It can do ~120fps at 4k with low/medium settings according to the video I just watched.

Bakalakadaka
Sep 18, 2004

sup I haven't bought a new monitor in over a decade and I want to reconfigure my pc setup that currently involves a 60 inch 1080p tv (GLORIOUS COUCH GAMING). I have an oldish PC with a gtx 1070 which has been just fine for everything at 1080p but going by the last page it looks like I should still be good going for a 1440p monitor in the 27-30 inch range? Anything special I should look for or avoid or can I pretty much just buy anything that looks good?

Dr. Video Games 0031
Jul 17, 2004

Bakalakadaka posted:

sup I haven't bought a new monitor in over a decade and I want to reconfigure my pc setup that currently involves a 60 inch 1080p tv (GLORIOUS COUCH GAMING). I have an oldish PC with a gtx 1070 which has been just fine for everything at 1080p but going by the last page it looks like I should still be good going for a 1440p monitor in the 27-30 inch range? Anything special I should look for or avoid or can I pretty much just buy anything that looks good?

We've been naming quite a few 27" 1440p monitors worth buying these last few pages. I wouldn't go with literally any monitor, as there are some landmines to avoid (don't buy a cheap VA panel for instance). I would at minimum consult reputable review sources like RTINGS, Hardware Unboxed, and TFTCentral. There's a lot of marketing BS in monitors, and it can be hard to tell the actual quality of a monitor from the product pages and user reviews. I wouldn't buy a monitor that hasn't been extensively tested by a reputable reviewer.

That said, I wouldn't get a 1440p monitor for your 1070 unless you're keeping your 1080p display hooked up to it as well. There are going to be some games that don't run very well at 1440p on the 1070. You can expect somewhere around a 35% drop in frame rate going from 1080p to 1440p.

Bakalakadaka
Sep 18, 2004

Dr. Video Games 0031 posted:

We've been naming quite a few 27" 1440p monitors worth buying these last few pages. I wouldn't go with literally any monitor, as there are some landmines to avoid (don't buy a cheap VA panel for instance). I would at minimum consult reputable review sources like RTINGS, Hardware Unboxed, and TFTCentral. There's a lot of marketing BS in monitors, and it can be hard to tell the actual quality of a monitor from the product pages and user reviews. I wouldn't buy a monitor that hasn't been extensively tested by a reputable reviewer.

That said, I wouldn't get a 1440p monitor for your 1070 unless you're keeping your 1080p display hooked up to it as well. There are going to be some games that don't run very well at 1440p on the 1070. You can expect somewhere around a 35% drop in frame rate going from 1080p to 1440p.

Thanks for the input! My main games right now aren't all that gpu intensive so I could probably work with the frame rate drop, and I do have three other 1080p screens to work with if there's something that my system really can't handle at 1440p. I'll probably stick with the old setup for a bit while I work on unpacking from moving and do some shopping later.

Badly Jester
Apr 9, 2010


Bitches!
I find myself getting extremely tired of my current 2x 24" 1080p setup, in no small part due to the extremely lovely ergonomics of these monitors and their giant bezels. Because I'm at the stage of my goony life cycle where I'm more concerned with productivity rather than gaming, I keep eyeing 1440p 34-35" ultrawide setups.

Am I correct in assuming that my GTX 970 will not hamstring such a setup if I can handle playing the occasional game windowed? I hardly ever play AAA titles on my computer anymore where :myimmersion: could be ruined. For stuff like Rimworld I don't think it would matter at all.

For example, the LG 34GN850-B keeps popping up, but it's closer to 800 than my (somewhat malleable) 700€ limit and I think I'd end up overpaying for gaming performance that may be nice for futureproofing, but absolutely wasted on me right now.
Given that I don't need balls-to-the-wall gaming performance from a monitor, are there actually any stand-out models in the ~500-700€ budget range? The mess of naming conventions and local availability (Germany) make it pretty difficult to find useful reviews, so I might as well take a shot on yinz opinions.

Volguus
Mar 3, 2009
Now that "Dr. Video Games 0031" posted the link to TFT Central, I went ahead and looked at their LG CX OLED review. And it's a bummer:

quote:

The resolution is a sensible size for the screen at least and 3840 x 2160 Ultra HD makes it more viable than if it was a 1080p only. On the 48″ model (which is certainly the better choice relative to the others for desktop use) the pixel pitch is 0.2767mm which is exactly the same as a 24″ monitor at 1920 x 1080. That means that from up close from a normal PC monitor viewing distance the font size is comfortable and comparable to a common desktop monitor. That’s fine, but it means you need to be at a normal PC monitor viewing distance which then means the screen is far too large and in your face. If you move yourself back to a more comfortable distance so that the screen size is less of a problem, the text size becomes too small and so you then have to use Operating System scaling to make it bigger. This reduces your desktop real-estate and adds complications for some applications, games, and systems. Scaling isn’t always the easiest thing to use on PCs.

I would have wanted to put it on the wall in front of me, which would have given me about 1m or so distance. And the primary purpose would have been office use (coding) with a secondary (distant) gaming and movie watching. drat.

Dr. Video Games 0031
Jul 17, 2004

This is an issue that other 4K monitors have to address too. You probably don't want to run a 32" 4K monitor unscaled (unless you have excellent eyesight), and you definitely don't want to do that for a 28" or 27" 4K monitor. So it's really the same dilemma there, just with the monitor closer to your face. And in the end, windows scaling works out fine for most things these days. There are some minor annoyances, and you do lose some screen real estate, that's true, but you can think of it as using that extra resolution render text and UI elements with more detail. That's one of the biggest benefits of going 4K, being able to render text with more pixels for a smoother, cleaner, easier to read look.

However, there's a catch with LG OLEDs in the form of their WBGR subpixel structure. Not only do they use BGR, which some programs fail to account for when rendering text (causing blurry text), but they have an additional white subpixel which throws every PC text renderer for a loop. It's not unreadable unscaled, but you do get some color fringing issues and slightly less clarity than other displays. Scaling an OLED is an issue though. Look at the text rendering examples in RTINGS's review of the C1 using a 10-point font here. Note that higher scaling factors actually break ClearType pretty badly when using this subpixel format. And for some reason, windows' non-AA'ed whole-pixel text rendering also shits the bed at 125% scaling on WBGR displays. They didn't show what text looks like in google apps, but it's probably not great since those tend to fare the worst when it comes to BGR subpixels.

That said, 10-point fonts aren't too common, and these issues always look far worse when magnified 100 times like these images are. The text will look better with larger font sizes at their regular viewing size. It's something you should be aware of though since you would be working with text a lot on this thing. That issue along with the risk of burn-in (how easy is it to avoid static interface elements as a coder?) would sway me away from an OLED for now, to be honest. The tech as it currently exists doesn't seem ideal for use as a PC monitor. Just go with a multi-monitor setup if screen real estate is what you're after and gaming/moving watching isn't a major concern.

Dr. Video Games 0031 fucked around with this message at 03:42 on Oct 26, 2021

Question Time
Sep 12, 2010



Recently I got a Gigabyte M27Q monitor for my secondary computer on this thread's recommendation, and side-by-side, the colors look amazing in many games compared to my old 34" ultrawide. I assume that is the HDR - is there a recommended HDR ultrawide monitor I could upgrade to?

Butterfly Valley
Apr 19, 2007

I am a spectacularly bad poster and everyone in the Schadenfreude thread hates my guts.
It's not the HDR because HDR in monitors almost universally sucks.

Question Time
Sep 12, 2010



Butterfly Valley posted:

It's not the HDR because HDR in monitors almost universally sucks.

So, just higher quality in general, due to some arcane specs I wouldn't know how to read, I guess?

That makes it even more important to get a recommended one from people who know what they are talking about, I guess.

Unsinkabear
Jun 8, 2013

Ensign, raise the beariscope.





M27Q is one of the few options that have outstanding color accuracy out of the box. Most need help, and without a fancy colorimeter I'm not sure how us regular folk are supposed to eyeball that calibration. Maybe correct color just isn't a thing most people care about, but it's a big one to me. My might-buy list is exclusively your M27Q or the XV272U KVbmiiprzx (at this point I'm just waiting to see if either one goes on sale), because they're the only two recommended options that Rtings scored a 9+ to on pre-calibration color accuracy. Everything else is 7ish or below with weirdness visible to the naked eye.

Dr. Video Games 0031
Jul 17, 2004

Butt Discussin posted:

So, just higher quality in general, due to some arcane specs I wouldn't know how to read, I guess?

That makes it even more important to get a recommended one from people who know what they are talking about, I guess.

The M27Q has a wide gamut. Color space is a weirdly complex topic, but the gist of it is that the M27Q is capable of displaying more colors than most monitors. By default, it may actually be displaying too many colors and is oversaturating the image. The M27Q is tuned to the "adobe RGB" color space, while most games are designed for the narrower "sRGB" color space. A "color space" is basically a range of colors that we've decided to turn into a standard so displays know what to target if they want the image to be accurate. So if you want the image to be more accurate to the game developer's original intent, you should go into your monitor's settings and switch into sRGB mode. There's no need to do this if you prefer the look of the default mode, but some images can appear very off if oversaturated. A common example is that people may appear sunburnt, as the monitor will take any hint of redness in their skin and blow it out of proportion. Darker browns and tans will still look mostly normal, but lighter skintones end up appearing very wrong. This can happen for blues and greens too, but their are fewer examples that look obviously wrong to our eyes (we're much more sensitive to when people look wrong). sRGB mode will restrict the color range and map colors to their intended tones when viewing sRGB content.

If you already are on sRGB mode, then I guess what you're noticing is just the greater sRGB coverage. Until recently, many older TN displays had 90% or worse sRGB coverage, which means that those images looked more desaturated than intended on those monitors. Going from ~90% to ~100% coverage isn't usually such a huge leap though, so either your old monitor really sucked or you're not on sRGB mode.

I typed this right after waking up so I hope it makes sense.

edit: HDR mode is only enabled if you specifically turn it on in windows and then also your monitor's settings (it should have some kind of separate HDR mode). The image will appear washed out in windows until you enable it in your monitor too, at which point it should hopefully look normal. Then you must launch games in exclusive full screen mode, and only then will you have the option to enable HDR in the game. Now, with all the stars aligned, you may have an HDR gaming experience, and feel the disappointment as your monitor's HDR looks the same as SDR but with the game dynamically adjusting the brightness setting, basically. Same-frame contrast will still be poo poo on most "HDR" monitors, sorry to say. If the game supports a wide color gamut in HDR mode you may benefit from that, but I'm not sure I've actually run into any games that do (you'd basically have to design every asset twice, once for sRGB, and again for HDR's wider color space, and nobody seems willing to do that).

Dr. Video Games 0031 fucked around with this message at 04:05 on Oct 28, 2021

Dr. Video Games 0031
Jul 17, 2004

Speaking of HDR:

https://twitter.com/tomshardware/status/1453454537570795531

Note that this "inexpensive" 27-inch 1440p 170Hz monitor is $1000, lmao.

FALD = "Full-Array Local Dimming," which is being produced here by an array of independently controlled mini-LED backlights. There are 576 dimming zones in this monitor, which is the kind of dimming that's necessary for decent same-scene contrast ratios. There's a catch, however. Take a dark scene and put a bright object in it that fills up exactly one of these dimming zones. That object will appear very bright, and the surrounding scene will appear very dark—OLED-like, even. But make that object much smaller so it only takes up a portion of the dimming zone, and what will happen is either the panel's logic will tone down the brightness of that one zone (and thus make the bright object dimmer), or the dark space around the bright object will "bloom" with light because the whole zone is being illuminated and IPS pixels suck at blocking light. In especially bad instances of blooming, it can be more distracting than if the image simply had universally bad contrast.

The M1 Ipad Pro and new Macbook displays have even denser local dimming zones, and users are reporting a fair amount of light blooming in HDR content (less so for the macbooks). The Asus pg32uqx is a 32" 4K IPS monitor with 1152 zones (42% more zone density than this new AOC monitor), and the HDR is said to be pretty good for the most part, but there's still some noticeable bloom. In contrast, the 49" super ultra wide Odyssey Neo G9 (two 27" displays stitched into one basically) has 2048 zones, and the HDR is almost OLED-like even in the most challenging of scenarios, which is due in part to the density being 25% greater yet than the Asus monitor, but also due to VA pixels being much better at blocking out light (hence their superior native contrast).

All of this is to say that I don't think a 27" IPS monitor with 576 zones is going to make for a mind-blowing HDR experience, which makes the $1000 price tag rather disappointing. Monitor manufacturers have also shown that controlling FALD backlights is tricky, and most have not yet mastered the art. Still though, this will probably give you a superior experience to SDR. It doesn't seem worth $1000, but it's a step in the right direction. Other companies are putting out flagship products with FALD backlights, and it hopefully won't be long before the flagship features become the norm. This will be the next hot thing for manufacturers to market now that they've exhuasted the public's interest in GtG response times.

Dr. Video Games 0031 fucked around with this message at 02:59 on Oct 28, 2021

Question Time
Sep 12, 2010



Dr. Video Games 0031 posted:

The M27Q has wide gamut. Color space is a weirdly complex topic, but basically the gist of it is that the M27Q is capable of displaying more colors than most monitors. By default, it may actually be displaying too many colors and is oversaturating the image. The M27Q is tuned to the "adobe RGB" color space, while most games are designed for the narrower "sRGB" color space. A "color space" is basically a range of colors that we've decided to turn into a standard so displays know what to target if they want the image to be accurate. So if you want the image to be more accurate to the game developer's original intent, you should go into your monitor's settings and switch into sRGB mode. There's no need to do this if you prefer the look of the default mode, but some images can appear very off if oversaturated. A common example is that people may appear sunburnt, as the monitor will take any hint of redness in their skin and blow it out of proportion. Darker browns and tans will still look mostly normal, but lighter skintones end up appearing very wrong. This can happen for blues and greens too, but their are fewer examples that look obviously wrong to our eyes (we're much more sensitive to when people look wrong). sRGB mode will restrict the color range and map colors to their intended tones when viewing sRGB content.

If you already are on sRGB mode, then I guess what you're noticing is just the greater sRGB coverage. Until recently, many older TN displays had 90% or worse sRGB coverage, which means that those images looked more desaturated than intended on those monitors. Going from ~90% to ~100% coverage isn't usually such a huge leap though, so either your old monitor really sucked or you're not on sRGB mode.

I typed this right after waking up so I hope it makes sense.

edit: HDR mode is only enabled if you specifically turn it on in windows and then also your monitor's settings (it should have some kind of separate HDR mode). The image will appear washed out in windows until you enable it in your monitor too, at which point it should hopefully look normal. Then you must launch games in exclusive full screen mode, and only then will you have the option to enable HDR in the game. Now, with all the stars aligned, you may have an HDR gaming experience, and feel the disappointment as your monitor's HDR looks the same as SDR but with the game dynamically adjusting the brightness setting, basically. Same-frame contrast will still be poo poo on most "HDR" monitors, sorry to say. If the game supports a wide color gamut in SDR mode you may benefit from that, but I'm not sure I've actually run into any games that do (you'd basically have to design every asset twice, once for sRGB, and again for HDR's wider color space, and nobody seems willing to do that).

This is very informative, thank you very much!

HDR was actually disabled on both monitors, and enabling it does seem to produce a slight improvement on both, mainly darker blacks. Nothing world shattering, but it's better than before. I guess I've become the old guy paying for all the expensive features and not using them out of ignorance, but it's good to clear that up.

On the M27Q, the different color modes do make a big difference, and setting it to "sRGB mode" does make it look a lot closer to the other monitor. I guess the M27Q set to "FPS" or "RTS" mode just makes it look supersaturated, and more beautiful to me.

Rand Brittain
Mar 25, 2013

"Go on until you're stopped."
Does anybody have any experience with Samsung's return process? I want to send back my G7 to be repaired because the auto-switch-input function doesn't work, and it's still in warranty, but I can't get anybody at Samsung to admit they've ever seen the problem in spite of the dozens of people on their forums complaining about it, which makes me loathe to ship a large fragile monitor away and wait a month for it to come back and then, possibly, work better.

(I wish I could send it back to Newegg for a replacement but by the time I realized it was actually broken and not just weird I'd already thrown out all the boxes.)

SkyeAuroline
Nov 12, 2020

Rand Brittain posted:

Does anybody have any experience with Samsung's return process? I want to send back my G7 to be repaired because the auto-switch-input function doesn't work, and it's still in warranty, but I can't get anybody at Samsung to admit they've ever seen the problem in spite of the dozens of people on their forums complaining about it, which makes me loathe to ship a large fragile monitor away and wait a month for it to come back and then, possibly, work better.

(I wish I could send it back to Newegg for a replacement but by the time I realized it was actually broken and not just weird I'd already thrown out all the boxes.)

Not for monitors, but I'm currently fighting them over my phone, which I sent in for repairs and not only came back unrepaired but is actually now significantly worse than it was before. Support gives zero fucks and won't guarantee anything, even what's spelled out in the warranty policy. Terrible support.

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

If you're within the time limit to return it to the vendor instead, I'd do that.

Rand Brittain
Mar 25, 2013

"Go on until you're stopped."

Rexxed posted:

If you're within the time limit to return it to the vendor instead, I'd do that.

I'm not, and I don't have the packaging any longer. It's very annoying because it seems like every review said "wow, this is a high-end monitor; the best you can get if you want to spend the money" and then after I spent my bonus on it all the comments popped up saying "by the way this has an astonishing number of QA issues and features that don't work."

On the bright side, it displays pictures very prettily.

Dennis McClaren
Mar 28, 2007

"Hey, don't put capture a guy!"
...Well I've got to put something!
I just bought a new Acer Nitro 5 gaming laptop with a GTX3050ti in it. I used the Nvidia control panel and the external monitor control's to calibrate my external gaming monitor yesterday.
But today when I started using the laptop itself without the external monitor, I'm noticing there is a yellow tinge/tint over all the white colors on my display. Whether it's in a movie/Video where it's very prominent yellow shaded whites, or just browsing chrome, all the whites are kind of washed with like a yellow influence.
I didn't change any display settings on my laptop though - just the external monitor. No windows settings. So I'm not sure how this happened.
How do I get rid of this yellow tint that's discoloring my whites? Thanks for any help.

Withnail
Feb 11, 2004
I'm sure buying 'vintage' tech is a terrible idea. But I just picked up this old 30 inch cinema display for $50 off craigslist. And it's still, sort of, good?

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Withnail posted:

I'm sure buying 'vintage' tech is a terrible idea. But I just picked up this old 30 inch cinema display for $50 off craigslist. And it's still, sort of, good?



is it dvi-i? also it seems that thing isn't height adjustable (but hardly an issue for $50)

Withnail
Feb 11, 2004

Rinkles posted:

is it dvi-i? also it seems that thing isn't height adjustable (but hardly an issue for $50)

It's dvi-d, I just plugged it into the gtx1080 and 4.1 million pixels fired up (and the room got a little warmer).

DerekSmartymans
Feb 14, 2005

The
Copacetic
Ascetic

Dennis McClaren posted:

I just bought a new Acer Nitro 5 gaming laptop with a GTX3050ti in it. I used the Nvidia control panel and the external monitor control's to calibrate my external gaming monitor yesterday.
But today when I started using the laptop itself without the external monitor, I'm noticing there is a yellow tinge/tint over all the white colors on my display. Whether it's in a movie/Video where it's very prominent yellow shaded whites, or just browsing chrome, all the whites are kind of washed with like a yellow influence.
I didn't change any display settings on my laptop though - just the external monitor. No windows settings. So I'm not sure how this happened.
How do I get rid of this yellow tint that's discoloring my whites? Thanks for any help.

Check the laptop settings for Win10’s “nite light” feature being on. It changes colors down from blue light to warm, yellow-orange colors on my older laptop. I used to get the same treatment from f.lux until MS came with their own official tool.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
given the current market it’s hard to argue with an older high end monitor for $50. Yeah i’m sure the newer high-end stuff is better, but it’s also not gonna be $50, and quality in midrange/low-end monitors drops off fast below $300.

take apple out of the picture and what if it was an older pro dell IPS or something, like a P3012Q (made up model) then yeah sure whatever, worth it if you’re looking for something along those lines

Paul MaudDib fucked around with this message at 04:57 on Oct 29, 2021

Dr. Video Games 0031
Jul 17, 2004

https://www.newegg.com/viewsonic-vx...-900-_-10292021

Viewsonic VX2758-2KP-MHD, the decent 27" 144Hz 1440p IPS we've brought up a few other times in this thread is on sale for $220 currently. If you've been looking for a budget-tier 1440p 144hz display, this is it. The color gamut coverage is average for an IPS. Not quite wide gamut, but it has full sRGB coverage and then some. The response times aren't quite as good as the current fastest monitors, but most people probably wouldn't be able to tell the difference between this and a faster IPS at 144hz. It's a perfectly good monitor for $220 with no pointless bells and whistles.

Inner Light
Jan 2, 2020



Dr. Video Games 0031 posted:

https://www.newegg.com/viewsonic-vx...-900-_-10292021

Viewsonic VX2758-2KP-MHD, the decent 27" 144Hz 1440p IPS we've brought up a few other times in this thread is on sale for $220 currently. If you've been looking for a budget-tier 1440p 144hz display, this is it. The color gamut coverage is average for an IPS. Not quite wide gamut, but it has full sRGB coverage and then some. The response times aren't quite as good as the current fastest monitors, but most people probably wouldn't be able to tell the difference between this and a faster IPS at 144hz. It's a perfectly good monitor for $220 with no pointless bells and whistles.

Looks good. One thing though is it doesn't have height adjust on the stand. Some goons will put it on a VESA so it doesn't matter, but if you are using a default stand height adjustment is real nice, especially if you have a standing desk.

Dennis McClaren
Mar 28, 2007

"Hey, don't put capture a guy!"
...Well I've got to put something!

DerekSmartymans posted:

Check the laptop settings for Win10’s “nite light” feature being on. It changes colors down from blue light to warm, yellow-orange colors on my older laptop. I used to get the same treatment from f.lux until MS came with their own official tool.

That's what I thought too because I remember that gives a yellow tint. But sadly no, its turned off.

In reference to my post on this page above this..
Any other display pro's here want to give it a shot? Any reason why my screen has a yellow tinge/tint over all my whites? It's subtle, but it's noticeable enough to throw off videos and shows I'm watching when white scene's are obviously off-white with a yellow tint.
How can I get this fixed?
RTX 3050ti on a Acer Nitro 5

SkyeAuroline
Nov 12, 2020

Dr. Video Games 0031 posted:

https://www.newegg.com/viewsonic-vx...-900-_-10292021

Viewsonic VX2758-2KP-MHD, the decent 27" 144Hz 1440p IPS we've brought up a few other times in this thread is on sale for $220 currently. If you've been looking for a budget-tier 1440p 144hz display, this is it. The color gamut coverage is average for an IPS. Not quite wide gamut, but it has full sRGB coverage and then some. The response times aren't quite as good as the current fastest monitors, but most people probably wouldn't be able to tell the difference between this and a faster IPS at 144hz. It's a perfectly good monitor for $220 with no pointless bells and whistles.

If all I've got is an RX 580, is it worth pushing for this? I already have a rough time at just 1080p60 sometimes.
Not in the best spot to pick up any hardware right now but if it's the best option I might be able to squeeze it out.

Adbot
ADBOT LOVES YOU

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Depends on what you're doing and when. Any modern games are going to get murdered. You can probably still get to 60ish FPS though. Desktop experience would be a big upgrade. Older games would be a big upgrade, especially older stuff where you can get really high framerates.

I don't think you should necessarily feel rushed to buy a monitor just because it's an upgrade if you have no money and aren't going to any time soon.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply