Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dr. Video Games 0031
Jul 17, 2004

codo27 posted:

Rtings has a great rep but they constantly will cite HDR being a worthwhile difference between one monitor and one that doesn't have it. But its been said here its basically a scam as far as monitors go. What gives?

Rtings is a great source of test data, and most of their high scorers are solid monitors. That said, the way their scoring criteria works has led to some strange results before, and they are perhaps too lenient on features that don't fully deliver.

HDR in monitors isn't universally a scam though. There are OLED monitors and monitors with mini-LED backlights that have enough backlight zones to give a satisfying HDR experience. I'm sure there are some monitors that rtings gives undue credit to just for having basic HDR support, though.

Ultimately, you need to look at just the parts of the reviews that matter to you. I tend to ignore any editorializing they do, since they sometimes have weird priorities.

Adbot
ADBOT LOVES YOU

codo27
Apr 21, 2008

I put a scratch on my original ROG swift when moving a few years back and have just dealt with it. I want something new though, and I want VESA mountable so I can have the monitor sit a bit further back from me. I'd like to make the jump to 4K, but I'm running a HDMI cable out to the living room TV for that, as that will only be for casual/lounging style games anyway as I wont play my shooters and poo poo without high frames. So I guess its going to be another 27" @1440p. The M27Q is on sale right now, though many of its extra features seem superfluous for my use case. Viewsonic has an option thats $100 CAD cheaper, but as usual I'm given pause by reviews which may or may not be worth a gently caress. Couldn't care less about hard to reach USB ports, fancy OSDs, RGB or ergonomics- its going on a wall mount anyway.

One thing I did notice the other day, we were going through our wedding photos in bed the morning we received them, this was on my Surface Laptop. When I looked at them again later on the Swift, the colors looked so bad in comparison. That screen is 8 years old now, I wonder how one of these newer displays would handle color in comparison.

e: apparently the viewsonic drops to 6ms response time when G Sync is enabled, so that sucks.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

codo27 posted:

One thing I did notice the other day, we were going through our wedding photos in bed the morning we received them, this was on my Surface Laptop. When I looked at them again later on the Swift, the colors looked so bad in comparison. That screen is 8 years old now, I wonder how one of these newer displays would handle color in comparison.

Judging by you saying it's 8 years old and a quick glance at your old posts ITT: it's almost certainly TN, which means any modern IPS monitor is going to have colors and contrast levels that blow it out of the water.

As for HDR, the quick summary is that you probably won't use it that much even on a dedicated gaming monitor (because HDR support in PC games is not that widespread and even when it's available it might not be a good idea to turn it on, depending on the game), but if you do want it and know you'll use it you'll have to pay a lot of money to get an implementation that's actually good. There's also a difference between actual HDR and the monitor having a wide color gamut/being able to accept a signal with 10-bit color.

BTW there's also a M27Q X now that's 240 Hz and has a normal RGB subpixel layout.

InternetOfTwinks
Apr 2, 2011

Coming out of my cage and I've been doing just bad
Got my dual monitor VESA arm setup all worked out at this point, huge quality of life upgrade from the old 1360x768 TV with overscan issues I was using for a display before. Probably will add one more so I can have a display permanently in portrait orientation for code/web content, but I'm pretty happy with it all now. Couple things I think could elevate it though, was hoping I could get pointed in the right direction.

I'm looking for like, a tiny display, something that can sit on the desk under the main monitors where I could put, for instance, Spotify or Discord and be able to check it at a glance while using the main monitors for other tasks. Touchscreen capability would be the cherry on top, being able to pause/play and other simple tasks on my desktop without toggling the peripheral switch over from my work laptop would be super convenient.

Would also be nice if I could set up a hardware switch to handle switching my monitors between the two computers, currently using the monitor buttons to switch both monitors individually which works, but is kinda inconvenient, bordering on obnoxious if my desktop is in sleep mode since these Sceptre monitors will auto switch back to the last connection if they aren't receiving signal within like half a second of switching to a different input.

Bad Purchase
Jun 17, 2019




^ i'm also curious about whether there are any good KVMs out there, especially the "M" part and particularly something supporting multiple displays. but given how finicky monitors and graphics cards can be with cables and refresh rates and especially features like g/free sync, i wouldn't be surprised if sticking with the monitor's built-in input switching is the best way to go.

knox_harrington
Feb 18, 2011

Running no point.

InternetOfTwinks posted:

I'm looking for like, a tiny display, something that can sit on the desk under the main monitors where I could put, for instance, Spotify or Discord and be able to check it at a glance while using the main monitors for other tasks. Touchscreen capability would be the cherry on top, being able to pause/play and other simple tasks on my desktop without toggling the peripheral switch over from my work laptop would be super convenient.

How tiny? How much $? You can look at portable monitors that may fit the bill. They'll be around 14"

Lenovo make good ones apparently (no personal experience) though I'm sure you could find something cheaper off Amazon.

https://www.lenovo.com/us/en/p/accessories-and-software/monitors/office/61dduar6us

InternetOfTwinks
Apr 2, 2011

Coming out of my cage and I've been doing just bad
14" might be a bit on the large size, was thinking more like 7-10", but I'll check those out and see if they fit the bill.

knox_harrington
Feb 18, 2011

Running no point.

Looks like some rpi oriented screens could work (and a lot cheaper)

https://www.waveshare.com/product/raspberry-pi/7inch-hdmi-lcd-h-with-case.htm

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me

knox_harrington posted:

Looks like some rpi oriented screens could work (and a lot cheaper)

https://www.waveshare.com/product/raspberry-pi/7inch-hdmi-lcd-h-with-case.htm

This is a good idea, but I would maybe look for something with more vertical resolution. Modern day Windows can be hard to navigate with only 600 vertical pixels. Or maybe make sure it works in portrait mode I guess.

Vintersorg
Mar 3, 2004

President of
the Brendan Fraser
Fan Club



Ok, this is just insane.

https://www.youtube.com/watch?v=CsoKWsZ-Tyw

Samsung 4k 55" curved monitor, $3500~

Vintersorg fucked around with this message at 18:57 on Aug 15, 2022

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Bad Purchase posted:

^ i'm also curious about whether there are any good KVMs out there, especially the "M" part and particularly something supporting multiple displays. but given how finicky monitors and graphics cards can be with cables and refresh rates and especially features like g/free sync, i wouldn't be surprised if sticking with the monitor's built-in input switching is the best way to go.

I use this software solution (basically get a cheap USB switch and trigger monitor input switches with software based on USB connect/disconnect events). It's not exactly user-friendly though and probably near unusable if you're not a pretty big nerd.

Dr. Video Games 0031
Jul 17, 2004

TheFluff posted:

Judging by you saying it's 8 years old and a quick glance at your old posts ITT: it's almost certainly TN, which means any modern IPS monitor is going to have colors and contrast levels that blow it out of the water.

As for HDR, the quick summary is that you probably won't use it that much even on a dedicated gaming monitor (because HDR support in PC games is not that widespread and even when it's available it might not be a good idea to turn it on, depending on the game), but if you do want it and know you'll use it you'll have to pay a lot of money to get an implementation that's actually good. There's also a difference between actual HDR and the monitor having a wide color gamut/being able to accept a signal with 10-bit color.

BTW there's also a M27Q X now that's 240 Hz and has a normal RGB subpixel layout.

Eh, I haven't found HDR to make a gaming experience worse yet, though some games have an underwhelming presentation. Windows AutoHDR also does a decent job of filling in the holes. I have to squint every time I use a cutting laser in a dark area while playing Hardspace Shipbreaker for instance, which I don't know if you'd find fun, but I do! So I find that I'm using the HDR mode in my monitor quite a bit actually.

codo27 posted:

I put a scratch on my original ROG swift when moving a few years back and have just dealt with it. I want something new though, and I want VESA mountable so I can have the monitor sit a bit further back from me. I'd like to make the jump to 4K, but I'm running a HDMI cable out to the living room TV for that, as that will only be for casual/lounging style games anyway as I wont play my shooters and poo poo without high frames. So I guess its going to be another 27" @1440p. The M27Q is on sale right now, though many of its extra features seem superfluous for my use case. Viewsonic has an option thats $100 CAD cheaper, but as usual I'm given pause by reviews which may or may not be worth a gently caress. Couldn't care less about hard to reach USB ports, fancy OSDs, RGB or ergonomics- its going on a wall mount anyway.

One thing I did notice the other day, we were going through our wedding photos in bed the morning we received them, this was on my Surface Laptop. When I looked at them again later on the Swift, the colors looked so bad in comparison. That screen is 8 years old now, I wonder how one of these newer displays would handle color in comparison.

e: apparently the viewsonic drops to 6ms response time when G Sync is enabled, so that sucks.

I looked into the VX-2768 sometime last year, and apparently it's a refresh of the already mediocre 2758 except with much worse response times across the board. I'd avoid it. The M27Q is a nice, vibrant IPS, though the BGR subpixels can cause text clarity issues in some instances (anywhere windows cleartype or zero anti-aliasing is used should be fine, but custom text AA is often broken). You may not need the extra features, but the only things I'm seeing for cheaper than the M27Q in canada right now is cheap junk that I wouldn't recommend. The M27Q is really the baseline.

a dingus
Mar 22, 2008

Rhetorical questions only
Fun Shoe
HDR is awesome. I'm probably only playing games that do it well but after seeing it part of me only wants to play games that support it.

VelociBacon
Dec 8, 2009

Yeah I have an m32U and I have to say when I play games with HDR it absolutely makes a difference and I always use it.

It's a huge pain in the rear end sometimes though because most games won't Auto-On/Off the HDR. I find myself launching a game, realizing I forgot to turn hdr on, closing the game, turning HDR on, relaunching. Sometimes games will front load the brightness/HDR settings on the first launch and in those cases it's really annoying to look for those things to change them after the game already thinks I'm not using HDR.

CAPTAIN CAPSLOCK
Sep 11, 2001



Vintersorg posted:

Ok, this is just insane.

Samsung 4k 55" curved monitor, $3500~

I want Hardware Unboxed to do testing on that monstrosity. :getin:

SpaceDrake
Dec 22, 2006

I can't avoid filling a game with awful memes, even if I want to. It's in my bones...!
So after more than a decade, I return to Movax's House of Monitor Chat. Although it's given me eleven years of good service, I have reasons to think that my Dell U2410 is beginning to suffer various kinds of failure (or is at least becoming very sensitive to heat, which is making this summer A Struggle, needless to say) and is nearing the end of its operational life without extensive refurbishment. From a cursory look, however, it doesn't seem like "modern" ultrasharps offer the same feature set as my good ol' 2410 (a soundbar, composite/component PiP, no need for DVI since it's dead on new cards but still, easy monitor controls, etc.) Would I be better off just getting an already-refurbed 2410 if I really like what my 2410 does as it stands, because goddamn, there are some good prices out there but that also seems dubious? (My current home setup would make getting a much bigger monitor a little bit of a struggle without also replacing my desk/re-arranging my working area, and much bigger than 27" would be a waste in any event.)

EDIT: Uhp, yep, there it goes. Only displaying red and can't be turned off without unplugging it, and turning it back on is pure black screen territory, unlike previously where an unplug would let it work... for a while. I think that last power blip got to it even through the surge protector (which is a bit worrying on the surge protector's side) with the recent heat doing nothing to help. The display itself still seems fine but the control board for it has probably taken a pounding over the years - and I swear the whole monitor's been getting hotter than it should over the past few years at least. It'd probably cost more to repair/fix the board for the thing than to just get a new display in 2022, so... pour out a 40, it's done.

(Thank god I saved my rinky-dink SP1908FP for contingencies like this. :v: Shine on, you trooper.)

So what am I looking at for a replacement with similar image quality, at least? The 2421H seems to have built-in speakers (with PC or console audio able to come in via HDMI, I'd assume) and two HDMIs, so I could flip between PC input and console input easily without a switch, and I know DVI on modern monitors (and cards, once my 970 finally no longer cuts the mustard for real) is dead, but drat it hurts to lose those nifty RCA input options and that wonderful cornucopia of jack options in the back, and goddamnit I liked being a weirdo with a 1920x1200 monitor :argh:. On the other hand, the price is pretty much exactly what I'd want in a swift replacement main monitor (or even just a monitor in general, since in truth, I never did use a lot of those other features I paid extra coin for back when all that much, I have no real personal interest in anything above 1080p/1200p anyway, and it can get here pretty goddamned fast. It also looks like, with IPS panels becoming standard, a lot of the features I got the Ultrasharp for in 2010-11 (the high color range, etc) are now a lot more standard on monitors in the $200 range and the newer Ultrasharps lack speakers. Is this all pretty correct, or am I missing things.

fake edit: also I guess the RCA thing is just a casualty of RCA not being able to push over 1080p, so as soon as 4k came into vogue, the writing was on the wall. :sigh:

SpaceDrake fucked around with this message at 17:29 on Aug 16, 2022

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Just wanted to say re: the QD OLED chat that if you're willing to move up to an S95B you will not get the pentile issues at all. At least I don't see it whatsoever. Possibly because the DPI is much higher than the Alienware.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Taima posted:

Just wanted to say re: the QD OLED chat that if you're willing to move up to an S95B you will not get the pentile issues at all. At least I don't see it whatsoever. Possibly because the DPI is much higher than the Alienware.

A 55" S95B should have a DPI of ~80. A 3440x1440 34" ultrawide has a DPI of about 110 (which is about the same as a 4k display at 40", for reference). But you're likely not sitting as close to the TV as you would a monitor, so the lower DPI is fine and it's gonna be a lot harder to pick out single-sub-pixel lines.

As much as using an OLED TV as a monitor does have some pretty real use cases and advantages, sadly my desk just won't work for something that large.

SpaceDrake
Dec 22, 2006

I can't avoid filling a game with awful memes, even if I want to. It's in my bones...!

DrDork posted:

As much as using an OLED TV as a monitor does have some pretty real use cases and advantages, sadly my desk just won't work for something that large.

I kind of alluded to this in my other :words: post: going much above a 27" for my desk is going to be pushing it for me, and the sort of desk that'd support a 30+" monitor is something of an investment unto itself, never minding how I'm not entirely sure I'd use that much screen real estate at typical desk distances. If anything, it's a bit of a relief to see that 24" monitors still exist, after watching TVs and monitors balloon in size over the past decade as 4K or higher has come into vogue.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

DrDork posted:

A 55" S95B should have a DPI of ~80. A 3440x1440 34" ultrawide has a DPI of about 110 (which is about the same as a 4k display at 40", for reference). But you're likely not sitting as close to the TV as you would a monitor, so the lower DPI is fine and it's gonna be a lot harder to pick out single-sub-pixel lines.

As much as using an OLED TV as a monitor does have some pretty real use cases and advantages, sadly my desk just won't work for something that large.

Oh that's fair.

I guess what I mean is that because it's fairly far away from your desk, it has an effective higher DPI, right? And that is what I would recommend for anyone who can't fit in on their desk; that's fine, just get another table and set it up about a human's height away from your keyboard. You wouldn't want this monitor anywhere near that close to yourself anyways so it's a win win.

I run it like this, the TV is about 5 to 5.5 feet away. I tried all manner of distances and that's what I subjectively thought was the correct setup.



I own a lot of the best technology, such as the Dell U4021QW, a 65 LG C9, etc. and QD OLED is just... the end game.

The brightness, color volume and insane response time of the panel is just out of this world for both gaming and PC use specifically. There's no competition. Make no mistake; there are colors on this TV that you have literally never seen before on a panel. It's nuts. I did the last 25% of God of War PC on this panel with 4K/110 fps DLSS Quality and my jaw was on the floor the whole time. The panel itself also supports 4K/144, which is nice to have if you're looking at a higher end Lovelace; the 4090 will absolutely push those numbers, but it's not required. 4K/90 fps on a slower card gets you 90% of the way there in practice.

Also for some reason Samsung is dropping the price on the panel absurdly quickly, it's already $1650 somehow on the Ed/corp site they run for 65 inch.

I paid $2400 in a brick and mortar store, which kinda sucks, but also bought the insanely stupid Best Buy burn in warranty because I was using it for pc, I think the warranty was $600 but 5 years of peace of mind? totally worth. I work from home so this thing will be sitting on the desktop and coding using things like Sublime, so if anyone is going to cause burn in, its me.

However in theory the way QD OLED works should significantly lower chances of general burn in compared to a traditional OLED.

Another weird side effect is that I suffer from eye strain... just a natural side effect of being an aging (mid 30s) goon who is on a screen at work and then after work. But somehow, the S95B has significantly decreased my eye strain versus the LG C9. No idea why that is, but it's been a huge blessing.

e: I can dig up how to purchase the S95B on the ed/corp site for $1650 if anyone cares, just say so

Taima fucked around with this message at 19:58 on Aug 16, 2022

shrike82
Jun 11, 2005

The odyssey ark seems more interesting if you want a huge monitor at your desk

I don’t see how using a TV as a monitor works

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Taima posted:

I guess what I mean is that because it's fairly far away from your desk, it has an effective higher DPI, right?

Sorta! What you're talking about is better classified as something like "pixels per arc-second"--which effectively takes into account both the pixel density of the display, and your distance away from it. The further away you are from a given display, the more fewer fit into an arc-second. Or, conversely, the fewer pixels you need to maintain a given perceived visual level.

Helpfully, absolutely no one bothers to include such data on their displays, though you can use some online calculators to figure it out. this calculator has the "Visual Acuity distance" as the distance for which the display in question has a a 1-pixel-per-arc-second result, which is (in theory) what someone with 20/20 vision should be able to see as "clear".

For a 55" 4k, that distance is 3.7ft, which isn't that far off from the 5.5ft you find comfortable, but it being further off would likely go a long way to "removing" the sub-pixel haloing that the Alienware is being dinged for--since now you're (in theory) too far away to be able to discern individual pixels, let alone sub-pixels.

But yeah, I'm basically mad that QD OLED sounds absolutely bonkers good, except for that part. I typically sit 25-30" away from my monitor, which in terms of Visual Acuity distance would mean I'd want about a 34" 4k--a 34" 3440x1440 ain't quite gonna do it for deleting off that halo, and I know it would drive me nuts using it for office stuff for 6hrs a day.

If the prices keep dropping, though, I might just have to pick up one of the 65" S95B's to replace my existing 65" LED "120 Zone joke-mode HDR" TCL from a couple of years back. 'Cuz drat do they look nice.

shrike82 posted:

The odyssey ark seems more interesting if you want a huge monitor at your desk

I don’t see how using a TV as a monitor works

The short answer is you don't put it on your desk to begin with. I've seen people mount them on the wall and have their desk pulled away from it a bit, I've seen people who just do couch-desk setups in front of the TV, etc. If you've got a big, deep desk, you can sometimes get away with using the 40-42" ones right there, but typically that still ends up involving some head movement to see the corners which can be draining after a while. Anything over the 40's you need to be a couple of extra feet away from for it to be comfortable for most people.

DrDork fucked around with this message at 20:35 on Aug 16, 2022

Klungar
Feb 12, 2008

Klungo make bessst ever video game, 'Hero Klungo Sssavesss Teh World.'

Anyone have an ELI5 walkthrough of calibrating my X27q’s? Been 15 years previously since I’ve bought a monitor, so I am not quite sure what I’m supposed to be shooting for here.

codo27
Apr 21, 2008

Taima posted:

I own a lot of the best technology

Rub that in my face why dont ya.

Save space on your bookmarks bar by editing individual ones with identifiable icons and remove the title.

Dr. Video Games 0031
Jul 17, 2004

Taima posted:


I run it like this, the TV is about 5 to 5.5 feet away. I tried all manner of distances and that's what I subjectively thought was the correct setup.



If this is the 55-inch version, then I would probably move up to 4 feet away personally, but I'm sure you've already tried that. Everyone has their own preferences with this kind of thing.

SpaceDrake posted:

I kind of alluded to this in my other :words: post: going much above a 27" for my desk is going to be pushing it for me, and the sort of desk that'd support a 30+" monitor is something of an investment unto itself, never minding how I'm not entirely sure I'd use that much screen real estate at typical desk distances. If anything, it's a bit of a relief to see that 24" monitors still exist, after watching TVs and monitors balloon in size over the past decade as 4K or higher has come into vogue.

If 27" is doable though, I'd just go with that. 27" 1440p has sorta become the standard now, especially for gaming, and most of the best LCDs right now are of that spec. Like the LG 27GP83B (on sale now for $300, which is a pretty good deal) or the Acer XV272U KV. The latter will have slightly better contrast and a better built-in sRGB mode.These will be tough to drive on your 970 when gaming, though.

For 24", the only real options are 1080p. I think you should also get a high-refresh option instead of sticking with 60hz office monitors. You're gaming on it, and being able to use the PS5/XSX's 120hz mode or a newer GPU's high-framerate capabilities would be a nice upgrade. There are sub-$200 options available, and the AOC 24G2 is the current budget king.

Of these, only the Acer monitor has speakers, but I have to say that I have never heard monitor speakers that I've actually liked. I don't know what kind of speakers your old monitor had, but they're typically absolute trash, the worst speakers you can imagine. I'm suspecting that the S2421H you linked earlier will be the same. Modern monitors will have a 3.5mm line out that you can use for headphones or separate speakers, so no separate switch is needed.

Dr. Video Games 0031
Jul 17, 2004

Good lord, the size of the heatsink on the back of Asus' upcoming 42" W-OLED monitor.

https://www.youtube.com/watch?v=yudNKx8mdLQ&t=40s

SpaceDrake
Dec 22, 2006

I can't avoid filling a game with awful memes, even if I want to. It's in my bones...!

Dr. Video Games 0031 posted:

If 27" is doable though, I'd just go with that. 27" 1440p has sorta become the standard now, especially for gaming, and most of the best LCDs right now are of that spec. Like the LG 27GP83B (on sale now for $300, which is a pretty good deal) or the Acer XV272U KV. The latter will have slightly better contrast and a better built-in sRGB mode.These will be tough to drive on your 970 when gaming, though.

For 24", the only real options are 1080p. I think you should also get a high-refresh option instead of sticking with 60hz office monitors. You're gaming on it, and being able to use the PS5/XSX's 120hz mode or a newer GPU's high-framerate capabilities would be a nice upgrade. There are sub-$200 options available, and the AOC 24G2 is the current budget king.

Of these, only the Acer monitor has speakers, but I have to say that I have never heard monitor speakers that I've actually liked. I don't know what kind of speakers your old monitor had, but they're typically absolute trash, the worst speakers you can imagine. I'm suspecting that the S2421H you linked earlier will be the same. Modern monitors will have a 3.5mm line out that you can use for headphones or separate speakers, so no separate switch is needed.

Ahh, I wish I'd seen this just a hair earlier; though, ultimately, I stuck with 24" because of 1080p to help the 970 along a while longer, yeah. Ended up going with the little brother of the LG you linked, which was also on sale for what was obviously a pretty good loving deal. Managed to snag it and some decent-seeming budget speakers for $200, with the speakers only really there for when I don't feel like using my high quality headphones (or it's a little uncomfortable for them, like a lot of this summer has been).

Thank you, though! It at least drives home that I was generally on the right track with that purchase. And it's a bit wild being able to get a 24" for that price given how much the 2410 cost me a decade ago. :vv: Who says technology doesn't still march on?

Dr. Video Games 0031 posted:

Good lord, the size of the heatsink on the back of Asus' upcoming 42" W-OLED monitor.

https://www.youtube.com/watch?v=yudNKx8mdLQ&t=40s

All I can think about is what kind of loving sauna you'd turn a room into if you used this, a 3080 or equivalent, and an overclocked CPU in the same space. At some point cool air has to come from somewhere.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

SpaceDrake posted:

All I can think about is what kind of loving sauna you'd turn a room into if you used this, a 3080 or equivalent, and an overclocked CPU in the same space. At some point cool air has to come from somewhere.


Yeah, but that heatsink seems mostly to be just so they can avoid having a fan while keeping it cooler than other monitors. Most 55" OLEDs so far have, in actual use, averaged a little under 100W, including the C9 series. The Alienware QD-OLED takes like 43W. I'd be surprised if these ones aren't in line with those numbers, as well.

But you're not wrong about the rest of it. I'm on the top floor of my complex, in a "pop up" room that basically means it has kinda lovely insulation because it was added after the main building was constructed. Between my 3080 (even tuned), 5800X3D (so no OC), and my partner's computer (2700X + 3070), two human bodies and 6 monitors...it gets right toasty.

Vintersorg
Mar 3, 2004

President of
the Brendan Fraser
Fan Club



Is it as hot as the days of plasmas? I had the last of the Panasonic plasmas - 60”. We put that beast in my bedroom one day as I just got a new LED tv and my wife wanted to play Mario Kart. It was so drat hit in there - in the middle of winter. And we never needed heat in that apartment as we were on the 3rd floor of 4.

Dr. Video Games 0031
Jul 17, 2004

Vintersorg posted:

Is it as hot as the days of plasmas? I had the last of the Panasonic plasmas - 60”. We put that beast in my bedroom one day as I just got a new LED tv and my wife wanted to play Mario Kart. It was so drat hit in there - in the middle of winter. And we never needed heat in that apartment as we were on the 3rd floor of 4.

Apparently early 60" plasmas could pull over 500W, though I can't find any concrete testing on this, only random blogs and forum posts. And later-generation plasmas seemingly got much more power efficient—someone reported 135W for a 60" Samsung (though I don't know if that's with the brightness set all the way low or what).

500W would absolutely heat up a room, especially a small third-floor apartment bedroom. In contrast, the typical usage of the 42" C2 is 45 - 65W according to Hardware Unboxed, with peak usage at around 120W. LCDs will be around half to a third of this. So even an OLED wouldn't be nearly as bad as the most power-draining plasmas. You can comfortably put one in a bedroom.

Asus is putting a heatsink there for (potentially) enhanced brightness. OLED panels have to walk a fine tightrope, and it takes a lot of juice to make them bright. This generates a lot of heat, and heat is bad for the organic compounds that make up the pixels. This results in both temporary image retention and permanent "burn-in" once the compounds deteriorate enough (it's more like burn-out). Because of this, display manufacturers are often reluctant to feed the panels too much juice in fear that it'll cause endurance issues or more obvious temporary image retention. But if you put a bigass heatsink on the back, you can drive the panel harder before these issues get too bad, which means more brightness and longevity.

Other display manufacturers use heatsinks too, though only on high-end models typically. And this is probably the biggest OLED heatsink I've ever seen. Unless that image's perspective is playing tricks on my eyes, that thing's thick. I think Asus is afraid of burn-in more than anything, since W-OLED is said to be less resistant to that than QD-OLED.

Dr. Video Games 0031 fucked around with this message at 22:57 on Aug 17, 2022

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Dr. Video Games 0031 posted:

I think Asus is afraid of burn-in more than anything, since W-OLED is said to be less resistant to that than QD-OLED.

That's my take, too. A couple of dollars of extra metal in the BOM is probably a reasonable hedge against (1) a bunch of returns / warranty replacements in a year or two, and (2) bad PR for what's gonna be their first venture into W-OLEDs outside laptops.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Dr. Video Games 0031 posted:

If this is the 55-inch version, then I would probably move up to 4 feet away personally, but I'm sure you've already tried that. Everyone has their own preferences with this kind of thing.

Excuse me. This is the 65 inch!

codo27 posted:

Rub that in my face why dont ya.

Apologies sir, my intention was not to brag but just to state that I have some of the best monitors and they look pretty bad next to QD OLED tech honestly :)

DrDork posted:

For a 55" 4k, that distance is 3.7ft, which isn't that far off from the 5.5ft you find comfortable, but it being further off would likely go a long way to "removing" the sub-pixel haloing that the Alienware is being dinged for--since now you're (in theory) too far away to be able to discern individual pixels, let alone sub-pixels.

Thank you for that insanely informative reply! It is a 65 inch. Pardon me. I'm just amazed you can (if you qualify) get the 65 inch S95B for less than $1700. The competition in panel tech is off the charts.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Dr. Video Games 0031 posted:

Good lord, the size of the heatsink on the back of Asus' upcoming 42" W-OLED monitor.

https://www.youtube.com/watch?v=yudNKx8mdLQ&t=40s

Is this a panacea? I understand the basic principle here, but I do NOT understand if throwing successively more huge heatsinks solves the issue linearly.

Also can someone, for example, add a heatsink to the back of a S95B? It seems to have very little heat sink, BUT it also subjectively feels like it generates less heat than my C9... not sure if I'm making that up though.

I'm guessing that you would need to make contact to the internals to properly add a heatsink, but who knows.

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Taima posted:

Is this a panacea? I understand the basic principle here, but I do NOT understand if throwing successively more huge heatsinks solves the issue linearly.

Also can someone, for example, add a heatsink to the back of a S95B? It seems to have very little heat sink, BUT it also subjectively feels like it generates less heat than my C9... not sure if I'm making that up though.

I'm guessing that you would need to make contact to the internals to properly add a heatsink, but who knows.

It's likely that it would have to be engineered for it. It's perhaps not impossible to add one to a monitor but you'd be disassembling it and looking to get a big heatsink to put on the back of the panel sandwich inside which is usually housed in aluminum. I wouldn't do it with a monitor that works since you'd be doing a lot of work that could possibly damage it.

Sir Sidney Poitier
Aug 14, 2006

My favourite actor


I am changing some things around, this is my current setup, for reference:


The ultrawide on the right is a Samsung Odyssey G5 34" 3440x1440. Whilst I like it, it's had a frequent flicker fault from the start, regardless of input source. I sent it back for repair, it was "repaired" and returned with exactly the same issue, so I'm returning it. In looking for replacements nothing in particular caught my eye, everything was a bit mediocre seeming and in terms of spec that looked largely like the best option at the price (bought at £388, it's now at £425 or so) so I started looking a bit higher up. I started looking for something bigger because I'm a sucker for more real estate - this monitor will be used both for gaming connected to the PC, and also working connected to the laptop beneath it. It seemed as though the 49" megawide monitors were a bit too high priced but my eye landed on the Gigabyte FV43U - costs a bit more than the G5 I'm returning, but it's bigger and has more usable space on it for working too.

Is it silly to try and fit it in that space?

Bad Purchase posted:

^ i'm also curious about whether there are any good KVMs out there, especially the "M" part and particularly something supporting multiple displays. but given how finicky monitors and graphics cards can be with cables and refresh rates and especially features like g/free sync, i wouldn't be surprised if sticking with the monitor's built-in input switching is the best way to go.

This is the other reason I am interested in the FV43U - I presently manually switch input sources and have a separate USB switch. It's a pain in the arse and if it can be done in one place it'd be a lot better.

Dr. Video Games 0031
Jul 17, 2004

DrDork posted:

That's my take, too. A couple of dollars of extra metal in the BOM is probably a reasonable hedge against (1) a bunch of returns / warranty replacements in a year or two, and (2) bad PR for what's gonna be their first venture into W-OLEDs outside laptops.

W-OLED is LG's OLED panel designed for TVs. Laptops have been using panels primarily from Samsung (their AMOLED panels, which use icky subpixel layouts but can achieve very dense DPIs and apparently 240hz now, which is fun), but it's not suitable for desktop/gaming monitors.

Density is the real issue here, with the problem being that the circuitry and stuff required to control each pixel don't shrink when you shrink the pixels. This means as you make displays more pixel-dense, the ratio of non-illuminated circuitry to illuminated subpixels gets worse, resulting in lower brightness and, potentially, a grainier image (this is what Vincent from HDTVTest is talking about when he mentions "pixel aperture ratio"). The 42" C2 is noticeably dimmer than its larger counterparts for instance, and any 42" OLED monitors coming out will be using the same panel. The subpixels being smaller also means it takes less time for burn-in to appear. So panel manufacturers are really fighting a two-front battle as they try to shrink OLED panels intended for TVs to monitor-ish sizes. I think Asus would be okay with a sub-200-nits measurement on the 100% white test, but they really don't want to deal with burn-in-related RMAs.

This is why so little progress has been made toward OLED monitors, especially 4K ones. JOLED is doing stuff with inkjet printing (with stripe RGB pixels!), but those are relatively low-brightness 60hz displays designed for professionals. You could theoretically scale up smartphone/tablet/laptop-style displays to desktop monitor sizes, but you'd run into some pretty severe text clarity issues, they aren't the most burn-in resistant, and they also don't get as bright as TVs for truly punchy HDR--though i think some esports pros would accept the shortcomings for 240+ hz OLED. Instead of that, Samsung and LG are trying to shrink TV technology, but it's not easy. QD-OLED will probably be the first to do a normal-sized 4K gaming monitor once Samsung applies some recent breakthroughs with blue OLED. They can already do a 27" 1440p, but for manufacturing reasons they're going with 34" 1440p ultrawide only for now (the panels are made by carving up a larger "motherglass," and samsung has to play RE4 inventory tetris with every panel size they're manufacturing at once since they only have one assembly line right now)

(edit: this explanation was simplified and not 100% accurate since stuff like the "deposition" method also results in lower aperture ratios at high DPIs. Another Japanese display company... Japan Display, recently claimed to have discovered a deposition technique that could help monitors a lot. The claims are bold (double the brightness and triple the lifespan!), though I feel like I hear claims like these often and only rarely do they translate into tangible improvements.)

Taima posted:

Also can someone, for example, add a heatsink to the back of a S95B? It seems to have very little heat sink, BUT it also subjectively feels like it generates less heat than my C9... not sure if I'm making that up though.

I'm guessing that you would need to make contact to the internals to properly add a heatsink, but who knows.

If by "someone" you mean "a large corporation like Sony," then the answer is yes, and they've already done exactly that with the A95K (another QD-OLED TV).



Unlike when they've done this with LG's panels, they aren't using the heatsink to make the thing brighter. They're just using it to reduce image retention instead. (I guess QD-OLED is already bright enough)

Sorry for rambling on and on about OLED poo poo. It's an interesting topic to me, and I'm hopeful we could start seeing more actual monitor options next year.

Dr. Video Games 0031 fucked around with this message at 05:46 on Aug 18, 2022

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Dr. Video Games 0031 posted:

Sorry for rambling on and on about OLED poo poo. It's an interesting topic to me, and I'm hopeful we could start seeing more actual monitor options next year.

I also remember last year fondly.

SuperTeeJay
Jun 14, 2015

Interesting post, thanks. I had assumed that the absence of OLED monitors was for purely commercial reasons (manufacturing capacity better used on TVs; keep flogging dogshit IPS panels for as long as possible). With that said, I'd probably go blind if my monitor was anywhere near as bright as my OLED TV so the dimmer the better.

Dr. Video Games 0031
Jul 17, 2004

SuperTeeJay posted:

Interesting post, thanks. I had assumed that the absence of OLED monitors was for purely commercial reasons (manufacturing capacity better used on TVs; keep flogging dogshit IPS panels for as long as possible). With that said, I'd probably go blind if my monitor was anywhere near as bright as my OLED TV so the dimmer the better.

It's not just about peak brightness, but full-screen brightness. e.g. how much will the ABL dim the screen when looking at gmail or excel? The 42" C2 gets pretty dim, and you wouldn't want it to get any dimmer than that unless you do everything in a heavily light controlled environment. One of the things that makes the AW3423DW so awesome is that it can do everything you want in the windows desktop at a good brightness level and with no automatic dimming in SDR mode and very little automatic dimming when using the HDR True Black 400 mode. What we really need is more of that.

Dr. Video Games 0031 fucked around with this message at 06:53 on Aug 18, 2022

Adbot
ADBOT LOVES YOU

Shipon
Nov 7, 2005
I have noticed that if I put videos on my C2 and then split the screen with a browser, ABL results in an extremely annoying fluctuation in brightness. I would not use this thing as a monitor - it pretty much only gets used by me to play full-screen games and watch Youtube on.

I also just picked up one of the LG 16:18 displays and set it next to my C2. This thing is crisp: 140 DPI, slightly denser than the 32" 4k monitors out there and text looks incredible on it. Also the built-in KVM switch feature is great, can just plug my work laptop in and have it charge off the display as well as switch between work and home desktop. The mount it comes with is pretty great too - hefty yet easy to move, like it would cost $150 if sold separate from the display or something. The aspect ratio is great - in landscape I can have a window just for Twitter off to the side and two stacked windows on the other side of the monitor, or I can have all four quadrants of the monitor show a decently-large window like when I need to remote into one of my pieces of equipment at work or w/e. The only downside is it's only 60 Hz - yeah it's definitely not meant for gaming but I am so used to high refresh rates on my display that even browsing the internet or doing stuff in Excel is frustrating at such a low refresh rate. I don't know if $700 is a good value for it, but so far it feels like a pretty nice splurge.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply