Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Worf
Sep 12, 2017

If only Seth would love me like I love him!

Wonder how many people are needlessly disappointed in poo poo bc they justifiably think they should be using hdr

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Or have adaptive contrast enabled, despite the fact that literally nobody thinks that that works well.

VelociBacon
Dec 8, 2009

As a gamer who cares more about competitive advantage than an impressive visual experience I think I'd prefer non-HDR for shooters and such if it blows your highlights and crushes your blacks. Or am I misunderstanding this?

I'm the kind of guy that plays with the gamma slider way too high so I can get all the information into my dumb eyes possible.

sauer kraut
Oct 2, 2004
I'm afraid that artsy graphics people are gonna abuse it too to create dark rooms where you can't see poo poo, interspersed with really bright lights burning a hole into your monitor.
It's gonna be like the early xbox360 era all over again with its bloom effects, like shown here https://www.resetera.com/threads/a-look-back-at-the-fuckload-of-bloom-era.28756/

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

sauer kraut posted:

I'm afraid that artsy graphics people are gonna abuse it too to create dark rooms where you can't see poo poo, interspersed with really bright lights burning a hole into your monitor.
It's gonna be like the early xbox360 era all over again with its bloom effects, like shown here https://www.resetera.com/threads/a-look-back-at-the-fuckload-of-bloom-era.28756/

I can feel the humidity from every one of those screenshots, just looking at them is making my balls stick to my leg

repiv
Aug 13, 2009

the reason early bloom looked terrible is because games were still rendering in LDR, so they had no idea how bright pixels were and faked it by assuming anything vaguely white must be bright

that sort of problem isn't going to come back, any renderer written in the last decade or so renders HDR internally and knows how bright pixels are through the whole pipeline

incidentally that's also why "enabling" HDR usually doesn't affect performance, games were rendering HDR anyway and "enabling" it actually just switches the final resolve to a less destructive one

repiv fucked around with this message at 23:41 on Dec 7, 2019

Klyith
Aug 3, 2007

GBS Pledge Week

sauer kraut posted:

I'm afraid that artsy graphics people are gonna abuse it too to create dark rooms where you can't see poo poo, interspersed with really bright lights burning a hole into your monitor.

That's not what HDR in monitors with expanded color space is. Same with HDR filters on photos. A HDR monitor / TV actually displays colors and light ranges that your eyes can see but a normal monitor can't show. The standard sRGB color range is extremely meh because it was based on what was possible with electrons being thrown at phosphor dots. You could only make the green this much green with that. Now we have the tech to make greens greener.

Bad HDR effects are trying to imitate what your eyes see by ramping brightness or contrast up and down rather than actually displaying it.


The problem with wide color gamut is that until everything that's on your monitor -- from the graphical UI, to photos on the internet, to video game content and rendering -- is designed with that color space in mind it looks like crap in HDR mode. What we really needed was for the HDR display standards to actually add more numeric range to the color values, so like when the sRGB Red goes from 0-255 the DCI Red could go from 0-320. Unfortunately nobody had extra half-bits sitting around and bits don't work like that. So we're left with the situation that sRGB and DCI-P3 are visually incompatible and sRGB content looks washed out when you turn on HDR.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
HDR isn't just wide color gamut, wide color gamut has been around forever. It's also about being able to display much higher contrast and maximum brightness through controlling the backlight for individual pixels/sectors. Of course, most cheaper TVs/monitors can't do this, so the effect is simply faked.

Geemer
Nov 4, 2010



HDR monitors are trash because instead of having actually finer graduation in color levels, they fake it by turning the backlight to eye-searing and step it down if a part is supposed to be darker.
I'm sitting here with my HDR 400 (the version of the standard that shouldn't exist because it's just that underwhelming, but exists so manufactures can put a fancy new logo on their boxes) capable monitor set to 0% brightness and it's OK to watch, but if I turn up the brightness to 5% it's already uncomfortable even though I'm in a well lit room. Or even with daylight.
This thing shipped defaulted to 80% and if I turn on HDR mode in the HUD it gets stuck to 100% and literally hurts to look at.
The real HDR monitors go to 1000 nits instead of the paltry 400 this one does and I can not imagine anyone enjoying needing to put loving sunglasses on to be able to look at their monitor.

Shaocaholica
Oct 29, 2002

Fig. 5E
In order for HDR and wide gamut to work correctly in a desktop environment, the OS needs to support HDR and wide gamut rendering and be aware of which apps/buffers need to be rendered which way so everything doesn’t look like garbage. AFAIK none of the publicly available OS builds do this yet. You have one off apps that hijack the whole screen but that’s a pretty simplistic implementation. Game consoles are totally going to beat desktop OSes on this one.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Klyith posted:

That's not what HDR in monitors with expanded color space is. Same with HDR filters on photos. A HDR monitor / TV actually displays colors and light ranges that your eyes can see but a normal monitor can't show. The standard sRGB color range is extremely meh because it was based on what was possible with electrons being thrown at phosphor dots. You could only make the green this much green with that. Now we have the tech to make greens greener.

Bad HDR effects are trying to imitate what your eyes see by ramping brightness or contrast up and down rather than actually displaying it.


The problem with wide color gamut is that until everything that's on your monitor -- from the graphical UI, to photos on the internet, to video game content and rendering -- is designed with that color space in mind it looks like crap in HDR mode. What we really needed was for the HDR display standards to actually add more numeric range to the color values, so like when the sRGB Red goes from 0-255 the DCI Red could go from 0-320. Unfortunately nobody had extra half-bits sitting around and bits don't work like that. So we're left with the situation that sRGB and DCI-P3 are visually incompatible and sRGB content looks washed out when you turn on HDR.

You're conflating color gamut with contrast here. The "dynamic range" part of HDR strictly refers to contrast ratio - that is, ratio of luminosity between bright things and dark things. HDR monitors exist to "solve" the "problem" that if you look at the sun in a video game on a SDR monitor it doesn't blind you in real life. In reality this happens because the sun is so much more bright than any other object in a way a monitor could never represent, but someone got it into their head to try to make monitors more "life-like" by increasing the maximum brightness. HDR display standards also include wide gamut color as a freebie, but that's strictly speaking not HDR.

This is sort of related to the numeric range issues you bring up because with only 24 bits of mixed color and luminosity resolution per pixel you can't represent things being incredibly much brighter than other things without making the resolution unacceptably low at some end of the spectrum. In fact in SDR the resolution is already too low, and gamma is one way to try to deal with this - it basically compresses the luminosity values into a smaller range (you multiply the numeric value by a ramping function to get the actual value to show on the screen).

HDR is loving dumb, I don't want the monitor to loving blind me, gently caress off already. I'll gladly take higher precision color and linear gamma though, those are nice.

TheFluff fucked around with this message at 05:17 on Dec 8, 2019

Indiana_Krom
Jun 18, 2007
Net Slacker

TheFluff posted:

HDR is loving dumb, I don't want the monitor to loving blind me, gently caress off already. I'll gladly take higher precision color and linear gamma though, those are nice.

Yeah pretty much, I have no idea why someone thought sitting down and looking at a display that could throw 1000+ nits would somehow be a good idea. Like the huge contrast between the light and the dark isn't particularly useful if it is accomplished by making the white so bright it forces one to turn their eyes away from the screen. Properly implemented HDR rendering on a good SDR display does pretty much 100% of the effect with 0% of the eye strain.

ufarn
May 30, 2009
We're probably not going to get working HDR for desktop until we move on to microLED panels or something.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

ufarn posted:

We're probably not going to get working HDR for desktop until we move on to microLED panels or something.

There are some proper HDR monitors out there, but all in the thousand+ dollar range. Some of the very top of the line gaming monitors are HDR 1000 with local dimming and VA panels that provide the necessary contrast to do good HDR. Then there's also the top of the line 4K productivity monitors.

If the HDR group's website is up to date, only 13 monitors exist that do HDR 1000. These also seem to include the BFGD "monitors" which are so big it would be silly to call them a monitor instead of the TV's they actually are.

https://displayhdr.org/certified-products/

I do question how your eyes don't burn out your head watching proper HDR content on a monitor at normal monitor viewing ranges. Even with the peak 400 nit brightness my current HDR 400 monitor does, that was eye searing and I dropped the brightness by half for day to day use in SDR.

v1ld
Apr 16, 2012

Nifty link to the HDR site. That Samsung is very good deal if you want a 49" screen - too wide for my use.

Re: brightness. I used the TFTCentral profile and instructions for this new LG 34GK950F. Their recommended brightness setting is 17 (default was 80 I think). I found that very dim as I adjusted it down after getting used to the eye sear for a few hours. But now it's perfect and bright scenes seem bright without taxing the eyes. Overall their profile and recommended settings make a very large difference to the feel of the monitor.


Have to say I'm very happy with the monitor. There were some initial hiccups with the DisplayPort input not working, but that completely went away after a few false starts - apparently a common complaint. No problems with wake from sleep, etc. Seem to have gotten lucky with no backlight bleed I can see in the black screen tester, nor any bad pixels. I hope it continues so, there are reports of this monitor degrading over time and I want it to be a 5-10 year monitor.

Gaming is great with the 48-144 Freesync range and the wide color gamut. Can't say I'm sensitive to input lag, but it's apparently quite low.

I like that it is simple to update the firmware - it updated to 3.10, which is more recent than the 3.1 TFTCentral received in Q1 when they did their review. So LG are actively fixing issues, good to see.

The 5700 with XT bios is doing a great job of pushing pixels. The only downside has been that Radeon Chill which was working flawlessly with a cheap 48-75Hz Freesync range 1080p monitor crashes every few minutes on this monitor in every game I tried (Warframe and Outward). So that's been disabled for Enhanced Sync + in-game fps limiter. I like the idea of Chill, so I hope they fix it.

v1ld fucked around with this message at 16:54 on Dec 8, 2019

v1ld
Apr 16, 2012

This is a more general question, but I'm most worried about the monitor so: Is it ok to turn off external power to the monitor many times a day? Ie. can it damgae the monitor itself, less worried about the power supply?

I'm on a kick to reduce vampire power draw from devices that are otherwise off and have set up some smart plugs/strips to control entire segments. For eg., there's now a "PC power" group in Alexa to cut off power to all the devices connected to this monitor at one go.

Can the 2-3 times a day turning off/on of external power cause harm to these monitors (or other devices)?

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
Continuing Kraken G12/2080Ti adventures: I found a cheap-rear end thermometer with k-type thermocouple probes included in a local nerd store, so out of curiosity I picked it up and stuck one probe to one of the power stages on the VRM part that sits next to the display outputs (that is, with no air flow from the G12 fan) and one to one of the memory modules. The VRM is unsurprisingly not a problem at all and in fact seems to do better without the original cooler than with it. If it's anything like Asus' motherboard VRM's I suspect it balances the load on the power stages based on their temperature, which probably helps.

The VRAM on the hand is more concerning. I'm reading close to 75C on the case of the module, which if Igor's Lab is to be believed is probably an internal temperature around 90C. That's probably okay, at least as far as the spec sheet goes, but I'd be happier if it was lower. Question is how to get airflow in there - most of the area around the core is blocked by the G12 or the pump/coldplate assembly. Heatsinks without airflow seem sort of dubious but might as well try. Could also try mounting heatsinks to the back of the PCB where the modules are; there's some airflow there from the CPU cooler.

At least I managed to get rid of CAM. Radiator fans are hooked up to a chassis fan header on the motherboard via an Y-splitter (thanks past me for unwittingly buying a motherboard with lots of fan headers, including one rated for 3A of current) and as such controllable via Argus monitor. For controlling the pump I found liquidctl which is a neat little Python utility that can do all I want from CAM except it only runs once at login instead of staying in the background. The G12 fan is annoyingly voltage-controlled (3-pin connector) which means it runs at 100% all the time when hooked up to the AIO's fan connector. It's thankfully not very loud but I'll need to fix that.

Shaocaholica
Oct 29, 2002

Fig. 5E
Is Vulkan implementation better/worse on either brand or specific arch?

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

v1ld posted:

Re: brightness. I used the TFTCentral profile and instructions for this new LG 34GK950F. Their recommended brightness setting is 17 (default was 80 I think). I found that very dim as I adjusted it down after getting used to the eye sear for a few hours. But now it's perfect and bright scenes seem bright without taxing the eyes. Overall their profile and recommended settings make a very large difference to the feel of the monitor.

I've long had a suspicion that many people who complain about eye strain and (for example) say that they dislike reading long texts on a computer screen simply have their monitors set too bright. Basically all monitors default to being eye-searingly bright out of the box and you usually have to turn the brightness almost all the way down for use in a dimly lit room at home. I also suspect this is one of several reasons for the rise of "dark mode" UI. HDR is another - apparently, on console where HDR is more common since it's more of a thing in TV's, a common complaint is that white UI elements are blindingly bright.

Shaocaholica
Oct 29, 2002

Fig. 5E
White UI on a HDR display needs to be a design consideration. Seems like the content creators are simply not designing for it.

If HDR:
enable_alternate_UI_set()

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

TheFluff posted:

Continuing Kraken G12/2080Ti adventures: I found a cheap-rear end thermometer with k-type thermocouple probes included in a local nerd store, so out of curiosity I picked it up and stuck one probe to one of the power stages on the VRM part that sits next to the display outputs (that is, with no air flow from the G12 fan) and one to one of the memory modules. The VRM is unsurprisingly not a problem at all and in fact seems to do better without the original cooler than with it. If it's anything like Asus' motherboard VRM's I suspect it balances the load on the power stages based on their temperature, which probably helps.

The VRAM on the hand is more concerning. I'm reading close to 75C on the case of the module, which if Igor's Lab is to be believed is probably an internal temperature around 90C. That's probably okay, at least as far as the spec sheet goes, but I'd be happier if it was lower. Question is how to get airflow in there - most of the area around the core is blocked by the G12 or the pump/coldplate assembly. Heatsinks without airflow seem sort of dubious but might as well try. Could also try mounting heatsinks to the back of the PCB where the modules are; there's some airflow there from the CPU cooler.

At least I managed to get rid of CAM. Radiator fans are hooked up to a chassis fan header on the motherboard via an Y-splitter (thanks past me for unwittingly buying a motherboard with lots of fan headers, including one rated for 3A of current) and as such controllable via Argus monitor. For controlling the pump I found liquidctl which is a neat little Python utility that can do all I want from CAM except it only runs once at login instead of staying in the background. The G12 fan is annoyingly voltage-controlled (3-pin connector) which means it runs at 100% all the time when hooked up to the AIO's fan connector. It's thankfully not very loud but I'll need to fix that.

All the spec sheets I could find for GDDR6 showed a max temp of 95-100C so I wouldn't stress too much about 75C, its well below max.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

B-Mac posted:

All the spec sheets I could find for GDDR6 showed a max temp of 95-100C so I wouldn't stress too much about 75C, its well below max.

I'm measuring case temperature though, which is significantly lower than internal temperature. The question is how much lower. This is where an EVGA card with all those temperatures readable in software would be handy, but I don't have one of those unfortunately. The Igor's lab numbers I linked above suggest a ~15C difference between Tcase and Tjunction on GDDR6, but I really don't know if my numbers are at all comparable to his. I think he was testing Micron chips while I've got Samsung, for one thing, and I'm really not so confident in my probe mounting or bargain basement thermometer.

Still, I don't think I've seen the memory downclocking under load though, nor any weird artifacts or instability, so I don't think it's overheating, but well, running less hot can't hurt and loving around is half the reason for doing a mod like this anyway :v:

Klyith
Aug 3, 2007

GBS Pledge Week

v1ld posted:

This is a more general question, but I'm most worried about the monitor so: Is it ok to turn off external power to the monitor many times a day? Ie. can it damgae the monitor itself, less worried about the power supply?

I'm on a kick to reduce vampire power draw from devices that are otherwise off and have set up some smart plugs/strips to control entire segments. For eg., there's now a "PC power" group in Alexa to cut off power to all the devices connected to this monitor at one go.

Can the 2-3 times a day turning off/on of external power cause harm to these monitors (or other devices)?

Monitors should be fine. Cutting power to your PC while it's on is a bad idea, while it's off is no problem. Make sure to check the current ratings on the smart power and stay well below the total, which shouldn't be difficult unless you have a monster PC.


Also just personally I wouldn't want to put expensive PC stuff on a smart power plug that probably has extremely inferior surge protection compared to a $30 triplite or whatever. Saving a few pennies by having thousands of dollars at higher risk seems like a bad idea.



TheFluff posted:

You're conflating color gamut with contrast here. The "dynamic range" part of HDR strictly refers to contrast ratio - that is, ratio of luminosity between bright things and dark things.

Right, but when fat bossy gerbil turned on the "HDR mode" I think that also turns on the wide color gamut. And the wide gamut is what made the color look like poo poo if you don't have things set up properly. That's been a problem for a long time, including back when professional monitors had AdobeRGB gamut without the HDR contrast & luminosity range. It was a frequent thing to see people post about getting their fancy new expensive monitor and complain that everything looked off.

v1ld
Apr 16, 2012

On-topic stuff: I've been running the 5700 with XT bios for a few weeks now and thought I'd report on stability. The only two things I've found it to have issues with are:
- Radeon Chill hard crashes (immediate power off) on a 3440x1440 monitor. Crashes stop if chill is disabled.
- Radeon Image Sharpening causes infrequent driver crashes where it will recover with the "disabling your power/overclock overrides" messages.

No crashes, system or games, outside of those two options with XT bios and overclock to 2040HZ, 900MHz RAM, +20% power. I wonder if either of them expects the extra hardware on the real 5700 XT to be present.

So overall I'm happy with running with the bios hack. Neither feature is a must have though I'd like both to be enabled.


The Chill crash is an immediate power off, no warning at all. Does that indicate a possible power problem? It's got a beefy Corsair supply, 650W I think, but almost 7 years old now.

The crash reliably happens within a few minutes of starting a game. Only occurs with the LG 34GK950F, not a Philips 48-75Hz 1080p Freesync monitor.

E: The Philips connects over HDMI, the LG over DP. So that's a possible difference. Not going to test the LG over HDMI, no point because of how many other variables will have to be changed to accommodate that.

Klyith posted:

Monitors should be fine. Cutting power to your PC while it's on is a bad idea, while it's off is no problem. Make sure to check the current ratings on the smart power and stay well below the total, which shouldn't be difficult unless you have a monster PC.

Also just personally I wouldn't want to put expensive PC stuff on a smart power plug that probably has extremely inferior surge protection compared to a $30 triplite or whatever. Saving a few pennies by having thousands of dollars at higher risk seems like a bad idea.

I'm checking for 15A on all the plugs/strips. Your point on surge protection is a good one and I will look at getting something to plug the smart strips into. It'll be a bit of an unholy daisy chain, but worth it.

v1ld fucked around with this message at 18:41 on Dec 8, 2019

wolrah
May 8, 2006
what?

Beautiful Ninja posted:

These also seem to include the BFGD "monitors" which are so big it would be silly to call them a monitor instead of the TV's they actually are.
As I see it, in the era of 16:9 LCDs being the default across the board TVs are a subset of monitors that are capable of operating standalone. If it requires an external source device to be useful, it's a monitor. If it has a tuner or streaming functionality built in, it's a TV.

BFGDs exist on both sides of that line, the HP for example has a Shield built in so I'd argue it's a TV, where the Asus does not and is thus a monitor.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

v1ld posted:

I'm checking for 15A on all the plugs/strips. Your point on surge protection is a good one and I will look at getting something to plug the smart strips into. It'll be a bit of an unholy daisy chain, but worth it.

The gold standard would be a Tripp-Lite ISOBAR. I've got my UPSes plugged into one and it's been a tank for years. Just periodically look down at the LEDs to see if it's reporting a fault every few months.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

TheFluff posted:

Question is how to get airflow in there - most of the area around the core is blocked by the G12 or the pump/coldplate assembly. Heatsinks without airflow seem sort of dubious but might as well try. Could also try mounting heatsinks to the back of the PCB where the modules are; there's some airflow there from the CPU cooler.

You can always grab a spare fan and mount it perpendicular to the card itself, so it's blowing towards the motherboard. Even if it's not perfectly squared up to the VRAM, just pushing air around the general area may help, particularly if you are able to slap some small heatsinks on.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

v1ld posted:

The Chill crash is an immediate power off, no warning at all. Does that indicate a possible power problem? It's got a beefy Corsair supply, 650W I think, but almost 7 years old now.

The crash reliably happens within a few minutes of starting a game. Only occurs with the LG 34GK950F, not a Philips 48-75Hz 1080p Freesync monitor.

I'm not an expert, but instant hard shutdowns do point to some kind of low-level power delivery problem, yes - either the PSU shutting itself off or the motherboard shutting down because the PWR_OK signal goes outside the acceptable limit. Anything higher level than that usually fails in a more graceful way (even a bluescreen or image corruption is more "graceful" than a hard shutdown). It doesn't have to be overcurrent protection either - as far as I understand it, load transients (like when the GPU goes from doing something relatively low powered to something very high powered in a matter of milliseconds) can make the voltage regulation to go out-of-spec for a brief moment just long enough to trigger some kind of protection mechanism. The 5700XT has a higher power limit than the 5700, so that in combination with how Radeon Chill might start and stop rendering could be what's triggering it.

If you currently have the card hooked up with a single PCIe power cable with two connectors on it, you could try running two separate cables with one connector each instead (I know Seasonic recommends this for cards with >225W TDP). Even if your PSU has a single 12v rail, that would at least help minimize voltage drops in the cables.

DrDork posted:

You can always grab a spare fan and mount it perpendicular to the card itself, so it's blowing towards the motherboard. Even if it's not perfectly squared up to the VRAM, just pushing air around the general area may help, particularly if you are able to slap some small heatsinks on.

Yeah, I think I'll give that a shot. Jank city, but that's fine :v:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I had a Leadtek Winfast 6800 whose fan failed back in the mid 2000s, and ended up tying an 80mm fan to it with archery serving (it was on hand) using the four holes in the corner of the PCB to line it up over the failed fan's hole (having moved the failed one out of the way since the card didn't function if that fan was unplugged from an on-PCB power jack, even broken) and using it like that for another year at least if I remember right.

I am pretty sure this whole memory just came from reading the word jank, thanks for listening folks

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Agreed posted:

I had a Leadtek Winfast 6800 whose fan failed back in the mid 2000s, and ended up tying an 80mm fan to it with archery serving (it was on hand) using the four holes in the corner of the PCB to line it up over the failed fan's hole (having moved the failed one out of the way since the card didn't function if that fan was unplugged from an on-PCB power jack, even broken) and using it like that for another year at least if I remember right.

I am pretty sure this whole memory just came from reading the word jank, thanks for listening folks

Hey, if it works, it works. Linus did a video once where he did basically the same thing, slapping a couple of noctuas on top of a GPU heatsink and it ended up running cooler and quieter than with stock fans. The only real downside, outside of looking jank, is that you need to use motherboard fan headers to control the fans rather than GPU software.

GRINDCORE MEGGIDO
Feb 28, 1985


Just buy GPU fan header > 4 pin fan adaptors and run the fans off the card. Unless it's a weird Snowflake connector.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I wish you guys had talked about Windows HDR earlier. I got that LG 5K2K ultrawide last week and it just looked like absolute poo poo on Windows. Turned off HDR yesterday... fixed everything. Though, MacOS has infinitely better scaling at super high resolution.

That being said, HDR on my OLED is completely 100% great. So I'm not super clear on the difference, maybe it's that the OLED does "real" HDR, or has per-pixel dimming zones? Hmm.

e: I should clarify- I mean, HDR on Windows on the OLED.

Taima fucked around with this message at 03:43 on Dec 10, 2019

Shaocaholica
Oct 29, 2002

Fig. 5E
OLED is per pixel dimming. Everything else is big chunky zones.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
So you're saying it's essentially overexposed because it's just taking giant panels of backlighting and overdoing it in some misguided attempt to emulate HDR? Weirdly, none of the reviews of the panel mentioned it :/

That would explain it because good as the panel is, the dimming zones are absolute poo poo. There's literally like 8 dimming zones.

Shaocaholica
Oct 29, 2002

Fig. 5E

Taima posted:

So you're saying it's essentially overexposed because it's just taking giant panels of backlighting and overdoing it in some misguided attempt to emulate HDR? Weirdly, none of the reviews of the panel mentioned it :/

That would explain it because good as the panel is, the dimming zones are absolute poo poo. There's literally like 8 dimming zones.

It's gimmicky for movies. It's atrocious for desktop use not that they are really advertised for that tho.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
HDR in terms of increased brightness is pointless dogshit on monitors. They've always been capable of far too much brightness and almost everyone runs them far too bright and suffers eye strain as a result. On a TV that you're sitting 10' from, 1000 nits might make sense, but on a monitor, anything over 300 is definitely useless.

The problem with contrast is a more fundamental one. OLEDs and CRTs are light emitting devices. Because of this, they can go from zero brightness to full brightness. OLEDs are particularly good at this, because they're entirely digital devices with no crosstalk/bleed. LCDs are light blocking devices placed in front of a backlight. An LCD will always leak some light through, and have seriously reduced contrast. So again, at least in the case of LCDs, HDR is largely pointless, because the dynamic range really isn't there in the technology.

LCDs also have garbage response times across the board. OLED is a huge step up in response time, but CRTs still retain the god-tier crown because they have extremely short persistence, which gives them motion clarity that sample and hold displays can only compete with if they achieve refresh rates many times higher, with appropriate response times.

LCDs are basically the CFLs of the monitor world, but unlike CFLs which were only foisted on us for a few brief years before the emergence LED lights, we're likely to be stuck with LCDs for quite a while long before an actually good monitor technology emerges. There are only four good things about them : they're extremely crisp, they use less space than a CRT, cost less to make than a CRT, and require less power than a CRT. In the long run, they will be remembered as the bastard display technology that outstayed its welcome by decades.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
between an RX 580 for 82 USD, an RX 480 for 85, and an RX 470 for 68 (all from the used market), I don't really have a perspective on the price-performance comparison besides that the 480 is probably not going to be it

what to do?

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

gradenko_2000 posted:

between an RX 580 for 82 USD, an RX 480 for 85, and an RX 470 for 68 (all from the used market), I don't really have a perspective on the price-performance comparison besides that the 480 is probably not going to be it

what to do?

The 580.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

580 is an OCd 480, 470 is a slightly cut down 480.

So the 580.

Adbot
ADBOT LOVES YOU

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE

K8.0 posted:

LCDs are basically the CFLs of the monitor world, but unlike CFLs which were only foisted on us for a few brief years before the emergence LED lights, we're likely to be stuck with LCDs for quite a while long before an actually good monitor technology emerges. There are only four good things about them : they're extremely crisp, they use less space than a CRT, cost less to make than a CRT, and require less power than a CRT. In the long run, they will be remembered as the bastard display technology that outstayed its welcome by decades.

"It's a bastard technology that outstayed its welcome while an actually good technology was developed" seems unnecessarily down on LCDs. Yeah, they aren't perfect, but we have generations of products that would never have been possible with only CRT tech, and OLEDs have their own problems (burn in tends to be prominent). CFLs made largely no difference in lighting but LCDs reshaped the technological landscape.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply