Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Brut
Aug 21, 2007
Probation
Can't post for 10 days!

Cojawfee posted:

Because everyone can just drop a few hundred dollars on a device they will use once.

The other side of that argument is that they could save another couple hundred dollars by not getting an IPS for the color accuracy they'll end up not getting even once.

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Cojawfee posted:

Because everyone can just drop a few hundred dollars on a device they will use once.

If you're looking for a "price isn't important" monitor for graphic design that you're going to connect to a $2000 MacBook, there's really no excuse--it's simply a cost of doing that sort of work. Also you can get many of the lower-end offerings (which are all she probably needs) for <$200 these days.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


Brut posted:

The other side of that argument is that they could save another couple hundred dollars by not getting an IPS for the color accuracy they'll end up not getting even once.

How to witness color deformation and degradation in a TN monitor posted:

1) Get a TN panel larger than a netbook display. Or one straight off a netbook display.

2) Look straight on.

Even if you don't have factory or on-site calibration, a not-TN display will look markedly less bad, partly because angle deformation is far less of an issue and partly because most of them are using 8 bit panels (they'll tell you if they're 10-bit) and if it's 6 bit they'll at least be pretty drat good at faking it. Keep in mind that if the manufacturer's using TN for the panel and it's not ALL of [expensive/144Hz/not-gamer-oriented] it's hard to imagine they're not cutting corners everywhere else.

And even if you feel you need Accurate Color Or Else, unless color management is actually vital (because you do visual media for money or actual not-Internet prestige or something) you can get by with a $100 model or a passaround - and if it is vital you're derelict if you don't spring for a quality calibrator.

dont be mean to me fucked around with this message at 01:27 on Mar 14, 2015

Jippa
Feb 13, 2009
I'm currently very happy with my 120 hz monitor for gaming. I have a more long term question. Will monitors keep getting higher refresh rates in the future?

I know we already have 144, will we get significantly higher than that in the next few years?

SCheeseman
Apr 23, 2003

Jippa posted:

I'm currently very happy with my 120 hz monitor for gaming. I have a more long term question. Will monitors keep getting higher refresh rates in the future?

I know we already have 144, will we get significantly higher than that in the next few years?

There's not much point, anything higher than 96hz is gonna be almost impossible to detect for most of the population. I think 600hz is the next "sweet spot" because it is divisible by all the main refresh rates used today (24hz, 25hz, 50hz, 60hz) but I dunno when or if that'll happen.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

SwissCM posted:

There's not much point, anything higher than 96hz is gonna be almost impossible to detect for most of the population. I think 600hz is the next "sweet spot" because it is divisible by all the main refresh rates used today (24hz, 25hz, 50hz, 60hz) but I dunno when or if that'll happen.

What would be the point if we have adaptive sync? Could we even perceive a change that occured over 1/600th of a second?

My guess is somewhere around 200 Hz is the endgame as far as making a screen look perfectly smooth is concerned.

That said, I think anything above 85 Hz or so is going to be very smooth, and far above, diminishing returns will kick in hard. Might as well use the GPU power to make the scene prettier.

As far as consoles are concerned, it's still de rigueur to poo poo out 30 FPS presentations which are far from visually smooth, so I reckon it'll be a while before we see any real progress, since many devs target 30 FPS as acceptable. Which is sure as hell won't be for VR.

HalloKitty fucked around with this message at 18:12 on Mar 14, 2015

SCheeseman
Apr 23, 2003

HalloKitty posted:

What would be the point if we have adaptive sync? Could we even perceive a change that occured over 1/600th of a second?

My guess is somewhere around 200 Hz is the endgame as far as making a screen look perfectly smooth is concerned.

That said, I think anything above 85 Hz or so is going to be very smooth, and far above, diminishing returns will kick in hard. Might as well use the GPU power to make the scene prettier.

As far as consoles are concerned, it's still seen as perfectly acceptable to poo poo out 30 FPS presentations which are far from visually smooth, so I reckon it'll be a while before we see any real progress, since many devs target 30 FPS as acceptable. Which is sure as hell won't be for VR.

Kind of a weird scenario, but at 600hz you could have both a 50hz video and 60hz video running on the same screen without any jitter.

Otherwise yeah, adaptive sync is the way to go.

Wasabi the J
Jan 23, 2008

MOM WAS RIGHT
Look, if you're a professional in a field, I would expect you have the wherewithal to find the appropriate tools for the job you do.

Multimeters are not cheap, but I wouldn't hire an electrician that didn't have one.

Wasabi the J fucked around with this message at 18:14 on Mar 14, 2015

skylined!
Apr 6, 2012

THE DEM DEFENDER HAS LOGGED ON

Wasabi the J posted:

Underwhelming by all accounts.

I'm trying to find somewhere to see the Dell 34" curved monitor in person - having trouble; found a lot of positive feedback online though. Would you mind going into any detail on your experiences with them?

eggyolk
Nov 8, 2007


At 600hz you could probably have some very cool motion blur effects.

Wasabi the J
Jan 23, 2008

MOM WAS RIGHT

skylined! posted:

I'm trying to find somewhere to see the Dell 34" curved monitor in person - having trouble; found a lot of positive feedback online though. Would you mind going into any detail on your experiences with them?

I saw one in person at Fry's in Vegas, and it was just ... meh. I liked the ultrawide aspect (which I knew going in, having a 2560x1080 monitor, myself), however the curve was a negative to me; it wasn't severe enough to fill my field of view any more, yet it was too extreme to simply overcome the perspective skewing of the edges being slightly farther away, and actually pushed useful screen elements in the corners to awkward positions in my field of view.

I have pretty good vision, though I've never been tested for anything that might gently caress up my peripheral vision. Regardless, it made trying to do something simple (like finding an icon in the upper or lower left of the monitor) a slightly jarring experience as my eyes had to focus NEARER on an object that my brain "knew" was further away.

All this testing was done in a store environment, though, so in order to get the most complete review, I'd have to take a $1500 gamble, buy one, mount it well, and adjust everything to myself.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

Wasabi the J posted:

I saw one in person at Fry's in Vegas, and it was just ... meh. I liked the ultrawide aspect (which I knew going in, having a 2560x1080 monitor, myself), however the curve was a negative to me; it wasn't severe enough to fill my field of view any more, yet it was too extreme to simply overcome the perspective skewing of the edges being slightly farther away, and actually pushed useful screen elements in the corners to awkward positions in my field of view.

I have pretty good vision, though I've never been tested for anything that might gently caress up my peripheral vision. Regardless, it made trying to do something simple (like finding an icon in the upper or lower left of the monitor) a slightly jarring experience as my eyes had to focus NEARER on an object that my brain "knew" was further away.

All this testing was done in a store environment, though, so in order to get the most complete review, I'd have to take a $1500 gamble, buy one, mount it well, and adjust everything to myself.

So you mean underwhelming by your account, not all accounts.

Wasabi the J
Jan 23, 2008

MOM WAS RIGHT

KingEup posted:

So you mean underwhelming by your account, not all accounts.

http://www.gizmodo.co.uk/2015/03/one-week-living-with-lgs-curved-ultrawide-219-monitor/

http://www.gizmodo.co.uk/2015/03/lg-34uc97-curved-led-monitor-review-more-screen-than-your-desk-can-handle/

And users in this very thread talking about the backlight issues its curved shape exacerbates.

Like I was implying before, it's cool because it's a huge 21:9 display, not because it's got a weird shape.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

HalloKitty posted:

As far as consoles are concerned, it's still de rigueur to poo poo out 30 FPS presentations which are far from visually smooth, so I reckon it'll be a while before we see any real progress, since many devs target 30 FPS as acceptable. Which is sure as hell won't be for VR.

I think that you've got it exactly backwards here. e - now I get that you're referring to games on consoles themselves rather their impact on PC gaming. Because the current consoles were so incredibly weak even when released, PC games are already running at double the framerates, and as hardware continues to scale up framerates are going to go even higher. I don't think I have a practical use for a monitor bigger than 27"/1440, it's already much larger than my central FOV and I can't really discern pixels in motion, and I would much prefer scaling to really high framerates over higher resolution at this point.

Also, there are benefits to increasing framerates up to enormous numbers that we will never hit, just like resolution. It's always going to be a tradeoff, but framerates have been lagging behind for a long time and we're finally starting to see a serious push in that direction which is nice.

K8.0 fucked around with this message at 21:22 on Mar 14, 2015

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Except for the fact that, quite frankly, the industry has a lot more plans in the pipeline for upping resolution than they do for Hz. VR isn't going to change much because it's going to be pushed via headsets running two independent screens, so you can still get away with 60Hz display hardware (though obviously it'll take quite a bit more to actually drive them, GPU-wise). Consoles will also push more of a resolution-over-framerate approach, because some people actually have 4k TVs in their homes, while basically no one has a TV that's actually capable of displaying a 240Hz signal.

You can argue your personal preferences back and forth, but we're not going to see anything higher than 144Hz for a long time, simply because there is minimal demand for it. 120-144Hz is "fast enough" to provide an extremely smooth picture, and as has been noted, past that the diminishing gains kick in pretty hard. Then you have to figure in the computational cost of stupid-high refresh rates: 1080p@60Hz is effectively 2M pixels/s. 1080p@144Hz is 5M pixels/s. 1080p@240Hz is 8M pixels/s, which is pretty close to 4k@60Hz, and it already takes a hell of a setup to run 4k games at 50-60 FPS. So in the next ~5 years we'll mostly see everyone catching up to 4k/5k resolutions, while 1440p and 1080p IPS/VA monitors get a bump to 120/144Hz, and then once those all settle in for a bit, we'll see some work on increasing 4k framerates.

But you're not gonna see a 1080p@240/600Hz anytime soon, because seriously, why bother?

DrDork fucked around with this message at 01:29 on Mar 15, 2015

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DrDork posted:

Except for the fact that, quite frankly, the industry has a lot more plans in the pipeline for upping resolution than they do for Hz. VR isn't going to change much because it's going to be pushed via headsets running two independent screens, so you can still get away with 60Hz display hardware (though obviously it'll take quite a bit more to actually drive them, GPU-wise).

Nobody doing PC-attached VR headsets is doing 60Hz displays.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Subjunctive posted:

Nobody doing PC-attached VR headsets is doing 60Hz displays.
The Galaxy Gear VR uses a 1440p@60Hz display (which admittedly isn't PC-attached), and the Razor one uses a 1080p@60Hz display. As to the others, they're also not really breaking new ground, either. Most of them look to be in the 1080p-ish range at 90-120Hz. No one's getting too crazy because, again, the computational expense of 240+Hz displays gets pretty hefty really fast.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DrDork posted:

The Galaxy Gear VR uses a 1440p@60Hz display (which admittedly isn't PC-attached), and the Razor one uses a 1080p@60Hz display. As to the others, they're also not really breaking new ground, either. Most of them look to be in the 1080p-ish range at 90-120Hz. No one's getting too crazy because, again, the computational expense of 240+Hz displays gets pretty hefty really fast.

Is Razor really doing low-persistence at 60Hz in their consumer version? That's going to be pretty bad compared to the others. I guess it's really just a dev kit though.

I think 90Hz is going to be the standard for PC-attached HMDs; I don't think anyone has been able to produce evidence that LP 120Hz is distinguishably better than 90, and the content performance requirements go up dramatically. (Faster response in terms of change from one pixel value to another will be a good area of improvement, but also harder AIUI than raising the refresh rate.)

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Subjunctive posted:

Is Razor really doing low-persistence at 60Hz in their consumer version? That's going to be pretty bad compared to the others. I guess it's really just a dev kit though.
Yeah, though as you noted, it's just a dev kit right now. There's hope that'll it'll get bumped up--Oculus started with a 60Hz dev kit, too. On the other hand, they note that the display "can be replaced by a phone," so it sounds like they may be tracking more towards the Samsung make-a-cheapish-mount-for-existing-hardware route than making a purpose-built set of display hardware.

Ham Sandwiches
Jul 7, 2000

DrDork posted:

But you're not gonna see a 1080p@240/600Hz anytime soon, because seriously, why bother?

I have a 240 hz tv (60 fps for real, frame interpolation for the rest). I was skeptical, but for everything that doesn't require lightning fast response times, the difference in smoothness is significant.

I would really like to buy a 120 hz native display that had frame interpolation for an effective 480 / 600 hz. I get that it's not strictly necessary, but the extra smoothness does help and gives things a very slick feel. Also it helps massively with motion blur on LCDs.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DrDork posted:

Yeah, though as you noted, it's just a dev kit right now. There's hope that'll it'll get bumped up--Oculus started with a 60Hz dev kit, too.

Yeah, but we never considered it to be good enough for high-quality VR. Even DK2's 75Hz was known all along to be a stepping stone, we knew we had to hit at least 90Hz for it to be comfortable.

(120Hz at HMD resolutions also exceeds HDMI bandwidth, and some GPUs can't even push 90Hz at Crescent Bay resolutions because they don't have the pixel clock for it. Better spending bandwidth on increased resolution than refresh rate above 90Hz, I think.)

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Agreed. Either way, high-quality 3D is really a HDMI 2.0 or bust kinda deal, which at least gets you stereo 1080p@60Hz. The real answer is DisplayPort, and I'm not sure why that's not being pushed harder by the higher-end devices. Sure, you can push your lower-rez devices off a medium-range GPU (as long as you're not trying to render anything too stressful), but if you're already talking about 1440p or larger displays at 90+Hz, you're going to need a higher-end card to begin with, and guess what? Most of them have DP now.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

My understanding from the folks working on the display side of the hardware is that it's harder to accommodate DP with the low-persistence panels that are available in quantity. Might not be inherent, but just how they were built when the current gen of HMDs were being designed.

I'll ask on Monday if I remember.

drewmoney
Mar 11, 2004
After waiting 6 months or so for the price to drop, or something better to come along, I finally purchased the ASUS PB287Q. It's my first 4K monitor and I really like it. Being a gamer having to run at 60hz takes some getting used to but due to my GTX 980 I run all games at 4k@60fps with some graphics settings adjusted. Viewing angles are great, colours are great and the brightness is excellent.

While I'll miss 120hz and 120fps I'm looking forward to running more games in 4k.

Betty
Apr 14, 2008
Saw the BenQ XL2730Z, a free-sync capable, pop up on pc parts picker today. Quick someone buy one and tell me its worth getting! Based off of basic comparisons its the Asus ROG Swift but with free-sync instead of G-sync. TN panel, 27 inch, 2540x1440, 144hz refresh rate, 1ms response time. Radeon owners rejoice!

Mr.PayDay
Jan 2, 2004
life is short - play hard

Betty posted:

Saw the BenQ XL2730Z, a free-sync capable, pop up on pc parts picker today. Quick someone buy one and tell me its worth getting! Based off of basic comparisons its the Asus ROG Swift but with free-sync instead of G-sync. TN panel, 27 inch, 2540x1440, 144hz refresh rate, 1ms response time. Radeon owners rejoice!

In the german Amazon store, the BenQ is about ~80 Euro more expensive than the Asus ROG Swift. How and why? I guessed free sync would be way cheaper than a G-Sync price policy

evensevenone
May 12, 2001
Glass is a solid.
I think Carmack thinks 8K at 90fps is about where HMDs get good (I.e screen-door effect goes away, dot pitch about in line with current display sizes/viewing distances. So that will probably be a new generation of interconnects and several GPU generations. I think to get retina-sized pixels across the FOV you need around 20K, which might actually be feasible in a decade or so.

But it might make more sense to take advantage of the non-uniformity of the retina somehow using eye tracking.

EoRaptor
Sep 13, 2003

by Fluffdaddy

evensevenone posted:

I think Carmack thinks 8K at 90fps is about where HMDs get good (I.e screen-door effect goes away, dot pitch about in line with current display sizes/viewing distances. So that will probably be a new generation of interconnects and several GPU generations. I think to get retina-sized pixels across the FOV you need around 20K, which might actually be feasible in a decade or so.

But it might make more sense to take advantage of the non-uniformity of the retina somehow using eye tracking.

VR is probably not a miracle, but I also think we will start to cheat a bit, as you can use way less pixels for areas that aren't the center of focus, saving a huge amount of rendering time and transit bandwidth.

We are slowly getting to the point where head and eye tracking is fast enough we wouldn't even notice the missing bits. Our eyes do this type of thing all the time already, and I expect the knowledge of how that works to be applied.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Foveated rendering is something we've looked at; you have to push as many pixels really (they all have to light up) unless you invent new signaling, but you could make those pixels much cheaper to generate. (The human eye only perceives full resolution over something like 3° of arc.) There are definitely a lot of tricks left to play on the human brain, but I don't think anyone is going to get foveated working in this headset generation. I'm not sure fast enough eye tracking is possible with current sensing tech, though eye and head motion both top out at about 1000°/sec so the general rendering responsiveness requirement might be about the same as for tracking head rotation.

The Baumann
Jun 2, 2013

En Garde, Fuckboy
Is there a special term for when the image on your screen bleeds to the side?

It's a pretty big problem for my monitor and I'm curious as to what its called so I can try to fix it.

Zorilla
Mar 23, 2005

GOING APE SPIT

Baumann posted:

Is there a special term for when the image on your screen bleeds to the side?

It's a pretty big problem for my monitor and I'm curious as to what its called so I can try to fix it.

Is it persistent or does it only happen with motion? Are you using DVI/HDMI or VGA? This sounds a lot like a VGA-connected monitor with badly adjusted timing. Even if not using VGA, some monitors still let you play with timing when connected to a DVI/HDMI signal, so it may be worth investigating whether or not those are still set to 0 like they probably should be.

As always, photos of the problem would help.

The Baumann
Jun 2, 2013

En Garde, Fuckboy

Zorilla posted:

Is it persistent or does it only happen with motion? Are you using DVI/HDMI or VGA? This sounds a lot like a VGA-connected monitor with badly adjusted timing. Even if not using VGA, some monitors still let you play with timing when connected to a DVI/HDMI signal, so it may be worth investigating whether or not those are still set to 0 like they probably should be.

As always, photos of the problem would help.


So after looking into the timing, it was not set to zero. Fixing that seems to have done the trick, and the image on my monitor is a lot better now. Thanks for your help.

DammitJanet
Dec 26, 2006

Nice shootin', Tex.
Is it realistic to think we'll see a 34-inch ultrawide monitor under $500 this year?

Etrips
Nov 9, 2004

Having Teemo Problems?
I Feel Bad For You, Son.
I Got 99 Shrooms
And You Just Hit One.

DammitJanet posted:

Is it realistic to think we'll see a 34-inch ultrawide monitor under $500 this year?

I wouldn't hold my breath on that.

DammitJanet
Dec 26, 2006

Nice shootin', Tex.

Etrips posted:

I wouldn't hold my breath on that.

I had a feeling. The only one I'm all that interested in is LG's 34UM67, and that fucker only supports Freesync, and I just got a GTX 970, and I was looking forward to getting into G-sync. I went to Best Buy the other night to get a look at a 29" ultrawide to see if it would be big enough to suit me (it wasn't). Trying to look at a 27" 2560x1440 IPS (my next choice) was an even bigger challenge since BB had not one monitor on display at a resolution higher than 1920x1080.

Tearing and v-sync stutter are my biggest motivations for upgrading from this dried out turd, so now I'm thinking of taking my ~$330 to amazon and getting a QNIX 2710 for OC purposes...maybe the increased refresh rate will be close enough to the effect of G-sync?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

DammitJanet posted:

Tearing and v-sync stutter are my biggest motivations for upgrading from this dried out turd, so now I'm thinking of taking my ~$330 to amazon and getting a QNIX 2710 for OC purposes...maybe the increased refresh rate will be close enough to the effect of G-sync?
Just make sure you have realistic expectations about being able to actually drive it in whatever games you've got. I know a 970 is a beefy piece of meat, but even it will bog down if you're trying to do 1440p@90+Hz with all the pretties on in some games (1440p@90Hz is 2.6 times what 1080p@60Hz is). NVidia is a dick for refusing to support Freesync at this point, but I guess it's all about :10bux: and that ain't gonna change for a few years until they're basically forced into it by monitor manufacturers who don't want to pay the royalty.

DammitJanet
Dec 26, 2006

Nice shootin', Tex.
I'd be happy to switch to 1920x1080 for some games that need it. Is that not an option with an OC'd Korean affair?

Zorilla
Mar 23, 2005

GOING APE SPIT

DammitJanet posted:

I'd be happy to switch to 1920x1080 for some games that need it. Is that not an option with an OC'd Korean affair?

You can run pretty much any resolution you want. Your video card will be handling the scaling instead of the monitor's signal board, but it's totally automatic and you don't have to configure anything. Even the BIOS splash screen will be stretched to fill the screen.

The only real caveat is that it's a simple, linear stretch, so things might look a bit more blurred if you're used to monitors that use a "sharpness" setting to determine how smooth or pixelated lower resolutions look when scaled up.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

DammitJanet posted:

I'd be happy to switch to 1920x1080 for some games that need it. Is that not an option with an OC'd Korean affair?
You can always set a game to whatever render resolution you want--the GPU will simply upscale whatever the result is to the 1440p signal that the Korean monitor requires, so no worries there. The problem is that 1080 x 1.333 = 1440, and since 1.333.. isn't an even number, the visual quality suffers and things end up looking blurry or fuzzy. A better option for most games would be to leave it at 1440p and knock down some of the other graphics options, like FSAA, shadows, etc. to maintain an acceptable framerate. But you can toy around with a variety of those options until you figure out what you like best; it all will work, just a question of what looks best to you.

Adbot
ADBOT LOVES YOU

Wasabi the J
Jan 23, 2008

MOM WAS RIGHT

DammitJanet posted:

I'd be happy to switch to 1920x1080 for some games that need it. Is that not an option with an OC'd Korean affair?

No scaler.

It'd be worth looking into the 34" 1080 ultra wides if it's just for gaming.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply