|
Cojawfee posted:Because everyone can just drop a few hundred dollars on a device they will use once. The other side of that argument is that they could save another couple hundred dollars by not getting an IPS for the color accuracy they'll end up not getting even once.
|
# ? Mar 14, 2015 01:06 |
|
|
# ? May 29, 2024 17:00 |
|
Cojawfee posted:Because everyone can just drop a few hundred dollars on a device they will use once. If you're looking for a "price isn't important" monitor for graphic design that you're going to connect to a $2000 MacBook, there's really no excuse--it's simply a cost of doing that sort of work. Also you can get many of the lower-end offerings (which are all she probably needs) for <$200 these days.
|
# ? Mar 14, 2015 01:18 |
|
Brut posted:The other side of that argument is that they could save another couple hundred dollars by not getting an IPS for the color accuracy they'll end up not getting even once. How to witness color deformation and degradation in a TN monitor posted:1) Get a TN panel larger than a netbook display. Or one straight off a netbook display. Even if you don't have factory or on-site calibration, a not-TN display will look markedly less bad, partly because angle deformation is far less of an issue and partly because most of them are using 8 bit panels (they'll tell you if they're 10-bit) and if it's 6 bit they'll at least be pretty drat good at faking it. Keep in mind that if the manufacturer's using TN for the panel and it's not ALL of [expensive/144Hz/not-gamer-oriented] it's hard to imagine they're not cutting corners everywhere else. And even if you feel you need Accurate Color Or Else, unless color management is actually vital (because you do visual media for money or actual not-Internet prestige or something) you can get by with a $100 model or a passaround - and if it is vital you're derelict if you don't spring for a quality calibrator. dont be mean to me fucked around with this message at 01:27 on Mar 14, 2015 |
# ? Mar 14, 2015 01:24 |
|
I'm currently very happy with my 120 hz monitor for gaming. I have a more long term question. Will monitors keep getting higher refresh rates in the future? I know we already have 144, will we get significantly higher than that in the next few years?
|
# ? Mar 14, 2015 09:41 |
|
Jippa posted:I'm currently very happy with my 120 hz monitor for gaming. I have a more long term question. Will monitors keep getting higher refresh rates in the future? There's not much point, anything higher than 96hz is gonna be almost impossible to detect for most of the population. I think 600hz is the next "sweet spot" because it is divisible by all the main refresh rates used today (24hz, 25hz, 50hz, 60hz) but I dunno when or if that'll happen.
|
# ? Mar 14, 2015 11:03 |
|
SwissCM posted:There's not much point, anything higher than 96hz is gonna be almost impossible to detect for most of the population. I think 600hz is the next "sweet spot" because it is divisible by all the main refresh rates used today (24hz, 25hz, 50hz, 60hz) but I dunno when or if that'll happen. What would be the point if we have adaptive sync? Could we even perceive a change that occured over 1/600th of a second? My guess is somewhere around 200 Hz is the endgame as far as making a screen look perfectly smooth is concerned. That said, I think anything above 85 Hz or so is going to be very smooth, and far above, diminishing returns will kick in hard. Might as well use the GPU power to make the scene prettier. As far as consoles are concerned, it's still de rigueur to poo poo out 30 FPS presentations which are far from visually smooth, so I reckon it'll be a while before we see any real progress, since many devs target 30 FPS as acceptable. Which is sure as hell won't be for VR. HalloKitty fucked around with this message at 18:12 on Mar 14, 2015 |
# ? Mar 14, 2015 11:45 |
|
HalloKitty posted:What would be the point if we have adaptive sync? Could we even perceive a change that occured over 1/600th of a second? Kind of a weird scenario, but at 600hz you could have both a 50hz video and 60hz video running on the same screen without any jitter. Otherwise yeah, adaptive sync is the way to go.
|
# ? Mar 14, 2015 12:56 |
|
Look, if you're a professional in a field, I would expect you have the wherewithal to find the appropriate tools for the job you do. Multimeters are not cheap, but I wouldn't hire an electrician that didn't have one. Wasabi the J fucked around with this message at 18:14 on Mar 14, 2015 |
# ? Mar 14, 2015 15:57 |
|
Wasabi the J posted:Underwhelming by all accounts. I'm trying to find somewhere to see the Dell 34" curved monitor in person - having trouble; found a lot of positive feedback online though. Would you mind going into any detail on your experiences with them?
|
# ? Mar 14, 2015 17:41 |
|
At 600hz you could probably have some very cool motion blur effects.
|
# ? Mar 14, 2015 17:41 |
|
skylined! posted:I'm trying to find somewhere to see the Dell 34" curved monitor in person - having trouble; found a lot of positive feedback online though. Would you mind going into any detail on your experiences with them? I saw one in person at Fry's in Vegas, and it was just ... meh. I liked the ultrawide aspect (which I knew going in, having a 2560x1080 monitor, myself), however the curve was a negative to me; it wasn't severe enough to fill my field of view any more, yet it was too extreme to simply overcome the perspective skewing of the edges being slightly farther away, and actually pushed useful screen elements in the corners to awkward positions in my field of view. I have pretty good vision, though I've never been tested for anything that might gently caress up my peripheral vision. Regardless, it made trying to do something simple (like finding an icon in the upper or lower left of the monitor) a slightly jarring experience as my eyes had to focus NEARER on an object that my brain "knew" was further away. All this testing was done in a store environment, though, so in order to get the most complete review, I'd have to take a $1500 gamble, buy one, mount it well, and adjust everything to myself.
|
# ? Mar 14, 2015 18:15 |
|
Wasabi the J posted:I saw one in person at Fry's in Vegas, and it was just ... meh. I liked the ultrawide aspect (which I knew going in, having a 2560x1080 monitor, myself), however the curve was a negative to me; it wasn't severe enough to fill my field of view any more, yet it was too extreme to simply overcome the perspective skewing of the edges being slightly farther away, and actually pushed useful screen elements in the corners to awkward positions in my field of view. So you mean underwhelming by your account, not all accounts.
|
# ? Mar 14, 2015 18:31 |
|
KingEup posted:So you mean underwhelming by your account, not all accounts. http://www.gizmodo.co.uk/2015/03/one-week-living-with-lgs-curved-ultrawide-219-monitor/ http://www.gizmodo.co.uk/2015/03/lg-34uc97-curved-led-monitor-review-more-screen-than-your-desk-can-handle/ And users in this very thread talking about the backlight issues its curved shape exacerbates. Like I was implying before, it's cool because it's a huge 21:9 display, not because it's got a weird shape.
|
# ? Mar 14, 2015 18:57 |
|
HalloKitty posted:As far as consoles are concerned, it's still de rigueur to poo poo out 30 FPS presentations which are far from visually smooth, so I reckon it'll be a while before we see any real progress, since many devs target 30 FPS as acceptable. Which is sure as hell won't be for VR. I think that you've got it exactly backwards here. e - now I get that you're referring to games on consoles themselves rather their impact on PC gaming. Because the current consoles were so incredibly weak even when released, PC games are already running at double the framerates, and as hardware continues to scale up framerates are going to go even higher. I don't think I have a practical use for a monitor bigger than 27"/1440, it's already much larger than my central FOV and I can't really discern pixels in motion, and I would much prefer scaling to really high framerates over higher resolution at this point. Also, there are benefits to increasing framerates up to enormous numbers that we will never hit, just like resolution. It's always going to be a tradeoff, but framerates have been lagging behind for a long time and we're finally starting to see a serious push in that direction which is nice. K8.0 fucked around with this message at 21:22 on Mar 14, 2015 |
# ? Mar 14, 2015 21:14 |
|
Except for the fact that, quite frankly, the industry has a lot more plans in the pipeline for upping resolution than they do for Hz. VR isn't going to change much because it's going to be pushed via headsets running two independent screens, so you can still get away with 60Hz display hardware (though obviously it'll take quite a bit more to actually drive them, GPU-wise). Consoles will also push more of a resolution-over-framerate approach, because some people actually have 4k TVs in their homes, while basically no one has a TV that's actually capable of displaying a 240Hz signal. You can argue your personal preferences back and forth, but we're not going to see anything higher than 144Hz for a long time, simply because there is minimal demand for it. 120-144Hz is "fast enough" to provide an extremely smooth picture, and as has been noted, past that the diminishing gains kick in pretty hard. Then you have to figure in the computational cost of stupid-high refresh rates: 1080p@60Hz is effectively 2M pixels/s. 1080p@144Hz is 5M pixels/s. 1080p@240Hz is 8M pixels/s, which is pretty close to 4k@60Hz, and it already takes a hell of a setup to run 4k games at 50-60 FPS. So in the next ~5 years we'll mostly see everyone catching up to 4k/5k resolutions, while 1440p and 1080p IPS/VA monitors get a bump to 120/144Hz, and then once those all settle in for a bit, we'll see some work on increasing 4k framerates. But you're not gonna see a 1080p@240/600Hz anytime soon, because seriously, why bother? DrDork fucked around with this message at 01:29 on Mar 15, 2015 |
# ? Mar 15, 2015 01:25 |
|
DrDork posted:Except for the fact that, quite frankly, the industry has a lot more plans in the pipeline for upping resolution than they do for Hz. VR isn't going to change much because it's going to be pushed via headsets running two independent screens, so you can still get away with 60Hz display hardware (though obviously it'll take quite a bit more to actually drive them, GPU-wise). Nobody doing PC-attached VR headsets is doing 60Hz displays.
|
# ? Mar 15, 2015 01:28 |
|
Subjunctive posted:Nobody doing PC-attached VR headsets is doing 60Hz displays.
|
# ? Mar 15, 2015 01:42 |
|
DrDork posted:The Galaxy Gear VR uses a 1440p@60Hz display (which admittedly isn't PC-attached), and the Razor one uses a 1080p@60Hz display. As to the others, they're also not really breaking new ground, either. Most of them look to be in the 1080p-ish range at 90-120Hz. No one's getting too crazy because, again, the computational expense of 240+Hz displays gets pretty hefty really fast. Is Razor really doing low-persistence at 60Hz in their consumer version? That's going to be pretty bad compared to the others. I guess it's really just a dev kit though. I think 90Hz is going to be the standard for PC-attached HMDs; I don't think anyone has been able to produce evidence that LP 120Hz is distinguishably better than 90, and the content performance requirements go up dramatically. (Faster response in terms of change from one pixel value to another will be a good area of improvement, but also harder AIUI than raising the refresh rate.)
|
# ? Mar 15, 2015 01:59 |
|
Subjunctive posted:Is Razor really doing low-persistence at 60Hz in their consumer version? That's going to be pretty bad compared to the others. I guess it's really just a dev kit though.
|
# ? Mar 15, 2015 02:07 |
|
DrDork posted:But you're not gonna see a 1080p@240/600Hz anytime soon, because seriously, why bother? I have a 240 hz tv (60 fps for real, frame interpolation for the rest). I was skeptical, but for everything that doesn't require lightning fast response times, the difference in smoothness is significant. I would really like to buy a 120 hz native display that had frame interpolation for an effective 480 / 600 hz. I get that it's not strictly necessary, but the extra smoothness does help and gives things a very slick feel. Also it helps massively with motion blur on LCDs.
|
# ? Mar 15, 2015 02:24 |
|
DrDork posted:Yeah, though as you noted, it's just a dev kit right now. There's hope that'll it'll get bumped up--Oculus started with a 60Hz dev kit, too. Yeah, but we never considered it to be good enough for high-quality VR. Even DK2's 75Hz was known all along to be a stepping stone, we knew we had to hit at least 90Hz for it to be comfortable. (120Hz at HMD resolutions also exceeds HDMI bandwidth, and some GPUs can't even push 90Hz at Crescent Bay resolutions because they don't have the pixel clock for it. Better spending bandwidth on increased resolution than refresh rate above 90Hz, I think.)
|
# ? Mar 15, 2015 05:18 |
|
Agreed. Either way, high-quality 3D is really a HDMI 2.0 or bust kinda deal, which at least gets you stereo 1080p@60Hz. The real answer is DisplayPort, and I'm not sure why that's not being pushed harder by the higher-end devices. Sure, you can push your lower-rez devices off a medium-range GPU (as long as you're not trying to render anything too stressful), but if you're already talking about 1440p or larger displays at 90+Hz, you're going to need a higher-end card to begin with, and guess what? Most of them have DP now.
|
# ? Mar 15, 2015 05:35 |
|
My understanding from the folks working on the display side of the hardware is that it's harder to accommodate DP with the low-persistence panels that are available in quantity. Might not be inherent, but just how they were built when the current gen of HMDs were being designed. I'll ask on Monday if I remember.
|
# ? Mar 15, 2015 05:41 |
|
After waiting 6 months or so for the price to drop, or something better to come along, I finally purchased the ASUS PB287Q. It's my first 4K monitor and I really like it. Being a gamer having to run at 60hz takes some getting used to but due to my GTX 980 I run all games at 4k@60fps with some graphics settings adjusted. Viewing angles are great, colours are great and the brightness is excellent. While I'll miss 120hz and 120fps I'm looking forward to running more games in 4k.
|
# ? Mar 15, 2015 06:20 |
|
Saw the BenQ XL2730Z, a free-sync capable, pop up on pc parts picker today. Quick someone buy one and tell me its worth getting! Based off of basic comparisons its the Asus ROG Swift but with free-sync instead of G-sync. TN panel, 27 inch, 2540x1440, 144hz refresh rate, 1ms response time. Radeon owners rejoice!
|
# ? Mar 15, 2015 17:52 |
|
Betty posted:Saw the BenQ XL2730Z, a free-sync capable, pop up on pc parts picker today. Quick someone buy one and tell me its worth getting! Based off of basic comparisons its the Asus ROG Swift but with free-sync instead of G-sync. TN panel, 27 inch, 2540x1440, 144hz refresh rate, 1ms response time. Radeon owners rejoice! In the german Amazon store, the BenQ is about ~80 Euro more expensive than the Asus ROG Swift. How and why? I guessed free sync would be way cheaper than a G-Sync price policy
|
# ? Mar 15, 2015 18:51 |
|
I think Carmack thinks 8K at 90fps is about where HMDs get good (I.e screen-door effect goes away, dot pitch about in line with current display sizes/viewing distances. So that will probably be a new generation of interconnects and several GPU generations. I think to get retina-sized pixels across the FOV you need around 20K, which might actually be feasible in a decade or so. But it might make more sense to take advantage of the non-uniformity of the retina somehow using eye tracking.
|
# ? Mar 15, 2015 20:39 |
|
evensevenone posted:I think Carmack thinks 8K at 90fps is about where HMDs get good (I.e screen-door effect goes away, dot pitch about in line with current display sizes/viewing distances. So that will probably be a new generation of interconnects and several GPU generations. I think to get retina-sized pixels across the FOV you need around 20K, which might actually be feasible in a decade or so. VR is probably not a miracle, but I also think we will start to cheat a bit, as you can use way less pixels for areas that aren't the center of focus, saving a huge amount of rendering time and transit bandwidth. We are slowly getting to the point where head and eye tracking is fast enough we wouldn't even notice the missing bits. Our eyes do this type of thing all the time already, and I expect the knowledge of how that works to be applied.
|
# ? Mar 15, 2015 21:21 |
|
Foveated rendering is something we've looked at; you have to push as many pixels really (they all have to light up) unless you invent new signaling, but you could make those pixels much cheaper to generate. (The human eye only perceives full resolution over something like 3° of arc.) There are definitely a lot of tricks left to play on the human brain, but I don't think anyone is going to get foveated working in this headset generation. I'm not sure fast enough eye tracking is possible with current sensing tech, though eye and head motion both top out at about 1000°/sec so the general rendering responsiveness requirement might be about the same as for tracking head rotation.
|
# ? Mar 16, 2015 01:47 |
|
Is there a special term for when the image on your screen bleeds to the side? It's a pretty big problem for my monitor and I'm curious as to what its called so I can try to fix it.
|
# ? Mar 16, 2015 02:34 |
|
Baumann posted:Is there a special term for when the image on your screen bleeds to the side? Is it persistent or does it only happen with motion? Are you using DVI/HDMI or VGA? This sounds a lot like a VGA-connected monitor with badly adjusted timing. Even if not using VGA, some monitors still let you play with timing when connected to a DVI/HDMI signal, so it may be worth investigating whether or not those are still set to 0 like they probably should be. As always, photos of the problem would help.
|
# ? Mar 16, 2015 03:26 |
|
Zorilla posted:Is it persistent or does it only happen with motion? Are you using DVI/HDMI or VGA? This sounds a lot like a VGA-connected monitor with badly adjusted timing. Even if not using VGA, some monitors still let you play with timing when connected to a DVI/HDMI signal, so it may be worth investigating whether or not those are still set to 0 like they probably should be. So after looking into the timing, it was not set to zero. Fixing that seems to have done the trick, and the image on my monitor is a lot better now. Thanks for your help.
|
# ? Mar 16, 2015 04:05 |
|
Is it realistic to think we'll see a 34-inch ultrawide monitor under $500 this year?
|
# ? Mar 16, 2015 04:41 |
|
DammitJanet posted:Is it realistic to think we'll see a 34-inch ultrawide monitor under $500 this year? I wouldn't hold my breath on that.
|
# ? Mar 16, 2015 04:44 |
|
Etrips posted:I wouldn't hold my breath on that. I had a feeling. The only one I'm all that interested in is LG's 34UM67, and that fucker only supports Freesync, and I just got a GTX 970, and I was looking forward to getting into G-sync. I went to Best Buy the other night to get a look at a 29" ultrawide to see if it would be big enough to suit me (it wasn't). Trying to look at a 27" 2560x1440 IPS (my next choice) was an even bigger challenge since BB had not one monitor on display at a resolution higher than 1920x1080. Tearing and v-sync stutter are my biggest motivations for upgrading from this dried out turd, so now I'm thinking of taking my ~$330 to amazon and getting a QNIX 2710 for OC purposes...maybe the increased refresh rate will be close enough to the effect of G-sync?
|
# ? Mar 16, 2015 05:14 |
|
DammitJanet posted:Tearing and v-sync stutter are my biggest motivations for upgrading from this dried out turd, so now I'm thinking of taking my ~$330 to amazon and getting a QNIX 2710 for OC purposes...maybe the increased refresh rate will be close enough to the effect of G-sync?
|
# ? Mar 16, 2015 06:15 |
|
I'd be happy to switch to 1920x1080 for some games that need it. Is that not an option with an OC'd Korean affair?
|
# ? Mar 16, 2015 06:17 |
|
DammitJanet posted:I'd be happy to switch to 1920x1080 for some games that need it. Is that not an option with an OC'd Korean affair? You can run pretty much any resolution you want. Your video card will be handling the scaling instead of the monitor's signal board, but it's totally automatic and you don't have to configure anything. Even the BIOS splash screen will be stretched to fill the screen. The only real caveat is that it's a simple, linear stretch, so things might look a bit more blurred if you're used to monitors that use a "sharpness" setting to determine how smooth or pixelated lower resolutions look when scaled up.
|
# ? Mar 16, 2015 06:46 |
|
DammitJanet posted:I'd be happy to switch to 1920x1080 for some games that need it. Is that not an option with an OC'd Korean affair?
|
# ? Mar 16, 2015 06:47 |
|
|
# ? May 29, 2024 17:00 |
|
DammitJanet posted:I'd be happy to switch to 1920x1080 for some games that need it. Is that not an option with an OC'd Korean affair? No scaler. It'd be worth looking into the 34" 1080 ultra wides if it's just for gaming.
|
# ? Mar 16, 2015 06:47 |