|
movax posted:poo poo, sorry, that was copied over from the U2410 description. I added an 'UNCONFIRMED' to the U2412 description. No idea if the U2412M suffers from the same. 2412 definitely doesn't support 1920x1080 without scaling. It's probably the monitor's biggest downside.
|
# ? Oct 9, 2011 01:00 |
|
|
# ? Jun 10, 2024 12:32 |
|
sethsez posted:2412 definitely doesn't support 1920x1080 without scaling. It's probably the monitor's biggest downside. It seems like basically no 24" monitor below $500 supports 1:1 pixel mapping.
|
# ? Oct 9, 2011 01:39 |
|
Apparently, the HP ZR2440w supports 1:1 pixel mapping. It's the replacement for the HP ZR24w and probably has the same panel has the U2412M. http://h10010.www1.hp.com/wwpc/us/en/sm/WF05a/382087-382087-64283-72270-3884471-5163690.html quote:1:1 scaling supports full HD 1080p letterboxing.
|
# ? Oct 9, 2011 06:23 |
|
Sorry if this has been posted already but are there even standards for getting 10bit color from an Application > OS > GPU > display connection > display? I can see an application writing to a buffer on a special output card which is connected with a special connection to a special display but what about 'normal' high end desktop displays and apps rendering just on the desktop?
|
# ? Oct 9, 2011 16:40 |
|
Shaocaholica posted:Sorry if this has been posted already but are there even standards for getting 10bit color from an Application > OS > GPU > display connection > display? HDMI has the GPU > display connection > display portion covered. The HDMI standard includes specifications for supporting several wide-gamut color spaces and up to 16 bits per color. Displayport has this covered as well. Supposedly support for 10-bit or 16-bit color depth was added in Windows 7, but with some searching (including a bit of MSDN documentation) I couldn't find anything about it said after Windows 7 actually came out nor how it would be used. It is definitely possible to do 30-bit color with OpenGL and nVidia Quattro cards, and I think AMD's FireGL cards as well.
|
# ? Oct 9, 2011 23:12 |
|
Are we getting to the point where 'prosumer' grade displays can talk to a colorimeter and calibrate themselves? No application running in an OS, just plug the colorimeter into the display. I know some high end displays do this but they are way outside the scope of the average joe and even small business. There also needs to be some way for the OS to know this fact which can be sending a profile downstream on the cable or simply having the display limit itself to a known standard gamut which is already defined in the OS like sRGB or AdobeRGB, etc.
|
# ? Oct 10, 2011 04:14 |
|
I need a new monitor for gaming, movies, and general Windows use/surfing the net. I like fighting games, so input lag is an important issue for me (anything above 30 ms is really really bad.) I'm not going to be doing image/movie editing, and I don't really care about strict color accuracy. Black bars don't bug me. I'll also be upgrading my video card, so I should be able to handle any resolution. I have no hard limit on my budget, but I don't really want to spend more than $1000. Any advice on what I should get?
|
# ? Oct 10, 2011 13:40 |
|
Ashenai posted:I need a new monitor for gaming, movies, and general Windows use/surfing the net. I like fighting games, so input lag is an important issue for me (anything above 30 ms is really really bad.) I'm not going to be doing image/movie editing, and I don't really care about strict color accuracy. Black bars don't bug me. You may not have a hard budget limit, but what resolution you get will greatly affect how much video card you need. A 1900x1080p monitor will require maybe $2-300 in video hardware, while a 2560x1600 monitor will run closer to $600. What resolution/monitor size do you really want? The Dell U2412M is a pretty good default choice, as it's 1900x1200, 24", IPS (so good viewing angles/color), and has essentially no input lag.
|
# ? Oct 10, 2011 13:46 |
|
Crackbone posted:You may not have a hard budget limit, but what resolution you get will greatly affect how much video card you need. A 1900x1080p monitor will require maybe $2-300 in video hardware, while a 2560x1600 monitor will run closer to $600. It's going to sit on my desk maybe four feet away from my face, so I don't need it to be huge. 24" sounds perfect, and based on a quick googlin' the U2412M seems very appealing. I think that's what I'll go with, thanks edit: man, good monitors are pretty inexpensive these days
|
# ? Oct 10, 2011 13:56 |
|
If you're really going to game a decent amount, think about grabbing a 120hz panel like the BenQ XL2410T. It's really loving awesome, but you'll need to download a good color profile for it and tweak it a little.
|
# ? Oct 10, 2011 16:35 |
|
Taima posted:If you're really going to game a decent amount, think about grabbing a 120hz panel like the BenQ XL2410T. It's really loving awesome, but you'll need to download a good color profile for it and tweak it a little. Err, what actually happens to the loaded color profile when the system goes into fullscreen 3D mode? Does it just get pypassed or tossed? I know some games are aware of this but most aren't right?
|
# ? Oct 10, 2011 16:37 |
|
Zhentar posted:Supposedly support for 10-bit or 16-bit color depth was added in Windows 7, but with some searching (including a bit of MSDN documentation) I couldn't find anything about it said after Windows 7 actually came out nor how it would be used. It is definitely possible to do 30-bit color with OpenGL and nVidia Quattro cards, and I think AMD's FireGL cards as well. Shaocaholica posted:Are we getting to the point where 'prosumer' grade displays can talk to a colorimeter and calibrate themselves? No application running in an OS, just plug the colorimeter into the display. Shaocaholica posted:Err, what actually happens to the loaded color profile when the system goes into fullscreen 3D mode? Does it just get pypassed or tossed? I know some games are aware of this but most aren't right?
|
# ? Oct 10, 2011 18:36 |
|
Oh wow. That would drive me nuts. It annoyed me enough that installing the new ATI drivers reverted to some default colour profile that didn't even show in my colour management interface. I noticed it straight away, so I guess at least you can say it's not placebo.. You'd think that the drivers would look at the profile you were using before, and then, even if something about reinitialising the drivers fucks the profile, it should revert to the profile you used to use, after the drivers are done.
|
# ? Oct 10, 2011 18:56 |
|
HalloKitty posted:Oh wow. That would drive me nuts. This is kinda why I asked if we're at the point where displays controlled the color correction LUT, not the OS/driver/GPU. That way, no matter what OS/application/GPU you're using, it should transparently work granted a minimal level of awareness on the part of the OS/application/GPU.
|
# ? Oct 10, 2011 20:26 |
|
Shaocaholica posted:This is kinda why I asked if we're at the point where displays controlled the color correction LUT, not the OS/driver/GPU. That way, no matter what OS/application/GPU you're using, it should transparently work granted a minimal level of awareness on the part of the OS/application/GPU. True. That'd be really nice, but bear in mind.. most people see nothing wrong with cheap 6-bit TN panels (nothing necessarily wrong with this, they serve a useful purpose!), so I'm not sure self-calibrating monitors will reach the mainstream.
|
# ? Oct 10, 2011 20:29 |
|
When will AMOLED screens go mainstream? I love the screen on my Galaxy S, and would like to have pitch blacks on my monitor. I assume its a problem with making such a large screens. Other than that, are there limitations like screen lag, ghosting, etc?
|
# ? Oct 10, 2011 20:37 |
|
Animal posted:When will AMOLED screens go mainstream? I love the screen on my Galaxy S, and would like to have pitch blacks on my monitor. I assume its a problem with making such a large screens. Other than that, are there limitations like screen lag, ghosting, etc? Sony will sell you a 17" OLED monitor for a mere $4,100. I don't know if it's been overcome yet (Sony doesn't offer a spec here), but in the past one of the major limitations of OLED has been a life span significantly lower than what most users will expect from a desktop display (displays in phones, of course, benefit from both lower usage and a comparatively short useful lifespan).
|
# ? Oct 10, 2011 21:04 |
|
Ashenai posted:edit: man, good monitors are pretty inexpensive these days Don't know if you've actually bought yet, and maybe you saw it, but there is a $80 off coupon code valid until the 13th for the U2412M. Brings it down to $319. XD5W7S1JRP52C8
|
# ? Oct 11, 2011 05:23 |
|
JediJesseS posted:Don't know if you've actually bought yet, and maybe you saw it, but there is a $80 off coupon code valid until the 13th for the U2412M. Brings it down to $319. I didn't see it, and I already bought it. Thanks, though On the plus side, the monitor really owns, thanks Crackbone!
|
# ? Oct 11, 2011 05:40 |
|
Ashenai posted:I didn't see it, and I already bought it. Thanks, though Dell customer service is excellent, if you politely contact them via live chat or phone, they may be able to work with you and give that discount to you retroactively. Never hurts to politely ask for something!
|
# ? Oct 11, 2011 05:53 |
|
Taima posted:If you're really going to game a decent amount, think about grabbing a 120hz panel like the BenQ XL2410T. It's really loving awesome, but you'll need to download a good color profile for it and tweak it a little. I have the asus 120hz and i just could not go back to 60.
|
# ? Oct 11, 2011 16:26 |
|
ToastyX posted:Apparently, the HP ZR2440w supports 1:1 pixel mapping. It's the replacement for the HP ZR24w and probably has the same panel has the U2412M. What the hell? HP already has a replace for the zr24w?
|
# ? Oct 11, 2011 17:01 |
|
Tab8715 posted:What the hell? HP already has a replace for the zr24w? Yep; HP announced a full new set of IPS displays in various sizes, ZR2x40Ws. 20", 22", 24" and 27" IIRC.
|
# ? Oct 11, 2011 18:25 |
|
Pardon my ignorance but how do games work with >60hz displays? I mean, does vsync become less noticeable? If not, and I keep vsync on, how often will modern games even go beyond 60fps @ 1080p?
|
# ? Oct 11, 2011 20:51 |
|
FYI the Dell u2711 is on for $719 CAD now on dell.ca: http://accessories.dell.com/sna/products/Monitors_Flat_Panel_Widescreen/productdetail.aspx?c=ca&l=en&s=dhs&cs=cadhs1&sku=224-8284 So tempting.
|
# ? Oct 11, 2011 21:26 |
|
JediJesseS posted:Don't know if you've actually bought yet, and maybe you saw it, but there is a $80 off coupon code valid until the 13th for the U2412M. Brings it down to $319. What the gently caress is wrong with the Dell website? I've spent 15 minutes trying to buy this monitor with this coupon and at various stages in the process, I keep getting 'Your cart is empty or has expired' messages.
|
# ? Oct 11, 2011 21:40 |
|
Shaocaholica posted:Pardon my ignorance but how do games work with >60hz displays? I mean, does vsync become less noticeable? If not, and I keep vsync on, how often will modern games even go beyond 60fps @ 1080p? VSync will try to maintain an FPS that's an even divisor of the screen refresh rate; the problem people have nowadays is that even a slight FPS drop to 50 gets floored down to 30FPS to prevent tearing. Triple-buffering + Vsync try to combat this, though they sometimes need to be forced via something like D3DOverrider. 120Hz displays with Vsync/trip-buffering on would drop to 60FPS before dropping further; if you played without VSync whatsoever, then you'd still get tearing, but at a nice high refresh rate.
|
# ? Oct 11, 2011 21:43 |
|
Shaocaholica posted:Pardon my ignorance but how do games work with >60hz displays? I mean, does vsync become less noticeable? If not, and I keep vsync on, how often will modern games even go beyond 60fps @ 1080p? V-Sync works exactly the same at 120hz as it does at 60hz. If your video card/game is fast enough to consistently render frames at 120fps, then it will do 120fps, and if it's not, you'll get something less. And it is theoretically possible (with naive v-sync implementations) that you could only reach 119fps, with every frame rendering in exactly the same amount of time, and end up stuck at 60fps. If, and how often, that would happen depends on a huge number of factors so it's not something you could easily generalize.
|
# ? Oct 11, 2011 21:48 |
|
movax posted:VSync will try to maintain an FPS that's an even divisor of the screen refresh rate; the problem people have nowadays is that even a slight FPS drop to 50 gets floored down to 30FPS to prevent tearing. Triple-buffering + Vsync try to combat this, though they sometimes need to be forced via something like D3DOverrider. Well having it drop to 60hz would be nice. Still, have many games can even run at >60fps at 1080p on ~1 year old hardware?
|
# ? Oct 11, 2011 21:49 |
|
cheese posted:What the gently caress is wrong with the Dell website? I've spent 15 minutes trying to buy this monitor with this coupon and at various stages in the process, I keep getting 'Your cart is empty or has expired' messages. Apparently it was Opera. I tried on Firefox and it worked just fine.
|
# ? Oct 11, 2011 21:55 |
|
movax posted:VSync will try to maintain an FPS that's an even divisor of the screen refresh rate; I'm not aware of any V-Sync technique that intentionally attempts to maintain an even divisor. The problem with the traditional double-buffering model is that you can't start drawing the next frame until the current one gets displayed. This means that the effective time to draw each individual frame will effectively always be a multiple of the refresh rate, but the average of that is not necessarily a divisor. If the average time to render is close to a multiple of the refresh rate, some frames will take fewer refreshes than others (e.g. at 60hz, if frames take 17ms +/- 1ms, some frames will finish in one refresh and some frames will finish in two) and you'll end up with an FPS that's not an integer divisor of the refresh rate.
|
# ? Oct 11, 2011 22:33 |
|
Ok so I've just had this monitor delivered!! http://www.ebuyer.com/240824-lg-w2363d-pf-lcd-tft-23-3d-ready-hdmi-monitor-w2363d-pf The instructions for the monitor are asking me to uninstall my graphics card drivers before booting up some geforce drivers (I have an ATI card but I'm guessing that's still compatible) on a cd-rom provided. However most of this seems to be in aid of calibrating for the 3D features. I'm not at all interested in the 3D capabilities of the monitor (I don't even have the glasses needed) and bought it solely for the 120hz. Should I be fine just plugging the monitor in without uninstalling my graphics drivers? I definately didn't have to do anything like this for my last monitor.. Thanks for any help.
|
# ? Oct 12, 2011 15:02 |
|
If you don't care, neither should your graphics driver. Just plug it in, verify that you can select 120Hz for the refresh rate, and enjoy.
|
# ? Oct 12, 2011 15:14 |
|
Do you guys think the cost difference is worth it between an HP ZR22w and an HP ZR2240w? The former uses CCFL, and I can't imagine the increased power consumption over the WLED version would come anywhere near paying for the premium -- it would be interesting to calculate the cost savings over a year assuming 20 hours/week usage. Does the ZR2240w offer any practical benefits over the ZR22w other than the decreased size/weight and reduced power consumption? Also, I wish you could get a 1080p display at 19-20". Laptops have taught me that high pixel density is loving neat (Sony Z crew represent).
|
# ? Oct 12, 2011 16:54 |
|
kalibar posted:Do you guys think the cost difference is worth it between an HP ZR22w and an HP ZR2240w? The former uses CCFL, and I can't imagine the increased power consumption over the WLED version would come anywhere near paying for the premium -- it would be interesting to calculate the cost savings over a year assuming 20 hours/week usage. This is entirely subjective and based off the few CCFL/LED monitors I've tried, but colors look less "washed out" on my LED monitors and they are more pleasant to stare at for long periods of time.
|
# ? Oct 12, 2011 17:05 |
|
kalibar posted:Do you guys think the cost difference is worth it between an HP ZR22w and an HP ZR2240w? The former uses CCFL, and I can't imagine the increased power consumption over the WLED version would come anywhere near paying for the premium -- it would be interesting to calculate the cost savings over a year assuming 20 hours/week usage. I don't think there are major differences except for the backlight and any industrial design tweaks; still no reviews out for the new ZR-series yet. Some of them seem to be following Dell's playbook in taking 6-bit panels and applying A(FRC) though. On something as small as a 22", I don't know what the savings would be, but I do know my U3011 sucks down >100W when it's on due to the sheer size it has to light up. Oh well, I'm not getting a new monitor until this one is stolen or utterly destroyed!
|
# ? Oct 12, 2011 17:06 |
|
DrDork posted:If you don't care, neither should your graphics driver. Just plug it in, verify that you can select 120Hz for the refresh rate, and enjoy. Ok, thanks for the help.
|
# ? Oct 12, 2011 19:15 |
|
kalibar posted:Do you guys think the cost difference is worth it between an HP ZR22w and an HP ZR2240w? The former uses CCFL, and I can't imagine the increased power consumption over the WLED version would come anywhere near paying for the premium -- it would be interesting to calculate the cost savings over a year assuming 20 hours/week usage.
|
# ? Oct 13, 2011 02:30 |
|
Good feedback, thanks guys. Didn't notice that little tidbit about the power consumption, that's really interesting. The thing is, ZR22w's are less than $200 on eBay. So like, I could have dual 1080p IPS displays for less than 400 bucks. That's pretty appealing.
|
# ? Oct 13, 2011 07:23 |
|
|
# ? Jun 10, 2024 12:32 |
|
I picked up a second-hand Samsung 2693hm, 25.5" 1600x1200, for €200. Unfortunately, there's a vertical red line every time I turn it on. It disappears after about 15 minutes or if I rotate the monitor 90 degrees it will disappear and if i'm lucky I can bring the screen back to horizontal and it will be ok. That makes me think gravity impacts the red line. I'm tempted to open the screen to fix it, but won't unless someone has advice. I'm posting because of the speakers in it. They don't mute when I plug speakers into the headphone socket. Anyone any ideas? Like, why have a headphone socket if the speakers don't turn off? Otherwise, I'd say the screen is too big. I tried it at my regular desk and it was too high up. I'm on a taller stool now and it's OK, but I think I'll really only use it with my Apple TV. That said, the extra 60 lines on the top and bottom during videos are barely noticeable. The viewing angle seems grand from 10' away. The difference in pixel size between my MacBook Air and this screen was weird. I think I'll pick up a second-hand 23" Apple Cinema display (~€300) and use that for actual work.
|
# ? Oct 14, 2011 00:23 |