|
Does this look like completely normal levels of glow/bleed? I think its pretty solid to me.
|
# ? Apr 25, 2023 06:43 |
|
|
# ? May 27, 2024 14:38 |
|
By overexposed photograph standards, I'd say it looks about normal or maybe slightly better than average? It's very hard to tell in photos though due to how they exaggerate the effects of glow and bleed. If it looks okay to your eye in person, then that's all that matters.
Dr. Video Games 0031 fucked around with this message at 07:18 on Apr 25, 2023 |
# ? Apr 25, 2023 07:07 |
|
I would really like some advice picking a monitor. I've read the the OP but still have a few things I'd like people's thoughts on. I'm work from home full time as a developer and game a lot in the evenings so gaming and coding are the primary uses. Oh, and a bit of music production and (even less frequently) occasional video editing. My graphics card is an RTX 3080. Budget-wise, probably 300-400 GBP but I'd go up to 500 if there was good reason. Stuff I'm reasonably sure about : Size - pretty sure I want 27". I want something bigger than the 24" I have currently, but I think anything beyond that is going to be impractical in terms of viewing distance and desk space. Resolution - pretty sure I want 1440p. I've considered 4K but I'm not sure if you can really tell the difference at that size and, for gaming, I'm more interested in frame rates, smoothness and responsiveness than res. Thoughts? If you think I should get 4K I'd be interested in hearing why. Refresh rate - Something high? 144Hz is probably enough I think. Stuff I'm not so sure about : G-Sync / Freesync - Not 100% sure what I'm looking for here. There seem to be very few fully native G-Sync monitors, and they are more expensive. But I gather than FreeSync monitor are compatible these days? Is that right? Are there are any tangible drawbacks to using a non-native G-Sync (but G-Sync compatible) monitor with an Nvidia card? Is it worth spending the money? Would appreciate advise on this. Panel Type - I'm guessing for my budget, it's going to be some variant of IPS right? HDR - Seems from reading this thread it's not worth it as this price point. Is that right? Curvature - I don't think I want a curved one but a lot of them seem to be, so am wondering if I'm unnecessarily restricting my options. I can see why it's good for gaming, not to so sure about coding etc. Also, in a previous job I worked in an office where we have ultrawide curved monitors and they have this annoying effect where, when moving my head, I would see sort of vertical lines strobing. Not a great description but hopefully you know what I mean. I found it pretty irritating. Is that a common thing or a problem with certain panel types or something. I think that's about it. Interested in everyone's thoughts. Especially if anyone just wants to give me a solid "buy this, you'll be happy" recommendation, because honestly researching all this sort of stuff drives me mad, I just go round in circles and never make a decision. Bonus points: Something I can just order on Amazon UK and have it hear next day would be great. chippy fucked around with this message at 17:01 on Apr 25, 2023 |
# ? Apr 25, 2023 16:58 |
|
Ran into an odd bug with my Alienware QD-OLED where g-sync stopped working (didn't show up as a feature on nvidia control panel) and I had to power cycle the monitor to get it working again It was a good reminder at how bad tearing is without g-sync because I was playing Dead Island 2 and the tearing was incredibly noticeable even with a 120fps average frame rate
|
# ? Apr 26, 2023 03:29 |
|
shrike82 posted:Ran into an odd bug with my Alienware QD-OLED where g-sync stopped working (didn't show up as a feature on nvidia control panel) and I had to power cycle the monitor to get it working again I splurged on a 7950gx2 back in the day. You think tearing is bad with 1 gpu? Watch 2 of them frankensteined together in 1 pci slot try to render a forest in world of warcraft.
|
# ? Apr 26, 2023 03:40 |
|
chippy posted:I would really like some advice picking a monitor. I've read the the OP but still have a few things I'd like people's thoughts on. I'm work from home full time as a developer and game a lot in the evenings so gaming and coding are the primary uses. Oh, and a bit of music production and (even less frequently) occasional video editing. My graphics card is an RTX 3080. I think these are supposed to be pretty good but there's so many options in this range that are affordable that 27" 1440p high refresh is pretty much a dime a dozen now. GIGABYTE M27Q ASUS VG27AQ
|
# ? Apr 26, 2023 03:43 |
|
chippy posted:I would really like some advice picking a monitor. I've read the the OP but still have a few things I'd like people's thoughts on. I'm work from home full time as a developer and game a lot in the evenings so gaming and coding are the primary uses. Oh, and a bit of music production and (even less frequently) occasional video editing. My graphics card is an RTX 3080. 27" 1440p IPS is a good idea for this budget. As for G-Sync/Freesync, nearly every monitor these days will be g-sync compatible, and it's not really worth spending extra for a native g-sync monitor. And you're right about HDR not being worth it in this price range. Looking at PCPartPicker, the monitor I'd choose in this list is the LG 27GP850. It has a really nice, clear image that's hard to beat. The VG27AQ is cheaper, but it's a much older model that has slower response times and slightly worse gamut coverage. There may be better deals out there that PCPartPicker isn't finding—their coverage in the UK seems rather spotty.
|
# ? Apr 26, 2023 04:02 |
|
Thanks a lot for your suggestions. I have a couple more questions based on these: - There seem to be very similar models/variants of the LG 27GP850: LG 27GP850-B, and LG 27GP850P-B. Are they all basically the same, slightly newer versions etc.? - LG 27GP850 apparently has a VRR range of 60-180Hz for Geforce cards. GIGABYTE M27Q apparently goes down to 48Hz. What happens if your framerate drops below these ranges? You get tearing again? In that case, it seems better to go for the one with the wider range. The Gigabyte one will apparently do frame-doubling below this range. Although that is described as "AMD low framerate compension" so maybe it only works with AMD cards? Any idea if the LG one does something like this? Or do all VRR monitors do something like this when outside the range? Thanks again! edit: OK, according to https://www.rtings.com/monitor/reviews/lg#:~:text=need%20with%20LG.-,Lineup,-LG%20offers%20three, looks like maybe the P just means 2021, and the -B just means it's in black. My VRR questions still stand though. chippy fucked around with this message at 15:23 on Apr 26, 2023 |
# ? Apr 26, 2023 15:04 |
|
RTINGS reports a 20 - 180 fps range for g-sync on the 27GP850. I believe the monitor does frame doubling/tripling when it drops below 60. So the refresh rate will never drop below 60hz, but VRR still functions under it. The M27Q also supports VRR down to 20fps or even lower, reportedly, also through frame duplication. I don't believe there's any tangible difference between these monitors' VRR implementations, even if the M27Q can technically hit a lower refresh rate. That just means that frame duplication starts at 47fps instead of 59fps. This works on Nvidia cards as well as AMD cards. The different models of the 27GP850 are just minor variations, as far as I know. Maybe some slight differences in chassis design, built-in RGB LEDs, USB ports, etc. It shouldn't be anything that affects the panel.
|
# ? Apr 26, 2023 15:15 |
|
Dr. Video Games 0031 posted:RTINGS reports a 20 - 180 fps range for g-sync on the 27GP850. I believe the monitor does frame doubling/tripling when it drops below 60. So the refresh rate will never drop below 60hz, but VRR still functions under it. The M27Q also supports VRR down to 20fps or even lower, reportedly, also through frame duplication. I don't believe there's any tangible difference between these monitors' VRR implementations, even if the M27Q can technically hit a lower refresh rate. That just means that frame duplication starts at 47fps instead of 59fps. This works on Nvidia cards as well as AMD cards. I just found Nvidia's own compatibility list, and that says bottom end of 60 for the 27GP850. Presumably that's more likely to be correct than RTINGS. But if I'm understanding, you're saying that if the monitor is doing frame double/tripling below 60, the difference between that and the monitor doing actual VRR <60Hz is not actually perceivable? So effectively it is doing VRR down to 20, just via frame duplication as opposed to truly matching the refresh rate with the framerate?
|
# ? Apr 26, 2023 15:26 |
|
The backlight on my 2 year old AOC U2790VQ 27" 4K monitor is dying. I need an equivalent replacement. Should I get the same one or something else? I hardly game, I just want a 4k 27" with a VESA mount.
|
# ? Apr 27, 2023 04:47 |
|
does anybody own the AW3423DWF and can just clearly tell me if it's going to suck having the freesync version on my 3080 ti?
|
# ? Apr 27, 2023 20:06 |
|
What’re you worried about? As I understand it’s not quite as nice a panel overall but I don’t think there’s any difference between having the gsync module or not, if that’s what you mean?
|
# ? Apr 27, 2023 20:43 |
|
I have that panel and 3080 FE I can't tell the difference from gsync except I heard the gsync panel had an annoying fan noise when the gsync version first released and didn't allow firmware updates (which the freesync one does)
|
# ? Apr 27, 2023 21:02 |
|
Mine doesn’t make a noise I can hear but yeah no fw updates, something about the gsync module
|
# ? Apr 27, 2023 21:07 |
|
edit:Nvm
Scalding Coffee fucked around with this message at 04:39 on Apr 28, 2023 |
# ? Apr 27, 2023 22:27 |
|
The Gigabyte M27U is now on newegg: https://www.newegg.com/p/N82E16824012061 Looks like the MSRP is $550, which is a really nice starting price for a high-refresh 27" 4K monitor. I'm sure we'll be seeing some good discounts on it by black friday, too.
|
# ? Apr 28, 2023 09:31 |
|
A few pages back I was posting about an LG TV vs a QD-OLED UW. Both options really have their pros and cons for my use case, but it is Golden Week here in Japan and I was able to grab a 48” C1 for a very good price with a free 5 year warranty. Can’t wait to try it!
|
# ? Apr 29, 2023 21:54 |
|
So I've recently purchased the following computer, awaiting delivery: Processor: AMD Ryzen 7 5700X Motherboard: MSI B550M-P [MATX] NVIDIA GeForce RTX 4070 Ti 12GB 32GB [3200MHz] DDR4 RAM 1TB NVMe M.2 Solid State Drive Additional Storage HDD: 4TB Hard Disk Drive Power Supply: 750W 80 Plus Gold As I've currently got a fairly mediocre 10 year old Samsung s24b300 1080p monitor that caps out at 60Hz, I'm going to get a new monitor as well. Given that there's a bit of a gap between now and when I last looked into monitors, I'd appreciate some clarification on some general points (especially given that the OP of this thread is... a little out of date). Basically: 1. 4k vs. 1440p. This is probably the biggest thing I'm unsure of. After researching a bit I'm leaning slightly towards 1440p as from what I can see 4K gaming (particularly for new games with ray-tracing and all of the bells and whistles) is pretty much the purview of x80/x90 class cards (which in Australia cost close to AU$1,500-2000) and would likely also mean that I would need to commit to the frequent upgrade treadmill. Meanwhile when looking at a 1080/1440/4K side-by-side comparison video on a friend's 32-inch 4K monitor, I have to say that the differences between 1080 and 1440 were really pronounced, but I kinda had to squint looking between 1440 and 4K to see a difference. Plus any 4K monitor that gets over 60HZ refresh rate is $1,000+ here. The types of games I play tend to run a wide gamut, but I do want to play things like Street Fighter 6 as well as some very execution-heavy action games where stable framerate are very important. 1.a The impression that I get is that it is far preferable to run games in a monitor's native resolution as it will look poor if e.g. I buy a 4K monitor for everything that my computer can handle at that resolution but then play anything it struggles with at 1440p. Is this accurate/how much of a big deal is it? 2. Refresh rate. I'm seeing 144HZ as being the happy medium for gaming monitors, is there any reason I would want to aim higher/could be satisfied with 60HZ? 3. Response rate: It's my understanding that a 1ms response time from the monitor is pretty much achievable/desirable for gaming these days. Are there any downsides to this/is this overkill in terms of added expense? 4. Is there anything else about a monitor in general that I should look out for with regards to its capabilities that I haven't addressed above? 5. I'm aiming for either 27 or 32 inches, are these sizes appropriate in terms of pixels-per-inch if I go for a 1440p? 6. Will my old HDMI cable be sufficient (bought it 10 years ago with my current monitor)? Or do newer displays with resolutions higher than 1080p require newer connection cables (i.e. has HDMI cable bandwidth increased significantly in recent years necessitating an upgrade)? 7. I use my desk for work from home, but need to use my work-provided surfacepro. Up until now I've been just pulling out the HDMI cable from my own computer and connecting it to the surfacepro via an USB-HDMI connection adapter, but I don't particularly want to continue futzing around at the back of my case every day I'm WFM. I had an idea to use something like this so that I can switch between the monitor using my computer for input and using my work computer for input. Would this potentially introduce lag/some other kind of detriment that would affect how games play e.g. if I'm not connecting a new monitor to the computer directly? Thanks in advance, know it's a lot of questions, but there seems to be a lot of considerations when making a purchase like this.
|
# ? Apr 30, 2023 01:48 |
|
Breetai posted:1.a The impression that I get is that it is far preferable to run games in a monitor's native resolution as it will look poor if e.g. I buy a 4K monitor for everything that my computer can handle at that resolution but then play anything it struggles with at 1440p. Is this accurate/how much of a big deal is it? NVIDIA cards support integer scaling so if you bought a 4k display you can run games at 720p, 1080p, 2160p (whatever gives you frame-rates you’re happy with) with no blur. If you bought a 1440p display you can run games at 720p or 1440p with no blur.
|
# ? Apr 30, 2023 07:27 |
|
Stotty posted:I'm looking to upgrade to a bigger screen, having upgraded my PC recently. I plumped for the M32Q in the end (32" 1440p @ 165Hz). First impressions are pretty good. 32" is verging on being ever so slightly cumbersome, but my previous experience with a 27" @ 1440p was that I found the text a little difficult to read at the distance I sit without incresing the scaling a touch, and therefore losing most of the benefit of the extra screen real estate) - it might have just been a really poo poo panel though, to be honest. It's a shame there's not a bit more granularity in screen sizes. Desktop/productivity is fine, the text is perfectly clear, on par with my old 1080p display, which is totally fine for me, and the extra screen space is nice. Colours seem ok. No dead pixels that I can see. A little bit of noticeable IPS glow in one corner in really dark conditions. It's really nice for any sort of media consumption. Games look decent and the sort I gaming I do, the 32" size isn't a detriment - I can see if you're playing competitive shooters that maybe it might be a drawback .
|
# ? Apr 30, 2023 10:20 |
|
Breetai posted:So I've recently purchased the following computer, awaiting delivery: I bought a similar spec PC to you not that long ago. The articles I read pointed to the 4070ti not really being a good pick for 4K gaming because of the downsized memory controller (the 4070ti is better or at least on par with 3090 in most 1440p benchmarks if i recall correctly). 1440p @ 32" is the same ppi as 1080p @ 24", i think - so I guess it would depend if you're wanting more clarity or happy with what you have now and just want a bigger screen/more desktop space. Most decent monitors should come with a new HDMI/Displayport cable (or both). Some of the Gigabyte monitors (and maybe other brands too) have a KVM switch built in. Stotty fucked around with this message at 10:41 on Apr 30, 2023 |
# ? Apr 30, 2023 10:38 |
|
Breetai posted:is there any reason I would want to aim higher/could be satisfied with 60HZ? I bought a new gaming PC with an RTX 3080 about a year ago, and up until this week was using it with an old Dell 60Hz 1920x1200 monitor. I finally got around to getting an new monitor this week, 1440p, 165Hz, G-Sync compatible. I ran everything with V-Sync on before because I hate tearing, and now I'm using G-Sync with a 165 FPS framerate cap. I'm not sure if it's down to the increased framerate, or the decreased latency, or both, but HOLY poo poo it feels like I upgraded my entire computer. Games, especially FPS, feel amazingly smooth and responsive now. The controls feels so tight and snappy. It seems obvious now, but I had no idea my old monitor was bottlenecking the performance (or at least perceived performance) like that. p.s. Thanks for the recommendation Dr. Video Games 0031, you certainly earned your username.
|
# ? Apr 30, 2023 11:05 |
|
if anything i would prioritize 1440p 144/165hz over 4k 60 because the difference between 1440p/4k isn't as immediately noticeable as the difference between 60/144hz. You will immediately notice how much smoother everything feels. 4070Ti might be getting to the point where 4k high refresh is viable performance wise although you'll still be paying a lot for that privilege.
|
# ? May 1, 2023 00:04 |
|
Breetai posted:So I've recently purchased the following computer, awaiting delivery: 2. there is no reason to stick to a 60Hz monitor these days. decent 1440p gaming monitors mostly jump right part 144Hz and start at 165Hz these days anyway. 3/4. advertised response times are pretty much all lies. it's important to get an IPS monitor, not VA or TN. OLED should only be considered if you're only using the monitor for gaming (not work or web browsing or anything) and it's still a top-end option that it doesn't sound like you'd be considering. you should also have a look at reviews here: https://www.rtings.com/monitor https://www.youtube.com/@Hardwareunboxed https://www.youtube.com/@monitorsunboxed their recommendations might be a bit off in terms of value due to not focusing on the australian market, but it's a good starting point, and you can compare actual response times (not just advertised specs which can be misleading). 5. yeah those are the sizes to look at for 1440p. there's the obvious size/pixel density tradeoff 6. high framerates require higher bandwidth that a ten year old HDMI cable can't do. you'll need a displayport cable (or HDMI 2.1 for high framerate 4K) but monitors tend to come with the relevant cables. 7. good monitors tend to have multiple inputs these days so you shouldn't even need any sort of switch like that
|
# ? May 1, 2023 00:30 |
|
Coming hat in hand to the monitor thread for technical help: I have 2 monitors, and whenever my computer wakes up, even from the monitors just turning off (so the system itself isn't asleep), everything flickers rapidly and then all of my windows get shunted to my main monitor and I have to distribute them again. Any clue on why this is happening and how to stop it?
|
# ? May 1, 2023 19:31 |
|
change my name posted:Coming hat in hand to the monitor thread for technical help: I have 2 monitors, and whenever my computer wakes up, even from the monitors just turning off (so the system itself isn't asleep), everything flickers rapidly and then all of my windows get shunted to my main monitor and I have to distribute them again. Any clue on why this is happening and how to stop it? It's happening because windows is losing sight of at least one of your monitors and thinks it's no longer connected. Not all monitors will behave this way, and I suppose it's down to how the sleep mode is implemented. Windows 11 works around this issue by remembering the position of your windows and restoring them when waking your monitors from sleep. The flickering and poo poo may still happen, but everything should be in the right place at least once it stops. Microsoft didn't see fit to backport this feature to windows 10 despite how easy it would likely be, so you're out of luck if you don't want to upgrade.
|
# ? May 1, 2023 20:04 |
|
Dr. Video Games 0031 posted:It's happening because windows is losing sight of at least one of your monitors and thinks it's no longer connected. Not all monitors will behave this way, and I suppose it's down to how the sleep mode is implemented. Windows 11 works around this issue by remembering the position of your windows and restoring them when waking your monitors from sleep. The flickering and poo poo may still happen, but everything should be in the right place at least once it stops. Microsoft didn't see fit to backport this feature to windows 10 despite how easy it would likely be, so you're out of luck if you don't want to upgrade. I guess I'll upgrade then, I've been putting it off. But this only started recently so I didn't really have incentive to (when I started hooking my PC up to my new TV for movie nights)
|
# ? May 1, 2023 20:09 |
|
change my name posted:I guess I'll upgrade then, I've been putting it off. But this only started recently so I didn't really have incentive to (when I started hooking my PC up to my new TV for movie nights) Not to dissuade you from upgrading, given the security benefits of W11, but if you really don't want to / aren't ready to yet, there are 3rd party programs like DisplayFusion that you can set to do basically the same thing: if your PC/monitor sleeps at, say, 60 minutes idle, have the app save window locations at 55 minutes and then have a restore-position trigger on wake. You'll generally still see windows shuffle around, but at least it'll be taken care of for you.
|
# ? May 1, 2023 20:42 |
|
It’s the tv that sets off the flickering for me. Win 11 doesn’t eliminate that flickering, but I’m pretty sure it resolves it quicker than 10 did.
|
# ? May 1, 2023 20:44 |
|
KingEup posted:NVIDIA cards support integer scaling so if you bought a 4k display you can run games at 720p, 1080p, 2160p (whatever gives you frame-rates you’re happy with) with no blur. Cheers for that, at this stage because I'll be keeping the 1080p screen as a second monitor, I'm leaning towards going 1440p and anything old enough not to support 1440p I can just switch up which is my primary display to run at 1080 on my current one so I avoid the blur from scaling. Stotty posted:I bought a similar spec PC to you not that long ago. The articles I read pointed to the 4070ti not really being a good pick for 4K gaming because of the downsized memory controller (the 4070ti is better or at least on par with 3090 in most 1440p benchmarks if i recall correctly). Thanks. At this stage it may work out to a middle ground and a 27 inch screen, although I'll have to see. The Gigabyte M27Q X looks like a great monitor, only it may not be readily available in Australia. M32Q has slightly worse response times/contrast, but might be easier to actually get a hold of. lih posted:1. 1440p is far better value, high framerate 4K is still very expensive compared to high framerate 1440p. you're correct that trying to run games at 1440p on a 4K monitor will just look bad. a 4070 Ti is also going to start to struggle with 4K games sooner, especially with bad ports that are VRAM hungry that we're already seeing. That's awesome, thanks. rtings.com is an amazing resource and the massive searchable table they have is utterly brilliant for winnowing down exactly what I need. Cheers for the heads-up on IPS as well.
|
# ? May 2, 2023 08:34 |
|
lih posted:OLED should only be considered if you're only using the monitor for gaming (not work or web browsing or anything) Why is this? Burn-in?
|
# ? May 3, 2023 14:53 |
|
chippy posted:Why is this? Burn-in? That partially but also they just get super dim on all-white screens. That Xeneon Flex I was testing out maxed out at like 135 nits on a 100% white screen and the autodimming bouncing all over the place annoyed the poo poo out of me
|
# ? May 3, 2023 15:26 |
|
Some OLED monitors are better for brightness. Asus' recent monitors can do around 180 nits on a 100% white screen, and the QD-OLED ultrawides can do 240. I think mixing in some light web browsing alongside gaming and media consumption is fine, but I wouldn't use an OLED display for for heavy duty work/browsing.
|
# ? May 3, 2023 16:53 |
|
Super dim is subjective, my OLED is plenty bright for me, I never have it near full brightness.
|
# ? May 3, 2023 18:20 |
|
I run the AW34 at like 75% and it’s great.
|
# ? May 3, 2023 18:27 |
|
What’s the deal with OLED being damaged by direct sunlight? I have a small room and need to decide the best wall to put the TV on. My options are to put the TV against the window, meaning it will get direct sunlight on the back of the TV (if I forget to close the curtains), meaning the unit might get hot Or I put it on a side wall where it won’t get as hot but the screen might occasionally get hit by direct sunlight (again, if I forget to close the curtains).
|
# ? May 4, 2023 00:23 |
|
Hey there. Got a alienware aw3423dw open box at microcenter. Under a thousand for a qd oled display heck yeah. only problem i'm running into is that the shape of the monitor doesn't work well with my cheap amazon webcam. Is there a better alternative out there that will stay put? Also is setting the monitor to 144hz for 10bit color worth it at all? Seems glorious even in 8bit. Should I be setting it to hdr 400 or 1000 for best hdr results?
|
# ? May 4, 2023 19:34 |
|
1) I have a logitech and it stays in place fine 2) not that anyone can discern 3) opinions vary, I've been happy with 1000
|
# ? May 4, 2023 19:38 |
|
|
# ? May 27, 2024 14:38 |
|
misterevilcat posted:the shape of the monitor doesn't work well with my cheap amazon webcam. Is there a better alternative out there that will stay put? Suck it up and get some Dual Lock: https://www.amazon.com/Dual-Reclosable-Fastener-SJ3560-Clear/dp/B0141MQRGI That will let you stick pretty much anything to anything else and have it be removable/adjustable. It's a little pricey but the adhesive is VHB which is some structural-grade poo poo, yet removes without residue. I used it in my office to stick webcams to the top/bottom of various TVs to make Zoom setups and they held on for years. Zero VGS fucked around with this message at 21:54 on May 4, 2023 |
# ? May 4, 2023 21:51 |