Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
wolrah
May 8, 2006
what?
I have a preorder in on a new GTX970 and unexpectedly sold an expensive piece of equipment recently so I'm thinking about blowing the extra cash on upgrading off my pair of old Dell 20" 1680x1050s (2005FPW primary, E207WFP secondary).

Since >60Hz 1440p non-Gsync doesn't seem to be a thing at the moment, is anyone running the Asus PB287Q (28" 4K 60Hz) next to the VG248QE (24" 1080p 144Hz)? I was originally assuming there was a 1440p equivalent to the VG248QE (high refresh rate but no G-Sync because I still want DVI or HDMI to support game consoles) but since there doesn't seem to be and this would only cost a little bit more than the PG278Q (27" 1440p 144Hz G-Sync) I'm considering it. I'd have the 1080p unit in the center for ideal gaming and put the 4K off to the side to more than quadruple my secondary space.

Can 144Hz mode even work on DVI? nVidia unfortunately allowed their OEMs to keep using the crappy 2xDVI + HDMI + DP port layout they've had for a while on the GTX 970s instead of the nice 3xDP + HDMI + DVI layout from the 980s, so I'm stuck with only one DisplayPort output until at some point in the future I go SLI. I don't think the 4K monitor has HDMI 2.0 support so that'll have to use the DisplayPort connection, leaving the fast monitor with the old interface.

Is running two monitors with such vastly different physical resolutions next to each other going to be disorienting? I'm running Windows 7 primarily at the moment so my OS-level scaling is pretty much crap there and while I haven't looked in to it on Linux I'm going to assume my Ubuntu install is just as bad. I know 8 handles high DPI better, does it properly handle different DPIs on different monitors?

edit: Just thought of another possible plan...

VG248QE in the center still, put the Dell 20 inchers on each side, and get a Seiki 4K to mount on my wall above my main screens to use for IRC, notifications, and the like where nothing's getting moved around often so the 30Hz refresh doesn't really matter.

wolrah fucked around with this message at 04:49 on Oct 4, 2014

Adbot
ADBOT LOVES YOU

wolrah
May 8, 2006
what?
If my GTX550 will do 144Hz, your R9 270X had drat well better. Whether it'll deliver over 60 FPS in a modern game is another matter entirely of course, but it is pretty noticeable even at the desktop. Never has my mouse moved smoother.

That said, I can not agree more on it being a bad idea if color accuracy has financial value to you. My old 2005FPW is significantly better, a modern panel built for quality is likely another world. Just changing my seating position by a few inches, slouching versus not, etc. causes major color shifts on both my TN panels. The Asus is actually worse than my old Dell E-series.

wolrah
May 8, 2006
what?

GreatGreen posted:

Wasn't there some kind of tech a while back that was supposed to be able to split backlights into a grid, each section of which could be individually toggled on and off or independently dimmed?

Yeah, lots LED backlit TVs do this, some even with colored LEDs. It's called local dimming. Not sure if any computer monitors do it though. Obviously this requires true backlighting rather than the side-lit design used in laptops, so the thinnest of displays are automatically out of the picture.

wolrah
May 8, 2006
what?

smushroomed posted:

Just found out my asus strix 970 only has one dp and one hdmi.

Last time I used hdmi on a monitor, it gave me this fuzzy blur that I couldn't get over until I used dvi instead

Does anyone else have a problem with hdmi signals being fuzzy/blurry with their monitors?

HDMI is basically single link DVI + audio on a different connector. They're so much the same thing that adapters between the two are just wires linking the appropriate pins on the plugs. Any difference between the two when operating within spec means something's wrong. Blur especially, there should be no reason you'd see blur in a digital video link. Errors tend to show up as "sparkles" in the displayed image.

e: ^^ yeah that makes sense for the blur, though I haven't seen a PC video card defaulting to overscan in a long time. Monitors built with the assumption that the HDMI port would be used for TV-ish things though...

wolrah fucked around with this message at 03:44 on Oct 26, 2014

wolrah
May 8, 2006
what?

Zartans Lady Mask posted:

Can I ask about LCD arms in this thread?

I'm about to buy an arm for my LCD (specifically, this one), but I'm concerned about the strain put on my desk. The desk in question is this one, which has a solid pine top, albeit about 3/4 of an inch thick. I would be clamping from the side, rather than at the back, and then extending the arm over so my screen is relatively centered on the desk. The screen weighs about 11lb.

This seems like it would put an awfully large amount of stress on the area that the mount is attached, and the last thing I want is for the wood to break or something. Is this a totally retarded concern?

You're fine. I'm using this triple monitor mount with a 24" in the middle and two 20" on the sides clamped to a 1" thick fiberboard desk and it doesn't even show stress. Solid wood will carry a lot more.

edit: If you're really concerned it looks like you should also be able to fit a 1x4, maybe even a 2x4 under the desk between the desk surface and the clamp to spread the load further. It's not necessary but could be cheap and simple peace of mind.

wolrah fucked around with this message at 17:34 on Nov 4, 2014

wolrah
May 8, 2006
what?

Rakthar posted:

What if you want to play games? Many have a fixed interface size which looks tiny at high resolutions.

And most of those games are sufficiently old that doubled 1080p still looks fine, so you at least have a solid option if they're made unusable by 4K. It's a trade off, plenty of developers both in games and elsewhere took shortcuts that mean their software doesn't handle high DPIs well.

Personally I see a high DPI as sufficiently beneficial to be worth putting effort in to finding workarounds for broken apps and/or replacing them with something that does UI the right way. With games, at least as far as I've run in to while messing with 4K via DSR, the problems seem to be mostly limited to menus and other UI that's not critical to gameplay so it's just a minor inconvenience.

wolrah
May 8, 2006
what?
HDMI 2.0 is required for 4K@60Hz with full 4:4:4 color. nVidia and a few TV manufacturers support 4K@60 over HDMI 1.4 using 4:2:0 color, but this mode is not without compromises. Basically the color resolution is half the brightness resolution. For videos it's hard to tell. Games mostly the same. Text and GUI-type stuff though is not great.

It's a fine hack for TV use, but not for something being used as a primary desktop display.

As far as the consoles go you're only getting a 1080p output from them, I think everything does 1080p60 happily.

Only a few brand new TVs and the latest (nVidia 9 series) graphics cards support HDMI 2.0, apparently the 4:2:0 thing has a bit more support on the market but it's still not much.

wolrah
May 8, 2006
what?

App13 posted:

Oh drat, those are almost as much as my monitor.

Yeah they're expensive, but think of a good monitor mount more like a good power supply, case, or the monitor itself. If you choose well it'll last you a long time. It's not the sort of thing you replace every few years.

edit: The more movement it supports the more you want to make sure you get a good one. A simple wall mount or desktop stand with a few degrees of motion on 1 or 2 axes? Pretty much anything works fine. Arm with multiple flex points intended to support a large monitor? Go big or go home.

wolrah fucked around with this message at 02:26 on Dec 21, 2014

wolrah
May 8, 2006
what?

GenericGirlName posted:

Tell me NOT to buy another Dell E248WFP and use my VGA-HDMI convertor thing to make it work in the HDMI port. Please be the voice of reason in my life.

Don't do this. Ignore the VGA port, they really shouldn't exist on a computer built since LCDs became standard.

HDMI and DVI are practivally the same thing. Monoprice or Amazon will have plenty of adapter dongles or cables for so little you might pay more for shipping than for the actual device, and you get to keep a pure digital signal.

As for the monitor itself, it is an older unit and Dell's E series are one of their lower end lines, but if you like it and can get one for really cheap there is something nice about having your side-by-side monitors be identical.

Quaggan Sympathist posted:

This may be the wrong place to ask this, I'm not sure. But I'm at my wits' end here and I thought you guys might have a suggestion: I need a designated display for a Playstation 2. They output 4:3 natively, and run any other aspect ratio like total rear end. So I need something 4:3 that takes RCA or HDMI (there are some pretty reliable PS2->HDMI converters)... and as far as I can tell such a thing is a drat unicorn. Any ideas?

4:3 displays aren't produced for the mass market anymore, so if you can find a new one it'll probably be a lot more expensive than it should be. I say pick up a cheap used PC monitor if you just can't deal with pillarboxing, otherwise use any modern monitor and ignore the black bars.

A TV-sized 4:3 flat panel will probably be hard to find used as well, so you might be stuck with a CRT cannonball or rear-projection.

wolrah fucked around with this message at 23:48 on Dec 26, 2014

wolrah
May 8, 2006
what?

Knifegrab posted:



This problem does not go away, unless I reboot my computer, at which point my display functions normally.

Does anyone know what is happening here, and if I can fix this problem without having to constantly reboot my computer?

Are you using DVI and clocking the monitor over 60Hz?

That looks a lot like what happened when I didn't realize my existing DVI cable was single link and tried to run the monitor at 144Hz on it before my DisplayPort cable arrived.. Maybe something's not waking up properly and forgetting to use dual link.

priznat posted:

I also have a 2005fpw sitting around ... Dell IPS monitors 4 lyfe.

My 2005FPW is the first non-laptop LCD I ever had and it's still running great once warmed up. Still does better colors than any of my other monitors too, but the rest are all TN so that's no surprise. Capacitors are hosed though so it does all kinds of fun things when it's booting up from cold. I have a replacement kit, just need to get around to it.

wolrah
May 8, 2006
what?

Incredulous Dylan posted:

I will say that as much as I love this new monitor, going from 120hz back to 60hz is a very noticeable difference. I used to play a ton of twitch shooters and still play TF2. I simply cannot play my old play style effectively at only 60hz - it just isn't smooth enough for very high sensitivity movements to be able to register what is going on and flick off shots. I've been playing less and less shooters over the years and am just moving over into strategy games/DOTA style stuff so it isn't a huge deal but absolutely be aware of that.

Even at the desktop it's a big deal. Both of my housemates bought the VG248QEs a few months before I did and while I could certainly see a difference in games I'm not really a twitch gamer so I didn't really care. I then used one of their machines for a few minutes to look something up and noticed how much smoother even moving things around on the desktop was. I work from home and thus spend 10-12 hours a day in front of my computer, so I went and bought one the next day. I want to get two more for my other monitor spots, but I'm waiting on the IPS 144Hz models to hit the market so I can decide if I want one of those for my center position.

Mousing over to the side monitors it's noticeable even watching the pointer move around. I used to run them overclocked to 72Hz to at least kind of match up 2:1 but custom clock rates and DSR apparently don't mix.

wolrah fucked around with this message at 01:13 on Jan 16, 2015

wolrah
May 8, 2006
what?
I'm pretty sure that the 760 is new enough that you can use any combination of ports you have and it'll work fine. Some of the older cards had four ports but could only do 2-3 monitors. Either way if you only have two monitors it doesn't matter and anything with the right ports should do the job.

Beware that nVidia's power saving sometimes has issues with mismatched monitors of any kind. I have to run nVidia Inspector's Multi-Display Power Saver to get my 970 to reliably clock below 900MHz at the desktop, and the tradeoff is occasional glitches on the DisplayPort connection when it changes clock modes.

You don't actually have to use DisplayPort, 144Hz 1080p works fine over the included dual-link DVI cable. I'm using it because both of my DVI ports are taken up by my other monitors.

wolrah
May 8, 2006
what?

Kragger99 posted:

I might be wrong, but that monitor has G-Sync, and doesn't G-Sync only work over display port?
I have the same monitor, and was going to switch to using display port instead of DVI for that reason (if it's correct).

You are correct that G-Sync requires DisplayPort, but the VG248QE is only G-Sync Ready, not actually equipped with it. It's a $200 part, you have to open the monitor up (not hard, but more than many want to do), and you lose all your other inputs as well as the OSD. Really hard to justify IMO.

grimcreaper posted:

Guess i should have done my own research instead of listening to a friend. Ah well, this will be my first time ever using a displayport cable. Got it for 2 bucks after some credit on newegg with free premier shipping so not a biggy.

DisplayPort is still a technically better interface, as a geek I'd be using it with this even if I didn't have to, so since you got the cable cheap its all good.

wolrah
May 8, 2006
what?
Is anyone aware of a way to disable input searching on the VG248QE? I have a spare Chromecast that I want to leave on the HDMI input, but any time my PC tries to put the monitor to sleep it switches to the HDMI input. There is no way to put the Chromecast to sleep. I've searched the menus and nothing, just hoping maybe I've missed something or there's a hidden option or something.

wolrah
May 8, 2006
what?

Sidesaddle Cavalry posted:

Be careful if you decide on a Asus VG248QE, especially if you try the gray market. Those things originally only had Gsync onboard as a retrofit, so you need to make sure that Gsync was already installed in the first place.

That's still the case, they do not ship with the GSync card (which is why they're cheaper than the monitors with it built in) and I have not seen any stores actually stocking the things anymore.

I'd recommend against this particular unit if you want GSync because of that, plus it loses features when you install the card (all non-Displayport inputs disabled, crosshair/timer overlay disabled). That said it's always the cheapest 144Hz monitor I can find, so if you don't care about GSync it's great.

wolrah
May 8, 2006
what?

Chalets the Baka posted:

When are we going to skip the whole monitor middle-man business and stream images directly into our brains via electrodes?

DARPA's working on it.

http://www.theregister.co.uk/2015/02/17/darpas_google_glass_will_plug_straight_into_your_brain/

edit: If this ever happens, bets on how long it takes before someone tries to mess with the interface (overclocking it perhaps) and zaps their brain?

wolrah fucked around with this message at 21:31 on Feb 17, 2015

wolrah
May 8, 2006
what?

zer0spunk posted:

Who are these magical folks that hate 3:2 pulldown?

Are a lot of you watching content side by side on 60hz and 120/240 panels simultaneously? Am I just smoothing over motion artifacts with my brain? I know they exist based on understanding motion and the need for duplication of certain frames, but I'm just not susceptible to it at all.

On a sideways panning shot or anything with what's supposed to be consistent smooth motion it's really obvious. How annoying it is depends entirely on the film, I usually don't notice it unless I'm looking for it but if I do look for it I can usually find it pretty easily.

edit: Yeah larger screens do make a difference as well, I never noticed it on a PC monitor but can easily see it on a 52" LCD and it was clear as day when I didn't have 24 Hz working right on my 110" projector.

wolrah fucked around with this message at 21:25 on Feb 20, 2015

wolrah
May 8, 2006
what?

Tanbo posted:

Yeah, just remember DVI doesn't carry audio so if you're using DVI, and you didn't specifically connect the audio port in the back so you can listen to the terrible built in speakers, they won't function.

It depends, because of the compatibility in both directions it wouldn't surprise me one bit to find that some devices can take HDMI format audio on their DVI port. Pretty much every HDMI capable GPU can do the other way around, so it might even be possible to have a HDMI signal entirely flowing over DVI ports and cables.

wolrah
May 8, 2006
what?
How's Asus' RMA turnaround? My VG248QE's DisplayPort input entirely failed and the DVI port seems like it might have problems too, so I opened a ticket and got a RMA number. I'm just curious how long I'll be without my 144Hz goodness.

wolrah
May 8, 2006
what?
Minor warning that nVidia's drivers disable GPU accelerated PhysX if they detect any non-nVidia GPUs in the system. Obviously not the biggest loss in the world but worth noting.

wolrah
May 8, 2006
what?

Subjunctive posted:

That can't be true for IGPs, or nobody would ever have it enabled.

I can't speak for older ones off the top of my head because it's been so long, but both the IGPs I have available to me at the moment disable themselves by default when a standalone GPU is installed. i7-4790K and A10-7850K, neither system shows anything GPUs aside from the standalone cards.

wolrah
May 8, 2006
what?

PC LOAD LETTER posted:

Holy loving poo poo $50K but its badass.

I was going to make fun of their idea of using it as an always-on digital art display, but then I realized that the kind of person who can drop $50k on a projector doesn't care in the slightest about a few hundred bucks for a bulb every few months or the extra 500 watts of power draw.

edit: Ok, apparently it's light source is a laser diode rated at 20k hours, or over two years of constant use. Can't find anything about replacement though.

wolrah fucked around with this message at 04:32 on May 17, 2015

wolrah
May 8, 2006
what?

Daviclond posted:

Windows is really lovely at handling this stuff and the only workaround was to cover up the HDMI pin (I forgot the exact pin number) that transmitted the "power off" signal to the PC using a small piece of electrical tape. This meant the PC did not know when the monitor was switched off.

Why are you blaming Windows here? The monitor is removing the "I'm here" signal, so Windows responds appropriately to the fact that the monitor no longer appears to be there. Would you rather it ignore the signal altogether and any time you plug/unplug a monitor you have to manually enable/disable it?

Complain to the monitor manufacturer for doing it wrong, don't blame the software that is seeing a monitor no longer there and reacting reasonably to it.

wolrah
May 8, 2006
what?

DrDork posted:

The problem is that it would be laughably easy for Microsoft to simply add an option to, you know, not do that and correct the issue that's caused by cheap monitors reporting as HDTVs and not proper monitors (which I have to imagine is due to the parts being cheaper). But despite the issue persisting for years and plenty of requests, they haven't bothered.

Arguably you can make the same complaint about AMD and NVidia, since it'd also be addressable via graphics drivers.

There are other OSes that give you options for it. Hell, even older versions of Windows behaved better.

Having dealt with many of these kinds of situations over the years at this point I'm firmly in the camp of not coddling lovely vendors who defy the standards. Otherwise you end up with issues where the vendor says "works for me" because they're using something that tolerates their bullshit even though they're doing it wrong. See most of the XP->Vista problems for example, XP tolerated lots of things programs should never have been doing in the first place and then those assholes blamed Vista for it breaking when it was their own fault.

A lot of bugs and odd behavior can be traced back to workarounds for something else doing something wrong. Workarounds also eliminate a lot of the incentive for the vendor to actually fix their poo poo in the future, they just toss the workaround in their FAQs and call it a day.

If I had my way Raymond Chen would have a lot less to blog about...

wolrah
May 8, 2006
what?

DrDork posted:

And let's be clear--the monitors in this case are not doing anything "out of standards." They are acting like a HDTV--a completely valid display option--and are acting according to those standards.
...
The standards require a disconnect signal be sent when the TV turns off to notify the host of its status;

I've been reading the specs and have not seen anything to support your claim that HDTVs are supposed to handle hot-plug any differently and especially nothing that says hot-plug should be used to indicate power status.

DVI 1.0 Standard posted:

2.2.9.1 System Hot Plugging Requirements
When the host detects a transition above +2.0 volts or below +0.8 volts the graphics subsystem must generate a system level event (OS dependent) to inform the Operating System of the event. Additionally, if the DVI compliant monitor is a digital monitor, when "Monitor Removal" is detected the graphics subsystem must disable the T.M.D.S. transmitter within 1 second.

HDMI 1.3 Standard posted:

8.4.4 Enhanced DDC Sink
The Sink shall be Enhanced DDC read compliant.
The Sink shall be capable of responding with EDID 1.3 data and up to 255 extension blocks, each
128 bytes long (up to 32K bytes total E-EDID memory) whenever the Hot Plug Detect signal is
asserted.
The Sink should be capable of providing E-EDID information over the Enhanced DDC channel
whenever the +5V Power signal is provided. This should be available within 20msec after the +5V
Power signal is provided.

<snip>

8.5 Hot Plug Detect Signal
An HDMI Sink shall not assert high voltage level on its Hot Plug Detect pin when the E-EDID is
not available for reading. This requirement shall be fulfilled at all times, even if the Sink is
powered-off or in standby. The Hot Plug Detect pin may be asserted only when the +5V Power
line from the Source is detected.

DisplayPort 1.1 Standard posted:

3.3 Hot Plug / Unplug Detect Circuitry
The HPD signal is asserted by the DisplayPort Sink whenever the Sink is connected to either its main power
supply or “trickle” power.

DVI and HDMI both leave enough opening that it's not technically against the standard to turn off the hot plug signal when the sink is powered off, but they make it clear that the intent is to indicate attached/removed and that the source should act accordingly. HDMI says that EDID "should" be available any time a source is attached and implies that HPD should be asserted whenever EDID is available.

DisplayPort on the other hand is very clear, a sink that stops asserting HPD when the device is still receiving power is doing it wrong.

In all cases the HPD "monitor present" signal is the same level as the EDID standby power provided by the source, so a simple connection between the two pins internal to the sink is all that's necessary. To make it turn on and off with the display actually took more effort than not, so your claim of requiring extra hardware to do it right is incorrect.

wolrah fucked around with this message at 00:35 on May 18, 2015

wolrah
May 8, 2006
what?

DrDork posted:

Windows decides that HDTVs should be handled differently than monitors are, despite there being no requirement or real rationale to do so. Hence why sleeping a "monitor" works just fine and nothing gets dicked up, but sleeping the same display when reporting as a "HDTV" often causes Windows to freak out and randomize your windows, and then re-randomize them when everything wakes back up.

No it doesn't handle them differently. The displays that cause this problem are turning off the "I'm here" signal. Windows then sees the monitor as not there anymore, which is the correct response. If it's a secondary monitor it rearranges things because you don't want applications or icons getting stuck off on a monitor that doesn't exist, if it's the sole monitor that was just disconnected it defaults to a virtual 1024x768 display as noted in the Microsoft document linked in the post that started this discussion because Windows 7 has no concept of a headless system for obvious reasons.

The badly behaving displays are telling Windows they're no longer plugged in, Windows then reacts exactly as it would if you unplugged it. What type of display it is does not matter. The display should not be turning off the HPD signal when sleeping or soft-off.

wolrah
May 8, 2006
what?
If you have three identical resolution displays nVidia Surround can make them appear as one large display which will just work with any game that isn't idiotic about resolutions.

Unfortunately a lot of older games have hardcoded lists of resolutions they'll let you select so you may need to manually edit config files in some cases, and other games are just lovely about non-standard aspect ratios.

Very few games actually properly support multiple distinct monitors, usually this will be limited to simulators.

wolrah
May 8, 2006
what?

ryangs posted:

Are they any active adapters or interfaces between the computer and monitor that might resolve this? It sounds like a mere handshaking or EDID issue, or something along those lines.

Other options include unplugging my monitor every time I reboot (super ghetto), or getting a new monitor (not really what I want to do, as this monitor is otherwise exactly what I want and need).

If it is EDID related you might be able to get away with buying a short DVI extension cable ($5 or less online at places like Monoprice) and yanking the DDC pin out. I think the Apple monitors signal brightness changes over that line though so you might lose functionality if that's the case. No idea what other impacts it might have.

wolrah
May 8, 2006
what?

KS posted:

Just a heads up that these monitor arms are shockingly good for the price.

http://www.amazon.com/gp/product/B00ITG90D0

I've installed 5 now for work.

I use the three-way version of that one and it's really nice, but it's a tight fit to do a wraparound setup with a single 24 in the center and two 20s on the sides, I don't think you'd be able to do anything except straight in line with three 24s.

edit: Also the vertical angle hold screws are impossible to get a good flexible but holding setting with, so you basically have to find your spot and crank 'em down. The side by side flexibility works perfectly though.

wolrah
May 8, 2006
what?

The Deadly Hume posted:

I guess it also makes it easier when shipping to countries with different electricity supplies, they can just throw in the correct adapter or transformer to suit etc.

Yeah, but it's not like input-flexible power supplies are exactly rocket science, most switching power supplies support all common household voltages. They just change out the cord to the wall for inline bricks or the plug segment on a wall-wart. Same thing with an internal supply, just change out the cord for different markets. Aside from audio equipment and probably some high-end radio gear most power supplies these days are switching rather than linear.

wolrah
May 8, 2006
what?

DaveSauce posted:

Well that's where I'm confused...I thought an LED display was exactly that: an LCD that uses LEDs for the backlight. Is there more to it?

No, that's exactly it. That post was in response to you asking whether something was LED or LCD, as if they were separate. While there are actual LED displays they're usually in the category we'd call "Jumbotron" so for normal people LED is a subset of LCD indicating the backlight technology in use.

wolrah
May 8, 2006
what?

DaveSauce posted:

Right...it's just that the Amazon listing showed it as LCD instead of LED, when it is in fact an LED because it's LED-backlit. I'm over it!

That's because it is LCD. You seem to think that LCD = CCFL-backlit LCD, which is incorrect.

wolrah
May 8, 2006
what?

ufarn posted:

What's the main reason behind the VAST difference in crispness of watching Twitch on my (calibrated) U2414H versus the iPad app? Are Chrome's a big factor here, or are other things at work?

I'm just wondering why someone like Lirik's 1440p looks like blown-up 720 or worse on the 1080p monitor with Source.

Because it is blown up 720p? Twitch limits most users to 720p and while I'm not familiar with that particular streamer, checking his stream right now shows in the "Video Playback Stats" display "Video Resolution: 1280x720" so it doesn't look like he's one of the chosen ones that are allowed to use a higher resolution.

wolrah
May 8, 2006
what?

Subjunctive posted:

I accidentally set my screen back to 60Hz and spent a few days trying to figure out what was wrong with my GPUs because it felt broken even on the desktop.

I fired up Project CARS yesterday and for some reason it had decided after the patch that 1080p@50Hz was a great idea. Suddenly refreshing at almost 1/3 of the normal speed was like driving in to a school zone on a divided highway.

wolrah
May 8, 2006
what?

EoRaptor posted:

This is just how displayport behaves. With other connectors, even if the monitor is turned off, there is a passive connection between two pins that says 'something is plugged in here'. Not so for displayport.

Not true, DisplayPort is actually more strict about this than DVI and HDMI. I looked up the specs when a similar claim was made a few months ago: http://forums.somethingawful.com/showthread.php?action=showpost&postid=445449225

If your monitor is "disappearing" when powered off it's doing DisplayPort wrong.

wolrah
May 8, 2006
what?

Zorilla posted:

I just picked up a cheap ASUS PB278Q on Craigslist and, yep, I can add this one to the long list of monitors that do DisplayPort wrong.

That's really odd, since Asus gets it right in the VG248QE.

wolrah
May 8, 2006
what?

Wasabi the J posted:

Maybe I'm not correct but I thought triple buffering meant rendering ahead 3 frames but I think I was misinformed.

Zorilla posted:

I don't see how anything else is possible unless the system is capable of rendering frames at three times the refresh rate. If it really does work the other way, that means it's discarding two out of every three frames under ideal conditions.

Triple buffering adds a variable latency up to just short of one refresh cycle. How many frames behind that puts you depends on how your framerate and refresh rate compare.


Single buffering has the GPU drawing directly to the framebuffer that's being sent to the display. This can cause flickering and tearing when the order of things being done by the GPU doesn't line up with the scan to the screen. Old consoles did this.

Double buffering has the GPU rendering to a "back buffer" while the display is fed from the "front buffer". The buffers are flipped when the GPU is done with the frame. This prevents flicker, but still tears when the buffer is flipped mid-scan. Playing modern computer games or consoles this is generally the default "no vsync" mode.

Both of these add no significant latency, the first is drawing to the screen before the frame is even done and the second starts immediately when it is ready.

Triple buffering adds a second back buffer. The GPU alternates which one it's rendering to frame by frame and the inactive one gets flipped to the front when the display is ready. This eliminates tearing by preventing a front buffer flip in the middle of a scan, but if it happens to complete a frame *just* after the display hit another refresh that frame will be delayed for a full cycle or possibly discarded depending on the actual framerate being rendered. You get the graphical benefits of vsync with less of the latency/jitter because the game engine can keep running at whatever framerate it wants rather than caring about syncing up with the display.

Anandtech has a good explanation here: http://www.anandtech.com/show/2794/2

wolrah
May 8, 2006
what?

DrDork posted:

That almost looks like your video card is detecting it as an HDTV or something and is trying to do some sort of blurring or overscan. Check the settings in the Catalyst/NVidia control panel to ensure it's being seen correctly as a digital monitor.

That looks like Windows' DPI scaling being derpy. If an app doesn't declare itself as compatible Windows just literally blows up the contents. (Mac OS X did the same, but properly scaled most standard system widgets so only custom widgets and bitmap images ended up fuzzy).

You can set a compatibility option in the shortcut (or a few other ways that are harder) to disable scaling for specific programs or you can just set it to 100% system-wide in the display control panel.

wolrah
May 8, 2006
what?

Etrips posted:

iirc, I remember people can "rub" out stuck pixels sometimes.

I've had success on three different older monitors (two 2005FPWs and some unknown 19" non-wide) by just opening something that flickers through the colors and putting it under the pixels in question. I'm sure there are programs specifically for this, but I've always just used rgb.swf (google it and mute your sound, don't use if you have epilepsy).

Adbot
ADBOT LOVES YOU

wolrah
May 8, 2006
what?

Captain Yossarian posted:

An HDMI to displayport adapter?

Theoretically, but pricey, hard to find, and potentially might add a frame or two of lag, plus you'd then have to be switching cables around if you don't also spend money on a DisplayPort switch.

To get the level of convenience you'd have from a monitor with native inputs you'd end up spending an extra $150 or so it looks like between the adapter and switch.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply