Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Indiana_Krom
Jun 18, 2007
Net Slacker
I can easily tell whenever my frame rate drops under 80. 100 FPS looks and feels better, but past 144 is diminishing returns outside of very high motion games. Doom 2016 for instance looks and plays incredible at its 200 FPS engine cap if you have a display that can keep up, 60 Hz only looks moderately less smooth overall thanks to a really solid motion blur implementation but the downside is everything is blurred into complete obscurity. You can rapidly turn around in Doom at 200 FPS and also clearly see the imp that was going to attack you from behind, where as at 60 Hz its all a blur and its much more challenging to tell the imp from the rocks behind it.

IMO spacial resolution is a problem that has been largely solved with 1080p on 24", 1440p on 27" and 4k on larger displays. But temporal resolution remains a huge problem, 60 Hz is choppy and makes everything so blurry the resolution becomes irrelevant. I can read text while scrolling on a 240 Hz display fairly easily, on the exact same display at 60 Hz it is a completely unreadable smudge until it stops scrolling, this is an unavoidable problem with sample and hold displays like LCDs. The extra resolution in 4k is completely wasted most of the time you are actually using the display for anything that moves, because the only way to keep the motion "smooth" is to blur the image to the point that even 720p would get the job done. And even if you don't use a motion blur, just being a sample and hold display will blur it to hell at 60 Hz.

One really good way to see the impact of higher refresh rates and strobes is to look at chase camera comparisons on blur busters and their testufo site.

Edit: F%&# page snipe...

Adbot
ADBOT LOVES YOU

EdEddnEddy
Apr 5, 2012



On mobile this is sort of the case as well. Since Phones shoot for 60hz scrolling text on a 6" 1440P screen can be a blurry mess.

I wonder if the 120Hz display of the Razer Phone could make its way outside of Razer alone?

craig588
Nov 19, 2005

by Nyc_Tattoo

Indiana_Krom posted:

60 Hz is choppy and makes everything so blurry the resolution becomes irrelevant.

You just happened to say it, not calling you out at all, but I like that we've gotten to the point that now 60 FPS isn't good enough. Advancement of technology!

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
Having a high-framerate monitor definitely cursed me into spending more. 60FPS really does feel awful to me now.

Indiana_Krom
Jun 18, 2007
Net Slacker

craig588 posted:

You just happened to say it, not calling you out at all, but I like that we've gotten to the point that now 60 FPS isn't good enough. Advancement of technology!

It isn't actually anything new, I've always hated 60 Hz. There is a reason I kept using CRTs till my last one died from the horizontal deflection failing in like 2012, its because my 19" CRTs did 100 Hz refresh at the 1280x960 resolution I used the most at the time. I was pretty unhappy in the years in-between before 144 Hz + *sync became available in LCDs. Hell, back in the early 2000s the game I was playing competitively looked good enough at 640x480 and I played it there because the particular monitor I was on could refresh at 160 Hz at that resolution, and people wondered how I could react to stuff so fast...

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
There's an entire swath of games were you're not flicking your viewport180 degrees in a split second folks. Yes, temporal resolution is a thing and 120 fps benefits it greatly, but especially on large TV's even at 30fps it's ridiculous to say 4k is 'irrelevant'. It depends entirely on the game and in most games, you're not wildly changing your perspective; plus even in 60fps or lower there are still elements like specular aliasing which a higher resolution benefits greatly - and especially in motion where it will still shimmer like crazy regardless of the frame rate. When you play on a 55+ TV, 1080p can look pretty brutal.

Happy_Misanthrope fucked around with this message at 03:11 on Mar 27, 2019

Corb3t
Jun 7, 2003

I can't help but grimace every time I see scrolling text on a 60hz display.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Gay Retard posted:

I can't help but grimace every time I see scrolling text on a 60hz display.

this could be the thread title of a few threads I read

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

EdEddnEddy posted:

On mobile this is sort of the case as well. Since Phones shoot for 60hz scrolling text on a 6" 1440P screen can be a blurry mess.

I wonder if the 120Hz display of the Razer Phone could make its way outside of Razer alone?

one of the ipads had a high refresh screen. Dont remember which, maybe they all do now

Cygni
Nov 12, 2005

raring to post

you see takes on this forum like 60hz is so bad that resolution is irrelevant and linux dual boot gaming discussions, it’s incredible

this place is the corner case emporium and I love it

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
My AOC2460PG is a few years old now but I always marvel at the scrolling on ULMB 120hz mode. It's a shame I can't use gsync at the same time.

Does any 4KTV have a similar sort of function?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

orcane posted:

Oh no poor Nvidia had to make huge loving chips to replace their much smaller older chips and that means of course they HAVE TO charge $$$ even though the additional features are a complete gamble at this point, stupid ungrateful consumers.

I'm going to mail Jensen a few more leather jackets now, I feel so bad for him.

you should do the math on area-difference-per-average-core between Turing and Turing Minor to see the "wasted die space" between RTX and GTX

something like 5% I think? boy, I really overestimated it at like 15%.

NVIDIA squeezed a little bit more gains by pushing FP16 through the tensor cores on the RTX cards, if they had done real dedicated FP16 units plus also tensor units it would have been even less.

this is the post-Moore's Law era for GPUs, a 20% gen-on-gen uarch improvement with no shrink is fantastic and you should absolutely mail Jensen another jacket or two, you're going to look back at this as a great era, until NVIDIA figures out chiplet GPUs and drops costs. Which is probably 3 years? until they can port NVSWITCH down to consumer cards?

lol if you think RTG can afford to do it

whatever NVIDIA shits out on 7nm, you should buy it asap, RTG is probably going to poo poo out another Hawaii rehash on 7nm

Paul MaudDib fucked around with this message at 10:48 on Mar 27, 2019

Arzachel
May 12, 2012

Zigmidge posted:

You sound like you're justifying what you paid for pixel counts.

2560x1440@144fps is more pixels than 4k@60 :wink:

TheFluff posted:

Of course it's all about pixel density, that's what makes it look sharp. I use 150% UI scaling on my display - it's not about fitting more stuff on the screen. In that NASA paper that K8.0 linked earlier they assert that spatial acuity (whatever that means) is typically 30-60 "pixels" per degree. I didn't see a source cited for it, but the higher end of that is in about in the same ballpark as what Apple said was needed for full transparency in the retina displays (70 pixels per degree, I think). The further away you sit from the screen the less pixel density you need.

The paper uses *cycles* per degree, the cycles being pairs of black and white lines. Resolution increases for the most part don't introduce drasticaly different features, so my interpretation of that graph would be that more complex/busy objects will appear less fluid at the same framerate but that's not what K8.0 was claiming.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Gay Retard posted:

I can't help but grimace every time I see scrolling text on a 60hz display.

Oh yeah I'm definitely in the camp that 100+ hz is a massive difference no doubt (even in just desktop usage it feels so much better), just that 60hz (or lower) doesn't negate 4k entirely by any stretch.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Zedsdeadbaby posted:

My AOC2460PG is a few years old now but I always marvel at the scrolling on ULMB 120hz mode. It's a shame I can't use gsync at the same time.

Does any 4KTV have a similar sort of function?

Got $5k laying around you don't need?

WithoutTheFezOn
Aug 28, 2005
Oh no

Fauxtool posted:

one of the ipads had a high refresh screen. Dont remember which, maybe they all do now
All the iPad Pro models except the first 9.7” one.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Arzachel posted:

The paper uses *cycles* per degree, the cycles being pairs of black and white lines. Resolution increases for the most part don't introduce drasticaly different features, so my interpretation of that graph would be that more complex/busy objects will appear less fluid at the same framerate but that's not what K8.0 was claiming.

The part that's most relevant to what I was talking about is on page 6, where they reference a couple studies where pilots actually performed worse when presented with a more detailed image at the same framerate, because like I said the brain's spatial processing breaks down when it's able to perceive a lot of detail but not sufficient motion to accompany it.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Sounds like we need to get ourselves some better brains :v:

Speaking of the one post from a little bit earlier, whatever happened to that hopeful feature of G/FreeSync and ULMB at the same time? I seem to recall someone doing experiments to prove that variable backlight intensities can be synced to variable refresh rates, but didn't hear anything about commercial adoption of the tech after that....

ufarn
May 30, 2009
There's apparently also a proprietary ULMB by ASUS, ELMB.

EdEddnEddy
Apr 5, 2012



Yea didn't the start of that also sort of begin with their 3D Vision 2 displays that pulsed the backlight even brighter to make up for the darkening that happens with the shutter glasses?

Indiana_Krom
Jun 18, 2007
Net Slacker

Sidesaddle Cavalry posted:

Sounds like we need to get ourselves some better brains :v:

Speaking of the one post from a little bit earlier, whatever happened to that hopeful feature of G/FreeSync and ULMB at the same time? I seem to recall someone doing experiments to prove that variable backlight intensities can be synced to variable refresh rates, but didn't hear anything about commercial adoption of the tech after that....

It wouldn't work, because you don't know when the next frame is going to be delivered so you can't know how bright to make the current frame. Even if you used a fixed strobe interval, the apparent brightness of the screen would vary with the framerate, the more FPS the brighter the screen would appear.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
That's true for Gsync. You could however do it with Freesync, since Freesync determines the timing for the next frame ahead of time. Unfortunately that's probably quite a ways off.

VelociBacon
Dec 8, 2009

So I wanted to post about my experiences with the EVGA Hybrid kit for my EVGA 2080ti XC Ultra.

The kit comes in a box exactly like the GPU comes in. It took me around 2 hours from pulling the card to booting up the system with the kit installed. I do a fair bit of hobby work with electronics, worked for a few years in machining, work on my own cars, etc. I don't think it's that difficult but it's an expensive card and I took my time to not gently caress up and damage the PCB.

Technically the kit works as advertised. It replaces the open-fan triple slot cooler into a blower + 120mm AIO. So it's two heat sinks + thermal pads that interface with the VRM/Memory, and a 120mm AIO liquid cooler that mounts to the GPU. A fan that is pre-assembled on one of the heatsinks provides flow over the heat sinks out the back of the card. The AIO comes with thermal paste pre-applied which I wish I knew about before spending 30 bucks on more kyronaut. Here are some images from their site:






Temps before would eventually hit 80C under auto fan speed (I think it went to 100% fan speed at 75C) and then I assume start thermal limiting with the way GPU boost works. With the same auto fan control now it doesn't get higher than around 62C where according to afterburner the fans are running at 45%. It's almost completely quiet at that point. It idles around 32-36C with the fans off and just the pump running (can't hear it). I'll try turning up my clock but I assume it won't really be any more stable with the cooler, it's a quality of life upgrade more than anything and will prevent it from thermal throttling.

There are a number of issues with the kit that I noticed are well reported online on the EVGA forums and elsewhere as I had to google stuff up. I don't see any response by EVGA. Here's a list of issues I ran into:

  • The kit often refers to screws by their metric size and comes with 4 or 5 bags of screws which all have the same label on the bag. I guess they're providing a lot of spares because the cooler can be fitted to a few different models but it isn't labelled very well or even accurately in some cases
  • The instructions tell you to route cables in a certain way from the AIO pump with photos and everything but then later on contradict their own instructions and show the cables routed another way. If you follow the initial instructions like I did the cables don't reach where they need to connect and you have to remove the AIO pump after bolting it down which obviously isn't ideal for the thermal paste
  • They tell you that you don't have to remove the backplate to use these black washers that they want you to put between the AIO retaining screws and the PCB. They actually show the washers in an image underneath the backplate, but the washers are too large to get into position without removing it.
  • The cable for the blower fan was sticking up enough to touch the fan blades the way it came out of the box (it's pushed into a channel in the heat sink). I had to get a tiny screw driver and press the cables down into the heat sink channel to get it clear of the fan.
  • Instructions tell you that you don't have to remove some bolts on the IO plate that absolutely need to be removed (on my card).

Let me know if anyone has questions. I had to pay out the rear end for the kit because :canada:, $170USD for the card, $40USD for shipping, and then another $37CAD for customs fees at the border.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord

Arzachel posted:

2560x1440@144fps is more pixels than 4k@60 :wink:

Haha that’s wild. I always thought I was better off performance wise at 1440p/144fps. I guess I never thought about it hard enough. Going up to that made my 1070GTX feel old.

I ran some non-scientific tests and I don’t think I can really see the difference between 100-144fps. The jump from 60-100 is pretty nice. Much like having a SSD, I can’t go back. And now I’m gonna need a new graphics card so I can play on things higher than Medium.

Indiana_Krom
Jun 18, 2007
Net Slacker

K8.0 posted:

That's true for Gsync. You could however do it with Freesync, since Freesync determines the timing for the next frame ahead of time. Unfortunately that's probably quite a ways off.
That's impossible, how can you set the timing for a frame that doesn't exist yet? The whole point of the *sync technologies is that frames aren't always delivered at a fixed swap interval of the display, so they made a monitor that could wait till the frame is actually done and then swap it in even at irregular intervals.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Yeah I put that wrong. Freesync CAN decide the frame timing ahead of time, because the GPU has complete control. It wouldn't be a big deal to extend Freesync to support the feature. To be clear, while Gsync behaves like you're saying, Freesync really doesn't. Gsync is the GPU delivers each completed frame to the monitor and the monitor handles the actual display timing. Freesync is the GPU just spools the vblank until it's ready to start the next frame and the monitor just sits there hoping it comes before the monitor breaks.

There's really nothing wrong with guessing when your next frame will be. Frame pacing has improved massively over the past several years. The point of VRR isn't just to perfectly match the start of the refresh with the frame becoming complete, as long as you do a good job chasing it around you're going to be 99% of the way there and trading off a slightly lower average framerate for less blur could be very worthwhile.

K8.0 fucked around with this message at 01:15 on Mar 28, 2019

Stickman
Feb 1, 2004

Could the length of the strobe be varied instead of the brightness? Or would that cause unwanted visual effects?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Presumably they're always making the strobe as short as it can be while generating the specified brightness. The point is to snapshot the state of the LCD instead of letting you see it transition colors.

Listerine
Jan 5, 2005

Exquisite Corpse
I bought an RTX 2070 for GPU rendering, but I got a discount code for a game, and even though I'm not really gaming on my PC much I thought what the hell.

To redeem the game, I need to go through Nvidia's Gaming Experience software. I wasn't really interested in installing this. Can I install it, download the game, and then deinstall Gaming Experiencing? Or am I stuck having to keep Gaming Experience installed to play the game?

VelociBacon
Dec 8, 2009

Listerine posted:

I bought an RTX 2070 for GPU rendering, but I got a discount code for a game, and even though I'm not really gaming on my PC much I thought what the hell.

To redeem the game, I need to go through Nvidia's Gaming Experience software. I wasn't really interested in installing this. Can I install it, download the game, and then deinstall Gaming Experiencing? Or am I stuck having to keep Gaming Experience installed to play the game?

Do you mean GeForce experience? It's also how you get new drivers and most people have no (serious) issues with it. Yes you can uninstall it after.

Listerine
Jan 5, 2005

Exquisite Corpse

VelociBacon posted:

Do you mean GeForce experience? It's also how you get new drivers and most people have no (serious) issues with it. Yes you can uninstall it after.

Yes, that's what I meant- posting from work and relying on my terrible memory. Thanks.

Indiana_Krom
Jun 18, 2007
Net Slacker

Stickman posted:

Could the length of the strobe be varied instead of the brightness? Or would that cause unwanted visual effects?
The length of the strobe basically is the brightness in most cases, the longer it is the brighter it looks, but the downside is the more blur potential there is.

A 2.4 MS strobe is roughly equivalent to running your display at 416 FPS as far as motion blur is concerned. The way ULMB and other strobe modes like it work is by essentially resetting your eyes after every frame, its less controlling what you see and more controlling what you don't see. Sample and hold LCDs induce blur by nature, even if they transitioned from one frame to the next absolutely instantly (0 MS) it would still be blurry on one at 60, 120 or even 240 Hz refresh rate. You have to think about how you see something move on a LCD and how the process actually works. When you move the mouse across the screen the cursor isn't really moving, it is just a series of pictures of the cursor in rapid succession where each new image has it in a different position along a line. But the problem is with sample and hold types, as your eye follows the cursor "moving", your eye is moving at a constant speed across the screen but the images of the cursor aren't actually moving at all, the blur is the disconnect between how much your eye moves while the cursor is stationary during every frame. ULMB works by only very briefly flashing the image and then its black the rest of the time, due to the nature of persistence of vision you only see the flash and the rest of the time you don't see anything, and the flash is so short your eye basically isn't moving relative to the length of time the image is actually there so you don't get the blurring effect.

craig588
Nov 19, 2005

by Nyc_Tattoo

VelociBacon posted:

Do you mean GeForce experience? It's also how you get new drivers and most people have no (serious) issues with it. Yes you can uninstall it after.

You don't need it for new drivers, it's never done anything beneficial for me. I don't use it because I want to keep the possibility space for issues narrow. It doesn't help me so I'm not going to let it run doing nothing. At some point they wanted to make it mandatory for drivers, but they rolled back on that.

VelociBacon
Dec 8, 2009

craig588 posted:

You don't need it for new drivers, it's never done anything beneficial for me. I don't use it because I want to keep the possibility space for issues narrow. It doesn't help me so I'm not going to let it run doing nothing. At some point they wanted to make it mandatory for drivers, but they rolled back on that.

I know but someone who hasn't heard of it before isn't going to be manually installing drivers without it so it feels like the lesser of the evils.

Cavauro
Jan 9, 2008

Installing the drivers is extremely straightforward. That poster who works in rendering probably just doesn't want any bloat and doesn't happen to know what GeForce Experience is

iastudent
Apr 22, 2008

If anyone's still looking for a Vega, Newegg has the Sapphire Nitro+ Vega 64 deal back for $420 (before shipping/taxes). Includes the usual AMD 3-game bundle with it.

https://www.newegg.com/Product/Product.aspx?Item=N82E16814202321

EdEddnEddy
Apr 5, 2012



The ONLY thing I didn't like about Geforce Experience was when downloading/installing drivers, it kept them all in some deep program data directory with no way of deleting them so it started to eat up a lot of GB on my, at the time, smaller SSD.

Did that ever get fixed where you could remove past drivers in the app itself?

Also if anyone gets a new RTX with games like Anthem or BF5, if you already have it or don't want it, hit me up. I have a ton of extra keys from past Humble Bundles and other purchases that we could work out a trade or something.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
So I've the chance to pick up an "XFX Radeon RX 580 GTS XXX Edition" for ~$160 through a credit card promo, but I have to confess an ignorance toward AMD card makers. Is XFX still decent or is there a reason they're the cheapest?

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Not sure if it goes here but what does AMD get out of releasing tech like ProRender that works with Nvidia cards?

I’m using Modo 3D, they recently implemented this and while it’s a nice feature, it’s weird. I’m probably not the only one confused because every time they mention this feature they clarify “it’s an agnostic solution! It works with all brands!”

Nvidia made that Optix denoiser available for 3d apps but I think that one only works with their cards, and the Raytracing stuff is/was limited to the new models

Adbot
ADBOT LOVES YOU

Geemer
Nov 4, 2010



Comfy Fleece Sweater posted:

Not sure if it goes here but what does AMD get out of releasing tech like ProRender that works with Nvidia cards?

I’m using Modo 3D, they recently implemented this and while it’s a nice feature, it’s weird. I’m probably not the only one confused because every time they mention this feature they clarify “it’s an agnostic solution! It works with all brands!”

Nvidia made that Optix denoiser available for 3d apps but I think that one only works with their cards, and the Raytracing stuff is/was limited to the new models

Maybe they want mindshare? Hoping people will think "AMD? Oh yeah, they made that tool I use. Maybe it'll work even better on their cards, I should get one when I upgrade."

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply