|
I can easily tell whenever my frame rate drops under 80. 100 FPS looks and feels better, but past 144 is diminishing returns outside of very high motion games. Doom 2016 for instance looks and plays incredible at its 200 FPS engine cap if you have a display that can keep up, 60 Hz only looks moderately less smooth overall thanks to a really solid motion blur implementation but the downside is everything is blurred into complete obscurity. You can rapidly turn around in Doom at 200 FPS and also clearly see the imp that was going to attack you from behind, where as at 60 Hz its all a blur and its much more challenging to tell the imp from the rocks behind it. IMO spacial resolution is a problem that has been largely solved with 1080p on 24", 1440p on 27" and 4k on larger displays. But temporal resolution remains a huge problem, 60 Hz is choppy and makes everything so blurry the resolution becomes irrelevant. I can read text while scrolling on a 240 Hz display fairly easily, on the exact same display at 60 Hz it is a completely unreadable smudge until it stops scrolling, this is an unavoidable problem with sample and hold displays like LCDs. The extra resolution in 4k is completely wasted most of the time you are actually using the display for anything that moves, because the only way to keep the motion "smooth" is to blur the image to the point that even 720p would get the job done. And even if you don't use a motion blur, just being a sample and hold display will blur it to hell at 60 Hz. One really good way to see the impact of higher refresh rates and strobes is to look at chase camera comparisons on blur busters and their testufo site. Edit: F% page snipe...
|
# ? Mar 26, 2019 23:17 |
|
|
# ? May 19, 2024 04:46 |
|
On mobile this is sort of the case as well. Since Phones shoot for 60hz scrolling text on a 6" 1440P screen can be a blurry mess. I wonder if the 120Hz display of the Razer Phone could make its way outside of Razer alone?
|
# ? Mar 27, 2019 00:32 |
|
Indiana_Krom posted:60 Hz is choppy and makes everything so blurry the resolution becomes irrelevant. You just happened to say it, not calling you out at all, but I like that we've gotten to the point that now 60 FPS isn't good enough. Advancement of technology!
|
# ? Mar 27, 2019 01:10 |
|
Having a high-framerate monitor definitely cursed me into spending more. 60FPS really does feel awful to me now.
|
# ? Mar 27, 2019 01:46 |
|
craig588 posted:You just happened to say it, not calling you out at all, but I like that we've gotten to the point that now 60 FPS isn't good enough. Advancement of technology! It isn't actually anything new, I've always hated 60 Hz. There is a reason I kept using CRTs till my last one died from the horizontal deflection failing in like 2012, its because my 19" CRTs did 100 Hz refresh at the 1280x960 resolution I used the most at the time. I was pretty unhappy in the years in-between before 144 Hz + *sync became available in LCDs. Hell, back in the early 2000s the game I was playing competitively looked good enough at 640x480 and I played it there because the particular monitor I was on could refresh at 160 Hz at that resolution, and people wondered how I could react to stuff so fast...
|
# ? Mar 27, 2019 01:55 |
|
There's an entire swath of games were you're not flicking your viewport180 degrees in a split second folks. Yes, temporal resolution is a thing and 120 fps benefits it greatly, but especially on large TV's even at 30fps it's ridiculous to say 4k is 'irrelevant'. It depends entirely on the game and in most games, you're not wildly changing your perspective; plus even in 60fps or lower there are still elements like specular aliasing which a higher resolution benefits greatly - and especially in motion where it will still shimmer like crazy regardless of the frame rate. When you play on a 55+ TV, 1080p can look pretty brutal.
Happy_Misanthrope fucked around with this message at 03:11 on Mar 27, 2019 |
# ? Mar 27, 2019 02:47 |
|
I can't help but grimace every time I see scrolling text on a 60hz display.
|
# ? Mar 27, 2019 03:32 |
|
Gay Retard posted:I can't help but grimace every time I see scrolling text on a 60hz display. this could be the thread title of a few threads I read
|
# ? Mar 27, 2019 03:38 |
|
EdEddnEddy posted:On mobile this is sort of the case as well. Since Phones shoot for 60hz scrolling text on a 6" 1440P screen can be a blurry mess. one of the ipads had a high refresh screen. Dont remember which, maybe they all do now
|
# ? Mar 27, 2019 05:08 |
|
you see takes on this forum like 60hz is so bad that resolution is irrelevant and linux dual boot gaming discussions, it’s incredible this place is the corner case emporium and I love it
|
# ? Mar 27, 2019 05:15 |
|
My AOC2460PG is a few years old now but I always marvel at the scrolling on ULMB 120hz mode. It's a shame I can't use gsync at the same time. Does any 4KTV have a similar sort of function?
|
# ? Mar 27, 2019 09:21 |
|
orcane posted:Oh no poor Nvidia had to make huge loving chips to replace their much smaller older chips and that means of course they HAVE TO charge $$$ even though the additional features are a complete gamble at this point, stupid ungrateful consumers. you should do the math on area-difference-per-average-core between Turing and Turing Minor to see the "wasted die space" between RTX and GTX something like 5% I think? boy, I really overestimated it at like 15%. NVIDIA squeezed a little bit more gains by pushing FP16 through the tensor cores on the RTX cards, if they had done real dedicated FP16 units plus also tensor units it would have been even less. this is the post-Moore's Law era for GPUs, a 20% gen-on-gen uarch improvement with no shrink is fantastic and you should absolutely mail Jensen another jacket or two, you're going to look back at this as a great era, until NVIDIA figures out chiplet GPUs and drops costs. Which is probably 3 years? until they can port NVSWITCH down to consumer cards? lol if you think RTG can afford to do it whatever NVIDIA shits out on 7nm, you should buy it asap, RTG is probably going to poo poo out another Hawaii rehash on 7nm Paul MaudDib fucked around with this message at 10:48 on Mar 27, 2019 |
# ? Mar 27, 2019 10:43 |
|
Zigmidge posted:You sound like you're justifying what you paid for pixel counts. 2560x1440@144fps is more pixels than 4k@60 TheFluff posted:Of course it's all about pixel density, that's what makes it look sharp. I use 150% UI scaling on my display - it's not about fitting more stuff on the screen. In that NASA paper that K8.0 linked earlier they assert that spatial acuity (whatever that means) is typically 30-60 "pixels" per degree. I didn't see a source cited for it, but the higher end of that is in about in the same ballpark as what Apple said was needed for full transparency in the retina displays (70 pixels per degree, I think). The further away you sit from the screen the less pixel density you need. The paper uses *cycles* per degree, the cycles being pairs of black and white lines. Resolution increases for the most part don't introduce drasticaly different features, so my interpretation of that graph would be that more complex/busy objects will appear less fluid at the same framerate but that's not what K8.0 was claiming.
|
# ? Mar 27, 2019 11:49 |
|
Gay Retard posted:I can't help but grimace every time I see scrolling text on a 60hz display. Oh yeah I'm definitely in the camp that 100+ hz is a massive difference no doubt (even in just desktop usage it feels so much better), just that 60hz (or lower) doesn't negate 4k entirely by any stretch.
|
# ? Mar 27, 2019 15:08 |
|
Zedsdeadbaby posted:My AOC2460PG is a few years old now but I always marvel at the scrolling on ULMB 120hz mode. It's a shame I can't use gsync at the same time. Got $5k laying around you don't need?
|
# ? Mar 27, 2019 15:09 |
|
Fauxtool posted:one of the ipads had a high refresh screen. Dont remember which, maybe they all do now
|
# ? Mar 27, 2019 16:56 |
|
Arzachel posted:The paper uses *cycles* per degree, the cycles being pairs of black and white lines. Resolution increases for the most part don't introduce drasticaly different features, so my interpretation of that graph would be that more complex/busy objects will appear less fluid at the same framerate but that's not what K8.0 was claiming. The part that's most relevant to what I was talking about is on page 6, where they reference a couple studies where pilots actually performed worse when presented with a more detailed image at the same framerate, because like I said the brain's spatial processing breaks down when it's able to perceive a lot of detail but not sufficient motion to accompany it.
|
# ? Mar 27, 2019 17:09 |
|
Sounds like we need to get ourselves some better brains Speaking of the one post from a little bit earlier, whatever happened to that hopeful feature of G/FreeSync and ULMB at the same time? I seem to recall someone doing experiments to prove that variable backlight intensities can be synced to variable refresh rates, but didn't hear anything about commercial adoption of the tech after that....
|
# ? Mar 27, 2019 20:27 |
|
There's apparently also a proprietary ULMB by ASUS, ELMB.
|
# ? Mar 27, 2019 20:35 |
|
Yea didn't the start of that also sort of begin with their 3D Vision 2 displays that pulsed the backlight even brighter to make up for the darkening that happens with the shutter glasses?
|
# ? Mar 27, 2019 20:38 |
|
Sidesaddle Cavalry posted:Sounds like we need to get ourselves some better brains It wouldn't work, because you don't know when the next frame is going to be delivered so you can't know how bright to make the current frame. Even if you used a fixed strobe interval, the apparent brightness of the screen would vary with the framerate, the more FPS the brighter the screen would appear.
|
# ? Mar 27, 2019 22:22 |
|
That's true for Gsync. You could however do it with Freesync, since Freesync determines the timing for the next frame ahead of time. Unfortunately that's probably quite a ways off.
|
# ? Mar 27, 2019 22:27 |
|
So I wanted to post about my experiences with the EVGA Hybrid kit for my EVGA 2080ti XC Ultra. The kit comes in a box exactly like the GPU comes in. It took me around 2 hours from pulling the card to booting up the system with the kit installed. I do a fair bit of hobby work with electronics, worked for a few years in machining, work on my own cars, etc. I don't think it's that difficult but it's an expensive card and I took my time to not gently caress up and damage the PCB. Technically the kit works as advertised. It replaces the open-fan triple slot cooler into a blower + 120mm AIO. So it's two heat sinks + thermal pads that interface with the VRM/Memory, and a 120mm AIO liquid cooler that mounts to the GPU. A fan that is pre-assembled on one of the heatsinks provides flow over the heat sinks out the back of the card. The AIO comes with thermal paste pre-applied which I wish I knew about before spending 30 bucks on more kyronaut. Here are some images from their site: Temps before would eventually hit 80C under auto fan speed (I think it went to 100% fan speed at 75C) and then I assume start thermal limiting with the way GPU boost works. With the same auto fan control now it doesn't get higher than around 62C where according to afterburner the fans are running at 45%. It's almost completely quiet at that point. It idles around 32-36C with the fans off and just the pump running (can't hear it). I'll try turning up my clock but I assume it won't really be any more stable with the cooler, it's a quality of life upgrade more than anything and will prevent it from thermal throttling. There are a number of issues with the kit that I noticed are well reported online on the EVGA forums and elsewhere as I had to google stuff up. I don't see any response by EVGA. Here's a list of issues I ran into:
Let me know if anyone has questions. I had to pay out the rear end for the kit because , $170USD for the card, $40USD for shipping, and then another $37CAD for customs fees at the border.
|
# ? Mar 27, 2019 23:05 |
|
Arzachel posted:2560x1440@144fps is more pixels than 4k@60 Haha that’s wild. I always thought I was better off performance wise at 1440p/144fps. I guess I never thought about it hard enough. Going up to that made my 1070GTX feel old. I ran some non-scientific tests and I don’t think I can really see the difference between 100-144fps. The jump from 60-100 is pretty nice. Much like having a SSD, I can’t go back. And now I’m gonna need a new graphics card so I can play on things higher than Medium.
|
# ? Mar 27, 2019 23:13 |
|
K8.0 posted:That's true for Gsync. You could however do it with Freesync, since Freesync determines the timing for the next frame ahead of time. Unfortunately that's probably quite a ways off.
|
# ? Mar 28, 2019 00:48 |
|
Yeah I put that wrong. Freesync CAN decide the frame timing ahead of time, because the GPU has complete control. It wouldn't be a big deal to extend Freesync to support the feature. To be clear, while Gsync behaves like you're saying, Freesync really doesn't. Gsync is the GPU delivers each completed frame to the monitor and the monitor handles the actual display timing. Freesync is the GPU just spools the vblank until it's ready to start the next frame and the monitor just sits there hoping it comes before the monitor breaks. There's really nothing wrong with guessing when your next frame will be. Frame pacing has improved massively over the past several years. The point of VRR isn't just to perfectly match the start of the refresh with the frame becoming complete, as long as you do a good job chasing it around you're going to be 99% of the way there and trading off a slightly lower average framerate for less blur could be very worthwhile. K8.0 fucked around with this message at 01:15 on Mar 28, 2019 |
# ? Mar 28, 2019 01:13 |
|
Could the length of the strobe be varied instead of the brightness? Or would that cause unwanted visual effects?
|
# ? Mar 28, 2019 01:21 |
|
Presumably they're always making the strobe as short as it can be while generating the specified brightness. The point is to snapshot the state of the LCD instead of letting you see it transition colors.
|
# ? Mar 28, 2019 01:30 |
|
I bought an RTX 2070 for GPU rendering, but I got a discount code for a game, and even though I'm not really gaming on my PC much I thought what the hell. To redeem the game, I need to go through Nvidia's Gaming Experience software. I wasn't really interested in installing this. Can I install it, download the game, and then deinstall Gaming Experiencing? Or am I stuck having to keep Gaming Experience installed to play the game?
|
# ? Mar 28, 2019 01:57 |
|
Listerine posted:I bought an RTX 2070 for GPU rendering, but I got a discount code for a game, and even though I'm not really gaming on my PC much I thought what the hell. Do you mean GeForce experience? It's also how you get new drivers and most people have no (serious) issues with it. Yes you can uninstall it after.
|
# ? Mar 28, 2019 02:05 |
|
VelociBacon posted:Do you mean GeForce experience? It's also how you get new drivers and most people have no (serious) issues with it. Yes you can uninstall it after. Yes, that's what I meant- posting from work and relying on my terrible memory. Thanks.
|
# ? Mar 28, 2019 02:10 |
|
Stickman posted:Could the length of the strobe be varied instead of the brightness? Or would that cause unwanted visual effects? A 2.4 MS strobe is roughly equivalent to running your display at 416 FPS as far as motion blur is concerned. The way ULMB and other strobe modes like it work is by essentially resetting your eyes after every frame, its less controlling what you see and more controlling what you don't see. Sample and hold LCDs induce blur by nature, even if they transitioned from one frame to the next absolutely instantly (0 MS) it would still be blurry on one at 60, 120 or even 240 Hz refresh rate. You have to think about how you see something move on a LCD and how the process actually works. When you move the mouse across the screen the cursor isn't really moving, it is just a series of pictures of the cursor in rapid succession where each new image has it in a different position along a line. But the problem is with sample and hold types, as your eye follows the cursor "moving", your eye is moving at a constant speed across the screen but the images of the cursor aren't actually moving at all, the blur is the disconnect between how much your eye moves while the cursor is stationary during every frame. ULMB works by only very briefly flashing the image and then its black the rest of the time, due to the nature of persistence of vision you only see the flash and the rest of the time you don't see anything, and the flash is so short your eye basically isn't moving relative to the length of time the image is actually there so you don't get the blurring effect.
|
# ? Mar 28, 2019 02:45 |
|
VelociBacon posted:Do you mean GeForce experience? It's also how you get new drivers and most people have no (serious) issues with it. Yes you can uninstall it after. You don't need it for new drivers, it's never done anything beneficial for me. I don't use it because I want to keep the possibility space for issues narrow. It doesn't help me so I'm not going to let it run doing nothing. At some point they wanted to make it mandatory for drivers, but they rolled back on that.
|
# ? Mar 28, 2019 07:21 |
|
craig588 posted:You don't need it for new drivers, it's never done anything beneficial for me. I don't use it because I want to keep the possibility space for issues narrow. It doesn't help me so I'm not going to let it run doing nothing. At some point they wanted to make it mandatory for drivers, but they rolled back on that. I know but someone who hasn't heard of it before isn't going to be manually installing drivers without it so it feels like the lesser of the evils.
|
# ? Mar 28, 2019 08:07 |
|
Installing the drivers is extremely straightforward. That poster who works in rendering probably just doesn't want any bloat and doesn't happen to know what GeForce Experience is
|
# ? Mar 28, 2019 10:16 |
|
If anyone's still looking for a Vega, Newegg has the Sapphire Nitro+ Vega 64 deal back for $420 (before shipping/taxes). Includes the usual AMD 3-game bundle with it. https://www.newegg.com/Product/Product.aspx?Item=N82E16814202321
|
# ? Mar 28, 2019 11:52 |
|
The ONLY thing I didn't like about Geforce Experience was when downloading/installing drivers, it kept them all in some deep program data directory with no way of deleting them so it started to eat up a lot of GB on my, at the time, smaller SSD. Did that ever get fixed where you could remove past drivers in the app itself? Also if anyone gets a new RTX with games like Anthem or BF5, if you already have it or don't want it, hit me up. I have a ton of extra keys from past Humble Bundles and other purchases that we could work out a trade or something.
|
# ? Mar 28, 2019 19:37 |
|
So I've the chance to pick up an "XFX Radeon RX 580 GTS XXX Edition" for ~$160 through a credit card promo, but I have to confess an ignorance toward AMD card makers. Is XFX still decent or is there a reason they're the cheapest?
|
# ? Mar 28, 2019 20:49 |
|
Not sure if it goes here but what does AMD get out of releasing tech like ProRender that works with Nvidia cards? I’m using Modo 3D, they recently implemented this and while it’s a nice feature, it’s weird. I’m probably not the only one confused because every time they mention this feature they clarify “it’s an agnostic solution! It works with all brands!” Nvidia made that Optix denoiser available for 3d apps but I think that one only works with their cards, and the Raytracing stuff is/was limited to the new models
|
# ? Mar 28, 2019 20:59 |
|
|
# ? May 19, 2024 04:46 |
|
Comfy Fleece Sweater posted:Not sure if it goes here but what does AMD get out of releasing tech like ProRender that works with Nvidia cards? Maybe they want mindshare? Hoping people will think "AMD? Oh yeah, they made that tool I use. Maybe it'll work even better on their cards, I should get one when I upgrade."
|
# ? Mar 28, 2019 21:04 |