|
Sininu posted:I'm aware of that and I've even linked Blurbusters here and in monitor thread more than few times before. I'd bet it's that way by design and there's nothing wrong with it, it's just not useful to you because it doesn't buffer. It's very useful for people who want minimum input lag on a VRR display though. On a fixed refresh rate display if you want minimum input lag you just uncap it and let it run at 300fps with no vsync and deal with the (minor, at that frame rate) tearing. e: it's kinda obvious but worth pointing out that precisely achieving a given framerate is in fact exactly the same problem as keeping the rendering synchronized to the display refresh rate, and has the same solution (render into a buffer one frame in advance and wait for the right moment to present it - that is, vsync), since you can't predict exactly how long rendering will take. TheFluff fucked around with this message at 23:52 on Jan 15, 2019 |
# ? Jan 15, 2019 23:41 |
|
|
# ? May 16, 2024 04:04 |
|
So this may be a very niche use case, but you can run both a GSync Monitor and FreeSync monitor at the same time. Both of my monitors are running with XSync enabled and both are not giving any kind of issues. My Pixio PX277 and Acer Predator X34 are running great without any kind of flickering or issues. I can run the Nvidia Pendulum demo between both monitors and it doesn't stutter or tear.
|
# ? Jan 15, 2019 23:42 |
|
Sininu posted:I don't have hardware that can support VRR, (planning to change it this year.) Okay, but the original post I was responding to was about "how to use Gsync properly." I use in-game caps when available and don't even have RTSS or Nvidia Inspector installed on a Gsync monitor (but I also am not a CSGO player). I'm not saying there isn't scientifically-evident improvements in doing things differently, but none that someone of my age (mid-30s) can appreciate. I can appreciate 141 FPS over 60 FPS in Overwatch. But 120-141, especially in a game like League where I'm offered 120 or 144 but not single digit FPS values, is a much tighter comparison. Craptacular! fucked around with this message at 00:00 on Jan 16, 2019 |
# ? Jan 15, 2019 23:58 |
|
Craptacular! posted:Okay, but the original post I was responding to was about "how to use Gsync properly." I did jump the gun there yeah. When I saw you mention CSGO my bad experience with its cap immediately came to mind and I wanted to post how bad it was. Didn't consider what was posted before and if it would apply to my experience.
|
# ? Jan 16, 2019 00:05 |
|
craig588 posted:I'm not sensitive to tearing so I needed to use my monitors OSD to verify Free/Gsync was working. Normally I'm very against any objective measurements because if you can only notice up to 30 FPS knowing that you're not running at 120 FPS is only going to make you sad even if you can't see it. If some people can't see the difference in different caps then they're fine, no need to chase perfection if they can't see the difference. For competitive games, there's an argument to be made that even if an effect isn't noticeable in the conscious sense (like input lag), it might still have an effect on competitive performance. You'd probably want to do some blind tests to verify the actual effect size, though. Speaking of blind tests, have you ever tried having a friend randomly cap your framerate at 30/60/90/120 and try to rate the qualitative experience? You might find that it's more noticeable than you think when you're not specifically looking for it. If you truly can't tell the difference, I'm kind of jealous. High refresh has ruined me and now all my future monitors are doomed to be ridiculously expensive.
|
# ? Jan 16, 2019 00:11 |
|
60 to 120 is easy for me to see, 120 to 144 I can only notice side by side.
|
# ? Jan 16, 2019 00:18 |
|
SlayVus posted:So this may be a very niche use case, but you can run both a GSync Monitor and FreeSync monitor at the same time. Both of my monitors are running with XSync enabled and both are not giving any kind of issues. My Pixio PX277 and Acer Predator X34 are running great without any kind of flickering or issues. I can run the Nvidia Pendulum demo between both monitors and it doesn't stutter or tear. That's really great to hear. I assume that when you do that it's capped to 80 or 100hz or whatever your Predator is clocked to?
|
# ? Jan 16, 2019 00:21 |
|
K8.0 posted:That's really great to hear. I assume that when you do that it's capped to 80 or 100hz or whatever your Predator is clocked to? Yeah, both monitors are clocked at 100Hz just because of weird Windows issues when you try to run monitors at different refresh rates, like videos stuttering in YouTube. With the Pixio having such a high FS range 30-145, running it at 100 still lets FS run at 30-100 which gives me like a 3.3x scale range. I also purchased this Pixio refurbished like 2 or 3 months ago, so with this driver update I'm finally able to confirm all features of a refurb monitor I bought work. My X34 is also a refurb as well lol, which has been going strong for 2 years and 10 months. SlayVus fucked around with this message at 00:35 on Jan 16, 2019 |
# ? Jan 16, 2019 00:32 |
|
I definitely appreciate 120+ fps in a game like Overwatch but for your standard AAA action-adventure title anything above 60fps starts to look real similar to me. Much more preferable in those instances to have an immovable 60fps cap than a jittery 75.
|
# ? Jan 16, 2019 00:42 |
|
exquisite tea posted:I definitely appreciate 120+ fps in a game like Overwatch but for your standard AAA action-adventure title anything above 60fps starts to look real similar to me. Much more preferable in those instances to have an immovable 60fps cap than a jittery 75. I've said before the VRR, particularly well-implemented VRR that's often not cheap, reminds me of when people talk about Nintendo having "good enough tech". High refresh monitors are really nice sometimes, but while I'm okay turning off reflections in Overwatch to hit 140+ solid I'm a lot less likely to do that in a Final Fantasy campaign. We don't always really want all that FPS at the sake of what has to be sacrificed for it in every single game. 90 average in Monster Hunter isn't ideal, but only because the game could be coded so much better. On the other hand, the fact that I can sit solidly at 90FPS and not complain of tears or stutters for either having too much FPS or not enough FPS is really cool. At the risk of getting a little too warez-y for this forum, I particularly like it with emulated Breath of the Wild because it looks great without it's wildly unpredictable framerate being that much of a problem.
|
# ? Jan 16, 2019 00:53 |
|
Could a 2070 drive 1440p @high frames? What's the window for step up?
|
# ? Jan 16, 2019 02:53 |
|
codo27 posted:Could a 2070 drive 1440p @high frames? What's the window for step up? Yes? But if that's your only objective, a 1070ti or a Vega 56 (depending on your monitor) might be a better value.
|
# ? Jan 16, 2019 03:13 |
|
Sure, it depends on the game and settings. Here's Doom 2016 averaging 170 on Ultra. Games like Witcher 3 and Far Cry 5 are closer to 100-110 on Ultra, and the newest stuff like Shadow of the Tomb Raider and AC: Odyssey are closer to 60 unless you turn down some settings. E: Subtract ~15% for a 1070 Ti/2060, add 15-25% for a 2080/1080 Ti (or 30-50% for a 2080 Ti, depending on CPU/initial fps). EE: Step-Up is 90 days to apply. Stickman fucked around with this message at 03:23 on Jan 16, 2019 |
# ? Jan 16, 2019 03:16 |
|
codo27 posted:Could a 2070 drive 1440p @high frames? What's the window for step up? Yeah totally, just drop some settings from ULTRA based on some performance videos and you will be at great frame rates Greensync indeed works on my Freesync AOC AGON panel Don’t forget to reapply your colour correction profiles after the driver update if you do a clean install.
|
# ? Jan 16, 2019 03:57 |
|
Thanks for the info regarding upgrading and that msi 1070 itx card. After reading more reviews I pulled the trigger. Will have it by the end of the week. It's interesting to see how these modern ssf cards actually hold their own without sacrificing a lot. It's even reported to be pretty silent even when under load despite only having one fan.
|
# ? Jan 16, 2019 14:26 |
|
RTX 2060s are in stock now for msrp ($350) + free games. Benchmarks have them mostly very slightly outperforming 1070 Tis and occasionally outperforming the 1080 (Wolfenstein II at 1440p). The -2GB of RAM cause 99th percentile frame times to drop below the 1070 at 4K in some games (also Wolfenstein II). https://www.anandtech.com/show/13762/nvidia-geforce-rtx-2060-founders-edition-6gb-review/10 https://www.guru3d.com/articles-pages/msi-geforce-rtx-2060-gaming-z-review,1.html With 1070-Ti-level power consumption, it seems pretty solid against the Vega 56 until those prices come down. Definitely too expressive for a 1060 replacement, but we already knew that and the 580/90’s got us covered for now! Stickman fucked around with this message at 17:31 on Jan 16, 2019 |
# ? Jan 16, 2019 17:13 |
|
Looks like it's 400€ in EU, as usual..
|
# ? Jan 16, 2019 17:21 |
|
Meh, after all these driver updates, I still have the issue that sometimes, my displays don't wake up anymore until I powercycle the computer. Here and there, it's also just one display that doesn't come up while the other does, and I get the distinct feeling, it's yet another GSync issue. Because when it happens, the LED keeps flickering, indicating mode switches. And when only one comes up, the user interface keeps locking up hard all the time. Eventually I noticed, if I turn off the display stuck asleep, it fixes itself. But FreeSync/Adaptive Sync is apparently super terrible.
|
# ? Jan 17, 2019 05:12 |
|
I used to have that problem with dp on my 980ti, and was only solved by going to hdmi. I thought for sure it was my monitor but when I got my 1080ti I tried dp with it and all was fine so I've been using it since.
|
# ? Jan 17, 2019 06:00 |
|
Combat Pretzel posted:Meh, after all these driver updates, I still have the issue that sometimes, my displays don't wake up anymore until I powercycle the computer. Here and there, it's also just one display that doesn't come up while the other does, and I get the distinct feeling, it's yet another GSync issue. Because when it happens, the LED keeps flickering, indicating mode switches. And when only one comes up, the user interface keeps locking up hard all the time. Eventually I noticed, if I turn off the display stuck asleep, it fixes itself. But FreeSync/Adaptive Sync is apparently super terrible. And you're sure it's not a "monitor deep sleep" option etc in the monitor software itself? That is the cause of that issue like 99% of the time I hear about if
|
# ? Jan 17, 2019 06:07 |
|
Which is quieter? EVGA GeForce RTX 2060 XC ULTRA (one large fan) or EVGA GeForce RTX 2060 XC (two smaller fans) The EVGA site only talks about space. In theory, I've been led to believe bigger fans are quieter. Can I apply that reasoning to conclude the XC Ultra is quieter?
|
# ? Jan 17, 2019 06:42 |
|
Statutory Ape posted:And you're sure it's not a "monitor deep sleep" option etc in the monitor software itself? That is the cause of that issue like 99% of the time I hear about if
|
# ? Jan 17, 2019 06:45 |
|
daab posted:Which is quieter? They both seem to have two fans...? Usually single fan cards are ITX cards, which are smaller, hotter and louder.
|
# ? Jan 17, 2019 06:53 |
|
daab posted:Which is quieter? The only one I can find noise data for is the Ultra. 28dB at idle and 36.6dB at full load. Hace posted:They both seem to have two fans...? The Ultra has 2 fans and the XC Gaming and XC Black only have 1.
|
# ? Jan 17, 2019 07:02 |
|
That single fan doesn't look any bigger than the dual fans though, it's just a smaller heatsink to save costs and for small form factor builds. And heatsink surface area is a more important factor.
|
# ? Jan 17, 2019 07:11 |
|
Yeah the single fan card is just smaller in size, the fan is the same diameter. It'll be hotter and louder.
|
# ? Jan 17, 2019 07:29 |
|
Llamadeus posted:That single fan doesn't look any bigger than the dual fans though, it's just a smaller heatsink to save costs and for small form factor builds. And heatsink surface area is a more important factor. Yeah after looking at it again, it appears you be correct...same size fans. So I guess its just to make it shorter, but fatter. And now I'm wondering if the heatsink surface area is actually more or less and I'm worried I need to use trigonometry. Maybe more importantly, the XC Ultra says its "Boost Speed" is 1830 MHz compared to the XC's 1755 MHz. $20 CAD extra for 75 MHz more, a longer heatsink, and another fan. Hmmmm.....decisions.... Which one should I go for?
|
# ? Jan 17, 2019 07:37 |
|
I would take the Ultra. Like Hace said, the fans are the same size so you can almost guarantee that the single fan will be louder and hotter.
|
# ? Jan 17, 2019 07:44 |
|
daab posted:Yeah after looking at it again, it appears you be correct...same size fans. So I guess its just to make it shorter, but fatter. And now I'm wondering if the heatsink surface area is actually more or less and I'm worried I need to use trigonometry. I would take card the one with dual fans and undervolt/underclock it to ~75% of the target power limit. That'd only cost 3% performance and the card will be almost inaudible.
|
# ? Jan 17, 2019 07:52 |
|
eames posted:I would take card the one with dual fans and undervolt/underclock it to ~75% of the target power limit. That'd only cost 3% performance and the card will be almost inaudible. Interesting idea. I have that 1070 itx come in, I might try this if it turns out louder than anticipated.
|
# ? Jan 17, 2019 07:59 |
|
Hm, so far none of the 2060 cards have a zero-fan mode. Wonder when those will show up.
|
# ? Jan 17, 2019 09:32 |
|
eames posted:I would take card the one with dual fans and undervolt/underclock it to ~75% of the target power limit. That'd only cost 3% performance and the card will be almost inaudible. I should give this a try. Been crunching with Folding@Home and would like my 1080 to be quieter.
|
# ? Jan 17, 2019 09:49 |
|
Does anyone have any experience with the RTX FE coolers? Are they quiet on desktop usage?
|
# ? Jan 17, 2019 10:06 |
|
Gunder posted:Does anyone have any experience with the RTX FE coolers? Are they quiet on desktop usage? I didn’t notice this at first, but EVGA’s single-fan 2060s (the XC and Black) are three-slot cards. That’s probably better cooling than a two-slot iTX card, but what case wants a short three-slot card? Most iTX cases are two-slot max.
|
# ? Jan 17, 2019 10:27 |
|
lllllllllllllllllll posted:Hm, so far none of the 2060 cards have a zero-fan mode. Wonder when those will show up. Apparently EVGA’s dual-fan card (XC Ultra) does, and I wouldn’t be surprised if their other two do as well.
|
# ? Jan 17, 2019 10:32 |
|
Stickman posted:Apparently EVGA’s dual-fan card (XC Ultra) does, and I wouldn’t be surprised if their other two do as well. I was looking at itx 20xx cards and thought these looked totally pointless.
|
# ? Jan 17, 2019 10:49 |
|
Yeah evga's marketing on the 2060 is kinda bad. The XC non-Ultra cooler on the 2070, 2080, and 2080 Ti is the XC Ultra cooler on the 2060. I feel like some people are going to buy the 2060 XC Ultra expecting the fat cooler and not get it. In evga's defense, they would say that "Black", "Gaming" "XC" etc dont refer to specific coolers, but their position within evga's own product stack. But like so many things in evga marketing, its real confusing. Especially when other brands do refer to cooler in the name (Asus Dual, MSI Armor, etc), and evga does too when it comes to the blower/hydro/ftw3. Evga is just consistently sort of a mess with its SKUs.
|
# ? Jan 17, 2019 12:13 |
|
I did the upgrade to 417 yesterday and did some testing on my Samsung 32Uj590 display(I know it's a cheap one but I use my pc so little i don't notice the shortcomings), i wanted to add two things for whoever wanted to activate gsync-compatible on a Samsung monitor: - You need to activate freesync on the monitor first, then open nvidia panel - Use freesync standard mode, extreme mode is nausea inducing(which was the mode likely on the youtube ces test someone posted a few pages back). If you want a quick test, the old nvidia pendulum demo is still up and will trigger the monitor freesync faults.
|
# ? Jan 17, 2019 13:16 |
|
How does pricing on non-recent model video cards work? Do prices on new cards ever drop, or are they like Intel computer chips in that they never lower price? I'd like to get a 1060 6GB at some point in time to upgrade my computer which is only a few years old, but would like to spend < $100 on it. Will I ever be able to buy a new one for that price, or is the only option eBay?
|
# ? Jan 17, 2019 13:25 |
|
|
# ? May 16, 2024 04:04 |
|
silence_kit posted:How does pricing on non-recent model video cards work? Do prices on new cards ever drop, or are they like Intel computer chips in that they never lower price? eBay, shop clearance sale (we forgot we had that one/display case) or refurbished item. New ones are unlikely to be available on the shelves within weeks.
|
# ? Jan 17, 2019 13:33 |