Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Sininu posted:

I'm aware of that and I've even linked Blurbusters here and in monitor thread more than few times before.
CSGO's cap is so bad to me it can't even maintain smooth 60 FPS. I have no idea what is wrong with it, other games I've played with in-game caps have worked way better.

E: more serious competitive people don't use caps at all in CSGO

I'd bet it's that way by design and there's nothing wrong with it, it's just not useful to you because it doesn't buffer. It's very useful for people who want minimum input lag on a VRR display though. On a fixed refresh rate display if you want minimum input lag you just uncap it and let it run at 300fps with no vsync and deal with the (minor, at that frame rate) tearing.

e: it's kinda obvious but worth pointing out that precisely achieving a given framerate is in fact exactly the same problem as keeping the rendering synchronized to the display refresh rate, and has the same solution (render into a buffer one frame in advance and wait for the right moment to present it - that is, vsync), since you can't predict exactly how long rendering will take.

TheFluff fucked around with this message at 23:52 on Jan 15, 2019

Adbot
ADBOT LOVES YOU

SlayVus
Jul 10, 2009
Grimey Drawer
So this may be a very niche use case, but you can run both a GSync Monitor and FreeSync monitor at the same time. Both of my monitors are running with XSync enabled and both are not giving any kind of issues. My Pixio PX277 and Acer Predator X34 are running great without any kind of flickering or issues. I can run the Nvidia Pendulum demo between both monitors and it doesn't stutter or tear.

Craptacular!
Jul 9, 2001

Fuck the DH

Sininu posted:

I don't have hardware that can support VRR, (planning to change it this year.)

Okay, but the original post I was responding to was about "how to use Gsync properly."

I use in-game caps when available and don't even have RTSS or Nvidia Inspector installed on a Gsync monitor (but I also am not a CSGO player). I'm not saying there isn't scientifically-evident improvements in doing things differently, but none that someone of my age (mid-30s) can appreciate. I can appreciate 141 FPS over 60 FPS in Overwatch. But 120-141, especially in a game like League where I'm offered 120 or 144 but not single digit FPS values, is a much tighter comparison.

Craptacular! fucked around with this message at 00:00 on Jan 16, 2019

Sininu
Jan 8, 2014

Craptacular! posted:

Okay, but the original post I was responding to was about "how to use Gsync properly."

I use in-game caps when available and don't even have RTSS or Nvidia Inspector installed on a Gsync monitor (but I also am not a CSGO player). I'm not saying there isn't scientifically-evident improvements in doing things differently, but none that someone of my age (mid-30s) can appreciate. I can appreciate 141 FPS over 60 much tighter comparison.

I did jump the gun there yeah. When I saw you mention CSGO my bad experience with its cap immediately came to mind and I wanted to post how bad it was. Didn't consider what was posted before and if it would apply to my experience.

Stickman
Feb 1, 2004

craig588 posted:

I'm not sensitive to tearing so I needed to use my monitors OSD to verify Free/Gsync was working. Normally I'm very against any objective measurements because if you can only notice up to 30 FPS knowing that you're not running at 120 FPS is only going to make you sad even if you can't see it. If some people can't see the difference in different caps then they're fine, no need to chase perfection if they can't see the difference.

For competitive games, there's an argument to be made that even if an effect isn't noticeable in the conscious sense (like input lag), it might still have an effect on competitive performance. You'd probably want to do some blind tests to verify the actual effect size, though.

Speaking of blind tests, have you ever tried having a friend randomly cap your framerate at 30/60/90/120 and try to rate the qualitative experience? You might find that it's more noticeable than you think when you're not specifically looking for it. If you truly can't tell the difference, I'm kind of jealous. High refresh has ruined me and now all my future monitors are doomed to be ridiculously expensive.

craig588
Nov 19, 2005

by Nyc_Tattoo
60 to 120 is easy for me to see, 120 to 144 I can only notice side by side.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

SlayVus posted:

So this may be a very niche use case, but you can run both a GSync Monitor and FreeSync monitor at the same time. Both of my monitors are running with XSync enabled and both are not giving any kind of issues. My Pixio PX277 and Acer Predator X34 are running great without any kind of flickering or issues. I can run the Nvidia Pendulum demo between both monitors and it doesn't stutter or tear.

That's really great to hear. I assume that when you do that it's capped to 80 or 100hz or whatever your Predator is clocked to?

SlayVus
Jul 10, 2009
Grimey Drawer

K8.0 posted:

That's really great to hear. I assume that when you do that it's capped to 80 or 100hz or whatever your Predator is clocked to?

Yeah, both monitors are clocked at 100Hz just because of weird Windows issues when you try to run monitors at different refresh rates, like videos stuttering in YouTube. With the Pixio having such a high FS range 30-145, running it at 100 still lets FS run at 30-100 which gives me like a 3.3x scale range. I also purchased this Pixio refurbished like 2 or 3 months ago, so with this driver update I'm finally able to confirm all features of a refurb monitor I bought work. My X34 is also a refurb as well lol, which has been going strong for 2 years and 10 months.

SlayVus fucked around with this message at 00:35 on Jan 16, 2019

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


I definitely appreciate 120+ fps in a game like Overwatch but for your standard AAA action-adventure title anything above 60fps starts to look real similar to me. Much more preferable in those instances to have an immovable 60fps cap than a jittery 75.

Craptacular!
Jul 9, 2001

Fuck the DH

exquisite tea posted:

I definitely appreciate 120+ fps in a game like Overwatch but for your standard AAA action-adventure title anything above 60fps starts to look real similar to me. Much more preferable in those instances to have an immovable 60fps cap than a jittery 75.

I've said before the VRR, particularly well-implemented VRR that's often not cheap, reminds me of when people talk about Nintendo having "good enough tech". High refresh monitors are really nice sometimes, but while I'm okay turning off reflections in Overwatch to hit 140+ solid I'm a lot less likely to do that in a Final Fantasy campaign. We don't always really want all that FPS at the sake of what has to be sacrificed for it in every single game. 90 average in Monster Hunter isn't ideal, but only because the game could be coded so much better. On the other hand, the fact that I can sit solidly at 90FPS and not complain of tears or stutters for either having too much FPS or not enough FPS is really cool.

At the risk of getting a little too warez-y for this forum, I particularly like it with emulated Breath of the Wild because it looks great without it's wildly unpredictable framerate being that much of a problem.

codo27
Apr 21, 2008

Could a 2070 drive 1440p @high frames? What's the window for step up?

Craptacular!
Jul 9, 2001

Fuck the DH

codo27 posted:

Could a 2070 drive 1440p @high frames? What's the window for step up?

Yes? But if that's your only objective, a 1070ti or a Vega 56 (depending on your monitor) might be a better value.

Stickman
Feb 1, 2004

Sure, it depends on the game and settings.

Here's Doom 2016 averaging 170 on Ultra. Games like Witcher 3 and Far Cry 5 are closer to 100-110 on Ultra, and the newest stuff like Shadow of the Tomb Raider and AC: Odyssey are closer to 60 unless you turn down some settings.

E: Subtract ~15% for a 1070 Ti/2060, add 15-25% for a 2080/1080 Ti (or 30-50% for a 2080 Ti, depending on CPU/initial fps).

EE: Step-Up is 90 days to apply.

Stickman fucked around with this message at 03:23 on Jan 16, 2019

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

codo27 posted:

Could a 2070 drive 1440p @high frames? What's the window for step up?

Yeah totally, just drop some settings from ULTRA based on some performance videos and you will be at great frame rates

Greensync indeed works on my Freesync AOC AGON panel :getin: Don’t forget to reapply your colour correction profiles after the driver update if you do a clean install.

Mindblast
Jun 28, 2006

Moving at the speed of death.


Thanks for the info regarding upgrading and that msi 1070 itx card. After reading more reviews I pulled the trigger. Will have it by the end of the week.

It's interesting to see how these modern ssf cards actually hold their own without sacrificing a lot. It's even reported to be pretty silent even when under load despite only having one fan.

Stickman
Feb 1, 2004

RTX 2060s are in stock now for msrp ($350) + free games.

Benchmarks have them mostly very slightly outperforming 1070 Tis and occasionally outperforming the 1080 (Wolfenstein II at 1440p). The -2GB of RAM cause 99th percentile frame times to drop below the 1070 at 4K in some games (also Wolfenstein II).

https://www.anandtech.com/show/13762/nvidia-geforce-rtx-2060-founders-edition-6gb-review/10

https://www.guru3d.com/articles-pages/msi-geforce-rtx-2060-gaming-z-review,1.html

With 1070-Ti-level power consumption, it seems pretty solid against the Vega 56 until those prices come down. Definitely too expressive for a 1060 replacement, but we already knew that and the 580/90’s got us covered for now!

Stickman fucked around with this message at 17:31 on Jan 16, 2019

alex314
Nov 22, 2007

Looks like it's 400€ in EU, as usual..

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Meh, after all these driver updates, I still have the issue that sometimes, my displays don't wake up anymore until I powercycle the computer. Here and there, it's also just one display that doesn't come up while the other does, and I get the distinct feeling, it's yet another GSync issue. Because when it happens, the LED keeps flickering, indicating mode switches. And when only one comes up, the user interface keeps locking up hard all the time. Eventually I noticed, if I turn off the display stuck asleep, it fixes itself. But FreeSync/Adaptive Sync is apparently super terrible.

slidebite
Nov 6, 2005

Good egg
:colbert:

I used to have that problem with dp on my 980ti, and was only solved by going to hdmi. I thought for sure it was my monitor but when I got my 1080ti I tried dp with it and all was fine so I've been using it since.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Combat Pretzel posted:

Meh, after all these driver updates, I still have the issue that sometimes, my displays don't wake up anymore until I powercycle the computer. Here and there, it's also just one display that doesn't come up while the other does, and I get the distinct feeling, it's yet another GSync issue. Because when it happens, the LED keeps flickering, indicating mode switches. And when only one comes up, the user interface keeps locking up hard all the time. Eventually I noticed, if I turn off the display stuck asleep, it fixes itself. But FreeSync/Adaptive Sync is apparently super terrible.

And you're sure it's not a "monitor deep sleep" option etc in the monitor software itself? That is the cause of that issue like 99% of the time I hear about if

daab
Nov 14, 2005
Shit, Negro, that's all you had to say!
Which is quieter?

EVGA GeForce RTX 2060 XC ULTRA (one large fan)
or
EVGA GeForce RTX 2060 XC (two smaller fans)

The EVGA site only talks about space.

In theory, I've been led to believe bigger fans are quieter. Can I apply that reasoning to conclude the XC Ultra is quieter?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Statutory Ape posted:

And you're sure it's not a "monitor deep sleep" option etc in the monitor software itself? That is the cause of that issue like 99% of the time I hear about if
Pretty sure, because it's a problem that only showed up with the 400 series drivers.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

daab posted:

Which is quieter?

EVGA GeForce RTX 2060 XC ULTRA (one large fan)
or
EVGA GeForce RTX 2060 XC (two smaller fans)

The EVGA site only talks about space.

In theory, I've been led to believe bigger fans are quieter. Can I apply that reasoning to conclude the XC Ultra is quieter?

They both seem to have two fans...?

Usually single fan cards are ITX cards, which are smaller, hotter and louder.

Regrettable
Jan 5, 2010



daab posted:

Which is quieter?

EVGA GeForce RTX 2060 XC ULTRA (one large fan)
or
EVGA GeForce RTX 2060 XC (two smaller fans)

The EVGA site only talks about space.

In theory, I've been led to believe bigger fans are quieter. Can I apply that reasoning to conclude the XC Ultra is quieter?

The only one I can find noise data for is the Ultra. 28dB at idle and 36.6dB at full load.

Hace posted:

They both seem to have two fans...?

Usually single fan cards are ITX cards, which are smaller, hotter and louder.

The Ultra has 2 fans and the XC Gaming and XC Black only have 1.

Llamadeus
Dec 20, 2005
That single fan doesn't look any bigger than the dual fans though, it's just a smaller heatsink to save costs and for small form factor builds. And heatsink surface area is a more important factor.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
Yeah the single fan card is just smaller in size, the fan is the same diameter. It'll be hotter and louder.

daab
Nov 14, 2005
Shit, Negro, that's all you had to say!

Llamadeus posted:

That single fan doesn't look any bigger than the dual fans though, it's just a smaller heatsink to save costs and for small form factor builds. And heatsink surface area is a more important factor.

Yeah after looking at it again, it appears you be correct...same size fans. So I guess its just to make it shorter, but fatter. And now I'm wondering if the heatsink surface area is actually more or less and I'm worried I need to use trigonometry.

Maybe more importantly, the XC Ultra says its "Boost Speed" is 1830 MHz compared to the XC's 1755 MHz. $20 CAD extra for 75 MHz more, a longer heatsink, and another fan. Hmmmm.....decisions....

Which one should I go for?

Regrettable
Jan 5, 2010



I would take the Ultra. Like Hace said, the fans are the same size so you can almost guarantee that the single fan will be louder and hotter.

eames
May 9, 2009

daab posted:

Yeah after looking at it again, it appears you be correct...same size fans. So I guess its just to make it shorter, but fatter. And now I'm wondering if the heatsink surface area is actually more or less and I'm worried I need to use trigonometry.

Maybe more importantly, the XC Ultra says its "Boost Speed" is 1830 MHz compared to the XC's 1755 MHz. $20 CAD extra for 75 MHz more, a longer heatsink, and another fan. Hmmmm.....decisions....

Which one should I go for?

I would take card the one with dual fans and undervolt/underclock it to ~75% of the target power limit. That'd only cost 3% performance and the card will be almost inaudible.

Mindblast
Jun 28, 2006

Moving at the speed of death.


eames posted:

I would take card the one with dual fans and undervolt/underclock it to ~75% of the target power limit. That'd only cost 3% performance and the card will be almost inaudible.

Interesting idea. I have that 1070 itx come in, I might try this if it turns out louder than anticipated.

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
Hm, so far none of the 2060 cards have a zero-fan mode. Wonder when those will show up.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

eames posted:

I would take card the one with dual fans and undervolt/underclock it to ~75% of the target power limit. That'd only cost 3% performance and the card will be almost inaudible.

I should give this a try. Been crunching with Folding@Home and would like my 1080 to be quieter.

Gunder
May 22, 2003

Does anyone have any experience with the RTX FE coolers? Are they quiet on desktop usage?

Stickman
Feb 1, 2004

Gunder posted:

Does anyone have any experience with the RTX FE coolers? Are they quiet on desktop usage?

I didn’t notice this at first, but EVGA’s single-fan 2060s (the XC and Black) are three-slot cards. That’s probably better cooling than a two-slot iTX card, but what case wants a short three-slot card? Most iTX cases are two-slot max.

Stickman
Feb 1, 2004

lllllllllllllllllll posted:

Hm, so far none of the 2060 cards have a zero-fan mode. Wonder when those will show up.

Apparently EVGA’s dual-fan card (XC Ultra) does, and I wouldn’t be surprised if their other two do as well.

GRINDCORE MEGGIDO
Feb 28, 1985


Stickman posted:

Apparently EVGA’s dual-fan card (XC Ultra) does, and I wouldn’t be surprised if their other two do as well.

I was looking at itx 20xx cards and thought these looked totally pointless.

Cygni
Nov 12, 2005

raring to post

Yeah evga's marketing on the 2060 is kinda bad. The XC non-Ultra cooler on the 2070, 2080, and 2080 Ti is the XC Ultra cooler on the 2060. I feel like some people are going to buy the 2060 XC Ultra expecting the fat cooler and not get it.

In evga's defense, they would say that "Black", "Gaming" "XC" etc dont refer to specific coolers, but their position within evga's own product stack. But like so many things in evga marketing, its real confusing. Especially when other brands do refer to cooler in the name (Asus Dual, MSI Armor, etc), and evga does too when it comes to the blower/hydro/ftw3. Evga is just consistently sort of a mess with its SKUs.

SlowBloke
Aug 14, 2017
I did the upgrade to 417 yesterday and did some testing on my Samsung 32Uj590 display(I know it's a cheap one but I use my pc so little i don't notice the shortcomings), i wanted to add two things for whoever wanted to activate gsync-compatible on a Samsung monitor:

- You need to activate freesync on the monitor first, then open nvidia panel
- Use freesync standard mode, extreme mode is nausea inducing(which was the mode likely on the youtube ces test someone posted a few pages back).

If you want a quick test, the old nvidia pendulum demo is still up and will trigger the monitor freesync faults.

silence_kit
Jul 14, 2011

by the sex ghost
How does pricing on non-recent model video cards work? Do prices on new cards ever drop, or are they like Intel computer chips in that they never lower price?

I'd like to get a 1060 6GB at some point in time to upgrade my computer which is only a few years old, but would like to spend < $100 on it. Will I ever be able to buy a new one for that price, or is the only option eBay?

Adbot
ADBOT LOVES YOU

alex314
Nov 22, 2007

silence_kit posted:

How does pricing on non-recent model video cards work? Do prices on new cards ever drop, or are they like Intel computer chips in that they never lower price?

I'd like to get a 1060 6GB at some point in time to upgrade my computer which is only a few years old, but would like to spend < $100 on it. Will I ever be able to buy a new one for that price, or is the only option eBay?

eBay, shop clearance sale (we forgot we had that one/display case) or refurbished item. New ones are unlikely to be available on the shelves within weeks.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply