Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

v1ld posted:

Thanks for the explanation.

I was and am kinda hoping that a 3080ti with DLSS 2.71828 cranked up can render an image based on the laws of physics, circa whenever we figured out basic reflection and refraction (say, Newton?), without too many baked in assists. Sounds like we're still a long way away from doing that at, say, 3440x1440@60Hz.

Minecraft RTX is basically that and it's murder to run on current hardware even with DLSS 2.0, we're a way off from being able to render current AAA quality scenes with those techniques :pcgaming:

There was a SIGGRAPH talk from an Nvidia researcher outlining his predicted timeline for real-time raytracing and his (probably optimistic) estimate for generalized pathtracing in games was the year of our lord 2035.

Adbot
ADBOT LOVES YOU

shrike82
Jun 11, 2005

i'm lolling at Microsoft trying to market XSX as a 4K120 box

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
You also have to keep in mind that a game like CP2077 is built around current lighting techniques, with raytracing added on. Future games designed with raytracing-capable hardware as the primary or only target will handle lighting differently and will make different choices about what to sacrifice for optimization's sake.

Indiana_Krom
Jun 18, 2007
Net Slacker

repiv posted:

Minecraft RTX is basically that and it's murder to run on current hardware even with DLSS 2.0, we're a way off from being able to render current AAA quality scenes with those techniques :pcgaming:

There was a SIGGRAPH talk from an Nvidia researcher outlining his predicted timeline for real-time raytracing and his (probably optimistic) estimate for generalized pathtracing in games was the year of our lord 2035.

Yeah, doing the whole lighting model in ray tracing is expensive, but it is worth nothing that it doesn't become any more expensive in a current AAA game than it is in Minecraft RTX or Quake 2 RTX. Basically the expense of switching your entire lighting model to ray tracing is a largely fixed one, which means it is a lot closer to becoming a thing in current AAA quality scenes than you would think.

Twibbit
Mar 7, 2013

Is your refrigerator running?
The more complex the geometry, the longer it takes to build the bvh structure each frame, which is a cpu cost.

repiv
Aug 13, 2009

More complex geometry takes longer for the RT cores to traverse too. Raytracing has the nice property that the traversal cost scales logarithmically with scene complexity, unlike raster which is more linear, but going from trivial Minecraft geometry to today's AAA assets still hurts.

Minecrafts geometry is also perfectly axis aligned, the best case scenario for BVH construction.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Anyone else have the most recent driver update break rtx voice? That's all that's changed on my setup and my mic is all choppy now. Worked great before.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

shrike82 posted:

i'm lolling at Microsoft trying to market XSX as a 4K120 box

Kinda reminds me of this:

CaptainSarcastic
Jul 6, 2013



I've been playing Metro Exodus recently and it's pretty nice-looking but I feel like I expected a bit more. I have found that with everything maxed out and RT set to performance it runs more smoothly if I turn on v-sync, which seems weird to me. It runs at a solid 60fps that way, whereas I swear with v-sync off I was getting better framerates but it was less smooth.

I haven't installed the new driver yet - might see if that makes any difference.

Also, I've seen some ridiculously bad physics happening with the mutants so far. They get stuck on things, stutter, and their collision in general seems wonky. Anybody else seen that? Other enemies behave as expected, but the mutants seem a bit broken.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
I recall a few times I’ve killed a mutant so hard they partially clip into something and wiggle around a bit, but it never really happened enough that it stood out to me. I wonder if framerate and physics are tied together - Fallout 4 got really weird when I forced it to play at 100fps.

Burno
Aug 6, 2012

Dogen posted:

Anyone else have the most recent driver update break rtx voice? That's all that's changed on my setup and my mic is all choppy now. Worked great before.

I was getting behind the scenes RTX Voice crashes (program worked but the input stopped) and choppy mic, when it is choppy I also lose like 30fps in games, both happening more frequently over time. Seems RTX Voice got pushed out with the pandemic and left to fend for itself. Ditched it last week.

sauer kraut
Oct 2, 2004

Dogen posted:

Anyone else have the most recent driver update break rtx voice? That's all that's changed on my setup and my mic is all choppy now. Worked great before.

Yeah apparently with this new Windows 20.04 hardware scheduling thing enabled, RTX Voice on Pascal cards gets reamed when running alongside a demanding game/gpu load.
Unsurprisingly, so does Chrome hardware acceleration.

Try disabling it in Windows' graphics settings. Not sure if you need the Windows store Nvidia control app to do that.
Or revert to the driver before this one, it's the first to support this wonderful Windows feature.

sauer kraut fucked around with this message at 12:48 on Jun 26, 2020

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



future ghost posted:

FWIW my MSI 2080ti doesn't idle at all if I have more than one monitor connected. Primary is 1440p 165hz and secondary is 1440p 60hz. Setting primary to 144hz has the same issue, using a 1080p 60hz secondary made no difference, and it still persists with the latest Nvidia drivers. I had to put the secondary panel on onboard GPU to get idle clocks back. Prepare to invest in Displaylink adapters or a low end separate GPU just in case.

Stating again that using the Multi Display Power Saver from the Nvidia Inspector easily fixes this with very low effort and no other noticable impact.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

buglord posted:

I wonder if framerate and physics are tied together - Fallout 4 got really weird when I forced it to play at 100fps.

I believe they are indeed tied together for Fallout 4, yeah.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

mcbexx posted:

Stating again that using the Multi Display Power Saver from the Nvidia Inspector easily fixes this with very low effort and no other noticable impact.
I looked into that but it seemed like you had to go through whitelisting applications to make it work correctly based on what I read. Putting the secondary screen on the Intel gpu was zero effort.

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



future ghost posted:

I looked into that but it seemed like you had to go through whitelisting applications to make it work correctly based on what I read. Putting the secondary screen on the Intel gpu was zero effort.

You may be confusing it with the Riva Tuner Statistics Server, which has the option to assign settings on a per-app basis, the MDPS is a set-and-forget configuration.

I no longer have an iGPU, so this is the one that works for me.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

mcbexx posted:

Stating again that using the Multi Display Power Saver from the Nvidia Inspector easily fixes this with very low effort and no other noticable impact.

With this enabled, my 1080Ti pushing 3 monitors will actually drop down so low in power use that it can't muster the effort to support all three screens and will crash randomly. It works for some people, but not for others. It's certainly worth a shot if you've got a card that won't idle, though.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

DrDork posted:

With this enabled, my 1080Ti pushing 3 monitors will actually drop down so low in power use that it can't muster the effort to support all three screens and will crash randomly. It works for some people, but not for others. It's certainly worth a shot if you've got a card that won't idle, though.



this feels like a humble brag

Shaocaholica
Oct 29, 2002

Fig. 5E
Ok so laptop gaming. How do I know which GPU is going to get used when my laptop has a dGPU and a iGPU? Does windows try to use dGPU when on wall power and the iGPU on battery? What’s the best way to configure this if possible?

Arzachel
May 12, 2012

Shaocaholica posted:

Ok so laptop gaming. How do I know which GPU is going to get used when my laptop has a dGPU and a iGPU? Does windows try to use dGPU when on wall power and the iGPU on battery? What’s the best way to configure this if possible?

Gpu drivers have a whitelist for games, sometimes it fucks up and you have to manually force either through the driver control panel or through the right click context menu (for Nvidia, never had a AMD gpu on a laptop).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Statutory Ape posted:

this feels like a humble brag

It's not. It means that I can't use NVidia Inspector to fix the idling issue, so I'm stuck burning ever so slightly more electricity to heat my room whether I want to or not. I'd give these new drivers a shot, but honestly for me RTX Voice is worth more than fixing idle clocks right now.

Humble bragging would be noting the FPS the 1080Ti gets me in StardewValley. Outstanding frame pacing on it, really. Just the best.

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Shaocaholica posted:

Ok so laptop gaming. How do I know which GPU is going to get used when my laptop has a dGPU and a iGPU? Does windows try to use dGPU when on wall power and the iGPU on battery? What’s the best way to configure this if possible?

You can force the dGPU to be the used one through the nvidia control panel, You'll find the option in general settings. You can also set it program by program.

originalnickname
Mar 9, 2005

tree
Hey, So... I think my displayport controller on my 1080ti is dead. Does anyone have any suggestions as to possibly re-animating it/testing if it's actually dead? I'm 3 whole months out of warranty :(

Things I've tried:

-Tried all the ports
-Tried a different cable (2 different ones)
-Tried all the ports on the monitor
-Tried another monitor
-Tried to run that DP 1.4 bios patch
-Tried clean/reinstall drivers
-Installed Linux and tried on that.

I'm not holding out much hope, but any suggestions would be greatly appreciated. HDMI port still works, so I'm running that, but no gsync/freesync this way.

Thanks!

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Does it show anything on the screen pre post with just that gpu installed? If not and you tested all that you might have a dead card.

Twibbit
Mar 7, 2013

Is your refrigerator running?
Turned on gpu scheduling a while ago and noticed no major difference in usable game performance.

But I noticed a huge jump in performance of things that are kind of pointless. The FFXIV main menu went from 250+ to just under 400. Even when rendering my character model in the character select I am now at over 300 fps. This all goes away once I am in-game and getting the normal performance. Turns out there is no situation where memory scheduling on the CPU is a bottleneck that isn't already well beyond the 165hz of my monitor.

Nothing important about this just thought it was amusing.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Man, this is the worst general timeframe in graphics cards- that stagnant middle ground where you know much, MUCH better stuff will be available in a few months, but you're just forced to wait with no information (especially since Nvidia seems to be playing chicken with AMD and holding their cards (as in playing cards, not graphics cards) till the last possible second, creating a scenario where we know next to nothing for sure.

Like holy poo poo, just post the 3090 you cowards. My wallet will grow consciousness and buy the preorder before I even wake up in the morning.

Personally speaking I feel like we've been waiting forever.... Turing really should have had HDMI 2.1, and didn't to much disappointment, but at this point, having HDMI 2.1 technology and having to wait for a card to support it is just torture. I can't even remember a time where it was this stupid, in terms of waiting for the tech stack to mature. There's these separate technologies that all have to come together at once, and in the meanwhile your fuckin' dick is flapping in the wind at 4K/60 waiting for the 120hz support.

I realize this is not everyone's issue, and most people are sitting on smaller, higher refresh panels but goddamn can the HDMI 2.1 era just loving start already, poo poo. OLED gaming is already taking the crown at 60hz, 120hz will be untouchable with OLED response times and color accuracy/infinite contrast.

It's not just HDMI 2.1 people waiting for this. Ray tracing needs to get less intensive immediately. The Turing cards (of which I own a 2080 so no bias here) will become a footnote. At best a step towards something good, and at worst a failure. A true low point in graphics card history. Let's move on to something better.

Taima fucked around with this message at 18:11 on Jun 28, 2020

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Taima posted:

Man, this is the worst general timeframe in graphics cards-

Nope, I've felt great about my pre-shitcoin boom 1080 Ti purchase for way longer than I ever imagined possible. Who would have thought it actually represented great value

HalloKitty fucked around with this message at 18:25 on Jun 28, 2020

Indiana_Krom
Jun 18, 2007
Net Slacker
I bought a 1080 about two months after they launched (before the Ti was even announced IIRC). I'm still using it.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

With the exception of a couple games my ROG 1070 has held strong! The only reason I’m considering an upgrade now is because I’m a crazy keener who wants to run cyberpunk on max graphics with zero stutters.

Just hoping that my 7700k can still carry it.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

HalloKitty posted:

Nope, I've felt great about my pre-shitcoin boom 1080 Ti purchase for way longer than I ever imagined possible. Who would have thought it actually represented great value

Sorry if I wasn't abundantly clear on this I'm talking about the 2000 series, not a card launched over 3 years ago. You're actually just proving my point.


Indiana_Krom posted:

I bought a 1080 about two months after they launched (before the Ti was even announced IIRC). I'm still using it.

That's a good call. I only got the 2080 because I found it for 500 new. Otherwise I would be on my old poo poo too.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

HalloKitty posted:

Who would have thought it actually represented great value

I got mine because it was right at the start of inventory problems due to mining and any reasonable priced 1080 was sold out

B&H had a FTW3 @ MSRP and since I can only really complain about the difference in price between a 1080 and the ti here, I will say I actually got lucky

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Taima posted:

It's not just HDMI 2.1 people waiting for this. Ray tracing needs to get less intensive immediately. The Turing cards (of which I own a 2080 so no bias here) will become a footnote. At best a step towards something good, and at worst a failure. A true low point in graphics card history. Let's move on to something better.

People have been trying to figure out how to make ray tracing less intensive for decades. Nobody's come up with anything substantive yet. The best we've got is limited scope plus really fast silicon.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Statutory Ape posted:

I got mine because it was right at the start of inventory problems due to mining and any reasonable priced 1080 was sold out

I got mine because I managed to mine enough buttcoins on a launch 1080 that it paid for itself and also for the 1080Ti upgrade. I also haven't upgraded since then because...why?

And yeah, HDMI 2.1/DP 2.0 needs to get here pronto so the real reason for all of this can start dropping on the market: 4k+ 120Hz+ monitors. I'm not upgrading poo poo until I can get a monitor compellingly better than my 3440x1440@100 I have right now.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

sauer kraut posted:

Yeah apparently with this new Windows 20.04 hardware scheduling thing enabled, RTX Voice on Pascal cards gets reamed when running alongside a demanding game/gpu load.
Unsurprisingly, so does Chrome hardware acceleration.

Try disabling it in Windows' graphics settings. Not sure if you need the Windows store Nvidia control app to do that.
Or revert to the driver before this one, it's the first to support this wonderful Windows feature.

I hadn't turned it on, and it doesn't' seem on by default, so seems like the driver itself is the culprit. Oh well it was cool while it worked, hope they start updating it at some point.

Also I tried turning on the new setting for kicks, it's annoying because it makes you reboot and it doesn't really seem to do anything for performance.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I am still feeling optimistic about my 6700k/1080ti/850 Evo system and Cyberpunk 2077. I can probably see it running high (not highest) 1440p60 with slightly reduced resolution scale, and NPC count reduction to ease the load off the ageing CPU.

Fingers crossed! If not, well, there's no better excuse to get upgrades :shobon:

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

originalnickname posted:

Hey, So... I think my displayport controller on my 1080ti is dead. Does anyone have any suggestions as to possibly re-animating it/testing if it's actually dead? I'm 3 whole months out of warranty :(

Things I've tried:

-Tried all the ports
-Tried a different cable (2 different ones)
-Tried all the ports on the monitor
-Tried another monitor
-Tried to run that DP 1.4 bios patch
-Tried clean/reinstall drivers
-Installed Linux and tried on that.

I'm not holding out much hope, but any suggestions would be greatly appreciated. HDMI port still works, so I'm running that, but no gsync/freesync this way.

Thanks!

I'd check for any physical damage first, then depending on how comfortable you are with it, you can try re-flashing your card's BIOS with NVFlash. I've seen a few cases where certain features stop working or cards have instability & either flashing a new BIOS or re-flashing the existing one sometimes fixes the problem.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Zedsdeadbaby posted:

I am still feeling optimistic about my 6700k/1080ti/850 Evo system and Cyberpunk 2077. I can probably see it running high (not highest) 1440p60 with slightly reduced resolution scale, and NPC count reduction to ease the load off the ageing CPU.

Fingers crossed! If not, well, there's no better excuse to get upgrades :shobon:

I actually think you're super wrong. The lack of DLSS 2.0 support will make your GPU look like poo poo in 2077. It will run, and if that's your concern then more power to you, but it will be absolutely nothing compared to even a 2000 series, let alone a 3 series. A 2070 is going to run a train on the 1080Ti in cyberpunk, that's just the truth.

People who haven't tried DLSS 2.0 continue to underrate the enormous impact it will produce in titles that support it (and increasingly that's looking like most titles with heavy GPU workloads).

Space Gopher posted:

People have been trying to figure out how to make ray tracing less intensive for decades. Nobody's come up with anything substantive yet. The best we've got is limited scope plus really fast silicon.

Just out of curiosity, so you feel that the 3000 series will have nothing to aid ray tracing beyond sheer rendering power? Not saying you're wrong at all, just curious if you would elaborate on your viewpoint. That could be true for all we know.

Taima fucked around with this message at 21:31 on Jun 28, 2020

Cygni
Nov 12, 2005

raring to post

That’s... a bit presumptive consider the game doesn’t even come out until November, and the companies last game still runs bad on top end hardware with things maxed explicitly because of how they implemented Nvidia specific features. And we also have seen a grand total of like one big name DLSS 2.0 implementation.

1080 Ti will run the game at 1440p fine, it certainly won’t “look like poo poo”.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
CP2077 has to run on current gen consoles. It's going to be playable on a lot of really lovely systems.

Adbot
ADBOT LOVES YOU

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
It's all relative. 2077 will run on a 1080Ti... probably fairly well. It will, however, be far behind even 2000 level GPUs. That's just the reality of DLSS 2.0, let alone the other optimizations that have been made during the tenure of the 2000 series, and any witchcraft the 3000s series will employ in terms of ray tracing etc.

My point is we're basically on completely different goal posts. If the goal is "this game will run and look ok" then fine, conceded. If you think it's going to look anywhere near as good as even a 2000-level system, you're dreaming.

Cygni posted:

That’s... a bit presumptive consider the game doesn’t even come out until November, and the companies last game still runs bad on top end hardware with things maxed explicitly because of how they implemented Nvidia specific features. And we also have seen a grand total of like one big name DLSS 2.0 implementation.

You can be as skeptical as you want about DLSS 2.0 but as someone who has seen its effects across multiple titles, and given the incredibly vast amount of dev help that Nvidia is injecting into 2077, I think you're crazy if you think it won't make a giant difference. But all questions will be answered shortly and we can revisit it then.

DLSS 2.0 is as close to magic as we've come in my entire 20+ year tenure following graphics cards, starting with my good old Riva TNT in 1998. It took a while, DLSS 1.0 was poo poo, but we're here now, and it's the real deal. I'm totally ok with skeptics doubting it, because I've seen it firsthand :shrug:

e: and for the record people might be thinking "didn't you just talk poo poo on the 2000 series" and yes I did, and do. The 2000 series needed DLSS 2.0 out the gate, and Nvidia failed to make that happen. Now, when DLSS is making real inroads, we are already effectively in Ampere country. So I don't even really count it as a 2000 series feature, though that series of cards will benefit from it greatly.

Taima fucked around with this message at 21:54 on Jun 28, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply