Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
terrorist ambulance
Nov 5, 2009
10x0 card's don't have the hardware for ray tracing

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

terrorist ambulance posted:

10x0 card's don't have the hardware for ray tracing

They do have a software DXR implementation in the drivers though

I'm not sure if it ever got updated for DXR 1.1 though so that might be what's blocking it from working in Cyberpunk

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Oh! maybe that's it, I'm not even runnig the last drivers.

And yeah it worked, really slowly, in Control and Quake RTX before so I was hoping to at least see what's going on

hobbesmaster
Jan 28, 2008

mobby_6kl posted:

Oh! maybe that's it, I'm not even runnig the last drivers.

And yeah it worked, really slowly, in Control and Quake RTX before so I was hoping to at least see what's going on

At least control has a photo mode you can use it in iirc.

pyrotek
May 21, 2004



DeathSandwich posted:

Fair, but the Framerate is noticeably much worse at 1080. I'm probably bottlenecked at both ends so the answer basically turns back to "Build or Buy a proper desktop again, dumbass" which I've put the brakes on for so long because I've had 0 issues with the thunderbolt GPU dock on the laptop until now.

You can also go the other way. Lower the resolution as far as possible and see what kind of framerate you get. If you only get 30FPS at 480p or whatever, you probably won't ever get above that.

There is also the possibility that bandwidth limitations from the PCI3 4x Thunderbolt port could be hindering your performance. I know Forza Horizon 4 was massively limited by that, it is a possibility this game could be too.

DeathSandwich
Apr 24, 2008

I fucking hate puzzles.

pyrotek posted:

You can also go the other way. Lower the resolution as far as possible and see what kind of framerate you get. If you only get 30FPS at 480p or whatever, you probably won't ever get above that.

There is also the possibility that bandwidth limitations from the PCI3 4x Thunderbolt port could be hindering your performance. I know Forza Horizon 4 was massively limited by that, it is a possibility this game could be too.

What I've read online shows about a 20% GPU performance loss on external monitors when connected over Thunderbolt 3.0. Given that the 980 is already near the bottom of the minimum spec sheet it makes sense I'm getting the performance that I'm getting.

Basically I know what I'm going to be doing once I get my tax refund back next year and (hopefully) things are reasonably back in stock or at least the scalper prices aren't as insane.

Edit: Gaming on the HP Spectre wasn't really intended to be a permanent solution to begin with. I brought my (granted by that point a 6-ish year old relic) desktop with me when I moved apartments two years ago and it lasted about two weeks after the move before it ate poo poo and died. I got the eGPU setup because I had both the gtx 980 (it was the newest part of my old desktop rig and still pretty okay two years ago) and the laptop on hand, and the eGPU dock was significantly cheaper than buying a new desktop at that time. Once I actually got it up and running, it ran so much better than the old relic desktop for the games I was playing I didn't really need to think about a desktop.

DeathSandwich fucked around with this message at 18:35 on Dec 10, 2020

v1ld
Apr 16, 2012

ijyt posted:

Nah I was never going under 60 fps. Dunno what LFC is.

Low Frequency Compensation which some monitors have and which handles the case where you drop below your lower sync limit. If your monitor has it, check to see its enabled. It's possible you're short, seeing transient dips below the gsync limit. If not :iiam:

If your monitor doesn't do LFC, established thought itt seems to be that enabling vsync in the NVidia software is the way around this for NVidia cards so vsync kicks in when you're below the gsync lower limit. Shouldn't be enabled in the game itself, only in the nvidia software.

Animal
Apr 8, 2003

Ran CP on my Razer Blade with Geforce 2070. At 1080p with DLSS on Quality, it runs mostly above 60fps on High with RT turned wholly off. Looks pretty good, though RT really transforms this game. I feel you are missing out without it.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Anyone with a Big Navi card wanna share their experience?

Mr. Neutron
Sep 15, 2012

~I'M THE BEST~

terrorist ambulance posted:

10x0 card's don't have the hardware for ray tracing

I'm getting 5-6 fps in Q2TX on a 1070Ti!

bus hustler
Mar 14, 2019

To not make this the CP only thread but related, I think for sure anyone on a 2080 card or better should just ride this generation out. The jump from a 2080 or better to a 3080 isn't worth the price. This was supposed to be the "RT is here" game and it's not so far.

Feeling pretty good about this whole console generation and getting a 4080 :coal:

Leaning toward DLSS = extremely for real, RT = next generation, maybe

gary oldmans diary
Sep 26, 2005
I'm on a 2060, but I'll just turn down the graphics settings until the scalpers are out of play.

repiv
Aug 13, 2009

I'm probably sitting this gen out as well, I play so few AAAs these days that this 2070S should last until Hopper comes around

pyrotek
May 21, 2004



DeathSandwich posted:

What I've read online shows about a 20% GPU performance loss on external monitors when connected over Thunderbolt 3.0. Given that the 980 is already near the bottom of the minimum spec sheet it makes sense I'm getting the performance that I'm getting.

Basically I know what I'm going to be doing once I get my tax refund back next year and (hopefully) things are reasonably back in stock or at least the scalper prices aren't as insane.

Edit: Gaming on the HP Spectre wasn't really intended to be a permanent solution to begin with. I brought my (granted by that point a 6-ish year old relic) desktop with me when I moved apartments two years ago and it lasted about two weeks after the move before it ate poo poo and died. I got the eGPU setup because I had both the gtx 980 (it was the newest part of my old desktop rig and still pretty okay two years ago) and the laptop on hand, and the eGPU dock was significantly cheaper than buying a new desktop at that time. Once I actually got it up and running, it ran so much better than the old relic desktop for the games I was playing I didn't really need to think about a desktop.

I actually had a HP Spectre x360 with an eGPU myself. In my experience, you only got that 20% performance drop if you used the laptop's screen or the outputs through your laptop, not if you used the outputs on the GPU itself. The problem is sharing the limited bandwidth. By using the outputs on the GPU itself, the PCIe lanes are only used for transferring data to/from the GPU, not any display.

Most games worked well on that setup, with the exception of CPU-heavy games like Assassin's Creed Origins/Odyssey and bandwidth-heavy games like Forza Horizon 4.

GeForce Now might deliver better performance than your current setup. Maybe give that a shot until you can get a desktop?

bus hustler posted:

To not make this the CP only thread but related, I think for sure anyone on a 2080 card or better should just ride this generation out. The jump from a 2080 or better to a 3080 isn't worth the price. This was supposed to be the "RT is here" game and it's not so far.

Feeling pretty good about this whole console generation and getting a 4080 :coal:

Leaning toward DLSS = extremely for real, RT = next generation, maybe

It is worth checking to see how much you can sell your 2080 or better for. With the crazy prices used cards are going for at the moment, you might be able to upgrade for very little money if you can manage to get a new card.

pyrotek fucked around with this message at 19:44 on Dec 10, 2020

bus hustler
Mar 14, 2019

It's in the back of my mind but I just put an aftermarket cooler on my 2080ti and that usually tanks the resale, & the stock cooler is a blower, which tanks the value etc. My whole setup stays cool and barely makes noise over ambient, I'm in a good place.

I'm at 1440p it's not really worth it, mayyybe since I'm not upgrading my CPU or anything for a long time I'll get the itch on a 3080ti.

DeathSandwich
Apr 24, 2008

I fucking hate puzzles.

pyrotek posted:

I actually had a HP Spectre x360 with an eGPU myself. In my experience, you only got that 20% performance drop if you used the laptop's screen or the outputs through your laptop, not if you used the outputs on the GPU itself. The problem is sharing the limited bandwidth. By using the outputs on the GPU itself, the PCIe lanes are only used for transferring data to/from the GPU, not any display.

Most games worked well on that setup, with the exception of CPU-heavy games like Assassin's Creed Origins/Odyssey and bandwidth-heavy games like Forza Horizon 4.

GeForce Now might deliver better performance than your current setup. Maybe give that a shot until you can get a desktop?


It is worth checking to see how much you can sell your 2080 or better for. With the crazy prices used cards are going for at the moment, you might be able to upgrade for very little money if you can manage to get a new card.

I'll give Geforce Now a look tonight. I might have a lead on a 3070 that I can get my hands on this weekend. If I can get that I'll pop it in and see how that flies, but at worse if it doesn't work I'll use it in a desktop buildout after January.

skylined!
Apr 6, 2012

THE DEM DEFENDER HAS LOGGED ON

bus hustler posted:

It's in the back of my mind but I just put an aftermarket cooler on my 2080ti and that usually tanks the resale, & the stock cooler is a blower, which tanks the value etc. My whole setup stays cool and barely makes noise over ambient, I'm in a good place.

I'm at 1440p it's not really worth it, mayyybe since I'm not upgrading my CPU or anything for a long time I'll get the itch on a 3080ti.

In the same boat, on a 2080 (non super) with a water block on it on a custom loop. I sold the stock cooler (it's an evga hybrid, so came with an aio watercooler and radiator) so I'd have to sell the card with the waterblock, or the card with no cooler at all lol.

Under the right circumstances it'd make sense, but those circumstances are pretty limited.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
does anyone know if the geforce now codes that came with my 3080 are locked to people with a 30xx gpu installed or can anyone use them? I have a few I dont need.

skylined!
Apr 6, 2012

THE DEM DEFENDER HAS LOGGED ON
edit awesome double post

Happy Noodle Boy
Jul 3, 2002


Fauxtool posted:

does anyone know if the geforce now codes that came with my 3080 are locked to people with a 30xx gpu installed or can anyone use them? I have a few I dont need.

You need to be on a system with a 30XX to redeem them, I believe.


Is it Cold War? I’d buy Cold War for cheap. You can generally keep an eye on eBay for nvidia codes but CW is still $30+ there. It’s how I bought Monster Hunter World a while back for like $15.

Happy Noodle Boy fucked around with this message at 19:58 on Dec 10, 2020

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Fauxtool posted:

does anyone know if the geforce now codes that came with my 3080 are locked to people with a 30xx gpu installed or can anyone use them? I have a few I dont need.

The instructions on the Geforce site say you need to log into the Geforce Experience Center, verify your 30xx series card, then put the code in. It sucks, I wanted to give mine away too.

Also lol oops, I just realized that the new through-fan design on the 30xx series is now dumping heat directly onto my CPU sink...

change my name fucked around with this message at 20:17 on Dec 10, 2020

wolrah
May 8, 2006
what?

v1ld posted:

Low Frequency Compensation which some monitors have and which handles the case where you drop below your lower sync limit. If your monitor has it, check to see its enabled. It's possible you're short, seeing transient dips below the gsync limit. If not :iiam:

LFC is a GPU-side thing not a monitor-side thing. The monitor just has to have a variable sync range wide enough to support it, which from a technical standpoint means the maximum variable refresh rate must be at least double the minimum. For practical reasons the vendors prefer to have the range larger than that, AMD's slides when they introduced LFC said the maximum needs to be at least 2.5x the minimum, and support is required for any of the more advanced FreeSync branding beyond the basic one. nVidia doesn't have any published resources I can find but their G-Sync Compatible list doesn't have anything with a range smaller than 2.5x either so it seems like they're going with that too.

Animal
Apr 8, 2003

Digital Foundry has their Cyberpunk 2077 tech tour.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

change my name posted:

The instructions on the Geforce site say you need to log into the Geforce Experience Center, verify your 30xx series card, then put the code in. It sucks, I wanted to give mine away too.

Also lol oops, I just realized that the new through-fan design on the 30xx series is now dumping heat directly onto my CPU sink...

could probably log in their account from your pc and activate it.

bus hustler
Mar 14, 2019

Fauxtool posted:

could probably log in their account from your pc and activate it.


edit: wow im dumb gonna make some coffee youd obviously just log into their account in geforce exp

Xaris
Jul 25, 2006

Lucky there's a family guy
Lucky there's a man who positively can do
All the things that make us
Laugh and cry

Fauxtool posted:

does anyone know if the geforce now codes that came with my 3080 are locked to people with a 30xx gpu installed or can anyone use them? I have a few I dont need.

I would say just make some throw-away accounts if anyone wants one. or yeah if someone wants to provide their account just login an activate it. it's not like geforce experience accounts are anything special and i hate that you have to have one anyways

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
I dont feel like putting out the effort to do it myself but if anyone with a 30xx wants to use my geforcenow codes to help someone else without a beefy gaming rig let me know. I have 2 codes.

DeadlyHalibut
May 31, 2008

bus hustler posted:

To not make this the CP only thread but related, I think for sure anyone on a 2080 card or better should just ride this generation out. The jump from a 2080 or better to a 3080 isn't worth the price. This was supposed to be the "RT is here" game and it's not so far.

Feeling pretty good about this whole console generation and getting a 4080 :coal:

Leaning toward DLSS = extremely for real, RT = next generation, maybe

RT is definitely here for 3080. I've been playing control and cyberpunk and it'd pretty awesome. Of course you need to combine it with DLSS to get the frame rates up.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?



huh that's pretty high res cube ma-... oh right


e:no player character tho!

bus hustler
Mar 14, 2019

DeadlyHalibut posted:

RT is definitely here for 3080. I've been playing control and cyberpunk and it'd pretty awesome. Of course you need to combine it with DLSS to get the frame rates up.

Doesn't seem to do anything in cyberpunk for me RTX doesn't add much but did make everything blurry with DLSS. Control plays fine on 2000 series cards too but that once again remains the only game people can really cite for it.

It's just not a big deal to me and it's going to nuke performance for this entire generation. Given how well it plays with it off on the 2080ti it just isn't worth bumping for. If only the 3080 & 3090 can run it and only with DLSS... it's a generation off.

bus hustler fucked around with this message at 20:49 on Dec 10, 2020

Cygni
Nov 12, 2005

raring to post

Cyberpunk RT looks great to me maxed out but i never played Control so i dunno

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.


Dell comes through, here's my Cyberpunk 2077 Starter Kit







i'm not leaving this room until 2021

Shear Modulus
Jun 9, 2010



they're giving away geforce now codes with the super-deluxe brand new graphics cards? isn't that the product that you buy if you want to stream games to your lovely computer from nvidia's servers? ie, not what you would do if you had just bought the latest graphics card?

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Cabbages and Kings posted:



Dell comes through, here's my Cyberpunk 2077 Starter Kit







i'm not leaving this room until 2021

I can't even play "click here" strategy games high, much less FPSs, good luck on not dying

Shear Modulus
Jun 9, 2010



Buy a new 2020 Model Year vehicle and get discounted car rentals from enterprise

Animal
Apr 8, 2003

In CP there is a night and day difference with RT on or off in certain areas. Inside the Afterlife night club I toggled it on and off and it makes the environment look very different and much better, even more so than in Control.

You’ll still have fun with the game with an older generation and RT off but it’s not the same visual experience. I was just playing it on my laptop with a 2070 and it definitely feels last gen.

terrorist ambulance
Nov 5, 2009
I see lots of reflections and lights on wet pavement, windows, etc in Cyberpunk and it looks pretty good. I'd say it's really great if it weren't for the fact your characters reflection doesn't show up, which is very interruptive and strange looking in motion

repiv
Aug 13, 2009

I wonder if omitting your character from reflections was intentional, first-person characters tend to move in really bizzare and unrealistic ways that look ridiculous from any other perspective

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

repiv posted:

I wonder if omitting your character from reflections was intentional, first-person characters tend to move in really bizzare and unrealistic ways that look ridiculous from any other perspective



I've seen my character t-pose from their shadow

Adbot
ADBOT LOVES YOU

Butter Hole
Dec 8, 2011

change my name posted:

The instructions on the Geforce site say you need to log into the Geforce Experience Center, verify your 30xx series card, then put the code in. It sucks, I wanted to give mine away too.

Also lol oops, I just realized that the new through-fan design on the 30xx series is now dumping heat directly onto my CPU sink...

Gamer's Nexus did some tests on this and found it has no negative effect, and possibly even positive effects, on air cooled CPU temperatures. They said any flow > no flow, even if it's hot air. A traditional GPU cooler is just blasting hot air all over your case anyways. I wouldn't worry about it!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply