Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Enos Cabell
Nov 3, 2004


shrike82 posted:

it's an odd feature which runs better the higher end card you have - frame generation for Harry potter runs great on my 4090/13600K

Cyberpunk seemed to work well with it too

Really? I'm also on a 4090/12700k and with frame generation on it's basically unplayable to me.

Adbot
ADBOT LOVES YOU

Cross-Section
Mar 18, 2009

Frame Generation was the only way I was able to get rid of (most of) the stuttering in Witcher 3 Next-Gen tbh

Instant Grat
Jul 31, 2009

Just add
NERD RAAAAAAGE
TVs have had "frame generation" for like a decade now and it's universally reviled by everyone who knows enough about technology to be able to explain what it is

in what way is nvidia's thing different

Kibner
Oct 21, 2008

Acguy Supremacy

Instant Grat posted:

TVs have had "frame generation" for like a decade now and it's universally reviled by everyone who knows enough about technology to be able to explain what it is

in what way is nvidia's thing different

It uses past frames to predict future frames instead of looking at previous and next frames and interpolating the difference.

I think, anyway. I'm probably wrong.

repiv
Aug 13, 2009

nvidias thing has much more pixel metadata to draw on (exact motion vectors, depth buffer, etc) than the TV interpolators which just have the colour buffer to work with

it's analogous to how DLSS/FSR2 get better results than the upscaler in your TV, they have more information to work with

repiv
Aug 13, 2009

Kibner posted:

It uses past frames to predict future frames instead of looking at previous and next frames and interpolating the difference.

I think, anyway. I'm probably wrong.

nah DLSS framegen does interpolate between two frames, you're probably thinking of VR which only extrapolates from past frames to avoid incurring any latency (but image quality suffers for it)

the big problem with extrapolation is that when there's a disocclusion there's no sure way of knowing what the newly revealed pixels are supposed to look like, interpolation can peek into the future to find out

Enos Cabell
Nov 3, 2004


So based on comments yesterday I decided to revisit frame generation on my 4090 in Hogwarts. For whatever reason, disabling hw accelerated gpu scheduling in windows settings and then re-enabling it solved all the issues I was having. Lag is still slightly noticeable with frame gen on, but it's nothing like it was before.

Another weird thing I'm seeing is that with dlss enabled, my gpu usage never goes above 70%, except for in menus where it will climb to the 90s. Turn off dlss and usage shoots back up to the high 90s again Running around in Hogsmeade there is only about a 3-5 fps increase with dlss on frame gen off over native 4k, so something is definitely screwy.

BurritoJustice
Oct 9, 2012

Enos Cabell posted:

So based on comments yesterday I decided to revisit frame generation on my 4090 in Hogwarts. For whatever reason, disabling hw accelerated gpu scheduling in windows settings and then re-enabling it solved all the issues I was having. Lag is still slightly noticeable with frame gen on, but it's nothing like it was before.

Another weird thing I'm seeing is that with dlss enabled, my gpu usage never goes above 70%, except for in menus where it will climb to the 90s. Turn off dlss and usage shoots back up to the high 90s again Running around in Hogsmeade there is only about a 3-5 fps increase with dlss on frame gen off over native 4k, so something is definitely screwy.

You're CPU bottlenecked, so DLSS is just letting your GPU hit the limit easier and with less power draw.

lordfrikk
Mar 11, 2010

Oh, say it ain't fuckin' so,
you stupid fuck!
Frame generation on my 4090 works fine but I dislike visual artifacts it creates so I don't use it.

Dr. Video Games 0031
Jul 17, 2004

Enos Cabell posted:

So based on comments yesterday I decided to revisit frame generation on my 4090 in Hogwarts. For whatever reason, disabling hw accelerated gpu scheduling in windows settings and then re-enabling it solved all the issues I was having. Lag is still slightly noticeable with frame gen on, but it's nothing like it was before.

Another weird thing I'm seeing is that with dlss enabled, my gpu usage never goes above 70%, except for in menus where it will climb to the 90s. Turn off dlss and usage shoots back up to the high 90s again Running around in Hogsmeade there is only about a 3-5 fps increase with dlss on frame gen off over native 4k, so something is definitely screwy.

This is very normal for DLSS or any kind of upscaling. Lowering the resolution will only do so much for you if you're already CPU limited (and it's fairly easy to be in HogLeg, I've heard)

Enos Cabell
Nov 3, 2004


BurritoJustice posted:

You're CPU bottlenecked, so DLSS is just letting your GPU hit the limit easier and with less power draw.

Ahh ok, that makes sense. Bummer to be hitting CPU bottlenecks already on a 12700k.

Bloopsy
Jun 1, 2006

you have been visited by the Tasty Garlic Bread. you will be blessed by having good Garlic Bread in your life time, but only if you comment "ty garlic bread" in the thread below

lordfrikk posted:

Frame generation on my 4090 works fine but I dislike visual artifacts it creates so I don't use it.

It works great in the few games I’ve tried, however, Cyberpunk and Witcher 3 next gen have a really annoying bug (or maybe feature) in which every time you exit a menu the frames drop to single digits and GPU useage is at 0%, before ramping back up to normal after a few seconds. It’s really annoying but I tolerate it for the boost in precious frames. Spider-Man remastered does not do this at all. Visual artifacts are random in both occurrence and intensity but mostly it’s not ever an issue.

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

This is very normal for DLSS or any kind of upscaling. Lowering the resolution will only do so much for you if you're already CPU limited (and it's fairly easy to be in HogLeg, I've heard)

this is why DLAA should be a standard feature, if you're CPU limited then lowering the internal resolution is pointless

better yet just let us set the scaling ratio to whatever we want (UE5 TSR gets this right at least)

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Hey could anyone elaborate on how to make Cyberpunk 2077 look its best? I get this might be better in the CP2077 thread but this thread is full of specialized knowledge on how to maximize graphics output.


Enos Cabell posted:

Ahh ok, that makes sense. Bummer to be hitting CPU bottlenecks already on a 12700k.

Aww really? When my system gets CPU capped I'm just... impressed. It owns. The 4090 owns. The only time I've been more satisfied with a generational upgrade was when my dad bought a Riva TNT2 for his work computer and my brother and I were sure those graphics at the time were the best graphics can get!!11!11

The 4090 will grow with upcoming processors at 4k which, honestly, I can't think of a single time that was ever the case? Would be interested to be proven wrong though- I'm sure there's some edge cases and it would be cool to hear about them.

Imo people don't give Nvidia nearly enough credit for the 4090. It's loving beautiful.

Taima fucked around with this message at 17:18 on Mar 14, 2023

Dr. Video Games 0031
Jul 17, 2004

There's nothing really special to Cyberpunk. Just max out all the settings. Turn on Psycho RT even. The 4090 can do that at 4K with DLSS Quality or Balanced and run it fine.

Also how did you manage to accidentally quote a post from 6 years ago? :psyduck:

edit: Alternatively, just wait for the RT Overdrive mode to come out

Dr. Video Games 0031 fucked around with this message at 15:49 on Mar 14, 2023

lordfrikk
Mar 11, 2010

Oh, say it ain't fuckin' so,
you stupid fuck!
Cyberpunk 2077 with everything absolutely maxed out in 4K is one of the few times in my life that I've been REALLY impressed by graphics in a videogame. One of the best uses of raytracing so far, too. I've tried Metro Exodus and in my mind's eye it looks just like I remember Last Light, though I'm sure it's not true...

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
CP2077 is really a very pretty game and pretty much ideal for RT.

Agree that to make it look it's best just max the settings. If you have to make compromises I think generally keeping RT on high is worth more bang for the buck. DLSS can make things look a little muddy, but that can weirdly improve some of the city scapes too.

Indoors isn't as good looking as the city, imo.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Dr. Video Games 0031 posted:

Also how did you manage to accidentally quote a post from 6 years ago? :psyduck:

Oops no idea, fixed!

Aren't there third party mods that make the game look better beyond the intrinsic baseline? I can go ahead and max settings and frame generation or whatever- but the mod scene is a bit of a black box to me.

It looks like it's possible: https://www.youtube.com/watch?v=M0HmDHPXKjs

repiv
Aug 13, 2009

you could use DLSSTweak to force it to use DLAA, or upscale at a ratio in-between DLSS Quality and DLAA

VelociBacon
Dec 8, 2009

Can I just say I came to this thread somewhat recently asking why our PCs can't do the upscaling that even midlevel TVs do and then RTX super resolution came like 6 months later. Which one of you is Jensen Huang?

e; drat I think it was the monitors thread I'm dumb as hell

Shipon
Nov 7, 2005

Taima posted:

Hey could anyone elaborate on how to make Cyberpunk 2077 look its best? I get this might be better in the CP2077 thread but this thread is full of specialized knowledge on how to maximize graphics output.

Aww really? When my system gets CPU capped I'm just... impressed. It owns. The 4090 owns. The only time I've been more satisfied with a generational upgrade was when my dad bought a Riva TNT2 for his work computer and my brother and I were sure those graphics at the time were the best graphics can get!!11!11

The 4090 will grow with upcoming processors at 4k which, honestly, I can't think of a single time that was ever the case? Would be interested to be proven wrong though- I'm sure there's some edge cases and it would be cool to hear about them.

Imo people don't give Nvidia nearly enough credit for the 4090. It's loving beautiful.

it is a huge uplift for sure, it's just, like, $1600, come on man

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Taima posted:

Imo people don't give Nvidia nearly enough credit for the 4090. It's loving beautiful.

People give Nvidia plenty of credit for the 4090. It's just that most people don't care about $1600 GPUs and the 4090 makes very clear how incredibly abusive Nvidia is being with every product slotted below it.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I know the 30-series was when they dropped support for VirtualLink, but is there any other way to get USB+DisplayPort on a single cable from those cards, or the 40-series?

repiv
Aug 13, 2009

plot twist: apparently the PSVR2 is the one and only headset that uses virtuallink natively

if a PC driver ever emerges it will require either one of the GPUs with native VL, or an obscure and expensive adapter to mux DP+USB into VL

BigRoman
Jun 19, 2005
I just built a new pc and installed a zoatc rtx 3070 ti and my god the coil whine! It sounds like a dentists drill. My old pc was louder (max fans all the time and I only had a 1080 TI). So I can't tell if this is just a perspective thing or not. Alos, my monitor only has a 75 mhz refresh rate, but the game I was testing it out on has vsync enabled and is limited to 75 fps.

Should I buy a monitor with a higher refresh rate? Is this the new normal and will I have to invest in a set of good headphones?

In hindsight, I probably should have purchased from a better manufacturer, but it was the only card I could find without a 50% markup. Maybe this is why I got such a deal.

Former Human
Oct 15, 2001

So Moore's Law is Dead is claiming the MSRP of the upcoming RTX 4070 is... $750.

Yes, that's only $50 less than the suggested price of the 4070Ti. Now both cards will be totally overpriced and nearly cost the same. On what planet does this make sense?

https://www.youtube.com/watch?v=JIqoMyjmC5A

Dr. Video Games 0031
Jul 17, 2004

It doesn't make sense, and I'm very skeptical that it will actually be that price.

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





I'd trust tech leaks reported by Weekly World News before MLID

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Taima posted:

Aww really? When my system gets CPU capped I'm just... impressed. It owns. The 4090 owns. The only time I've been more satisfied with a generational upgrade was when my dad bought a Riva TNT2 for his work computer and my brother and I were sure those graphics at the time were the best graphics can get!!11!11

The 4090 will grow with upcoming processors at 4k which, honestly, I can't think of a single time that was ever the case? Would be interested to be proven wrong though- I'm sure there's some edge cases and it would be cool to hear about them.

Imo people don't give Nvidia nearly enough credit for the 4090. It's loving beautiful.

Jensen, is that you?

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
as a rule pricing leaks are pretty much always wrong anyway because they often don't make a final decision until very close to the announcement

shrike82
Jun 11, 2005

what does it mean to give credit to a videocard lol

Geemer
Nov 4, 2010



it's when you take out a loan to buy one

AutismVaccine
Feb 26, 2017


SPECIAL NEEDS
SQUAD

Geemer posted:

it's when you take out a loan to buy one

too true

Kazinsal
Dec 13, 2011
calling it now, the 7090 will be $2999 and people in this thread will still loving buy it and call it a bargain

pyrotek
May 21, 2004



Former Human posted:

So Moore's Law is Dead is claiming the MSRP of the upcoming RTX 4070 is... $750.

Yes, that's only $50 less than the suggested price of the 4070Ti. Now both cards will be totally overpriced and nearly cost the same. On what planet does this make sense?

https://www.youtube.com/watch?v=JIqoMyjmC5A

You think they'll price a card that is slower than the 3080 higher than the 3080? They didn't even do that during the crytpo boom with the 3070 Ti. It will probably be $599, which is still ridiculous.

Nalin
Sep 29, 2007

Hair Elf

Kazinsal posted:

calling it now, the 7090 will be $2999 and people in this thread will still loving buy it and call it a bargain

Just you wait. AMD has chiplets now so they can make a $10,000 GPU with 40 chiplets. It will come in its own dedicated tower and you have to run an extension cord to your kitchen so you can unplug your stove and use its 240v outlet but it won't matter because AMD will be the KING and will revolutionize the next generation of graphics. It will be so far ahead of the game that NVIDIA won't be able to compete anymore and you can bet it will be a struggle for gamers find the time to stop masturbating over it and drive to their bank for a loan.

Kazinsal
Dec 13, 2011

Nalin posted:

Just you wait. AMD has chiplets now so they can make a $10,000 GPU with 40 chiplets. It will come in its own dedicated tower and you have to run an extension cord to your kitchen so you can unplug your stove and use its 240v outlet but it won't matter because AMD will be the KING and will revolutionize the next generation of graphics. It will be so far ahead of the game that NVIDIA won't be able to compete anymore and you can bet it will be a struggle for gamers find the time to stop masturbating over it and drive to their bank for a loan.

the minute a top end GPU becomes more expensive than a mesa boogie I'm buying a loving console

Theophany
Jul 22, 2014

SUCCHIAMI IL MIO CAZZO DA DIETRO, RANA RAGAZZO



2022 FIA Formula 1 WDC

Nalin posted:

Just you wait. AMD has chiplets now so they can make a $10,000 GPU with 40 chiplets. It will come in its own dedicated tower and you have to run an extension cord to your kitchen so you can unplug your stove and use its 240v outlet but it won't matter because AMD will be the KING and will revolutionize the next generation of graphics. It will be so far ahead of the game that NVIDIA won't be able to compete anymore and you can bet it will be a struggle for gamers find the time to stop masturbating over it and drive to their bank for a loan.

A couple of psychopaths over at overclock.net have shunt modded their 7900 XTXs and they're drawing 600-700 watts at full tilt lol.

Shipon
Nov 7, 2005

Kazinsal posted:

the minute a top end GPU becomes more expensive than a mesa boogie I'm buying a loving console
eh, a top end GPU might be a bad per-dollar value but it's still better than the NaN value a console offers (what games lol)

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy

Charles Leclerc posted:

A couple of psychopaths over at overclock.net have shunt modded their 7900 XTXs and they're drawing 600-700 watts at full tilt lol.

are they getting any extra perf from that tho?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply