|
shrike82 posted:it's an odd feature which runs better the higher end card you have - frame generation for Harry potter runs great on my 4090/13600K Really? I'm also on a 4090/12700k and with frame generation on it's basically unplayable to me.
|
# ? Mar 13, 2023 14:30 |
|
|
# ? May 31, 2024 21:00 |
|
Frame Generation was the only way I was able to get rid of (most of) the stuttering in Witcher 3 Next-Gen tbh
|
# ? Mar 13, 2023 14:32 |
|
TVs have had "frame generation" for like a decade now and it's universally reviled by everyone who knows enough about technology to be able to explain what it is in what way is nvidia's thing different
|
# ? Mar 13, 2023 14:47 |
|
Instant Grat posted:TVs have had "frame generation" for like a decade now and it's universally reviled by everyone who knows enough about technology to be able to explain what it is It uses past frames to predict future frames instead of looking at previous and next frames and interpolating the difference. I think, anyway. I'm probably wrong.
|
# ? Mar 13, 2023 14:50 |
|
nvidias thing has much more pixel metadata to draw on (exact motion vectors, depth buffer, etc) than the TV interpolators which just have the colour buffer to work with it's analogous to how DLSS/FSR2 get better results than the upscaler in your TV, they have more information to work with
|
# ? Mar 13, 2023 14:51 |
|
Kibner posted:It uses past frames to predict future frames instead of looking at previous and next frames and interpolating the difference. nah DLSS framegen does interpolate between two frames, you're probably thinking of VR which only extrapolates from past frames to avoid incurring any latency (but image quality suffers for it) the big problem with extrapolation is that when there's a disocclusion there's no sure way of knowing what the newly revealed pixels are supposed to look like, interpolation can peek into the future to find out
|
# ? Mar 13, 2023 15:15 |
|
So based on comments yesterday I decided to revisit frame generation on my 4090 in Hogwarts. For whatever reason, disabling hw accelerated gpu scheduling in windows settings and then re-enabling it solved all the issues I was having. Lag is still slightly noticeable with frame gen on, but it's nothing like it was before. Another weird thing I'm seeing is that with dlss enabled, my gpu usage never goes above 70%, except for in menus where it will climb to the 90s. Turn off dlss and usage shoots back up to the high 90s again Running around in Hogsmeade there is only about a 3-5 fps increase with dlss on frame gen off over native 4k, so something is definitely screwy.
|
# ? Mar 14, 2023 13:36 |
|
Enos Cabell posted:So based on comments yesterday I decided to revisit frame generation on my 4090 in Hogwarts. For whatever reason, disabling hw accelerated gpu scheduling in windows settings and then re-enabling it solved all the issues I was having. Lag is still slightly noticeable with frame gen on, but it's nothing like it was before. You're CPU bottlenecked, so DLSS is just letting your GPU hit the limit easier and with less power draw.
|
# ? Mar 14, 2023 14:03 |
|
Frame generation on my 4090 works fine but I dislike visual artifacts it creates so I don't use it.
|
# ? Mar 14, 2023 14:12 |
|
Enos Cabell posted:So based on comments yesterday I decided to revisit frame generation on my 4090 in Hogwarts. For whatever reason, disabling hw accelerated gpu scheduling in windows settings and then re-enabling it solved all the issues I was having. Lag is still slightly noticeable with frame gen on, but it's nothing like it was before. This is very normal for DLSS or any kind of upscaling. Lowering the resolution will only do so much for you if you're already CPU limited (and it's fairly easy to be in HogLeg, I've heard)
|
# ? Mar 14, 2023 14:20 |
|
BurritoJustice posted:You're CPU bottlenecked, so DLSS is just letting your GPU hit the limit easier and with less power draw. Ahh ok, that makes sense. Bummer to be hitting CPU bottlenecks already on a 12700k.
|
# ? Mar 14, 2023 14:59 |
|
lordfrikk posted:Frame generation on my 4090 works fine but I dislike visual artifacts it creates so I don't use it. It works great in the few games I’ve tried, however, Cyberpunk and Witcher 3 next gen have a really annoying bug (or maybe feature) in which every time you exit a menu the frames drop to single digits and GPU useage is at 0%, before ramping back up to normal after a few seconds. It’s really annoying but I tolerate it for the boost in precious frames. Spider-Man remastered does not do this at all. Visual artifacts are random in both occurrence and intensity but mostly it’s not ever an issue.
|
# ? Mar 14, 2023 15:06 |
|
Dr. Video Games 0031 posted:This is very normal for DLSS or any kind of upscaling. Lowering the resolution will only do so much for you if you're already CPU limited (and it's fairly easy to be in HogLeg, I've heard) this is why DLAA should be a standard feature, if you're CPU limited then lowering the internal resolution is pointless better yet just let us set the scaling ratio to whatever we want (UE5 TSR gets this right at least)
|
# ? Mar 14, 2023 15:24 |
|
Hey could anyone elaborate on how to make Cyberpunk 2077 look its best? I get this might be better in the CP2077 thread but this thread is full of specialized knowledge on how to maximize graphics output.Enos Cabell posted:Ahh ok, that makes sense. Bummer to be hitting CPU bottlenecks already on a 12700k. Aww really? When my system gets CPU capped I'm just... impressed. It owns. The 4090 owns. The only time I've been more satisfied with a generational upgrade was when my dad bought a Riva TNT2 for his work computer and my brother and I were sure those graphics at the time were the best graphics can get!!11!11 The 4090 will grow with upcoming processors at 4k which, honestly, I can't think of a single time that was ever the case? Would be interested to be proven wrong though- I'm sure there's some edge cases and it would be cool to hear about them. Imo people don't give Nvidia nearly enough credit for the 4090. It's loving beautiful. Taima fucked around with this message at 17:18 on Mar 14, 2023 |
# ? Mar 14, 2023 15:33 |
|
There's nothing really special to Cyberpunk. Just max out all the settings. Turn on Psycho RT even. The 4090 can do that at 4K with DLSS Quality or Balanced and run it fine. Also how did you manage to accidentally quote a post from 6 years ago? edit: Alternatively, just wait for the RT Overdrive mode to come out Dr. Video Games 0031 fucked around with this message at 15:49 on Mar 14, 2023 |
# ? Mar 14, 2023 15:38 |
|
Cyberpunk 2077 with everything absolutely maxed out in 4K is one of the few times in my life that I've been REALLY impressed by graphics in a videogame. One of the best uses of raytracing so far, too. I've tried Metro Exodus and in my mind's eye it looks just like I remember Last Light, though I'm sure it's not true...
|
# ? Mar 14, 2023 16:35 |
|
CP2077 is really a very pretty game and pretty much ideal for RT. Agree that to make it look it's best just max the settings. If you have to make compromises I think generally keeping RT on high is worth more bang for the buck. DLSS can make things look a little muddy, but that can weirdly improve some of the city scapes too. Indoors isn't as good looking as the city, imo.
|
# ? Mar 14, 2023 16:41 |
|
Dr. Video Games 0031 posted:Also how did you manage to accidentally quote a post from 6 years ago? Oops no idea, fixed! Aren't there third party mods that make the game look better beyond the intrinsic baseline? I can go ahead and max settings and frame generation or whatever- but the mod scene is a bit of a black box to me. It looks like it's possible: https://www.youtube.com/watch?v=M0HmDHPXKjs
|
# ? Mar 14, 2023 17:18 |
|
you could use DLSSTweak to force it to use DLAA, or upscale at a ratio in-between DLSS Quality and DLAA
|
# ? Mar 14, 2023 17:24 |
|
Can I just say I came to this thread somewhat recently asking why our PCs can't do the upscaling that even midlevel TVs do and then RTX super resolution came like 6 months later. Which one of you is Jensen Huang? e; drat I think it was the monitors thread I'm dumb as hell
|
# ? Mar 14, 2023 17:24 |
|
Taima posted:Hey could anyone elaborate on how to make Cyberpunk 2077 look its best? I get this might be better in the CP2077 thread but this thread is full of specialized knowledge on how to maximize graphics output. it is a huge uplift for sure, it's just, like, $1600, come on man
|
# ? Mar 14, 2023 22:37 |
|
Taima posted:Imo people don't give Nvidia nearly enough credit for the 4090. It's loving beautiful. People give Nvidia plenty of credit for the 4090. It's just that most people don't care about $1600 GPUs and the 4090 makes very clear how incredibly abusive Nvidia is being with every product slotted below it.
|
# ? Mar 14, 2023 23:11 |
|
I know the 30-series was when they dropped support for VirtualLink, but is there any other way to get USB+DisplayPort on a single cable from those cards, or the 40-series?
|
# ? Mar 15, 2023 03:29 |
|
plot twist: apparently the PSVR2 is the one and only headset that uses virtuallink natively if a PC driver ever emerges it will require either one of the GPUs with native VL, or an obscure and expensive adapter to mux DP+USB into VL
|
# ? Mar 15, 2023 04:03 |
|
I just built a new pc and installed a zoatc rtx 3070 ti and my god the coil whine! It sounds like a dentists drill. My old pc was louder (max fans all the time and I only had a 1080 TI). So I can't tell if this is just a perspective thing or not. Alos, my monitor only has a 75 mhz refresh rate, but the game I was testing it out on has vsync enabled and is limited to 75 fps. Should I buy a monitor with a higher refresh rate? Is this the new normal and will I have to invest in a set of good headphones? In hindsight, I probably should have purchased from a better manufacturer, but it was the only card I could find without a 50% markup. Maybe this is why I got such a deal.
|
# ? Mar 15, 2023 05:37 |
|
So Moore's Law is Dead is claiming the MSRP of the upcoming RTX 4070 is... $750. Yes, that's only $50 less than the suggested price of the 4070Ti. Now both cards will be totally overpriced and nearly cost the same. On what planet does this make sense? https://www.youtube.com/watch?v=JIqoMyjmC5A
|
# ? Mar 15, 2023 05:45 |
|
It doesn't make sense, and I'm very skeptical that it will actually be that price.
|
# ? Mar 15, 2023 05:46 |
|
I'd trust tech leaks reported by Weekly World News before MLID
|
# ? Mar 15, 2023 05:50 |
|
Taima posted:Aww really? When my system gets CPU capped I'm just... impressed. It owns. The 4090 owns. The only time I've been more satisfied with a generational upgrade was when my dad bought a Riva TNT2 for his work computer and my brother and I were sure those graphics at the time were the best graphics can get!!11!11 Jensen, is that you?
|
# ? Mar 15, 2023 05:56 |
|
as a rule pricing leaks are pretty much always wrong anyway because they often don't make a final decision until very close to the announcement
|
# ? Mar 15, 2023 06:28 |
|
what does it mean to give credit to a videocard lol
|
# ? Mar 15, 2023 06:41 |
|
it's when you take out a loan to buy one
|
# ? Mar 15, 2023 06:58 |
|
Geemer posted:it's when you take out a loan to buy one too true
|
# ? Mar 15, 2023 07:02 |
|
calling it now, the 7090 will be $2999 and people in this thread will still loving buy it and call it a bargain
|
# ? Mar 15, 2023 07:51 |
|
Former Human posted:So Moore's Law is Dead is claiming the MSRP of the upcoming RTX 4070 is... $750. You think they'll price a card that is slower than the 3080 higher than the 3080? They didn't even do that during the crytpo boom with the 3070 Ti. It will probably be $599, which is still ridiculous.
|
# ? Mar 15, 2023 09:01 |
|
Kazinsal posted:calling it now, the 7090 will be $2999 and people in this thread will still loving buy it and call it a bargain Just you wait. AMD has chiplets now so they can make a $10,000 GPU with 40 chiplets. It will come in its own dedicated tower and you have to run an extension cord to your kitchen so you can unplug your stove and use its 240v outlet but it won't matter because AMD will be the KING and will revolutionize the next generation of graphics. It will be so far ahead of the game that NVIDIA won't be able to compete anymore and you can bet it will be a struggle for gamers find the time to stop masturbating over it and drive to their bank for a loan.
|
# ? Mar 15, 2023 09:42 |
|
Nalin posted:Just you wait. AMD has chiplets now so they can make a $10,000 GPU with 40 chiplets. It will come in its own dedicated tower and you have to run an extension cord to your kitchen so you can unplug your stove and use its 240v outlet but it won't matter because AMD will be the KING and will revolutionize the next generation of graphics. It will be so far ahead of the game that NVIDIA won't be able to compete anymore and you can bet it will be a struggle for gamers find the time to stop masturbating over it and drive to their bank for a loan. the minute a top end GPU becomes more expensive than a mesa boogie I'm buying a loving console
|
# ? Mar 15, 2023 09:47 |
|
Nalin posted:Just you wait. AMD has chiplets now so they can make a $10,000 GPU with 40 chiplets. It will come in its own dedicated tower and you have to run an extension cord to your kitchen so you can unplug your stove and use its 240v outlet but it won't matter because AMD will be the KING and will revolutionize the next generation of graphics. It will be so far ahead of the game that NVIDIA won't be able to compete anymore and you can bet it will be a struggle for gamers find the time to stop masturbating over it and drive to their bank for a loan. A couple of psychopaths over at overclock.net have shunt modded their 7900 XTXs and they're drawing 600-700 watts at full tilt lol.
|
# ? Mar 15, 2023 12:14 |
|
Kazinsal posted:the minute a top end GPU becomes more expensive than a mesa boogie I'm buying a loving console
|
# ? Mar 15, 2023 12:38 |
|
|
# ? May 31, 2024 21:00 |
|
Charles Leclerc posted:A couple of psychopaths over at overclock.net have shunt modded their 7900 XTXs and they're drawing 600-700 watts at full tilt lol. are they getting any extra perf from that tho?
|
# ? Mar 15, 2023 12:59 |