Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dr. Video Games 0031
Jul 17, 2004

repiv posted:

a few scenes are clearly too dark with the more realistic lighting because they were designed around light leaking :v:

epic ran into a similar issue when shipping lumen in fortnite, they ended up adding a hack to the renderer which intentionally allows a small amount of skylight to leak through walls, to ensure it never gets too dark

It looks so good though, man

Adbot
ADBOT LOVES YOU

Shumagorath
Jun 6, 2001
I was really excited to play Cyberpunk when I moved up from a 1070 to a 3080, but it's wild to think I'll be looking forward to playing one game over three different cards spanning five generations (assuming I can power a theoretical 5080 with minimal other upgrades from 12th gen Intel... and a 15A home circuit :v:).

Did Crysis even have that kind of staying power? Maybe Stellaris with CPUs?

Shumagorath fucked around with this message at 15:27 on Apr 10, 2023

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Dr. Video Games 0031 posted:

It looks so good though, man



The game always needed a flashlight/night vision cyberware

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Wait, is that supposed to be good? The 1070 was faster than 980Ti, 2070 was faster than 1080 Ti, etc.

repiv
Aug 13, 2009

im morbidly curious to see how overdrive runs on AMD cards, it's the same tech that underpinned Portal RTX and that was... rough to say the least

Saturnine Aberrance
Sep 6, 2010

Creator.

Please make me flesh.


mobby_6kl posted:

Wait, is that supposed to be good? The 1070 was faster than 980Ti, 2070 was faster than 1080 Ti, etc.

Not really, but it would at least make the 4070 the clear choice over the 4080, since that doesn't give you much over the 3080 as it is.

I have a feeling I'll be skipping 5xxx too, at this rate. $1k+ for xx80, uninspiring xx70s isn't great. MAYBE 5xxx will show generational improvement like we used to see, but i don't see how Nvidia wouldn't price it ever more insanely at this rate unless AMD severely kneecaps their RT performance advantage.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Shame Cyberpunk is a pretty shallow and hollow experience, but holy crap does that lighting look good

mobby_6kl posted:

Wait, is that supposed to be good? The 1070 was faster than 980Ti, 2070 was faster than 1080 Ti, etc.

1080 Ti is faster than 2070 in most situations, it's usually between a 2070 and 2070 Super

HalloKitty fucked around with this message at 17:37 on Apr 10, 2023

pyrotek
May 21, 2004



repiv posted:

im morbidly curious to see how overdrive runs on AMD cards, it's the same tech that underpinned Portal RTX and that was... rough to say the least
https://www.youtube.com/watch?v=cr9ZPRKm9dU

Here is an interview posted a couple of weeks ago after GDC that goes into a little bit more detail than I've seen other places. Knapik mentions that while it is technically hardware agnostic, it will only run well on very high end hardware. I took that to mean it will run like poo poo on any Radeon GPU, especially given how extra rays tank AMD performance far more than NVIDIA.

That preview video honestly looks incredible. I guess they didn't go back in and add any lighting to the world based off of how dark some of the scenes are. I wonder if that is true with story-important cutscenes and missions? Maybe they'll go back and add that kind of stuff in later, it is just a technology preview.

I wonder how the lighting change affects VRAM usage? It looks like it gives you a little less than half the performance of RT ultra, which means the 4070 Ti should be able to give better performance at 1440p than the 4090 at UHD with equivalent settings from the video.

pyrotek fucked around with this message at 15:52 on Apr 10, 2023

kliras
Mar 27, 2021

HalloKitty posted:

Shame Cyberpunk is a pretty shallow and hollow experience, but holy crap does that lighting look good
tbh, most people's idea of cyberpunk is just fancy neon lighting, so at least it's on brand

Shumagorath
Jun 6, 2001

HalloKitty posted:

Shame Cyberpunk is a pretty shallow and hollow experience, but holy crap does that lighting look good

kliras posted:

tbh, most people's idea of cyberpunk is just fancy neon lighting, so at least it's on brand
:lol: at both of your thematic literacy

kliras
Mar 27, 2021

Shumagorath posted:

:lol: at both of your thematic literacy
sorry you felt personally called out

meanwhile, tsmc numbers ain't looking so hot:

https://twitter.com/Techmeme/status/1645367441973972993

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
CP2077 was definitely designed for at least some RT and not pure rasterization, that said in that video it seems like he has brightness way up or something. Or maybe it just doesn't look very good without any RT. I feel like the max rasterization being used is making the overdrive look better than it maybe is. The Psycho vs Overdrive seems much much closer.

I know overdrive is one of those "Push your hardware" settings but it doesn't seem like all that much of a game changer for graphics.

kliras
Mar 27, 2021
some of the "before" pics definitely look a little extreme in the worst-case-scenario end of things. maybe the position of the sun was extremely uncharitable at that particular angle

interesting video, but the editing's a little weird on it; they probably didn't have a lot of time to get something out, which might also explain the lack of timestamps

the fps figures are at 9:05 in the video for those looking btw

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
I agree that sometimes the difference is can be subtle, but sometimes you just don’t recognize how “wrong” rasterized lighting is till you have a direct comparison.

This is my personal go to, cherry picked RT off /on side by side

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

kliras posted:

sorry you felt personally called out

meanwhile, tsmc numbers ain't looking so hot:

https://twitter.com/Techmeme/status/1645367441973972993

if the rumours about China's invasion hold true, tsmc will have bigger problems

Shumagorath posted:

:lol: at both of your thematic literacy

It was my own, probably incomplete and questionable opinion; we all like different things. I thought it was pretty fun, but it's missing a lot of the features they suggested would be in the game, and it lacks basic functionality found in open world games over a decade older, which results in a slightly sterile and unfinished feeling. For what it's worth, I mainly only played it on launch day and for several weeks after, so I'll probably pick it up again, time permitting

Edit: and it wasn't a dig or response to your earlier post, I hadn't read it at the time

HalloKitty fucked around with this message at 07:24 on Apr 11, 2023

kliras
Mar 27, 2021
would probably be nice if there were a benchmark on rails of some sort to have direct video comparisons instead of randomly walking around to record stuff. benchmarks feel like an increasingly rare thing nowadays unfortunately

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

kliras posted:

would probably be nice if there were a benchmark on rails of some sort to have direct video comparisons instead of randomly walking around to record stuff. benchmarks feel like an increasingly rare thing nowadays unfortunately

There is a benchmark, they added one in 1.6, I think. Not great for CPU testing, though.

kliras
Mar 27, 2021

Rinkles posted:

There is a benchmark, they added one in 1.6, I think. Not great for CPU testing, though.
yeah i guess that's the problem: that benchmarks are supposed to be fairly representative of performance. we probably need a different name for something that's just to show visual quality (and maybe test vram shenanigans), but people will treat it as a benchmark regardless

Enos Cabell
Nov 3, 2004


Buddy of mine wants to replace his old 1070 and has about $275ish to do it. I was thinking some flavor of AMD in that price range, but he's an nvidia fanboy for some dumb reason and won't consider alternatives. Used 3060 12GB a decent option at that price? Kind of afraid to steer him towards something with less vram.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Didn’t it always have a thing for testing the performance of your settings? I figured people used that to benchmark.

Yaoi Gagarin
Feb 20, 2014

Dr. Video Games 0031 posted:

It looks so good though, man



If we keep doing this we're going to need oled monitors just to see the details in the dark parts of the screen

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Games should do what Civ 6 does where there's a specific benchmark for CPU load, and one got graphics, assuming both matter for your game

Weirdoman
Jun 12, 2001

I was walking down the street when I saw a bovus. And then it hit me...I was hit by a bovus.

Enos Cabell posted:

Buddy of mine wants to replace his old 1070 and has about $275ish to do it. I was thinking some flavor of AMD in that price range, but he's an nvidia fanboy for some dumb reason and won't consider alternatives. Used 3060 12GB a decent option at that price? Kind of afraid to steer him towards something with less vram.

He's not a true fanboy if he's not willing to drop a grand/his pants for Jensen. He's probably going to have to find something used if that's all he has to spend and still wants nvidia

New Zealand can eat me
Aug 29, 2008

:matters:


I will be surprised and impressed if the 7900XTX can stay near 30 in 1080p using this. Comes out tomorrow huh? I wonder if AMD will have a driver for it ready

Nfcknblvbl
Jul 15, 2002

VostokProgram posted:

If we keep doing this we're going to need oled monitors just to see the details in the dark parts of the screen

I think this is more on map builders to consider what their levels will look like without the light bleeding in from outside.

sauer kraut
Oct 2, 2004

Enos Cabell posted:

Buddy of mine wants to replace his old 1070 and has about $275ish to do it. I was thinking some flavor of AMD in that price range, but he's an nvidia fanboy for some dumb reason and won't consider alternatives. Used 3060 12GB a decent option at that price? Kind of afraid to steer him towards something with less vram.

Yeah there isn't much else, 275USD is now firmly in the territory of 'last gen, lower midrange AMD model during a firesale' so... :unsmith:
Should be a ~50% bump on average, not great not terrible. The real upgrade is coming out soon, the purported $599 4070.
Just make sure it's a 12GB 3060 since there is a severely cut 8GB one also.

kliras
Mar 27, 2021
also used markets have been rear end for a while, doubt anything's changed for the better

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Saturnine Aberrance posted:

Not really, but it would at least make the 4070 the clear choice over the 4080, since that doesn't give you much over the 3080 as it is.


Do you mean per dollar or something? The 4080 is around 50% faster in rasterization than the 3080 and is like 25% faster than the 3080ti, not to mention it's a 16GB card. The 4080 is an amazing card for the power, it's just priced absurdly. If it was priced like the 3080 FE was it'd be an all-timer.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

VostokProgram posted:

If we keep doing this we're going to need oled monitors just to see the details in the dark parts of the screen

Not to mention that there are plenty of scenes which end up overlit with path tracing. Presumably it's a question of balancing the number of light sources for any given scene, but considering it has to work for normal ray tracing and raster too, that's probably a hard trick to pull off.


I suppose this sets the bar for the next gen of consoles. Will APUs in 2027/8 have more grunt than a 4090 so all lighting can be path traced by default?

kliras
Mar 27, 2021
capcom still can't figure out the default brightness settings for resident evil on pc, so underlit scenes are definitely going to be a blast

horror playthroughs on twitch and youtube are a good example of people running into horrible settings on hardware and software

Twibbit
Mar 7, 2013

Is your refrigerator running?
I wonder if there will be any way to make RT overdrive playable on 3080 ti at 1440p

kliras
Mar 27, 2021

Twibbit posted:

I wonder if there will be any way to make RT overdrive playable on 3080 ti at 1440p
keep in mind it runs at

- 18fps on a 4090 in 4k w/ dlaa
- 59fps with dlss 2 in 4k performance mode, ie 1080p
- 95fps with dlss 3 in 4k performance mode

so #2 would be the equivalent of quality mode 1440p with dlss 2, and that's running on the 4090

i'm probably messing up my conversions, but it's gonna be rough

kliras fucked around with this message at 22:01 on Apr 10, 2023

Josh Lyman
May 24, 2009


Saturnine Aberrance posted:

Not really, but it would at least make the 4070 the clear choice over the 4080, since that doesn't give you much over the 3080 as it is.

I have a feeling I'll be skipping 5xxx too, at this rate. $1k+ for xx80, uninspiring xx70s isn't great. MAYBE 5xxx will show generational improvement like we used to see, but i don't see how Nvidia wouldn't price it ever more insanely at this rate unless AMD severely kneecaps their RT performance advantage.
In that CP2077 RT video, they only got 18 fps at 4K on a 4090 and needed to use DLSS 3 performance mode to hit 90 fps. So even if AMD had RT on par with Nvidia, they need to improve FSR since it, like DLSS, will be necessary for usable RT.

Nfcknblvbl
Jul 15, 2002

As a 4090 owner using a 1440p monitor I’m very happy to see these FPS numbers.

repiv
Aug 13, 2009

kliras posted:

keep in mind it runs at

- 18fps on a 4090 in 4k w/ dlaa
- 59fps with dlss 2 in 4k performance mode, ie 1080p
- 95fps with dlss 3 in 4k performance mode

so #2 would be the equivalent of quality mode 1440p with dlss 2, and that's running on the 4090

i'm probably messing up my conversions, but it's gonna be rough

quality mode at 1440p is 960p internal, so it's a little bit lighter than 4k performance (1080p internal)

hobbesmaster
Jan 28, 2008

So, how small of a postage stamp can a 3080 run this in?

kliras
Mar 27, 2021

hobbesmaster posted:

So, how small of a postage stamp can a 3080 run this in?
battaglia sighed as he pulled out his crt monitor

https://twitter.com/Dachsjaeger/status/1642073449874051073

SwissArmyDruid
Feb 14, 2014

by sebmojo

Shumagorath posted:

I was really excited to play Cyberpunk when I moved up from a 1070 to a 3080, but it's wild to think I'll be looking forward to playing one game over three different cards spanning five generations (assuming I can power a theoretical 5080 with minimal other upgrades from 12th gen Intel... and a 15A home circuit :v:).

Did Crysis even have that kind of staying power? Maybe Stellaris with CPUs?

I mean, how many generations of hardware did it take for the GTA5 hard limit to be discovered?

hobbesmaster
Jan 28, 2008

SwissArmyDruid posted:

I mean, how many generations of hardware did it take for the GTA5 hard limit to be discovered?

Not as many as you might initially think since the PC version was delayed for so long.

Adbot
ADBOT LOVES YOU

Shipon
Nov 7, 2005

VostokProgram posted:

If we keep doing this we're going to need oled monitors just to see the details in the dark parts of the screen


19 years later bros, we did it

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply