|
repiv posted:a few scenes are clearly too dark with the more realistic lighting because they were designed around light leaking It looks so good though, man
|
# ? Apr 10, 2023 15:19 |
|
|
# ? Jun 6, 2024 12:16 |
|
I was really excited to play Cyberpunk when I moved up from a 1070 to a 3080, but it's wild to think I'll be looking forward to playing one game over three different cards spanning five generations (assuming I can power a theoretical 5080 with minimal other upgrades from 12th gen Intel... and a 15A home circuit ). Did Crysis even have that kind of staying power? Maybe Stellaris with CPUs? Shumagorath fucked around with this message at 15:27 on Apr 10, 2023 |
# ? Apr 10, 2023 15:19 |
|
Dr. Video Games 0031 posted:It looks so good though, man The game always needed a flashlight/night vision cyberware
|
# ? Apr 10, 2023 15:20 |
|
gradenko_2000 posted:https://twitter.com/VideoCardz/status/1645393342547603457?t=kO76SltksTMjTAcuwjrhfA&s=19 Wait, is that supposed to be good? The 1070 was faster than 980Ti, 2070 was faster than 1080 Ti, etc.
|
# ? Apr 10, 2023 15:31 |
|
im morbidly curious to see how overdrive runs on AMD cards, it's the same tech that underpinned Portal RTX and that was... rough to say the least
|
# ? Apr 10, 2023 15:33 |
mobby_6kl posted:Wait, is that supposed to be good? The 1070 was faster than 980Ti, 2070 was faster than 1080 Ti, etc. Not really, but it would at least make the 4070 the clear choice over the 4080, since that doesn't give you much over the 3080 as it is. I have a feeling I'll be skipping 5xxx too, at this rate. $1k+ for xx80, uninspiring xx70s isn't great. MAYBE 5xxx will show generational improvement like we used to see, but i don't see how Nvidia wouldn't price it ever more insanely at this rate unless AMD severely kneecaps their RT performance advantage.
|
|
# ? Apr 10, 2023 15:43 |
|
Shame Cyberpunk is a pretty shallow and hollow experience, but holy crap does that lighting look good mobby_6kl posted:Wait, is that supposed to be good? The 1070 was faster than 980Ti, 2070 was faster than 1080 Ti, etc. 1080 Ti is faster than 2070 in most situations, it's usually between a 2070 and 2070 Super HalloKitty fucked around with this message at 17:37 on Apr 10, 2023 |
# ? Apr 10, 2023 15:47 |
|
repiv posted:im morbidly curious to see how overdrive runs on AMD cards, it's the same tech that underpinned Portal RTX and that was... rough to say the least Here is an interview posted a couple of weeks ago after GDC that goes into a little bit more detail than I've seen other places. Knapik mentions that while it is technically hardware agnostic, it will only run well on very high end hardware. I took that to mean it will run like poo poo on any Radeon GPU, especially given how extra rays tank AMD performance far more than NVIDIA. That preview video honestly looks incredible. I guess they didn't go back in and add any lighting to the world based off of how dark some of the scenes are. I wonder if that is true with story-important cutscenes and missions? Maybe they'll go back and add that kind of stuff in later, it is just a technology preview. I wonder how the lighting change affects VRAM usage? It looks like it gives you a little less than half the performance of RT ultra, which means the 4070 Ti should be able to give better performance at 1440p than the 4090 at UHD with equivalent settings from the video. pyrotek fucked around with this message at 15:52 on Apr 10, 2023 |
# ? Apr 10, 2023 15:49 |
|
HalloKitty posted:Shame Cyberpunk is a pretty shallow and hollow experience, but holy crap does that lighting look good
|
# ? Apr 10, 2023 15:49 |
|
HalloKitty posted:Shame Cyberpunk is a pretty shallow and hollow experience, but holy crap does that lighting look good kliras posted:tbh, most people's idea of cyberpunk is just fancy neon lighting, so at least it's on brand
|
# ? Apr 10, 2023 15:51 |
|
Shumagorath posted:at both of your thematic literacy meanwhile, tsmc numbers ain't looking so hot: https://twitter.com/Techmeme/status/1645367441973972993
|
# ? Apr 10, 2023 15:53 |
|
CP2077 was definitely designed for at least some RT and not pure rasterization, that said in that video it seems like he has brightness way up or something. Or maybe it just doesn't look very good without any RT. I feel like the max rasterization being used is making the overdrive look better than it maybe is. The Psycho vs Overdrive seems much much closer. I know overdrive is one of those "Push your hardware" settings but it doesn't seem like all that much of a game changer for graphics.
|
# ? Apr 10, 2023 16:20 |
|
some of the "before" pics definitely look a little extreme in the worst-case-scenario end of things. maybe the position of the sun was extremely uncharitable at that particular angle interesting video, but the editing's a little weird on it; they probably didn't have a lot of time to get something out, which might also explain the lack of timestamps the fps figures are at 9:05 in the video for those looking btw
|
# ? Apr 10, 2023 16:24 |
|
I agree that sometimes the difference is can be subtle, but sometimes you just don’t recognize how “wrong” rasterized lighting is till you have a direct comparison. This is my personal go to, cherry picked RT off /on side by side
|
# ? Apr 10, 2023 16:38 |
|
kliras posted:sorry you felt personally called out if the rumours about China's invasion hold true, t Shumagorath posted:at both of your thematic literacy It was my own, probably incomplete and questionable opinion; we all like different things. I thought it was pretty fun, but it's missing a lot of the features they suggested would be in the game, and it lacks basic functionality found in open world games over a decade older, which results in a slightly sterile and unfinished feeling. For what it's worth, I mainly only played it on launch day and for several weeks after, so I'll probably pick it up again, time permitting Edit: and it wasn't a dig or response to your earlier post, I hadn't read it at the time HalloKitty fucked around with this message at 07:24 on Apr 11, 2023 |
# ? Apr 10, 2023 16:46 |
|
would probably be nice if there were a benchmark on rails of some sort to have direct video comparisons instead of randomly walking around to record stuff. benchmarks feel like an increasingly rare thing nowadays unfortunately
|
# ? Apr 10, 2023 16:46 |
|
kliras posted:would probably be nice if there were a benchmark on rails of some sort to have direct video comparisons instead of randomly walking around to record stuff. benchmarks feel like an increasingly rare thing nowadays unfortunately There is a benchmark, they added one in 1.6, I think. Not great for CPU testing, though.
|
# ? Apr 10, 2023 16:49 |
|
Rinkles posted:There is a benchmark, they added one in 1.6, I think. Not great for CPU testing, though.
|
# ? Apr 10, 2023 16:53 |
|
Buddy of mine wants to replace his old 1070 and has about $275ish to do it. I was thinking some flavor of AMD in that price range, but he's an nvidia fanboy for some dumb reason and won't consider alternatives. Used 3060 12GB a decent option at that price? Kind of afraid to steer him towards something with less vram.
|
# ? Apr 10, 2023 16:57 |
|
Didn’t it always have a thing for testing the performance of your settings? I figured people used that to benchmark.
|
# ? Apr 10, 2023 17:00 |
|
Dr. Video Games 0031 posted:It looks so good though, man If we keep doing this we're going to need oled monitors just to see the details in the dark parts of the screen
|
# ? Apr 10, 2023 17:06 |
|
Games should do what Civ 6 does where there's a specific benchmark for CPU load, and one got graphics, assuming both matter for your game
|
# ? Apr 10, 2023 17:07 |
|
Enos Cabell posted:Buddy of mine wants to replace his old 1070 and has about $275ish to do it. I was thinking some flavor of AMD in that price range, but he's an nvidia fanboy for some dumb reason and won't consider alternatives. Used 3060 12GB a decent option at that price? Kind of afraid to steer him towards something with less vram. He's not a true fanboy if he's not willing to drop a grand/his pants for Jensen. He's probably going to have to find something used if that's all he has to spend and still wants nvidia
|
# ? Apr 10, 2023 17:10 |
|
I will be surprised and impressed if the 7900XTX can stay near 30 in 1080p using this. Comes out tomorrow huh? I wonder if AMD will have a driver for it ready
|
# ? Apr 10, 2023 19:57 |
|
VostokProgram posted:If we keep doing this we're going to need oled monitors just to see the details in the dark parts of the screen I think this is more on map builders to consider what their levels will look like without the light bleeding in from outside.
|
# ? Apr 10, 2023 20:21 |
|
Enos Cabell posted:Buddy of mine wants to replace his old 1070 and has about $275ish to do it. I was thinking some flavor of AMD in that price range, but he's an nvidia fanboy for some dumb reason and won't consider alternatives. Used 3060 12GB a decent option at that price? Kind of afraid to steer him towards something with less vram. Yeah there isn't much else, 275USD is now firmly in the territory of 'last gen, lower midrange AMD model during a firesale' so... Should be a ~50% bump on average, not great not terrible. The real upgrade is coming out soon, the purported $599 4070. Just make sure it's a 12GB 3060 since there is a severely cut 8GB one also.
|
# ? Apr 10, 2023 20:43 |
|
also used markets have been rear end for a while, doubt anything's changed for the better
|
# ? Apr 10, 2023 20:45 |
|
Saturnine Aberrance posted:Not really, but it would at least make the 4070 the clear choice over the 4080, since that doesn't give you much over the 3080 as it is. Do you mean per dollar or something? The 4080 is around 50% faster in rasterization than the 3080 and is like 25% faster than the 3080ti, not to mention it's a 16GB card. The 4080 is an amazing card for the power, it's just priced absurdly. If it was priced like the 3080 FE was it'd be an all-timer.
|
# ? Apr 10, 2023 20:48 |
|
VostokProgram posted:If we keep doing this we're going to need oled monitors just to see the details in the dark parts of the screen Not to mention that there are plenty of scenes which end up overlit with path tracing. Presumably it's a question of balancing the number of light sources for any given scene, but considering it has to work for normal ray tracing and raster too, that's probably a hard trick to pull off. I suppose this sets the bar for the next gen of consoles. Will APUs in 2027/8 have more grunt than a 4090 so all lighting can be path traced by default?
|
# ? Apr 10, 2023 20:51 |
|
capcom still can't figure out the default brightness settings for resident evil on pc, so underlit scenes are definitely going to be a blast horror playthroughs on twitch and youtube are a good example of people running into horrible settings on hardware and software
|
# ? Apr 10, 2023 20:54 |
|
I wonder if there will be any way to make RT overdrive playable on 3080 ti at 1440p
|
# ? Apr 10, 2023 21:48 |
|
Twibbit posted:I wonder if there will be any way to make RT overdrive playable on 3080 ti at 1440p - 18fps on a 4090 in 4k w/ dlaa - 59fps with dlss 2 in 4k performance mode, ie 1080p - 95fps with dlss 3 in 4k performance mode so #2 would be the equivalent of quality mode 1440p with dlss 2, and that's running on the 4090 i'm probably messing up my conversions, but it's gonna be rough kliras fucked around with this message at 22:01 on Apr 10, 2023 |
# ? Apr 10, 2023 21:58 |
|
Saturnine Aberrance posted:Not really, but it would at least make the 4070 the clear choice over the 4080, since that doesn't give you much over the 3080 as it is.
|
# ? Apr 10, 2023 22:00 |
|
As a 4090 owner using a 1440p monitor I’m very happy to see these FPS numbers.
|
# ? Apr 10, 2023 22:07 |
|
kliras posted:keep in mind it runs at quality mode at 1440p is 960p internal, so it's a little bit lighter than 4k performance (1080p internal)
|
# ? Apr 10, 2023 22:08 |
|
So, how small of a postage stamp can a 3080 run this in?
|
# ? Apr 10, 2023 22:16 |
|
hobbesmaster posted:So, how small of a postage stamp can a 3080 run this in? https://twitter.com/Dachsjaeger/status/1642073449874051073
|
# ? Apr 10, 2023 22:26 |
|
Shumagorath posted:I was really excited to play Cyberpunk when I moved up from a 1070 to a 3080, but it's wild to think I'll be looking forward to playing one game over three different cards spanning five generations (assuming I can power a theoretical 5080 with minimal other upgrades from 12th gen Intel... and a 15A home circuit ). I mean, how many generations of hardware did it take for the GTA5 hard limit to be discovered?
|
# ? Apr 10, 2023 22:51 |
|
SwissArmyDruid posted:I mean, how many generations of hardware did it take for the GTA5 hard limit to be discovered? Not as many as you might initially think since the PC version was delayed for so long.
|
# ? Apr 10, 2023 23:01 |
|
|
# ? Jun 6, 2024 12:16 |
|
VostokProgram posted:If we keep doing this we're going to need oled monitors just to see the details in the dark parts of the screen 19 years later bros, we did it
|
# ? Apr 10, 2023 23:41 |