|
Shumagorath posted:It's hard to see when the effects are already fairly subtle vs normal RTX and they do a lot of for wipes / quick cuts instead of split / side-by-side. I'm definitely not getting 4090 FOMO but it's nice to know I'll have something to look forward to on a 5080 or whatever. They don't really linger too much on the changes, and some scenes aren't actually too different anyway because the original lighting was already good. Outdoors scenes during the daytime with primarily direct lighting from the sun for instance aren't going to see a dramatic improvement because that's something the older methods already excelled at: It's still a little better though, with a more natural feel to how things are shaded. Any scenes that are indirectly lit or have many artificial lights will likely see a large improvement: This is comparing rasterization to path tracing though, so it's not the fairest comparison. I think Misty's face would be lit correctly with normal ray tracing for instance, but I doubt even normal ray tracing gave the NPCs shadows in the next comparison. That's something I noticed about the ray tracing in CP2077; a lot of lights in the open world wouldn't cast shadows, but the new path tracing method will allow for unlimited shadow-casting lights. There was one scene they showed with both normal ray tracing and their path tracing: As for whether it's worth the additional performance hit, probably not? But it is fun to see. And I'm gonna use it anyway. Dr. Video Games 0031 fucked around with this message at 02:09 on Apr 8, 2023 |
# ? Apr 8, 2023 01:38 |
|
|
# ? Jun 5, 2024 12:42 |
|
Hemish posted:I second the usb suggestion. I never put my computer to sleep but my Thrustmaster T16000m Joystick setup will straight up not even let my screens turn off for the power saving mode let alone sleep mode if I were to try to use it. I have to unplug it when not using it to play. Interesting. I have the thrustmaster warthog and it also did it. I also have to leave it unplugged. It didn’t resolve all the issues but it resolved some!
|
# ? Apr 8, 2023 01:44 |
|
That last comparison is specifically an area that my 3080 was already struggling at night on 1440p/performance. I’m guessing this is a “4090” setting. Maybe it’ll be neat to revisit with my 6080 (12GB ) in however many years that will be.
|
# ? Apr 8, 2023 01:45 |
|
New GPU Review https://www.youtube.com/watch?v=-aMiEszQBik
|
# ? Apr 8, 2023 01:51 |
|
cyberpunk DLC launch is probably a good time to revisit the game with the new RT stuff not sure if the base game has enough going for it for another replay
|
# ? Apr 8, 2023 01:59 |
|
doomisland posted:New GPU Review this rocks so fuckin hard. wish i could justify buying one for my old graphics card collection
|
# ? Apr 8, 2023 03:28 |
|
doomisland posted:New GPU Review Hell yes Hemish posted:I second the usb suggestion. I never put my computer to sleep but my Thrustmaster T16000m Joystick setup will straight up not even let my screens turn off for the power saving mode let alone sleep mode if I were to try to use it. I have to unplug it when not using it to play.
|
# ? Apr 8, 2023 03:49 |
|
Hemish posted:I second the usb suggestion. I never put my computer to sleep but my Thrustmaster T16000m Joystick setup will straight up not even let my screens turn off for the power saving mode let alone sleep mode if I were to try to use it. I have to unplug it when not using it to play. I should add that the screen doesn't turn off either. I haven't added any new USB devices. If it is a USB device that is preventing it from sleeping, what is causing the GPU to under clock now or even before when it was going to sleep successfully?
|
# ? Apr 8, 2023 14:48 |
|
An active USB device definitely won’t let the screen turn off. I dunno about the GPU clocks but I’d check those usb devices as a first step.
|
# ? Apr 8, 2023 14:56 |
|
Has there been any confirmation of the 5090/Blackwell rumours-or-leaks that popped up in December?
|
# ? Apr 8, 2023 20:11 |
|
KingKapalone posted:I should add that the screen doesn't turn off either. I haven't added any new USB devices. If it is a USB device that is preventing it from sleeping, what is causing the GPU to under clock now or even before when it was going to sleep successfully? this the reason i keep my controller unplugged unless actively using it. turn out while it wasnt preventing the pc from hibernating it could and did gently caress with pc and monitor sleep for whatever reason my usb headphones dont cause this problem, but the controller sure does
|
# ? Apr 8, 2023 20:21 |
|
everspace 2 (on PC game pass) is the first in a game in a while that'll 100% your GPU at a pause menu i power limit my 4090 to 70% and also frame cap it so it's still under 65C but lol that it's sucking down over 300W to do "nothing"
|
# ? Apr 8, 2023 23:58 |
|
Rookie mistake
|
# ? Apr 9, 2023 00:00 |
|
shrike82 posted:everspace 2 (on PC game pass) is the first in a game in a while that'll 100% your GPU at a pause menu every loving time lol
|
# ? Apr 9, 2023 00:06 |
|
The Ascent will totally draw 300W at the pause menu, and more if you don't vsync it.Dr. Video Games 0031 posted:They don't really linger too much on the changes, and some scenes aren't actually too different anyway because the original lighting was already good. Outdoors scenes during the daytime with primarily direct lighting from the sun for instance aren't going to see a dramatic improvement because that's something the older methods already excelled at:
|
# ? Apr 9, 2023 00:12 |
|
Dr. Video Games 0031 posted:They don't really linger too much on the changes, and some scenes aren't actually too different anyway because the original lighting was already good. Outdoors scenes during the daytime with primarily direct lighting from the sun for instance aren't going to see a dramatic improvement because that's something the older methods already excelled at: The bounce lighting and shadows in the path-traced middle pictures are definitely better. I do have to wonder just how fast the shadows in particular will update. I've noticed a tendency for there to be a noticeable delay in things like shadows in RT games, particularly Portal RTX as it is the closest comparison to this. Also, the LCD light in the bottom picture must be incredibly, blindingly bright to anyone in the building across from it the way it lights up the bottom of the overpass above it. I have to wonder: did the game use any faked lighting? If so, will the artists have to go around everywhere and potentially add lights to scenes to make sure they are lit appropriately?
|
# ? Apr 9, 2023 00:33 |
|
I should clarify that the first three sets of screenshots might actually have ray tracing enabled on the left, or at least i saw some people say they did. It wasn't clear in the video so I'm not sure. The lighting issue on Misty's face would be because some lights don't cast shadows, even self shadows, which can cause light to bleed through character models in weird ways.
|
# ? Apr 9, 2023 00:33 |
|
A guy at work spent $300 on an intel GPU. It’s for his kids computer and he’s hoping Diablo 4 runs well on it. The only thing I could say is that he should have said something before he bought it and that I hope it works out for him. Im sure intel GPUs are in a “better” spot but he could have done better.
|
# ? Apr 9, 2023 04:04 |
|
The A750 and A770 aren't terrible cards
|
# ? Apr 9, 2023 04:45 |
|
My DIY monitor project is coming along. I bought a laptop panel "B173ZAN06.8" which is 17.3 inches with 4K and 120hz, and an eDP to DP board, then the board was overheating so I had to go on Mouser.com and buy upgraded diodes and inductors to get the heat output in check. Now that I have it plugged into my RTX 3080 and running stable, a pleasant surprise is that G-Sync compatible was automatically supported just by enabling it: Pretty nice for a desktop driving a hacked laptop panel. Is there any way to get further details about the adaptive sync? Namely, what range has my 3080 decided to support? The panel's datasheet doesn't seem to mention minimum frequency for adaptive sync, all I found was this:
|
# ? Apr 9, 2023 04:51 |
|
Does the gsync indicator option in nvidia control panel always show when vrr is enabled for the display or does it only show when in range and active? If it is the latter, you could load up an intensive game with the option enabled and see if the indicator turns off when the frame rate drops below 48 FPS or whatever the minimum end of the range is.
|
# ? Apr 9, 2023 05:20 |
|
Bloodplay it again posted:Does the gsync indicator option in nvidia control panel always show when vrr is enabled for the display or does it only show when in range and active? If it is the latter, you could load up an intensive game with the option enabled and see if the indicator turns off when the frame rate drops below 48 FPS or whatever the minimum end of the range is. Hmm, I'm not sure. I turned it on, and this is what I'm getting: - 120hz, no frame limit: indicator on - 120hz, frame limited to 20 FPS: indicator on - custom resolution to set the monitor itself to 20hz, no frame limit: indicator off So if I lower the panel to an unsupported hz like 20 it seems to disable Gsync Compatible, but if I lower the framerate to 20 while the hz is 120 it stays on. The NVidia control panel won't let me go below a frame cap of 20 so it is hard to tell. I ran the Pendulum Demo with the 20 FPS cap enabled and it didn't seem to tear. Does that sound like what others have experienced? I didn't know FreeSync could work down to 20 FPS, but it has been a while since the last time I checked.
|
# ? Apr 9, 2023 06:14 |
|
I haven't personally tested mine, but Rtings has displays like the LG C1's VRR range working down to 20 fps/Hz, so what you're seeing could easily make sense.
|
# ? Apr 9, 2023 06:46 |
|
Zero VGS posted:For a while, the only Twinaxial risers in existence were made by 3M and they were in the vicinity of $300 each. Be thankful these randos started making them the in past 5 years, $50 is a downright bargain in comparison, especially with inflation. That's completely fair. Once I started looking into what else they were used for it made a whole lot of sense, and gave me a little peace of mind. "If its good enough for modular military equipment it must be good enough for my dumb bullshit" Zero VGS posted:My DIY monitor project is coming along. I bought a laptop panel "B173ZAN06.8" which is 17.3 inches with 4K and 120hz, and an eDP to DP board, then the board was overheating so I had to go on Mouser.com and buy upgraded diodes and inductors to get the heat output in check. This is loving sick. Seriously, I was just talking about how much I'd love a few laptop sized 4k/120hz panels. Try 48, that's generally the absolute lowest you will see a decent Freesync monitor support (though I do recall some 2/4k monitors going as low as 30), if that doesn't work then it's 60. You'd have the most luck asking the nerds at like TFTCentral or BlurBusters (and I'd wager someone at either of these forums has figured out the perfect EDID config already or is able to). I'm not exactly sure how you'd confirm that it's working, generally when it's done with monitors, they have an OSD display and you can watch it to see if it's actually adaptive or it just flips between /60 values. If you don't want to gently caress around, just use 60. E: If it wasn't clear, this isn't about what your GPU has decided to support. You control the EDID of the display, and can edit that to say whatever you want. This is awesome except for all of the times you mess up and tell your GPU to give you something the monitor can't display. New Zealand can eat me fucked around with this message at 11:54 on Apr 9, 2023 |
# ? Apr 9, 2023 11:51 |
|
New Zealand can eat me posted:E: If it wasn't clear, this isn't about what your GPU has decided to support. You control the EDID of the display, and can edit that to say whatever you want. This is awesome except for all of the times you mess up and tell your GPU to give you something the monitor can't display. Is there something to edit/override the EDID from within Windows? It's nice right now because I'm streaming to a laptop with Moonlight, so even if I completely gently caress up monitor settings and the panel blacks out, I can still see what I'm doing via the stream.
|
# ? Apr 9, 2023 12:08 |
Shipon posted:Yep, pretty much any controller seems to just not let Windows go to sleep This was fixed on 11 in a recent patch. Supposedly it will roll out to 10 eventually. Maybe.
|
|
# ? Apr 9, 2023 12:57 |
|
Zero VGS posted:Is there something to edit/override the EDID from within Windows? It's nice right now because I'm streaming to a laptop with Moonlight, so even if I completely gently caress up monitor settings and the panel blacks out, I can still see what I'm doing via the stream. CRU. It has a try-revert process, but it's still easy enough to get in a stupid state.
|
# ? Apr 9, 2023 13:13 |
|
I'd sure like to know why my 3080 can idle at 50.6MHz memory clock when I have my two 4K displays configured to 144hz and 120hz respectively, but needs to go full idiot at 1187MHz, wasting additional 50W of power, when I raise the second one from 120hz to 144hz. Given that there's multiple memory clocking steps, including 101MHz, why can't it just settle on that instead, if 50MHz is too low? JFC Nvidia.
|
# ? Apr 9, 2023 13:18 |
|
I'm not doing the math but that probably exceeds 600mhz pixel clock total. AMD struggles with the same issue. Chuggin along at 105W for 2x1080p280hz screens
|
# ? Apr 9, 2023 13:21 |
|
Combat Pretzel posted:I'd sure like to know why my 3080 can idle at 50.6MHz memory clock when I have my two 4K displays configured to 144hz and 120hz respectively, but needs to go full idiot at 1187MHz, wasting additional 50W of power, when I raise the second one from 120hz to 144hz. Given that there's multiple memory clocking steps, including 101MHz, why can't it just settle on that instead, if 50MHz is too low? JFC Nvidia. If it's that borderline you can properly make a custom timing profile with a reduced blanking interval and get them both to run at 144Hz without hitting the pixel clock limit that mandates the VRAM clocks up.
|
# ? Apr 9, 2023 14:02 |
|
BurritoJustice posted:If it's that borderline you can properly make a custom timing profile with a reduced blanking interval and get them both to run at 144Hz without hitting the pixel clock limit that mandates the VRAM clocks up. This is mostly about G-Sync. In the past, I could actually chose between 75, 120 and 144hz. But for whatever reason, half a year ago, NVidia removed that and defaults to the highest refresh rate and doesn't allow selection of anything else. Before I set tem both to 120hz, had low clocks and also G-Sync. I'm not sure if it's for all displays or specifically this Samsung one, because it's been problematic generally.
|
# ? Apr 9, 2023 15:13 |
|
I have that issue with my Samsung monitor when I use HDMI but not with DisplayPort. HDMI just shows me 60hz and 165hz (this monitor's max refresh rate), and also only a couple oddball resolutions in addition to 4K. When using DisplayPort though, I get a range of refresh rate and resolution options. So it may be an issue with how Samsung handles EDID data across their different outputs.
|
# ? Apr 9, 2023 15:22 |
|
Same story, only 144hz. What's odd is that the system acts like it's a different display with each combination. Then again, I also think that the firmware is generally FUBAR, because a different display type shows up in the registry. See this: The LC49G95T is an Odyssey G9. I never had one of these. Also, it showed up again after scrubbing things with DDU. Samsunged I guess.
|
# ? Apr 9, 2023 16:01 |
|
Nvidia is supposedly launching the 4070 on the 13th. Are there any rumors about when AMD will launch the rest of the Radeon 7000 series?
|
# ? Apr 9, 2023 17:29 |
|
doomisland posted:New GPU Review Its weird how we just collectively forgot that during the mid/late 90s the entire world was at a dutch angle. I guess its just something you got used to edit: holy poo poo a surplus p3 dell xps in almost that exact config was the first "gaming" pc I had edit 2: and all cards had random artifacting like that which changed with driver versions. I completely forgot that was why later you'd accept "oh yes of course its normal to download a video driver from some random dude's website because it has the right "tweaks". hobbesmaster fucked around with this message at 19:22 on Apr 9, 2023 |
# ? Apr 9, 2023 18:58 |
|
https://twitter.com/VideoCardz/status/1645393342547603457?t=kO76SltksTMjTAcuwjrhfA&s=19 Book it, let's see if this turns out to be true
|
# ? Apr 10, 2023 14:20 |
|
https://www.youtube.com/watch?v=I-ORt8313Og
|
# ? Apr 10, 2023 14:34 |
|
That’s a bigger change from old RT than I was expecting.
|
# ? Apr 10, 2023 14:56 |
|
a few scenes are clearly too dark with the more realistic lighting because they were designed around light leaking epic ran into a similar issue when shipping lumen in fortnite, they ended up adding a hack to the renderer which intentionally allows a small amount of skylight to leak through walls, to ensure it never gets too dark
|
# ? Apr 10, 2023 15:01 |
|
|
# ? Jun 5, 2024 12:42 |
|
fun little video on vram and some of the subtle ways things start failing when games run out of it https://www.youtube.com/watch?v=Rh7kFgHe21k
|
# ? Apr 10, 2023 15:10 |