Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dr. Video Games 0031
Jul 17, 2004

Shumagorath posted:

It's hard to see when the effects are already fairly subtle vs normal RTX and they do a lot of for wipes / quick cuts instead of split / side-by-side. I'm definitely not getting 4090 FOMO but it's nice to know I'll have something to look forward to on a 5080 or whatever.

They don't really linger too much on the changes, and some scenes aren't actually too different anyway because the original lighting was already good. Outdoors scenes during the daytime with primarily direct lighting from the sun for instance aren't going to see a dramatic improvement because that's something the older methods already excelled at:



It's still a little better though, with a more natural feel to how things are shaded. Any scenes that are indirectly lit or have many artificial lights will likely see a large improvement:



This is comparing rasterization to path tracing though, so it's not the fairest comparison. I think Misty's face would be lit correctly with normal ray tracing for instance, but I doubt even normal ray tracing gave the NPCs shadows in the next comparison. That's something I noticed about the ray tracing in CP2077; a lot of lights in the open world wouldn't cast shadows, but the new path tracing method will allow for unlimited shadow-casting lights. There was one scene they showed with both normal ray tracing and their path tracing:



As for whether it's worth the additional performance hit, probably not? But it is fun to see. And I'm gonna use it anyway.

Dr. Video Games 0031 fucked around with this message at 02:09 on Apr 8, 2023

Adbot
ADBOT LOVES YOU

MarcusSA
Sep 23, 2007

Hemish posted:

I second the usb suggestion. I never put my computer to sleep but my Thrustmaster T16000m Joystick setup will straight up not even let my screens turn off for the power saving mode let alone sleep mode if I were to try to use it. I have to unplug it when not using it to play.

Interesting. I have the thrustmaster warthog and it also did it.

I also have to leave it unplugged.

It didn’t resolve all the issues but it resolved some!

hobbesmaster
Jan 28, 2008

That last comparison is specifically an area that my 3080 was already struggling at night on 1440p/performance. I’m guessing this is a “4090” setting.

Maybe it’ll be neat to revisit with my 6080 (12GB :v:) in however many years that will be.

doomisland
Oct 5, 2004

New GPU Review :spooky:

https://www.youtube.com/watch?v=-aMiEszQBik

shrike82
Jun 11, 2005

cyberpunk DLC launch is probably a good time to revisit the game with the new RT stuff
not sure if the base game has enough going for it for another replay

Cygni
Nov 12, 2005

raring to post


this rocks so fuckin hard. wish i could justify buying one for my old graphics card collection

Shipon
Nov 7, 2005

Hell yes

Hemish posted:

I second the usb suggestion. I never put my computer to sleep but my Thrustmaster T16000m Joystick setup will straight up not even let my screens turn off for the power saving mode let alone sleep mode if I were to try to use it. I have to unplug it when not using it to play.
Yep, pretty much any controller seems to just not let Windows go to sleep

KingKapalone
Dec 20, 2005
1/16 Native American + 1/2 Hungarian = Totally Badass

Hemish posted:

I second the usb suggestion. I never put my computer to sleep but my Thrustmaster T16000m Joystick setup will straight up not even let my screens turn off for the power saving mode let alone sleep mode if I were to try to use it. I have to unplug it when not using it to play.

I should add that the screen doesn't turn off either. I haven't added any new USB devices. If it is a USB device that is preventing it from sleeping, what is causing the GPU to under clock now or even before when it was going to sleep successfully?

MarcusSA
Sep 23, 2007

An active USB device definitely won’t let the screen turn off. I dunno about the GPU clocks but I’d check those usb devices as a first step.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Has there been any confirmation of the 5090/Blackwell rumours-or-leaks that popped up in December?

UHD
Nov 11, 2006


KingKapalone posted:

I should add that the screen doesn't turn off either. I haven't added any new USB devices. If it is a USB device that is preventing it from sleeping, what is causing the GPU to under clock now or even before when it was going to sleep successfully?

this the reason i keep my controller unplugged unless actively using it. turn out while it wasnt preventing the pc from hibernating it could and did gently caress with pc and monitor sleep

for whatever reason my usb headphones dont cause this problem, but the controller sure does

shrike82
Jun 11, 2005

everspace 2 (on PC game pass) is the first in a game in a while that'll 100% your GPU at a pause menu
i power limit my 4090 to 70% and also frame cap it so it's still under 65C but lol that it's sucking down over 300W to do "nothing"

MarcusSA
Sep 23, 2007

Rookie mistake

Shipon
Nov 7, 2005

shrike82 posted:

everspace 2 (on PC game pass) is the first in a game in a while that'll 100% your GPU at a pause menu
i power limit my 4090 to 70% and also frame cap it so it's still under 65C but lol that it's sucking down over 300W to do "nothing"

every loving time lol

Shumagorath
Jun 6, 2001
The Ascent will totally draw 300W at the pause menu, and more if you don't vsync it.

Dr. Video Games 0031 posted:

They don't really linger too much on the changes, and some scenes aren't actually too different anyway because the original lighting was already good. Outdoors scenes during the daytime with primarily direct lighting from the sun for instance aren't going to see a dramatic improvement because that's something the older methods already excelled at:



It's still a little better though, with a more natural feel to how things are shaded. Any scenes that are indirectly lit or have many artificial lights will likely see a large improvement:



This is comparing rasterization to path tracing though, so it's not the fairest comparison. I think Misty's face would be lit correctly with normal ray tracing for instance, but I doubt even normal ray tracing gave the NPCs shadows in the next comparison. That's something I noticed about the ray tracing in CP2077; a lot of lights in the open world wouldn't cast shadows, but the new path tracing method will allow for unlimited shadow-casting lights. There was one scene they showed with both normal ray tracing and their path tracing:



As for whether it's worth the additional performance hit, probably not? But it is fun to see. And I'm gonna use it anyway.
Once again thank you for putting more work into your SH/SC posts than most people put into their actual jobs.

pyrotek
May 21, 2004



Dr. Video Games 0031 posted:

They don't really linger too much on the changes, and some scenes aren't actually too different anyway because the original lighting was already good. Outdoors scenes during the daytime with primarily direct lighting from the sun for instance aren't going to see a dramatic improvement because that's something the older methods already excelled at:



It's still a little better though, with a more natural feel to how things are shaded. Any scenes that are indirectly lit or have many artificial lights will likely see a large improvement:



This is comparing rasterization to path tracing though, so it's not the fairest comparison. I think Misty's face would be lit correctly with normal ray tracing for instance, but I doubt even normal ray tracing gave the NPCs shadows in the next comparison. That's something I noticed about the ray tracing in CP2077; a lot of lights in the open world wouldn't cast shadows, but the new path tracing method will allow for unlimited shadow-casting lights. There was one scene they showed with both normal ray tracing and their path tracing:



As for whether it's worth the additional performance hit, probably not? But it is fun to see. And I'm gonna use it anyway.

The bounce lighting and shadows in the path-traced middle pictures are definitely better. I do have to wonder just how fast the shadows in particular will update. I've noticed a tendency for there to be a noticeable delay in things like shadows in RT games, particularly Portal RTX as it is the closest comparison to this.

Also, the LCD light in the bottom picture must be incredibly, blindingly bright to anyone in the building across from it the way it lights up the bottom of the overpass above it.

I have to wonder: did the game use any faked lighting? If so, will the artists have to go around everywhere and potentially add lights to scenes to make sure they are lit appropriately?

Dr. Video Games 0031
Jul 17, 2004

I should clarify that the first three sets of screenshots might actually have ray tracing enabled on the left, or at least i saw some people say they did. It wasn't clear in the video so I'm not sure. The lighting issue on Misty's face would be because some lights don't cast shadows, even self shadows, which can cause light to bleed through character models in weird ways.

MarcusSA
Sep 23, 2007

A guy at work spent $300 on an intel GPU. It’s for his kids computer and he’s hoping Diablo 4 runs well on it.

The only thing I could say is that he should have said something before he bought it and that I hope it works out for him.

Im sure intel GPUs are in a “better” spot but he could have done better.

Blurb3947
Sep 30, 2022
The A750 and A770 aren't terrible cards

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
My DIY monitor project is coming along. I bought a laptop panel "B173ZAN06.8" which is 17.3 inches with 4K and 120hz, and an eDP to DP board, then the board was overheating so I had to go on Mouser.com and buy upgraded diodes and inductors to get the heat output in check.

Now that I have it plugged into my RTX 3080 and running stable, a pleasant surprise is that G-Sync compatible was automatically supported just by enabling it:



Pretty nice for a desktop driving a hacked laptop panel. Is there any way to get further details about the adaptive sync? Namely, what range has my 3080 decided to support? The panel's datasheet doesn't seem to mention minimum frequency for adaptive sync, all I found was this:

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
Does the gsync indicator option in nvidia control panel always show when vrr is enabled for the display or does it only show when in range and active? If it is the latter, you could load up an intensive game with the option enabled and see if the indicator turns off when the frame rate drops below 48 FPS or whatever the minimum end of the range is.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Bloodplay it again posted:

Does the gsync indicator option in nvidia control panel always show when vrr is enabled for the display or does it only show when in range and active? If it is the latter, you could load up an intensive game with the option enabled and see if the indicator turns off when the frame rate drops below 48 FPS or whatever the minimum end of the range is.

Hmm, I'm not sure. I turned it on, and this is what I'm getting:

- 120hz, no frame limit: indicator on

- 120hz, frame limited to 20 FPS: indicator on

- custom resolution to set the monitor itself to 20hz, no frame limit: indicator off

So if I lower the panel to an unsupported hz like 20 it seems to disable Gsync Compatible, but if I lower the framerate to 20 while the hz is 120 it stays on. The NVidia control panel won't let me go below a frame cap of 20 so it is hard to tell. I ran the Pendulum Demo with the 20 FPS cap enabled and it didn't seem to tear.

Does that sound like what others have experienced? I didn't know FreeSync could work down to 20 FPS, but it has been a while since the last time I checked.

some dillweed
Mar 31, 2007

I haven't personally tested mine, but Rtings has displays like the LG C1's VRR range working down to 20 fps/Hz, so what you're seeing could easily make sense.

New Zealand can eat me
Aug 29, 2008

:matters:


Zero VGS posted:

For a while, the only Twinaxial risers in existence were made by 3M and they were in the vicinity of $300 each. Be thankful these randos started making them the in past 5 years, $50 is a downright bargain in comparison, especially with inflation.

That's completely fair. Once I started looking into what else they were used for it made a whole lot of sense, and gave me a little peace of mind. "If its good enough for modular military equipment it must be good enough for my dumb bullshit"

Zero VGS posted:

My DIY monitor project is coming along. I bought a laptop panel "B173ZAN06.8" which is 17.3 inches with 4K and 120hz, and an eDP to DP board, then the board was overheating so I had to go on Mouser.com and buy upgraded diodes and inductors to get the heat output in check.

Now that I have it plugged into my RTX 3080 and running stable, a pleasant surprise is that G-Sync compatible was automatically supported just by enabling it:



Pretty nice for a desktop driving a hacked laptop panel. Is there any way to get further details about the adaptive sync? Namely, what range has my 3080 decided to support? The panel's datasheet doesn't seem to mention minimum frequency for adaptive sync, all I found was this:



This is loving sick. Seriously, I was just talking about how much I'd love a few laptop sized 4k/120hz panels. Try 48, that's generally the absolute lowest you will see a decent Freesync monitor support (though I do recall some 2/4k monitors going as low as 30), if that doesn't work then it's 60. You'd have the most luck asking the nerds at like TFTCentral or BlurBusters (and I'd wager someone at either of these forums has figured out the perfect EDID config already or is able to). I'm not exactly sure how you'd confirm that it's working, generally when it's done with monitors, they have an OSD display and you can watch it to see if it's actually adaptive or it just flips between /60 values. If you don't want to gently caress around, just use 60.

E: If it wasn't clear, this isn't about what your GPU has decided to support. You control the EDID of the display, and can edit that to say whatever you want. This is awesome except for all of the times you mess up and tell your GPU to give you something the monitor can't display.

New Zealand can eat me fucked around with this message at 11:54 on Apr 9, 2023

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

New Zealand can eat me posted:

E: If it wasn't clear, this isn't about what your GPU has decided to support. You control the EDID of the display, and can edit that to say whatever you want. This is awesome except for all of the times you mess up and tell your GPU to give you something the monitor can't display.

Is there something to edit/override the EDID from within Windows? It's nice right now because I'm streaming to a laptop with Moonlight, so even if I completely gently caress up monitor settings and the panel blacks out, I can still see what I'm doing via the stream.

Theris
Oct 9, 2007

Shipon posted:

Yep, pretty much any controller seems to just not let Windows go to sleep

This was fixed on 11 in a recent patch. Supposedly it will roll out to 10 eventually. Maybe.

New Zealand can eat me
Aug 29, 2008

:matters:


Zero VGS posted:

Is there something to edit/override the EDID from within Windows? It's nice right now because I'm streaming to a laptop with Moonlight, so even if I completely gently caress up monitor settings and the panel blacks out, I can still see what I'm doing via the stream.

CRU. It has a try-revert process, but it's still easy enough to get in a stupid state.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I'd sure like to know why my 3080 can idle at 50.6MHz memory clock when I have my two 4K displays configured to 144hz and 120hz respectively, but needs to go full idiot at 1187MHz, wasting additional 50W of power, when I raise the second one from 120hz to 144hz. Given that there's multiple memory clocking steps, including 101MHz, why can't it just settle on that instead, if 50MHz is too low? JFC Nvidia.

New Zealand can eat me
Aug 29, 2008

:matters:


I'm not doing the math but that probably exceeds 600mhz pixel clock total. AMD struggles with the same issue. Chuggin along at 105W for 2x1080p280hz screens

BurritoJustice
Oct 9, 2012

Combat Pretzel posted:

I'd sure like to know why my 3080 can idle at 50.6MHz memory clock when I have my two 4K displays configured to 144hz and 120hz respectively, but needs to go full idiot at 1187MHz, wasting additional 50W of power, when I raise the second one from 120hz to 144hz. Given that there's multiple memory clocking steps, including 101MHz, why can't it just settle on that instead, if 50MHz is too low? JFC Nvidia.

If it's that borderline you can properly make a custom timing profile with a reduced blanking interval and get them both to run at 144Hz without hitting the pixel clock limit that mandates the VRAM clocks up.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

BurritoJustice posted:

If it's that borderline you can properly make a custom timing profile with a reduced blanking interval and get them both to run at 144Hz without hitting the pixel clock limit that mandates the VRAM clocks up.
I have to investigate this, thanks for pointing it out. A quick try on the built-in CVT Reduced Blank profile didn't help. I have to see how far I get using the manual profile.

This is mostly about G-Sync. In the past, I could actually chose between 75, 120 and 144hz. But for whatever reason, half a year ago, NVidia removed that and defaults to the highest refresh rate and doesn't allow selection of anything else. Before I set tem both to 120hz, had low clocks and also G-Sync. I'm not sure if it's for all displays or specifically this Samsung one, because it's been problematic generally.

Dr. Video Games 0031
Jul 17, 2004

I have that issue with my Samsung monitor when I use HDMI but not with DisplayPort. HDMI just shows me 60hz and 165hz (this monitor's max refresh rate), and also only a couple oddball resolutions in addition to 4K. When using DisplayPort though, I get a range of refresh rate and resolution options. So it may be an issue with how Samsung handles EDID data across their different outputs.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Same story, only 144hz. What's odd is that the system acts like it's a different display with each combination. Then again, I also think that the firmware is generally FUBAR, because a different display type shows up in the registry. See this:



The LC49G95T is an Odyssey G9. I never had one of these. Also, it showed up again after scrubbing things with DDU.

Samsunged I guess.

Drakhoran
Oct 21, 2012

Nvidia is supposedly launching the 4070 on the 13th. Are there any rumors about when AMD will launch the rest of the Radeon 7000 series?

hobbesmaster
Jan 28, 2008


Its weird how we just collectively forgot that during the mid/late 90s the entire world was at a dutch angle. I guess its just something you got used to :shrug:

edit: holy poo poo a surplus p3 dell xps in almost that exact config was the first "gaming" pc I had

edit 2: and all cards had random artifacting like that which changed with driver versions. I completely forgot that was why later you'd accept "oh yes of course its normal to download a video driver from some random dude's website because it has the right "tweaks".

hobbesmaster fucked around with this message at 19:22 on Apr 9, 2023

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://twitter.com/VideoCardz/status/1645393342547603457?t=kO76SltksTMjTAcuwjrhfA&s=19

Book it, let's see if this turns out to be true

repiv
Aug 13, 2009

:blastback: :pcgaming: :blaster:

https://www.youtube.com/watch?v=I-ORt8313Og

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
That’s a bigger change from old RT than I was expecting.

repiv
Aug 13, 2009

a few scenes are clearly too dark with the more realistic lighting because they were designed around light leaking :v:

epic ran into a similar issue when shipping lumen in fortnite, they ended up adding a hack to the renderer which intentionally allows a small amount of skylight to leak through walls, to ensure it never gets too dark

Adbot
ADBOT LOVES YOU

kliras
Mar 27, 2021
fun little video on vram and some of the subtle ways things start failing when games run out of it

https://www.youtube.com/watch?v=Rh7kFgHe21k

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply