Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
mobby_6kl
Aug 9, 2009

by Fluffdaddy


Lol well there's the reason nvidia only let Cyberpunk reviews out first.

Adbot
ADBOT LOVES YOU

spunkshui
Oct 5, 2011



mobby_6kl posted:



Lol well there's the reason nvidia only let Cyberpunk reviews out first.

The second that I saw they were only allowing reviewers to test cyberpunk I knew that it wasn’t worth my time to learn anything about the product.

Other then knowing it is garbage.

repiv
Aug 13, 2009

CPUs have too many cores now, can we bring back software rendering until GPUs get their poo poo together

kliras
Mar 27, 2021
it's so stupid that the prices of nvidia gpu's are what they are so streamers might just end up getting some vanity threadripper pc around october for av1 at like ten times the power draw

also a shame that you need a 4070ti or up for two nvenc streams

kliras fucked around with this message at 14:58 on Jun 28, 2023

Icept
Jul 11, 2001
You can use the RTX framegen that means it's a great deal right? Right?

kliras
Mar 27, 2021
the people don't want to play starfield with rtx, they want to play it with the most miserable bloom cranked up to 300 like bethesda intended

Arzachel
May 12, 2012

Icept posted:

You can use the RTX framegen that means it's a great deal right? Right?

Wish I could framegen a good GPU

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

I further retract my comments about the $250 RX 7600 after seeing that it's on par with or just beats the 4060 most of the time

repiv
Aug 13, 2009

it's funny how the reaction to starfield being AMD sponsored is overwhelmingly negative basically everywhere, even on /r/AMD there's a strong sentiment that their sponsorships are now all downside (sandbagging upscalers and raytracing) with no discernible upside even for AMD users

repiv fucked around with this message at 15:23 on Jun 28, 2023

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Idk if it’s true across the board, but AMD “optimized“ do sometimes allow AMD cards to punch above their weight

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Rinkles posted:

Idk if it’s true across the board, but AMD “optimized“ do sometimes allow AMD cards to punch above their weight

On the one hand, Starfield needs to run acceptably on the consoles, even including the Series S which has about the GPU power of a 6500 XT with more memory bandwidth.

I'd bet that starfield is decently AMD optimized and plays to their advantages but it will not matter one bit because all the Nvidia people who would complain have 3080s or above and it'll run fine there anyway.

Edit: 4060 looking like poo poo, way to go Nvidia.

Twerk from Home fucked around with this message at 15:32 on Jun 28, 2023

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Too bad, maybe the 4050 will be good value

This is a joke

FuturePastNow
May 19, 2014


Nvidia's marketing people should be fed into a frame generator feet-first

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



mobby_6kl posted:



Lol well there's the reason nvidia only let Cyberpunk reviews out first.

Lol. Lmao

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
I don't know what you all are all complaining about, this new 3060 looks pretty good vs the older 4060 and 4060 Ti:



It's even got 12GB of VRAM to future proof a tiny bit! I know that it's only a couple FPS faster than the 4060 Ti, but still this new Ampere seems to perform a bit better than last year's Ada.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Twerk from Home posted:

I don't know what you all are all complaining about, this new 3060 looks pretty good vs the older 4060 and 4060 Ti:



It's even got 12GB of VRAM to future proof a tiny bit! I know that it's only a couple FPS faster than the 4060 Ti, but still this new Ampere seems to perform a bit better than last year's Ada.

What’s going on here?

Yudo
May 15, 2003

4060 bad? Cherry picked benchmarks misleading? No, I never, not my Nvidia!

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Twerk from Home posted:

I don't know what you all are all complaining about, this new 3060 looks pretty good vs the older 4060 and 4060 Ti:



It's even got 12GB of VRAM to future proof a tiny bit! I know that it's only a couple FPS faster than the 4060 Ti, but still this new Ampere seems to perform a bit better than last year's Ada.

Don't worry though, no-one games at 1080p!


Yudo posted:

4060 bad? Cherry picked benchmarks misleading? No, I never, not my Nvidia!

It's all actually AMD's fault, for <insert 10 pages of reasons>

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Nvidia's just trying to give Intel a chance to catch up :3:

Anime Schoolgirl
Nov 28, 2002

Twerk from Home posted:

On the one hand, Starfield needs to run acceptably on the consoles, even including the Series S which has about the GPU power of a 6500 XT with more memory bandwidth.

I'd bet that starfield is decently AMD optimized and plays to their advantages but it will not matter one bit because all the Nvidia people who would complain have 3080s or above and it'll run fine there anyway.

Edit: 4060 looking like poo poo, way to go Nvidia.
it runs 1280p or 1296p on Series X at a frame cap of 30 so if the renderer is the bottleneck I expect it to be around 720p on Series S. However, the scenes and assets don't seem to be particularly complicated or detailed, so I wonder if it's the case that DRS is triggering too fast when the framerate dips below 30 for a couple frames.

Dr. Video Games 0031
Jul 17, 2004

Benchmark averages tend to vary between +15% better than the 3060 to +22% or thereabouts. a very :effort: result

It is not as comically bad as the 4060 Ti in my opinion, but it still has its moments.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

repiv posted:

CPUs have too many cores now, can we bring back software rendering until GPUs get their poo poo together

Nanite has software rendering, right?

repiv
Aug 13, 2009

yeah, but it's software rasterization running on the GPU in compute shaders

Farecoal
Oct 15, 2011

There he go

repiv posted:

/r/fucktaa is very funny because they're absolutely convinced that everyone hates TAA as much as them as their posts get like... 10 upvotes

it takes some unique brainworms to look at the UE5 City sample with no AA and think "this is preferable to TAA"

https://www.youtube.com/watch?v=6vg-duTMj6M&t=175s

this guy actually tries running 200% render percentage at 1080p (so rendering at 4K) without AA and you can see it's still aliasing, 4K alone isn't enough

Whatever happened to FXAA and such

Truga
May 4, 2014
Lipstick Apathy
https://www.youtube.com/watch?v=oMNFoNtKCR8

Truga
May 4, 2014
Lipstick Apathy

Farecoal posted:

Whatever happened to FXAA and such

they're not so good for removing jaggies when going for a photorealistic look some people decided it's trash

temporal antialiasing & co often introduce blur or ghosting which is insanely more jarring than jaggies to me, but apparently it's perfectly fine for everyone else??

repiv
Aug 13, 2009

FXAA/SMAA1x were only effective against "macro scale" aliasing patterns like around the silhouettes of objects, they can't do anything about subpixel aliasing which is a much bigger issue now

in that UE5 City video you can see him enable FXAA at 4:30 and it just makes the image blurrier without meaningfully reducing the amount of aliasing

repiv fucked around with this message at 20:50 on Jun 28, 2023

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Rinkles posted:

What’s going on here?

memory bandwidth has been slashed and burned

steckles
Jan 14, 2006

Yeah, morphological AA is good at smoothing lines, but pretty bad at resolving Moiré patterns, which is what more-triangles-than-pixels will usually give you. The whole concept was pretty neat when it landed on the PS3 though. I recall being blown away by how smooth God of War 3 looked when it came out. I suppose it might still have use in a pure SSAA setting to remove some high frequencies before downsampling and get you a slightly smoother image for the same sample count.

Actually, that reminds me: I would sometimes combat Moiré when resizing photos by adding a light noise layer over the affected areas in Photoshop before downsampling and it'd often work really well to smooth stuff out while still looking natural. I wonder if you could apply something stupid like that, driven by some metric of like the surface area of a nanite mesh to it's projected size to break things up and trade Moiré for "filmic" noise. I'd guess it'd look terrible and require a ton of scene specific tuning, but it might be a fun thing to code up.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

HalloKitty posted:

memory bandwidth has been slashed and burned

but that still seems like an incredible outlier

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Rinkles posted:

but that still seems like an incredible outlier

Yeah, it's obviously cherry-picked for comedy purposes, it must be sensitive in a very specific way. It'd be interesting to know why

repiv
Aug 13, 2009

steckles posted:

Yeah, morphological AA is good at smoothing lines, but pretty bad at resolving Moiré patterns, which is what more-triangles-than-pixels will usually give you. The whole concept was pretty neat when it landed on the PS3 though. I recall being blown away by how smooth God of War 3 looked when it came out. I suppose it might still have use in a pure SSAA setting to remove some high frequencies before downsampling and get you a slightly smoother image for the same sample count.

Actually, that reminds me: I would sometimes combat Moiré when resizing photos by adding a light noise layer over the affected areas in Photoshop before downsampling and it'd often work really well to smooth stuff out while still looking natural. I wonder if you could apply something stupid like that, driven by some metric of like the surface area of a nanite mesh to it's projected size to break things up and trade Moiré for "filmic" noise. I'd guess it'd look terrible and require a ton of scene specific tuning, but it might be a fun thing to code up.

maybe once they're all-in on software rasterization they could randomize the sample positions per-pixel to trade aliasing for noise? unreal isn't at the point where they can get that experimental though, they still use hardware rasterization for parts of the pipeline so the software rasterizer needs to match the regular sampling grid of the HW rasterizer for them to fit together seamlessly

Weird Pumpkin
Oct 7, 2007

repiv posted:

FXAA/SMAA1x were only effective against "macro scale" aliasing patterns like around the silhouettes of objects, they can't do anything about subpixel aliasing which is a much bigger issue now

in that UE5 City video you can see him enable FXAA at 4:30 and it just makes the image blurrier without meaningfully reducing the amount of aliasing

Is there an explainer about what subpixel would mean?

I'd think that any aliasing within a pixel wouldn't show up, since each pixel can only be one color. But I might also just be HILARIOUSLY out of date on how monitors and stuff work

or is it just if you get enough of it stacking up it still shows up as jaggies, but because of the way the jaggies are formed the previous forms of anti-aliasing don't deal with it properly?

repiv
Aug 13, 2009

subpixel elements become a major problem in motion, because they might be visible on some frames if they happen to land on the singular point within the pixel that the rasterizer samples, and then not on the next frame, and then maybe visible again after that, resulting in horrible flickering/shimmering artifacts

FXAA and friends worked by looking for clearly defined edges in the image and then blurring/smoothing them over, but something that small doesn't have any edges to speak of

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Truga posted:

they're not so good for removing jaggies when going for a photorealistic look some people decided it's trash

temporal antialiasing & co often introduce blur or ghosting which is insanely more jarring than jaggies to me, but apparently it's perfectly fine for everyone else??

Yeah I’m the same, when I enable temporal antialiasing I see ghosting all of the time, to where it makes the game unplayable (to me).

Truga
May 4, 2014
Lipstick Apathy
the only game i've played that made the game *too* jarring was black desert, where moving through grass at speed turned it into a green sludge lmao

but yeah, it can get annoying in many cases

Instant Grat
Jul 31, 2009

Just add
NERD RAAAAAAGE


lol

repiv
Aug 13, 2009

finewine is back!

Weird Pumpkin
Oct 7, 2007

repiv posted:

subpixel elements become a major problem in motion, because they might be visible on some frames if they happen to land on the singular point within the pixel that the rasterizer samples, and then not on the next frame, and then maybe visible again after that, resulting in horrible flickering/shimmering artifacts

FXAA and friends worked by looking for clearly defined edges in the image and then blurring/smoothing them over, but something that small doesn't have any edges to speak of

Ohhh I see, thanks for the quick explanation!

Adbot
ADBOT LOVES YOU

steckles
Jan 14, 2006

repiv posted:

maybe once they're all-in on software rasterization they could randomize the sample positions per-pixel to trade aliasing for noise? unreal isn't at the point where they can get that experimental though, they still use hardware rasterization for parts of the pipeline so the software rasterizer needs to match the regular sampling grid of the HW rasterizer for them to fit together seamlessly
I wonder how much performance they'd be losing by just rasterizing everything in the shader. Although if I recall the nanite paper correctly, they're using a DDA type algorithm for that that might be tough to adapt to per-pixel sample points, so maybe adding that would be impractical right now. Hardware support for per-pixel sample locations would be cool though.

That no-TAA unreal video looked shimmery as hell, but I did really like the complete lack of ghosting. Maybe I've spent too long playing with graphics algorithms and the parts of my brain that notice small rendering errors are over-developed, but I've always been super sensitive to temporal artefacts.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply