Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
latinotwink1997
Jan 2, 2008

Taste my Ball of Hope, foul dragon!


hobbesmaster posted:

192 bit = 6 chips. GDDR6 comes in 8Gb or 16Gb chips so 12 is the best they can do on a 4070 (ti).

How did they only do 12 again? 12 isn’t a multiple of 8, last I checked. Is it supposed to be 2Gb chips, cuz 6x2 = 12.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

I think that's a part of it. If developers can't stream as much data per second off your storage drive, then they may need to preload more data which causes longer load times and more VRAM consumption. It's also possible that developers are trying to account for players who have slower SATA SSDs. DirectStorage could help, but GPU decompression only came out early this year/late last year, so it may take some time for it to trickle into shipped games.

even when games do start using GPU decompression, that's just trading CPU overhead for GPU overhead

i suspect this compute-based iteration of decompression is a stop-gap, and the plan is for future GPU architectures to have fixed function decompression blocks like the consoles

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Alex Battaglia retweeted this video of someone doing a from-the-ground-up simple implementation of ray tracing

https://www.youtube.com/watch?v=Qz0KTGYJtUk

I haven't coded in a long time, but I still understood a lot of it, and it explained a lot of things about how 3D rendering actually works, what a "shader" is, why images can look "noisy", how TAA accounts for that noise, and a number of other basic concepts related to GPUs and graphics. It's cool and educational.

repiv
Aug 13, 2009

best of luck to non-ada-havers

https://www.youtube.com/watch?v=Tk7Zbzd-6fs

Instant Grat
Jul 31, 2009

Just add
NERD RAAAAAAGE

latinotwink1997 posted:

How did they only do 12 again? 12 isn’t a multiple of 8, last I checked. Is it supposed to be 2Gb chips, cuz 6x2 = 12.

gigabit, not gigabyte

TorakFade
Oct 3, 2006

I strongly disapprove



drat. Is that going to be usable at 4K on a 4080? Guessing yes with DLSS3 and if you don't mind framerates <100, but I always hear that the 4090 is so much better than everything else that I'm kinda "worried"...

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

repiv posted:

even when games do start using GPU decompression, that's just trading CPU overhead for GPU overhead

i suspect this compute-based iteration of decompression is a stop-gap, and the plan is for future GPU architectures to have fixed function decompression blocks like the consoles

I'm really skeptical that we'll see hardware like that make appreciable penetration into the PC gaming market anytime soon when we've already got such a wealth of extra cores sitting around doing nothing that we could use instead. For actually running today's games, 6 core processors remain perfectly sufficient, but Intel is throwing you 14 cores in the value i5 segment and 24 cores at the high end, while AMD offers you 12 or 16 cores for a pretty drat good value.

Why not just pull 2 or 4 of those extra cores that are sitting around doing nothing and use them for useful stuff? Hell, that's how I feel about AV1 hardware encode too, why go out of your way to seek that when you could just get the CPU with a ton of cores and CPU encode it with SVT-AV1?

Nfcknblvbl
Jul 15, 2002

Hell yeah, 130 fps. So path traced gaming begins now?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

latinotwink1997 posted:

How did they only do 12 again? 12 isn’t a multiple of 8, last I checked. Is it supposed to be 2Gb chips, cuz 6x2 = 12.

16x6 = 96 Gigabit

96/8 = 12 Gigabyte

repiv
Aug 13, 2009

TorakFade posted:

drat. Is that going to be usable at 4K on a 4080? Guessing yes with DLSS3 and if you don't mind framerates <100, but I always hear that the 4090 is so much better than everything else that I'm kinda "worried"...

they're not specifying which upscaling preset they used to hit that performance so :iiam:

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Nfcknblvbl posted:

Hell yeah, 130 fps. So path traced gaming begins now?

How did they get 130fps from 16? I thought it was generating a frame after every real rendered frame

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

mobby_6kl posted:

How did they get 130fps from 16? I thought it was generating a frame after every real rendered frame

it's comparing "RT on, [all] DLSS off" with "RT on, DLSS [2, and 3] on"

it looks like a huge jump because it doesn't show you the intermediate step between DLSS off and DLSS 3, which is "just DLSS 2 turned on"

hobbesmaster
Jan 28, 2008

latinotwink1997 posted:

How did they only do 12 again? 12 isn’t a multiple of 8, last I checked. Is it supposed to be 2Gb chips, cuz 6x2 = 12.

I fixed my confusing missing units at the end there. Memory chips are always in Gigabits, memory assemblies are always in Gigabytes*.

*Gibibytes. unlike hard drives solid state devices do use gibibytes for marketing. the memory standards group, JEDEC, defined gigabytes to be gibibytes and megabytes to be mebibytes.

Instant Grat
Jul 31, 2009

Just add
NERD RAAAAAAGE
My favorite bewildering "power of 2/power of 10" fact is that the "Megabyte" that 3.5 inch floppies can store 1.44 of is defined as "1000 lots of 1024 bytes"

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

my body is ready

UHD
Nov 11, 2006



sure hope this lands before I finish the game. cp2077 isn’t nearly compelling enough (to me) for me to play through it twice just for extra shiny.

Ihmemies
Oct 6, 2012

3080's 10GB of ram was fine in 2020 and 1440p, but doesn't feel that fine anymore in 2023 with 4K. I'd like a 16GB+ GPU with 2x the performance, for under 1000€, but they are not avaiable yet from Nvidia. I think Radeons cost too much too, a 7900XTX with 24GB ram is is 50% faster but costs like 1300€ in Finland. Regular 7900 is not fast enough compared to a 3080. A 4080 is also a meh upgrade, and costs 1500€+.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
Did anybody pick up any of the 20GB 3080s? Apparently ~$450-550 used last year: https://www.tomshardware.com/news/geforce-rtx-3080-20gb-gpus-emerge-for-around-dollar575

repiv
Aug 13, 2009

UHD posted:

sure hope this lands before I finish the game. cp2077 isn’t nearly compelling enough (to me) for me to play through it twice just for extra shiny.

it's due out on the 11th

CBD Corndog
Jun 21, 2009



repiv posted:

it's due out on the 11th

There's also a big DLC coming this year sometime which will give you an excuse to go back and turn on all the fun graphics settings

FuturePastNow
May 19, 2014


Twerk from Home posted:

Did anybody pick up any of the 20GB 3080s? Apparently ~$450-550 used last year: https://www.tomshardware.com/news/geforce-rtx-3080-20gb-gpus-emerge-for-around-dollar575

I thought those were mining-only cards that don't work with Nvidia's driver.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Zedsdeadbaby posted:

My impression is that as more and more current-gen/PS5 ports come to PC, the PC ports are suffering because they have no equivalent of the PS5's super-fast decompression capability

PCs have directstorage but it's just not being used, at all

The PS5 also has 16GB of GDDR6.

Edit: vvv Yes, I meant to tack on that it's on a 256-bit bus, which helps keep things speedy. My bad for posting while working and getting distracted.

mdxi fucked around with this message at 17:26 on Apr 4, 2023

Kibner
Oct 21, 2008

Acguy Supremacy

mdxi posted:

The PS5 also has 16GB of GDDR6.

Which they don't use all of for the GPU because it is a shared memory pool and some of it goes to the CPU.

repiv
Aug 13, 2009

some of it is also reserved for the OS, i don't think it's publicly known how much but it's probably a few gigs

the last generation systems reserved about 3GB apparently

https://www.gameinformer.com/b/news/archive/2013/07/26/report-playstation-4-reserves-nearly-half-of-ram-for-os.aspx

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Ihmemies posted:

3080's 10GB of ram was fine in 2020 and 1440p, but doesn't feel that fine anymore in 2023 with 4K.

In my opinion it was always a sub-par offering. Two generations and 3 and a half years behind it and only one slot higher in the stack (which should surely be two down the stack two generations on) got you the 1080 Ti with 11GB.

latinotwink1997
Jan 2, 2008

Taste my Ball of Hope, foul dragon!


hobbesmaster posted:

Memory chips are always in Gigabits, memory assemblies are always in Gigabytes*.

Ah, makes more sense then. The only time I usually see Gb used correctly is with internet companies trying to make their speeds seem faster (big numbers!). What’s their reasoning instead of calling them 1GB and 2GB chips?

hobbesmaster posted:

*Gibibytes. unlike hard drives solid state devices do use gibibytes for marketing. the memory standards group, JEDEC, defined gigabytes to be gibibytes and megabytes to be mebibytes.

Ugh

hobbesmaster
Jan 28, 2008

latinotwink1997 posted:

Ah, makes more sense then. The only time I usually see Gb used correctly is with internet companies trying to make their speeds seem faster (big numbers!). What’s their reasoning instead of calling them 1GB and 2GB chips?

At least for last mile the “number of symbols on the physical layer per second” is actually the most honest way to report because that’s what is actually being delivered. Each layer on OSI adds checksums and other overhead. Symbols and bits aren’t synonymous in most communications schemes but the concept from voice modems remains - ie your 56k modem could send 56000 symbols per second over the phone line but your OS will never see that because modems have do their own checks before the raw data is sent to the PC. ISPs of course are also selling you a connection to the wider internet but the same is true for every connection step so why not be consistent.

I’m not 100% on the history of the bits thing on memory but with individual chips you’re also talking about individual pins which represent 1 bit. E.g. GDDR6 is 32bits wide meaning there are 32 wires in parallel that can be transmitted over at the same time. Again, consistency is nice here. You do have to get used to the physical to software shift between bits and bytes if you’re looking at say registers, memory maps or device tables or whatever.

At least someone set the rules for memory so even if you don’t like it it’s consistent.

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.

repiv posted:

some of it is also reserved for the OS, i don't think it's publicly known how much but it's probably a few gigs

the last generation systems reserved about 3GB apparently

https://www.gameinformer.com/b/news/archive/2013/07/26/report-playstation-4-reserves-nearly-half-of-ram-for-os.aspx

i think the DF guys said they understand it to be about 3.5GB reserved for the PS5 OS

Dr. Video Games 0031
Jul 17, 2004

https://www.youtube.com/watch?v=74Ve8mjqhQo&t=740s

Even at 45W, the 4050 is better than expected, tbh. It's 30% faster than the 85W 3050 Ti in two of the three games tested (the other being 10% faster), and it matches the mobile 3060 twice too.

That said, they seem to be overcharging the poo poo out of the laptop parts too. That $1000 price tag is crazy when there are cheaper 3060 laptops with much better screens and build quality.

sauer kraut
Oct 2, 2004

Am I the only one noping out when those overbright 'realistic' windows came up?
Natural sunlight sucks balls, that's why my crib is draped with double curtains plus cutouts during the summer. (can't install shutters etc because it's a listed ancient building)
Keep that poo poo out my games ffs. Too much bright/dark contrast and it washes out all the colors.
Also I couldn't care less what some water puddle reflection looks like at steep angles when I'm sprinting and parkouring through the streets at 50 mph.

sauer kraut fucked around with this message at 01:23 on Apr 5, 2023

Dr. Video Games 0031
Jul 17, 2004

It's not even more natural, and it seems unrelated to the ray tracing. They just made the image more overexposed for some reason, and it makes it look much worse than it should. Hopefully HDR mode fixes it, or if not, they let you tone it down.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

sauer kraut posted:

Am I the only one noping out when those overbright 'realistic' windows came up?
Natural sunlight sucks balls, that's why my crib is draped with double curtains plus cutouts during the summer. (can't install shutters etc because it's a listed ancient building)
Keep that poo poo out my games ffs. Too much bright/dark contrast and it washes out all the colors.
Also I couldn't care less what some water puddle reflection looks like at steep angles when I'm sprinting and parkouring through the streets at 50 mph.

GoonClosingBlindsFromFireworks.jpg

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Dr. Video Games 0031 posted:

It's not even more natural, and it seems unrelated to the ray tracing. They just made the image more overexposed for some reason, and it makes it look much worse than it should. Hopefully HDR mode fixes it, or if not, they let you tone it down.

That scene is in the opening tutorial sort of mission and you go from crawling in a dark hallway to that room, so it might be simulating acclimation time.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Subjunctive posted:

That scene is in the opening tutorial sort of mission and you go from crawling in a dark hallway to that room, so it might be simulating acclimation time.

I really don't get when they do that effect... my own eyes can do that just fine.

repiv
Aug 13, 2009

your monitor has significantly less dynamic range than your eyes, so the game has to simulate fake eyes to gauge how to set the exposure appropriately and compress it down to the range the monitor can display

in theory if you had a HDR monitor with an absurdly high brightness ceiling then yes, you could just rely on your eyes, but otherwise bloom and dynamic exposure are the workarounds

repiv fucked around with this message at 01:49 on Apr 5, 2023

Shumagorath
Jun 6, 2001
You also don’t want to own / power a monitor capable of making you squint like sun rays in a dimly lit room.

Arivia
Mar 17, 2011

Shumagorath posted:

You also don’t want to own / power a monitor capable of making you squint like sun rays in a dimly lit room.

Maybe not a monitor but a therapy lamp is a good thing

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Shumagorath posted:

You also don’t want to own / power a monitor capable of making you squint like sun rays in a dimly lit room.

I want a monitor bright enough that it bleaches my shadow into the wall behind me. Tesla can sell me a huge capacitor bank for it.

repiv
Aug 13, 2009

Subjunctive posted:

I want a monitor bright enough that it bleaches my shadow into the wall behind me. Tesla can sell me a huge capacitor bank for it.

:hmmyes:

https://www.youtube.com/watch?v=WlFVPnGEb8o

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
as an amateur photographer it was very interesting getting an LG C1 OLED and then watching the LOTR trilogy. It's sort of similar to a "zone system" for luminosity in addition to tonality, things can have greater/lesser luminosity in addition to the actual tonal color. The shift between dark caves and bright plains etc. And really the peak luminosity is only used for certain things like lightning and gandalf the white's staff and the balrog's whip etc, it was kinda obvious when they were firing up 1000 nits for an effect.

perhaps it's less deliberately done in stuff that wasn't posthumously mastered up to HDR after the fact, I haven't noticed it since even on HDR content, but yeah, it was kinda used as an eye-searer in LOTR at least, and that was the film where I studied it the most deliberately.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply