|
hobbesmaster posted:192 bit = 6 chips. GDDR6 comes in 8Gb or 16Gb chips so 12 is the best they can do on a 4070 (ti). How did they only do 12 again? 12 isn’t a multiple of 8, last I checked. Is it supposed to be 2Gb chips, cuz 6x2 = 12.
|
# ? Apr 4, 2023 13:55 |
|
|
# ? Jun 5, 2024 08:38 |
|
Dr. Video Games 0031 posted:I think that's a part of it. If developers can't stream as much data per second off your storage drive, then they may need to preload more data which causes longer load times and more VRAM consumption. It's also possible that developers are trying to account for players who have slower SATA SSDs. DirectStorage could help, but GPU decompression only came out early this year/late last year, so it may take some time for it to trickle into shipped games. even when games do start using GPU decompression, that's just trading CPU overhead for GPU overhead i suspect this compute-based iteration of decompression is a stop-gap, and the plan is for future GPU architectures to have fixed function decompression blocks like the consoles
|
# ? Apr 4, 2023 14:01 |
|
Alex Battaglia retweeted this video of someone doing a from-the-ground-up simple implementation of ray tracing https://www.youtube.com/watch?v=Qz0KTGYJtUk I haven't coded in a long time, but I still understood a lot of it, and it explained a lot of things about how 3D rendering actually works, what a "shader" is, why images can look "noisy", how TAA accounts for that noise, and a number of other basic concepts related to GPUs and graphics. It's cool and educational.
|
# ? Apr 4, 2023 14:11 |
|
best of luck to non-ada-havers https://www.youtube.com/watch?v=Tk7Zbzd-6fs
|
# ? Apr 4, 2023 14:13 |
|
latinotwink1997 posted:How did they only do 12 again? 12 isn’t a multiple of 8, last I checked. Is it supposed to be 2Gb chips, cuz 6x2 = 12. gigabit, not gigabyte
|
# ? Apr 4, 2023 14:20 |
|
repiv posted:best of luck to non-ada-havers drat. Is that going to be usable at 4K on a 4080? Guessing yes with DLSS3 and if you don't mind framerates <100, but I always hear that the 4090 is so much better than everything else that I'm kinda "worried"...
|
# ? Apr 4, 2023 14:20 |
|
repiv posted:even when games do start using GPU decompression, that's just trading CPU overhead for GPU overhead I'm really skeptical that we'll see hardware like that make appreciable penetration into the PC gaming market anytime soon when we've already got such a wealth of extra cores sitting around doing nothing that we could use instead. For actually running today's games, 6 core processors remain perfectly sufficient, but Intel is throwing you 14 cores in the value i5 segment and 24 cores at the high end, while AMD offers you 12 or 16 cores for a pretty drat good value. Why not just pull 2 or 4 of those extra cores that are sitting around doing nothing and use them for useful stuff? Hell, that's how I feel about AV1 hardware encode too, why go out of your way to seek that when you could just get the CPU with a ton of cores and CPU encode it with SVT-AV1?
|
# ? Apr 4, 2023 14:21 |
|
Hell yeah, 130 fps. So path traced gaming begins now?
|
# ? Apr 4, 2023 14:24 |
|
latinotwink1997 posted:How did they only do 12 again? 12 isn’t a multiple of 8, last I checked. Is it supposed to be 2Gb chips, cuz 6x2 = 12. 16x6 = 96 Gigabit 96/8 = 12 Gigabyte
|
# ? Apr 4, 2023 14:36 |
|
TorakFade posted:drat. Is that going to be usable at 4K on a 4080? Guessing yes with DLSS3 and if you don't mind framerates <100, but I always hear that the 4090 is so much better than everything else that I'm kinda "worried"... they're not specifying which upscaling preset they used to hit that performance so
|
# ? Apr 4, 2023 14:39 |
|
Nfcknblvbl posted:Hell yeah, 130 fps. So path traced gaming begins now? How did they get 130fps from 16? I thought it was generating a frame after every real rendered frame
|
# ? Apr 4, 2023 15:24 |
|
mobby_6kl posted:How did they get 130fps from 16? I thought it was generating a frame after every real rendered frame it's comparing "RT on, [all] DLSS off" with "RT on, DLSS [2, and 3] on" it looks like a huge jump because it doesn't show you the intermediate step between DLSS off and DLSS 3, which is "just DLSS 2 turned on"
|
# ? Apr 4, 2023 15:27 |
|
latinotwink1997 posted:How did they only do 12 again? 12 isn’t a multiple of 8, last I checked. Is it supposed to be 2Gb chips, cuz 6x2 = 12. I fixed my confusing missing units at the end there. Memory chips are always in Gigabits, memory assemblies are always in Gigabytes*. *Gibibytes. unlike hard drives solid state devices do use gibibytes for marketing. the memory standards group, JEDEC, defined gigabytes to be gibibytes and megabytes to be mebibytes.
|
# ? Apr 4, 2023 15:37 |
|
My favorite bewildering "power of 2/power of 10" fact is that the "Megabyte" that 3.5 inch floppies can store 1.44 of is defined as "1000 lots of 1024 bytes"
|
# ? Apr 4, 2023 15:45 |
|
repiv posted:best of luck to non-ada-havers my body is ready
|
# ? Apr 4, 2023 16:05 |
|
repiv posted:best of luck to non-ada-havers sure hope this lands before I finish the game. cp2077 isn’t nearly compelling enough (to me) for me to play through it twice just for extra shiny.
|
# ? Apr 4, 2023 16:09 |
|
3080's 10GB of ram was fine in 2020 and 1440p, but doesn't feel that fine anymore in 2023 with 4K. I'd like a 16GB+ GPU with 2x the performance, for under 1000€, but they are not avaiable yet from Nvidia. I think Radeons cost too much too, a 7900XTX with 24GB ram is is 50% faster but costs like 1300€ in Finland. Regular 7900 is not fast enough compared to a 3080. A 4080 is also a meh upgrade, and costs 1500€+.
|
# ? Apr 4, 2023 16:13 |
|
Did anybody pick up any of the 20GB 3080s? Apparently ~$450-550 used last year: https://www.tomshardware.com/news/geforce-rtx-3080-20gb-gpus-emerge-for-around-dollar575
|
# ? Apr 4, 2023 16:14 |
|
UHD posted:sure hope this lands before I finish the game. cp2077 isn’t nearly compelling enough (to me) for me to play through it twice just for extra shiny. it's due out on the 11th
|
# ? Apr 4, 2023 16:15 |
|
repiv posted:it's due out on the 11th There's also a big DLC coming this year sometime which will give you an excuse to go back and turn on all the fun graphics settings
|
# ? Apr 4, 2023 16:20 |
|
Twerk from Home posted:Did anybody pick up any of the 20GB 3080s? Apparently ~$450-550 used last year: https://www.tomshardware.com/news/geforce-rtx-3080-20gb-gpus-emerge-for-around-dollar575 I thought those were mining-only cards that don't work with Nvidia's driver.
|
# ? Apr 4, 2023 16:49 |
|
Zedsdeadbaby posted:My impression is that as more and more current-gen/PS5 ports come to PC, the PC ports are suffering because they have no equivalent of the PS5's super-fast decompression capability The PS5 also has 16GB of GDDR6. Edit: vvv Yes, I meant to tack on that it's on a 256-bit bus, which helps keep things speedy. My bad for posting while working and getting distracted. mdxi fucked around with this message at 17:26 on Apr 4, 2023 |
# ? Apr 4, 2023 17:04 |
|
mdxi posted:The PS5 also has 16GB of GDDR6. Which they don't use all of for the GPU because it is a shared memory pool and some of it goes to the CPU.
|
# ? Apr 4, 2023 17:11 |
|
some of it is also reserved for the OS, i don't think it's publicly known how much but it's probably a few gigs the last generation systems reserved about 3GB apparently https://www.gameinformer.com/b/news/archive/2013/07/26/report-playstation-4-reserves-nearly-half-of-ram-for-os.aspx
|
# ? Apr 4, 2023 17:15 |
|
Ihmemies posted:3080's 10GB of ram was fine in 2020 and 1440p, but doesn't feel that fine anymore in 2023 with 4K. In my opinion it was always a sub-par offering. Two generations and 3 and a half years behind it and only one slot higher in the stack (which should surely be two down the stack two generations on) got you the 1080 Ti with 11GB.
|
# ? Apr 4, 2023 17:18 |
|
hobbesmaster posted:Memory chips are always in Gigabits, memory assemblies are always in Gigabytes*. Ah, makes more sense then. The only time I usually see Gb used correctly is with internet companies trying to make their speeds seem faster (big numbers!). What’s their reasoning instead of calling them 1GB and 2GB chips? hobbesmaster posted:*Gibibytes. unlike hard drives solid state devices do use gibibytes for marketing. the memory standards group, JEDEC, defined gigabytes to be gibibytes and megabytes to be mebibytes. Ugh
|
# ? Apr 4, 2023 18:24 |
|
latinotwink1997 posted:Ah, makes more sense then. The only time I usually see Gb used correctly is with internet companies trying to make their speeds seem faster (big numbers!). What’s their reasoning instead of calling them 1GB and 2GB chips? At least for last mile the “number of symbols on the physical layer per second” is actually the most honest way to report because that’s what is actually being delivered. Each layer on OSI adds checksums and other overhead. Symbols and bits aren’t synonymous in most communications schemes but the concept from voice modems remains - ie your 56k modem could send 56000 symbols per second over the phone line but your OS will never see that because modems have do their own checks before the raw data is sent to the PC. ISPs of course are also selling you a connection to the wider internet but the same is true for every connection step so why not be consistent. I’m not 100% on the history of the bits thing on memory but with individual chips you’re also talking about individual pins which represent 1 bit. E.g. GDDR6 is 32bits wide meaning there are 32 wires in parallel that can be transmitted over at the same time. Again, consistency is nice here. You do have to get used to the physical to software shift between bits and bytes if you’re looking at say registers, memory maps or device tables or whatever. At least someone set the rules for memory so even if you don’t like it it’s consistent.
|
# ? Apr 4, 2023 18:42 |
|
repiv posted:some of it is also reserved for the OS, i don't think it's publicly known how much but it's probably a few gigs i think the DF guys said they understand it to be about 3.5GB reserved for the PS5 OS
|
# ? Apr 4, 2023 23:18 |
|
https://www.youtube.com/watch?v=74Ve8mjqhQo&t=740s Even at 45W, the 4050 is better than expected, tbh. It's 30% faster than the 85W 3050 Ti in two of the three games tested (the other being 10% faster), and it matches the mobile 3060 twice too. That said, they seem to be overcharging the poo poo out of the laptop parts too. That $1000 price tag is crazy when there are cheaper 3060 laptops with much better screens and build quality.
|
# ? Apr 5, 2023 01:03 |
|
repiv posted:best of luck to non-ada-havers Am I the only one noping out when those overbright 'realistic' windows came up? Natural sunlight sucks balls, that's why my crib is draped with double curtains plus cutouts during the summer. (can't install shutters etc because it's a listed ancient building) Keep that poo poo out my games ffs. Too much bright/dark contrast and it washes out all the colors. Also I couldn't care less what some water puddle reflection looks like at steep angles when I'm sprinting and parkouring through the streets at 50 mph. sauer kraut fucked around with this message at 01:23 on Apr 5, 2023 |
# ? Apr 5, 2023 01:16 |
|
It's not even more natural, and it seems unrelated to the ray tracing. They just made the image more overexposed for some reason, and it makes it look much worse than it should. Hopefully HDR mode fixes it, or if not, they let you tone it down.
|
# ? Apr 5, 2023 01:22 |
|
sauer kraut posted:Am I the only one noping out when those overbright 'realistic' windows came up? GoonClosingBlindsFromFireworks.jpg
|
# ? Apr 5, 2023 01:25 |
|
Dr. Video Games 0031 posted:It's not even more natural, and it seems unrelated to the ray tracing. They just made the image more overexposed for some reason, and it makes it look much worse than it should. Hopefully HDR mode fixes it, or if not, they let you tone it down. That scene is in the opening tutorial sort of mission and you go from crawling in a dark hallway to that room, so it might be simulating acclimation time.
|
# ? Apr 5, 2023 01:29 |
|
Subjunctive posted:That scene is in the opening tutorial sort of mission and you go from crawling in a dark hallway to that room, so it might be simulating acclimation time. I really don't get when they do that effect... my own eyes can do that just fine.
|
# ? Apr 5, 2023 01:37 |
|
your monitor has significantly less dynamic range than your eyes, so the game has to simulate fake eyes to gauge how to set the exposure appropriately and compress it down to the range the monitor can display in theory if you had a HDR monitor with an absurdly high brightness ceiling then yes, you could just rely on your eyes, but otherwise bloom and dynamic exposure are the workarounds repiv fucked around with this message at 01:49 on Apr 5, 2023 |
# ? Apr 5, 2023 01:39 |
|
You also don’t want to own / power a monitor capable of making you squint like sun rays in a dimly lit room.
|
# ? Apr 5, 2023 01:45 |
|
Shumagorath posted:You also don’t want to own / power a monitor capable of making you squint like sun rays in a dimly lit room. Maybe not a monitor but a therapy lamp is a good thing
|
# ? Apr 5, 2023 01:48 |
|
Shumagorath posted:You also don’t want to own / power a monitor capable of making you squint like sun rays in a dimly lit room. I want a monitor bright enough that it bleaches my shadow into the wall behind me. Tesla can sell me a huge capacitor bank for it.
|
# ? Apr 5, 2023 02:03 |
|
Subjunctive posted:I want a monitor bright enough that it bleaches my shadow into the wall behind me. Tesla can sell me a huge capacitor bank for it. https://www.youtube.com/watch?v=WlFVPnGEb8o
|
# ? Apr 5, 2023 02:05 |
|
|
# ? Jun 5, 2024 08:38 |
|
as an amateur photographer it was very interesting getting an LG C1 OLED and then watching the LOTR trilogy. It's sort of similar to a "zone system" for luminosity in addition to tonality, things can have greater/lesser luminosity in addition to the actual tonal color. The shift between dark caves and bright plains etc. And really the peak luminosity is only used for certain things like lightning and gandalf the white's staff and the balrog's whip etc, it was kinda obvious when they were firing up 1000 nits for an effect. perhaps it's less deliberately done in stuff that wasn't posthumously mastered up to HDR after the fact, I haven't noticed it since even on HDR content, but yeah, it was kinda used as an eye-searer in LOTR at least, and that was the film where I studied it the most deliberately.
|
# ? Apr 5, 2023 02:10 |