|
Nfcknblvbl posted:I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now? I only watch long form videos about games I want to play and then never do.
|
# ? Dec 5, 2023 23:56 |
|
|
# ? Jun 10, 2024 20:35 |
|
Honestly I game on my Steam Deck more than my PC these days
|
# ? Dec 6, 2023 01:06 |
|
i play games a ton, but i also only buy gpus every 5 years or so. 980ti lasted me over 6, and it'd probably last 7 or 8 if nvidia didn't skimp on vram and put 8 instead of 6 on it, because it ran all the games i play fine, it's just alttabbing with a game running got ridiculously slow when the 6gb started to be not quite enough
|
# ? Dec 6, 2023 01:16 |
|
Truga posted:i play games a ton, but i also only buy gpus every 5 years or so. 980ti lasted me over 6, and it'd probably last 7 or 8 if nvidia didn't skimp on vram and put 8 instead of 6 on it, because it ran all the games i play fine, it's just alttabbing with a game running got ridiculously slow when the 6gb started to be not quite enough They didn't skimp on VRAM with the 980ti lol, you can level that criticism at a lot of modern GPUs but the 6GB on the 980ti is just a product of it's time. It's a very large 384b memory bus, and they fully loaded it with the most dense chips available at the time. They could've done 12GB, but it would've required sandwiching VRAM on both sides of the PCB which is a cost and reliability nightmare. The only times they've done sandwich VRAM on consumer cards is the 3090 (which failed a lot), or more recently on the 4060ti (because that card was a panicked reflex to VRAM backlash).
|
# ? Dec 6, 2023 01:26 |
|
Truga posted:i play games a ton, but i also only buy gpus every 5 years or so. 980ti lasted me over 6, and it'd probably last 7 or 8 if nvidia didn't skimp on vram and put 8 instead of 6 on it, because it ran all the games i play fine, it's just alttabbing with a game running got ridiculously slow when the 6gb started to be not quite enough Im in the same boat but I try to stay up to date on GPU stiff just because it’s always great drama to rubberneck.
|
# ? Dec 6, 2023 01:41 |
|
BurritoJustice posted:They didn't skimp on VRAM with the 980ti lol, you can level that criticism at a lot of modern GPUs but the 6GB on the 980ti is just a product of it's time. It's a very large 384b memory bus, and they fully loaded it with the most dense chips available at the time. They could've done 12GB, but it would've required sandwiching VRAM on both sides of the PCB which is a cost and reliability nightmare. The only times they've done sandwich VRAM on consumer cards is the 3090 (which failed a lot), or more recently on the 4060ti (because that card was a panicked reflex to VRAM backlash).
|
# ? Dec 6, 2023 01:43 |
|
I just started playing through Star Wars: The Force Unleashed for the first time in many years. It is hilarious playing on a 4090, with the default 30 FPS cap the game has the GPU doesn't even come out of its 210 MHz idle desktop clocks to run it (and only reaches like 40% utilization at that), the power consumption on the 4090 is all of 20w, when it idles at 16. I remember when this game brought a GPU of mine to its knees.
|
# ? Dec 6, 2023 01:46 |
|
Nfcknblvbl posted:I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now? I used my 3060Ti to play Snowrunner yesterday.
|
# ? Dec 6, 2023 01:52 |
|
BurritoJustice posted:The only times they've done sandwich VRAM on consumer cards is the 3090 (which failed a lot) It failed because both Nvidia (with the FE) and most of the partners, cheaped out on proper cooling, up to and including even the smallest of efforts of You don't need to excuse/justify Nvidia's actions for everything, regardless of how much you like them.
|
# ? Dec 6, 2023 02:41 |
|
yeah don't the high end pro cards have sandwich VRAM to cram 48GB onto them, and it's not like those have particularly exotic cooling solutions it's clearly doable
|
# ? Dec 6, 2023 02:46 |
|
Yeah, and in many instances those pro cards are under just as much thermal pressure as the 3090s were seeing for mining, etc. But I guess in that case, when cards are costing several thousand dollars or more, it's easier to justify putting $10 pads everywhere.
|
# ? Dec 6, 2023 02:48 |
|
Nfcknblvbl posted:I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now? I play games often but for as much time as I spend playing around with hardware and the games it's still a hobby for me. Capital "G" Gamers, if that's what you are referring to, take what's a hobby for most and turn it into a lifestyle and thereby become insufferable.
|
# ? Dec 6, 2023 02:49 |
|
Truga posted:that's all great, but radeons in 2015 shipped with 8 People have completely memory holed that Nvidia shipped the 3060 with 12GB of VRAM for $329 MSRP a few years ago (because it was the peak of GPU shortage and they weren't ever really available for that price).
|
# ? Dec 6, 2023 02:49 |
|
Nfcknblvbl posted:I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now? esports games can run on potatoes and typically should be run on lowest quality anyway, people aren't buying 4090s to play counterstrike lol
|
# ? Dec 6, 2023 02:52 |
|
I thought esports games all need to be run at 300 fps or why even bother. I can barely tell above 75 but I also don’t do stranger danger multiplayer
|
# ? Dec 6, 2023 03:11 |
|
Twerk from Home posted:People have completely memory holed that Nvidia shipped the 3060 with 12GB of VRAM for $329 MSRP a few years ago (because it was the peak of GPU shortage and they weren't ever really available for that price). This whole thing started because he said the 980ti was being stingy with RAM, despite there being no games I am aware of that would have come close to using more than 6GB of VRAM when the 980ti was released.
|
# ? Dec 6, 2023 03:16 |
|
Also i’m pretty sure 8gb VRAM wouldn’t be the norm in anything but the high end for a long time after the 900 series, even for amd
|
# ? Dec 6, 2023 03:22 |
|
Canned Sunshine posted:It failed because both Nvidia (with the FE) and most of the partners, cheaped out on proper cooling, up to and including even the smallest of efforts of I ain't excusing poo poo, I literally said they've skimped out in other cases in the same sentence. Truga posted:that's all great, but radeons in 2015 shipped with 8 The R9 390 had a comically large 512b bus, it was literally the last GPU to do so. It's just revisionist history to suggest that NVIDIA not putting the largest physically possible amount of VRAM on the 980ti was some egregious gimping move. BurritoJustice fucked around with this message at 03:31 on Dec 6, 2023 |
# ? Dec 6, 2023 03:29 |
|
FuzzySlippers posted:I thought esports games all need to be run at 300 fps or why even bother. I can barely tell above 75 but I also don’t do stranger danger multiplayer A 1080Ti can run CS2 around 180-200 FPS at 1080p and most of the esports people just run at 1080p or even 720p if they can
|
# ? Dec 6, 2023 03:32 |
|
UHD posted:Also i’m pretty sure 8gb VRAM wouldn’t be the norm in anything but the high end for a long time after the 900 series, even for amd The GTX 1070 had 8gb and was $379. The RX 480 8gb was $239. Both came out in 2016. Cards stagnated on 8gb for a loooong time.
|
# ? Dec 6, 2023 03:46 |
|
Shipon posted:A 1080Ti can run CS2 around 180-200 FPS at 1080p and most of the esports people just run at 1080p or even 720p if they can Hell, a lot of them (and people who want to be like them) are still running at poo poo like 1280x1024 stretched. Peripheral vision? What's that?
|
# ? Dec 6, 2023 03:52 |
|
Lockback posted:This whole thing started because he said the 980ti was being stingy with RAM, despite there being no games I am aware of that would have come close to using more than 6GB of VRAM when the 980ti was released. Oh how quickly people have forgotten Skyrim...
|
# ? Dec 6, 2023 05:24 |
|
Nfcknblvbl posted:I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now? I play a lot of games and the 4090 is my favorite card of all time. Playing through ff7 remake at 120 fps 4k right now along with Pikmin 4 at 60fps on Yuzu 2023 has been insane for games, truly. e: super excited for Forbidden West early 2024, that game is going to look bonkers Taima fucked around with this message at 05:40 on Dec 6, 2023 |
# ? Dec 6, 2023 05:35 |
|
Inept posted:The GTX 1070 had 8gb and was $379. The RX 480 8gb was $239. Both came out in 2016. because until games meant to come out on the ps5/xbox series x came out that was Fine for almost everything. now we're on a new generation of games
|
# ? Dec 6, 2023 07:11 |
|
Well the good news is that by the time GTA6 is out, I'd be able to get a used 4090 for like $200 probably
|
# ? Dec 6, 2023 10:44 |
|
mobby_6kl posted:
200 new dollars, worth 200,000 old dollars
|
# ? Dec 6, 2023 10:57 |
|
https://twitter.com/VideoCardz/status/1732316650920096010?s=20
|
# ? Dec 6, 2023 14:03 |
|
Why not just... stop selling it
|
# ? Dec 6, 2023 14:47 |
|
oh for... why Jensen
|
# ? Dec 6, 2023 14:51 |
|
I guess vram must be expensive
|
# ? Dec 6, 2023 14:53 |
|
MarcusSA posted:I guess vram must be expensive https://www.tomshardware.com/news/gddr6-vram-prices-plummet $27 for 8 gigabytes on the spot market as of June, so yeah, nvidia really has to reign in costs here.
|
# ? Dec 6, 2023 15:04 |
|
The 3050 is kind of a crappy card isn't it? My guess is the only use case moving forward is as a encoder/decoder or really light gaming so VRam doesn't matter. I don't know why you don't just make a 3050 LE or something but NVidia naming has been mostly about creating confusion/trolling for a while now.
|
# ? Dec 6, 2023 15:17 |
|
Lockback posted:The 3050 is kind of a crappy card isn't it? My guess is the only use case moving forward is as a encoder/decoder or really light gaming so VRam doesn't matter. It's fine, I used it to upgrade my friend's PC and it's perfectly serviceable at playing BG3 at 1440p if you enable DLSS. She doesn't super care about new releases so the $180 I paid was worth it.
|
# ? Dec 6, 2023 15:29 |
|
https://www.computerbase.de/2023-12...rr_funktioniert FSR3 now supports VRR as of the version shipping in avatar so it'll be due a second chance i think the game is getting mid reviews though
|
# ? Dec 6, 2023 16:29 |
|
change my name posted:It's fine, I used it to upgrade my friend's PC and it's perfectly serviceable at playing BG3 at 1440p if you enable DLSS. She doesn't super care about new releases so the $180 I paid was worth it. It’s a fine low end card at 8gb. 6 is way too little for anything remotely new. poo poo the 1070 had 8gb. If you just want to play old games? Sure, but I’d argue that at that point it’s still a mediocre deal at 6gb especially when you look at used cards.
|
# ? Dec 6, 2023 16:33 |
|
repiv posted:https://www.computerbase.de/2023-12...rr_funktioniert Alex Battaglia seems to like it
|
# ? Dec 6, 2023 16:34 |
|
Cyrano4747 posted:It’s a fine low end card at 8gb. 6 is way too little for anything remotely new. poo poo the 1070 had 8gb. Oh, I just meant the 8GB version. The cut-down 6GB variant is trash, especially since you can pick up a used 12GB 3060 for $200 even these days pretty easily
|
# ? Dec 6, 2023 16:35 |
|
change my name posted:Oh, I just meant the 8GB version. The cut-down 6GB variant is trash, especially since you can pick up a used 12GB 3060 for $200 even these days pretty easily Nivida won't sell you a 12GB 3060 though, production of the 3060 has been fully replaced with the slower 3060 8GB at the same MSRP: https://www.techspot.com/review/2581-nvidia-rtx-3060-8gb/ These are on the shelves at Best buy, people are paying $300 for them. Cynically, I think the only reason they exist is to let Nvidia keep selling some cheap Ampere without making their new GPUs look stingy on VRAM.
|
# ? Dec 6, 2023 16:41 |
|
change my name posted:Oh, I just meant the 8GB version. The cut-down 6GB variant is trash, especially since you can pick up a used 12GB 3060 for $200 even these days pretty easily In this example BG3 runs fine on 6GB though. You wouldn't see a difference between 6 and 8.
|
# ? Dec 6, 2023 17:06 |
|
|
# ? Jun 10, 2024 20:35 |
|
Lockback posted:In this example BG3 runs fine on 6GB though. You wouldn't see a difference between 6 and 8. This example is likely not true in this specific instance as Nvidia will probably cut down the memory bus even further and stick with the lower 115-watt power cap that the revised 8GB version was saddled with. Thankfully this was a launch model I bought allllllll the way back through Newegg's GPU lottery (lol) change my name fucked around with this message at 17:20 on Dec 6, 2023 |
# ? Dec 6, 2023 17:12 |