Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MarcusSA
Sep 23, 2007

Nfcknblvbl posted:

I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now?

I only watch long form videos about games I want to play and then never do.

Adbot
ADBOT LOVES YOU

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Honestly I game on my Steam Deck more than my PC these days

Truga
May 4, 2014
Lipstick Apathy
i play games a ton, but i also only buy gpus every 5 years or so. 980ti lasted me over 6, and it'd probably last 7 or 8 if nvidia didn't skimp on vram and put 8 instead of 6 on it, because it ran all the games i play fine, it's just alttabbing with a game running got ridiculously slow when the 6gb started to be not quite enough

BurritoJustice
Oct 9, 2012

Truga posted:

i play games a ton, but i also only buy gpus every 5 years or so. 980ti lasted me over 6, and it'd probably last 7 or 8 if nvidia didn't skimp on vram and put 8 instead of 6 on it, because it ran all the games i play fine, it's just alttabbing with a game running got ridiculously slow when the 6gb started to be not quite enough

They didn't skimp on VRAM with the 980ti lol, you can level that criticism at a lot of modern GPUs but the 6GB on the 980ti is just a product of it's time. It's a very large 384b memory bus, and they fully loaded it with the most dense chips available at the time. They could've done 12GB, but it would've required sandwiching VRAM on both sides of the PCB which is a cost and reliability nightmare. The only times they've done sandwich VRAM on consumer cards is the 3090 (which failed a lot), or more recently on the 4060ti (because that card was a panicked reflex to VRAM backlash).

Cyrano4747
Sep 25, 2006

Yes, I know I'm old, get off my fucking lawn so I can yell at these clouds.

Truga posted:

i play games a ton, but i also only buy gpus every 5 years or so. 980ti lasted me over 6, and it'd probably last 7 or 8 if nvidia didn't skimp on vram and put 8 instead of 6 on it, because it ran all the games i play fine, it's just alttabbing with a game running got ridiculously slow when the 6gb started to be not quite enough

Im in the same boat but I try to stay up to date on GPU stiff just because it’s always great drama to rubberneck.

Truga
May 4, 2014
Lipstick Apathy

BurritoJustice posted:

They didn't skimp on VRAM with the 980ti lol, you can level that criticism at a lot of modern GPUs but the 6GB on the 980ti is just a product of it's time. It's a very large 384b memory bus, and they fully loaded it with the most dense chips available at the time. They could've done 12GB, but it would've required sandwiching VRAM on both sides of the PCB which is a cost and reliability nightmare. The only times they've done sandwich VRAM on consumer cards is the 3090 (which failed a lot), or more recently on the 4060ti (because that card was a panicked reflex to VRAM backlash).
that's all great, but radeons in 2015 shipped with 8 :v:

Indiana_Krom
Jun 18, 2007
Net Slacker
I just started playing through Star Wars: The Force Unleashed for the first time in many years. It is hilarious playing on a 4090, with the default 30 FPS cap the game has the GPU doesn't even come out of its 210 MHz idle desktop clocks to run it (and only reaches like 40% utilization at that), the power consumption on the 4090 is all of 20w, when it idles at 16. I remember when this game brought a GPU of mine to its knees.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Nfcknblvbl posted:

I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now?

I used my 3060Ti to play Snowrunner yesterday.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



BurritoJustice posted:

The only times they've done sandwich VRAM on consumer cards is the 3090 (which failed a lot)

It failed because both Nvidia (with the FE) and most of the partners, cheaped out on proper cooling, up to and including even the smallest of efforts of eating adding a small cost for better thermal pads on the backplane VRAM.

You don't need to excuse/justify Nvidia's actions for everything, regardless of how much you like them.

repiv
Aug 13, 2009

yeah don't the high end pro cards have sandwich VRAM to cram 48GB onto them, and it's not like those have particularly exotic cooling solutions

it's clearly doable

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Yeah, and in many instances those pro cards are under just as much thermal pressure as the 3090s were seeing for mining, etc.

But I guess in that case, when cards are costing several thousand dollars or more, it's easier to justify putting $10 pads everywhere.

Bloopsy
Jun 1, 2006

you have been visited by the Tasty Garlic Bread. you will be blessed by having good Garlic Bread in your life time, but only if you comment "ty garlic bread" in the thread below

Nfcknblvbl posted:

I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now?

I play games often but for as much time as I spend playing around with hardware and the games it's still a hobby for me. Capital "G" Gamers, if that's what you are referring to, take what's a hobby for most and turn it into a lifestyle and thereby become insufferable.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Truga posted:

that's all great, but radeons in 2015 shipped with 8 :v:

People have completely memory holed that Nvidia shipped the 3060 with 12GB of VRAM for $329 MSRP a few years ago (because it was the peak of GPU shortage and they weren't ever really available for that price).

Shipon
Nov 7, 2005

Nfcknblvbl posted:

I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now?

esports games can run on potatoes and typically should be run on lowest quality anyway, people aren't buying 4090s to play counterstrike lol

FuzzySlippers
Feb 6, 2009

I thought esports games all need to be run at 300 fps or why even bother. I can barely tell above 75 but I also don’t do stranger danger multiplayer :shrug:

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Twerk from Home posted:

People have completely memory holed that Nvidia shipped the 3060 with 12GB of VRAM for $329 MSRP a few years ago (because it was the peak of GPU shortage and they weren't ever really available for that price).

This whole thing started because he said the 980ti was being stingy with RAM, despite there being no games I am aware of that would have come close to using more than 6GB of VRAM when the 980ti was released.

UHD
Nov 11, 2006


Also i’m pretty sure 8gb VRAM wouldn’t be the norm in anything but the high end for a long time after the 900 series, even for amd

BurritoJustice
Oct 9, 2012

Canned Sunshine posted:

It failed because both Nvidia (with the FE) and most of the partners, cheaped out on proper cooling, up to and including even the smallest of efforts of eating adding a small cost for better thermal pads on the backplane VRAM.

You don't need to excuse/justify Nvidia's actions for everything, regardless of how much you like them.

I ain't excusing poo poo, I literally said they've skimped out in other cases in the same sentence.

Truga posted:

that's all great, but radeons in 2015 shipped with 8 :v:

The R9 390 had a comically large 512b bus, it was literally the last GPU to do so.

It's just revisionist history to suggest that NVIDIA not putting the largest physically possible amount of VRAM on the 980ti was some egregious gimping move.

BurritoJustice fucked around with this message at 03:31 on Dec 6, 2023

Shipon
Nov 7, 2005

FuzzySlippers posted:

I thought esports games all need to be run at 300 fps or why even bother. I can barely tell above 75 but I also don’t do stranger danger multiplayer :shrug:

A 1080Ti can run CS2 around 180-200 FPS at 1080p and most of the esports people just run at 1080p or even 720p if they can

Inept
Jul 8, 2003

UHD posted:

Also i’m pretty sure 8gb VRAM wouldn’t be the norm in anything but the high end for a long time after the 900 series, even for amd

The GTX 1070 had 8gb and was $379. The RX 480 8gb was $239. Both came out in 2016.

Cards stagnated on 8gb for a loooong time.

Kazinsal
Dec 13, 2011

Shipon posted:

A 1080Ti can run CS2 around 180-200 FPS at 1080p and most of the esports people just run at 1080p or even 720p if they can

Hell, a lot of them (and people who want to be like them) are still running at poo poo like 1280x1024 stretched. Peripheral vision? What's that?

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Lockback posted:

This whole thing started because he said the 980ti was being stingy with RAM, despite there being no games I am aware of that would have come close to using more than 6GB of VRAM when the 980ti was released.

Oh how quickly people have forgotten Skyrim...

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Nfcknblvbl posted:

I find it kind of strange how little overlap there is between this GPU thread and there being Gamers in here. Are all of you only generating anime waifus with your 4090s now?

I play a lot of games and the 4090 is my favorite card of all time. Playing through ff7 remake at 120 fps 4k right now along with Pikmin 4 at 60fps on Yuzu :cabot: 2023 has been insane for games, truly.

e: super excited for Forbidden West early 2024, that game is going to look bonkers

Taima fucked around with this message at 05:40 on Dec 6, 2023

Shipon
Nov 7, 2005

Inept posted:

The GTX 1070 had 8gb and was $379. The RX 480 8gb was $239. Both came out in 2016.

Cards stagnated on 8gb for a loooong time.

because until games meant to come out on the ps5/xbox series x came out that was Fine for almost everything. now we're on a new generation of games

mobby_6kl
Aug 9, 2009

by Fluffdaddy


Well the good news is that by the time GTA6 is out, I'd be able to get a used 4090 for like $200 probably

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

mobby_6kl posted:



Well the good news is that by the time GTA6 is out, I'd be able to get a used 4090 for like $200 probably

200 new dollars, worth 200,000 old dollars

shrike82
Jun 11, 2005

https://twitter.com/VideoCardz/status/1732316650920096010?s=20

:lol:

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.


Why not just... stop selling it

orcane
Jun 13, 2012

Fun Shoe
oh for... why Jensen :psyduck:

MarcusSA
Sep 23, 2007


I guess vram must be expensive

Inept
Jul 8, 2003

MarcusSA posted:

I guess vram must be expensive

https://www.tomshardware.com/news/gddr6-vram-prices-plummet

$27 for 8 gigabytes on the spot market as of June, so yeah, nvidia really has to reign in costs here.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
The 3050 is kind of a crappy card isn't it? My guess is the only use case moving forward is as a encoder/decoder or really light gaming so VRam doesn't matter.

I don't know why you don't just make a 3050 LE or something but NVidia naming has been mostly about creating confusion/trolling for a while now.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Lockback posted:

The 3050 is kind of a crappy card isn't it? My guess is the only use case moving forward is as a encoder/decoder or really light gaming so VRam doesn't matter.

I don't know why you don't just make a 3050 LE or something but NVidia naming has been mostly about creating confusion/trolling for a while now.

It's fine, I used it to upgrade my friend's PC and it's perfectly serviceable at playing BG3 at 1440p if you enable DLSS. She doesn't super care about new releases so the $180 I paid was worth it.

repiv
Aug 13, 2009

https://www.computerbase.de/2023-12...rr_funktioniert

FSR3 now supports VRR as of the version shipping in avatar so it'll be due a second chance i think

the game is getting mid reviews though

Cyrano4747
Sep 25, 2006

Yes, I know I'm old, get off my fucking lawn so I can yell at these clouds.

change my name posted:

It's fine, I used it to upgrade my friend's PC and it's perfectly serviceable at playing BG3 at 1440p if you enable DLSS. She doesn't super care about new releases so the $180 I paid was worth it.

It’s a fine low end card at 8gb. 6 is way too little for anything remotely new. poo poo the 1070 had 8gb.

If you just want to play old games? Sure, but I’d argue that at that point it’s still a mediocre deal at 6gb especially when you look at used cards.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

repiv posted:

https://www.computerbase.de/2023-12...rr_funktioniert

FSR3 now supports VRR as of the version shipping in avatar so it'll be due a second chance i think

the game is getting mid reviews though

Alex Battaglia seems to like it

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Cyrano4747 posted:

It’s a fine low end card at 8gb. 6 is way too little for anything remotely new. poo poo the 1070 had 8gb.

If you just want to play old games? Sure, but I’d argue that at that point it’s still a mediocre deal at 6gb especially when you look at used cards.

Oh, I just meant the 8GB version. The cut-down 6GB variant is trash, especially since you can pick up a used 12GB 3060 for $200 even these days pretty easily

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

change my name posted:

Oh, I just meant the 8GB version. The cut-down 6GB variant is trash, especially since you can pick up a used 12GB 3060 for $200 even these days pretty easily

Nivida won't sell you a 12GB 3060 though, production of the 3060 has been fully replaced with the slower 3060 8GB at the same MSRP: https://www.techspot.com/review/2581-nvidia-rtx-3060-8gb/

These are on the shelves at Best buy, people are paying $300 for them. Cynically, I think the only reason they exist is to let Nvidia keep selling some cheap Ampere without making their new GPUs look stingy on VRAM.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

change my name posted:

Oh, I just meant the 8GB version. The cut-down 6GB variant is trash, especially since you can pick up a used 12GB 3060 for $200 even these days pretty easily


In this example BG3 runs fine on 6GB though. You wouldn't see a difference between 6 and 8.

Adbot
ADBOT LOVES YOU

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Lockback posted:

In this example BG3 runs fine on 6GB though. You wouldn't see a difference between 6 and 8.

This example is likely not true in this specific instance as Nvidia will probably cut down the memory bus even further and stick with the lower 115-watt power cap that the revised 8GB version was saddled with. Thankfully this was a launch model I bought allllllll the way back through Newegg's GPU lottery (lol)

change my name fucked around with this message at 17:20 on Dec 6, 2023

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply