Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Inept
Jul 8, 2003

Cygni posted:

Why are you mad that they made a new product?

i think they're mad that the price sucks rear end

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

change my name posted:

Oh man, time to spit shine Dishonored Everquest

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
AV1 encode/decode will matter more to me probably than all the other fancy stuff, oh boy.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Just remember that this is not the whole stack, so even if you want the new features, you can keep waiting for the 4050

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

necrobobsledder posted:

AV1 encode/decode will matter more to me probably than all the other fancy stuff, oh boy.

You're going to get a better value in games and in encoding by buying an AMD 7700XT or discount 6900XT and an Intel GPU. Multi-GPU is back, baby!

orcane
Jun 13, 2012

Fun Shoe

gradenko_2000 posted:

Just remember that this is not the whole stack, so even if you want the new features, you can keep waiting for the 4050
Yeah but that will release in like 2024 for probably $500 at this rate.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Also you can buy a used 3080 FE on Reddit for like $500 right now, I might just do that, flip my 3070, and wait until the 5000 series. I need to test high refresh rate monitors at native resolution for work so I actually have a use case...

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Power usage (4090, 4080, not4080)


Not as bad as early rumors.

Cygni
Nov 12, 2005

raring to post

Inept posted:

i think they're mad that the price sucks rear end

I don't think we really know that until we get reviews though. Nvidia claims the 12GB 4080 outperforms the 3080 Ti by 2x. At $899, thats a significant uplift even vs the 3080 Ti's current street price. There are some worrying caveats in Nvidias marketing, but we won't really know until the review comes out. If you weren't in the $900+ price bracket, this announcement wasn't for you. The lower end cards will come later as always. Would have been nice to see them release something down at the $699 bracket before years end, but oh well. There are plenty of Ampere cards on steep discount to buy instead, but anyone that does needs to be aware that they are legally forbidden from whining about it when Nvidia replaces them with something with better performance in 4 months.

As always, ignore the naming and only look at price/performance. The x80 will continue to inflate ever higher. It's just a name, and they know that they can increase the price and average margin and many people will follow the name ever higher. A base Honda CR-V is $27k now for the exact same reason.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Twerk from Home posted:

You're going to get a better value in games and in encoding by buying an AMD 7700XT or discount 6900XT and an Intel GPU. Multi-GPU is back, baby!
On a strictly gaming + encoding basis you're right. But I do ML research as a hobby and AMD is allergic to making good frameworks for people doing ML work and the industry just doesn't have the bandwidth to deal with AMD's shenanigans for the most part. I really wish I could use AMD GPUs given I don't want to give nVidia money but I have to yield somewhere. I may be better off trying to use Apple's frameworks for PyTorch and Keras. For now I'm just going to do my training on GCP anyway given it's much cheaper in the end than paying the $2k premium essentially for the pro tier stuff and I'm still a cheapskate at heart. Should probably have done that in hindsight rather than plunking down $2k for a RTX 3090 the other year.

kliras
Mar 27, 2021

Twerk from Home posted:

You're going to get a better value in games and in encoding by buying an AMD 7700XT or discount 6900XT and an Intel GPU. Multi-GPU is back, baby!
do any other companies than nvidia separate the encode workload? there's always fun stuff to do on two-pc setups, but at that point, you might as well encode with cpu

if amd don't do that with rdna3, it's going ot be a huge bummer and dealbreaker for me

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Probably too early to tell, but does it at least look like these prices are justified by their performance boost compared to Ampere?

Sorry, I haven’t had time to watch the presentation yet.

E:partially answered when I was typing this

MarcusSA
Sep 23, 2007

Cygni posted:


Nvidia claims the 12GB 4080 outperforms the 3080 Ti by 2x. At $899,


Yeah but is that 2x with DLSS 3.0 or just straight up? I’m guessing it’s with DLSS and straight up it’s not nearly as fast.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Rinkles posted:

Probably too early to tell, but does it at least look like these prices are justified by their performance boost compared to Ampere?

Sorry, I haven’t had time to watch the presentation yet.

Yes, these prices are all a much better performance value than the RTX 3090.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If they had improved the VRAM count at each performance tier more it would be more obvious that in terms of relative processing power at launch 4080 12GB ≈ 3080 10GB, and 4080 16GB ≈ 3090, and 4090 had no Ampere precedent. They kinda nerfed the 3090 tier's VRAM but lowered its price $300 in exchange in the 4080 16GB, while not increasing the base 4080's VRAM any versus the final 12GB 3080. But in terms of GPU power, it isn't so crazy. Just pricing a bit higher than expected for the product which is closest to Ampere's launch 3080.

Agreed fucked around with this message at 17:45 on Sep 20, 2022

kliras
Mar 27, 2021
look at the vents of this thing

https://twitter.com/VideoCardz/status/1572264521506631681

twitter not embedding

Cygni
Nov 12, 2005

raring to post

MarcusSA posted:

Yeah but is that 2x with DLSS 3.0 or just straight up? I’m guessing it’s with DLSS and straight up it’s not nearly as fast.

Thats the worrying caveats i was talkin about. They only released numbers with DLSS 3.0 at 4k Ultra, which is definitely going to be a best case scenario for Lovelace. But ignoring internal numbers altogether is always a wise decision, so we will find out soon. Not sure when the review embargo is, but launch is only like 3 weeks away.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Twerk from Home posted:

Yes, these prices are all a much better performance value than the RTX 3090.

Well I was more interested in the 3080 comparison

Inept
Jul 8, 2003

I bet the evga ceo is feeling pretty smug right now lol

TyrantWD
Nov 6, 2010
Ignore my doomerism, I don't think better things are possible
My 3080 is not quite cutting it at 3440x1440, but at these prices, I think I'll wait for the 5080 and just do an entirely new build.

Power requirements are starting to be concerning as well. 850W doesn't always cut it for my 3080, so you are probably going to need 1000-1200 to feel comfortable on a 4080+. Not only are these cards expensive, but they are going to start to become quite costly to operate.

ijyt
Apr 10, 2012

I'm just going to treat the 40 series like the 20 series when I had a 1080, overpriced and skippable. Going to eat up every 4080 review come November though.

e: :captainpop: my 3080 10G is great at 3440x1440

orcane
Jun 13, 2012

Fun Shoe

kliras posted:

look at the vents of this thing

https://twitter.com/VideoCardz/status/1572264521506631681

twitter not embedding


This looks like a Borderlands gun now. I love it.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Huh, I’m surprised they didn’t do much of a redesign of the FE models. I guess they found a look they like.

repiv
Aug 13, 2009

https://www.youtube.com/watch?v=qyGWFI1cuZQ

analysis is coming later, but DF posted some 120fps DLSS3 footage so we can pixel peep the interpolated frames a bit

Cao Ni Ma
May 25, 2010



Twerk from Home posted:

Yes, these prices are all a much better performance value than the RTX 3090.

3090s were terrible cards and strictly in the halo category that very few people buy into.

Now with the x80s going up to $1200 theyve blurred the line of terrible product you shouldnt buy to "well its just a few hundred more and its actually available"

Thats another thing, hows availability going to look for cut down 3080s vs the full one

ijyt
Apr 10, 2012

Rinkles posted:

Huh, I’m surprised they didn’t do much of a redesign of the FE models. I guess they found a look they like.

Wasn't the cooler design aesthetically the same from the 700 series all the way to the 10 series? I think the 20 series is really only the recent generation that hasn't had a long-lived cooler design (for better or worse)

infraboy
Aug 15, 2002

Phungshwei!!!!!!1123
Hopefully Best Buy's website is updated to handle some loads in a month or so. :lol: if they force you to have total tech.

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





Rinkles posted:

Huh, I’m surprised they didn’t do much of a redesign of the FE models. I guess they found a look they like.

It's a sharp look, one of the more handsome GPU designs currently on the market

kliras
Mar 27, 2021
ada will also feature "dual av1 encoding", but it sounds like it's not for improving quality as much as resolution and encode/render time. so a horizontal improvement if you will

https://blogs.nvidia.com/blog/2022/09/20/nvidia-studio-geforce-rtx-40-series/?=&linkId=100000150615477

quote:

Video editors and livestreamers are getting a massive boost, too. New dual encoders cut video export times nearly in half. Live streamers get encoding benefits with the eighth-generation NVIDIA Encoder, including support for AV1 encoding.

quote:

Video production is getting a significant boost with GeForce RTX 40 Series GPUs. The feeling of being stuck on pause while waiting for videos to export gets dramatically reduced with the GeForce RTX 40 Series’ new dual encoders, which slash export times nearly in half.

The dual encoders can work in tandem, dividing work automatically between them to double output. They’re also capable of recording up to 8K, 60 FPS content in real time via GeForce Experience and OBS Studio to make stunning gameplay videos.

the wording is very weird, and they make it sound like it's specific to the 4090 which i'm sure won't cause confusion

EvilBlackRailgun
Jan 28, 2007


DF results look intriguing

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

repiv posted:

https://www.youtube.com/watch?v=qyGWFI1cuZQ

analysis is coming later, but DF posted some 120fps DLSS3 footage so we can pixel peep the interpolated frames a bit

I think this gives us some context for that "2x 3090ti" 4090 figure


Cygni
Nov 12, 2005

raring to post

Cao Ni Ma posted:

3090s were terrible cards and strictly in the halo category that very few people buy into.

Now with the x80s going up to $1200 theyve blurred the line of terrible product you shouldnt buy to "well its just a few hundred more and its actually available"

I would say that the 3080 Ti was made specifically to established that middle. Nvidia wants a wide product stack to serve everyone from whales to kids, so there will be prices in a big range. The prices alone don't tell you anything, we need the performance side of the coin too, which we don't really have. Nvidia was always going to have a ~$1200 card in this line up, but what performance are they offering for the dollar?

repiv
Aug 13, 2009

pro-tip for viewing that DF video, youtube lets you step frame by frame using the , and . keys

you can certainly spot some oddities every other frame when the interpolation kicks in, but i didn't spot anything too egregious

it will break in cases where disocclusion forces it to guess details that didn't exist in the last real frame, but at high framerates that should be less obvious

repiv fucked around with this message at 18:12 on Sep 20, 2022

Gwaihir
Dec 8, 2009
Hair Elf
This whole release seems very very silly when still, to this day, the most common resolution on the fuckin steam hardware survey by a gigantic mile is 1920 * 1080.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Cao Ni Ma posted:

3090s were terrible cards and strictly in the halo category that very few people buy into.

Now with the x80s going up to $1200 theyve blurred the line of terrible product you shouldnt buy to "well its just a few hundred more and its actually available"

3090 had a smaller increase in CUDA cores over the 3080 than the 4080 16GB does over the 4080 12GB; in a sense they're reducing VRAM but also price of the 3090-alike this gen. The 4090 did not have an Ampere analogous part, IMO, it'll be interesting to see what kind of performance it gets - 41% more CUDA cores vs the next product, maybe the extra $$ will not seem as wasteful this go around. I dunno. $1700 after tax is way outta my price range for a GPU upgrade either way.

Looking forward to benchmarks, chilling out for now 'til more info is available.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Agreed posted:

If they had improved the VRAM count at each performance tier more it would be more obvious that in terms of relative processing power at launch 4080 12GB ≈ 3080 10GB, and 4080 16GB ≈ 3090, and 4090 had no Ampere precedent. They kinda nerfed the 3090 tier's VRAM but lowered its price $300 in exchange in the 4080 16GB, while not increasing the base 4080's VRAM any versus the final 12GB 3080. But in terms of GPU power, it isn't so crazy. Just pricing a bit higher than expected for the product which is closest to Ampere's launch 3080.

So kinda like how a 3070 is a 2080 Ti but with less vram, or the 2080 is a 1080 Ti with less vram :thunk:

yeah that’s not really an unusual move, NVIDIA often does that, pascal was really the exception with those huge across-the-board increases in vram… some skus doubled and that’s abnormal. The x80 is always nerfed vs the prior top chip in some respects, it’s like a cost optimized version.

Paul MaudDib fucked around with this message at 19:03 on Sep 20, 2022

Inept
Jul 8, 2003

Gwaihir posted:

This whole release seems very very silly when still, to this day, the most common resolution on the fuckin steam hardware survey by a gigantic mile is 1920 * 1080.

nvidia is determined to cater to ever higher end price points. the 7090 will cost 80 million dollars and have one buyer

Cygni
Nov 12, 2005

raring to post

Gwaihir posted:

This whole release seems very very silly when still, to this day, the most common resolution on the fuckin steam hardware survey by a gigantic mile is 1920 * 1080.

Friends don't let friends buy 1080p monitors in 2022.

Also, lol:



load bearing GPU

repiv
Aug 13, 2009

Cygni posted:



load bearing GPU

:sickos:

Adbot
ADBOT LOVES YOU

Cao Ni Ma
May 25, 2010



Cygni posted:

I would say that the 3080 Ti was made specifically to established that middle. Nvidia wants a wide product stack to serve everyone from whales to kids, so there will be prices in a big range. The prices alone don't tell you anything, we need the performance side of the coin too, which we don't really have. Nvidia was always going to have a ~$1200 card in this line up, but what performance are they offering for the dollar?

The 3080ti was also not a good card and every reviewer was basically factoring the market conditions at the time to possibly justify it.

The market conditions dont exist anymore. Saying that the 4080ti is nearly twice as fast as the 3080ti and only $100 more doesnt mean much when the 3080ti was a terrible product

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply