Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
shrike82
Jun 11, 2005

The 8K gaming skew to the marketing for the 3090 is so weird. They don’t say anything about what the 24GB gets you - no mention of compute use cases.

Adbot
ADBOT LOVES YOU

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

shrike82 posted:

Makes you wonder why they didn’t keep the Titan branding for the 3090 if they’re calling the 3080 the flagship. Odd positioning.

Probably because the Titan branding is a disaster :

GeForce GTX Titan
GeForce GTX Titan Black
GeForce GTX Titan Z
GeForce GTX Titan X
Nvidia Titan X
Nvidia Titan Xp
Nvidia Titan V
Nvidia Titan V CEO Edition
Nvidia Titan RTX

It's a goddamn abomination. If I jumbled the order of those you'd have absolutely no idea which cards were higher and lower performance.

Or they may be keeping the Titan branding around for later, or even skipping the Titan name this gen and bringing it back for Hopper or something. Seems unlikely though as based on specs the 3090 is clearly a Titan-class product. Or maybe they always would have preferred to call the Titans x90 but couldn't because of the (thankfully now dead) dual-GPU cards. With no plans for an Ampere dual-GPU card and Hopper making that concept obsolete maybe they felt now was time to reclaim the x90 branding. Who knows, Jensen does what Jensen wants. If he's going to sell me good GPUs, I won't complain too much.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Bloodplay it again posted:

1 year of geforce now is the promo for the RTX 3000 series cards. Sorry to anyone who was hoping for Cyberpunk.

Edit: of course I missed it 4 pages ago



you missed this too

Coffee Jones
Jul 4, 2004

16 bit? Back when we was kids we only got a single bit on Christmas, as a treat
And we had to share it!

Rinkles posted:

D3 on toasters is a myth. A conspiracy by big toast to sell toasters. (It really doesn't run great on low end devices)

It was “playable, I guess” on my Sandy Bridge iGPU laptop, and I beat it that way :downs:

Kazinsal
Dec 13, 2011
So I guess I'm going from a 1080 Ti to a 3080 then. I play at 1440p 144Hz so... woohoo, I get to crank everything to Beyond Ultra!

Inept
Jul 8, 2003

Buy a 3080 so you can use Geforce Now to stream Cyberpunk.

shrike82
Jun 11, 2005

https://twitter.com/kopite7kimi/status/1300846591503298562?s=20

Worf
Sep 12, 2017

If only Seth would love me like I love him!


which AMD site do they prefer

shrike82
Jun 11, 2005

lol that all their performance numbers are RTX and DLSS on

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Reaching into my old rear end bag of tech memories, fair to say the 3000 series will do for ray-tracing what the ATI R300 did for anti-aliasing?

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

shrike82 posted:

lol that all their performance numbers are RTX and DLSS on

Is it distorting the comparison even if both are using DLSS?

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

shrike82 posted:

lol that all their performance numbers are RTX and DLSS on

I mean DF found that it crushed the 2080 without either one by like +75% or some poo poo so

the_enduser
May 1, 2006

They say the user lives outside the net.



Inept posted:

Buy a 3080 so you can use Geforce Now to stream Cyberpunk.

lmao seriously, this feels so dumb. Why would anyone want Geforce NOW when you have a system that can melt steel beams????

Kazinsal
Dec 13, 2011

the_enduser posted:

lmao seriously, this feels so dumb. Why would anyone want Geforce NOW when you have a system that can melt steel beams????

It's purely marketing so Nvidia can still "commit" to Cyberpunk without CD Projekt Red having to commit to Cyberpunk.

This announcement is 100% going to be paired with an upcoming announcement of another release delay.

shrike82
Jun 11, 2005

sean10mm posted:

I mean DF found that it crushed the 2080 without either one by like +75% or some poo poo so

The numbers in the article mix usage of DLSS and RT so doesn’t seem true

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Kazinsal posted:

It's purely marketing so Nvidia can still "commit" to Cyberpunk without CD Projekt Red having to commit to Cyberpunk.

This announcement is 100% going to be paired with an upcoming announcement of another release delay.

lmao at the other studios trying desperately to swerve their products' release dates safely out of the way of CDPR drunkenly careening across the calendar, seriously it has been a hoot

but it's understandable, release anytime close to cyberpunk and gamers will be like "call of what now?", it will be like when Titanfall 2 released in the two weeks sandwiched between battlefield 1 and COD Infinity Wars. Nobody wants to get titanfall'd.

Paul MaudDib fucked around with this message at 22:26 on Sep 1, 2020

Carecat
Apr 27, 2004

Buglord
So when is Skyrim RTX announced with raytraced next gen console ports?

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Carecat posted:

So when is Skyrim RTX announced with raytraced next gen console versions?

Don't know how good that would look without a PBR texture overhaul

Kazinsal
Dec 13, 2011

Rinkles posted:

Don't know how good that would look without a PBR texture overhaul

They would just throw it all through Materialize and it'd sell ten million copies anyways

an actual dog
Nov 18, 2014

Digital Foundry published an enthusiastic Star Citizen video, so calm down about that video lol

repiv
Aug 13, 2009

Combat Pretzel posted:

The amount of raytraced samples taken with this RTX stuff, does it self-regulate based on frame rates (e.g. a fps limiter in game) and how long the rasterizer is taking? Card/driver going "poo poo's taking too long, sampling some more" or whatever.

The sampling strategy is up to the game engine. Raytracing usually happens after rasterization (per-pixel RT effects need the G-Buffer to know where to launch rays from) so raster being slow doesn't allow for more samples.

RT techniques that are decoupled from pixels might be able to run async alongside rasterization though? Nobody has shipped anything like that yet but NV has been working on middleware for it: https://developer.nvidia.com/rtxgi

ufarn
May 30, 2009

Carecat posted:

400W AIB?


Seems like bad reporting that describes the maximum throughput capacity as what they're actually going for. Similar to sites writing about next-gen consoles having 8K120, just because HDMI 2.1 technically supports it.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

an actual dog posted:

Digital Foundry published an enthusiastic Star Citizen video, so calm down about that video lol

this is extremely damning tbh

Internet Explorer
Jun 1, 2005





lol good luck buying a PSU right now

Twibbit
Mar 7, 2013

Is your refrigerator running?

shrike82 posted:

The numbers in the article mix usage of DLSS and RT so doesn’t seem true

Doom has neither of those and they showed amazing gains there

Happy Noodle Boy
Jul 3, 2002


Internet Explorer posted:

lol good luck buying a PSU right now

MicroCenter supremacy.

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
So will purchasing a 3080 get me through the entire generation with my computer outperforming launch Xbox Series X multiplats?

Carecat
Apr 27, 2004

Buglord

Internet Explorer posted:

lol good luck buying a PSU right now

This is why I had it lined up before the presentation started.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

shrike82 posted:

The numbers in the article mix usage of DLSS and RT so doesn’t seem true

I was under the impression that 2 of the games benched (Doom Eternal, Borderlands 3) don't even support RT or DLSS so would that not be just normal rasterization gains?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Twibbit posted:

Doom has neither of those and they showed amazing gains there

And if for some reason DLSS 3.0 is working on Doom without engine integration, then why wouldn't you count it as baseline performance anyway? That'd mean DLSS works on loving everything.

shrike82
Jun 11, 2005

What cards are people planning to buy?

I’m still 50-50 on getting the 3090 - would be good to know if there’s a 48GB SKU coming out anytime soon. Not to mention seeing some compute numbers.

Cream-of-Plenty
Apr 21, 2010

"The world is a hellish place, and bad writing is destroying the quality of our suffering."
I just want an EVGA 3080. Their company has always treated me good, especially when it comes to technical problems and RMAs. It wouldn't be a huge upgrade like some of the other people with like GTX 970s right now (I've got a 1080 at the moment) but I think I'm ready for it.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

shrike82 posted:

What cards are people planning to buy?

I’m still 50-50 on getting the 3090 - would be good to know if there’s a 48GB SKU coming out anytime soon. Not to mention seeing some compute numbers.

3090. It'll be terrible price/performance, but that extra 15-20% performance is relevant for VR as the Valve Index will take whatever hardware you throw at it. I've got money saved because of coronavirus pretty much canceling all my plans since March.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

shrike82 posted:

would be good to know if there’s a 48GB SKU coming out anytime soon.

Lol, why?

shrike82
Jun 11, 2005

Rinkles posted:

Lol, why?

ML

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

shrike82 posted:

What cards are people planning to buy?

I’m still 50-50 on getting the 3090 - would be good to know if there’s a 48GB SKU coming out anytime soon. Not to mention seeing some compute numbers.

I'm on the fence as well, between the 3080 and the 3090. Will the 3090 be worth the extra money? Hell no, but I've got the disposable income now and plan on this being the last hurrah of my computer upgrade purchases for atleast 5 years while the rest of my money goes to starting a family.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

I'm 50-50 on the 3080 and 3090 and decided to go with EVGA in case something goes wrong I'll be covered by a better RMA process.
The only thing I'm unsure of is which cooling solution I want to go with... Probably FTW3 because I have nowhere in my case to mount the radiator from their hybrid solutions.

the_enduser
May 1, 2006

They say the user lives outside the net.



Grabbing a 3080 FE probably, none of the AIBs look appealing really. Gamers Flair is dumb af

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

shrike82 posted:

What cards are people planning to buy?

I’m still 50-50 on getting the 3090 - would be good to know if there’s a 48GB SKU coming out anytime soon. Not to mention seeing some compute numbers.

I'm 100% getting the 3070. Exact manufacturer/model depends on the reviews, but I'm leaning towards a bigass triple fan card (non-RGB) unless the FE cooler is proven to be some kind of unexpected noise vs. performance revelation.

Adbot
ADBOT LOVES YOU

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Cream-of-Plenty posted:

I just want an EVGA 3080. Their company has always treated me good, especially when it comes to technical problems and RMAs. It wouldn't be a huge upgrade like some of the other people with like GTX 970s right now (I've got a 1080 at the moment) but I think I'm ready for it.

You're getting like at least double the raster power + RTX + integer scaling + a billion other things

the 1080 is a lot closer to the 970 than the 3080 imo

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply