Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

wolrah posted:

Assuming you have 8+ cores, I'd say it's a safe bet unless AMD really pulls something magical out of their rear end.

I feel that I can breath easy now that I have a baseline of what Video Card, CPU, and SSD I'm going to purchase.

Adbot
ADBOT LOVES YOU

A Proper Uppercut
Sep 30, 2008

Has anyone bought a video card from a place like Best Buy any time recently? I've got a bullshit $500 visa gift card from my last job that I can't use online, but I'd like to use it toward the purchase of a new video card.

shrike82
Jun 11, 2005

no new slate of DLSS games announced during the presentation which is disappointing - not sure why they aren't pushing partnerships with devs more aggressively

Mercrom
Jul 17, 2009

Kraftwerk posted:

I'll just play Doom Eternal or Metro Exodus since my 1070 crawls on those games.

What? Doom Eternal runs better than Doom 2016 on my 1070.

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



RTX 20x0 was just used for anchor pricing, lol.

Chimp_On_Stilts
Aug 31, 2004
Holy Hell.

K8.0 posted:

Also corona panic keeping a lot of people home. If you didn't drop 5k on a vacation this year, but you're home all the goddamn time, all of a sudden a $800 or $1400 GPU looks like a really good buy.

Yep.

I'm probably going to get a 3090. I know it's expensive, but my vacations are all cancelled, I'm not eating out, etc.

I am very fortunate to be able to work from home, so my income is the same but my expenses are significantly reduced compared to times when I could leave the house, so net/net a very expensive PC is still less money than I'd have spent on travel + leisure + restaurants + entertainment etc. over the course of an entire normal year

I suspect this is a common scenario among people eyeballing the 3090 despite the high price.

Chimp_On_Stilts fucked around with this message at 23:53 on Sep 1, 2020

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
You could also say it's a natural conclusion of the world increasingly segmenting into haves and have nots more and more aggressively

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Well, on the plus side, AMD is probably going to sell a shitload of CPUs to go with those nvidia cards :shrug:

Chimp_On_Stilts
Aug 31, 2004
Holy Hell.
Any thoughts on CPUs which will for-sure not bottleneck these cards at >1080 resolutions and 144Hz?

I assume the answer is "basically any current gen CPU", but I've got weeks to continue spinning my wheels about which CPU to pair a new GPU with so I am curious for your thoughts.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

shrike82 posted:

no new slate of DLSS games announced during the presentation which is disappointing - not sure why they aren't pushing partnerships with devs more aggressively

There isn't much in the way of major PC games releasing this year and a lot of them have DLSS/RTX support already. Watch Dogs: Legion, Cyberpunk, Call of Duty, Crysis Remastered are all upcoming major DLSS+RTX titles. Death Stranding was also released pretty recently with it supporting it as well. MSFS 2020 is going to get at least RTX when it gets updated for DX12. The percentage of AAA's supporting NV's RTX features has been pretty high since the launch of DLSS 2.0 back in April.

MikeC
Jul 19, 2004
BITCH ASS NARC

shrike82 posted:

no new slate of DLSS games announced during the presentation which is disappointing - not sure why they aren't pushing partnerships with devs more aggressively

This is the only thing that is still giving me pause on Ampere. While Big Navi won't beat the 3090, I find unlikely they won't have a competent 3080 competitor at a cheaper price that doesn't double as a personal space heater. DLSS remains an ace in the hole but only if mass adoption begins to happen.

PC LOAD LETTER
May 23, 2005
WTF?!

Statutory Ape posted:

Well, on the plus side, AMD is probably going to sell a shitload of CPUs to go with those nvidia cards :shrug:

It depends on what they price RDNA2 and its performance is.

If it really does perform around a 2080Ti and sells for less than one of these new competing NV cards it'll sell just fine. Losing to the 3090 in practical terms won't much matter though since nearly no one is going to be spending $1400 for a gaming card it would be interesting to see if they could pull it off.

Of course if they screw it up somehow (either hardware or drivers) than yeah that won't happen but since its essentially what is going into the new consoles my WAG is they actually got everything working OK otherwise MS/Sony would be suing the poo poo out of them.

movax
Aug 30, 2008

Statutory Ape posted:

Well, on the plus side, AMD is probably going to sell a shitload of CPUs to go with those nvidia cards :shrug:

Yeah 2021 is going to be defined by "the" combo of Ryzen 3xxx / 4xxx and GeForce RTX 3070/3080. Athlon 64s/Radeon 9700 PROs were pretty popular IIRC, and of course Sandy Bridge / Ivy Bridge CPUs with... GTX 480s, right?

CaptainSarcastic
Jul 6, 2013



A Proper Uppercut posted:

Has anyone bought a video card from a place like Best Buy any time recently? I've got a bullshit $500 visa gift card from my last job that I can't use online, but I'd like to use it toward the purchase of a new video card.

I tried buying a GPU from them online in May, and it resulted in me having to contact the Better Business Bureau in order to get it canceled. The card showed as in stock, but clearly wasn't, and they wouldn't let me cancel a clearly back-ordered item through any ordinary channels.

A few weeks before that I ordered a monitor from them online and it arrived no problem and I'm really happy with it.

I guess I'm saying that Best Buy online is a land of contrasts.

Well, seriously, I'd mainly be concerned if I had to deal with their customer service in any way. If you could grab a card in a brick and mortar store that would be great, but I'd be prepared for fuckery online.

repiv
Aug 13, 2009

Beautiful Ninja posted:

There isn't much in the way of major PC games releasing this year and a lot of them have DLSS/RTX support already. Watch Dogs: Legion, Cyberpunk, Call of Duty, Crysis Remastered are all upcoming major DLSS+RTX titles.

Fortnite is getting DLSS+RTX too, and that implies DLSS is going to be a core UE4 feature going forwards

https://www.youtube.com/watch?v=8pU7d_XzQa4

shrike82
Jun 11, 2005

They would've announced it if UE supported DLSS.

Mr.PayDay
Jan 2, 2004
life is short - play hard
Herkelman (CVP & GM AMD Radeon) just answered with a 🤔

https://twitter.com/sherkelman/status/1300842481886662658
Is this a smirk “too soon” ? Is it defeat?
No counter ? No leaks ? Allowing AMD to lose the first wave to 3070 and 3080?

The AMD Reddit already had nice meltdowns and deleted hysterical posts and threads.

Or in other words: On a scale from 0 to 3090, how hosed is AMD now?

repiv
Aug 13, 2009

shrike82 posted:

They would've announced it if UE supported DLSS.

AFAIK every engine feature they've ever added to Fortnite has been upstreamed into mainline UE4

Maybe they're sitting on the announcement until it actually drops in the public UE4 builds

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

shrike82 posted:

They would've announced it if UE supported DLSS.

They may not be ready to announce it as a prime-time feature yet, but UE4 got DLSS engine integration as a feature months ago. It's probably still beta level though.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

MikeC posted:

This is the only thing that is still giving me pause on Ampere. While Big Navi won't beat the 3090, I find unlikely they won't have a competent 3080 competitor at a cheaper price that doesn't double as a personal space heater.

How do you think that AMD will manage to exceed NVIDIA on efficiency? What evidence has there been that they'll even be able to hit the 3080 mark ignoring DLSS capability? Many second-place market contenders have learned that just wanting really badly to catch the leader doesn't get you there, and AMD's record on this stuff--especially in terms of running cool--doesn't make me at all confident that they'll catch the 3080 in 2020 at least.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Mr.PayDay posted:

Herkelman (CVP & GM AMD Radeon) just answered with a 🤔

https://twitter.com/sherkelman/status/1300842481886662658
Is this a smirk “too soon” ? Is it defeat?
No counter ? No leaks ? Allowing AMD to lose the first wave to 3070 and 3080?

The AMD Reddit already had nice meltdowns and deleted hysterical posts and threads.

Or in other words: On a scale from 0 to 3090, how hosed is AMD now?

kinda makes me want to rebuy :thunk: as :poorampere:

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Mr.PayDay posted:

Herkelman (CVP & GM AMD Radeon) just answered with a 🤔

https://twitter.com/sherkelman/status/1300842481886662658
Is this a smirk “too soon” ? Is it defeat?
No counter ? No leaks ? Allowing AMD to lose the first wave to 3070 and 3080?

The AMD Reddit already had nice meltdowns and deleted hysterical posts and threads.

Or in other words: On a scale from 0 to 3090, how hosed is AMD now?

If AMD doesn't say anything within the next two weeks, presume extremely hosed. That's when the 3080 will start being sold and if they don't do anything to get those people to wait on Big Navi, presume it's not competitive.

Wouldn't be shocked if Big Navi only ends up between a 3070 and a 3080, but without DLSS so it gets massacred on a large portion of major games. I also expect the RT to only be at Turing level if they aren't going to use dedicated ASIC's for it.

repiv
Aug 13, 2009

Beautiful Ninja posted:

They may not be ready to announce it as a prime-time feature yet, but UE4 got DLSS engine integration as a feature months ago. It's probably still beta level though.

It "got" DLSS integration in the sense that Nvidia is providing patches that add DLSS support on top of Epics code, but Epic isn't supporting that

Fortnite getting DLSS is the first sign of Epic integrating DLSS as a first class feature

Worf
Sep 12, 2017

If only Seth would love me like I love him!

somebody add ray tracing to the emoji

Truga
May 4, 2014
Lipstick Apathy

repiv posted:

Fortnite is getting DLSS+RTX too, and that implies DLSS is going to be a core UE4 feature going forwards

I don't think it's that simple. I thought the whole point of DLSS is that it's trained by supersampled game frames, so UE "supporting" DLSS means jack poo poo when you use your own art style for your own game. Unless nvidia bothers training the DLSS for your game and releases a driver package with the data, it won't work I think.

shrike82
Jun 11, 2005

Nvidia's UE4 branch just added support for dynamic resolution inputs for DLSS which is cool

repiv
Aug 13, 2009

Truga posted:

I don't think it's that simple. I thought the whole point of DLSS is that it's trained by supersampled game frames, so UE "supporting" DLSS means jack poo poo when you use your own art style for your own game. Unless nvidia bothers training the DLSS for your game and releases a driver package with the data, it won't work I think.

That's how DLSS 1.0 worked, DLSS 2.0 doesn't require any per-game training

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Truga posted:

I don't think it's that simple. I thought the whole point of DLSS is that it's trained by supersampled game frames, so UE "supporting" DLSS means jack poo poo when you use your own art style for your own game. Unless nvidia bothers training the DLSS for your game and releases a driver package with the data, it won't work I think.

DLSS2 does not rely on per-game training.

shrike82
Jun 11, 2005

Truga
May 4, 2014
Lipstick Apathy
Oh, oops I missed that :v: sorry

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

akadajet posted:

can't believe that leaker saying $2000 was full of poo poo

Pretty sure a few of the AIB with AIOs will be nearly $2k.

shrike82
Jun 11, 2005

lol don't forget additional markups due to demand

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Subjunctive posted:

How do you think that AMD will manage to exceed NVIDIA on efficiency? What evidence has there been that they'll even be able to hit the 3080 mark ignoring DLSS capability? Many second-place market contenders have learned that just wanting really badly to catch the leader doesn't get you there, and AMD's record on this stuff--especially in terms of running cool--doesn't make me at all confident that they'll catch the 3080 in 2020 at least.

TSMC has a big fat process lead on everyone else at this point, and AMD is in tight with them. Architecturally there's no chance they catch Nvidia, but there is an outside shot they're competitive on efficiency by not being THAT far behind while running on a superior process.

Shogunner
Apr 29, 2010

Ready to crash and burn.
I never learn.
I'm on the rapetrain.
https://twitter.com/NVIDIAGeForce/status/1300930559699423234

ugh we're all gonna get fuckin sniped by bots on September 17th huh

shrike82
Jun 11, 2005

odd that they got stuck on SS 8nm and G6X (as opposed to HBM)

repiv
Aug 13, 2009



guess the decompression does run on compute then

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Truga posted:

I don't think it's that simple. I thought the whole point of DLSS is that it's trained by supersampled game frames, so UE "supporting" DLSS means jack poo poo when you use your own art style for your own game. Unless nvidia bothers training the DLSS for your game and releases a driver package with the data, it won't work I think.

That was DLSS 1.0. DLSS 2.0 works differently, it’s basically a TAA where the weighting of the samples is controlled by a neural net, so it doesn’t need training on a specific game.

Basically TAA you take these pixel samples (integer space) of being representative of some higher-resolution real-space and you weight them to try and back out a more accurate representation of the real-space that doesn’t have aliasing. But there is no intuitively correct way to determine the weights, it depends on what surfaces there are, how they’re moving, etc. so what is normally used is some algorithm that approximates this.

What DLSS 2.0 does is give the algorithm some “intuitive understanding” of how to take its samples and weight them when it’s feeding them into the TAA algorithm. What it’s learning is how surfaces behave when in motion and how that translates into output images. That doesn’t really change between games.

PC LOAD LETTER
May 23, 2005
WTF?!

shrike82 posted:

odd that they got stuck on SS 8nm.

Supposedly Samsung was making all kinds of promises about their performance, yields, and price all the way up until about early this year/late last year when details started to leak that it was going lose significantly on at least 2 of those things vs TSMC's 7nm and shortly after NV said they were going to be using TSMC's 7nm for at least some of their GPU's too since it was too late to switch fabs if they wanted to launch in 2020.

IIRC SS's 8nm is closer to TSMC's 10nm than their 7nm performance-wise but I don't think there are any solid public numbers on that yet so its still just a rumor for now.

shrike82
Jun 11, 2005

I don't expect AMD to beat Nvidia on pure performance but there's definitely an opening for them on some kind of efficiency metric versus a 320W 3080 esp since they have both TSMC and HBM.

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

Mr.PayDay posted:

Or in other words: On a scale from 0 to 3090, how hosed is AMD now?

Not supremely — they make the console GPU for the ninth generation of home video game consoles and are going to make a fuckload of them. They can sell that with the benefits of (probably) bigger thermal envelopes at very cost competitive points to PC users. The more units those consoles sell, game developers can't just ignore it / throw in entirely for DLSS or something like that.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply