Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Black Griffon
Mar 12, 2005

Now, in the quantum moment before the closure, when all become one. One moment left. One point of space and time.

I know who you are. You are destiny.


Got Watch Dogs 2 with my last card, and even though I have no hopes for Legion being nearly as good, it's a neat bit of recursion.

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

Out of curiosity I went to try and trace back my GPUs... only got as far back as a used Ti 4600, but from there I think I hit a 7900, 8800 GTS, GTX 460, GTX 670 and then GTX 1080. Surprised at past me for upgrading so "quick" from 460 to 670, but maybe that was a big jump.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
No matter what initially got Nvidia on Samsung, one thing is for sure, Nvidia is making out like loving bandits to be slinging the 3080 at $699. There's no way they're just dumping their margins just to hold off AMD. They in all likelihood got a killer deal from Samsung.

Encrypted
Feb 25, 2016

K8.0 posted:

HWUnboxed just did a video on CPU performance for benching the 30 series. At 4k the 3950x tends to pull ahead in a lot of DX12/Vulkan titles. Not surprising, with newer APIs/newer engines having better multithreading support, having a bunch more cores can help avoid any waiting for the CPU when you're GPU bottlenecked. It's not something I previously considered very much, but it's worth thinking about.

They also did a video on PCI-E bandwidth scaling and while often there is literally no difference, a lot of newer games/engines show that more bandwidth clearly helps. Ampere is a HUGE step forward in performance, it will absolutely be bottlenecked on 3.0 x16 in some games.

The differences won't be massive, but a few percent more performance is worth noting, especially when it's in more demanding situations that you'll care about more as hardware ages.

The 3950x with 2 extra cores and costing 200 bucks more than the 10900k, but just barely squeezes out a few more frames in some games or get beaten/having equal performance in dx12 means the 3950x isn't really a great buy.

You are also counting on game devs putting in the extra effort (lol) to optimize for multithreading just to have barely equal performance. Which is a lot less likely to happen to non triple a titles.

Mr.PayDay
Jan 2, 2004
life is short - play hard

shrike82 posted:

I don't expect AMD to beat Nvidia on pure performance but there's definitely an opening for them on some kind of efficiency metric versus a 320W 3080 esp since they have both TSMC and HBM.

I am not only curious about AMD’s raw performance but especially about their Raytracing Solution as well, because both next gen consoles are visibly featuring RT - on RDNA 2 design.
Will be interesting how AMD and Nvidia will end up in direct Raytracing comparisons and benchmarks.

About the aggressive pricing:
The 3080 with 699 is interesting as it is now EXACTLY in the performance to price bracket where many would have expected AMD’s counter to the 3080.

The Radeon VII had a great environment for pricing as the 2080TI was beyond 1200 €/$ and the 2080s were around 800-900 €/$ so it was an easy undercut to 700 €/$.


That breathing room is denied by Nvidia now as their High End Ampere „flagship“ GPU 3080 (the 3090 is targeted at enthusiasts, designers, and disposable income nerds) is now sold for less than the expected (feared) 800-1000 moneys.

Custom Street prices will be 100-200 more, depending on demand and AiC Solution obviously.

Mr.PayDay fucked around with this message at 00:32 on Sep 2, 2020

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
The Radeon VII was weird. An absolutely gigantic portion of its total cost was just in the HBM alone. If they hadn't gone HBM that card could arguably have been a lot more successful at a lower price point.

Truga
May 4, 2014
Lipstick Apathy

Encrypted posted:

The 3950x with 2 extra cores and costing 200 bucks more than the 10900k, but just barely squeezes out a few more frames in some games or get beaten/having equal performance in dx12 means the 3950x isn't really a great buy.

You are also counting on game devs putting in the extra effort (lol) to optimize for multithreading just to have barely equal performance. Which is a lot less likely to happen to non triple a titles.

the real reason to buy high end amd chips is 64mb of cache

2020 is the first time ever i'm playing stellaris into late game and gameplay staying smooth despite all the excuses from paradox about how their game can't be made any faster

MikeC
Jul 19, 2004
BITCH ASS NARC

Mr.PayDay posted:

Herkelman (CVP & GM AMD Radeon) just answered with a 🤔

https://twitter.com/sherkelman/status/1300842481886662658
Is this a smirk “too soon” ? Is it defeat?
No counter ? No leaks ? Allowing AMD to lose the first wave to 3070 and 3080?

The AMD Reddit already had nice meltdowns and deleted hysterical posts and threads.

Or in other words: On a scale from 0 to 3090, how hosed is AMD now?

It has been less than 12 hours since Huang walked on stage. Relax, RDNA 2 leaks will be coming soon. Certainly before the 3080 launches. You will know AMD is royally hosed if you don't hear anything in the next 2 weeks.


Subjunctive posted:

How do you think that AMD will manage to exceed NVIDIA on efficiency? What evidence has there been that they'll even be able to hit the 3080 mark ignoring DLSS capability? Many second-place market contenders have learned that just wanting really badly to catch the leader doesn't get you there, and AMD's record on this stuff--especially in terms of running cool--doesn't make me at all confident that they'll catch the 3080 in 2020 at least.

TMSC's 7nm node is already better than the Samsung node that Nvidia is using for gaming Ampere. This is why they will run quieter. The latest leaks have RDNA 2 using TSMC's n7P (not EUV) which can supply another 10% saving in energy or 7% clock speed, your choice. I don't think I heard RDNA 2 being moved on TSMC 7nm EUV (N7+) which would be even better. It is almost certain the node advantage will let RDNA 2 win the efficiency crown.

DLSS remains their trump card but DLSS is not in every title yet. I am optimistic about mass adoption by next year but that is not certain. But without DLSS, there is no reason to doubt that AMD can crank outsomething that can match the 3080. They matched the 2070 with RDNA 1 with 5700XT and cancelled their top RDNA 1 die which was scheduled to compete with the 2080 (original 'Big Navi'). Maybe some day we will know why but it might have been the fact that RDNA 1 had issues with power consumption despite being on TSMC N7. Thats probably why they didn't come out with their rumored 72 CU RDNA 1 card. But now that we know XBOX and PS5 have RDNA 2 on an APU that maxes out at 300W. A discrete RNDA 2 GPU can have 300W to itself (less power than the 3080) so a full sized 80 CU RDNA 2 GPU should be able to match or come close to the 3080 which is about 25% better than a 2080Ti.


I believe in Ampere and will probably be on the 3080 train after reviews come out and we get a better idea or what RDNA 2 is or isn't. But to act like they can't catch the 3080 is hubris. Jensen wouldn't be nearly as aggressive as he was today in pricing if he himself didn't think that RDNA 2 was a threat.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Taima posted:

The Radeon VII was weird. An absolutely gigantic portion of its total cost was just in the HBM alone. If they hadn't gone HBM that card could arguably have been a lot more successful at a lower price point.

Don't forget the SHROUD.

I've a feeling Raja's a 'swing for the fences' guy. He just starts swinging before the ball is even thrown.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
OMG 1000 posts. Where can I go to reserve this fabled $700 3080? Preferably somewhere I can get an EVGA version since I think the XC3 is the only one that will fit. Just post all the reservation sites anyone.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Taima posted:

The Radeon VII was weird. An absolutely gigantic portion of its total cost was just in the HBM alone. If they hadn't gone HBM that card could arguably have been a lot more successful at a lower price point.

Radeon VII makes more sense when you realize it was a Titan-style prestige card that was never intended to sell units, just to blunt the “lol it's 2019 and AMD still has nothing to compete with 1080 Ti let alone anything newer” factor until they could get Navi out six months or so later. They absolutely knew Navi was going to be better and cheaper and nobody should really have bought VII, that wasn't the point.

It was leftover trash MI50 compute cards that they couldn’t sell because ROCm is a loving mess and welp might as well see what we can get for them from the gamers. Easy to turn into a "prestige" product.

(Bearing in mind this is exactly how Titans got started as well!)

It also probably would not have existed at all if NVIDIA had come out swinging with Turing pricing like they are doing on Ampere.

Paul MaudDib fucked around with this message at 00:47 on Sep 2, 2020

repiv
Aug 13, 2009

https://news.developer.nvidia.com/new-features-to-dlss-coming-to-nvidia-rtx-unreal-engine-4-branch/

quote:

Improved VR support. Maintaining the 2880×1600 resolution of top-end VR head mounted displays while delivering the recommended 90 FPS has been made easier with DLSS.

I think this is the first time Nvidia has talked about DLSS being suitable for VR?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Zero VGS posted:

OMG 1000 posts. Where can I go to reserve this fabled $700 3080? Preferably somewhere I can get an EVGA version since I think the XC3 is the only one that will fit. Just post all the reservation sites anyone.

Preorders aren't live yet and probably won't be until 9/17.

shrike82
Jun 11, 2005

Yeah AMD's GPU division seems like it's getting back on its feet after Raja left.
It's hilarious how he's failing upwards and is a serious contender to takeover as CEO at Intel

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

shrike82 posted:

Yeah AMD's GPU division seems like it's getting back on its feet after Raja left.
It's hilarious how he's failing upwards and is a serious contender to takeover as CEO at Intel

Their *hardware* is. They desperately need to find a driver team worth a drat and send them a few dump trucks full of money.

Truga
May 4, 2014
Lipstick Apathy

repiv posted:

I think this is the first time Nvidia has talked about DLSS being suitable for VR?

yeah afaik there's still a lot of work that needs to be done (not to mention having games actually support it), but it's definitely coming sometime during the 3000 generation if it's coming.

SuperTeeJay
Jun 14, 2015

For anyone interested in UK/EU pricing, Overclockers is adding models: https://www.overclockers.co.uk/pc-components/graphics-cards/nvidia/geforce-rtx-3080. The fancier 3080s will go for £750-800.

I would probably have been happy with a £650 FE but the illustration of hot air being shitted into the position of the CPU air cooler ain’t great.

Truga
May 4, 2014
Lipstick Apathy

BIG HEADLINE posted:

Their *hardware* is. They desperately need to find a driver team worth a drat and send them a few dump trucks full of money.

i'm honestly waiting for them to scrap the windows driver and just run a wrapper around the linux one, in a reverse "old linux ATi catalyst" way

Mr.PayDay
Jan 2, 2004
life is short - play hard

Taima posted:

The Radeon VII was weird. An absolutely gigantic portion of its total cost was just in the HBM alone. If they hadn't gone HBM that card could arguably have been a lot more successful at a lower price point.
Did not help that the „street prices“ where 800-900 Euro over here. Many expected a 2080 competitor for 500-600 bucks.

Gwaihir
Dec 8, 2009
Hair Elf

Encrypted posted:

The 3950x with 2 extra cores and costing 200 bucks more than the 10900k, but just barely squeezes out a few more frames in some games or get beaten/having equal performance in dx12 means the 3950x isn't really a great buy.

You are also counting on game devs putting in the extra effort (lol) to optimize for multithreading just to have barely equal performance. Which is a lot less likely to happen to non triple a titles.

The story in that isn't so much the 3950x as it is the PCIe 4.0 GPU gaining relatively of a more performance advantage at 4k while the PCIe 3.0 GPU was a statistical tie. That's the relevant part for people looking forward.

VorpalFish
Mar 22, 2007
reasonably awesometm

MikeC posted:

It has been less than 12 hours since Huang walked on stage. Relax, RDNA 2 leaks will be coming soon. Certainly before the 3080 launches. You will know AMD is royally hosed if you don't hear anything in the next 2 weeks.


TMSC's 7nm node is already better than the Samsung node that Nvidia is using for gaming Ampere. This is why they will run quieter. The latest leaks have RDNA 2 using TSMC's n7P (not EUV) which can supply another 10% saving in energy or 7% clock speed, your choice. I don't think I heard RDNA 2 being moved on TSMC 7nm EUV (N7+) which would be even better. It is almost certain the node advantage will let RDNA 2 win the efficiency crown.

DLSS remains their trump card but DLSS is not in every title yet. I am optimistic about mass adoption by next year but that is not certain. But without DLSS, there is no reason to doubt that AMD can crank outsomething that can match the 3080. They matched the 2070 with RDNA 1 with 5700XT and cancelled their top RDNA 1 die which was scheduled to compete with the 2080 (original 'Big Navi'). Maybe some day we will know why but it might have been the fact that RDNA 1 had issues with power consumption despite being on TSMC N7. Thats probably why they didn't come out with their rumored 72 CU RDNA 1 card. But now that we know XBOX and PS5 have RDNA 2 on an APU that maxes out at 300W. A discrete RNDA 2 GPU can have 300W to itself (less power than the 3080) so a full sized 80 CU RDNA 2 GPU should be able to match or come close to the 3080 which is about 25% better than a 2080Ti.


I believe in Ampere and will probably be on the 3080 train after reviews come out and we get a better idea or what RDNA 2 is or isn't. But to act like they can't catch the 3080 is hubris. Jensen wouldn't be nearly as aggressive as he was today in pricing if he himself didn't think that RDNA 2 was a threat.

RDNA was basically at parity vs turing efficiency wise, and that was tsmc7 vs 12.

I don't think the situation improves vs ampere on Samsung 8. They may still be on a better process, but that gap is shrinking.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Encrypted posted:

The 3950x with 2 extra cores and costing 200 bucks more than the 10900k, but just barely squeezes out a few more frames in some games or get beaten/having equal performance in dx12 means the 3950x isn't really a great buy.

You are also counting on game devs putting in the extra effort (lol) to optimize for multithreading just to have barely equal performance. Which is a lot less likely to happen to non triple a titles.

The point isn't "you should buy a 3950X" it's 'the 3950X is the proper platform for benching Ampere." Same reason that you should bench CPUs with a 3090, doesn't mean everyone needs a 3090.

movax
Aug 30, 2008

BIG HEADLINE posted:

Their *hardware* is. They desperately need to find a driver team worth a drat and send them a few dump trucks full of money.

I was going to ask — one of the historical differentiators all the way back to the ATI R100/R200 days was that driver quality differed greatly. I've never first-hand owned or run an ATI/AMD GPU — is this historical FUD or do AMD drivers still leave a lot to be desired vs NV?

VorpalFish
Mar 22, 2007
reasonably awesometm

shrike82 posted:

odd that they got stuck on SS 8nm and G6X (as opposed to HBM)

I wonder if it came down to them worrying about supply. As it's playing out, their consumer cards aren't competing for fab capacity or memory with a100. Otherwise hbm2e makes all the sense in the world if you want to push that much bandwidth.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

I think that a lot of people hoping to build a system for Cyberpunk are going to be running AMD cards because it's very likely that nobody here will be able to get an Ampere card at release due to bots and demand.

Before the 3080 reaches your hands you will need to pay the tax to the reseller to the tune of anywhere between 200 to 800 depending on how badly you need it. Finding a 3080 or 3090 in October will be about as easy as finding toilet paper in April.

Truga
May 4, 2014
Lipstick Apathy

movax posted:

I was going to ask — one of the historical differentiators all the way back to the ATI R100/R200 days was that driver quality differed greatly. I've never first-hand owned or run an ATI/AMD GPU — is this historical FUD or do AMD drivers still leave a lot to be desired vs NV?

early amd drivers weren't good but they fixed their poo poo eventually. now it's FUD about 95% of the time, but now and then it isn't and people fall over each other calling the drivers trash for 5 years every time and here we are

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Paul MaudDib posted:

Radeon VII makes more sense when you realize it was a Titan-style prestige card that was never intended to sell units, just to blunt the “lol it's 2019 and AMD still has nothing to compete with 1080 Ti let alone anything newer” factor until they could get Navi out six months or so later. They absolutely knew Navi was going to be better and cheaper and nobody should really have bought VII, that wasn't the point.

It was leftover trash MI50 compute cards that they couldn’t sell because ROCm is a loving mess and welp might as well see what we can get for them from the gamers. Easy to turn into a "prestige" product.

(Bearing in mind this is exactly how Titans got started as well!)

It also probably would not have existed at all if NVIDIA had come out swinging with Turing pricing like they are doing on Ampere.

radeon vii was purely to satisfy investors that there was, in fact, a 7nm consumer gpu

it was clearly a shitshow in every other way except as a cheap MI50

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Truga posted:

early amd drivers weren't good but they fixed their poo poo eventually. now it's FUD about 95% of the time, but now and then it isn't and people fall over each other calling the drivers trash for 5 years every time and here we are

nah the RDNA 1 drivers still suck rear end

MikeC
Jul 19, 2004
BITCH ASS NARC

movax posted:

I was going to ask — one of the historical differentiators all the way back to the ATI R100/R200 days was that driver quality differed greatly. I've never first-hand owned or run an ATI/AMD GPU — is this historical FUD or do AMD drivers still leave a lot to be desired vs NV?

Some people are legitimately screwed over. 5700XT was a great example. The mainstream reviewers never had issues on their setups. I had occasional black screens until the big Xmas update. For a small minority of people, problems continue to persist. There is a dude on Youtube that consistently tests every single release beta or otherwise and can report bugs on one setup or another.

VorpalFish posted:

RDNA was basically at parity vs turing efficiency wise, and that was tsmc7 vs 12.

I don't think the situation improves vs ampere on Samsung 8. They may still be on a better process, but that gap is shrinking.


Yes, but they obviously fixed that for RDNA 2 since it the APU can fit on a 300W package of which the GPU can only get a portion of that. So unless you have magic information that we don't that says they can't keep the same efficiency when scaling past 56 CUs at the XBOX clock speeds then you are really just blowing smoke.

MikeC fucked around with this message at 00:54 on Sep 2, 2020

shrike82
Jun 11, 2005

the trailer for CP2077 with RTX on looked... meh? bolting on RT at a late stage is going to limit the game's visual flair - especially with a game that's been in development for a while

don't forget that it was originally slated to launch at the start of this year (i.e., with Turing cards). people are going to be fine as long as they don't feel the need to run the game at "ultra" settings

Black Griffon
Mar 12, 2005

Now, in the quantum moment before the closure, when all become one. One moment left. One point of space and time.

I know who you are. You are destiny.


I am curious to know what the price will be here in Norway. Demand is not quite as high, so there's a higher chance of getting it closer to launch, but shits also more expensive here because we're richer than y'all.

movax
Aug 30, 2008

shrike82 posted:

as long as they don't feel the need to run the game at "ultra" settings

lol

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

Here is a post from the EVGA Forums detailing their ampere product:
https://forums.evga.com/Introducing-the-EVGA-GeForce-RTX-30-Series-with-iCX3-Technology-m3072847-p7.aspx#3073192

EVGA_TechLeeM posted:

Currently no pre-order planned. Similarly, no pricing is available at this time.

Like the US store, we cannot generally comment on availability at this time for the EU store. Availability, however, will not be any sooner than the dates NVIDIA mentioned for each GPU.

HYBRID, HC, and KPE cards will come a bit later than the XC3 and FTW3 cards. No ETA to give you, which is probably your next question.

HYBRIDs will be 240mm, except for KPE HYBRID, whch will be 360mm. No plans for a 120mm HYBRID at this time.
HYBRIDs will be 2 Slot.
XC3 cards will all be 2.2 slots. Length is 11.23in. - 285.37mm / Height is 4.38in. - 111.15 mm.
FTW3 cards will be 2.75 slots. Length is 11.81in. - 300mm / Height is 5.38in. - 136.75mm.
There will be an EVGA NVLINK. No ETA at this time.
Since I've seen this mentioned incorrectly, all cards are 3 DisplayPort, 1 HDMI.

PowerLink is expected to work with the XC3 models. For obvious reasons, it will not work with the 3090/3080 FTW3 cards due to the number of PCIe power connectors. Either way, we will have the compatibility list updated when the cards are available on the website.
Step-Up will begin when cards are available. Products will not be listed prior to general availability. I would expect one of the XC3 models will be listed for availability, but we will make that decision prior to general availability.
I'm not sure why people assume bots are buying cards over regular people. It was a popular reason for why 20-Series cards were always out of stock, which completely ignored literal supply and demand issues (lots of people wanted them, but there weren't many initially). Yes, there are per person quantity caps for each card. Yes, we require captcha to login, which is also required to create a profile. This should help to comfort some of you.

repiv
Aug 13, 2009

Black Griffon posted:

I am curious to know what the price will be here in Norway. Demand is not quite as high, so there's a higher chance of getting it closer to launch, but shits also more expensive here because we're richer than y'all.

https://www.nvidia.com/nb-no/geforce/buy/

movax
Aug 30, 2008


So the idea in the hybrids is to watercool GPU, air cool VRM/RAM, and then the other hybrid is a full waterblock?

Happy Noodle Boy
Jul 3, 2002


shrike82 posted:

the trailer for CP2077 with RTX on looked... meh? bolting on RT at a late stage is going to limit the game's visual flair - especially with a game that's been in development for a while

Something about 2077 in general feels off and I just can’t tell if it’s the gameplay not being good enough to carry everything else or if everything else will carry the gameplay.

Like a lot of it looks real good but the few snippets of gunplay looks... stiff? Same with the driving.

shrike82
Jun 11, 2005

Happy Noodle Boy posted:

Something about 2077 in general feels off and I just can’t tell if it’s the gameplay not being good enough to carry everything else or if everything else will carry the gameplay.

Like a lot of it looks real good but the few snippets of gunplay looks... stiff? Same with the driving.

the fact that it's first-person?

Black Griffon
Mar 12, 2005

Now, in the quantum moment before the closure, when all become one. One moment left. One point of space and time.

I know who you are. You are destiny.



oh hey there we go.

... that's a lot of kroner.

repiv
Aug 13, 2009

Happy Noodle Boy posted:

Like a lot of it looks real good but the few snippets of gunplay looks... stiff? Same with the driving.

the combat in TW3 sucked and that was their third attempt at it

their first go at first person shooter combat probably won't be amazing

Adbot
ADBOT LOVES YOU

Symetrique
Jan 2, 2013




Anyone have experience with preordering from Microcenter? Wondering if trying to get a card through them will be easier than directly through Nvidia.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply