Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Taima posted:

Sure you have titles like Horizon where they seemingly rejected DLSS on purpose because AMD was sponsoring it, but saying that it bit them in the rear end is an understatement. That game would have been received completely differently if it had the headroom of DLSS.
Eh, not really. It would still be an awful port, there's a lot more problems with it than just GPU usage. DLSS 2.0 would be nice yes, but it would do nothing for the exorbitant CPU demands, constant crashing, flickering grass (I mean in some cases it actually looks better due to this on checkerboarded 4k on the Pro vs native 4k on my PC), bugged animations/snow deformation and running at ~40fps on a 1060 which vastly outnumber the # of RTX cards in the installed base of PC gamers. It's completely hosed on numerous levels, even aside from performance.

Death Stranding for example, runs exactly like would expect without DLSS - a 1060 can run it at 1080p 60, what one usually gets when comparing a 1060 vs a regular PS4, a 1060 is at or above twice as fast (and also has the option to use AMD Fidelity to run it close to, if not better than the 4k mode of the Pro at a locked 30). My 1660 easily dusts my PS4 Pro when comparing the same titles, def not the case with H:ZD though. DLSS is not a panacea for a bad port.

Adbot
ADBOT LOVES YOU

Dr. Fishopolis
Aug 31, 2004

ROBOT
I was hoping the 30 series launch would make used 20 series cheap enough for me to want to upgrade my 980ti, but if the top end card requires being connected directly to a nuclear power facility I have a weird feeling I'm gonna be waiting for big navi instead.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Dr. Fishopolis posted:

I was hoping the 30 series launch would make used 20 series cheap enough for me to want to upgrade my 980ti, but if the top end card requires being connected directly to a nuclear power facility I have a weird feeling I'm gonna be waiting for big navi instead.

How much of a power jump were you hoping for in order to justify a new card? A 2080Ti is already 100-150% faster than a 980Ti, depending on the game. Depending on how the actual performance shakes out, maybe look at the 3070 or 3080, which should handily beat the 2080Ti anyhow.

I also wouldn't put too much stock in the "zomg 1000w PSU!!!!" claims just yet. "Recommended" PSU wattages have always been a bit of a joke. Your 980Ti, for example, "requires" a 600W+ PSU, with "recommended" wattage at least 50W higher than that, despite it being quite possible to run a perfectly happy system with a good chunk less.

The current 2080Ti "requires" a 650W PSU, as well, despite total-system draw under load for a "reasonable" system often being under 450W, so a recommendation from a PSU-maker that you up that a little to 750W shouldn't be terribly concerning, and I'd guess that means an actual increase of ~50W in power for the card itself over the 2080Ti.

DrDork fucked around with this message at 19:59 on Aug 23, 2020

Dilber
Mar 27, 2007

TFLC
(Trophy Feline Lifting Crew)


Yeah, I've been making fun of my 1000w PSU since it's such overkill, but this card should bring me to at least 50% on it, so it's still overkill but less so.

shrike82
Jun 11, 2005

DrDork posted:

The current 2080Ti "requires" a 650W PSU, as well, despite total-system draw under load for a "reasonable" system often being under 450W, so a recommendation from a PSU-maker that you up that a little to 750W shouldn't be terribly concerning, and I'd guess that means an actual increase of ~50W in power for the card itself over the 2080Ti.

The recommendation is upping to 850W

Dr. Fishopolis
Aug 31, 2004

ROBOT

DrDork posted:

How much of a power jump were you hoping for in order to justify a new card? A 2080Ti is already 100-150% faster than a 980Ti, depending on the game. Depending on how the actual performance shakes out, maybe look at the 3070 or 3080, which should handily beat the 2080Ti anyhow.

Honestly, I don't need much of a performance bump if DLSS is the way forward. I'm also not concerned with power draw because I'm just not going to buy a 300w graphics card. What I'm really saying is that pushing out a card with that big a die that sucks that much juice does not exactly scream confidence on nvidia's part.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Taima posted:

It's actually not a substantial price increase for the top end. The 2080Ti was $1200 minimum, and we're being quoted $1299 minimum with FE bump to $1399. That's immaterial for that specific market, and the conditions for that market are actually better than ever for multiple reasons. It's possible that prices might settle higher, but we will have to see. In any case the point remains that the top market doesn't care.

Re: price/perf through the rest of the range, there's basically no reliable data on price/performance, especially for anything below a 3080, so I would definitely hold off on that assessment. Please tell me you're not relying on that "comprehensive benchmark" image where all the cards are supposedly compared because that is insanely fake.

And re: DLSS2 I respectfully disagree. I get that people were burned on DLSS1.0 but only intensive titles need DLSS, not every title. Nvidia is going to be reaching out to everybody to get DLSS in their game, and they have way more impetus to do that then they ever have. Who is making a GPU intensive game in 2020+ and doesn't want an absolute shitload of free performance? I get your reluctance but DLSS is the most magical single technology in... honestly I don't even know what the last comparable tech is. This thing is fully happening.

Sure you have titles like Horizon where they seemingly rejected DLSS on purpose because AMD was sponsoring it, but saying that it bit them in the rear end is an understatement. That game would have been received completely differently if it had the headroom of DLSS. If anything it's a major warning to publishers to include that poo poo, especially when Death Stranding released at the same time and ran on a potato as long as it had DLSS.

Yes, the absolute top end customers will buy whatever comes out. If Jensen comes out with a 2080Ti, scribbles "3090" on it in Sharpie, and announces a $2000 price tag, they'll buy it to replace their 2080Tis. They're kind of irrelevant to the discussion of whether the rest of us are going to be convinced to upgrade. Also, "we're being quoted $1299 minimum" doesn't mean much, given what we already know about volatility in pricing strategies leading up to launch.

What we know for sure (barring some very elaborate fakery) is that there's some kind of monster-sized card with a couple of crazy custom PCBs, an expensive and elaborate cooler, and even a new high-current power connector, that's probably being put into the "gaming" product line rather than the pro/datacenter-targeted parts. We don't know much about the specifics of the product lineup, but it's most plausible that it's targeted above the current 2080Ti tier as an extreme halo part. So - super high performance, super high pricetag, probably irrelevant outside of that very narrow niche.

For the normal-people range, I'm not basing anything on whatever probably-faked "comprehensive benchmark" you're talking about, just market conditions. Nvidia has very little competition and no reason to offer massive price cuts or performance boosts at the same pricing tier. We haven't seen any indication that they'll do otherwise. It's possible they'll surprise us, but that's why I said "most plausible scenario" instead of "absolutely guaranteed to happen."

Finally, for DLSS, I actually agree with you. It's very impressive technology that is likely to be relevant in the future, and prolong the lifespan of current cards that support it (although we'll probably see all kinds of caveats as adoption increases). Right now, though, it's only marginally relevant. The current marquee support is basically just Control (OK, high requirements, although it does scale down nicely) and Death Stranding (yes, it's a DLSS showcase, but the dirty secret is that it'll run well and look good on a non-DLSS potato, too). So - once everything starts coming out with DLSS support, then it'll be a good time to upgrade for DLSS.

In the mean time, it's not a reason to rush out on launch day. I'm still not seeing any good reason to do that, unless you've already figured out that you're going to be upgrading because what you've got now isn't enough.

(super hot take: there's a good chance that, in retrospect, we're going to see the 1000-high-end-2000 jump as the big one, because it introduced new long-term features like DLSS, and the 2000-3000 jump as comparatively minor)

Some Goon posted:

so unless you're a real loathsome type that can't stand the idea that a console can rival your computer your GPU needs are still going to be dictated by your monitor.

I mean, we are talking about a hobby where a good chunk of people seem to think that "we are the master race" jokes are the height of comedy, sooooo....

Alchenar
Apr 9, 2008

shrike82 posted:

People should also keep an eye out on whether the boards come out on Samsung 8nm and run hot, that’s another reason to wait for a 7nm Super respin.

I don't know anything about engineering but it seems intuitively unlikely that a 7nm Super respin can deliver too much in this regard.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Alchenar posted:

I don't know anything about engineering but it seems intuitively unlikely that a 7nm Super respin can deliver too much in this regard.

It's from a different foundry so there's still hope of significant power efficiency differences, I think?

If that 3090 is actually on the 7nm as rumored then I'm guessing we're hosed, yeah.

shrike82
Jun 11, 2005

The A100 Pcie is on TSMC 7nm and specced at 250W... it could be that the 3090 is redlining frequency etc and also that the new RT core implementation is hot (the A100 doesn’t have them). That’s another reason to wait for a 3080 Ti/Super with more sane specs.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
If you're going to wait for the refresh you may as well wait for Hopper which will be a much bigger leap. And if you're going to wait that long, why not wait for 2nd gen MCM since it'll be much more mature?

Logical goon conclusion : never buy a GPU.

shrike82
Jun 11, 2005

Because there might actually be games out that need the power in a year’s time?

Arguing that not rushing to buy Ampere at launch is equivalent to never wanting to upgrade is a galactic brain take.

MikeC
Jul 19, 2004
BITCH ASS NARC

shrike82 posted:

Because there might actually be games out that need the power in a year’s time?

Arguing that not rushing to buy Ampere at launch is equivalent to never wanting to upgrade is a galactic brain take.

The reason why the hype train is reaching the ridiculous levels it is at now is thanks to Cyberpunk featuring RT and DLSS. For a lot of gamers (myself included), this is a must-have day 1 title and sure as hell want my 60-90 FPS with RT on when I play that. Big Navi might arrive in time to compete but hell, I am going back to Team Green if thats what it takes.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

K8.0 posted:

Logical goon conclusion : never buy a GPU.

shrike82 posted:

Arguing that not rushing to buy Ampere at launch is equivalent to never wanting to upgrade is a galactic brain take.

Arguing what anyone should buy at all given we know neither the performance, nor price, nor power budget, nor manufacturing process any of these cards are based on is a galactic brain take at this point, and K8.0 I know you're not that dumb.

Let's just all agree we're bored and spitballing various hypothetical situations, guys, and that we all want new cards because if we didn't we wouldn't be poasting in this dumb thread.

Dr. Fishopolis
Aug 31, 2004

ROBOT
if big navi comes out with ray tracing and reasonably stable drivers in time for cyberpunk, we'd all probably end up in some sort of quantum improbability vortex.

shrike82
Jun 11, 2005

I get paid for the number of the posts I make here.

I’m interested in a 3090/Titan replacement for my RTX Titan but if the heat/power rumours are true, I’m going to try to get work to buy me an A100 for WFH use.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

K8.0 posted:

If you're going to wait for the refresh you may as well wait for Hopper which will be a much bigger leap. And if you're going to wait that long, why not wait for 2nd gen MCM since it'll be much more mature?

Logical goon conclusion : never buy a GPU.

Never wait, always try to future proof: The darkest times are now.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Dr. Fishopolis posted:

if big navi comes out with ray tracing and reasonably stable drivers in time for cyberpunk, we'd all probably end up in some sort of quantum improbability vortex.

Isn't AMD always like 50% worse performance/watt to Nvidia regardless of anything else? These 3080 look like they're running kinda hot but cards but isn't AMD's architecture just always way hotter or am I misremembering?

Like any time AMD has claimed a huge efficiency increase or "overclocker's dream" it's been a straight up lie I feel like.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

DrDork posted:

Arguing what anyone should buy at all given we know neither the performance, nor price, nor power budget, nor manufacturing process any of these cards are based on is a galactic brain take at this point, and K8.0 I know you're not that dumb.

Let's just all agree we're bored and spitballing various hypothetical situations, guys, and that we all want new cards because if we didn't we wouldn't be poasting in this dumb thread.

I agree with you, that's the point. There are a lot of really dumb posts being made assuming that Ampere is probably/definitely going to be a terrible buy or that the refresh is the real value target. A refresh is hardly guaranteed, hardly guaranteed to have process parity with AMD if the original Ampere doesn't, and/or may represent a terrible value if it comes shortly before Hopper. That said, I would argue that it is probably quite a safe move to preorder Ampere the moment it becomes available, because the high-end GPUs will launch first and demand will definitely outstrip supply unless they are terrible in a way that is incredibly unlikely.


Zero VGS posted:

Isn't AMD always like 50% worse performance/watt to Nvidia regardless of anything else? These 3080 look like they're running kinda hot but cards but isn't AMD's architecture just always way hotter or am I misremembering?

Like any time AMD has claimed a huge efficiency increase or "overclocker's dream" it's been a straight up lie I feel like.

AMD got a huge efficiency boost from 7nm. If Nvidia has gone 8nm, they are almost certainly going to get SOME kind of efficiency boost, and almost certainly still retain the efficiency crown. Obviously they would do even better on 7nm, and it's always possible that RDNA2 is some magical, unexpected leap forward. So like everything else it's just guesswork based around reasonable assumptions.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Nah they're pretty close right now. Of course, they have a whole architectural gen advantage and are still behind, but ya know, close.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zero VGS posted:

Isn't AMD always like 50% worse performance/watt to Nvidia regardless of anything else? These 3080 look like they're running kinda hot but cards but isn't AMD's architecture just always way hotter or am I misremembering?

AMD has been bad in the past, but that's for two reasons: (1) they went a loooong time basically respinning slightly modified versions of the same uarch. The RDNA uarch in the 5700(XT) didn't entirely fix that issue, but it was a good step. (2) because AMD has been so behind on uarch and driver optimizations, one of the main ways they've stayed competitive(ish) in the mid-tier has been by juicing the gently caress out of their cards to eek out every last frame they possibly could. Much like with CPU overclocking and whatnot, that may mean that you end up spending +20% power for a +5% performance gain, which totally fucks up the efficiency metrics, but does make that little FPS bar just a bit higher.

People are hoping that AMD's RDNA2 is gonna be a lot better than they've been in the past, to which I say "we'll see." The TFLOPs counts they're advertising on the new consoles are pretty good, and the power budget they're presumably working within says at least it doesn't seem like a hideously over-drawn stack, but then again it's hard to gently caress that up on 7nm. That AMD hasn't really said much concrete about Big Navi is a bit of a concern, and if the 30-series launches without any sort of PR response from AMD, I think it'd be safe to conclude that they don't think they'll be able to directly compete again this generation.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

K8.0 posted:

That said, I would argue that it is probably quite a safe move to preorder Ampere the moment it becomes available, because the high-end GPUs will launch first and demand will definitely outstrip supply unless they are terrible in a way that is incredibly unlikely.

I'm using a 1080Ti that can no longer max out my monitor on a bunch of the games I play at the settings I want, and sitting on a pile of WFH funbux, so I'm probably gonna grab whatever sub-$1k card they put out, regardless of if they're terrible or not. ¯\_(ツ)_/¯

e; also, dear KVM manufacturers, can you please stop loving up your implementation of the DP protocol so that I can run a monitor >60Hz on it? I really don't want to have to press two buttons to switch all my poo poo around when I should only have to press one. Also plz to be making a HDMI 2.1 KVM immediately, kthx.

Mukaikubo
Mar 14, 2006

"You treat her like a lady... and she'll always bring you home."
Yeah. My original target was for sure Cyberpunk, but right now I've got the flight simulator bug real bad. It's absolutely wrecking my processor and memory as much as my graphics card, unsurprising since it's a 4 year old system. And if I'm replacing those, that's pretty much a new build, so I may as well also upgrade my 1060GTX (the 6 GB version), right? But buying a 2000 series now feels weird, and the rumored prices for the 3070- which is a huge step up from the 1060- are at the high end of my price tolerance... but then waiting for those is getting iffy.

Right now I'm debating building the new computer and putting this 1060 in it until I can get a 3070 and just grinning and bearing everything bottlenecking my current graphics card *hard* for a few months, because right now my computer is shaky enough that I can't even stably run the newest game I am enamored with. So it's more a general "I want to build a new computer and since I am I may as well get the best video card in my price range I can, and that's about to (probably) be the 3070, and I am very antsy about waiting well into the new year for it" proposition.

shrike82
Jun 11, 2005

It’s odd that Nvidia didn’t push for DLSS support on MSFS. That’s a game that seems like it could use it.

I hope they’re going to announce a slew of DLSS games at launch because support’s been anaemic.

Mukaikubo
Mar 14, 2006

"You treat her like a lady... and she'll always bring you home."

shrike82 posted:

It’s odd that Nvidia didn’t push for DLSS support on MSFS. That’s a game that seems like it could use it.

I hope they’re going to announce a slew of DLSS games at launch because support’s been anaemic.

It really does- they're using so much machine learning voodoo in the game's development you would have thought that it'd be a welcome addition. And frankly, a ton of the scenery during flight simulator (different amounts of clouds and weather, distant small terrain) is probably right up DLSS's alley from my limited understanding of how the training goes.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Mukaikubo posted:

Yeah. My original target was for sure Cyberpunk, but right now I've got the flight simulator bug real bad.

If money matters to you, and if you can grind it out for another month, the 30-series will have at least been announced and you can make the choice between plunking down for one of those or trying to pick up a fire-sale 20-series card.

Then if that aleviates your bottleneck enough to be able to wait another month or so, Zen3 should be out before the end of the year, promising better performance than current Zen2, and PCIe 4.0 for better future utility.

Sucks that you're getting the bug at basically the worst time to want to build a new computer. :(

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
MSFS is a huge work in progress and it's not surprising DLSS wasn't on the list for support at launch. It seems like it's definitely a game Nvidia should be targeting though, because it's quite demanding and also likely to remain reasonably popular among the high-budget crowd for the next several years.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Mukaikubo posted:

It really does- they're using so much machine learning voodoo in the game's development you would have thought that it'd be a welcome addition. And frankly, a ton of the scenery during flight simulator (different amounts of clouds and weather, distant small terrain) is probably right up DLSS's alley from my limited understanding of how the training goes.

MSFS would be a poster child for DLSS considering how much 4K seems to ruin its poo poo.

I keep wondering if there's weird corporate ego/market stuff at play here with DLSS adoption. For example, Microsoft is in something of a pickle here politically. AMD is powering its console. Embracing DLSS in a big way might run counter to that partnership.

You would assume that MS has good relations with Nvidia but they haven't afaik really gone out of their way to support its proprietary tech. It seems like they're trying to be impartial, generally speaking.

K8.0 posted:

MSFS is a huge work in progress and it's not surprising DLSS wasn't on the list for support at launch. It seems like it's definitely a game Nvidia should be targeting though, because it's quite demanding and also likely to remain reasonably popular among the high-budget crowd for the next several years.

I hope it's this and not politics.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"
Can't you comfortably run an SLI 2080ti setup on 1000W? Seems insane to me that a single 3090 would require 850W.

Craptacular!
Jul 9, 2001

Fuck the DH
The Microsoft way is to make their own equivalent of DLSS that's bundled into DirectX and thus can also work on their console. They don't need Nvidia's DLSS server farm for deep learning because they already own Azure.

Microsoft does not care if PC offers a better experience than their console, but they don't want to help make it that way. They also don't want to owe Nvidia poo poo, they'll make new cards that run the game better eventually anyhow.

ufarn
May 30, 2009
If I were Nvidia, I'd pay MS to not include DLSS so people will buy a 3090 at launch only for MSFS to get DLSS 2.0 a month later.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

jisforjosh posted:

Can't you comfortably run an SLI 2080ti setup on 1000W? Seems insane to me that a single 3090 would require 850W.

Someone already pointed this out but the "required" PSU numbers are basically bullshit 100% of the time historically. They're usually extremely padded.

Craptacular! posted:

The Microsoft way is to make their own equivalent of DLSS that's bundled into DirectX and thus can also work on their console. They don't need Nvidia's DLSS server farm for deep learning because they already own Azure.

Call me crazy but I doubt the main obstacle to DLSS implementation is owning or having access to a server farm.

repiv
Aug 13, 2009

Flight Simulator doesn't even use DX12 currently, their first priority rendering-wise will probably be that to help alleviate the massive CPU bottlenecks.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

shrike82 posted:

It’s odd that Nvidia didn’t push for DLSS support on MSFS. That’s a game that seems like it could use it.

I hope they’re going to announce a slew of DLSS games at launch because support’s been anaemic.

I am having flashbacks to 2018 when nvidia had a slide with a bunch of games "coming soon" with RTX support. It is Aug 2020 and I think like three of the titles have RTX support in some way or another and two have DLSS 1.0 support.

The pic is still here on their blog.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

repiv posted:

Flight Simulator doesn't even use DX12 currently

How did that happen?

Indiana_Krom
Jun 18, 2007
Net Slacker
Required/recommended PSU specs on Nvidia/AMD numbers are assuming some complete shitbox of a PSU that will literally explode and start launching components through the case past 40% of the "rated" output.

repiv
Aug 13, 2009

Rinkles posted:

How did that happen?

Long development cycle probably, they committed to DX11 when it was still current and ran with it.

It is kind of glaring now that we're seeing more and more AAA games use DX12 or Vulkan exclusively though, especially since other Microsoft studios have led the charge (e.g. Forza and Gears have had multiple DX12-only installments already)

repiv fucked around with this message at 22:21 on Aug 23, 2020

MikeC
Jul 19, 2004
BITCH ASS NARC

repiv posted:

Flight Simulator doesn't even use DX12 currently, their first priority rendering-wise will probably be that to help alleviate the massive CPU bottlenecks.

Is this why it runs like dogshit right now?

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin
People really need to chill out about PSU requirements for these GPUs. RTX Titan could pull 320 watts and Nvidia recommend a 650 watt unit.

If people really want to reduce uncertainty about these things, they need to buy a kill-a-watt and end their ignorance over how much power their computer uses.

Adbot
ADBOT LOVES YOU

Samadhi
May 13, 2001

So what you're saying is that if I am building a new PC and planning on finishing it with a 3080 or 3070, an 850W Platinum PSU will be fine...

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply