Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Stickman
Feb 1, 2004

ChaseSP posted:

I've redeemed the amdreward code and still nothing. RE2 been out for a while now which is making me upset at this point.

Apparently they're out of stock and are supposedly replenishing somewhere around the 18th-20th. If it's not up by then I'd contact AMD (or contact them to complain now, since that's extremely annoying).

Riflen posted:

To me it sounds like DLSS doesn't work well or can't work at all at high frame rates. That's why they're limiting the options that way.

I wonder if the tensor cores simply bottleneck you hard if you're running at settings that would give you ~8ms frametimes or lower. You'll notice Nvidia are recommending Ultra settings all over the place.
Seems like the DLSS performance improvement can only be used for higher fidelity and not for better motion resolution from higher frame rates by playing at lower resolutions or turning down quality settings.

It seems like worst-case-scenario would just be 2080-level performance at 1440p. Given the reports, though, I suppose 2080-level DLSS might be significantly worse performance than just straight TAA at 1440p.

Stickman fucked around with this message at 19:54 on Feb 13, 2019

Adbot
ADBOT LOVES YOU

Parallelwoody
Apr 10, 2008


I'm loving this absolutely broke dick delivery of expensive rear end features.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Sort of also enjoying the schadenfreude, but man, you gotta wonder if Nvidia set raytracing in games back a couple of years because of their overreach. It's clear that real-time raytracing, or DLSS, or any of this mumbo-jumbo bullshit needed another year to cook before being announced.

Stickman
Feb 1, 2004

SwissArmyDruid posted:

Sort of also enjoying the schadenfreude, but man, you gotta wonder if Nvidia set raytracing in games back a couple of years because of their overreach. It's clear that real-time raytracing, or DLSS, or any of this mumbo-jumbo bullshit needed another year to cook before being announced.

I doubt it. Now we actually have raytracing in games (and more on the way). The only way it'll be a setback is if they ditch it entirely with the next generation, but even then it'd only be if it's still not ready (which would be hard to call a "setback").

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



repiv posted:

DLSS may be a wet fart but the RTGI implementation in Metro looks really nice. OC3D has some comparison sliders.

Here's some more clips in motion, comparing both options (DLSS/RTX) on/off.
DLSS looks noticeably blurry. What's the point of a high resolution monitor when everything looks like an Emmanuelle movie from the 70's.

https://www.youtube.com/watch?v=8q7KCTXy2Jc

Funny how RTX on just disables all sunlight shining through the window (0:55).

Still not convinced of both features. I just want to go back to solid performance improvements at midrange prices. My 970 is getting long in the tooth and no options is appealing right now. Shrink that poo poo, cut RTX and slap 11 GB on the 2080, lower the price 30%, dunno. I don't want to stopgap with a current 2070 or 2080 and the Ti is still just ridiculously expensive.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

mcbexx posted:

Here's some more clips in motion, comparing both options (DLSS/RTX) on/off.
DLSS looks noticeably blurry. What's the point of a high resolution monitor when everything looks like an Emmanuelle movie from the 70's.

https://www.youtube.com/watch?v=8q7KCTXy2Jc

Funny how RTX on just disables all sunlight shining through the window (0:55).

Still not convinced of both features. I just want to go back to solid performance improvements at midrange prices. My 970 is getting long in the tooth and no options is appealing right now. Shrink that poo poo, cut RTX and slap 11 GB on the 2080, lower the price 30%, dunno. I don't want to stopgap with a current 2070 or 2080 and the Ti is still just ridiculously expensive.

I don’t want to assume what your budget for a GPU or upgrade history is like is but the 1070ti, 1080, 2060 or Vega 56/64 will offer a 70-100% performance increase. The 2060 is $350ish and Vega 56 is $330 and up, not too far off what a launch 970 cost.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/33.html

https://pcpartpicker.com/products/video-card/#c=436,404,405&sort=price

Stickman
Feb 1, 2004

Besides disabling the indoor sunlight, RTX lighting just seems far too dark indoors. It looks nice, but dark just doesn't work all that well on current monitors. That said, it's probably more of an artistic direction issue than RTX itself, and we'll see how it plays out in motion.

Stickman
Feb 1, 2004

B-Mac posted:

I don’t want to assume what your budget for a GPU or upgrade history is like is but the 1070ti, 1080, 2060 or Vega 56/64 will offer a 70-100% performance increase. The 2060 is $350ish and Vega 56 is $330 and up, not too far off what a launch 970 cost.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/33.html

https://pcpartpicker.com/products/video-card/#c=436,404,405&sort=price

That Red Dragon Vega 56 is out of stock, unfortunately. It was definitely the best deal out there - 1070 Ti/2060-level performance with 8GB RAM for $340 + 3 free games. Otherwise, it's still used 1070 Tis for $280ish and EVGA, Gigabyte, and MSI cards will still have 1.5+ years of warranty remaining. 2060s are just so hard to recommend with only 6GB VRAM, and used 1080s are reaching the end of their transferable warranties.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
The problem with a lot of these initial DLSS comparisons are still being repeated now, which is that they're testing the performance uplift against the native, non-DLSS final output res, and/or they're also showing how it looks next to 1440p, which is the 'real' rendering res of DLSS 4K.

DLSS does have overhead compared to it's base res without DLSS - it 'aint free regardless of the Tensor cores doing a lot of the work - so you can't compare the performance of say, non DLSS 1440p and DLSS 4K, which is 'rendering' at 1440p. You're definitely not going to get 1440p performance with DLSS 4K, but you certainly will get a perf uplift over straight 4K. What was obvious to me when DLSS's performance numbers were revealed was what should be done in these tests is to pick an intermediate resolution under 4K that gets you to within the same ballpark of performance as the 4K DLSS, and compare the visual quality of that mode using the regular upscaling to what DLSS is delivering. That's what DLSS should be measured against.

Both AMD/Nvidia allow you to set any arbitrary res (I often use 2880x1620 and 3200x1800 on my 1060 depending upon the game), and many games have resolution scaling sliders now to boot, so there's no reason not to do this. Of course DLSS 4K will give you a good boost over native 4K, and it should (apparently not with Exodus currently) give you an image quality boost over 1440p. But since you're not getting 1440p performance, I only care how it looks when compared to a native res in between that and 4k - how does DLSS 4K compare to 3200x1800? Is DLSS 4k better quality than 4K in BF5 with 80% resolution scaling? Is there any difference at all? If there isn't or it's minuscule, it's largely pointless.

I think only Hardware Unboxed was the only one (or one of few) when they reviewed the FFVX DLSS patch to do this. They tested it against 1800p and found virtually no benefit.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
:yooge: It just works folks

Geek Icon
May 8, 2006
Hello.
Why are the new 2080ti cards so expensive and why are there so many versions of them from Asus and EVGA?

I couldn't afford to or even justify the cost of a 1080ti back in the days when bitcoin was a revolutionary new and improved way to buy yourself a Lamborghini, so I ended up waiting for months for new supply and still ended up overpaying for a 1080 card.

Is upgrading from a 1080 card to a 2080 one even worth it for a 1440p monitor

Probably not.


Why and how the gently caress man this is dumb my gaming hobby has become loving expensive

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Lack of competition and a willingness from consumers to pay that much, basically

AMD was supposed to mix things up a little but they have just literally released a card that's only as fast as nvidia's best offering from two years ago and for the exact same price. The R7 was supposed to be cheaper than the 1080ti and it.... isn't.

..btt
Mar 26, 2008

Stickman posted:

Besides disabling the indoor sunlight, RTX lighting just seems far too dark indoors. It looks nice, but dark just doesn't work all that well on current monitors. That said, it's probably more of an artistic direction issue than RTX itself, and we'll see how it plays out in motion.

It seems that existing rendering techniques require a different lighting balance than ray tracing, which is to be expected. I wonder if it will be possible to have well composed scenes that work for traditional render paths and ray tracing in the context of games. Maybe just add a shitload of light sources that are disabled unless you're using RTX?

shrike82
Jun 11, 2005

The deep learning research papers that Nvidia or Nvidia-adjacent researchers have published doesn't give the sense they have an edge over academia or the tech giants e.g. Google.

I wouldn't be surprised if DLSS is a toy application that some Nvidia researcher played with that doesn't work well in real life conditions when thrown against a broad set of games but got seized upon by suits in the company out of desperation.

PC LOAD LETTER
May 23, 2005
WTF?!

Zedsdeadbaby posted:

and a willingness from consumers to pay that much, basically
Weelll RTX sales in general are supposed to be kinda mediocre to even kinda bad right now so it'd probably be more correct to say a lack of high end competition and because NV/OEM's give no fucks about providing value and want to keep prices as near crypto boom levels as possible.

eames
May 9, 2009

GTX1660ti is a good step in the right direction, hopefully RTX sales remain "mediocre" so they'll be forced to extend that range upwards (1770? 1880?).
All of the RTX/tensor related features remind me of Hairworks, good concepts introduced 1-2 generations before the hardware is ready for it. Not many people are going to upgrade their Pascal GPUs for $$$ to enable a IQ feature that then results in massive net FPS loss.

repiv
Aug 13, 2009

eames posted:

GTX1660ti is a good step in the right direction, hopefully RTX sales remain "mediocre" so they'll be forced to extend that range upwards (1770? 1880?).
All of the RTX/tensor related features remind me of Hairworks, good concepts introduced 1-2 generations before the hardware is ready for it. Not many people are going to upgrade their Pascal GPUs for $$$ to enable a IQ feature that then results in massive net FPS loss.

The difference is that Hairworks was built on the standard compute and tessellation pipelines, so it failing to take off didn't really cost Nvidia anything. They had to support compute/tess anyway for other things.

RT and Tensor on the other hand have dedicated silicon that's completely dead weight if adoption doesn't pan out.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
RTX cards are expensive because they're loving enormous. At 445mm², the 2060/2070 are nearly as big as the 471mm² 1080Ti. The 2080Ti is 754mm².

orcane
Jun 13, 2012

Fun Shoe

K8.0 posted:

RTX cards are expensive because they're loving enormous. At 445mm², the 2060/2070 are nearly as big as the 471mm² 1080Ti. The 2080Ti is 754mm².
Oh no poor Nvidia had to make huge loving chips to replace their much smaller older chips and that means of course they HAVE TO charge $$$ even though the additional features are a complete gamble at this point, stupid ungrateful consumers.

I'm going to mail Jensen a few more leather jackets now, I feel so bad for him.

wargames
Mar 16, 2008

official yospos cat censor

repiv posted:

RT and Tensor on the other hand have dedicated silicon that's completely dead weight if adoption doesn't pan out.

Adoption won't pan out as its not a feature supported in the console market that AMD controls.

eames
May 9, 2009

repiv posted:

RT and Tensor on the other hand have dedicated silicon that's completely dead weight if adoption doesn't pan out.

Well that's the gamble they took with this architecture, they tried to find a way to recoup some of their AI/DC investments on the gaming market. The only reason why this strategy works right now is because there's zero competition at the high end.

alex314
Nov 22, 2007

nVidia has pockets deep enough to throw some cash at game studios and ask them to add the feature. We'll see how does Mechwarrior implementation work.

repiv
Aug 13, 2009

wargames posted:

Adoption won't pan out as its not a feature supported in the console market that AMD controls.

RT adoption is going pretty well considering the "consoles won't support it until next-next-gen at the earliest" roadblock, i.e. Epic just confirmed that UE4 will support DXR as a first-class feature, unlike previous NV-centric tech that was relegated to unofficial UE4 forks.

Tensor seems to be DOA for games though, as far as I know not a single developer is using NVs Tensor denoisers for their DXR implementations. Everyone rolled their own compute-based denoiser.

PC LOAD LETTER
May 23, 2005
WTF?!

K8.0 posted:

RTX cards are expensive because they're loving enormous.
Huge die sizes have been the norm, in what were then cutting edge bulk process tech no less, in the GPU biz for a long long time now.

Yes they cost no small amount of money but it used to be they could still sell graphics cards, particularly previous gen "old" ones, for much more reasonable prices.

edit: \/\/\/\/\/\/\/\/ it probably is true that its mostly Nvidia that is price gouging though

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

orcane posted:

Oh no poor Nvidia had to make huge loving chips to replace their much smaller older chips and that means of course they HAVE TO charge $$$ even though the additional features are a complete gamble at this point, stupid ungrateful consumers.

I'm going to mail Jensen a few more leather jackets now, I feel so bad for him.

Dude relax, he's stating an obvious fact - they're huge, and cost a lot to make. Of course Nvidia didn't have to go this route, it's not a defense of their strategy that led to this point, but it's absolutely true that their size and using GDDR6 are significant factors in their high price. It's not just Nvidia gouging.

Inept
Jul 8, 2003

repiv posted:

RT adoption is going pretty well considering the "consoles won't support it until next-next-gen at the earliest" roadblock

Is there any indicator at all that AMD and the console manufacturers care about this? Even if they do, it seems too late in the game to change their current designs unless next-gen consoles are 3+ years out.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

eames posted:

GTX1660ti is a good step in the right direction, hopefully RTX sales remain "mediocre" so they'll be forced to extend that range upwards (1770? 1880?).
All of the RTX/tensor related features remind me of Hairworks, good concepts introduced 1-2 generations before the hardware is ready for it. Not many people are going to upgrade their Pascal GPUs for $$$ to enable a IQ feature that then results in massive net FPS loss.
I don't think it's a 'step' in any direction though, it was likely always planned as they know they can't scale the tensor cores down into a price point that actually moves a good amount of cards. For them to go beyond the 1660Ti performance-wise without Tensor would completely undercut the 2060, they're just not going to do that unless they absolutely bomb - which they likely won't, as they're the only choice now for cards in that performance bracket since they've stopped making the 10x series mid/high-end cards.

repiv
Aug 13, 2009

Inept posted:

Is there any indicator at all that AMD and the console manufacturers care about this? Even if they do, it seems too late in the game to change their current designs unless next-gen consoles are 3+ years out.

Yeah that's why I said next-next-gen, it's not happening on consoles any time soon. But engine developers seem to be interested regardless (either because it's cool, or because NV is shoveling money at them, or maybe both) considering DXR/RTX already has more confirmed games than NVs previous efforts like VXGI ever had.

..btt
Mar 26, 2008
I think ray tracing looks great when done well, and find it a little strange people posting in this thread, of all places, talk as if they want it to fail. I mean, I haven't bought an RTX card despite pre-ordering a 1080ti, but I absolutely hope it works out and becomes affordable in a generation or two.

nvidia obviously have profit margins in mind, but so do AMD and Intel, and nvidia is probably the only company in a position to drive adoption in the current market. Of course it won't be driven by the console market, which is geared toward cheap, relatively low-end hardware. PCs had functional dedicated 3D hardware before consoles did too, and I'm glad that worked out.

orcane
Jun 13, 2012

Fun Shoe
More announced games at this point :v:

repiv
Aug 13, 2009

Well VXGI shipped in a grand total of zero games so RTX already has that beat :v:

Only the stripped down VXAO ever shipped, and only in two games.

orcane
Jun 13, 2012

Fun Shoe
Haha it was in Rise of the Tomb Raider, too.

Shadow still doesn't have RTX :thunk:

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Inept posted:

Is there any indicator at all that AMD and the console manufacturers care about this? Even if they do, it seems too late in the game to change their current designs unless next-gen consoles are 3+ years out.

Don't expect any hardware assisted ray tracing in consoles until next-next gen.

There have been and continue to be some great innovations in simulating light realistically on general purpose hardware though. Look at the game Claybook as an example. So I expect we'll see some cool stuff from upcoming games regardless.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

..btt posted:

I think ray tracing looks great when done well, and find it a little strange people posting in this thread, of all places, talk as if they want it to fail.
People are more likely saying they want the considerable die area dedicated to accelerating RTX dedicated to boosting more traditional rendering methods right now. I want ray tracing, but the problem is it's arriving at a time when people want framerates above 60+ and high-res rendering (4k or UW) are becoming more commonplace resolutions. RTX, as it exists now, requires you to step back on both of these.

As well, after 2 years people were likely hoping 4K ~60fps would be reachable by mid-tier cards, when in reality the price to reach that has remained largely the same.

Cygni
Nov 12, 2005

raring to post

While console adoption def helps PC adoption of certain features, its certainly not the end all be all... or poo poo, even the most important factor in whether a PC game supports certain features.

On top of that, I frankly don't expect much from the next wave of Sony/MS consoles, and the best selling console in the US is an Nvidia reference design with a cart slot.

Womyn Capote
Jul 5, 2004


I built a new gaming PC a few months ago but kept my old 970 GTX in it. Any advice for a reasonable upgrade under $500?

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Womyn Capote posted:

I built a new gaming PC a few months ago but kept my old 970 GTX in it. Any advice for a reasonable upgrade under $500?

The RTX 2070 right around $500 (or sometimes slightly under) and will roughly double your FPS in many games (assuming you're not getting CPU bound). The step below that is the 2060, which is around $350 and somewhere around 50% better FPS (again, very rough numbers here), or the upcoming 1160 which you should probably wait and see what it's like before committing. What's your monitor like, though? A 2070 is overkill for 1080p 60fps.

sauer kraut
Oct 2, 2004

Womyn Capote posted:

I built a new gaming PC a few months ago but kept my old 970 GTX in it. Any advice for a reasonable upgrade under $500?

The 1660ti is launching tomorrow, it's about the only interesting new card atm and might be up your alley.
Otherwise, used 1070ti/Vega for 300 something$. Used 1080ti for 500 something$.

EdEddnEddy
Apr 5, 2012



While I feel DLSS will eventually die in the long run, I hope RT is here to stay and gets better. Let the GPU render the game as it was intended and just improve performance without trying to make it do more with less.

RT in Real Time is finally sort of here, and going by how that Quake 2 RT panned out, more of that would be great for either remakes of old classics, or some new games using it with a more simplistic design yet vastly more realistic shading/shadows/etc.

Adbot
ADBOT LOVES YOU

..btt
Mar 26, 2008

Happy_Misanthrope posted:

People are more likely saying they want the considerable die area dedicated to accelerating RTX dedicated to boosting more traditional rendering methods right now. I want ray tracing, but the problem is it's arriving at a time when people want framerates above 60+ and high-res rendering (4k or UW) are becoming more commonplace resolutions. RTX, as it exists now, requires you to step back on both of these.

As well, after 2 years people were likely hoping 4K ~60fps would be reachable by mid-tier cards, when in reality the price to reach that has remained largely the same.

Yeah, I can understand that, but to vilify nvidia specifically for something nobody else is delivering either seems a bit weird to me. The crypto bubble seems like it kinda screwed nvidia, even if it was their own doing, so it seems understandable, if not ideal, that they aren't pushing prices down in a market sector with no effective competition.

Hopefully the 16xx line can fulfill that desire to some extent though.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply