Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SlayVus
Jul 10, 2009
Grimey Drawer

DrDork posted:

Dumping the tensor cores for more CUDA cores is something they very reasonably could do from a design standpoint. But it'd be yet another chip to tap out and worry about yields and such, and it would under-cut their argument that ray-tracing is The Future, so I doubt we'll see a non-tensor 2080/Ti variant. It also probably wouldn't be any cheaper, because die space is die space, regardless of what you fill it with.

Supposedly the RTX 2070 will have 2304 Cuda cores vs 1920 of the 1070. The RTX 2080 has 2944(2560) and the 2080 Ti will have 4352(3584). Just cutting the tensor cores off would be all you would have to do. The 2070 has 20% more cuda cores, 2080 has 15% more and the Ti has 21.4% more. Die space is die space, yes. You're talking though that the 2080 Ti is supposedly going to be 60.08% larger die size than the 1080 Ti. Which the major portion of that 60% is going to be Tensor cores. If you're cutting 40% of the die off, because Tensor cores, you're fitting, what, 20-30% more dies on a single wafer?

SlayVus fucked around with this message at 22:04 on Sep 11, 2018

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

VelociBacon posted:

I would think the whole architecture of the chip is going to be radically different for Ray tracing cards so they wouldn't want to split their own market and have people buying gtx 2080s for less than rtx ones.

It would be like shipping cards without Physx support; it just doesn't make any sense when you're counting on developers to buy into your new gimmick. If NVidia ever allows a G-Sync monitor to support FreeSync as well on the same sku, then I might believe they'll forego their proprietary tech to give people more options. But that runs counter to everything they've ever done as a business; they add reasons to lock you in to their brand, not subtract them.

repiv
Aug 13, 2009

Videocardz posted some new Turing architecture info.

Mesh Shading sounds like AMDs Primitive Shaders. Maybe NVs version will actually work!

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!
I don't think we can say that just adding more cuda cores will actually yield big performance gains past a certain point, I'm sure there's a point where you start seeing your performance scaling per cuda core drop significantly, we saw the same thing with AMD's chips and I have no reason to believe that Nvidia's chips don't have their own limits.

SlayVus
Jul 10, 2009
Grimey Drawer

AVeryLargeRadish posted:

I don't think we can say that just adding more cuda cores will actually yield big performance gains past a certain point, I'm sure there's a point where you start seeing your performance scaling per cuda core drop significantly, we saw the same thing with AMD's chips and I have no reason to believe that Nvidia's chips don't have their own limits.

If anything, so far it looks like Nvidia is being hamstrung on memory bandwidth. Just compare the 1080 Ti vs the Titan XP. Almost same exact core configuration, except for ROPs. The Ti can out perform the Titan XP by using it's superior memory bandwidth. If the new cards are going to have GDDR6 and the 2080 Ti is going to have supposedly 616 GB/s, it has a 27.27% increase in memory bandwidth atop of it's 21.4% more CUDA cores.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
NVIDIA improved delta compression again too.

quote:

Turing architecture brings new lossless compression techniques. NVIDIA claims that their further improvements to ‘state of the art’ Pascal algorithms have provided (in NVIDIA’s own words) ‘50% increase in effective bandwidth on Turing compared to Pascal’.
https://videocardz.com/77895/the-new-features-of-nvidia-turing-architecture

EdEddnEddy
Apr 5, 2012



So the odds that Turing is 30%+ Faster than Pascal, especially once the Drivers are actually cooked, would not be out of the realms of impossible all things considered.

Man the wait to see is always a bummer. :(

ChaseSP
Mar 25, 2013



Personally I'm looking forward to the second generation raytracing cards that will hopefully be more affordable.

cowofwar
Jul 30, 2002

by Athanatos
Yeah this gen is a hard skip

Worf
Sep 12, 2017

If only Seth would love me like I love him!

cowofwar posted:

Yeah this gen is a hard skip

Same but largely because $2k in video cards in 2-3 year span is pretty stupid money

ChaseSP
Mar 25, 2013



Plenty of reasons, horrible cost performance ratios, groundbreaking tech likely to go through the teething phases. drat if it won't make some sick looking gameplay streams though.

SwissArmyDruid
Feb 14, 2014

by sebmojo
...yes. Because Twitch compression doesn't squash the poo poo out of subtle detail or anything.

Anime Schoolgirl
Nov 28, 2002

6000kbps looks great!!!!! ...in 540p

Cygni
Nov 12, 2005

raring to post

i dunno i thought raytracing looked pretty cool in that lovely compressed rear end stream i watched the presentation on

VelociBacon
Dec 8, 2009

The kind of thing you don't notice because it's working probably.

SwissArmyDruid
Feb 14, 2014

by sebmojo
I agree that they were able to convey that it was working, but those were largely static shots, either still, or close to still, and nothing like what you'd expect realistic gameplay of FPSes to actually be like.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
So is ray tracing related to the thing where the game renders at a low resolution, but then has AI upscale it for “cheap” and fills in the blanks? Are both those things coming out at the same time?

Craptacular!
Jul 9, 2001

Fuck the DH

Statutory Ape posted:

Same but largely because $2k in video cards in 2-3 year span is pretty stupid money

Yep, anyone who bought a video card in the past year has seen such crazy prices that either they couldn’t afford one until just now and couldn’t afford RTX anyway, or just spent so much money on a card that just cratered in depreciation and isn’t exactly looking for more.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

buglord posted:

So is ray tracing related to the thing where the game renders at a low resolution, but then has AI upscale it for “cheap” and fills in the blanks? Are both those things coming out at the same time?

No, the AI upscaling is it's own thing, though it uses the same tensor cores as the AI denoising used in Nvidia's ray tracing tech demos. Both ray tracing and the AI upscaling have to be implemented on a game by game basis, so it's hard to say exactly when games with these features will be available, though I would assume it will start showing up as a feature alongside the launch of the new cards or soon after.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin
There was an interview at gamescom where a SVP at Nvidia explained that almost all developers send Nvidia pre-release builds of their games anyway. So that in itself shouldn't be any kind of barrier to DLSS support.

https://youtu.be/m5WJCIe0ihc

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Yes, you need to send your game to NVIDIA anyway for driver tuning.

quote:

Maintaining the drivers with the current wide surface area is tricky. Although AMD and NV have the resources to do it, the smaller IHVs (Intel, PowerVR, Qualcomm, etc) simply cannot keep up with the necessary investment. More importantly, explaining to devs the correct way to write their render pipelines has become borderline impossible. There's too many failure cases. it's been understood for quite a few years now that you cannot max out the performance of any given GPU without having someone from NVIDIA or AMD physically grab your game source code, load it on a dev driver, and do a hands-on analysis. These are the vanishingly few people who have actually seen the source to a game, the driver it's running on, and the Windows kernel it's running on, and the full specs for the hardware. Nobody else has that kind of access or engineering ability.
https://www.gamedev.net/forums/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/#entry5215019

Truga
May 4, 2014
Lipstick Apathy
yeah, proprietary software sucks, news at 11

ufarn
May 30, 2009
Someone, somewhere, brought up EVGA misrepresenting ICX2 for their GPUs, and GN did a video on it:

https://www.youtube.com/watch?v=yRjfxTu-a-I

Icept
Jul 11, 2001
Is there a reason that AMD has not seen (at least publicly) the blowback to the cryptobubble pop as Nvidia did, where they agreed to reclaim a significant number of chips from the AIB partners?

I assume that the miners were purchasing every GPU available, with either AMD or Nvidia chips. Is it because the AMDs production was simply not at the same output as Nvidia, and therefore the AIB partners were left holding less chips at the time of the "collapse"?

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf

ufarn posted:

Someone, somewhere, brought up EVGA misrepresenting ICX2 for their GPUs, and GN did a video on it:

https://www.youtube.com/watch?v=yRjfxTu-a-I

Lol wasn't it a goon ITT?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Icept posted:

Is there a reason that AMD has not seen (at least publicly) the blowback to the cryptobubble pop as Nvidia did, where they agreed to reclaim a significant number of chips from the AIB partners?

I assume that the miners were purchasing every GPU available, with either AMD or Nvidia chips. Is it because the AMDs production was simply not at the same output as Nvidia, and therefore the AIB partners were left holding less chips at the time of the "collapse"?

You pretty much already got it: AMD wasn't able to ramp up production for various reasons, so they didn't have a bubble in the same sense. AMD's partners were left holding basically no chips, because they never managed to produce enough to result in a glut.

TacticalHoodie
May 7, 2007

I was going to step-up with the EVGA 1080Ti SC I got yesterday to a 2080, but I am just going to stick with it and skip Turing. I am getting 144FPS now on the games I am playing and with Gsnyc, I am not going to benefit from anything higher than that on my monitor. Also EVGA Support is killer. The rep I has last night helped me fix my issues with Ryzen stuttering on my new card(disable SMT and don't install HD Audio Drivers) and was able to figure out I was dropping packets on my internet connection too. I am considering getting their CPU AIO and a case from them because the support alone is worth the price.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
That support is almost sounding too good to be true. Did you get that kind of help for one specific game or was it a number of different issues you brought up separately?

Icept
Jul 11, 2001

DrDork posted:

You pretty much already got it: AMD wasn't able to ramp up production for various reasons, so they didn't have a bubble in the same sense. AMD's partners were left holding basically no chips, because they never managed to produce enough to result in a glut.

Alright, bit of a silver lining for AMD I guess :)

Lord Ludikrous
Jun 7, 2008

Enjoy your tea...

Found a Gigabyte 1070 Ti Founders Edition on eBay for £230 buy it now. Seems good?

1gnoirents
Jun 28, 2014

hello :)

AEMINAL posted:

Lol wasn't it a goon ITT?

I came here to post this, and yeah it was lol. He got a thorough seemingly hand written response from EVGA too

ughhhh
Oct 17, 2012

Whiskey A Go Go! posted:

Also EVGA I am considering getting their CPU AIO and a case from them because the support alone is worth the price.

The new case sucks and is huge. The Hadron is a cool little itx case (I don't think it's for sale anymore), but comes with a built in psu that has a dinky little 40mm fan (I ended up removing mine and turning the drive cage into a slot for a sfx psu).

Evgas psu, gpu and some motherboards are excellent, but the other stuff are kinda lacking. It's also wierd since I have the itx case but they never sold a itx mobo for the z370. I even waited a couple of months hoping they would make one but they never did.

TacticalHoodie
May 7, 2007

Sidesaddle Cavalry posted:

That support is almost sounding too good to be true. Did you get that kind of help for one specific game or was it a number of different issues you brought up separately?

It kinda just came up when I was testing in Battlefield 1 with him. He wanted to see if the FPS was dropping over and over again like it was before to cause my stuttering and it was happening only in multiplayer games but not in single player games. Ran a loop ping test and found out that it was the internet connection that was also causing stuttering issues by dropping packets. It was a nice above and beyond thing he did.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
I can only dream of techs/reps with the kind of expertise to solve one's own personal gaming hiccups and I almost feel like I want to whine to EVGA about the max overclocks of my refurb 980ti just for the experience :v:

ughhhh posted:

The new case sucks and is huge. The Hadron is a cool little itx case (I don't think it's for sale anymore), but comes with a built in psu that has a dinky little 40mm fan (I ended up removing mine and turning the drive cage into a slot for a sfx psu).

Can't remember if we had a case thread here but those DG-8x cases seemed like they were at least nice for having a full compartment in the rear for 2x120mm fans/240 rads. What's wrong with them?

ufarn
May 30, 2009

AEMINAL posted:

Lol wasn't it a goon ITT?
It was in some thread but I couldn't find the post by searching for "ICX" when I posted it. Only went back like five pages, guess the GPU thread is just way more active than I give it credit for. Even during the downtime.

ughhhh
Oct 17, 2012

Sidesaddle Cavalry posted:

Can't remember if we had a case thread here but those DG-8x cases seemed like they were at least nice for having a full compartment in the rear for 2x120mm fans/240 rads. What's wrong with them?

It looks like a old crt tv and is just as big. Also has evga logo plastered all over.

EdEddnEddy
Apr 5, 2012



Sidesaddle Cavalry posted:

I can only dream of techs/reps with the kind of expertise to solve one's own personal gaming hiccups and I almost feel like I want to whine to EVGA about the max overclocks of my refurb 980ti just for the experience :v:


Can't remember if we had a case thread here but those DG-8x cases seemed like they were at least nice for having a full compartment in the rear for 2x120mm fans/240 rads. What's wrong with them?

drat, techs like that are few and far between. That is the sort of Troubleshooting I enjoy doing when you get a visible fix not only for the core problem, but to identify and fix problems that users may not even know they had.

Captain Yossarian
Feb 24, 2011

All new" Rings of Fire"
Something was wrong on this website and I demand they give me a better thing for free, and furthermore

Aeka 2.0
Nov 16, 2000

:ohdear: Have you seen my apex seals? I seem to have lost them.




Dinosaur Gum
This is my first time getting an Nvidia branded card. I usually go for the EVGA but the cost is a bit much. My question is are the 2080ti Nvidia branded cards binned, do they have overclocking potential or is it a big "we don't know" at this point?

Adbot
ADBOT LOVES YOU

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Aeka 2.0 posted:

This is my first time getting an Nvidia branded card. I usually go for the EVGA but the cost is a bit much. My question is are the 2080ti Nvidia branded cards binned, do they have overclocking potential or is it a big "we don't know" at this point?

They are almost certainly binned, but we don't know how well they will OC in general.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply