|
DrDork posted:Dumping the tensor cores for more CUDA cores is something they very reasonably could do from a design standpoint. But it'd be yet another chip to tap out and worry about yields and such, and it would under-cut their argument that ray-tracing is The Future, so I doubt we'll see a non-tensor 2080/Ti variant. It also probably wouldn't be any cheaper, because die space is die space, regardless of what you fill it with. Supposedly the RTX 2070 will have 2304 Cuda cores vs 1920 of the 1070. The RTX 2080 has 2944(2560) and the 2080 Ti will have 4352(3584). Just cutting the tensor cores off would be all you would have to do. The 2070 has 20% more cuda cores, 2080 has 15% more and the Ti has 21.4% more. Die space is die space, yes. You're talking though that the 2080 Ti is supposedly going to be 60.08% larger die size than the 1080 Ti. Which the major portion of that 60% is going to be Tensor cores. If you're cutting 40% of the die off, because Tensor cores, you're fitting, what, 20-30% more dies on a single wafer? SlayVus fucked around with this message at 22:04 on Sep 11, 2018 |
# ? Sep 11, 2018 22:01 |
|
|
# ? May 31, 2024 13:18 |
|
VelociBacon posted:I would think the whole architecture of the chip is going to be radically different for Ray tracing cards so they wouldn't want to split their own market and have people buying gtx 2080s for less than rtx ones. It would be like shipping cards without Physx support; it just doesn't make any sense when you're counting on developers to buy into your new gimmick. If NVidia ever allows a G-Sync monitor to support FreeSync as well on the same sku, then I might believe they'll forego their proprietary tech to give people more options. But that runs counter to everything they've ever done as a business; they add reasons to lock you in to their brand, not subtract them.
|
# ? Sep 11, 2018 22:04 |
|
Videocardz posted some new Turing architecture info. Mesh Shading sounds like AMDs Primitive Shaders. Maybe NVs version will actually work!
|
# ? Sep 11, 2018 22:23 |
I don't think we can say that just adding more cuda cores will actually yield big performance gains past a certain point, I'm sure there's a point where you start seeing your performance scaling per cuda core drop significantly, we saw the same thing with AMD's chips and I have no reason to believe that Nvidia's chips don't have their own limits.
|
|
# ? Sep 11, 2018 23:17 |
|
AVeryLargeRadish posted:I don't think we can say that just adding more cuda cores will actually yield big performance gains past a certain point, I'm sure there's a point where you start seeing your performance scaling per cuda core drop significantly, we saw the same thing with AMD's chips and I have no reason to believe that Nvidia's chips don't have their own limits. If anything, so far it looks like Nvidia is being hamstrung on memory bandwidth. Just compare the 1080 Ti vs the Titan XP. Almost same exact core configuration, except for ROPs. The Ti can out perform the Titan XP by using it's superior memory bandwidth. If the new cards are going to have GDDR6 and the 2080 Ti is going to have supposedly 616 GB/s, it has a 27.27% increase in memory bandwidth atop of it's 21.4% more CUDA cores.
|
# ? Sep 11, 2018 23:37 |
|
NVIDIA improved delta compression again too.quote:Turing architecture brings new lossless compression techniques. NVIDIA claims that their further improvements to ‘state of the art’ Pascal algorithms have provided (in NVIDIA’s own words) ‘50% increase in effective bandwidth on Turing compared to Pascal’.
|
# ? Sep 11, 2018 23:47 |
|
So the odds that Turing is 30%+ Faster than Pascal, especially once the Drivers are actually cooked, would not be out of the realms of impossible all things considered. Man the wait to see is always a bummer.
|
# ? Sep 11, 2018 23:51 |
|
Personally I'm looking forward to the second generation raytracing cards that will hopefully be more affordable.
|
# ? Sep 12, 2018 00:06 |
|
Yeah this gen is a hard skip
|
# ? Sep 12, 2018 00:07 |
|
cowofwar posted:Yeah this gen is a hard skip Same but largely because $2k in video cards in 2-3 year span is pretty stupid money
|
# ? Sep 12, 2018 00:45 |
|
Plenty of reasons, horrible cost performance ratios, groundbreaking tech likely to go through the teething phases. drat if it won't make some sick looking gameplay streams though.
|
# ? Sep 12, 2018 03:12 |
|
...yes. Because Twitch compression doesn't squash the poo poo out of subtle detail or anything.
|
# ? Sep 12, 2018 03:30 |
|
6000kbps looks great!!!!! ...in 540p
|
# ? Sep 12, 2018 03:46 |
|
i dunno i thought raytracing looked pretty cool in that lovely compressed rear end stream i watched the presentation on
|
# ? Sep 12, 2018 03:48 |
|
The kind of thing you don't notice because it's working probably.
|
# ? Sep 12, 2018 03:54 |
|
I agree that they were able to convey that it was working, but those were largely static shots, either still, or close to still, and nothing like what you'd expect realistic gameplay of FPSes to actually be like.
|
# ? Sep 12, 2018 05:11 |
|
So is ray tracing related to the thing where the game renders at a low resolution, but then has AI upscale it for “cheap” and fills in the blanks? Are both those things coming out at the same time?
|
# ? Sep 12, 2018 05:51 |
|
Statutory Ape posted:Same but largely because $2k in video cards in 2-3 year span is pretty stupid money Yep, anyone who bought a video card in the past year has seen such crazy prices that either they couldn’t afford one until just now and couldn’t afford RTX anyway, or just spent so much money on a card that just cratered in depreciation and isn’t exactly looking for more.
|
# ? Sep 12, 2018 06:11 |
buglord posted:So is ray tracing related to the thing where the game renders at a low resolution, but then has AI upscale it for “cheap” and fills in the blanks? Are both those things coming out at the same time? No, the AI upscaling is it's own thing, though it uses the same tensor cores as the AI denoising used in Nvidia's ray tracing tech demos. Both ray tracing and the AI upscaling have to be implemented on a game by game basis, so it's hard to say exactly when games with these features will be available, though I would assume it will start showing up as a feature alongside the launch of the new cards or soon after.
|
|
# ? Sep 12, 2018 06:26 |
|
There was an interview at gamescom where a SVP at Nvidia explained that almost all developers send Nvidia pre-release builds of their games anyway. So that in itself shouldn't be any kind of barrier to DLSS support. https://youtu.be/m5WJCIe0ihc
|
# ? Sep 12, 2018 10:41 |
|
Yes, you need to send your game to NVIDIA anyway for driver tuning.quote:Maintaining the drivers with the current wide surface area is tricky. Although AMD and NV have the resources to do it, the smaller IHVs (Intel, PowerVR, Qualcomm, etc) simply cannot keep up with the necessary investment. More importantly, explaining to devs the correct way to write their render pipelines has become borderline impossible. There's too many failure cases. it's been understood for quite a few years now that you cannot max out the performance of any given GPU without having someone from NVIDIA or AMD physically grab your game source code, load it on a dev driver, and do a hands-on analysis. These are the vanishingly few people who have actually seen the source to a game, the driver it's running on, and the Windows kernel it's running on, and the full specs for the hardware. Nobody else has that kind of access or engineering ability.
|
# ? Sep 12, 2018 10:53 |
|
yeah, proprietary software sucks, news at 11
|
# ? Sep 12, 2018 10:56 |
|
Someone, somewhere, brought up EVGA misrepresenting ICX2 for their GPUs, and GN did a video on it: https://www.youtube.com/watch?v=yRjfxTu-a-I
|
# ? Sep 12, 2018 11:12 |
|
Is there a reason that AMD has not seen (at least publicly) the blowback to the cryptobubble pop as Nvidia did, where they agreed to reclaim a significant number of chips from the AIB partners? I assume that the miners were purchasing every GPU available, with either AMD or Nvidia chips. Is it because the AMDs production was simply not at the same output as Nvidia, and therefore the AIB partners were left holding less chips at the time of the "collapse"?
|
# ? Sep 12, 2018 13:49 |
|
ufarn posted:Someone, somewhere, brought up EVGA misrepresenting ICX2 for their GPUs, and GN did a video on it: Lol wasn't it a goon ITT?
|
# ? Sep 12, 2018 13:53 |
|
Icept posted:Is there a reason that AMD has not seen (at least publicly) the blowback to the cryptobubble pop as Nvidia did, where they agreed to reclaim a significant number of chips from the AIB partners? You pretty much already got it: AMD wasn't able to ramp up production for various reasons, so they didn't have a bubble in the same sense. AMD's partners were left holding basically no chips, because they never managed to produce enough to result in a glut.
|
# ? Sep 12, 2018 13:57 |
I was going to step-up with the EVGA 1080Ti SC I got yesterday to a 2080, but I am just going to stick with it and skip Turing. I am getting 144FPS now on the games I am playing and with Gsnyc, I am not going to benefit from anything higher than that on my monitor. Also EVGA Support is killer. The rep I has last night helped me fix my issues with Ryzen stuttering on my new card(disable SMT and don't install HD Audio Drivers) and was able to figure out I was dropping packets on my internet connection too. I am considering getting their CPU AIO and a case from them because the support alone is worth the price.
|
|
# ? Sep 12, 2018 14:20 |
|
That support is almost sounding too good to be true. Did you get that kind of help for one specific game or was it a number of different issues you brought up separately?
|
# ? Sep 12, 2018 14:24 |
|
DrDork posted:You pretty much already got it: AMD wasn't able to ramp up production for various reasons, so they didn't have a bubble in the same sense. AMD's partners were left holding basically no chips, because they never managed to produce enough to result in a glut. Alright, bit of a silver lining for AMD I guess
|
# ? Sep 12, 2018 14:30 |
|
Found a Gigabyte 1070 Ti Founders Edition on eBay for £230 buy it now. Seems good?
|
# ? Sep 12, 2018 14:32 |
|
AEMINAL posted:Lol wasn't it a goon ITT? I came here to post this, and yeah it was lol. He got a thorough seemingly hand written response from EVGA too
|
# ? Sep 12, 2018 15:14 |
|
Whiskey A Go Go! posted:Also EVGA I am considering getting their CPU AIO and a case from them because the support alone is worth the price. The new case sucks and is huge. The Hadron is a cool little itx case (I don't think it's for sale anymore), but comes with a built in psu that has a dinky little 40mm fan (I ended up removing mine and turning the drive cage into a slot for a sfx psu). Evgas psu, gpu and some motherboards are excellent, but the other stuff are kinda lacking. It's also wierd since I have the itx case but they never sold a itx mobo for the z370. I even waited a couple of months hoping they would make one but they never did.
|
# ? Sep 12, 2018 15:16 |
Sidesaddle Cavalry posted:That support is almost sounding too good to be true. Did you get that kind of help for one specific game or was it a number of different issues you brought up separately? It kinda just came up when I was testing in Battlefield 1 with him. He wanted to see if the FPS was dropping over and over again like it was before to cause my stuttering and it was happening only in multiplayer games but not in single player games. Ran a loop ping test and found out that it was the internet connection that was also causing stuttering issues by dropping packets. It was a nice above and beyond thing he did.
|
|
# ? Sep 12, 2018 15:29 |
|
I can only dream of techs/reps with the kind of expertise to solve one's own personal gaming hiccups and I almost feel like I want to whine to EVGA about the max overclocks of my refurb 980ti just for the experience ughhhh posted:The new case sucks and is huge. The Hadron is a cool little itx case (I don't think it's for sale anymore), but comes with a built in psu that has a dinky little 40mm fan (I ended up removing mine and turning the drive cage into a slot for a sfx psu). Can't remember if we had a case thread here but those DG-8x cases seemed like they were at least nice for having a full compartment in the rear for 2x120mm fans/240 rads. What's wrong with them?
|
# ? Sep 12, 2018 16:20 |
|
AEMINAL posted:Lol wasn't it a goon ITT?
|
# ? Sep 12, 2018 16:21 |
|
Sidesaddle Cavalry posted:Can't remember if we had a case thread here but those DG-8x cases seemed like they were at least nice for having a full compartment in the rear for 2x120mm fans/240 rads. What's wrong with them? It looks like a old crt tv and is just as big. Also has evga logo plastered all over.
|
# ? Sep 12, 2018 16:44 |
|
Sidesaddle Cavalry posted:I can only dream of techs/reps with the kind of expertise to solve one's own personal gaming hiccups and I almost feel like I want to whine to EVGA about the max overclocks of my refurb 980ti just for the experience drat, techs like that are few and far between. That is the sort of Troubleshooting I enjoy doing when you get a visible fix not only for the core problem, but to identify and fix problems that users may not even know they had.
|
# ? Sep 12, 2018 17:15 |
|
Something was wrong on this website and I demand they give me a better thing for free, and furthermore
|
# ? Sep 12, 2018 17:37 |
|
This is my first time getting an Nvidia branded card. I usually go for the EVGA but the cost is a bit much. My question is are the 2080ti Nvidia branded cards binned, do they have overclocking potential or is it a big "we don't know" at this point?
|
# ? Sep 12, 2018 17:39 |
|
|
# ? May 31, 2024 13:18 |
Aeka 2.0 posted:This is my first time getting an Nvidia branded card. I usually go for the EVGA but the cost is a bit much. My question is are the 2080ti Nvidia branded cards binned, do they have overclocking potential or is it a big "we don't know" at this point? They are almost certainly binned, but we don't know how well they will OC in general.
|
|
# ? Sep 12, 2018 17:42 |