|
How plausible is the rumour of Nvidia being short on GDDR5?
|
# ? Sep 5, 2017 11:26 |
|
|
# ? May 19, 2024 04:24 |
|
The 3GB 1060 is not bad at the right price. At $180 (or in my case, $200 with Rocket League) for a 1080p player I'd have no problem with it. Any more and I'd take a $250 6GB, but those don't really exist. The 3GB card doesn't do bad if you're aware of what settings melt performance for the smallest difference; such as depth of field in Destiny 2, high resolution shadows in GTA5, enhanced smoke/fog in Arkham Knight, etc. Recognize these things and turn them off and it does really well for now. The only real "nothing helps" scenario I've found is running into town in Witcher 3. Benchmarks generally agree, with the average frame rate being very close but the 1% dips being much more dramatic. Whereas the 1050ti is just not a meaningful step forward from many cards of four years ago. Craptacular! fucked around with this message at 11:33 on Sep 5, 2017 |
# ? Sep 5, 2017 11:31 |
|
The RX 560 with 4 GB is still going to be twice as fast (and more) than a 6850. The GTX 1050 and 1050 Ti even more so. It really depends on how much someone wants to spend and what games they want to play/how tolerant they are of lower settings. The common view is that it's not worth saving another 30 or 50 bucks to downgrade entry level cards to even lower end ones and lose another 25%+ of "already low" performance, but someone else might not be bothered by the performance hit and could spend those 30 bucks on something else that's important. Granted, it's a small niche, but that doesn't make the chip outright bad or useless (unlike, say, the RX 550 and GT 1030). That said, I do agree that you want to get at least a 1050 Ti for gaming anything remotely modern in 1080p. Or a RX 570 I guess, if buttcoins mining on GPUs didn't exist
|
# ? Sep 5, 2017 11:44 |
|
Scarecow posted:Errrr what Fairly sure that's pretty much bang on.
|
# ? Sep 5, 2017 12:02 |
|
HalloKitty posted:Fairly sure that's pretty much bang on. Yeah looking at benchmarks i did of my 980tis vs my 1080tis your right, man here i was thinking it was only like 10%
|
# ? Sep 5, 2017 12:29 |
|
VostokProgram posted:Is an RX 560 or a GTX 1050ti a noticeable upgrade from an HD 6850? And how much would either of those cost if the market wasn't hosed right now? Even if the difference in specifications isn't that great, the fact that Terascale driver development has been dead for years means either of those cards will beat an HD 6850 silly. From my experience even something as old as a GeForce GTX 460 smacks an HD 6850 around in newer games. Nvidia is still putting a tiny amount of effort into Fermi drivers. AMD is quick to abandon older architectures. I'm pretty sure they quit trying on Terascale support while still selling low-end cards on the architecture.
|
# ? Sep 5, 2017 14:20 |
|
SwissArmyDruid posted:What was that alleged number that people were saying that Vega 64 could do in terms of cryptocurrency? 48, right? I thought people were saying it could do around 70? And everyone here laughed about it.
|
# ? Sep 5, 2017 15:06 |
|
SwissArmyDruid posted:What was that alleged number that people were saying that Vega 64 could do in terms of cryptocurrency? 48, right? Before launch, Gibbo was saying that it could do 70 MH/s, but it looks like that was (at best) a wild-rear end guess of what a card running at optimal settings running a tuned kernel could do, and at worst he was lying to drum up sales. 48 MH/s is the most I think anyone has ever gotten from it. He really burned a lot of credibility/goodwill with this whole launch - although his comments about the existence of a manufacturer's rebate appear to have been more-or-less confirmed from other retailers.
|
# ? Sep 5, 2017 16:24 |
|
Craptacular! posted:The 3GB 1060 is not bad at the right price. At $180 (or in my case, $200 with Rocket League) for a 1080p player I'd have no problem with it. Any more and I'd take a $250 6GB, but those don't really exist. Towns in Witcher 3 are just very problematic in general, and there also appears to be a substantial CPU/memory-bandwidth component to it too. I think it's one of those situations that probably benefits from running from an SSD (to make load-ins as fast as possible), plus fast RAM, plus at least a quad-core CPU with decent single-thread performance (Ryzen, Haswell, or Skylake/Kaby Lake). Also, you may want to turn Hairworks down to Geralt Only if it's enabled, otherwise I'm guessing all the townspeople get super-tessellated wind-tunnel hair. Fortunately it's not really a big deal in terms of gameplay, you might get a little bit of stutter if you're running through and a lot of stuff is loading in, but once you're actually in an area it's pretty stable, and there are relatively few combat sequences in cities anyway. But yeah, the 1050 and 1050 Ti underperform badly compared to the 1060 and above (in general, not just TW3). Dunno if the blame falls on Samsung there or what. Paul MaudDib fucked around with this message at 16:39 on Sep 5, 2017 |
# ? Sep 5, 2017 16:33 |
|
Paul MaudDib posted:Before launch, Gibbo was saying that it could do 70 MH/s, but it looks like that was (at best) a wild-rear end guess of what a card running at optimal settings running a tuned kernel could do, and at worst he was lying to drum up sales. 48 MH/s is the most I think anyone has ever gotten from it. GamersNexus and Videocardz also said their sources backed up the 70MH/sec rumour, so it wasn't just Gibbo making poo poo up to sell cards. The bad information came from somewhere further up the rumour chain.
|
# ? Sep 5, 2017 16:34 |
|
ufarn posted:How plausible is the rumour of Nvidia being short on GDDR5? Lol I still can't believe my 1080ti has 11 loving gigs of vram, how the hell am I supposed to use that?
|
# ? Sep 5, 2017 17:17 |
|
Zero VGS posted:Lol I still can't believe my 1080ti has 11 loving gigs of vram, how the hell am I supposed to use that? What, you never went whole hog in modding Skyrim? You haven't lived until every blade of grass is lovingly rendered with 4k textures!
|
# ? Sep 5, 2017 17:21 |
|
Zero VGS posted:Lol I still can't believe my 1080ti has 11 loving gigs of vram, how the hell am I supposed to use that? You're using it in the sense that running 11 memory modules in parallel provides enough bandwidth for the GPU to do its thing The alternative would be to use 512MB modules and only have 5.5GB overall - better to have too much RAM than too little.
|
# ? Sep 5, 2017 17:26 |
|
Zero VGS posted:Lol I still can't believe my 1080ti has 11 loving gigs of vram, how the hell am I supposed to use that? Your neural networks don't have to be all that big before you've burned through those 11 gigs.
|
# ? Sep 5, 2017 18:20 |
|
Heres a quick and dirty graph i made of prices in July at the height of the bubble vs now. Prices have def come down a fair bit, but still $70-100 over their all time lows. The real insult is that a lot of those all-time lows were nearly 10 months ago. I imagine with the rumors of GDDR5 price increases because memory is the realm of lovely price fixing and artifically constrained supply, we may never get back down to those lows.
|
# ? Sep 5, 2017 18:30 |
|
fritz posted:Your neural networks don't have to be all that big before you've burned through those 11 gigs. Can neural networks be used to make videogame AI-controlled teammates not retarded?
|
# ? Sep 5, 2017 18:30 |
|
fritz posted:Your neural networks don't have to be all that big before you've burned through those 11 gigs. Also justification for why the Quadro P6000 has 24 gigs. I remember when the Titan Xp launched, Gamers Nexus had a video mentioning how researchers found the extra gig of memory in the Xp versus the 1080 Ti to be worth the price increase.
|
# ? Sep 5, 2017 19:04 |
|
Hey everybody tweet linus to buy one of those 100k quad volta compute machines to play crisis on... since theyre for sale now
|
# ? Sep 5, 2017 19:20 |
|
1gnoirents posted:Hey everybody tweet linus to buy one of those 100k quad volta compute machines to play crisis on... since theyre for sale now Only quad Volta? lol if you're not gaming on an octo-volta system
|
# ? Sep 5, 2017 19:25 |
|
repiv posted:Only quad Volta? lol if you're not gaming on an octo-volta system Oh yeah thats the one I meant. So weird that volta cards are available and yet somehow not available for consumer markets for another 6 months I wonder how that is its a conundrum
|
# ? Sep 5, 2017 19:26 |
|
GDDR6 isn't ready yet, and Nvidia isn't desperate enough to resort to putting HBM2 on a consumer card
|
# ? Sep 5, 2017 19:31 |
|
There probably isn't enough HBM2 availability to satisfy the market for Nvidia cards though.
|
# ? Sep 5, 2017 19:33 |
|
ufarn posted:How plausible is the rumour of Nvidia being short on GDDR5? EVERYONE is short on GDDR5.
|
# ? Sep 5, 2017 19:33 |
|
SwissArmyDruid posted:EVERYONE is short on GDDR5.
|
# ? Sep 5, 2017 20:31 |
|
ufarn posted:Are the buttcoiners to blame, or is it the usual NAND story of RAM being spread out in demand across storage drives, memory, and whatnot? Samsung and Hynix have cut GDDR5 supply and upped prices 30%, claiming its due to the move to GDDR6 and moving production to other forms of RAM for other products (like phones). As usual, they are likely doing it to keep the artificial scarcity going and the fact they did it in unison despite higher than ever demand for GDDR5 after years and years of manufacture and time to up supply is just coincidence winkwinknudgenudge
|
# ? Sep 5, 2017 20:50 |
|
I think it had to do with untimely upgrades at various fabs needing to go through, at about the same time that mobile devices with LPDDR4 are really taking off?
|
# ? Sep 5, 2017 20:50 |
|
repiv posted:Only quad Volta? lol if you're not gaming on an octo-volta system LOL at the £11.99 delivery fee.
|
# ? Sep 5, 2017 20:51 |
ufarn posted:Are the buttcoiners to blame, or is it the usual NAND story of RAM being spread out in demand across storage drives, memory, and whatnot? Samsung and SK Hynix moved some GDDR5 production capacity over to producing DDR4 instead and increased prices on GDDR5 by 30% as a result.
|
|
# ? Sep 5, 2017 20:51 |
|
This is a totally baseless accusation, but I'm guessing a shitload of iPhones going into manufacture has probably caused Samsung to give GDDR5 the short shrift.
|
# ? Sep 5, 2017 21:35 |
|
Paul MaudDib posted:Before launch, Gibbo was saying that it could do 70 MH/s, but it looks like that was (at best) a wild-rear end guess I'm surprised at that. Normally he only seems to speculate on how bad other races are.
|
# ? Sep 5, 2017 21:47 |
|
GRINDCORE MEGGIDO posted:I'm surprised at that. Normally he only seems to speculate on how bad other races are. You get a free Britain First membership with each Vega.
|
# ? Sep 5, 2017 21:54 |
|
Scarecow posted:haha you mean more like maybe 10% over a OC'ed 1080ti At least Nvidia will probably still offer big performance gains, just at a big price, whereas Intel gives small performance gains and high prices. Scarecow posted:Yeah looking at benchmarks i did of my 980tis vs my 1080tis your right, man here i was thinking it was only like 10% Stock vs stock the 1070 is like 5-10% faster than the 980 Ti but when you OC both then the performance is basically identical. MaxxBot fucked around with this message at 22:31 on Sep 5, 2017 |
# ? Sep 5, 2017 22:27 |
|
Thanks for the many responses to my question, everyone. Not going to quote them all for space. The person using the 6850 is for the most part fine with it since they play a lot of 2d games that would run well on anything. They've only had trouble in Skyrim. But if they ever want to play more recent games it's going to be a problem. Also this 6850 is now the noisiest component in the computer since I modernized everything else but the drives this weekend PBCrunch posted:Even if the difference in specifications isn't that great, the fact that Terascale driver development has been dead for years means either of those cards will beat an HD 6850 silly. From my experience even something as old as a GeForce GTX 460 smacks an HD 6850 around in newer games. Nvidia is still putting a tiny amount of effort into Fermi drivers. Yeah, it being old terascale is part of why I want to replace it. I set them up with the crimson relive beta driver for terascale and that seems to work ok at least.
|
# ? Sep 6, 2017 00:07 |
|
Space Racist posted:I remember when the Titan Xp launched, Gamers Nexus had a video mentioning how researchers found the extra gig of memory in the Xp versus the 1080 Ti to be worth the price increase. I've had models that needed that extra little bit and would run on the Xp and would not on the 1080Ti.
|
# ? Sep 6, 2017 00:26 |
|
VostokProgram posted:Thanks for the many responses to my question, everyone. Not going to quote them all for space. What you should buy depends on a number of factors including what kind of games they play, what sort of case they have, and what their priorities are. Almost everyone here will recommend you buy a dual-fan model from EVGA, MSI, etc; and for the vast majority of people that's fine but there are edge cases and there's a big variety of graphics cards formats out there. I mean, if someone's only wish was to play Skyrim or League of Legends while not having a noisy computer, I'd tell them to buy this and save money. If someone has a compact little cube case with poor airflow and wanted to play Destiny, a 1060 blower would be more appropriate, etc. I actually think the little 1030s are cute. They're surprisingly acceptable at games from before 2015 for low-noise alternatives to integrated. I kind of want to put one in my DVR to handle video decoding.
|
# ? Sep 6, 2017 00:27 |
|
I know AMD has limited resources but it'd be really nice of them to release a "Crimson Final" for Terascale.
|
# ? Sep 6, 2017 00:31 |
|
Video cards remain expensive forever and i must live 5+ years with my 290x so i can buy a used nvidia 1488 to get adequate performance?
|
# ? Sep 6, 2017 12:20 |
|
Are 290x's still not sky high on ebay? I sold my reference 290 for a nice chunk of change and bought a 1070 before those went crazy. At this point if you're looking to upgrade might as well sell the 290x and go for a 1080 or 1080ti if you can swing it.
|
# ? Sep 6, 2017 13:45 |
|
Avalanche posted:If anything friends and GPU buying have taught me with Nvidia assuming you have the cash to spend is to just buy the Ti card of every generation. The Ti card will end up being equivalent to the mid range card of the new generation sort of how the 980ti now nets you basically the same performance of a 1070. There's really two ways to do it if you want to be on the generation treadmill. Buying the *70s as they come out is a totally valid way to do it. You pay a good bit less, and between the 70 and a 80 Ti of a generation coming out, you've got similar performance. If you want to get real spendy there's the 80/80 Ti flip dance or even the Titans (you aren't paying for the titan's performance, but to get it sooner). Just getting *60s works as well but that's a different price range.
|
# ? Sep 6, 2017 14:38 |
|
|
# ? May 19, 2024 04:24 |
|
Is it better to get a 1070 Ti over a 1050 Ti, then? I'm putting together an SFF build and I don't plan on running everything on ultra, but I do want to be able to play new games as they come out.
|
# ? Sep 6, 2017 14:57 |