|
Instant Grat posted:If it's not affecting you to the point where you're noticing performance drops, why would you care I have no clue. I think it's super sleazy of NVIDIA to cover this up until it was found out (lying about card specs), but if you don't actually experience performance issues in real scenarios, what the gently caress does it matter? HalloKitty fucked around with this message at 17:40 on Feb 3, 2015 |
# ? Feb 3, 2015 17:03 |
|
|
# ? May 30, 2024 15:06 |
|
Megasabin posted:I actually don't know. I got the card recently, and I haven't been playing anything taxing on it. It was actually a christmas gift in anticipation for the Witcher III. If I have stuttering while trying to play the Witcher because of this issue I'm not going to be happy, so I'm trying to sort it out now.
|
# ? Feb 3, 2015 17:07 |
|
HalloKitty posted:I have no clue. I think it's super sleazy of NVIDIA to cover this up until it was found out (lying about card specs), but if you don't actually performance issues in real scenarios, what the gently caress does it matter? Oh absolutely it's sleazy. I think it's garbage that they're more-or-less getting away with straight-up lying about the hardware specs of the card. But it still performs extremely well
|
# ? Feb 3, 2015 17:22 |
|
I'm more upset about the gsync thing tbqh
|
# ? Feb 3, 2015 17:25 |
|
Found this on some korea website today. http://www.bodnara.co.kr/bbs/article.html?num=118035
|
# ? Feb 3, 2015 18:22 |
|
And they're calling it 980 Ti instead of Titan. Hrm. Probably still gonna be called a Titan anyways.
SwissArmyDruid fucked around with this message at 18:43 on Feb 3, 2015 |
# ? Feb 3, 2015 18:35 |
|
Instant Grat posted:If it's not affecting you to the point where you're noticing performance drops, why would you care I play a lot of stuff on an Oculus and it's pretty hard to troubleshoot already. I'd just like to have a way to see VRAM usage to know whether I should be considering that as an issue when I'm stuttering or dropping frames. The DK2 benefits greatly from supersampling, and I could run not just over 3.5gigs but 4 when I use that. I think there's some apps that will show VRAM usage but having some way to tell when I'm changing settings with the Rift actually on my face, that would be nice.
|
# ? Feb 3, 2015 18:49 |
|
Tweet from Robert Hallock over at AMD: https://twitter.com/Thracks/status/561708827662245888 The AMD biases aside, he's not wrong? Civ: Beyond Earth's implementation of Mantle does something like this when run in Crossfire, where one card renders the top half of the screen, and the other card renders the bottom half of the screen, but it remains to be seen if DX12 can do anything resembling this, or indeed, if it's even better to do than AFR.
|
# ? Feb 3, 2015 18:50 |
|
SwissArmyDruid posted:And they're calling it 980 Ti instead of Titan. Hrm. Probably still gonna be called a Titan anyways. Or they'll launch a separate model with stuff like double precision fully enabled like they did last time.
|
# ? Feb 3, 2015 19:08 |
|
floor is lava posted:That last .5gb of ram doesn't exist as far as most games are concerned. I played Dark Souls 2 at 4k fine but it's not exactly thew most vram intensive thing. Setting Shadow of Mordor up to use HD textures which lists as using 6+ gb of vram still only uses 3.5gb max on the card. It's hard to say if it'll mess you up. Depends on the driver and game's vram allocation. Sure it exists- (Also Mordor at 2560 * 1600 on Ultra doesn't use all 4 gigs of ram on my 980, it tends to sit more at about 3.5, so that's not exactly super reliable as a tell)
|
# ? Feb 3, 2015 19:27 |
|
Hace posted:I'm more upset about the gsync thing tbqh What was this? That you'll be able to use GSync without a different monitor?
|
# ? Feb 3, 2015 19:49 |
|
Lockback posted:What was this? That you'll be able to use GSync without a different monitor? At least on gaming notebooks, the adaptive sync found in eDP is being enabled in software as "GSync" by nVidia even though it's actually FreeSync. I don't think we know yet if that's going to be able to happen in regular monitors using FreeSync in the future.
|
# ? Feb 3, 2015 19:53 |
|
Yeah, I agree that AnandTech that the 970 specs being wrong is a stupid mistake rather than a willful lie. But I am peeved as gently caress about G-Sync/FreeSync segmentation.
|
# ? Feb 3, 2015 20:01 |
|
SwissArmyDruid posted:Tweet from Robert Hallock over at AMD: https://twitter.com/Thracks/status/561708827662245888 It doesn't sound AMD biased, just an explanation of a low-level API giving you more control over the hardware, which doesn't sound far-fetched. Although it does indeed sound promising, it's still a multiple-GPU scenario, which hopefully we can avoid in the first place by having good enough single GPU cards. vv But I thought the difference was, VESA® Adaptive-Sync is the result of, as you mention, embedded DisplayPort's feature onto the desktop - but at the request of AMD. Really, that's kind of the thing - AMD isn't ever hiding the fact their poo poo is based on a now open VESA standard, whereas NVIDIA seems to be, by using the exact same technology on laptops but giving it the name of their older proprietary tech. VESA posted:Q: Is VESA’s new AdaptiveSync supported? HalloKitty fucked around with this message at 20:12 on Feb 3, 2015 |
# ? Feb 3, 2015 20:03 |
|
Beautiful Ninja posted:At least on gaming notebooks, the adaptive sync found in eDP is being enabled in software as "GSync" by nVidia even though it's actually FreeSync. I don't think we know yet if that's going to be able to happen in regular monitors using FreeSync in the future. It's not FreeSync, it's Nvidia taking the platform-agnostic AdaptiveSync spec and then slapping the G-Sync name onto it. * eDP (embedded DisplayPort) has had the capability to send VBLANK signals to tell the LCD to just refresh whatever image is already being displayed since 2012, at least. * DisplayPort 1.2a is a revision that adds eDP's VBLANK signal over to desktop monitors. * AdaptiveSync is the name for this technology on desktop monitors. * FreeSync is AMD's name for enabling variable refresh using AdaptiveSync. * G-Sync uses their own scalar and doesn't use AdaptiveSync * Mobile G-Sync does not use a special scalar, but just uses eDP's VBLANK signal on laptop monitors. Yeah, it's complicated, huh? SwissArmyDruid fucked around with this message at 20:06 on Feb 3, 2015 |
# ? Feb 3, 2015 20:04 |
|
SwissArmyDruid posted:It's not FreeSync, it's Nvidia taking the platform-agnostic AdaptiveSync spec and then slapping the G-Sync name onto it. And any time I see AdaptiveSync, I mentally replace it with AdaptiveVSync, which is a totally different thing.
|
# ? Feb 3, 2015 20:07 |
|
Instant Grat posted:Oh absolutely it's sleazy. I think it's garbage that they're more-or-less getting away with straight-up lying about the hardware specs of the card. They aren't getting away with anything lol. They are getting exactly the kind of heat you would expect from this. Pop on over to nvidia forums or someowhere else for some lols. It's like nvidia burned their house down and pissed in their mouths I'd be annoyed if they were kind of getting away with it but they most certainly are not. This kind of stuff sticks around. Its kind of a shame it didn't happen around the AMD release or there would be some serious drama.
|
# ? Feb 3, 2015 22:00 |
|
1gnoirents posted:They aren't getting away with anything lol. They are getting exactly the kind of heat you would expect from this. Pop on over to nvidia forums or someowhere else for some lols. It's like nvidia burned their house down and pissed in their mouths
|
# ? Feb 3, 2015 22:08 |
|
Mobile GSync isn't FreeSync. Desktop GSync isn't FreeSync. Nvidia has done nothing shady. I mean I know the Nvidia hate train is in full steam right now, but come on people.
|
# ? Feb 3, 2015 23:54 |
|
Bleh Maestro posted:Found this on some korea website today. If those specs are correct I want a 965Ti if they aren't over $300.
|
# ? Feb 4, 2015 01:40 |
|
BurritoJustice posted:Mobile GSync isn't FreeSync. Desktop GSync isn't FreeSync. Nvidia has done nothing shady. But NVidia is taking an industry standard tech and putting a proprietary name on it. (AMD is also doing this, and I also don't like it, but I like them pushing the industry standard as opposed to a proprietary black box adding to BOM.)
|
# ? Feb 4, 2015 03:08 |
|
BurritoJustice posted:Mobile GSync isn't FreeSync. Desktop GSync isn't FreeSync. Nvidia has done nothing shady.
|
# ? Feb 4, 2015 03:10 |
|
Rastor posted:FreeSync is an AMD term for Adaptive Sync. Mobile GSync (at least the leaked Alpha version) is Adaptive Sync. Therefore, Mobile GSync is FreeSync. And nVidia are very much shady assholes for saying they can't possibly support an Adaptive Sync solution, when it has been proven that not only can they, they've developed drivers that do so. Where did they say that? Drop a link?
|
# ? Feb 4, 2015 03:14 |
|
Subjunctive posted:Where did they say that? Drop a link? Factory Factory posted:Poor Nvidia this week... the assholes.
|
# ? Feb 4, 2015 04:09 |
|
Rastor posted:FreeSync is an AMD term for Adaptive Sync. Mobile GSync (at least the leaked Alpha version) is Adaptive Sync. Therefore, Mobile GSync is FreeSync. And nVidia are very much shady assholes for saying they can't possibly support an Adaptive Sync solution, when it has been proven that not only can they, they've developed drivers that do so. Mobile GSync isn't AdaptiveSync, the misinformation here is absurd. It uses the custom VBLANK support that has been in eDP forever, which mobile GPUs use and desktop GPUs don't. That isn't AdaptiveSync. AdaptiveSync is the recent DP 1.3 standard that can be optionally added to 1.2a implementations to allow variable VBLANK on desktop solutions. Nvidia stating that they cannot support AdaptiveSync on their current desktop GPUs is correct, as they are only DP 1.2 and literally don't have hardware support for it. Similar to how AMD cards before the 290x don't support AdaptiveSync in games, as the newer cards have DP 1.2a. The GSync module was created to allow variable refresh rates on desktop GPUs before there was a standard for it, it also is not AdaptiveSync. Like I said earlier, if they don't support AdaptiveSync on their future DP1.3 GPUs then yeah they are dicks, but right now there is a real hardware limitation. Mountains and molehills. Edit: this is similar to AMDs original demonstration of variable refresh right as GSync was kicking off, it used a laptop because for AMD it was literally only possible using eDP at the time, as it has always been possible.
|
# ? Feb 4, 2015 04:20 |
|
BurritoJustice posted:if they don't support AdaptiveSync on their future DP1.3 GPUs Can nVidia actually do this (or just never adopt DP 1.3), seeing as royalty-free doesn't equal "we don't care how you use this", and - as a member of VESA - nVidia's probably got more expected of them than just some jerk with a manufacturing plant does?
|
# ? Feb 4, 2015 04:32 |
|
BurritoJustice posted:Mobile GSync isn't AdaptiveSync, the misinformation here is absurd. It uses the custom VBLANK support that has been in eDP forever, which mobile GPUs use and desktop GPUs don't. That isn't AdaptiveSync. AdaptiveSync is the recent DP 1.3 standard that can be optionally added to 1.2a implementations to allow variable VBLANK on desktop solutions. Nvidia stating that they cannot support AdaptiveSync on their current desktop GPUs is correct, as they are only DP 1.2 and literally don't have hardware support for it. Similar to how AMD cards before the 290x don't support AdaptiveSync in games, as the newer cards have DP 1.2a.
|
# ? Feb 4, 2015 04:44 |
|
I'm sure they'll learn their lesson when this fracas causes people to flock away from their cards into the market-snuggling embrace of AMD.
|
# ? Feb 4, 2015 04:46 |
|
Rastor posted:Poor, sweet, innocent nVidia, they just happen to not have implemented DP 1.2a. This has nothing at all to do with trying to lock in its customers, no sir. They'll get right on that DisplayPort 1.2a / 1.3 support I'm sure. Right around the same time their G-Sync scalars support the HDMI 2.0 that their 900-series video cards tout, I gather.
|
# ? Feb 4, 2015 05:05 |
|
Subjunctive posted:I'm sure they'll learn their lesson when this fracas causes people to flock away from their cards into the market-snuggling embrace of AMD. Can't tell if this is sarcastic or not, despite SADs hope that HBM provides AMD with some temporary advantage, there is a reason there is a "deathwatch" thread. I'd almost toxx on the 300 series cards flubbing and Zen being the guillotine.
|
# ? Feb 4, 2015 05:23 |
|
FaustianQ posted:Can't tell if this is sarcastic or not Very.
|
# ? Feb 4, 2015 05:26 |
|
Subjunctive posted:Very. Nvidia learning a lesson, anyone buying AMD cards, or just "Yes"?
|
# ? Feb 4, 2015 05:28 |
|
FaustianQ posted:Nvidia learning a lesson, anyone buying AMD cards, or just "Yes"? Pretty much.
|
# ? Feb 4, 2015 05:29 |
|
i woudl purchase amd gpus but their products are just significantly inferior to AMDs and it doesn't look like they'll be able to overcome the 100W+ lead that nVidia has without a significant retune of their gpu
|
# ? Feb 4, 2015 05:55 |
|
Rastor posted:Poor, sweet, innocent nVidia, they just happen to not have implemented DP 1.2a. This has nothing at all to do with trying to lock in its customers, no sir. They'll get right on that DisplayPort 1.2a / 1.3 support I'm sure. Say whatever you want about them not implementing 1.2a, but that doesn't change that the whole "everything is FreeSync and the module is just $$DRM$$" thing is patently false. Mobile GSync still is not FreeSync, and still they cannot just issue driver support for FreeSync/AdaptiveSync on their current GPUs.
|
# ? Feb 4, 2015 06:24 |
|
Yeah, I totally want to support amd. nVidia is evil, but they run like 100w cooler so like, it's a better deal.
|
# ? Feb 4, 2015 09:46 |
|
MY 980SC is only pulling 175W from the wall, When you can do that AMD without catching a city like Dresden on fire again I'll come back. There were some leaked slides of the new R9 3xx and it showed power consumption at well over 300W. No thanks! All the VRAM in the world doesn't fix a bad product. I have yet to use all of the VRAM on my 980 so I think the 970 is ok. I run 3840 x 1080.
|
# ? Feb 4, 2015 12:00 |
|
BurritoJustice posted:Say whatever you want about them not implementing 1.2a, but that doesn't change that the whole "everything is FreeSync and the module is just $$DRM$$" thing is patently false. Mobile GSync still is not FreeSync, and still they cannot just issue driver support for FreeSync/AdaptiveSync on their current GPUs. But that is exactly what just happened? The current gsync branding is from a leaked driver that enables the feature on an already existing laptop, that has no special hardware outside of eDP support. Installing this driver turned on adaptivesync and gave it the branding of mobile gsync. Now, this won't happen on desktop cards, because despite DP 1.2a and DP1.3 both being finalized and having chips available prior to the release of the 9xx series, it wasn't added as a feature to those cards. Even when the 960 came out nearly a year later, it still wasn't added. I wonder why?
|
# ? Feb 4, 2015 15:44 |
|
Darkpriest667 posted:MY 980SC is only pulling 175W from the wall, When you can do that AMD without catching a city like Dresden on fire again I'll come back. There were some leaked slides of the new R9 3xx and it showed power consumption at well over 300W. No thanks! I could give a poo poo if the new 390 pulls 500 watts as long as it is fastest card I can get my hands on. Why do people act like 100w is this giant deal. It's 1 loving light bulb. It might cost you an extra 50 cents a month in electricity. Also, gently caress Nvidia for being giant lovely assholes that overcharge for everything. The last Nvidia card I actually had for more than a month was my 690. And for the record, from my personal actual use of the 980 and 970 and 290. My 290 was the easiest card to actually work with and had the best performance at 11.5 million pixels.
|
# ? Feb 4, 2015 15:51 |
|
|
# ? May 30, 2024 15:06 |
|
EoRaptor posted:But that is exactly what just happened? Just to verify, you do realize that the laptop didn't work in 100% of the cases that a display with the GSync module has been shown to work, yes? There's issues with flickering and sometimes the display completely blanking altogether at low frame rates. These issues are reproducible on the ROG Swift, which has the GSync module, but it's noted that they're much less severe. Link In other words, the GSync module is actually doing something, it's not $100 of inert electronics. edit: I think that article also speculated that what isn't clear right now is whether additional hardware is actually required to address these issues, or if it's possible that they could be reliably handled in software. (With the subtext being that maybe Nvidia just jumped straight to a hardware solution because that's easier to monetize than a driver update). jkyuusai fucked around with this message at 16:47 on Feb 4, 2015 |
# ? Feb 4, 2015 16:40 |