Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Instant Grat posted:

If it's not affecting you to the point where you're noticing performance drops, why would you care

I have no clue. I think it's super sleazy of NVIDIA to cover this up until it was found out (lying about card specs), but if you don't actually experience performance issues in real scenarios, what the gently caress does it matter?

HalloKitty fucked around with this message at 17:40 on Feb 3, 2015

Adbot
ADBOT LOVES YOU

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Megasabin posted:

I actually don't know. I got the card recently, and I haven't been playing anything taxing on it. It was actually a christmas gift in anticipation for the Witcher III. If I have stuttering while trying to play the Witcher because of this issue I'm not going to be happy, so I'm trying to sort it out now.
The Witcher 3 will likely have problems at release that make it barely playable for people even with 980 GTX SLI setups at 1080P given past history... and that'll be resolved within maybe a month. It will be fantastic for whatever parts you can play most likely, but you really shouldn't buy so far ahead typically, especially for a single game. I'm on a GTX 680 still on a 3440x1440 monitor and I don't really see anything terrible happen in my games. But for Witcher 3 I might upgrade around then. Who knows what'll happen?

Instant Grat
Jul 31, 2009

Just add
NERD RAAAAAAGE

HalloKitty posted:

I have no clue. I think it's super sleazy of NVIDIA to cover this up until it was found out (lying about card specs), but if you don't actually performance issues in real scenarios, what the gently caress does it matter?

Oh absolutely it's sleazy. I think it's garbage that they're more-or-less getting away with straight-up lying about the hardware specs of the card.

But it still performs extremely well :shrug:

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
I'm more upset about the gsync thing tbqh

Bleh Maestro
Aug 30, 2003
Found this on some korea website today.



http://www.bodnara.co.kr/bbs/article.html?num=118035

SwissArmyDruid
Feb 14, 2014

by sebmojo
And they're calling it 980 Ti instead of Titan. Hrm. Probably still gonna be called a Titan anyways.

SwissArmyDruid fucked around with this message at 18:43 on Feb 3, 2015

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Instant Grat posted:

If it's not affecting you to the point where you're noticing performance drops, why would you care

I play a lot of stuff on an Oculus and it's pretty hard to troubleshoot already. I'd just like to have a way to see VRAM usage to know whether I should be considering that as an issue when I'm stuttering or dropping frames. The DK2 benefits greatly from supersampling, and I could run not just over 3.5gigs but 4 when I use that.

I think there's some apps that will show VRAM usage but having some way to tell when I'm changing settings with the Rift actually on my face, that would be nice.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Tweet from Robert Hallock over at AMD: https://twitter.com/Thracks/status/561708827662245888

The AMD biases aside, he's not wrong? Civ: Beyond Earth's implementation of Mantle does something like this when run in Crossfire, where one card renders the top half of the screen, and the other card renders the bottom half of the screen, but it remains to be seen if DX12 can do anything resembling this, or indeed, if it's even better to do than AFR.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

SwissArmyDruid posted:

And they're calling it 980 Ti instead of Titan. Hrm. Probably still gonna be called a Titan anyways.

Or they'll launch a separate model with stuff like double precision fully enabled like they did last time.

Gwaihir
Dec 8, 2009
Hair Elf

floor is lava posted:

That last .5gb of ram doesn't exist as far as most games are concerned. I played Dark Souls 2 at 4k fine but it's not exactly thew most vram intensive thing. Setting Shadow of Mordor up to use HD textures which lists as using 6+ gb of vram still only uses 3.5gb max on the card. It's hard to say if it'll mess you up. Depends on the driver and game's vram allocation.

Sure it exists-

(Also Mordor at 2560 * 1600 on Ultra doesn't use all 4 gigs of ram on my 980, it tends to sit more at about 3.5, so that's not exactly super reliable as a tell)

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Hace posted:

I'm more upset about the gsync thing tbqh

What was this? That you'll be able to use GSync without a different monitor?

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Lockback posted:

What was this? That you'll be able to use GSync without a different monitor?

At least on gaming notebooks, the adaptive sync found in eDP is being enabled in software as "GSync" by nVidia even though it's actually FreeSync. I don't think we know yet if that's going to be able to happen in regular monitors using FreeSync in the future.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Yeah, I agree that AnandTech that the 970 specs being wrong is a stupid mistake rather than a willful lie. But I am peeved as gently caress about G-Sync/FreeSync segmentation.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

SwissArmyDruid posted:

Tweet from Robert Hallock over at AMD: https://twitter.com/Thracks/status/561708827662245888

The AMD biases aside, he's not wrong? Civ: Beyond Earth's implementation of Mantle does something like this when run in Crossfire, where one card renders the top half of the screen, and the other card renders the bottom half of the screen, but it remains to be seen if DX12 can do anything resembling this, or indeed, if it's even better to do than AFR.

It doesn't sound AMD biased, just an explanation of a low-level API giving you more control over the hardware, which doesn't sound far-fetched. Although it does indeed sound promising, it's still a multiple-GPU scenario, which hopefully we can avoid in the first place by having good enough single GPU cards.

vv But I thought the difference was, VESA® Adaptive-Sync is the result of, as you mention, embedded DisplayPort's feature onto the desktop - but at the request of AMD. Really, that's kind of the thing - AMD isn't ever hiding the fact their poo poo is based on a now open VESA standard, whereas NVIDIA seems to be, by using the exact same technology on laptops but giving it the name of their older proprietary tech.

VESA posted:

Q: Is VESA’s new AdaptiveSync supported?

A: Yes. AdaptiveSync was first supported by DisplayPort 1.2a, and it is already supported in some available products. This is also branded as “Free-Sync” from AMD, which is based on VESA’s AdaptiveSync Standard.

HalloKitty fucked around with this message at 20:12 on Feb 3, 2015

SwissArmyDruid
Feb 14, 2014

by sebmojo

Beautiful Ninja posted:

At least on gaming notebooks, the adaptive sync found in eDP is being enabled in software as "GSync" by nVidia even though it's actually FreeSync. I don't think we know yet if that's going to be able to happen in regular monitors using FreeSync in the future.

It's not FreeSync, it's Nvidia taking the platform-agnostic AdaptiveSync spec and then slapping the G-Sync name onto it.

* eDP (embedded DisplayPort) has had the capability to send VBLANK signals to tell the LCD to just refresh whatever image is already being displayed since 2012, at least.
* DisplayPort 1.2a is a revision that adds eDP's VBLANK signal over to desktop monitors.
* AdaptiveSync is the name for this technology on desktop monitors.
* FreeSync is AMD's name for enabling variable refresh using AdaptiveSync.
* G-Sync uses their own scalar and doesn't use AdaptiveSync
* Mobile G-Sync does not use a special scalar, but just uses eDP's VBLANK signal on laptop monitors.

Yeah, it's complicated, huh?

SwissArmyDruid fucked around with this message at 20:06 on Feb 3, 2015

jkyuusai
Jun 26, 2008

homegrown man milk

SwissArmyDruid posted:

It's not FreeSync, it's Nvidia taking the platform-agnostic AdaptiveSync spec and then slapping the G-Sync name onto it.


* DisplayPort 1.2a is a revision that adds eDP's VBLANK signal over to desktop monitors.
* AdaptiveSync is the name for this technology on desktop monitors.
* FreeSync is AMD's name for enabling variable refresh using AdaptiveSync.
* G-Sync uses their own scalar and doesn't use AdaptiveSync
* Mobile G-Sync uses eDP's VBLANK signal over laptop monitors.

Yeah, it's complicated, huh?

And any time I see AdaptiveSync, I mentally replace it with AdaptiveVSync, which is a totally different thing. :( :arghfist:

1gnoirents
Jun 28, 2014

hello :)

Instant Grat posted:

Oh absolutely it's sleazy. I think it's garbage that they're more-or-less getting away with straight-up lying about the hardware specs of the card.

But it still performs extremely well :shrug:

They aren't getting away with anything lol. They are getting exactly the kind of heat you would expect from this. Pop on over to nvidia forums or someowhere else for some lols. It's like nvidia burned their house down and pissed in their mouths

I'd be annoyed if they were kind of getting away with it but they most certainly are not. This kind of stuff sticks around. Its kind of a shame it didn't happen around the AMD release or there would be some serious drama.

g0del
Jan 9, 2001



Fun Shoe

1gnoirents posted:

They aren't getting away with anything lol. They are getting exactly the kind of heat you would expect from this. Pop on over to nvidia forums or someowhere else for some lols. It's like nvidia burned their house down and pissed in their mouths

I'd be annoyed if they were kind of getting away with it but they most certainly are not. This kind of stuff sticks around. Its kind of a shame it didn't happen around the AMD release or there would be some serious drama.
My favorite posts are the people who are very, very angry about how nVidia treated them, and are definitely returning their 970 for a full refund. . . so they can buy a 980.

BurritoJustice
Oct 9, 2012

Mobile GSync isn't FreeSync. Desktop GSync isn't FreeSync. Nvidia has done nothing shady.

I mean I know the Nvidia hate train is in full steam right now, but come on people.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

If those specs are correct I want a 965Ti if they aren't over $300.

SwissArmyDruid
Feb 14, 2014

by sebmojo

BurritoJustice posted:

Mobile GSync isn't FreeSync. Desktop GSync isn't FreeSync. Nvidia has done nothing shady.

I mean I know the Nvidia hate train is in full steam right now, but come on people.

But NVidia is taking an industry standard tech and putting a proprietary name on it. (AMD is also doing this, and I also don't like it, but I like them pushing the industry standard as opposed to a proprietary black box adding to BOM.)

Rastor
Jun 2, 2001

BurritoJustice posted:

Mobile GSync isn't FreeSync. Desktop GSync isn't FreeSync. Nvidia has done nothing shady.

I mean I know the Nvidia hate train is in full steam right now, but come on people.
FreeSync is an AMD term for Adaptive Sync. Mobile GSync (at least the leaked Alpha version) is Adaptive Sync. Therefore, Mobile GSync is FreeSync. And nVidia are very much shady assholes for saying they can't possibly support an Adaptive Sync solution, when it has been proven that not only can they, they've developed drivers that do so.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Rastor posted:

FreeSync is an AMD term for Adaptive Sync. Mobile GSync (at least the leaked Alpha version) is Adaptive Sync. Therefore, Mobile GSync is FreeSync. And nVidia are very much shady assholes for saying they can't possibly support an Adaptive Sync solution, when it has been proven that not only can they, they've developed drivers that do so.

Where did they say that? Drop a link?

Rastor
Jun 2, 2001

Subjunctive posted:

Where did they say that? Drop a link?

Factory Factory posted:

Poor Nvidia this week... the assholes.

You know how they totally hypothetically could support FreeSync but aren't? That's because they're re-implementing it and calling it G-Sync for use in gaming laptops. As was extremely obvious yet they always denied, the G-Sync module was never required except to bring adaptive sync to non-mobile hardware. It's just an overpriced little monitor driver board that Nvidia can capture BoM dollars with.

BurritoJustice
Oct 9, 2012

Rastor posted:

FreeSync is an AMD term for Adaptive Sync. Mobile GSync (at least the leaked Alpha version) is Adaptive Sync. Therefore, Mobile GSync is FreeSync. And nVidia are very much shady assholes for saying they can't possibly support an Adaptive Sync solution, when it has been proven that not only can they, they've developed drivers that do so.

Mobile GSync isn't AdaptiveSync, the misinformation here is absurd. It uses the custom VBLANK support that has been in eDP forever, which mobile GPUs use and desktop GPUs don't. That isn't AdaptiveSync. AdaptiveSync is the recent DP 1.3 standard that can be optionally added to 1.2a implementations to allow variable VBLANK on desktop solutions. Nvidia stating that they cannot support AdaptiveSync on their current desktop GPUs is correct, as they are only DP 1.2 and literally don't have hardware support for it. Similar to how AMD cards before the 290x don't support AdaptiveSync in games, as the newer cards have DP 1.2a.

The GSync module was created to allow variable refresh rates on desktop GPUs before there was a standard for it, it also is not AdaptiveSync. Like I said earlier, if they don't support AdaptiveSync on their future DP1.3 GPUs then yeah they are dicks, but right now there is a real hardware limitation.

Mountains and molehills.


Edit: this is similar to AMDs original demonstration of variable refresh right as GSync was kicking off, it used a laptop because for AMD it was literally only possible using eDP at the time, as it has always been possible.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


BurritoJustice posted:

if they don't support AdaptiveSync on their future DP1.3 GPUs

Can nVidia actually do this (or just never adopt DP 1.3), seeing as royalty-free doesn't equal "we don't care how you use this", and - as a member of VESA - nVidia's probably got more expected of them than just some jerk with a manufacturing plant does?

Rastor
Jun 2, 2001

BurritoJustice posted:

Mobile GSync isn't AdaptiveSync, the misinformation here is absurd. It uses the custom VBLANK support that has been in eDP forever, which mobile GPUs use and desktop GPUs don't. That isn't AdaptiveSync. AdaptiveSync is the recent DP 1.3 standard that can be optionally added to 1.2a implementations to allow variable VBLANK on desktop solutions. Nvidia stating that they cannot support AdaptiveSync on their current desktop GPUs is correct, as they are only DP 1.2 and literally don't have hardware support for it. Similar to how AMD cards before the 290x don't support AdaptiveSync in games, as the newer cards have DP 1.2a.

The GSync module was created to allow variable refresh rates on desktop GPUs before there was a standard for it, it also is not AdaptiveSync. Like I said earlier, if they don't support AdaptiveSync on their future DP1.3 GPUs then yeah they are dicks, but right now there is a real hardware limitation.

Mountains and molehills.


Edit: this is similar to AMDs original demonstration of variable refresh right as GSync was kicking off, it used a laptop because for AMD it was literally only possible using eDP at the time, as it has always been possible.
Poor, sweet, innocent nVidia, they just happen to not have implemented DP 1.2a. This has nothing at all to do with trying to lock in its customers, no sir. They'll get right on that DisplayPort 1.2a / 1.3 support I'm sure.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I'm sure they'll learn their lesson when this fracas causes people to flock away from their cards into the market-snuggling embrace of AMD.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Rastor posted:

Poor, sweet, innocent nVidia, they just happen to not have implemented DP 1.2a. This has nothing at all to do with trying to lock in its customers, no sir. They'll get right on that DisplayPort 1.2a / 1.3 support I'm sure.

Right around the same time their G-Sync scalars support the HDMI 2.0 that their 900-series video cards tout, I gather.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Subjunctive posted:

I'm sure they'll learn their lesson when this fracas causes people to flock away from their cards into the market-snuggling embrace of AMD.

Can't tell if this is sarcastic or not, despite SADs hope that HBM provides AMD with some temporary advantage, there is a reason there is a "deathwatch" thread.

I'd almost toxx on the 300 series cards flubbing and Zen being the guillotine.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

FaustianQ posted:

Can't tell if this is sarcastic or not

Very.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Nvidia learning a lesson, anyone buying AMD cards, or just "Yes"?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

FaustianQ posted:

Nvidia learning a lesson, anyone buying AMD cards, or just "Yes"?

Pretty much.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
i woudl purchase amd gpus but their products are just significantly inferior to AMDs and it doesn't look like they'll be able to overcome the 100W+ lead that nVidia has without a significant retune of their gpu

BurritoJustice
Oct 9, 2012

Rastor posted:

Poor, sweet, innocent nVidia, they just happen to not have implemented DP 1.2a. This has nothing at all to do with trying to lock in its customers, no sir. They'll get right on that DisplayPort 1.2a / 1.3 support I'm sure.

Say whatever you want about them not implementing 1.2a, but that doesn't change that the whole "everything is FreeSync and the module is just $$DRM$$" thing is patently false. Mobile GSync still is not FreeSync, and still they cannot just issue driver support for FreeSync/AdaptiveSync on their current GPUs.

LRADIKAL
Jun 10, 2001

Fun Shoe
Yeah, I totally want to support amd. nVidia is evil, but they run like 100w cooler so like, it's a better deal.

Darkpriest667
Feb 2, 2015

I'm sorry I impugned
your cocksmanship.
MY 980SC is only pulling 175W from the wall, When you can do that AMD without catching a city like Dresden on fire again I'll come back. There were some leaked slides of the new R9 3xx and it showed power consumption at well over 300W. No thanks!

All the VRAM in the world doesn't fix a bad product. I have yet to use all of the VRAM on my 980 so I think the 970 is ok. I run 3840 x 1080.

EoRaptor
Sep 13, 2003

by Fluffdaddy

BurritoJustice posted:

Say whatever you want about them not implementing 1.2a, but that doesn't change that the whole "everything is FreeSync and the module is just $$DRM$$" thing is patently false. Mobile GSync still is not FreeSync, and still they cannot just issue driver support for FreeSync/AdaptiveSync on their current GPUs.

But that is exactly what just happened?

The current gsync branding is from a leaked driver that enables the feature on an already existing laptop, that has no special hardware outside of eDP support.

Installing this driver turned on adaptivesync and gave it the branding of mobile gsync.

Now, this won't happen on desktop cards, because despite DP 1.2a and DP1.3 both being finalized and having chips available prior to the release of the 9xx series, it wasn't added as a feature to those cards. Even when the 960 came out nearly a year later, it still wasn't added. I wonder why?

veedubfreak
Apr 2, 2005

by Smythe

Darkpriest667 posted:

MY 980SC is only pulling 175W from the wall, When you can do that AMD without catching a city like Dresden on fire again I'll come back. There were some leaked slides of the new R9 3xx and it showed power consumption at well over 300W. No thanks!

All the VRAM in the world doesn't fix a bad product. I have yet to use all of the VRAM on my 980 so I think the 970 is ok. I run 3840 x 1080.

I could give a poo poo if the new 390 pulls 500 watts as long as it is fastest card I can get my hands on. Why do people act like 100w is this giant deal. It's 1 loving light bulb. It might cost you an extra 50 cents a month in electricity. Also, gently caress Nvidia for being giant lovely assholes that overcharge for everything. The last Nvidia card I actually had for more than a month was my 690.

And for the record, from my personal actual use of the 980 and 970 and 290. My 290 was the easiest card to actually work with and had the best performance at 11.5 million pixels.

Adbot
ADBOT LOVES YOU

jkyuusai
Jun 26, 2008

homegrown man milk

EoRaptor posted:

But that is exactly what just happened?

The current gsync branding is from a leaked driver that enables the feature on an already existing laptop, that has no special hardware outside of eDP support.

Installing this driver turned on adaptivesync and gave it the branding of mobile gsync.

Now, this won't happen on desktop cards, because despite DP 1.2a and DP1.3 both being finalized and having chips available prior to the release of the 9xx series, it wasn't added as a feature to those cards. Even when the 960 came out nearly a year later, it still wasn't added. I wonder why?

Just to verify, you do realize that the laptop didn't work in 100% of the cases that a display with the GSync module has been shown to work, yes? There's issues with flickering and sometimes the display completely blanking altogether at low frame rates. These issues are reproducible on the ROG Swift, which has the GSync module, but it's noted that they're much less severe.

Link

In other words, the GSync module is actually doing something, it's not $100 of inert electronics.

edit: I think that article also speculated that what isn't clear right now is whether additional hardware is actually required to address these issues, or if it's possible that they could be reliably handled in software. (With the subtext being that maybe Nvidia just jumped straight to a hardware solution because that's easier to monetize than a driver update).

jkyuusai fucked around with this message at 16:47 on Feb 4, 2015

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply