Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo

Don Tacorleone posted:

Does this mean the new UE4 engine games won't support SLI ever?

No, it just means that we're probably not going to see AFR (Alternate-frame rendering) when you're running a UE4 game in SLI. This could mean NVidia goes with either SFR (Split-frame rendering), or it could mean some kind of compositing, wherein one card handles all the geometry and textures, and then passes to the next card for lighting and particles, that sort of thing.

SwissArmyDruid fucked around with this message at 20:06 on Mar 20, 2015

Adbot
ADBOT LOVES YOU

wolrah
May 8, 2006
what?

Don Tacorleone posted:

Does this mean the new UE4 engine games won't support SLI ever?

https://answers.unrealengine.com/questions/21746/does-the-ue4-engine-support-sli.html

quote:

The deferred rendering techniques used by UE4 rely on data from the previous frame to render the current frame and as a result are not SLI friendly. You could investigate which features are needed for SLI and potentially avoid them, however since that is not a usecase we have here at Epic, i'm not sure how well it will work as we keep extending UE4 with new functionality.

This basically says that AFR will probably never work with UE4. As SwissArmyDruid notes there are more complicated multi-GPU modes which may be usable in the future, but it's my understanding that these require the game to be aware of what's going on and thus can't be just forced on and tried for games that don't have it explicitly enabled like AFR can be.

Also to be clear though the question is SLI-specific the same limitations apply to all multi-GPU technologies.

Darkpriest667
Feb 2, 2015

I'm sorry I impugned
your cocksmanship.

GokieKS posted:

Except the K6000 had 12GB of VRAM too. But don't let the inability to distinguish between 6000 and K6000 get in the way of your rants. :V

You're right the K6000 has 12 GB and the 6000 has 6... You can't buy the K6000 anymore. YOU CAN buy the K 80 which has 24 GB of Vram, guess the price =)


EDIT -- looks like the M6000 was released (Sorry I don't keep up as much with workstation cards as I should) IT will suffice nicely at the 5K price point.

Darkpriest667 fucked around with this message at 23:27 on Mar 20, 2015

SwissArmyDruid
Feb 14, 2014

by sebmojo

wolrah posted:

https://answers.unrealengine.com/questions/21746/does-the-ue4-engine-support-sli.html


This basically says that AFR will probably never work with UE4. As SwissArmyDruid notes there are more complicated multi-GPU modes which may be usable in the future, but it's my understanding that these require the game to be aware of what's going on and thus can't be just forced on and tried for games that don't have it explicitly enabled like AFR can be.

Also to be clear though the question is SLI-specific the same limitations apply to all multi-GPU technologies.

Let's be fair here, we likely weren't going to be seeing AFR again anyways. Not with VR pushing down frame latencies and Vulkan and DX12 on the horizon trying to be more efficient. AMD has said that with Liquid VR, their model is going to be that of having a card drive each eye individually, while with NVidia, we knew as early as September of last year that they had already started looking forward with VR Direct and VR SLI, another scheme where each card renders each eye individually. Mantle, when it was used for Civ:BE, used a split-frame rendering implementation for Crossfire, where one card rendered the top half of the screen and the other card the bottom half.



I don't have an Nvidia slide, the Nathan Reed's website is up and down and up and down, and I can't stay connected long enough to download his presentation from GDC. Believe me when I say it's the same gist, though the actual implementation is a bit different.

SwissArmyDruid fucked around with this message at 23:37 on Mar 20, 2015

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

KakerMix posted:

It's worse with ATI cards and multi gpu rendering :ssh:

Fixed that for you. (and yes I've done both)

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
So will Nvidia cards ever get FreeSync or is it only for AMD cards?

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

spasticColon posted:

So will Nvidia cards ever get FreeSync or is it only for AMD cards?

Nvidia can use it if they want to, they even have a beta driver that uses the technology for use in laptops (though it's still called g-sync), but they refused to bump their new cards to the 1.2a DP spec and probably want to sell g-sync modules soooooo

zer0spunk
Nov 6, 2000

devil never even lived
Is it normal for a card to go up 50 bucks in msrp a few weeks after sale and stay there for 4 months? I got a reference evga 980 for 560 back in november and then it's jumped to 608 at all the retailers, which is odd since I was hoping at the 6 month mark it might be an option to pick up another for SLI. I don't really understand the pricing here..2x 980 = 1200, new titan which is on par = 1000...wha?

SlayVus
Jul 10, 2009
Grimey Drawer

zer0spunk posted:

Is it normal for a card to go up 50 bucks in msrp a few weeks after sale and stay there for 4 months? I got a reference evga 980 for 560 back in november and then it's jumped to 608 at all the retailers, which is odd since I was hoping at the 6 month mark it might be an option to pick up another for SLI. I don't really understand the pricing here..2x 980 = 1200, new titan which is on par = 1000...wha?

Demand of the cards. Remember the ATI bit coin mining price hikes? Newegg was selling 290s for like $500/$600.

Party Plane Jones
Jul 1, 2007

by Reene
Fun Shoe
Card shipments are off by a fourth versus last year, probably due to the Euro falling.

1gnoirents
Jun 28, 2014

hello :)

spasticColon posted:

So will Nvidia cards ever get FreeSync or is it only for AMD cards?

If freesync takes off they will. But, it will either have to really take off (something like just being a common monitor feature) and (combination of)/or be better than gsync for less the cost.

SwissArmyDruid
Feb 14, 2014

by sebmojo
PcPer video of a comparison between GSync and two FreeSync monitors: https://youtu.be/-ylLnT2yKyA

Two things to remember when watching this:

1: That FreeSync appears to rely heavily on the quality of the panel in order to deliver a quality image.
2: That Mobile GSync does not have a memory buffer, and it too, will suffer the same ghosting issues as FreeSync, and be reliant on panel quality to minimize it.

Until we get better panels, it looks like this round, from a technological standpoint, goes to Nvidia for the superior performance.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


spasticColon posted:

So will Nvidia cards ever get FreeSync or is it only for AMD cards?

At some point they'll have to if they like having DisplayPort sockets on their cards or if they like working with contemporary monitors (as VESA changes/updates the standards), but that could be many years from now.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


I'm a little confused with FreeSync - if my FPS drops below 48fps it won't work?

BurritoJustice
Oct 9, 2012

Tab8715 posted:

I'm a little confused with FreeSync - if my FPS drops below 48fps it won't work?

Depends on the monitor what the minimum FPS is exactly, but yes. Basically when you go below the minimum Hz that a FreeSync monitor can display it just locks the monitor at the minimum Hz. So the screen just becomes a constant refresh rate 48Hz screen when your FPS is below 48 in that case. This means you get all the associated issues of static FPS like stutter and tearing, except marginally worse because you have a 48Hz screen instead of a 60Hz one at that instant.

GSync on the other had uses a small amount of RAM in the screen as a frame buffer, which it then uses to refresh the screen multiple times per image. For example with GSync when you're at 30FPS (the minimum) your screen is refreshing 30 images thirty times a second. When you drop to 29FPS the screen refreshes those twenty nine images twice each at 58Hz. It continues these multiples down to 1FPS where it is refreshing the same image sixty times at 60Hz. Due to this GSync has a minimum refresh rate of 30Hz, but it can actually work with effectively any framerate.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I have no idea why AMD doesn't just handle that GPU side, it's a simple solution there and yet they aren't doing it because?

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

K8.0 posted:

I have no idea why AMD doesn't just handle that GPU side, it's a simple solution there and yet they aren't doing it because?

Could you elaborate what this simple solution would entail? Adding a frame buffer to the GPU display output which would add cost and complexity (and would prolly be not compatible with DP standard)? When your frame rate drops <48 FPS your card is already at 100% usage.

The solution is GSync's frame buffer on the display side. That's the reason for extra cost.

LiquidRain
May 21, 2007

Watch the madness!

I image AMD can spare an extra framebuffer in VRAM and push that to the monitor if 1 frame is taking too long.

Truga
May 4, 2014
Lipstick Apathy
I don't even get why this is an issue though. Cards are already doing what's being described here now, don't they? It's not like if your framerate drops to 3 for whatever reason, your monitor suddenly goes blank/flicker?

repiv
Aug 13, 2009

LiquidRain posted:

I image AMD can spare an extra framebuffer in VRAM and push that to the monitor if 1 frame is taking too long.

The problem is you don't know how much of a window you have to push a repeat frame - if the real frame is completed before you finish you either have to abort the repeat frame (causing tearing) or force the real frame to wait (meaning you see the same frame for two full 1/144=7ms intervals when the frame may have only taken 10ms to render).

The RAM on the G-Sync module means they can stream in fresh frames and draw repeated frames in parallel, without repeat frames ever hogging the line.

repiv fucked around with this message at 14:18 on Mar 23, 2015

suddenlyissoon
Feb 17, 2002

Don't be sad that I am gone.
Are the 970/980 really receiving only a gimped version of DirectX12 or have the children of NeoGAF finally melted my brain?

This is the specific post

LiquidRain
May 21, 2007

Watch the madness!

repiv posted:

The problem is you don't know how much of a window you have to push a repeat frame - if the real frame is completed before you finish you either have to abort the repeat frame (causing tearing) or force the real frame to wait (meaning you see the same frame for two full 1/144=7ms intervals when the frame may have only taken 10ms to render).
Same problem exists if you self-panel-refresh by putting the memory on the monitor. You'll be doing some form of frame pacing when you get that low.

repiv
Aug 13, 2009

LiquidRain posted:

Same problem exists if you self-panel-refresh by putting the memory on the monitor. You'll be doing some form of frame pacing when you get that low.

The same problem exists but there's more than they do to alleviate it, since they're not bound to displaying frames at the exact time they're received.

The 768MB RAM on the G-Sync board is enough to buffer about eight 1440p frames, so if they receive a new frame while re-drawing an existing one they could place it in a queue and use slight time-warping to catch up without having to tear or skip a frame.

(I don't know if that's what they're doing, but that's a lot of RAM to put on there for no reason...)

Kazinsal
Dec 13, 2011
You can fit 54 uncompressed 2560x1440x32 frames in 768 MB. They're a bit over 14 MB each.

repiv
Aug 13, 2009

Kazinsal posted:

You can fit 54 uncompressed 2560x1440x32 frames in 768 MB. They're a bit over 14 MB each.

Yeah I converted bits to bytes twice. oops

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

suddenlyissoon posted:

Are the 970/980 really receiving only a gimped version of DirectX12 or have the children of NeoGAF finally melted my brain?

This is the specific post

As far as I can tell from a quick search, there seems to be truth to this, yes.

But at the same time (it seems) there are other features Maxwell supports that GCN currently doesn't.

HalloKitty fucked around with this message at 15:05 on Mar 23, 2015

repiv
Aug 13, 2009

suddenlyissoon posted:

Are the 970/980 really receiving only a gimped version of DirectX12 or have the children of NeoGAF finally melted my brain?

This is the specific post

I wouldn't worry about this too much, the most important features of DX12 are changes to the software model and will work identically on any tier of hardware.

beejay
Apr 7, 2002

Tab8715 posted:

I'm a little confused with FreeSync - if my FPS drops below 48fps it won't work?

Only if you have a monitor that has a minimum refresh rate of 48Hz. In that case if your framerate falls below 48fps, things will go back to "normal" behavior, and you can choose to have be VSync on or off in such situations, just as you can now with a normal monitor. So it will basically work exactly like things work now if you fall to lower framerates. Additionally, FreeSync can handle down to 9Hz/fps, but the monitors that are currently out only go down as far as 40Hz.

In "ideal" fps situations (50-70fps), both technologies function basically the same, GSync having a 1%-2% performance penalty.

beejay fucked around with this message at 15:32 on Mar 23, 2015

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

beejay posted:

Additionally, FreeSync can handle down to 9Hz/fps, but the monitors that are currently out only go down as far as 40Hz.

Is this confirmed somewhere? Because if so then maturity on Freesync makes Gsync pointless - no one is going to be playing games or watching things at 9 frames per second (no one sane).

repiv
Aug 13, 2009

FaustianQ posted:

Is this confirmed somewhere? Because if so then maturity on Freesync makes Gsync pointless - no one is going to be playing games or watching things at 9 frames per second (no one sane).

People keep tossing around that FreeSync supports 9-240hz, but that's kind of misleading. You get to choose one of these ranges: 36-240hz, 21-144hz, 17-120hz or 9-60hz.

Still theoretically could outperform G-Syncs 30-144hz, but in practice that hasn't happened yet.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

FaustianQ posted:

Is this confirmed somewhere? Because if so then maturity on Freesync makes Gsync pointless - no one is going to be playing games or watching things at 9 frames per second (no one sane).

As the guy above me said, there are maximum ranges that can be chosen according to the AMD Freesync FAQ page.

9-60Hz, 17-120Hz, 21-144Hz, 36-240Hz.

What we're actually getting is pretty crappy in comparison, but that's most likely up to the monitor manufacturers.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I mean, for me it's usually a case of my games holding very steadily at 60 and maybe very rarely going down to 55... but I have to choose between the lag of V-sync, or turning V-sync off and still getting screen tearing even when I'm at 60fps solid, with or without frame limiting. So FreeSync as it exists today would be pretty sweet for me.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Even if 9-240Hz is misleading, the ranges provided for a Freesync monitor still make Gsync pointless, but only if the monitor quality required doesn't remove the cost savings. I'm just not seeing Gsyncs advantage personally :shrug:

veedubfreak
Apr 2, 2005

by Smythe
I just wish someone would put out a friggin thin bezel 1440 ips monitor. The asus monitors with thin bezel are TN and the color sucks. I looked at them at Microcenter.

SCheeseman
Apr 23, 2003

17-120hz is all you really need anyway. Once a monitor comes out with those kinda specs it's pretty much game over for g-sync.

beejay
Apr 7, 2002

Nah, it will definitely hang on for a while and a lot of people will buy it regardless.

repiv
Aug 13, 2009

SwissCM posted:

17-120hz is all you really need anyway. Once a monitor comes out with those kinda specs it's pretty much game over for g-sync.

AMD are using the "wouldn't that be great?" method of system design, by setting a goal that may not be achievable and making other companies take the fall if it turns out not to be.

I hope FreeSync does well for the sake of competition, but these superior specs mean nothing without a proof of concept implementation to back up their feasibility.

repiv fucked around with this message at 20:51 on Mar 23, 2015

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
I think freesync will take off in a year or so when people realize the panels cost way less and have more options than one or two panels per manufacturer.

beejay
Apr 7, 2002

Whatever happens will be very interesting. Nvidia has such a large presence that they don't have to give up right away, but certainly the majority of consumers won't pay the premium for the monitors. The monitor manufacturers won't enjoy making more expensive displays that may not sell well alongside the ones that will support FreeSync and not be much more expensive than "regular" monitors. So at that point Nvidia can choose to subsidize them in some way, or just coast on the enthusiast market that goes with G-Sync out of brand loyalty as more people may shift to AMD to get adaptive sync at a lower cost. Adaptive sync in general is going to be a big deal, soon, though.

Adbot
ADBOT LOVES YOU

veedubfreak
Apr 2, 2005

by Smythe

repiv posted:

AMD are using the "wouldn't that be great?" method of system design, by setting a goal that may not be achievable and making other companies take the fall if it turns out not to be.

I hope FreeSync does well for the sake of competition, but these superior specs mean nothing without a proof of concept implementation to back up their feasibility.

It's better than the alternative, which is Nvidia sits on their hands and never innovates because AMD is already behind. Kind of how Intel has been for the past 10 years.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply