Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Truga
May 4, 2014
Lipstick Apathy
Yeah, crossfire (and sli) have all sorts of issues, but it seems like with current APIs, AFR is about the best they can manage. And AFR with different cards is bad for obvious reasons.

From what I could understand the dx12 change, it doesn't really see an extra gpu, instead it just sees an extra 1000 or whatever your gpu has processors. How this will work in practice with the gpu ram split between the cards remains to be seen, but if people can make engines that exploit this efficiently, multi gpu might end up being better than single gpu in pretty much every way.

Adbot
ADBOT LOVES YOU

Haeleus
Jun 30, 2009

He made one fatal slip when he tried to match the ranger with the big iron on his hip.

Darkpriest667 posted:

What version of VLC is it? Was the reinstall the newest version? I know the last two Nvidia drivers have given me some driver failures when I was doing certain things. ELITE keeps crashing unless you downclock to like 1100Mhz on a 980 SC. Apparently boost clocks on the maxwell cards don't play nice with some things I am finding out. I think there is a way to force a constant clock and disable the boost and the energy saver, but I've been so busy the last 3 weeks I haven't had the chance to look.

It's the latest 64 bit release, not sure of the version number (not on the actual computer right now). I just reduced my OC on my GPU via Precision X as I had actually increased it recently, so perhaps that is what's been causing instability.

repiv
Aug 13, 2009

Truga posted:

From what I could understand the dx12 change, it doesn't really see an extra gpu, instead it just sees an extra 1000 or whatever your gpu has processors. How this will work in practice with the gpu ram split between the cards remains to be seen, but if people can make engines that exploit this efficiently, multi gpu might end up being better than single gpu in pretty much every way.

If DX12 is anything like Mantle (and it probably is), it's actually the exact opposite. Under Mantle the engine sees each GPU as a completely distinct entity much like how OpenCL or CUDA do things.

Each GPU has its own command queue and any VRAM allocation only exists on one card, so the engine is responsible for splitting up the work and combining the results by issuing GPU-GPU memory copies.

They might use simple methods like AFR or something more clever where each card does different tasks, causing less duplication across the cards VRAM and allowing higher utilization than SLi/CF did.

So... DX12 might be amazing for multi-GPU but it depends. Implementations will range from "efficiently splits between GPU and IGP" to "chokes if the GPUs aren't matched" to "only uses the first GPU :effort:"

repiv fucked around with this message at 19:03 on May 1, 2015

veedubfreak
Apr 2, 2005

by Smythe
Turns out my Titan X is not in fact capable of running a 50% overclock :( Started having issues and had to go back to the stock bios. Oh well, it's not like I'm really giving it more than it can handle with SW:ToR. Kind of sad though, as it ran perfectly fine for 3 weeks before starting to have driver crashes. Temps are perfectly reasonable at 40c, but I guess it still has too many voltage locks.

penus penus penus
Nov 9, 2014

by piss__donald
hmmm, should probably get another one then

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

THE DOG HOUSE posted:

hmmm, should probably get another one then

same

The_Franz
Aug 8, 2003

repiv posted:

So... DX12 might be amazing for multi-GPU but it depends. Implementations will range from "efficiently splits between GPU and IGP" to "chokes if the GPUs aren't matched" to "only uses the first GPU :effort:"

The more practical use would probably be to use the IGP for compute tasks like physics or building indirect draw lists freeing up the discrete card to focus on rendering.

Truga
May 4, 2014
Lipstick Apathy

The_Franz posted:

The more practical use would probably be to use the IGP for compute tasks like physics or building indirect draw lists freeing up the discrete card to focus on rendering.

This is what I want to happen. igpu runs opencl well enough for basic physics these days.

Wiggly Wayne DDS
Sep 11, 2010



I guess showing off comparisons with MOBAs is all the rage these days: https://www.youtube.com/watch?v=sV9aa_12Ap0

If anyone knows of a better source video fire away

Megasabin
Sep 9, 2003

I get half!!
I'm thinking about picking up another GTX 970 to SLI with for the Witcher 3, however the original card I have is one of those aftermarket overclocked ones (Zotac GeForce GTX 970 AMP Extreme Edition 4 GB GDDR5 PCI). Do I have to get the exact same card, or will a stock GTX 970 be compatible in SLI with my card?

repiv
Aug 13, 2009

Any 970 will work, but if the factory overclocks are different they will both run at the speed of the slower card by default.

repiv
Aug 13, 2009

More specifics on how DX12 multi-adapter works.

tl;dr there's three modes developers can use:
Implicit: Game sees one GPU, driver handles all load splitting as with SLi/CFX. Still requires driver profiles for good performance.
Explicit Linked: Game sees a single psudo-GPU with multiple command queues and memory pools, each card can seamlessly access the others RAM over the SLi/CF bridge or XDMA.
Explicit Unlinked: Game sees completely independent GPUs, allows mis-matched combinations like discrete+IGP or AMD+nVidia. Cross-GPU memory copies have to make a slow round-trip via the CPU.

So don't go mixing AMD and nVidia cards in your machine just yet, games that actually support that won't be the norm :v:

repiv fucked around with this message at 15:06 on May 5, 2015

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I gather the norm is gonna be a combination of intel HD and AMD or nvidia?

repiv
Aug 13, 2009

If they use Explicit Unlinked at all, it will be IGP+GPU yes.

Nobody is going to buy AMD+NV setups until games can use it, and no developer is going to pay the performance penalty over Explicit Linked unless many users have mixed-vendor setups. Never going to happen IMO.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Especially with stuff like nvlink advantaging same-vendor multigpu setups.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
The whole aspect of this mixed GPU stuff under DX12 doesn't make a whole lot of sense when nvidia are the dominant graphics vendor though, they're actively hostile to competition. I kinda feel like nvidia would put poo poo in their drivers to handicap or outright disable AMD cards under those multi adapter conditions. It just seems like something they'd do.

Wiggly Wayne DDS
Sep 11, 2010



cat doter posted:

The whole aspect of this mixed GPU stuff under DX12 doesn't make a whole lot of sense when nvidia are the dominant graphics vendor though, they're actively hostile to competition. I kinda feel like nvidia would put poo poo in their drivers to handicap or outright disable AMD cards under those multi adapter conditions. It just seems like something they'd do.
I wouldn't be surprised at this happening incidentally - where they put in optimisations to shaders with nvidia-specific implementations that are faster but affects this scenario.

Gyrotica
Nov 26, 2012

Grafted to machines your builders did not understand.
Given that pretty much everyone has iGPUs, though, if nVIDIA doesn't make their cards play nice AMD could always implement it as an incentive to go red over green.

repiv
Aug 13, 2009

Microsoft can just tell them to cut it out for WHQL certification if they try any shenanigans.

Trying to get decent frame pacing with mis-matched discrete GPUs is problematic enough that nVidia shouldn't need to deliberately hinder it though :confuoot:

repiv fucked around with this message at 18:54 on May 5, 2015

Glass of Milk
Dec 22, 2004
to forgive is divine
Darn, a new Nvidia bundle and I just got mine a couple weeks ago. Ah well, I already have more games than I have time for anyways.

penus penus penus
Nov 9, 2014

by piss__donald
Wow if those were two games you were going to buy anyways then drat thats a deal

Truga
May 4, 2014
Lipstick Apathy

repiv posted:

Explicit Linked: Game sees a single psudo-GPU with multiple command queues and memory pools, each card can seamlessly access the others RAM over the SLi/CF bridge or XDMA.

That's what I mean the other day. I guess it's confirmed now.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

cat doter posted:

The whole aspect of this mixed GPU stuff under DX12 doesn't make a whole lot of sense when nvidia are the dominant graphics vendor though, they're actively hostile to competition. I kinda feel like nvidia would put poo poo in their drivers to handicap or outright disable AMD cards under those multi adapter conditions. It just seems like something they'd do.

Making laptop hybrid stuff work will be a lot easier in this model I think. It's a fragile nightmare of shimming and trickery right now, and even if the content isn't programmed against it things should be better with the improved driver model. Having the IGP do an AA pass once it's been given the frame for scan-out could be a nice win.

(I wonder if you could fake gsync that way with the IGP resubmitting as needed, hmm.)

HERAK
Dec 1, 2004

Glass of Milk posted:

Darn, a new Nvidia bundle and I just got mine a couple weeks ago. Ah well, I already have more games than I have time for anyways.

The new bundle is making me seriously consider a second 970 for sli, I need to crunch the numbers vs stepping up to a 980 or possibly a 980ti. There have been hints about something dropping this week from some press.

pr0zac
Jan 18, 2004

~*lukecagefan69*~


Pillbug

Glass of Milk posted:

Darn, a new Nvidia bundle and I just got mine a couple weeks ago. Ah well, I already have more games than I have time for anyways.

God dammit I literally bought my 970 on Sunday.

Nm, Newegg was nice enough to hook me up.

pr0zac fucked around with this message at 01:40 on May 6, 2015

somethingawful bf
Jun 17, 2005
That bundle is really making me consider getting a 970. I'm looking at this one http://www.amazon.com/MSI-GTX-970-GAMING-100ME/dp/B00TPLKR7Q/ref=cm_wl_huc_item is that a decent MSI one? It sort of sucks because I was planning on holding out until a consumer Vr headset is out and weigh my options then, but I was planning on getting these two games anyway. I have a 2gb 770 right now, so I'm assuming the difference won't be huge but it will probably help a bit in elite dangerous in Vr at least, right?

Party Plane Jones
Jul 1, 2007

by Reene
Fun Shoe

Poetic Justice posted:

That bundle is really making me consider getting a 970. I'm looking at this one http://www.amazon.com/MSI-GTX-970-GAMING-100ME/dp/B00TPLKR7Q/ref=cm_wl_huc_item is that a decent MSI one? It sort of sucks because I was planning on holding out until a consumer Vr headset is out and weigh my options then, but I was planning on getting these two games anyway. I have a 2gb 770 right now, so I'm assuming the difference won't be huge but it will probably help a bit in elite dangerous in Vr at least, right?


From what I recall you want the red 970 (the 4G) as all you get with this one is a backplate which is going to vanish if you ever need to RMA the card as it's a limited edition. Plus the 4g is cheaper by about 20 bucks.

somethingawful bf
Jun 17, 2005

Party Plane Jones posted:

From what I recall you want the red 970 (the 4G) as all you get with this one is a backplate which is going to vanish if you ever RMA the card as its a limited edition.

I thought someone mentioned the red was the older first run, and the other versions might be newer ( don't even know if that really matters)? Plus I don't mind spending the extra 15$ 20$ for the backplate either, really.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
The backplate on my MSI 290 is actually pretty nice. Gives it more stability since it's a huge card and has some thermal tapes and pads attached for backside cooling. Probably not really necessary but after having cards sag over the years it's nice to have at least, and it looks a lot better than the 280X that lacked a backplate.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Poetic Justice posted:

That bundle is really making me consider getting a 970. I'm looking at this one http://www.amazon.com/MSI-GTX-970-GAMING-100ME/dp/B00TPLKR7Q/ref=cm_wl_huc_item is that a decent MSI one? It sort of sucks because I was planning on holding out until a consumer Vr headset is out and weigh my options then, but I was planning on getting these two games anyway. I have a 2gb 770 right now, so I'm assuming the difference won't be huge but it will probably help a bit in elite dangerous in Vr at least, right?

A 970 will be 20%-25% faster than the 770 at 1080p according to the benchmarks I've looked at, I'd expect that gap to widen over time due to increasing VRAM requirements.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Poetic Justice posted:

That bundle is really making me consider getting a 970. I'm looking at this one http://www.amazon.com/MSI-GTX-970-GAMING-100ME/dp/B00TPLKR7Q/ref=cm_wl_huc_item is that a decent MSI one? It sort of sucks because I was planning on holding out until a consumer Vr headset is out and weigh my options then, but I was planning on getting these two games anyway. I have a 2gb 770 right now, so I'm assuming the difference won't be huge but it will probably help a bit in elite dangerous in Vr at least, right?

I saw it and was about to pull the trigger when I realized my mobo supports crossfire, not sli.

I was stupid and a "team red" buyer when I built it. I was positive I would only buy AMD cards forever.

Are current motherboards supporting only one or the other or do they make most of the decent mobos with support for both?

It's this
http://www.newegg.com/Product/Product.aspx?Item=N82E16813131770

I really wish witcher 3 would hurry up and come out so I can see where my system stands. Im just looking for an excuse to upgrade.

Fauxtool fucked around with this message at 07:48 on May 6, 2015

BurritoJustice
Oct 9, 2012

Neither crossfire or SLI need any form of licencing for motherboards. The only difference is that crossfire will run with the two cards at any PCI-E link speed, whereas SLI is limited to 8x or 16x links. As modern Intel CPUs have 16 PCIE lanes from the CPU, for a motherboard to support SLI it needs the additionally routing (more expensive) in the motherboard to run the slots at 8x/8x from the CPU. Ironically, this actually means that motherboards that only support crossfire are generally poor choices for crossfire, as they have the first slot at PCIE 3.0 16x and the second slot at PCIE 2.0 4x (an eight of the bandwidth) from the chipset. As the second slot is through the chipset, it shares the DMI link from the chipset to the CPU with the storage as well, so it can cause weird bottlenecks.

Kazinsal
Dec 13, 2011
And if you *really* need the bandwidth, X99 can generally deliver quad 8x links.

You may be in one or both of the "I have too much money and not enough sense" or "I have no financial planning and should not live on my own" categories if you are building something where you *need* quad PCIe 3.0 8x, though. That way leads madness (and a ridiculous power bill).

sauer kraut
Oct 2, 2004
drat that's a nice 970 bundle, well played nVidia.

orcane
Jun 13, 2012

Fun Shoe
Of course I bought my computer a) just before they announced the first bundle (let alone this current one) and b) from the wrong retailer (since only one in Switzerland is part of the program), uncool guys :argh:

sauer kraut
Oct 2, 2004
In case you missed it, the Oculus Rift was announced to launch no earlier than Q1 2016 so please don't buy any hardware to 'prepare' for it. No specs or reqs afaik.

my kinda ape
Sep 15, 2008

Everything's gonna be A-OK
Oven Wrangler

Poetic Justice posted:

That bundle is really making me consider getting a 970. I'm looking at this one http://www.amazon.com/MSI-GTX-970-GAMING-100ME/dp/B00TPLKR7Q/ref=cm_wl_huc_item is that a decent MSI one? It sort of sucks because I was planning on holding out until a consumer Vr headset is out and weigh my options then, but I was planning on getting these two games anyway. I have a 2gb 770 right now, so I'm assuming the difference won't be huge but it will probably help a bit in elite dangerous in Vr at least, right?

I just picked that one up a week ago when it was actually $5 cheaper than the red "gaming" edition version on Newegg after the rebate. FWIW the backplate is really nice looking. Hell of an upgrade from the old 6870 I had been using!

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

sauer kraut posted:

In case you missed it, the Oculus Rift was announced to launch no earlier than Q1 2016 so please don't buy any hardware to 'prepare' for it. No specs or reqs afaik.

If I was a betting man I'd say It'll be 1440p (likely a modified Samsung display) 90hz so my hardware recommendation is "lol".

Sorbus
Apr 1, 2010
My GTX980 arrived today, but I can't install it yet as I don't have a 6 pin -> 2 x 8 pin adapter :cripes:

e: no wait I had one, new gpu installed!

Sorbus fucked around with this message at 17:48 on May 6, 2015

Adbot
ADBOT LOVES YOU

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
I remember when the Oculus first looked like it was coming out nearly 2 years ago, they were talking up how a GTX 770 was the card you should be aiming for to get a quality gaming experience on it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply