Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Paul MaudDib posted:

People love to freak out about how 1080p should be enough for any honest man, but the topic of "how much resolution is too much" isn't a novel one to science, it's a basic question that gets asked any time anyone does signal transforms (eg A2D or D2A conversion). It makes sense to increase display resolution to at least the Nyquist Rate (twice what you can actually resolve, with the units here being PPI). Otherwise, just like with video, you get aliasing. Potentially further, and it also makes sense to do SSAA even beyond your display resolution of course. Which is (I think) at least 8K in your basic 27-32" desktop monitor.

By the way the same is also true of refresh rate. If the human eye can actually resolve 80 hz or whatever, an ideal panel would at least double it. We're doing a lot better in that area recently - 144hz panels are nothing to complain about, let alone 165 hz.

Adbot
ADBOT LOVES YOU

Fuzzy Mammal
Aug 15, 2001

Lipstick Apathy
Wouldn't you want to measure in angles of arc rather than pixels per inch in order to cancel out any effect of distance from monitor?
I also feel you could up anti-aliasing many times more than currently to reduce shimmer and pixel crawl.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Fuzzy Mammal posted:

Wouldn't you want to measure in angles of arc rather than pixels per inch in order to cancel out any effect of distance from monitor?
I also feel you could up anti-aliasing many times more than currently to reduce shimmer and pixel crawl.

That's the ideal but that involves the end-user in the measurement. If I sit 14" away from a 27" panel (which I am right now) that's going to result in a different pixel-per-arcsecond than someone sitting 20" away from the same panel.

So basically what we get is PPI, because that's trivial to calculate from the panel size+resolution. Then, we say that the PPI for a desktop monitor that you're sitting 14" away from should be X, the PPI for a phone you're holding 8 inches away from your face should be Y, and the PPI for a TV you're sitting 10 feet away from should be Z. In other words "what you can actually resolve" is the factor that changes with the end user.

Yeah supersampling does own if you've got the hardware. Pretty much unambiguously a Good Thing That Helps Stop Aliasing. Ideally it would be super great to run 4-8x SSAA at 165hz across a 3-5 panel Eyefinity setup at ~27" each. We're really not talking fully diminishing returns up to that point.

Paul MaudDib fucked around with this message at 03:39 on Nov 4, 2015

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
About MGPU capability in DX12 - could Intel drop a consumer grade Xeon Phi like to crunch compute in tendem with an AMD or Nvidia GPU? Would such a thing be worth it?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

About MGPU capability in DX12 - could Intel drop a consumer grade Xeon Phi like to crunch compute in tendem with an AMD or Nvidia GPU? Would such a thing be worth it?

It's really difficult to say because the Phi is such a wildcard in terms of architecture - it's like if you built a GPU with 244 independent processors that could each run 16 SIMD ops instead of 4 banks of 32 processors that could each run 32 SIMD slots. In theory it's a much better general-purpose processor but somewhat worse than a balls-out SIMD processor (assuming the SIMD processor is working on an ideal workload). They aren't common at all in consumer gear so nobody programs for them.

I think the argument probably goes along much the same lines as pairing an NVIDIA GPU for graphics and an AMD GPU for async compute. In theory the Phi should be loving awesome for compute stuff like PhysX. But it'll probably underperform versus hardware that's optimized for graphics pipelines rather than general compute. At the end of the day the custom-tailored balls-out special-purpose parallel processor wins over the general-purpose independent cores on an equivalent amount of silicon.

Will anyone program for the 1 in 100k hardware configuration? Will NVIDIA let PhysX run on that configuration? (no)

Paul MaudDib fucked around with this message at 02:50 on Nov 4, 2015

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

About MGPU capability in DX12 - could Intel drop a consumer grade Xeon Phi like to crunch compute in tendem with an AMD or Nvidia GPU? Would such a thing be worth it?

Seriously though if they dropped a mini Phi with AVX512 into their iGPU they would suddenly crack the chicken-and-egg problem just via raw marketshare. I have no idea if the Skylake gen and beyond have those capabilities built in - to my knowledge they don't but it's really an intriguing idea.

They would never do it because it would cut into their sales of Xeons to the supercomputer crowd, but still.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Paul MaudDib posted:

Seriously though if they dropped a mini Phi with AVX512 into their iGPU they would suddenly crack the chicken-and-egg problem just via raw marketshare. I have no idea if the Skylake gen and beyond have those capabilities built in - to my knowledge they don't but it's really an intriguing idea.

They would never do it because it would cut into their sales of Xeons to the supercomputer crowd, but still.

And if AMD were to do so with Zen and Greenland? If Zen is competitive enough, I'm sure Intel would drop a refresh just to prevent such a thing from happening. In fact if the rumor for AVX512 for Zen+ is true then I don't see why Zen+ couldn't have such capability.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Paul MaudDib posted:

By the way the same is also true of refresh rate. If the human eye can actually resolve 80 hz or whatever, an ideal panel would at least double it. We're doing a lot better in that area recently - 144hz panels are nothing to complain about, let alone 165 hz.

The really interesting thing about this is if a game is actually running at 165hz, the need for antialiasing becomes significantly reduced because of the way persistence of vision works. Some games have tried to fake this effect (like with project cars on PS4) but it creates horrible ghosting, seeing it for realsies is some loving awesome poo poo.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

And if AMD were to do so with Zen and Greenland? If Zen is competitive enough, I'm sure Intel would drop a refresh just to prevent such a thing from happening. In fact if the rumor for AVX512 for Zen+ is true then I don't see why Zen+ couldn't have such capability.

An AMD-based Phi would definitely be a thing.

I don't know how plausible it is - you could feasibly reach that point by jumping off from a low-power Jaguar-type core (eg Kabini) but redesigning it as a HBM-based GPU that you stick into an APU. It's not 100% out of the realm of possibility, but I don't know that Zen+ is specifically architectured like that. Keller-senpai could do it. He's my hero :3:

Can you even drop a fraction of a GPU's-worth of bandwidth and power into an iGPU? No idea, probably not with DDR3/4. Maybe with HBM. But if anyone can then it's Keller.

Just running AVX512 on Zen+ isn't enough though. The draw is specifically the "general-purpose highly parallel processor" - if the Zen+ isn't running ~256 cores like Knight's Landing it's not going to compete in that space specifically as a GPU-assist addon card. 4 cores with 16x SIMD is nice and might draw AVX512 to Intel's CPU space but it's not going to coerce Intel into dropping an iGPU Phi to the consumer market.

Anyway, just bear in mind that you're basically asking for a totally different architecture from GCN. Yes, it could be done, and Intel is doing it right now to try and break the the GPGPU-compute market, but so far I've seen no indication that AMD is abandoning GCN or that Intel is moving the tech down to the consumer space.

Paul MaudDib fucked around with this message at 03:31 on Nov 4, 2015

Kazinsal
Dec 13, 2011


Panty Saluter posted:

If you don't mind spending some money and disassembling your card, Arctic Cooling makes some neato coolers that work really well. I had a 4850 with an awful tiny cooler that was mediocre when clean but loved to trap dust, leading to super high temperatures. A fully passive Accelero S1 kept my load temps lower than stock cooler idle temps (the card would idle 70-75C with the stock cooler).

Also make sure you have enough space in your case :v:

I kind of want to slam an Accelero Xtreme IV onto my 290. I mean, yeah, it would make the whole thing about four slots tall, but holy poo poo, it can drop a 290X from 95 C to 56 C. That's impressive.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

cat doter posted:

The really interesting thing about this is if a game is actually running at 165hz, the need for antialiasing becomes significantly reduced because of the way persistence of vision works. Some games have tried to fake this effect (like with project cars on PS4) but it creates horrible ghosting, seeing it for realsies is some loving awesome poo poo.

That's interesting but I'm going to need a 8K 165hz screen to compare as a reference. Also don't forget to drive it at full speed. Maybe some SSAA. And G-sync please. :colbert:

(Nah that's interesting period, got a link to an article on the topic?)

Paul MaudDib fucked around with this message at 03:39 on Nov 4, 2015

SwissArmyDruid
Feb 14, 2014

by sebmojo

Paul MaudDib posted:

That's interesting but I'm going to need a 8K 165hz screen to compare as a reference. Also don't forget to drive it at full speed. Maybe some SSAA. And G-sync please. :colbert:

(Nah that's interesting period, got a link to an article on the topic?)

G-sync is dead, remember? Intel has gone Freesync. :v:

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Paul MaudDib posted:

People love to freak out about how 1080p should be enough for any honest man, but the topic of "how much resolution is too much" isn't a novel one to science, it's a basic question that gets asked any time anyone does signal transforms (eg A2D or D2A conversion). It makes sense to increase display resolution to at least the Nyquist Rate (twice what you can actually resolve, with the units here being PPI). Otherwise, just like with video, you get aliasing. Potentially further, and it also makes sense to do SSAA even beyond your display resolution of course. Which is (I think) at least 8K in your basic 27-32" desktop monitor.

I definitely agree, I'm mainly citing subpixel aliasing (and the shimmer it causes) as something that jumps out and immediately reminds you you're looking at a screen and probably the most immediately obvious thing, and the lower limit for a Truly Sufficient Resolution™ is going to be at least what it takes to properly deal with it.

FaustianQ posted:

40" might be a bit too big for being only 2ft away.

It's roughly a taller 3440x1440 and those screens are cool as gently caress, I'd be interested.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Twerk from Home posted:

We are there already, look at phones, tablets, and the Macbook.

A lot of high end phones throttle, and the Macbook heavily throttles.

HalloKitty fucked around with this message at 07:46 on Nov 4, 2015

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Paul MaudDib posted:

That's interesting but I'm going to need a 8K 165hz screen to compare as a reference. Also don't forget to drive it at full speed. Maybe some SSAA. And G-sync please. :colbert:

(Nah that's interesting period, got a link to an article on the topic?)

Unfortunately I can't remember specifically where I read about that, so you'll have to take what I say with a grain of salt, but supposedly the idea is that frames past our perceptible limit of around 83hz blend together in a kind of in-eye AA, but it's a pretty easy thing to test yourself if your monitor's refresh rate is high enough, aliasing on scenes in motion is just way easier to see the lower the framerate is.

ballsman
Jun 16, 2003

a fuckin bomb
Grimey Drawer
Looking for some 980ti advice. I've gone through 3 Gigabyte G1's with hideous coil whine, finally managed to get a refund from newegg and wash my hands of the thing. Is the reference EVGA a safe bet? The cooler doesn't really matter, plan is to put it on water at some point.

VelociBacon
Dec 8, 2009

ballsman posted:

Looking for some 980ti advice. I've gone through 3 Gigabyte G1's with hideous coil whine, finally managed to get a refund from newegg and wash my hands of the thing. Is the reference EVGA a safe bet? The cooler doesn't really matter, plan is to put it on water at some point.

Where is your case? I honestly can't hear any coil whine from my case on the floor with that GPU even when I'm trying.

ballsman
Jun 16, 2003

a fuckin bomb
Grimey Drawer

VelociBacon posted:

Where is your case? I honestly can't hear any coil whine from my case on the floor with that GPU even when I'm trying.

On my desk, so pretty close. I've had some noisy video cards before (previous 970 G1 for example) but nothing like these. I can hear the pitch go up and down as scenery changes in a game, it's incredible :v:

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

ballsman posted:

Looking for some 980ti advice. I've gone through 3 Gigabyte G1's with hideous coil whine, finally managed to get a refund from newegg and wash my hands of the thing. Is the reference EVGA a safe bet? The cooler doesn't really matter, plan is to put it on water at some point.

The EVGA one should be fine since you are going to toss a loop on it, but I've heard reports of coil whine one just about every card so it looks like you were just unlucky with the Gigabyte ones.

japtor
Oct 28, 2005

cat doter posted:

Unfortunately I can't remember specifically where I read about that, so you'll have to take what I say with a grain of salt, but supposedly the idea is that frames past our perceptible limit of around 83hz blend together in a kind of in-eye AA, but it's a pretty easy thing to test yourself if your monitor's refresh rate is high enough, aliasing on scenes in motion is just way easier to see the lower the framerate is.
Sounds reasonable enough, although I imagine it'd need enough motion and/or resolution still. Like for horizon lines or other little elements that might be relatively stable at times, jaggies can still stand out, but I guess low motion is kind of effectively like a lower framerate in that scenario.

I haven't played Project Cars, but is it basically trying to do motion blur...but a crappy job? I remember some early driving game 360 doing motion blur at 30fps and it looked nice (at least at the time), but it was still a noticeable difference just running at 60fps. If done well at 60 it should just look smoother, but if the framerate is inconsistent and the blur isn't done well I could imagine it going awry quickly.

In any case now I really want a higher refresh monitor next time I get one. Unfortunately I also want a high res superwide (6880x2880) that does not exist yet...can anything even run that resolution yet with a single cable, DP1.3 or TB3?

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

japtor posted:

Sounds reasonable enough, although I imagine it'd need enough motion and/or resolution still. Like for horizon lines or other little elements that might be relatively stable at times, jaggies can still stand out, but I guess low motion is kind of effectively like a lower framerate in that scenario.

I haven't played Project Cars, but is it basically trying to do motion blur...but a crappy job? I remember some early driving game 360 doing motion blur at 30fps and it looked nice (at least at the time), but it was still a noticeable difference just running at 60fps. If done well at 60 it should just look smoother, but if the framerate is inconsistent and the blur isn't done well I could imagine it going awry quickly.

In any case now I really want a higher refresh monitor next time I get one. Unfortunately I also want a high res superwide (6880x2880) that does not exist yet...can anything even run that resolution yet with a single cable, DP1.3 or TB3?

The ghosting in project cars is their custom temporal AA as far as I know, with the principle being the more motion the less perceptible aliasing, but in practice it just kinda looks like poo poo.

SlayVus
Jul 10, 2009
Grimey Drawer

cat doter posted:

The ghosting in project cars is their custom temporal AA as far as I know, with the principle being the more motion the less perceptible aliasing, but in practice it just kinda looks like poo poo.

Temporal AA is just bad overall. It's a feature in the Unreal Engine since 4.4. I feel like most games are going to use this, but it just looks bad. It makes lines fuzzy, it blurs motion... Just bad things all around.

Odette
Mar 19, 2011

ballsman posted:

Looking for some 980ti advice. I've gone through 3 Gigabyte G1's with hideous coil whine, finally managed to get a refund from newegg and wash my hands of the thing. Is the reference EVGA a safe bet? The cooler doesn't really matter, plan is to put it on water at some point.

Are you sure it's not your PSU that's causing the whine?

Fathis Munk
Feb 23, 2013

??? ?
Hey guys, maybe you can help me out, I'm planning to swap out my old GTX 580 for a 970 :toot:

Now since my PSU recently died I had to fall back on an older 550W model. It runs my current rig with 580 fine as far as I can tell, no shutdowns or anything. All other parts being equal, will I need higher wattage if I swap the GPU for a 970?

SlayVus
Jul 10, 2009
Grimey Drawer

Odette posted:

Are you sure it's not your PSU that's causing the whine?

My Galax 980 Ti HOF has coil whine during the Windows Experience Index test and extremely high frame rate games.

Fathis Munk posted:

Hey guys, maybe you can help me out, I'm planning to swap out my old GTX 580 for a 970 :toot:

Now since my PSU recently died I had to fall back on an older 550W model. It runs my current rig with 580 fine as far as I can tell, no shutdowns or anything. All other parts being equal, will I need higher wattage if I swap the GPU for a 970?

The 970 should use less power.

Gtx 580 244w tdp
Gtx 970 145w tdp

SlayVus fucked around with this message at 10:31 on Nov 4, 2015

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Fathis Munk posted:

Hey guys, maybe you can help me out, I'm planning to swap out my old GTX 580 for a 970 :toot:

Now since my PSU recently died I had to fall back on an older 550W model. It runs my current rig with 580 fine as far as I can tell, no shutdowns or anything. All other parts being equal, will I need higher wattage if I swap the GPU for a 970?

You'll need less, not more, the 970 is quite power efficient, though it gets less so the more you overclock it.

Verizian
Dec 18, 2004
The spiky one.

Zero VGS posted:

Does this mean I can use a $50 Radeon GPU or APU to drive a FreeSync monitor with a 980ti to actually crunch the shiny?

Already been answered but yes, you could even use a future intel iGPU as they have plans to support Freesync and as of Skylake the iGPU is fully DX12 compatible. Not sure if Skylake supports freesync though?

Going back to the Stardock video there's been a few inaccuracies pointed out already but the Stardock guys in it seemed to be claiming they hadn't specifically optimised for the feature when they said they were surprised it worked as well as it did, jumping from ~30fps to just below 60fps. Does that mean that a DX12 title will be able to gain some benefit from this just by using the new API but with actual optimisation there could be even bigger gains than practically doubling your FPS?

Fathis Munk
Feb 23, 2013

??? ?
Yesss perfect, thank you very much!

BurritoJustice
Oct 9, 2012

Verizian posted:

Already been answered but yes, you could even use a future intel iGPU as they have plans to support Freesync and as of Skylake the iGPU is fully DX12 compatible. Not sure if Skylake supports freesync though?

Going back to the Stardock video there's been a few inaccuracies pointed out already but the Stardock guys in it seemed to be claiming they hadn't specifically optimised for the feature when they said they were surprised it worked as well as it did, jumping from ~30fps to just below 60fps. Does that mean that a DX12 title will be able to gain some benefit from this just by using the new API but with actual optimisation there could be even bigger gains than practically doubling your FPS?

You absolutely don't want to use an iGPU alongside a discrete card with DX12, and the same goes for cards of super mismatched performance. Given that it uses alternate frame rendering, you'd have massive and unplayable frame times. The multi-adaptor thing is only worth it if you have two graphics cards that are very similar in performance, hence all the benches with FuryX's and stock 980ti's.

I remember when the first DX12 slides came out about using an iGPU in a form of split frame rendering with a TitanX. While total FPS and average frame time decreased, the frame latency increased massively as every frame had to be passed to the iGPU and then be hosed around with.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

BurritoJustice posted:

You absolutely don't want to use an iGPU alongside a discrete card with DX12, and the same goes for cards of super mismatched performance. Given that it uses alternate frame rendering, you'd have massive and unplayable frame times. The multi-adaptor thing is only worth it if you have two graphics cards that are very similar in performance, hence all the benches with FuryX's and stock 980ti's.

I remember when the first DX12 slides came out about using an iGPU in a form of split frame rendering with a TitanX. While total FPS and average frame time decreased, the frame latency increased massively as every frame had to be passed to the iGPU and then be hosed around with.

Wouldn't HBM for the iGPU overcome this deficiency? This seems like a pretty logical step to making it workable.

BurritoJustice
Oct 9, 2012

FaustianQ posted:

Wouldn't HBM for the iGPU overcome this deficiency? This seems like a pretty logical step to making it workable.

With alternate frame rendering it will never work with cards of mismatched performance, can't be fixed. With SFR it would have to be coded well and specifically. Who knows what the future holds, but I doubt that iGPU + fast dGPU will ever be used outside of the ultra budget area (say with the r7 250x and a A10-7870K hybrid crossfire). AMD themselves have hybrid crossfire disabled with anything above a 250x as the performance delta becomes too great.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
I just started getting cold (for us at least) in southern california. My 980ti is running so cool and silent I love it.
Amazing what a huge difference not having a 90f room can make.

Starkk
Dec 31, 2008


Fauxtool posted:

I just started getting cold (for us at least) in southern california. My 980ti is running so cool and silent I love it.
Amazing what a huge difference not having a 90f room can make.

Srsly my 980ti isn't warming my room anymore, I had to put pants on today for the first time in months

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

Wouldn't HBM for the iGPU overcome this deficiency? This seems like a pretty logical step to making it workable.

No. Even once the memory is the same, you're still slaving something with maybe an eighth (or worse) performance to a full sized card.

And as we all well know, nothing in multi-GPU setups scales linearly 1:1, and everything always scales down to the lowest common denominator. So sure, if you'd like to Crossfire a pair of R3 320s...

ballsman
Jun 16, 2003

a fuckin bomb
Grimey Drawer

Odette posted:

Are you sure it's not your PSU that's causing the whine?

Could be, I don't have a second PSU to test it out with. It's a brand new EVGA P2 though so it's not like I bought a lovely one. I'm gonna give the EVGA card a shot, at least I'll be able to deal with their RMA people instead of newegg.

Anime Schoolgirl
Nov 28, 2002

cat doter posted:

You'll need less, not more, the 970 is quite power efficient, though it gets less so the more you overclock it.
The power climb isn't too bad either. It goes up to 225w max when overclocked to its limits (and some factory OCs run 200w), which is about the same as a r9 280x stock.


Getting the reference 290 today. Can't wait to see how it'll beat three concurrent room fans on full blast. :shepface:

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
My 970, OC'd to almost 1500mhz, and a 4790k oc'd to 4.6ghz pull 280W max from the wall running at full tilt.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Anime Schoolgirl posted:

Getting the reference 290 today. Can't wait to see how it'll beat three concurrent room fans on full blast. :shepface:

Enjoy the CLC ready edition cooler.

Anime Schoolgirl
Nov 28, 2002

Default fan profile: benches 92c at no noise, hotter than I'd like
Default MSI Afterburner profile: benches 70c with distinct noise from my PC, but not overpowering my room fans. I'd say this one is a success.

Just in time for Fallout 4. :downsgun:

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

ballsman posted:

Looking for some 980ti advice. I've gone through 3 Gigabyte G1's with hideous coil whine, finally managed to get a refund from newegg and wash my hands of the thing. Is the reference EVGA a safe bet? The cooler doesn't really matter, plan is to put it on water at some point.

I have the lowest clock twin-fan EVGA 980ti and it doesn't have any coil whine I can detect. Every 970 I've gone through has, but not this.

Verizian posted:

Already been answered but yes, you could even use a future intel iGPU as they have plans to support Freesync and as of Skylake the iGPU is fully DX12 compatible. Not sure if Skylake supports freesync though?

Okay, yeah, I have Skylake and I'll be super loving pumped if I can run Freesync off the motherboard with my 980ti, that's the ideal solution. Maybe someone can hack it in.

Zero VGS fucked around with this message at 17:48 on Nov 5, 2015

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply