Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Wow they must really want to dump their stock fast.

Adbot
ADBOT LOVES YOU

CyberPingu
Sep 15, 2013


If you're not striving to improve, you'll end up going backwards.
Lol 4 per household. 90% of those will be up on eBay in a week

NJD2005
Sep 3, 2006
...

CyberPingu posted:

Lol 4 per household. 90% of those will be up on eBay in a week

I love it, they spend $200 to make $20 and have to deal with idiots on ebay.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

MaxxBot posted:

Wow they must really want to dump their stock fast.

Microcenter also had some prices on 1060s that I'd consider nearly dumping levels ($153 after tax for a 1060 3 GB). I wonder if AIB partners got a heads-up that consumer Volta wasn't too far off.

(probably just a normal sale, but man, I've never seen EVGA do a tweet like that before)

1gnoirents
Jun 28, 2014

hello :)
I mean they are 980tis, I guess I should be surprised an AIB had enough stock of those to warrant a sale at all.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

1gnoirents posted:

I mean they are 980tis, I guess I should be surprised an AIB had enough stock of those to warrant a sale at all.

I mean, they were new-production until just over a year ago, step-up from firesale 980 Tis just ended in like November, and you're going to get warranty returns and stuff for years.

Also, NVIDIA sold a pretty much unprecedented amount of 980 Tis, given that it launched at $650. It was monstrously popular since it was the first single card that would do guaranteed 60+ fps at 1440p ultra settings, it was a popular choice for VR, etc. I scored a sweet price mistake on a 1080, otherwise I'd still be using mine.

1gnoirents
Jun 28, 2014

hello :)

Paul MaudDib posted:

I mean, they were new-production until just over a year ago, step-up from firesale 980 Tis just ended in like November, and you're going to get warranty returns and stuff for years.

Also, NVIDIA sold a pretty much unprecedented amount of 980 Tis, given that it launched at $650. It was monstrously popular since it was the first single card that would do guaranteed 60+ fps at 1440p ultra settings, it was a popular choice for VR, etc. I scored a sweet price mistake on a 1080, otherwise I'd still be using mine.

Yeah true I guess. I suppose I've never actually followed "old" skus after new things release as well.

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

Is this the reason for the Vega delay? How does Apple pay for product, all at once? Basically, if Apple asked for 50% or more of AMD's Vega stock and payed well for it, would it appear in the next earnings report?

Well, that's Apple and HP... where's Dell?

https://videocardz.com/newz/hp-omen-desktop-pc-to-ship-with-amd-vega-10

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Quick question:

Can you mix different brands to use SLI with the 1080 Ti?

I got an Asus 1080 Ti Founder's Edition... I found an excellent deal on an EVGA ( Model # 11G-P4-6390-KR ), almost half price. Should I go for it?

edit: deal is not online, local tiny store and they only have one, sorry, just wanted to check this before buying...

Comfy Fleece Sweater fucked around with this message at 00:11 on Jun 7, 2017

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Comfy Fleece Sweater posted:

Quick question:

Can you mix different brands to use SLI with the 1080 Ti?

I got an Asus 1080 Ti Founder's Edition... I found an excellent deal on an EVGA ( Model # 11G-P4-6390-KR ), almost half price. Should I go for it?

edit: deal is not online, local tiny store and they only have one, sorry, just wanted to check this before buying...

They should SLI just fine with each other.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Comfy Fleece Sweater posted:

Quick question:

Can you mix different brands to use SLI with the 1080 Ti?

I got an Asus 1080 Ti Founder's Edition... I found an excellent deal on an EVGA ( Model # 11G-P4-6390-KR ), almost half price. Should I go for it?

edit: deal is not online, local tiny store and they only have one, sorry, just wanted to check this before buying...

On paper and according to NVidia officially, yes.

In real life, not always. I had a GTX 970 from Gigabyte and Zotac that wouldn't SLI together, it was confirmed by others as well, and NVidia told me to go gently caress myself even though their site says two cards of the same series (i.e. 970, 1080 etc) will always SLI regardless of brand/board.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Zero VGS posted:

On paper and according to NVidia officially, yes.

In real life, not always. I had a GTX 970 from Gigabyte and Zotac that wouldn't SLI together, it was confirmed by others as well, and NVidia told me to go gently caress myself even though their site says two cards of the same series (i.e. 970, 1080 etc) will always SLI regardless of brand/board.

EVGA and Zotac both shipped models of 970 that were incompatible with other brands (sometimes incompatible within the same brand). There have been a few other times as well, I want to say back in the 700 generation and maybe one funky model of the 980 Ti too?

I remember reading that at times there's been both mechanical (someone put the SLI fingers in the wrong spot) and electrical incompatibles (someone didn't implement the spec right) but I can't source that.

Comfy Fleece Sweater posted:

Quick question:

Can you mix different brands to use SLI with the 1080 Ti?

I got an Asus 1080 Ti Founder's Edition... I found an excellent deal on an EVGA ( Model # 11G-P4-6390-KR ), almost half price. Should I go for it?

edit: deal is not online, local tiny store and they only have one, sorry, just wanted to check this before buying...

If this is two FE cards, I would assume you're safe, they're a reference implementation. I don't think I've heard of any SLI issues with the 10-series period let alone with ref cards.

At half price, are you sure it's not a 1080 non-Ti? If it's legit, that's a hell of a deal (it would actually be a great deal even for a non-Ti)

How well does the FE cooler cope with all that TDP? What clocks/temps do you hit? I assume fan is probably cranked close to 100%, do you turn the power limit up?

Paul MaudDib fucked around with this message at 00:51 on Jun 7, 2017

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Paul MaudDib posted:

EVGA and Zotac both shipped models of 970 that were incompatible with other brands (sometimes incompatible within the same brand). There have been a few other times as well, I want to say back in the 700 generation and maybe one funky model of the 980 Ti too?

I remember reading that at times there's been both mechanical (someone put the SLI fingers in the wrong spot) and electrical incompatibles (someone didn't implement the spec right) but I can't source that.


If this is two FE cards, I would assume you're safe, they're a reference implementation. I don't think I've heard of any SLI issues with the 10-series period let alone with ref cards.

At half price, are you sure it's not a 1080 non-Ti? If it's legit, that's a hell of a deal (it would actually be a great deal even for a non-Ti)

How well does the FE cooler cope with all that TDP? What clocks/temps do you hit? I assume fan is probably cranked close to 100%, do you turn the power limit up?

Oh my cooling is pretty great, got a corsair air 540 with a bunch of fans and I rarely see it above 70c unless I'm running benchmarks, the default fan profile rarely goes above 60% fan speed. Most I've seen it is 83c and then it slows down.

I hope a second 1080ti doesn't get too hot running together, but I used to run 2x970s and they seemed generally less efficient.
I also got a 850w power supply so I hope things are good.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Comfy Fleece Sweater posted:

Oh my cooling is pretty great, got a corsair air 540 with a bunch of fans and I rarely see it above 70c unless I'm running benchmarks, the default fan profile rarely goes above 60% fan speed. Most I've seen it is 83c and then it slows down.

I hope a second 1080ti doesn't get too hot running together, but I used to run 2x970s and they seemed generally less efficient.
I also got a 850w power supply so I hope things are good.

Yeah, that's not bad. Use GPU-Z and get some readings while you game and see what it looks like, in particular the boost clock.

The cooler you keep it, the higher it'll boost, so you may actually want to just crank the fan up. Especially since the other Pascal overclocking pro-tip is that it's power-limited and the #1 thing you can do is push the "power limit" slider all the way to the stops. Core clocking usually gets some gains, memory OC usually gets some gains, but core voltage often actually hurts due to increased heat/throttling. The best thing you can do for Pascal is just throw power at it and keep it cool. I assume the 1080 Ti probably follows.

Each card will put out ~230W average at stock, or 275W average at the 120% power limit. With 88W for CPU you're at 638W which is fine for your 850W PSU, if it's trustworthy. And you have plenty of headroom even for peak current there.

These days I would just get AIO coolers and a bracket for it, it's about $80 per card (Corsair H55 and NZXT G10, or the $150 EVGA kit if you want to be fancy), but it'll help performance a lot. That's probably one of the more effective ways to increase performance after you're at the "1080 Ti SLI" stage of the game. Maybe get a Noctua or Gentle Typhoon static-pressure-style fan as a replacement for the AIO one too (the defaults tend to be loud). I would think your case has plenty of mounts for the radiator.

Paul MaudDib fucked around with this message at 01:39 on Jun 7, 2017

Seamonster
Apr 30, 2007

IMMER SIEGREICH

It'll be Alienware branding when it finally does come.

The Gasmask
Nov 30, 2006

Breaking fingers like fractals

Obsurveyor posted:

All good points! I forgot about Netflix/video/Chrome is basically unusable when I rendered in Blender(I use it more for modeling stuff than rendering). I think I may leave this temporary 1060 in my machine when I finally step-up to the 1080Ti later this month for that.

If you're not using the secondary card for gaming, the awesome thing is you don't need to worry about SLI/crossfire - rendering uses as many cards as you want without issue (scales linearly, so 2x 1080 would be ~double the performance of one for example), and for gaming you can always disable the non-display card in the device manager when gaming to prevent annoying issues with having dual cards.

In my case, if I use OpenCL, I can have the 2x GPUs in the RPD + the 1070 rendering one image for some crazy 3x GPU performance gains, even though they're different brands and wouldn't work together for gaming.

The Gasmask fucked around with this message at 02:49 on Jun 7, 2017

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Paul MaudDib posted:

Yeah, that's not bad. Use GPU-Z and get some readings while you game and see what it looks like, in particular the boost clock.

The cooler you keep it, the higher it'll boost, so you may actually want to just crank the fan up. Especially since the other Pascal overclocking pro-tip is that it's power-limited and the #1 thing you can do is push the "power limit" slider all the way to the stops. Core clocking usually gets some gains, memory OC usually gets some gains, but core voltage often actually hurts due to increased heat/throttling. The best thing you can do for Pascal is just throw power at it and keep it cool. I assume the 1080 Ti probably follows.

Each card will put out ~230W average at stock, or 275W average at the 120% power limit. With 88W for CPU you're at 638W which is fine for your 850W PSU, if it's trustworthy. And you have plenty of headroom even for peak current there.

These days I would just get AIO coolers and a bracket for it, it's about $80 per card (Corsair H55 and NZXT G10, or the $150 EVGA kit if you want to be fancy), but it'll help performance a lot. That's probably one of the more effective ways to increase performance after you're at the "1080 Ti SLI" stage of the game. Maybe get a Noctua or Gentle Typhoon static-pressure-style fan as a replacement for the AIO one too (the defaults tend to be loud). I would think your case has plenty of mounts for the radiator.

Wow these are excellent tips ! I'll have to look at these fans, I haven't heard about those brands. I do use MSI afterburner for fiddling around with temps, but I haven't tried increasing power, great stuff, thanks! :)

eames
May 9, 2009

https://www.youtube.com/watch?v=MwWIXzTOoZY&t=227s

SlayVus
Jul 10, 2009
Grimey Drawer

Okay, so each rack is double sided and he has 6 cards per rack, 4 racks stacked 4 high. He has 32 racks per side, giving 128 nodes with 96 cards per node, giving a total of 12,288 GPUs.

Edit: Just in that hangar alone was 12k+ GPUs. If those hangars are split in the middle, and it looks like he has 2 hangars, probably 48,000-50,000 GPUs.

SlayVus fucked around with this message at 08:23 on Jun 7, 2017

eames
May 9, 2009

He commented that there are "a couple ten thousand" cards if I understood correctly. Power bill is 1 Million Euros/month. There's a second video here.

I knew that GPU mining is a thing but the scale of this completely blows my mind.

eames fucked around with this message at 08:24 on Jun 7, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Welp, I guess if they're operating at that scale, it's entirely possible AMD and Nvidia have a reason to offer buttcoin mining cards because even a hundred such operations can utterly drain the market.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

I'm :psyduck: that my 3 1/2 year-old 290 with a rattly fan sold for more than a new 980ti. Up is down; dogs are cats - thanks, bitminers! :toot:.

Wistful of Dollars
Aug 25, 2009

MaxxBot posted:

Pretty much exactly three years ago I bought an R9 290 on ebay for $270, I just resold it on ebay for $250 :lol:

Oh poo poo, I have one sitting around I still need to sell.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
Remember when GPUs were for video games and trying to cure cancer? Pepperidge Farm remembers.

rex rabidorum vires
Mar 26, 2007

KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN
All this talk of buttcoin and I decided to look up what the R9 Fury was going for. $325 to 375 :stare: Really think I'm going to flip it and use that for a 1070.

Truga
May 4, 2014
Lipstick Apathy

rex rabidorum vires posted:

All this talk of buttcoin and I decided to look up what the R9 Fury was going for. $325 to 375 :stare: Really think I'm going to flip it and use that for a 1070.

I'd keep it and spend the money on a freesync screen instead. Set up a cronjob to mine while you're at work, too. :v:

1gnoirents
Jun 28, 2014

hello :)

Truga posted:

I'd keep it and spend the money on a freesync screen instead. Set up a cronjob to mine while you're at work, too. :v:

I think the plus there is he'd be spending no money at all

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy
https://twitter.com/CDemerjian/status/872445675056898048

Also apparently the 4 shader engines are segregated, making 8?

https://www.youtube.com/watch?v=owL_KY9sIx8

Anarchist Mae fucked around with this message at 19:53 on Jun 7, 2017

rex rabidorum vires
Mar 26, 2007

KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN

Truga posted:

I'd keep it and spend the money on a freesync screen instead. Set up a cronjob to mine while you're at work, too. :v:

That was the original plan when I got this particular GPU. Minus the mining part.

1gnoirents posted:

I think the plus there is he'd be spending no money at all

I actually paid $240 for the card (open box) and was able to get Sapphire to honor a $20 rebate on it despite the previous owner clipping the UPCs off of it so I would be turning ~$100+ profit on the card and re-investing that into the 1070.

1gnoirents
Jun 28, 2014

hello :)
I understand the freesync argument fully, its really the biggest downside to nvidia. However *finally* decent gsync monitors are popping up for reasonable prices. I just bought a Dell 27" gsync, 1440p, 144hz for $400 flat last week. Although that particular price has shot back up, its now the third time thats dropped that low. Hopefully we'll see more follow suit. Its not going to touch the cheaper freesync monitors (and I cant imagine it ever will) but feature wise its finally approaching competitive in my book.

The selection of gsync monitors regardless of price is still quite disappointing, but at least there aren't any "gsync" monitors like there are """"freesync"""" 42-44 hz half rear end models.

ItBurns
Jul 24, 2007

1gnoirents posted:

I understand the freesync argument fully, its really the biggest downside to nvidia. However *finally* decent gsync monitors are popping up for reasonable prices. I just bought a Dell 27" gsync, 1440p, 144hz for $400 flat last week. Although that particular price has shot back up, its now the third time thats dropped that low. Hopefully we'll see more follow suit. Its not going to touch the cheaper freesync monitors (and I cant imagine it ever will) but feature wise its finally approaching competitive in my book.

The selection of gsync monitors regardless of price is still quite disappointing, but at least there aren't any "gsync" monitors like there are """"freesync"""" 42-44 hz half rear end models.

Is it an IPS? Model?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

ItBurns posted:

Is it an IPS? Model?

No, Dell only uses TN panels in their gaming lineup. Which sucks because their build quality is way better than Acer/Asus.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Which is weird, because they made the change to go from TN to IPS on their 7000 gaming series notebook.

I have no idea what their desktop hangup is.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

I've not followed this Freesync vs Gsync debate at all, does it really make a big difference from a normal monitor? I use a semi crappy Samsung curved monitor.

Since I've got Nvidia cards I just need to get a Gsync monitor to take advantage of.. what exactly?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Comfy Fleece Sweater posted:

I've not followed this Freesync vs Gsync debate at all, does it really make a big difference from a normal monitor? I use a semi crappy Samsung curved monitor.

Since I've got Nvidia cards I just need to get a Gsync monitor to take advantage of.. what exactly?

Gsync/FreeSync allow you to turn off v-sync and still have no screen tearing. With V-sync on, if the framerate dips below 60 it'll lock at 30, so that can look pretty obnoxious, plus it adds a couple milliseconds input lag. Without v-sync, you get tearing.

If you have a non-*sync monitor, you will get no benefit.

I have used FreeSync and Gsync and they are both great, when they work. I've had a couple FreeSync games where freesync failed to activate for whatever weird reason, and Gsync stupidly disabled if you are randering above the monitor cap, so you have to use a3rd-party frame limiter there.

MagusDraco
Nov 11, 2011

even speedwagon was trolled
I've had one or two games where g-sync screws up (Trails in the Sky Games, maybe Deadly Premonition but that could be something else) but otherwise it's been great.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Comfy Fleece Sweater posted:

I've not followed this Freesync vs Gsync debate at all, does it really make a big difference from a normal monitor? I use a semi crappy Samsung curved monitor.

Since I've got Nvidia cards I just need to get a Gsync monitor to take advantage of.. what exactly?

Given that you have SLI 1080 Tis, GSync isn't gonna do much for you. However, you can probably run LightBoost in Fast Sync mode, even at 1440p ultra. That's gonna be the best monitor improvement for you.

GSync fixes frame pacing. When the rate your GPU turns out frames is different from the rate at which your monitor draws them, something has to give. You either end up with tearing (displaying half of one frame and half of another) or judder (displaying frames for an uneven amount of time, like one frame twice and the next frame only once). VSync off gives you tearing, VSync on gives you judder. Judder is a lot of what makes low-framerate gameplay feel bad and that's one of the reasons everyone recommends disabling it most of the time (the other being input lag).

What GSync does is basically have your monitor only draw a frame when the GPU turns it out. It does this by varying the refresh rate - so if your GPU is drawing 40 fps, your monitor runs at 40 Hz. This makes gameplay below 60 fps hugely smoother, and has a noticeable benefit up to about 100 Hz. It is seriously, seriously way better, 45 fps is totally playable in most games, like running 60 Hz normally. But if you can max your monitor's refresh rate out, then it can't do anything, it's not going to magically make the monitor run faster than it normally could.

Lightboost enhances sharpness. Normally, you get a little bit of blurriness caused by pixels changing color. What lightboost does is keep the backlight off for most of the draw cycle, and then strobe it once everything is drawn, so you get a "flashbulb" effect rather than blurring. This does produce a noticeable increase in sharpness, the main downside being that you can't combine GSync with it, for current implementations (GSync changes framerate, if you are strobing the backlight every time you draw a frame then your brightness will increase as your framerate does). NVIDIA filed a patent for a workaround, so future Gsync monitors may be able to do both Lightboost and GSync at the same time.

Finally, Fast Sync is another rendering trick like VSync. However, instead of a double-buffer like VSync it uses a triple buffer. The advantage is you don't get the input lag like you do with VSync, the downside is that you need to be rendering at least twice your monitor's actual refresh rate. So for a monitor running in 120 Hz mode, you need to be able to push at least 240 fps in your game. Normally that would be a challenge at 1440p... but you have SLI 1080 Tis which should be enough horsepower to run most AAA titles at 240fps at high/ultra settings.

All GSync monitors that are at least 120 Hz support Lightboost. Fast Sync does not require any specific monitor support, it happens in the NVIDIA drivers.

Paul MaudDib fucked around with this message at 21:59 on Jun 7, 2017

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Comfy Fleece Sweater posted:

I've not followed this Freesync vs Gsync debate at all, does it really make a big difference from a normal monitor? I use a semi crappy Samsung curved monitor.

Since I've got Nvidia cards I just need to get a Gsync monitor to take advantage of.. what exactly?

Yes, it's a big difference, especially if you do not have the GPU horsepower to keep a game comfortably above your monitor's refresh rate at all times. Also, most *Sync monitors (particularly GSync ones) are high-Hz monitors, which also makes a big difference.

The crux of *sync is to try to match up the framerate of your GPU with the refresh rate of the monitor, by slowing/speeding the monitor to match the GPU. The end effect is that everything looks a good bit smoother in action than it otherwise would if the rates were mismatched. The most noticeable impacts are in that, when you have a low FPS, the game will appear more smooth, and effectively give the appearance of a higher framerate. Many people describe the effect as roughly equivalent to jumping a class in GPU (1070->1080, for example). On the higher end, as Zero VGS noted, it allows you to disable V-Sync and not have to worry about the nastiness that occurs if you ever dip below whatever the sync rate is (60Hz normally).

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
GSync is great but just bear in mind the guy just bought his second 1080 Ti for SLI, so he's in no danger of dipping below 60 fps.

Like, I bet you could easily do a triple-monitor 1440p surround setup at 165 Hz and probably come close to maxing it out. Or 4K surround. Multi-monitor surround would be the only situations where he'll see a benefit from GSync.

For a single 1440p 144/165 Hz monitor with 1080 Ti SLI, LightBoost + Fast Sync is the way to go, because it does boost sharpness quite a bit with no impact on input lag.

GSync is really more intended for people who are running 30-80 fps, i.e. a GPU that is not really capable of maxing out its monitor's refresh rate, or even barely capable of staying above 30fps, like a 780 Ti/970/1060 (which land around 45 fps average, in my experience).

(in my experience 30fps is where even GSync crashes and burns. The frame pacing is still perfect, but I can feel the slideshow effect from lack of frames.)

Paul MaudDib fucked around with this message at 22:26 on Jun 7, 2017

Adbot
ADBOT LOVES YOU

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

rex rabidorum vires posted:

All this talk of buttcoin and I decided to look up what the R9 Fury was going for. $325 to 375 :stare: Really think I'm going to flip it and use that for a 1070.

Yeah I would go for it, people are going nuts trying to buy AMD cards right now so you could probably price it towards the $375 end and still sell it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply