Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
betamax hipster
Aug 13, 2016

Comfy Fleece Sweater posted:

Same question but I don't do VR, just regular videogames - it's probably useful to keep my 970 around for physx?

Not really, no.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Subjunctive posted:

Why does AMD let the vendors use the FreeSync trademark if the ranges are lovely?

Because that way they can claim that "65% of all monitors support FreeSync!" and build mind-share, even if it's just a very minimal token support for most of them.

There are only like a handful of monitors that support LFC, which is the actual bar that you need to clear to compete reasonably with GSync. Most of the ones that do are 27" 1440p/144 Hz midrange-tier monitors with a handful of 24" 1080p 144 Hz monitors thrown in. There are very few monitors under 144 Hz that support LFC. If those were the only ones that got FreeSync branding it would be a lot harder to build mindshare about FreeSync's "ubiquitous" support and "no GSync tax" because the monitors with LFC support trend strongly towards the premium side of the market.

That's the catch you really have to watch especially with the 4K FreeSync monitors. Almost no 60 Hz monitors support LFC - including 4K monitors. The 4K GSync monitors all have sync ranges down to 24 Hz and will do LFC. The FreeSync ones won't, it's usually 40-60 Hz at best. Better than nothing, but it's not a comparable featureset.

I've been watching the 4K monitors in particular because the LGs are getting cheap for 4K 27" IPS... but no LFC is pretty much a dealbreaker for me. The Acer Predators still seem to be the way to go.

Paul MaudDib fucked around with this message at 02:29 on Apr 3, 2017

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Comfy Fleece Sweater posted:

Same question but I don't do VR, just regular videogames - it's probably useful to keep my 970 around for physx?

No, not at all.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

AVeryLargeRadish posted:

No, not at all.

Really? I thought it helped a lot with particles and stuff ?

1gnoirents
Jun 28, 2014

hello :)

Comfy Fleece Sweater posted:

Really? I thought it helped a lot with particles and stuff ?

It does for very specific games that use it but if you have something better than a 970 presumably it can handle it on its own. Its worth far more for the 120 mcdoubles or whatever they go for today

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
Since it's been weighing in my mind, there's an EVGA 1080 FTW Gaming DT available locally for 450 that is in fantastic shape, all the components, as well as being replaced with new thermal pads and Arctic Silver thermal compound.

I'm wondering if I should go for it or if maybe there is a better option for me to consider.

CaptainSarcastic
Jul 6, 2013



Comfy Fleece Sweater posted:

Really? I thought it helped a lot with particles and stuff ?

I think was somewhat true in the past, but hit the point of diminishing returns some time ago. From what I've read keeping a second card just for Physx with a 10xx card is close to futile. I was toying with the idea myself when I got my GTX 1060, but reading up on it made me table the idea.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Very well. I actually have 2 970's in SLI, so I'll give one away to a worthy kid nerd and I'll run some tests on the other, and then probably sell it.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Comfy Fleece Sweater posted:

Very well. I actually have 2 970's in SLI, so I'll give one away to a worthy kid nerd and I'll run some tests on the other, and then probably sell it.

As someone who upgrade from dual 970s to a 1080, I still have one of the 970s in my machine. It makes a difference with the Gibs+Fluids setting in Killing Floor 2 :gibs:

The 970 allows me to run Ultra settings @ 4k 60FPS. I drop down to 40 FPS without out. Not that I recommend anyone doing this for 1 game though :shobon:

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy
So I'm running sli 980ti's with the latest drivers and the last couple of days I've been getting issues with the core clocks locking at a very low hz (normally 1450hz stuck at 500ish) the only thing that fixes it is rebooting, its not a driver crash and seems to be right from start up so im at a loss as to why its happening

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Scarecow posted:

So I'm running sli 980ti's with the latest drivers and the last couple of days I've been getting issues with the core clocks locking at a very low hz (normally 1450hz stuck at 500ish) the only thing that fixes it is rebooting, its not a driver crash and seems to be right from start up so im at a loss as to why its happening

Had this exact issue with my 1080. The only thing that fixed it was downgrading to a version of the drivers that didn't cause this. In my case it was 2 or 3 driver revisions back :(

1gnoirents
Jun 28, 2014

hello :)

Stanley Pain posted:

As someone who upgrade from dual 970s to a 1080, I still have one of the 970s in my machine. It makes a difference with the Gibs+Fluids setting in Killing Floor 2 :gibs:

The 970 allows me to run Ultra settings @ 4k 60FPS. I drop down to 40 FPS without out. Not that I recommend anyone doing this for 1 game though :shobon:

I was on the fence about mentioning this one exception lol. But yes this is the one sole exception I'm aware of

edit: I guess I should point out it is also the only game I've ever seen where it actually makes a difference and is sweet as well.

1gnoirents fucked around with this message at 14:49 on Apr 3, 2017

eames
May 9, 2009

Apple is making a GPU by the way, so I guess this sort of belongs here. Obviously only for iOS devices but who knows far it will scale.

If they really want to compete with Nvidia in neural network/image recognition applications for their car project then this could turn out to be quite interesting considering the resources they have available. ¯\_(ツ)_/¯

http://www.anandtech.com/show/11243/apple-developing-custom-gpu-dropping-imagination

Stanley Pain
Jun 16, 2001

by Fluffdaddy

1gnoirents posted:

I was on the fence about mentioning this one exception lol. But yes this is the one sole exception I'm aware of

edit: I guess I should point out it is also the only game I've ever seen where it actually makes a difference and is sweet as well.


Yeah the fluids and gibs setting is REALLY well done. Playing with the blood pools never gets old. There's been a few times where the 970 has been hitting 90% utilization when things got crazy on screen. I think some of the Batman games can make use of hardware physx, there's also Warframe and Hawken as well. There's also that Russian Dark Souls clone.

Obsurveyor
Jan 10, 2003

Anything that uses Unity can theoretically benefit from it since they use PhysX as their 3D physics engine. However, they use bullet for 2D physics and I'm not sure about their particles(I believe it's their own engine called shuriken but not sure about the collision part).

EdEddnEddy
Apr 5, 2012



SwissArmyDruid posted:

If you Korean frankenmonitor people had your way, you'd be using monstrosities like this, on account of some poo poo or another.



source: https://www.reddit.com/r/pcmasterrace/comments/62llzu/who_remember_this_back_in_ces_2008/

edit: have some video, for the halibut: https://www.youtube.com/watch?v=PJlJ7OleBCU

Some guy has one of those here at the local Intel Lanfest events. Thing looks older than that one though and you can see where each screen sort of stiches together. It also is pretty low res but I guess still is one of the more original Ultrawides lol.


That 1080Ti Strix cooler does look really nice. I still wish that 980Ti trade up program happened but I feel I am SOL and have to do it the old fashion way.

repiv
Aug 13, 2009

eames posted:

Apple is making a GPU by the way, so I guess this sort of belongs here. Obviously only for iOS devices but who knows far it will scale.

If they really want to compete with Nvidia in neural network/image recognition applications for their car project then this could turn out to be quite interesting considering the resources they have available. ¯\_(ツ)_/¯

http://www.anandtech.com/show/11243/apple-developing-custom-gpu-dropping-imagination



:popeye:

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
wrecked

eames
May 9, 2009

:captainpop:
Apple is probably going to buy them out for peanuts in a few months/years.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Well, here we go - cue the "what 1060 should I get *now*" era: http://www.guru3d.com/news-story/gigabyte-launches-aorus-gtx-1060-6gb-with-faster-9-gbps-gddr5-memory.html

At least there's a marking on the box that shows the higher-speed GDDR. Same with the up-rated 1080.

1gnoirents
Jun 28, 2014

hello :)

BIG HEADLINE posted:

Well, here we go - cue the "what 1060 should I get *now*" era: http://www.guru3d.com/news-story/gigabyte-launches-aorus-gtx-1060-6gb-with-faster-9-gbps-gddr5-memory.html

At least there's a marking on the box that shows the higher-speed GDDR. Same with the up-rated 1080.

Im curious of this will make any difference at all for a 1060. Memory speed has had a spotty record for gains in the past (ill never forget squeezing every mhz out of 660tis though) but I kind of stopped paying attention for Pascal.

1gnoirents fucked around with this message at 20:54 on Apr 3, 2017

EdEddnEddy
Apr 5, 2012




Jesus that is a drop. Makes all the little mountains look like nothing once that cliff happened.

Now considering how good their GPU's have been for Apple all this time, why the heck would they have dropped them like this just now? Literally to open them up to purchase or is it just Apple being proprietary dicks and probably stealing all they know from them anyway?

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

EdEddnEddy posted:

Jesus that is a drop. Makes all the little mountains look like nothing once that cliff happened.

Now considering how good their GPU's have been for Apple all this time, why the heck would they have dropped them like this just now? Literally to open them up to purchase or is it just Apple being proprietary dicks and probably stealing all they know from them anyway?

Apple doesn't give a fuuuuuuuuuuuuuuuck about other businesses when they have an objective. They hosed over the Sapphire plant too, depending on who you believe.

Business is business for apple. They're not the type of corporation who will work with their suppliers and give them a heads up or anything. Conversely they're still working with Samsung, of all corps, because they benefit.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
Curious, where's the thread title from? I hope scantily clad elf women are no longer a thing on GPU coolers.

craig588
Nov 19, 2005

by Nyc_Tattoo
Someone posted how they liked the trend of "clean, no female on blower" videocards. A decade ago the art was real bad.

Bleh Maestro
Aug 30, 2003

BIG HEADLINE posted:

Well, here we go - cue the "what 1060 should I get *now*" era: http://www.guru3d.com/news-story/gigabyte-launches-aorus-gtx-1060-6gb-with-faster-9-gbps-gddr5-memory.html

At least there's a marking on the box that shows the higher-speed GDDR. Same with the up-rated 1080.

How much does this actually affect real world performance?

redeyes
Sep 14, 2002

by Fluffdaddy
I finally understand why my RX480 always beats the 1060 I had. I use a 4k monitor. The 256bit bus on the 480 vs the 192 on the 1060 really only hurts 4k stuff. Below that and the 1060 generally comes out on top.

GRINDCORE MEGGIDO
Feb 28, 1985


redeyes posted:

Below that and the 1060 generally comes out on top.

Is that true with modern drivers?

redeyes
Sep 14, 2002

by Fluffdaddy

GRINDCORE MEGGIDO posted:

Is that true with modern drivers?

Like right now drivers, yes. Odd question, both cards are current gen.

GRINDCORE MEGGIDO
Feb 28, 1985


redeyes posted:

Like right now drivers, yes. Odd question, both cards are current gen.

I don't see why that's an odd question. On launch, it was ~10% or more slower then a 1060. But AMD's drivers got a big improvement last December, at which point it was on a par with a 1060. Right down to 1080p.

I just wondered if that trend no longer continued.

GRINDCORE MEGGIDO fucked around with this message at 00:55 on Apr 4, 2017

wargames
Mar 16, 2008

official yospos cat censor

GRINDCORE MEGGIDO posted:

I don't see why that's an odd question. On launch, it was slower then a 1060. But AMD's drivers got a big improvement last December, at which point it was on a par / above a 1060 in DX11 and faster overall in DX12.

I just wondered if that trend no longer continued.

The difference is not enough to care get whatever one is on sale or *sync option is available to you.

GRINDCORE MEGGIDO
Feb 28, 1985


I'm not looking at either, thank you (I have a 1080). Was just curious if more recent drivers put the 1060 back ahead noticeably, or something. That would have surprised me as generally AMD's drivers eke more performance out over time.

GRINDCORE MEGGIDO fucked around with this message at 00:54 on Apr 4, 2017

Drakhoran
Oct 21, 2012

GRINDCORE MEGGIDO posted:

Is that true with modern drivers?

Depends on the game. I saw a couple of videos at the end of last year that tested the 1060 vs the 480 with (what was at the time) the latest drivers. Both concluded that on the whole the two cards are pretty much equally fast. Some games will significantly favor one card or the other, but unless you're primarily buying your card to play one of those games, just buy whichever is cheapest.

Edit: Here's one of the videos:

https://www.youtube.com/watch?v=CiYQqNiqQKU&t=914s

1gnoirents
Jun 28, 2014

hello :)

GRINDCORE MEGGIDO posted:

I'm not looking at either, thank you (I have a 1080). Was just curious if more recent drivers put the 1060 back ahead noticeably, or something. That would have surprised me as generally AMD's drivers eke more performance out over time.

They did, not sure I could count on the dramatic 290-Raquel improvements going forward though. Unless we assume AMD releases terrible drivers at launch.

There was a time when the 770 was comparable to a 290 in some games at the time...

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Newegg is blowing out (swidt) new Gigabyte blower 1080s for $420 on eBay: http://www.ebay.com/itm/GIGABYTE-GeForce-GTX-1080-DirectX-12-GV-N1080TTOC-8GD-8GB-256-Bit-GDDR5X-PCI-Exp-/302105143290?rmvSB=true

Not a horrible deal if you were already considering ~$350-380 for a 1070.

wargames
Mar 16, 2008

official yospos cat censor

BIG HEADLINE posted:

Newegg is blowing out (swidt) new Gigabyte blower 1080s for $420 on eBay: http://www.ebay.com/itm/GIGABYTE-GeForce-GTX-1080-DirectX-12-GV-N1080TTOC-8GD-8GB-256-Bit-GDDR5X-PCI-Exp-/302105143290?rmvSB=true

Not a horrible deal if you were already considering ~$350-380 for a 1070.

420 for a 1080 is way better then 380 for a 1070. Since vega won't beat a 1080ti I wonder what the price point will be since we can get 1080s for 420.

GRINDCORE MEGGIDO
Feb 28, 1985


wargames posted:

420 for a 1080 is way better then 380 for a 1070. Since vega won't beat a 1080ti I wonder what the price point will be since we can get 1080s for 420.

If they sell them for $420 / £450 or so I'm grabbing one for freesync. I hope reviews are out a little while before it launches so I can unload this 1080.

Seamonster
Apr 30, 2007

IMMER SIEGREICH

BIG HEADLINE posted:

Newegg is blowing out (swidt) new Gigabyte blower 1080s for $420 on eBay: http://www.ebay.com/itm/GIGABYTE-GeForce-GTX-1080-DirectX-12-GV-N1080TTOC-8GD-8GB-256-Bit-GDDR5X-PCI-Exp-/302105143290?rmvSB=true

Not a horrible deal if you were already considering ~$350-380 for a 1070.

Are these easy (reference layout) to put under water?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Seamonster posted:

Are these easy (reference layout) to put under water?

http://www.coolingconfigurator.com/waterblock/3831109831274

Only reference gigabyte is the FE.

Adbot
ADBOT LOVES YOU

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Risky Bisquick posted:

Needs a 10mm screamer fan tbqh

Make it sound like those siren whistles at full speed

https://www.youtube.com/watch?v=0pNWVy-yGFg

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply