Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ZobarStyl
Oct 24, 2005

This isn't a war, it's a moider.

Deathreaper posted:

Thinking of getting a fury x over a gtx 1070 for my 2560x1600 60hz monitor. The fury x is slightly cheaper, available and seems to have improved substantially in benchmarks since its initial release. Plus dx12/Vulcan with async is benefiting AMD cards nicely. Only thing I'm worried about is the 4GB vram. It would be replacing 2 r9 290s which are just too hot/noisy in CF and I don't want to deal with CF anymore. What do you guys think?
The 4 gb ram cutoff really isn't an issue, but if you're running 4K you may want to wait for Vega. Other than that it's a vastly underrated card if you can snag it at firesale prices. I've not noticed any sound issues from mine at all, and it doesn't really change speed at load as far as I can tell.

Adbot
ADBOT LOVES YOU

Stanley Pain
Jun 16, 2001

by Fluffdaddy

bull3964 posted:

Vulkan trip report on Doom with my GTX 1080 (at 4k).

FPS seems to be locked at 60 just like on OpenGL, but it doesn't feel as smooth. Something about it just isn't quite right. Loading times went up a ton with it too.

It does seem more stable though.

I'm actually dipping into the mid 30s on Vulkan vs 60FPS constant on OGL. Visually it doesn't look the same either right now. Can't really pin point it since Vulkan screenshots aren't working for me.

JacksAngryBiome
Oct 23, 2014
Why is display port a thing? From plugs breaking to monitor/resolution detection problems, I have never heard anything good about it. Is it a "fits more resolution through the cord" thing? (Lowly 1080p user here.)

SwissArmyDruid
Feb 14, 2014

by sebmojo
Variable refresh is not compatible with DVI.

Freesync and G-sync can both work through HDMI and DP.

SwissArmyDruid fucked around with this message at 13:33 on Jul 12, 2016

Truga
May 4, 2014
Lipstick Apathy
DP is a really really good and cool standard.

DP implementation by a lot of vendors is a very poo poo and trash thing, though. Give it a few more years and it'll get better, right now 99% screens out there can still easily coast on DVI, so they do.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


SwissArmyDruid posted:

Variable refresh is not compatible with DVI.

Freesync and G-sync can both work through HDMI and DP.

Can't get above 60hz on HDMI though.

penus penus penus
Nov 9, 2014

by piss__donald

Deathreaper posted:

Thinking of getting a fury x over a gtx 1070 for my 2560x1600 60hz monitor. The fury x is slightly cheaper, available and seems to have improved substantially in benchmarks since its initial release. Plus dx12/Vulcan with async is benefiting AMD cards nicely. Only thing I'm worried about is the 4GB vram. It would be replacing 2 r9 290s which are just too hot/noisy in CF and I don't want to deal with CF anymore. What do you guys think?

Without another compelling reason I can't see this being a great choice. With everything on the table in the best case scenario a fury x is on par with a 1070 AIB. In the other scenarios, which is most of them currently , it loses and sometimes very badly.

Combine that with the fact its using like twice the power to do this with all the other quirks. I don't think this architecture has another trick up its sleeve.

And yes you can definitely hit 4gb at 1600p. Its not terribly common but it will be. But you already are in the same boat now so it might not be a big deal.

The fact you found one cheaper is the only thing that makes me wonder but I don't know ...

LiquidRain
May 21, 2007

Watch the madness!

JacksAngryBiome posted:

Why is display port a thing? From plugs breaking to monitor/resolution detection problems, I have never heard anything good about it. Is it a "fits more resolution through the cord" thing? (Lowly 1080p user here.)
On top of being able to drive higher than 60Hz, DP is a thing because HDMI requires everyone who uses it to pay a licensing fee. DP is royalty free. DP also does a ton of really, really cool poo poo that nobody uses it for! Like driving multiple monitors through 1 cable (either with a splitter or by chaining monitors), carrying USB over it, and carrying ethernet over it. But nobody uses it for anything but video.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

ZobarStyl posted:

The 4 gb ram cutoff really isn't an issue, but if you're running 4K you may want to wait for Vega. Other than that it's a vastly underrated card if you can snag it at firesale prices. I've not noticed any sound issues from mine at all, and it doesn't really change speed at load as far as I can tell.

That is not true these days, there are a number of games that hit the VRAM limit and start stuttering on 4GB cards.

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy
Quick test of the new Tomb Raider patch on a 2500k@4.3 + GTX1080 and 8GB RAM @ 1440p max settings:

Avg(Min) frames
DX12 102(63), 82(29), 75(35) - Overall 87
DX11 97(4), 75(2), 70(13) - Overall 81

Might do more cause I'm curious if a second run would iron out the very visible DX11 single hitches that brought it down really low once per test stage. Watching it and seeing the numbers does seem to stick with the DX12 theme of smoothing stuff out if your GPU outpaces your CPU. 1080 is finally stretching out the legendary 2500k and giving me an itch to replace it (esp. since my other 8GB of memory just died :negative:).

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Yeah I think you'll definitely want to run TR in DX12 mode now when before it was iffy how much of a performance boost it gave, if any.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Truga posted:

If you have a dell monitor, you can sometimes correct the dumb "off=unplugged" poo poo somewhere in the OSD settings.

They're all dell u2410s, unfortunately this doesn't work for me. My old solution was to use non-DP cables but the 1080 only has the one DVI port and I have one more of them now. It means at least one of them is getting a DP cable.

bigmandan
Sep 11, 2001

lol internet
College Slice
Well the Zotac 1070 I ordered from NewEgg Canada and delivered by Purlolator is delayed in transit due to higher than normal shipment volumes... drat you Canada Post :argh:

When I eventually get the card should I bother with a fresh driver install or will the existing nVidia drivers be fine?

Gwaihir
Dec 8, 2009
Hair Elf

Desuwa posted:

I forgot how much I hate displayport's loving hotplug detection. It's part of the spec where if you turn a monitor off it's treated as if it's been unplugged. I like to turn my secondary monitors off when watching something on my main screen, but with displayport that causes the desktop to flicker and resize every time. Windows avoids doing this for the primary monitor, at least, or my monitor doesn't follow the worst parts of the spec.

The HDMI ports on my monitors are already full, so I need three DVI connections. On my GTX 1080 I've got the DVI port for one, I can use an HDMI to DVI cable for another monitor, and for the last monitor I'll try a DP->DVI cable, but I'm not too confident in that. It might end up being a dp -> hdmi -> EDID ghost (because they don't make these for DP, at least not yet) -> dvi frankenstein chain.

I hate computers. I wish I could blame this on Windows but Mac and Linux have the exact same behaviour without any software fix, though you can work around it with nvidia's workstation cards. It's like they never thought people might use more than one monitor.

At least 1200p60 is in the range of passive DP to HDMI/DVI converters.

It's not part of the spec, but a lot of monitors (And cables!!) gently caress it up.
When done right, turning off the monitor should do nothing. Only unplugging/hard powering off (Some monitors have a mechanical switch in back by the plug in addition to the bezel mounted front power) should trigger the "Display disconnected, better re-shuffle your desktop!" behavior.

Anecdotally, my Dell U3014 fucks it up and acts like it's been disconnected when powered off, but my U2412 does not and works fine. Also working fine, HP Z27s, and Acer XB270hu.

e: It's not part of the spec that they're supposed to appear disconnected when soft-powered off, that is. They're totally supposed to vanish if actually unplugged.

Gwaihir fucked around with this message at 15:43 on Jul 12, 2016

ZobarStyl
Oct 24, 2005

This isn't a war, it's a moider.

AVeryLargeRadish posted:

That is not true these days, there are a number of games that hit the VRAM limit and start stuttering on 4GB cards.
Allow me to clarify: absolutely nothing I've thrown at my Fury X has had a problem with 4 gb limitations at 3x1080p triple head.

The Slack Lagoon
Jun 17, 2008



I've been out of the loop for a few weeks. Any good articles on how the 1060 will stack up vs the 970? Good upgrade or sidegrade?

pigdog
Apr 23, 2004

by Smythe
[wrong thread]

pigdog fucked around with this message at 18:51 on Jul 12, 2016

Truga
May 4, 2014
Lipstick Apathy

Massasoit posted:

I've been out of the loop for a few weeks. Any good articles on how the 1060 will stack up vs the 970? Good upgrade or sidegrade?

Very sidegrade, unless you VR. Then maybe an upgrade if people start using the new nvidia tech in their VR games a lot.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Massasoit posted:

I've been out of the loop for a few weeks. Any good articles on how the 1060 will stack up vs the 970? Good upgrade or sidegrade?

If you have a 970 and want an upgrade, your best bet is selling it (you might still be able to get $200!) and getting a GTX 1070.

Gwaihir
Dec 8, 2009
Hair Elf

Massasoit posted:

I've been out of the loop for a few weeks. Any good articles on how the 1060 will stack up vs the 970? Good upgrade or sidegrade?

The 1060 looks like it will be equivalent to a 980 performance wise. Not a large upgrade from a 970, but you could (and should) do

Twerk from Home posted:

If you have a 970 and want an upgrade, your best bet is selling it (you might still be able to get $200!) and getting a GTX 1070.

^^ If you're looking to upgrade meaningfully.

The Slack Lagoon
Jun 17, 2008



Okay, thanks. Looks like a nice card for the cost though.

Was planning to get a 1070 in September so I'll stay the course

sauer kraut
Oct 2, 2004

bigmandan posted:

When I eventually get the card should I bother with a fresh driver install or will the existing nVidia drivers be fine?

Nuke everything with DDU, reboot and reinstall, no need to risk any weird issues.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Does anyone know if there's a way to override the blower fan minimum rpm on the RX 480? Would I need to mod the bios?

The noise is fine for gaming but it has an incredibly annoying idle sound. I taped the blower still for two minutes and the temp didn't go above 55c with normal desktop use, so it seems like this thing should have a passive mode like the open air coolers do.

njsykora
Jan 23, 2012

Robots confuse squirrels.


So would I be right in thinking that if the benchmarks are solid for the 1060 I'm better off getting that to replace my 750 over the 1070 if I have no interest in going above 1080p?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

njsykora posted:

So would I be right in thinking that if the benchmarks are solid for the 1060 I'm better off getting that to replace my 750 over the 1070 if I have no interest in going above 1080p?

Depending on local availability and your budget, the fire-sale 980 Tis might be a better deal than a 1060. If you're penny pinching, the RX 470 is likely to be a better performance bargain than anything else out there.

sout
Apr 24, 2014

Still considering options for new GPU.
Using a HAF 912 case. It seems most people tend to prefer the fan style coolers instead of the reference blowers. Are they much quieter or something? They seem to direct heat in a pretty weird direction away from all the exhaust fans on the case.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

sout posted:

Still considering options for new GPU.
Using a HAF 912 case. It seems most people tend to prefer the fan style coolers instead of the reference blowers. Are they much quieter or something? They seem to direct heat in a pretty weird direction away from all the exhaust fans on the case.

Yes, they are much quieter and also cool much better. The air does not need to be directly in the path of the case fans to get moved, also some open coolers blow downwards onto the heatsink of the card while others suck air through the heatsink and eject it away from the card so what direction the cooler pushes the hot air depends on the specific card. Regardless, as long as your case has good ventilation an open cooler will generally be better than a blower cooler in just about every way.

penus penus penus
Nov 9, 2014

by piss__donald

sout posted:

Still considering options for new GPU.
Using a HAF 912 case. It seems most people tend to prefer the fan style coolers instead of the reference blowers. Are they much quieter or something? They seem to direct heat in a pretty weird direction away from all the exhaust fans on the case.

I'm sure there are scenarios where not venting via a blower actually creates some kind of runaway heat inside the case, however, I have never seen it. Open air coolers always cool much better in terms of design and overcome disadvantages from throwing heat into the case for most people. The only time I've ever seen a need for a blower was in SLI personally, but it was still below throttling and considering the reference cards already ran at near throttling temperatures I wasn't even entirely convinced it would have been better - but everybody says it is.

In a largish case like that with 1 GPU open air will be clearly better.

Number19
May 14, 2003

HOCKEY OWNS
FUCK YEAH


The "display disconnected" thing with DisplayPort happens because monitor vendors want to have lower power draw when off as possible as a checkbox item for marketing materials. To do this they stop powering the DP port and sometimes the HDMI ones. Some let you keep the ports alive through their OSD settings so long as there is main power supplies, some don't.

It's stupid though and all to save fractions of a watt at best.

Party Plane Jones
Jul 1, 2007

by Reene
Fun Shoe

Number19 posted:

The "display disconnected" thing with DisplayPort happens because monitor vendors want to have lower power draw when off as possible as a checkbox item for marketing materials. To do this they stop powering the DP port and sometimes the HDMI ones. Some let you keep the ports alive through their OSD settings so long as there is main power supplies, some don't.

It's stupid though and all to save fractions of a watt at best.

Isn't it a requirement for the EU to have 1/2 watt standby modes?

Potato Salad
Oct 23, 2014

nobody cares


Is the 750Ti still the king of low-end, highly-efficient Nvidia-based cards? Please tell me something newer has decisively taken its place.

penus penus penus
Nov 9, 2014

by piss__donald

Potato Salad posted:

Is the 750Ti still the king of low-end, highly-efficient Nvidia-based cards? Please tell me something newer has decisively taken its place.

For the money, no, that's going to be the one. I saw one for $90 new somewhere today

Bleh Maestro
Aug 30, 2003
RX 460 might be a competing option soon?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Potato Salad posted:

Is the 750Ti still the king of low-end, highly-efficient Nvidia-based cards? Please tell me something newer has decisively taken its place.

nVidia actually updated the 750 Ti in some markets to be a newer GPU with the same marketing name, so you can find a 750 Ti that's actually Maxwell 2.0 like the 970 and friends.

I wish I had a better source, but nobody really tends to cover budget graphics cards years after they came out: http://wccftech.com/nvidia-geforce-gtx-750-gm206-maxwell-2-0-architecture/

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Potato Salad posted:

Is the 750Ti still the king of low-end, highly-efficient Nvidia-based cards? Please tell me something newer has decisively taken its place.

Do you want just an efficient card, or do you need something with no PCI-E power connector requirement? GTX 950's are much much faster than 750 Ti's, around 40-50% faster. You can get an EVGA one for 109.99 after rebate right now on Newegg - http://www.newegg.com/Product/Product.aspx?Item=N82E16814487159

http://www.newegg.com/Product/Product.aspx?Item=N82E16814126090 - This one has no power connector but costs 25 bucks more

E: Wait for RX 460 to come out though, as that will be power efficient enough not to need PCI-E power and may outperform this option.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...


http://wccftech.com/nvidia-gtx-1060-performance-benchmarks-leak/

mango sentinel
Jan 5, 2001

by sebmojo
So if I've got $300 burning a hole in my pocket I should just sit on it until the 480 and 1060 have had a chance to slapfight for a couple months, right?

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

mango sentinel posted:

So if I've got $300 burning a hole in my pocket I should just sit on it until the 480 and 1060 have had a chance to slapfight for a couple months, right?

August seems like a good time to revisit it, yeah. Both GPUs should be available with plenty of aftermarket models and benchmarks.

Number19
May 14, 2003

HOCKEY OWNS
FUCK YEAH


Party Plane Jones posted:

Isn't it a requirement for the EU to have 1/2 watt standby modes?

Something like that. It sucks that some monitor manufacturers are removing the option to disable the setting now though.

Adbot
ADBOT LOVES YOU

EdEddnEddy
Apr 5, 2012



ZobarStyl posted:

Allow me to clarify: absolutely nothing I've thrown at my Fury X has had a problem with 4 gb limitations at 3x1080p triple head.

A Fury with 4GB HBM isn't really an apples to apples comparison to anything running GDDR5(X)

With the bandwidth the memory has to move stuff around, 4GB is able to handle up to 4K detail without the memory performance limitations that 4GB GDDR5 cards have at the same res.

On any other card not runny HBM (anything not a Fury currently) You do want >4GB for anything above >=1440P.



Also with all the DP talk, the only thing I absolutely loath about it is the stupid as hell button to disconnect the drat thing. With all other current connectors just being a push plug, why the hell did they think that having a button on the wrong side of the stupid connector was a good thing. If I accidental plug the thing in under my DVI, its a royal pain to press just right to get out. Same goes for a crappy designed monitor where the connector faces like the screen itself from underneath or something and you have to have to use some other tools or someone else's slim hand to get the thing out. :argh:


I have been looking on and off for any sort of Chained DP setup since I first saw that DP supported that, but sadly it seems nobody has used it which is a bummer. 1 cable for multiple displays sounds neat and clean.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply