Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
sauer kraut
Oct 2, 2004
I can't believe that I could order a custom, in stock 1060 at MSRP right now.
The plain black dual fan Palit (NE51060015J9)/Gainward (3712) seem to be the 'reference' model for price this round.

e; there also is a plain black MSI dual fan model, below the Armor, at MSRP. Good luck snatching that though.
MSI GeForce GTX 1060 6GT OC (V809-2205R)

sauer kraut fucked around with this message at 19:17 on Jul 19, 2016

Adbot
ADBOT LOVES YOU

Gwaihir
Dec 8, 2009
Hair Elf

Zerilan posted:

So how does this 1060 compare to the 1070? Currently using a 780 which can at least run everything i'm playing on ok settings, so not in an urgent need to upgrade, but trying to decide whether to just get a 1060 now or wait until the holiday season and get a 1070 or 80 then.

The 1060 is the old GTX980, the 1070 is the 980ti. Both just with ~50w lower power usage. Both also look to gain about 15% performance from overclocking.

It's a 150$ jump from a 1060 to a 1070 though, for 30-35% better performance, so unless you are gaming at higher than 1080 resolution, it makes not much sense price/performance wise.

mango sentinel
Jan 5, 2001

by sebmojo
I asked before but didn't see an answer. When do these actually tend to start showing up at retail?

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Paul MaudDib posted:

5K 120hz ultrawide gaming is not a thing yet.

Sounds like someone doesn't know any train enthusiasts.

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!

mango sentinel posted:

I asked before but didn't see an answer. When do these actually tend to start showing up at retail?

As when you can buy one? Now, mine arrives tomorrow.

mango sentinel
Jan 5, 2001

by sebmojo

Mega Comrade posted:

As when you can buy one? Now, mine arrives tomorrow.

I meant brick and mortar, sorry.

mango sentinel fucked around with this message at 19:21 on Jul 19, 2016

Gwaihir
Dec 8, 2009
Hair Elf

mango sentinel posted:

I asked before but didn't see an answer. When do these actually tend to start showing up at retail?

Right now? Nowinstock has them going in and out of stock all day so far.

e: Oh, B&M :saddowns:

Hieronymous Alloy
Jan 30, 2009


Why! Why!! Why must you refuse to accept that Dr. Hieronymous Alloy's Genetically Enhanced Cream Corn Is Superior to the Leading Brand on the Market!?!




Morbid Hound
From reading this thread for the past few weeks it sounds like all the card wars arguments are moot because all the cards being made sell out anyway, demand outstrips manufacture.

kimcicle
Feb 23, 2003

mango sentinel posted:

I meant brick and mortar, sorry.

Saw a couple pop in and out of stock quickly at microcenter. Best Buy has a few listings online, but no option for in-store pickup yet.

afkmacro
Mar 29, 2009



Hieronymous Alloy posted:

From reading this thread for the past few weeks it sounds like all the card wars arguments are moot because all the cards being made sell out anyway, demand outstrips manufacture.

I don't know about you but I still think amd sucks regardless how many cards they sell.

sauer kraut
Oct 2, 2004
By the way, any fps number you see for Nvidia cards on Vulkan Doom is likely wrong.
Atm there seems to be no way to measure it due to a driver issue.

jabro
Mar 25, 2003

July Mock Draft 2014

1st PLACE
RUNNER-UP
got the knowshon


I got one of the LG 34" 1080p ultrawide monitors on Prime Day. It is Freesync so thought about getting a RX 480 to take advantage of it since my 770ti is making me drop my graphics lower than I like. Since they are out of stock until whenever is freesync a good enough of a thing where I should wait instead of getting a 1060 now?

SwissArmyDruid
Feb 14, 2014

by sebmojo
This needed to be shared:

edit: Never mind. Imgur is ded right now, but somehow the embedded images in the reddit post still work:

editedit: Imgur not hosed anymore.



https://www.reddit.com/r/techsupportgore/comments/4t8vlu/i_did_a_horrible_thing/

They used thermal adhesive, not thermal compound when repasting their 480.

SwissArmyDruid fucked around with this message at 02:01 on Jul 20, 2016

japtor
Oct 28, 2005

EdEddnEddy posted:

One thing about SLI. I remember reading up on 3DFX and their VooDoo chips right before they got bought out. The Voodoo 5 6000 had 4 of those VESA-100 chips and technically, with the tech the chips brought, they could stack as many of them as you could mount on a board and connect together (which sounded like a nightmare above 4 and had issues all over with AGP being what it was). But when everything worked, they scaled rather well.

What is keeping Nvidia from using some of that old research and doing something similar (or is that what they are doing with the Compute Link with Tesla?) and instead of being SLI when they have a dual GPU board? Or is this sort of tech similar to what we have now with the Shader Cores and we have come so far now that all the magic that was T&L is just older than dirt tech news these days?
I asked something similar before, but with PowerVR cause a lot of their designs seem to be slapping more cores on and scaling performance well with that. But I don't know if it's comparable cause all their stuff is integrated at the SoC level, like whether their cores are akin to standard GPUs or more like the individual units in them (whatever they're called, where they already have a shitload).

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me

Gwaihir posted:

The 1060 is the old GTX980, the 1070 is the 980ti. Both just with ~50w lower power usage. Both also look to gain about 15% performance from overclocking.

It's a 150$ jump from a 1060 to a 1070 though, for 30-35% better performance, so unless you are gaming at higher than 1080 resolution, it makes not much sense price/performance wise.

The performance numbers I've been seeing look like the GTX 1060 is closer to an AIB GTX 970 (mild factory OC) than a GTX 980 in most games. Granted, all three cards are fairly close together, but the "1060 equals 980" mantra doesn't seem to be really accurate.

Gwaihir
Dec 8, 2009
Hair Elf

PBCrunch posted:

The performance numbers I've been seeing look like the GTX 1060 is closer to an AIB GTX 970 (mild factory OC) than a GTX 980 in most games. Granted, all three cards are fairly close together, but the "1060 equals 980" mantra doesn't seem to be really accurate.

I mean, seems pretty close?


Same relative performance holds at 2560 * 1440.

Maybe an aftermarket 980 with an OC to 1.5ghz brings the margin a little closer, since it looks like the 1060 only picks up 14-15% OC perf, but measuring OC to OC is a crapshoot.

Enigma
Jun 10, 2003
Raetus Deus Est.

Is ZOTAC a decent brand for a 1060? I am not familiar with them.

sauer kraut
Oct 2, 2004

jabro posted:

I got one of the LG 34" 1080p ultrawide monitors on Prime Day. It is Freesync so thought about getting a RX 480 to take advantage of it since my 770ti is making me drop my graphics lower than I like. Since they are out of stock until whenever is freesync a good enough of a thing where I should wait instead of getting a 1060 now?

*sync is great and you should wait for a custom 480.

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!

jabro posted:

I got one of the LG 34" 1080p ultrawide monitors on Prime Day. It is Freesync so thought about getting a RX 480 to take advantage of it since my 770ti is making me drop my graphics lower than I like. Since they are out of stock until whenever is freesync a good enough of a thing where I should wait instead of getting a 1060 now?

While I think the 1060 is a slightly better card, that advantage dwindles with *sync on the table. Wait for a 480.

repiv
Aug 13, 2009

EdEddnEddy posted:

One thing about SLI. I remember reading up on 3DFX and their VooDoo chips right before they got bought out. The Voodoo 5 6000 had 4 of those VESA-100 chips and technically, with the tech the chips brought, they could stack as many of them as you could mount on a board and connect together (which sounded like a nightmare above 4 and had issues all over with AGP being what it was). But when everything worked, they scaled rather well.

What is keeping Nvidia from using some of that old research and doing something similar (or is that what they are doing with the Compute Link with Tesla?) and instead of being SLI when they have a dual GPU board? Or is this sort of tech similar to what we have now with the Shader Cores and we have come so far now that all the magic that was T&L is just older than dirt tech news these days?

There was nothing magical about 3dfx SLI - all they did was alternate rendering the horizontal lines of the image between each GPU, which is similar in principle to SFR in modern SLI/CF setups.

SFR was killed by render passes that sample adjacent pixels, AFR took its place, now AFR is being killed by render passes that sample previous frames, and so AMD/NV said "gently caress this" and DX12 EMA was born to make it the engine developers problem :whip:

jabro
Mar 25, 2003

July Mock Draft 2014

1st PLACE
RUNNER-UP
got the knowshon


sauer kraut posted:

*sync is great and you should wait for a custom 480.


Mega Comrade posted:

While I think the 1060 is a slightly better card, that advantage dwindles with *sync on the table. Wait for a 480.

That's what I figured but was starting to get that I CAN GET A CARD RIGHT NOW anxiety. Thanks, guys.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Enigma posted:

Is ZOTAC a decent brand for a 1060? I am not familiar with them.

Zotac is fine.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

FaustianQ posted:

I don't know how much effort went into tit's design, but my MG279Q supports 30-90Hz range, which is pretty drat good, an increase to 30-120Hz for Freesync should basically make it a no brainer

There's a firmware mod for that exact monitor which turns its range into 55-144Hz.

Enigma posted:

I wish I still played multiplayer fps games, but family/career and such. Mostly single player stuff these days that I can pause. I'm getting a 144Hz monitor, so I'm hoping that is good enough.

As I've stated before, it is. Once you get a 144Hz monitor, the GSync/FreeSync stuff no longer makes a great difference, because the faster refresh rate of the monitor already more than halves tearing/stutter.

60Hz->144Hz is the big one.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

spasticColon posted:

Economically feasible. And the "new" consoles are going to do some kind of fancy up-scaling to meet the 4K target. It's gonna be a while yet before a $250 video card or a $400 console can run games natively at 4K with all the graphical whistles and bells turned on.

Edit: Honestly I just don't understand the 4K gaming circlejerk. Movies and TV do look better on 4K sets but for gaming I'm satisfied with 1080p high/ultra settings at 60fps.

I would argue the opposite, the difference between 1080p and 4k is much more visible at typical monitor viewing distances than it is at typical TV viewing distances.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

spasticColon posted:

Edit: Honestly I just don't understand the 4K gaming circlejerk. Movies and TV do look better on 4K sets but for gaming I'm satisfied with 1080p high/ultra settings at 60fps.

I'm struggling to understand why this would be the case. Are you saying that there's a quality improvement from 4K video that you don't see in gaming, or that you care about the improvement for video but don't for games for some reason? Have you seen games running at 4K at the same settings otherwise that you run, or are you just saying that you can't miss what you've never experienced?

MaxxBot posted:

I would argue the opposite, the difference between 1080p and 4k is much more visible at typical monitor viewing distances than it is at typical TV viewing distances.

Past the clear tautology of "you can always see more detail on an object if you're closer to it", you kind of have to define typical distance and your monitor/TV sizes to make a statement like this.

Eletriarnation fucked around with this message at 21:17 on Jul 19, 2016

Sininu
Jan 8, 2014

spasticColon posted:

Edit: Honestly I just don't understand the 4K gaming circlejerk. Movies and TV do look better on 4K sets but for gaming I'm satisfied with 1080p high/ultra settings at 60fps.
The more pixels there are for games the less visible aliasing there is, movies don't have that problem so it benefits games way more than video.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Paul MaudDib posted:

Ultrawide is a thing. Surround gaming is a thing. People will learn not to rubberneck constantly. As someone who works with multiple monitors open it's a godsend, and xthetenth never stops jerking off about how awesome his X34 is for gaming.

5K 120hz ultrawide gaming is not a thing yet. And even then people love to crank up the supersampling and all that poo poo, so we will need to be effectively driving some 4x or 8x multiple of that. Every time a newer larger resolution comes out people say "oh, 1080p is so dense that you don't need supersampling because you can't see the aliasing" and then they inevitably turn it back on when GPU performance catches up.

We have a ways to go. And by then we will probably come up with something even newer and even more wasteful of GPU horsepower.

I'll always take more room, but honestly I love my ultrawides more for work/incessant screwing around online. It's real nice for gaming, but it's not two windows up at once that aren't cramped levels of nice, and that's before getting into wing monitors flanking that ultrawide. I'd definitely cross shop 144Hz 16:9s if it were just gaming though. If I had the power I'd run eyefinity with it and the wings, no question about it, and if people make a 4k VR headset intended for working with visualized screen space I'd take out a loan for that poo poo though.

Klyith
Aug 3, 2007

GBS Pledge Week

Mega Comrade posted:

So with dX12, is the 480 gonna pull out ahead of the 1060 or is the initial 'amd is more future proof' more a current driver thing?

Maybe more like pull even than pull ahead, but there are a lot of unknowns about this still.

AMD seems to have more performance boost from DX12 / Vulkan than nvidia does right now. It could be that GCN is more suited to the next gen software than the nvidia Kepler & Pascal tech. And between AMD's positions in the standards (gave Mantle to Vulkan, active on DX12) and occupying both consoles, it seems unlikely that performance will suddenly fall off a cliff for them. Games are going to be targeting GCN as a main platform for years to come.

OTOH there aren't a whole lot of games to confirm the trend. It could be that nvidia just hasn't yet put much effort into driver work for it. Or possibly the GTX 11xx will be the one that gets major revisions built for the CX12/Vulkan future. As much as AMD seems to make GPUs that age gracefully, Nvidia seems to put a lot of work into thrashing the gently caress out of games that are on the shelves now.

If you're looking at the 480/1060 decision, I'd only could the 480 as having future proof potential if you think you're gonna keep it for a good long time. If you're buying cheap because you buy a video card every generation, it doesn't matter.


Hieronymous Alloy posted:

From reading this thread for the past few weeks it sounds like all the card wars arguments are moot because all the cards being made sell out anyway, demand outstrips manufacture.

I think the pent-up demand from the last generation stretching out for nearly two years is really something else.

Skuto posted:

As I've stated before, it is. Once you get a 144Hz monitor, the GSync/FreeSync stuff no longer makes a great difference, because the faster refresh rate of the monitor already more than halves tearing/stutter.

60Hz->144Hz is the big one.
Agreed. *sync is probably a big thing for comp fps players, but for most people I'd illustrate like this:

Gsync
Freesync
--
144hz
-
|
|
-
60hz

Enigma
Jun 10, 2003
Raetus Deus Est.

Klyith posted:

Maybe more like pull even than pull ahead, but there are a lot of unknowns about this still.

AMD seems to have more performance boost from DX12 / Vulkan than nvidia does right now. It could be that GCN is more suited to the next gen software than the nvidia Kepler & Pascal tech. And between AMD's positions in the standards (gave Mantle to Vulkan, active on DX12) and occupying both consoles, it seems unlikely that performance will suddenly fall off a cliff for them. Games are going to be targeting GCN as a main platform for years to come.

OTOH there aren't a whole lot of games to confirm the trend. It could be that nvidia just hasn't yet put much effort into driver work for it. Or possibly the GTX 11xx will be the one that gets major revisions built for the CX12/Vulkan future. As much as AMD seems to make GPUs that age gracefully, Nvidia seems to put a lot of work into thrashing the gently caress out of games that are on the shelves now.

If you're looking at the 480/1060 decision, I'd only could the 480 as having future proof potential if you think you're gonna keep it for a good long time. If you're buying cheap because you buy a video card every generation, it doesn't matter.


I think the pent-up demand from the last generation stretching out for nearly two years is really something else.

Agreed. *sync is probably a big thing for comp fps players, but for most people I'd illustrate like this:

Gsync
Freesync
--
144hz
-
|
|
-
60hz

Cool, thanks! That's really the deciding factor for me between a 1060 and 480, but if it makes that little of a difference at 144Hz then I will go with the 1060.

Sininu
Jan 8, 2014

Everyone knows how Nvidia cards have issues with idle clocks and 144Hz screens and how Nvidia doesn't really seem to care about fixing it. But there's also another big issue Nvidia doesn't give a thing about and that's diagonal screen tearing with laptop cards. Has been a thing for at least 1.5 years and Nvidia hasn't said a thing about it.
https://forums.geforce.com/default/topic/903422/geforce-mobile-gpus/diagonal-screen-tearing-issues-on-gtx-860m-870m-960m-965m-970m-980m-/
http://forum.notebookreview.com/threads/strange-diagonal-screen-tearings.771358/
I only noticed this week ago on my laptop because I almost never play games on laptop's own screen, but now it's incredibly distracting.

SwissArmyDruid
Feb 14, 2014

by sebmojo

spasticColon posted:

Economically feasible. And the "new" consoles are going to do some kind of fancy up-scaling to meet the 4K target. It's gonna be a while yet before a $250 video card or a $400 console can run games natively at 4K with all the graphical whistles and bells turned on.

Edit: Honestly I just don't understand the 4K gaming circlejerk. Movies and TV do look better on 4K sets but for gaming I'm satisfied with 1080p high/ultra settings at 60fps.

Please bear with the fact that it's from Gamespot, this is one of the BEST demonstrations of what 4K does for gaming that I have ever found on the internet, and it involves demonstrating diagonal lines at various resolutions... in photoshop.

https://www.youtube.com/watch?v=1UcBwsQTwwI&t=109s

The explanation carries all the way through to 4:10 where the explain how you can get away with no anti-aliasing at all, if you've got enough resolution.

In short: More pixels mean less jaggies, because each individual pixel is a much smaller part of an overall line or edge, so the "stair effect" is reduced. This means you can turn AA down or off entirely and reclaim the power you'd normally use for it. This power can then be put towards texture quality or effects or whatever.

SwissArmyDruid fucked around with this message at 21:47 on Jul 19, 2016

Gunder
May 22, 2003

Edit: nvm.

Enigma
Jun 10, 2003
Raetus Deus Est.

AVeryLargeRadish posted:

Zotac is fine.

So this would be a fine purchase?

EdEddnEddy
Apr 5, 2012



SwissArmyDruid posted:

Please bear with the fact that it's from Gamespot, this is one of the BEST demonstrations of what 4K does for gaming that I have ever found on the internet, and it involves demonstrating diagonal lines at various resolutions... in photoshop.

https://www.youtube.com/watch?v=1UcBwsQTwwI&t=109s

The explanation carries all the way through to 4:10 where the explain how you can get away with no anti-aliasing at all, if you've got enough resolution.

In short: More pixels mean less jaggies, because each individual pixel is a much smaller part of an overall line or edge, so the "stair effect" is reduced. This means you can turn AA down or off entirely and reclaim the power you'd normally use for it. This power can then be put towards texture quality or effects or whatever.

This has been an argument since the Voodoo5/Geforce2 era when AA was really starting to be applicable.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

SwissArmyDruid posted:

In short: More pixels mean less jaggies, because each individual pixel is a much smaller part of an overall line or edge, so the "stair effect" is reduced. This means you can turn AA down or off entirely and reclaim the power you'd normally use for it. This power can then be put towards texture quality or effects or whatever.
That only works on still images, for moving images like video or games jaggies remain distracting even at high resolutions because you see "temporal aliasing", which manifests as shimmer or sparkling. The only way to fix this is to filter out detail smaller than the pixels of the display. One way to do this is to supersample and scale down, the other way is to subsample and scale up. The latter is much faster and often works almost as well.

wolrah
May 8, 2006
what?

EdEddnEddy posted:

One thing about SLI. I remember reading up on 3DFX and their VooDoo chips right before they got bought out. The Voodoo 5 6000 had 4 of those VESA-100 chips and technically, with the tech the chips brought, they could stack as many of them as you could mount on a board and connect together (which sounded like a nightmare above 4 and had issues all over with AGP being what it was). But when everything worked, they scaled rather well.

What is keeping Nvidia from using some of that old research and doing something similar (or is that what they are doing with the Compute Link with Tesla?) and instead of being SLI when they have a dual GPU board? Or is this sort of tech similar to what we have now with the Shader Cores and we have come so far now that all the magic that was T&L is just older than dirt tech news these days?

3DFX-era SLI divided up the load by having each GPU handle a fraction of the scan lines, basically allocating them in order. Effectively each one is rendering the same scene at the same time with a reduced vertical resolution.

This doesn't work for operations that need to have the entire framebuffer or nearby lines available, which if I'm not mistaken is most pixel shader operations and possibly even some hardware lighting. Basically the same sort of problem modern alternate-frame SLI/CF has with deferred rendering, just on a different level.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Enigma posted:

So this would be a fine purchase?

Yes it is fine, Zotac is fine

wicka
Jun 28, 2007


Skuto posted:

As I've stated before, it is. Once you get a 144Hz monitor, the GSync/FreeSync stuff no longer makes a great difference, because the faster refresh rate of the monitor already more than halves tearing/stutter.

60Hz->144Hz is the big one.

i would like to hear more about this bc the gsync premium is really the only thing stopping me from buying a new monitor rn

Enigma
Jun 10, 2003
Raetus Deus Est.

Taima posted:

Yes it is fine, Zotac is fine

Thank you! Sorry, just a little nervous about my first foray into building and a touch overwhelmed by the number of options.

Adbot
ADBOT LOVES YOU

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
I wish I had known that the small Zotac cards would pop up as in stock during the afternoon. I got up 2 hours after launch this morning and saw that everything on Newegg/Amazon was already sold out. I figured that was the end for a week or so, so I said gently caress it and ordered an FE despite it being almost too long to fit in my case and not really wanting a blower cooler. Now I see that I could have gotten what I wanted for $20 less, but it's too late unless I want to just order both and try to flip the FE.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply