Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
snuff
Jul 16, 2003

Massasoit posted:

I got the gigabyte g1 gaming because I stopped at local microcenter and they happened to have it in stock and I put it on "EcoMode" which I think underclocks it. Not sure it really does much. It also has an overclocking mode

Bleh Maestro posted:

Ign's daily deals page said there was an $80 off Gigabyte G1 1070 on sale today. It sold out but did anyone see this? Was it $80 off $429 or some higher price?

If so, that's a good deal to look out for.

The Gigabyte G1 is probably the worst AIB card, outside of blower cards.

Adbot
ADBOT LOVES YOU

The Slack Lagoon
Jun 17, 2008



snuff posted:

The Gigabyte G1 is probably the worst AIB card, outside of blower cards.

How so?

snuff
Jul 16, 2003

Loud at load compared to the competition.

Peanut3141
Oct 30, 2009

snuff posted:

Loud at load compared to the competition.

I can agree with this. My EVGA FTW blows it out of the water in terms of full load cooler noise.

The Slack Lagoon
Jun 17, 2008



Peanut3141 posted:

I can agree with this. My EVGA FTW blows it out of the water in terms of full load cooler noise.

It is a bit loud but I'm glad I don't have coil wine. Every EVGA card I've had has had coil wine - 660ti, 760, 970

averox
Feb 28, 2005



:dukedog:
Fun Shoe
I dunno I don't really notice it. My case isn't right next to my head.

Then again central air is blasting and I couldn't give too much poo poo about a little extra air noise.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Admiral Ray posted:

Should still be better than that i5-760, though: Comparison between 760, 8350, and 2600k The FX-8350 has a much better single thread rating for Passmark.

My understanding is that games can be relatively low IPC code with lots of cache misses, pipeline flushes, and branching code that means that the long-pipeline CPU designs like Pentium 4 and AMD Bulldozer aren't able to perform as well as they should be capable of. Benchmarks like Passmark tend to be set up to show off the CPU performing at it's best, heavily branching code can bring them to their knees.

Another issue confounding things is that with Bulldozer vs Nehalem / Sandy Bridge etc, you've got one that's integer math heavy and one that's comparatively FPU heavy. CPUs have never been as simple as "X CPU is better than Y CPU", we've just gotten used to having things be good enough so we don't have to think about these things.

wicka
Jun 28, 2007


i've never noticed the noise from my G1

penus penus penus
Nov 9, 2014

by piss__donald

Phlegmish posted:

I'm a bit surprised, looking at PassMark it has a decent score:

https://www.cpubenchmark.net/cpu.php?cpu=AMD+FX-8350+Eight-Core

I guess maybe per core it's not very good?

Yes thats the shortest summary of AMD cpu problems. A $60 Intel Pentium could outperform it in games, and in fact a lot of things (however some specific recent games have been bottlenecked an actual 2 core Intel, but just as a reference point).

CPU synthetic benchmarks are unfortunately highly misleading for average use, especially for games.

Its actually kind of difficult to find benchmarks with AMD cpu's anymore, at least comprehensive ones with a focus on CPU's affect on things. There was a lot of hope early on that Direct X 12 and Vulkan would close the gap between the common 8 core FX and the respective common 4 core non-HT Intel but as far as I can tell its not even remotely close. The AMD cpu's benefit, but so do the Intels, and the gap remains. I tried to look this up now though out of interest but I actually couldn't find a whole lot which is kind of surprising.

Here's another thing too that really shocked me when I first experienced it. If you play games online, especially ones where latency is important or lots of things are happening, the difference between AMD and Intel is amazing. Even if you locked it at exactly the same framerate the AMD acts like a stuttering mess compared to an Intel. I went from an AMD Phenom II 965 BE with a maxed out OC, a chip I highly regarded for a very long time, to a bone stock 4670k and was absolutely blown away at the difference. In single player the fps boost was minor at the time, but once I went online it was clear as day. These were for games like BF3 and the like. I thought the Intel hype was half bullshit too, I was having some pretty strong buyer's remorse before I booted it up. FX cpus actually got worse, at least the ones right after the Phenoms. Good luck finding benchmarks about the difference between CPU's single player vs multiplayer, they just didn't happen.

Zen cpu's might fix the huge gap but im rambling too much about CPU's. Just want anybody here to know if you're riding on an old FX cpu and wondering why your fps doesn't exactly match what your new GPU says it should do, thats almost certainly why.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

PerrineClostermann posted:

...looks like I'm crashing at more than +80 in AFterburner...

Did you increase the voltage? I didn't get much headroom with the CPU with my EVGA SC 1070 but I did add a few more clocks on the memory. Overall not a huge change.

snuff
Jul 16, 2003

wicka posted:

i've never noticed the noise from my G1

That's great, but it's unfair for potential buyers not to mention that it's one of the objectively loudest AIBs.

Barry
Aug 1, 2003

Hardened Criminal

snuff posted:

That's great, but it's unfair for potential buyers not to mention that it's one of the objectively loudest AIBs.

Is there a comparison of noise levels somewhere?

I have a G1 and haven't heard anything out of it, at least not over any other ambient noise.

e: What's the deal with voltage control in MSI Afterburner? It's incredibly non-intuitive.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

THE DOG HOUSE posted:

Here's another thing too that really shocked me when I first experienced it. If you play games online, especially ones where latency is important or lots of things are happening, the difference between AMD and Intel is amazing. Even if you locked it at exactly the same framerate the AMD acts like a stuttering mess compared to an Intel. I went from an AMD Phenom II 965 BE with a maxed out OC, a chip I highly regarded for a very long time, to a bone stock 4670k and was blown away at the difference. In single player the fps boost was minor at the time, but once I went online it was clear as day. These were for games like BF3 and the like. I thought the Intel hype was half bullshit too, I was having some pretty strong buyer's remorse before I booted it up. FX cpus actually got worse, at least the ones right after the Phenoms. Good luck finding benchmarks about the difference between CPU's single player vs multiplayer, they just didn't happen.

The long and short of it is that a 6600k utterly destroys everything in the AMD stable, even on threaded tasks, even before we get into hyperthreading, hexacore, etc.

wicka
Jun 28, 2007


snuff posted:

That's great, but it's unfair for potential buyers not to mention that it's one of the objectively loudest AIBs.

that's great, where are these objective comparisons?

repiv
Aug 13, 2009

wicka posted:

that's great, where are these objective comparisons?

https://www.computerbase.de/2016-07/geforce-gtx-1080-partnerkarten-vergleich-test/

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

This was about the 1070, not the 1080.

snuff
Jul 16, 2003

Barry posted:

Is there a comparison of noise levels somewhere?

I have a G1 and haven't heard anything out of it, at least not over any other ambient noise.

e: What's the deal with voltage control in MSI Afterburner? It's incredibly non-intuitive.

http://www.eteknix.com/palit-gamerock-premium-gtx-1070-graphics-card-review/11/

That's just off the top of my head, I have heard it from several sources and owners in this thread.

wicka
Jun 28, 2007


AVeryLargeRadish posted:

This was about the 1070, not the 1080.

probably going to be very similar results

Barry
Aug 1, 2003

Hardened Criminal

snuff posted:

http://www.eteknix.com/palit-gamerock-premium-gtx-1070-graphics-card-review/11/

That's just off the top of my head, I have heard it from several sources and owners in this thread.

I think it's obvious that it's noisier under load than the other cards (which may or may not be all that important to you) but I don't know if that's enough to immediately mark it as the worst of the AIB cards, especially at a potential $80 off.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
RX 480 announced at $200, third party is basically $260-270 and still has extremely high demand :iamafag: The value against a 1060 seems completely eroded unless you truly believe in the brand or DX12/Vulkan.

Xarn
Jun 26, 2015
I believe in FreeSync :v:.

It also helps that I am not in a hurry to upgrade RIGHT NOW.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

wolrah posted:

I'm just not seeing where the performance advantage is over accessing those same SSDs installed in the host system itself. It's two M.2 slots as currently implemented, so the absolute best case is 4 PCIe lanes per slot worth of bandwidth each. Assuming you're not trying to build a budget compute system on a PCIe-limited platform like LGA115x there's basically no bandwidth advantage. Depending on what motherboard you have and where you put the GPUs and SSDs there may be a latency advantage. It's been confirmed to be nothing more than a PCIe switch on the card, so in theory a pair of SSDs installed on the same PCIe controller as the GPU in a normal PC could be used exactly the same way. Even if there is a latency difference from the onboard SSD versus in the host the latency difference between RAM and SSD is so large that it seems like the difference between the two SSD configurations would be a fart in the wind by comparison.

At least with RAM you'd be able to have something faster and lower latency, though obviously significantly lower capacity.

Who knows, maybe that Intel XPoint/Optane stuff is as good as they claim and will eliminate the distinction between SSD and RAM in the long term.

On the plus side since they've confirmed it's just using a PCIe switch on the board rather than some previously unknown extra interfaces on the GPU and you need to use special APIs it seems like the magic likely happens more in drivers than in silicon. There's probably not much if any die space dedicated to this capability compared to what would already be required for "swapping" to system RAM.

Accessing a SSD on the system is dependent upon where it's located in the system.

No matter what the call would need to go out over the PCIe bus to the CPU to get in the queue. Then it depends on where/how the drive is hooked up.

Best case scenario you'd be talking to a PCIe NVMe drive in another slot with a direct link to the CPU so you'd just have to back through the bus to the different slot.

Otherwise the signal would then have to be routed to the chipset and then out to either a M.2/SFF-8643 port or (god forbid) a SATA port.

With the drive hooked up directly to the video card the call can be routed directly through the on-board switch chip without having to go anywhere else.

It certainly doesn't make sense for consumers but could be beneficial for big data clients who have multi-GB data sets.

Naffer
Oct 26, 2004

Not a good chemist

jm20 posted:

RX 480 announced at $200, third party is basically $260-270 and still has extremely high demand :iamafag: The value against a 1060 seems completely eroded unless you truly believe in the brand or DX12/Vulkan.

It seems likely that the 470 is going to be the real ~$200 card. Someone posted a link here a few days ago that had it rumored at $180 which would be ok if you didn't give up too much performance coming from the 480.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
I wonder how the 470 would compare to the 970/980/390, but we will have to wait for benchmarks or word that such a chip is being released before october

EdEddnEddy
Apr 5, 2012



THE DOG HOUSE posted:

Yes thats the shortest summary of AMD cpu problems. A $60 Intel Pentium could outperform it in games, and in fact a lot of things (however some specific recent games have been bottlenecked an actual 2 core Intel, but just as a reference point).

CPU synthetic benchmarks are unfortunately highly misleading for average use, especially for games.

Its actually kind of difficult to find benchmarks with AMD cpu's anymore, at least comprehensive ones with a focus on CPU's affect on things. There was a lot of hope early on that Direct X 12 and Vulkan would close the gap between the common 8 core FX and the respective common 4 core non-HT Intel but as far as I can tell its not even remotely close. The AMD cpu's benefit, but so do the Intels, and the gap remains. I tried to look this up now though out of interest but I actually couldn't find a whole lot which is kind of surprising.

Here's another thing too that really shocked me when I first experienced it. If you play games online, especially ones where latency is important or lots of things are happening, the difference between AMD and Intel is amazing. Even if you locked it at exactly the same framerate the AMD acts like a stuttering mess compared to an Intel. I went from an AMD Phenom II 965 BE with a maxed out OC, a chip I highly regarded for a very long time, to a bone stock 4670k and was absolutely blown away at the difference. In single player the fps boost was minor at the time, but once I went online it was clear as day. These were for games like BF3 and the like. I thought the Intel hype was half bullshit too, I was having some pretty strong buyer's remorse before I booted it up. FX cpus actually got worse, at least the ones right after the Phenoms. Good luck finding benchmarks about the difference between CPU's single player vs multiplayer, they just didn't happen.

Zen cpu's might fix the huge gap but im rambling too much about CPU's. Just want anybody here to know if you're riding on an old FX cpu and wondering why your fps doesn't exactly match what your new GPU says it should do, thats almost certainly why.

That almost sounds like a good Reviewer Site niche to fill.

Mikojan
May 12, 2010

Just wanted to pitch in my personal experience: the G1 was noticably loud and I'm very glad I pulled the trigger to return it for a Gaming X. Perfo across all AIB seem to be nearly identical so noise / temps become pretty important when selecting a board.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
Ugh, motherfucker; earlier today I saw 1060s in stock on Newegg, but by the time I'd got through the checkout process, it had already gone out of stock.

SlayVus
Jul 10, 2009
Grimey Drawer

SlayVus posted:

So I've always like big cases for their expandability, but with the Vive I want to build something like an itx build with a Titan XP. Just so that I can demo it around and have some thing not as flashy in the living room where I have my PC setup.

But the Titan is probably over kill for VR, but not overkill for 3440x1440@100. Besides the $500-$600 100' thunderbolt cable, what options do I have to have my vive in the living room and my ultrawide setup in my upstairs bedroom without having to move my PC around?

Posted this on the pc building thread and got no traction there. Anyone have any thoughts on how I can go about doing this?

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

SlayVus posted:

Posted this on the pc building thread and got no traction there. Anyone have any thoughts on how I can go about doing this?

You could keep your PC in the living room with your Vive and then Steamlink to a lovely PC with your ultrawide. That isn't so great for twitch shooters and the like, but its fine if you want to play, like, stardew valley or something.

Or you could keep your PC in the living room and run a 100' HDMI with some Keyboard/Mouse switching solution. Not cheap either.

I think the Titan is better on the Vive than on the monitor, to be honest. I'd rather get a 1070 or 1080 and lose some frames on the monitor and have the Vive rock solid.

Captain Hair
Dec 31, 2007

Of course, that can backfire... some men like their bitches crazy.
If you're building it in itx maybe lug it upstairs/downstairs when needed?

If it's directly upstairs then a wireless keyboard and mouse may work alongside a hdmi cable? Not sure how great wireless hdmi is but may be worth looking into?

I know my logitech keyboard works from most rooms in my house, certainly in adjecent rooms.

EdEddnEddy
Apr 5, 2012



New gaming laptop with a 1070M and leave the desktop where you want it?

The 1070M looks to almost be = to the 1070 non M due to the core increase with only a minor clock reduction but we will have to wait and see what they do when they are finally out to know for sure.


Until then, I lug my huge desktop around to demo the Vive because it is just drat worth it. VR must be shared! :vrfrog:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

SlayVus posted:

Posted this on the pc building thread and got no traction there. Anyone have any thoughts on how I can go about doing this?

Thunderbolt is the best overall solution. You can do everything with a single cable.

You can also pull a pair of HDMI/DP and a USB 2.0 cables down to your living room. You can probably get about 15 meters before you will need "active extension"/"repeater" cables. This isn't the 100 ft you mentioned, but it might be enough if you pull a run through your walls. You can plug multiple HDMI/DP/USB active extensions together, but each link is basically a separate hub that adds latency and at some point you will have issues with the Vive. It might be good enough for desktop gaming though, since that's not as latency-sensitive. Depending on how far you want to go, this is not particularly cheap either.

You can stream your display from your living room up to your computer room over a network (ethernet/powerline/wifi in order of preference) but you will incur a pretty decent latency penalty at the other end, and the streaming solutions are not as good as having a physically connected PC.

You can also use HDMI-over-ethernet and USB-over-ethernet adapters. There don't seem to be DP-over-ethernet solutions so you would have to use HDMI. I imagine the Vive wouldn't like the latency and you run into many of the same problems in terms of latency sending up to your gaming PC (in particular you won't get *sync and possibly not even your full refresh rate). However, since it's a physical approach it would probably beat software-based streaming solutions.

Paul MaudDib fucked around with this message at 23:48 on Jul 26, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo
Been reading up on the AMD news out of Siggraph.

And um.

I know we kind of semi-mocked the whole SSD-on-GPU thing, but an 8K timeline realtime scrubbing at 92 FPS? That's.... not nothing.

That's actually kind of absurd.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


EdEddnEddy posted:

New gaming laptop with a 1070M and leave the desktop where you want it?

The 1070M looks to almost be = to the 1070 non M due to the core increase with only a minor clock reduction but we will have to wait and see what they do when they are finally out to know for sure.


Until then, I lug my huge desktop around to demo the Vive because it is just drat worth it. VR must be shared! :vrfrog:

Even with sufficient power, there's kinks to work out with VR and notebooks. They tend to get goofed up by Optimus.

repiv
Aug 13, 2009

bull3964 posted:

Even with sufficient power, there's kinks to work out with VR and notebooks. They tend to get goofed up by Optimus.

Just get a G-Sync laptop, none of them have Optimus :shepspends:

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
AMD didn't make those SSD video cards on a whim. They almost certainly had at least one large scale customer that knew the approach would give dramatic performance benefits.

I imagine these probably make those servers full of GPUs easier to manage. The individual GPUs don't have to fight over PCIe bandwidth.

Bloodly
Nov 3, 2008

Not as strong as you'd expect.

Paul MaudDib posted:

The real benefit of buying a more expensive version is usually getting better stock clocks, but most Pascal cards are boosting to 2000-2100 MHz on their own anyway because of GPU Boost 3.0 so the differences on paper don't exist in real life. And especially on a $250 card it's really not worth spending a bunch extra anyway, because it's more efficient to use that money to step up to a 1070 that's 40% faster instead of a high-end 1060 that's 5% faster. If you want a reasonable card with an open cooler just look for the basic EVGA ACX 3.0 card (no SC or FTW in the name) and be done with it.

e: On that note, how in the world are AIB partners not pissed as hell about GPU Boost 3.0? Like, Maxwell would reliably OC to similar clocks, but at least you had to take an hour and figure out your peak overclock, so the increased clocks still meant something to users too scared/lazy to do that. Now the various models are literally indistinguishable because GPU Boost automatically squeezes nearly everything out of the card and the only real differentiation is a high-end aftermarket PCB and the cooler.

I appreciate your helping; forgive my lack of understanding. I am, I suppose, a 'filthy casual' when it comes to performance. It doesn't help that my gaming isn't that intensive.

But half of that is words.

Currently, the 1060s are showing up as £250-300(something like this: https://www.amazon.co.uk/Asus-NVIDIA-GeForce-1060-DUAL-GTX1060-6G/dp/B01IUA9Z52/ref=sr_1_5?ie=UTF8&qid=1469577469&sr=8-5&keywords=gtx+1060

or this: https://www.amazon.co.uk/NVIDIA-GeF...ywords=gtx+1060 ). Mostly out of stock, too.

The 1070's are £430+(something like this: https://www.amazon.co.uk/EVGA-Nvidi...ywords=gtx+1070 or this: https://www.amazon.co.uk/Gigabyte-G...ywords=gtx+1070). About the same popularity. The major difference seems to be 'the 1060s have 6 GB of video RAM, the 1070s have 8 GB'. Is that all, and how important is that? It's a big purchase for me, so I'm basically terrified(as I always am). Certainly, I have enough money for either. Still, I would know what my money's getting.

Bloodly fucked around with this message at 01:11 on Jul 27, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo

PBCrunch posted:

AMD didn't make those SSD video cards on a whim. They almost certainly had at least one large scale customer that knew the approach would give dramatic performance benefits.

I imagine these probably make those servers full of GPUs easier to manage. The individual GPUs don't have to fight over PCIe bandwidth.

They probably *did*.

Of course, in the time-honored tradition, let's pull an "but can it run Doom" and see what happens when you just copy an entire game directory to the onboard SSD (with the appropriate extensions and APIs to take advantage of it being there, of course)

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Previously, you had to overclock by hand to fully wring all the performance out of the card. Maxwell and Pascal are both really stable and reach consistent clocks, but with Maxwell you had to overclock by hand. The "base/boost" numbers didn't represent the maximum performance you could actually squeeze out of a card, and most of the lower cards would overclock almost as far as the high-end clocks.

With Pascal, there's basically an "auto-overclock". The card knows when it can apply more power and stay stable, and it will basically take itself to within about 5% of its max stable overclock all by itself (and there's a diminishing return/rapidly increasing heat past there).

So basically, the "base/boost" clock numbers marked on the box are meaningless with Pascal. The card will boost itself past those numbers if the chip can actually stay stable. So the only differentiations on the better cards are better cooling and better power delivery that might increase the stability a bit. This is a super minor difference, the difference between a GBP 250 card and a GBP 350 card might be 5%.

On the other hand, the 1070 is a legitimately faster card with a lot more cores on it (about 40% faster). It costs more, but you're actually getting something for your money and at least in the US it's more cost efficient to save your pennies and step up to the 1070 rather than try and chase that last 5% of performance. And at least here, if you chase around good sales you can knock another chunk off the typical retail prices since those are overpriced at the moment relative to demand.

Paul MaudDib fucked around with this message at 02:10 on Jul 27, 2016

Adbot
ADBOT LOVES YOU

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Paul MaudDib posted:

Previously, you had to overclock by hand to fully wring all the performance out of the card. Maxwell and Pascal are both really stable and reach consistent clocks, but with Maxwell you had to overclock by hand. The "base/boost" numbers didn't represent the maximum performance you could actually squeeze out of a card, and most of the lower cards would overclock almost as far as the high-end clocks.

With Pascal, there's basically an "auto-overclock". The card knows when it can apply more power and stay stable, and it will basically take itself to within about 5% of its max stable overclock all by itself (and there's a diminishing return/rapidly increasing heat past there).

So basically, the "base/boost" clock numbers marked on the box are meaningless with Pascal. The card will boost itself past those numbers if the chip can actually stay stable. So the only differentiations on the better cards are better cooling and better power delivery that might increase the stability a bit. This is a super minor difference, the difference between a GBP 250 card and a GBP 350 card might be 5%.

On the other hand, the 1070 is a legitimately faster card with a lot more cores on it (about 40% faster). It costs more, but you're actually getting something for your money and at least in the US it's more cost efficient to save your pennies and step up to the 1070 rather than try and chase that last 5% of performance. And at least here, if you chase around good sales you can knock another chunk off the typical retail prices since those are overpriced at the moment relative to demand.

That explains why my Zotac 1070 AMP was able to clock itself all the way up to 1922MHz in Witcher 3 even though the boost spec is only 1797MHz.

Still I wonder who would be *that* dumb to buy something like a $330 Asus Strix 1060 instead of any 1070 when available at ~$400.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply