Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

SwissArmyDruid posted:

Gonna blow your mind here.

Adreno? Is an anagram for Radeon.

If AMD had kept Adreno around, instead of selling it off to Qualcomm, they'd be in much better straits, financially, so I doubt they're going to do THAT again.

It's another business with lousy profit margins: NV mostly abandoned the mobile SoC business, PowerVR died without Apple, and ARM's own GPU IP would have rendered (pun intended) the hypothetical AMD Adreno dead since the former was giving it away for free IINW and was already competitive in performance since 2011. Qualcomm wouldn't really cared one or another since 2/3 of their revenue comes from baseband IP licensing and they can just license GPU tech from ARM anyway.

Adbot
ADBOT LOVES YOU

Shrimp or Shrimps
Feb 14, 2012


The Slack Lagoon posted:

What is driving up the prices? Buttcoins?

Buttflation.

1gnoirents
Jun 28, 2014

hello :)

Twerk from Home posted:

Vega is about to start winning all value comparisons as 1070 prices just keep increasing.

Welllllllll

https://pcpartpicker.com/products/video-card/#c=404,405

Gyrotica
Nov 26, 2012

Grafted to machines your builders did not understand.

Usually I'm in favor of buttflation, but in this instance I just can't get behind it.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Palladium posted:

It's another business with lousy profit margins: NV mostly abandoned the mobile SoC business, PowerVR died without Apple, and ARM's own GPU IP would have rendered (pun intended) the hypothetical AMD Adreno dead since the former was giving it away for free IINW and was already competitive in performance since 2011. Qualcomm wouldn't really cared one or another since 2/3 of their revenue comes from baseband IP licensing and they can just license GPU tech from ARM anyway.
As long as Vega sells somewhere profitable (hpc, apu etc.) it's a win. That said, raja clearly oversold this clunker for gaming and Lisa su has a track record for turning around flailing fabless silicon companies , so ideally she would kick the barriers down between rtg and amd and then do a clean sheet redesign for Navi which sort of seemed to be happening anyway

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
Also Nvidia is deathly afraid that intel will stop supporting pcie or equivalent so has a vested interest in keeping amd CPUs relevant, which will hopefully fund an actual GPU redesign

Craptacular!
Jul 9, 2001

Fuck the DH

Paul MaudDib posted:

Software encoding is still vastly better at "archival-grade" recordings.

Yes, but we're not recording the family home movie collection. We're recording Rick and Morty, watching it eventually, and then deleting it. The DVR is a Haswell i3 in an ITX mobo/case so any help to speed things up seems like an improvement.

redeyes
Sep 14, 2002

by Fluffdaddy
Speaking of decoders, is AMD or nvidia better for video?

EVIL Gibson
Mar 23, 2001

Internet of Things is just someone else's computer that people can't help attaching cameras and door locks to!
:vapes:
Switchblade Switcharoo

Malcolm XML posted:

Also Nvidia is deathly afraid that intel will stop supporting pcie or equivalent so has a vested interest in keeping amd CPUs relevant, which will hopefully fund an actual GPU redesign

Stop supporting PCI-E? What the ... what do you mean?!

Removing a connector standard that both Nvidia and AMD use would be a death knell.

It would have been a very spicy connector to force everyone to go to Intel.

SwissArmyDruid
Feb 14, 2014

by sebmojo
I mean, if Intel has its way, you'll all be using Intel Omni-Path for everything.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Can't believe I bought my 1080ti around launch for less than most of these Vega 56 are going for right now.

Arivia
Mar 17, 2011

Zero VGS posted:

Can't believe I bought my 1080ti around launch for less than most of these Vega 56 are going for right now.

Look these are very good cards being badly treated by the mainstream tech media. They don't deserve that. What I'm saying is, there's good cards on both sides. Did anyone ask if Nvidia was at fault for this?

Anime Schoolgirl
Nov 28, 2002

The Slack Lagoon posted:

What is driving up the prices? Buttcoins?
yup which is part of why nvidia is going to put out a 1070ti with just one more bank of cuda cores and gddr5x memory, attempting to halve its usefulness to butt miners

it'll be hilarious if it becomes cheaper than the 1070 proper

The Slack Lagoon
Jun 17, 2008



Anime Schoolgirl posted:

yup which is part of why nvidia is going to put out a 1070ti with just one more bank of cuda cores and gddr5x memory, attempting to halve its usefulness to butt miners

it'll be hilarious if it becomes cheaper than the 1070 proper

Why does making a cards better make it worse for buttmining?

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

The Slack Lagoon posted:

Why does making a cards better make it worse for buttmining?

buttmining is optimized for GDDR5 but not GDDR5X

Anime Schoolgirl
Nov 28, 2002

The Slack Lagoon posted:

Why does making a cards better make it worse for buttmining?
mining software currently runs gddr5x at effectively half of its bandwidth due to the nature of how blockchains work

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

The Slack Lagoon posted:

Why does making a cards better make it worse for buttmining?

GDDR5x is bad for buttcoin mining.

E: beaten

1gnoirents
Jun 28, 2014

hello :)
For what its worth its not bad enough to make faster cards produce less money if you use something like nicehash (which picks the fastest algorithms) however with the 1070ti we will finally see a faster/better card do worse than its slightly slower counterpart. Until somebody writes new code

Craptacular!
Jul 9, 2001

Fuck the DH

Anime Schoolgirl posted:

it'll be hilarious if it becomes cheaper than the 1070 proper

Wouldn't go far until the reduced efficiency would be compensated by the lower bar on ROI.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
There is no driver optimization that is going to fix this one. It's inherent to the way GDDR5X works - GDDR5 is dual-pumped, once you put in an address you get two words of output, while 5X moves to being quad-pumped, so you get four words out. That's fine for graphics where you mostly want to use sequential data. Problem is since Ethereum works in those 2-word chunks, the second half of that 4-word chunk is basically junk, so you are wasting half of it. You could change Ethereum itself to use the quad-word chunks but it would be a super contentious hardfork, where you're directly screwing over all the miners, it's never gonna happen.

There are certainly altcoins that GDDR5X cards can do well at, even right now, but they don't have anywhere near the network hashrate of Ethereum and if you stuck a million GPUs on them their profitability would tank. Same as how people say "if ethereum crashes then miners will just mine something else", but even assuming they don't all tank at once there just isn't the "market depth" there (so to speak) to absorb millions of GPUs. Some of these altcoins are probably not technically sound either - I highly suspect some of them may be mineable with an ASIC relatively easily.

Regardless, at some point here GDDR5 is going to cross that magical threshold into being a "legacy product" and then prices start going up significantly. I wouldn't be surprised at all to see the lower-end Volta parts on GDDR5X this generation and the higher-end stuff on GDDR6, and once that happens I think it's essentially game over for plain GDDR5.

Paul MaudDib fucked around with this message at 22:47 on Sep 18, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
The good news is that the GPU bubble is over as far as rig-building is concerned though. Prices are slowly returning to sanity, even if there hasn't been the big pop and flood of used GPUs that we saw the first time (yet?).

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
I dunno I've been mining on a 1070 for a while and it rarely if ever mines Etherium, usually ZCash is the most profitable (which works fine with GDDR5X). With AMD cards on the other hand Etherium seems to almost always be the most profitable.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Paul MaudDib posted:

There is no driver optimization that is going to fix this one. It's inherent to the way GDDR5X works - GDDR5 is dual-pumped, once you put in an address you get two words of output, while 5X moves to being quad-pumped, so you get four words out. That's fine for graphics where you mostly want to use sequential data. Problem is since Ethereum works in those 2-word chunks, the second half of that 4-word chunk is basically junk, so you are wasting half of it. You could change Ethereum itself to use the quad-word chunks but it would be a super contentious hardfork, where you're directly screwing over all the miners, it's never gonna happen.

There are certainly altcoins that GDDR5X cards can do well at, even right now, but they don't have anywhere near the network hashrate of Ethereum and if you stuck a million GPUs on them their profitability would tank. Same as how people say "if ethereum crashes then miners will just mine something else", but even assuming they don't all tank at once there just isn't the "market depth" there (so to speak) to absorb millions of GPUs. Some of these altcoins are probably not technically sound either - I highly suspect some of them may be mineable with an ASIC relatively easily.

Regardless, at some point here GDDR5 is going to cross that magical threshold into being a "legacy product" and then prices start going up significantly. I wouldn't be surprised at all to see the lower-end Volta parts on GDDR5X this generation and the higher-end stuff on GDDR6, and once that happens I think it's essentially game over for plain GDDR5.

I don't know if GDDR5X is going to survive, rather a transition to GDDR5 and GDDR6 seems most likely. Only micron produces GDDR5X, and they're already moving and ramping up for GDDR6. GDDR5X will last as long as Pascal is produced, IMHO.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Paul MaudDib posted:

The good news is that the GPU bubble is over as far as rig-building is concerned though. Prices are slowly returning to sanity, even if there hasn't been the big pop and flood of used GPUs that we saw the first time (yet?).

Hoping this happens sooner or later. I'd like to snag a Hawaii or polaris card for the misses's rig so she can use my old freesync monitor.

1gnoirents
Jun 28, 2014

hello :)

Paul MaudDib posted:

The good news is that the GPU bubble is over as far as rig-building is concerned though. Prices are slowly returning to sanity, even if there hasn't been the big pop and flood of used GPUs that we saw the first time (yet?).

I thought this last little crash was going to "be the one" that floods the market but values are already on their way back up... but I am looking forward to this next wave of used GPUs because they include a very wide range of cards. You won't be stuck with your choice of 290 and 290. But it has yet to come, I have this growing feeling its really going to hit the fan when Volta releases.

Cheap Trick
Jan 4, 2007

Looking at MSI's 1080 cards - what's the difference between the Armor and the Gaming X? Slightly higher clock speeds and a fancier looking cooler? Doesn't seem worth it for an extra $100 Aussie dollarydoos.

Arivia
Mar 17, 2011

Cheap Trick posted:

Looking at MSI's 1080 cards - what's the difference between the Armor and the Gaming X? Slightly higher clock speeds and a fancier looking cooler? Doesn't seem worth it for an extra $100 Aussie dollarydoos.

The Armor's cooler is actually designed for 1070s and below. On the 1080 it makes the card perform terribly. Unless you're planning on replacing it with another cooler, get the Gaming X.

Cheap Trick
Jan 4, 2007

Arivia posted:

The Armor's cooler is actually designed for 1070s and below. On the 1080 it makes the card perform terribly. Unless you're planning on replacing it with another cooler, get the Gaming X.

Oh wow, thanks for the tip! Good thing I asked before putting in my order. Gaming X it is.

Surprise Giraffe
Apr 30, 2007
1 Lunar Road
Moon crater
The Moon

Cheap Trick posted:

Oh wow, thanks for the tip! Good thing I asked before putting in my order. Gaming X it is.

I got a 1080ti armor recently with the same cooler. If you look around online you'll see it makes a few fps difference in most situations and is a straight upgrade from stock blower at a much lower price than the gaming x.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Cheap Trick posted:

Doesn't seem worth it for an extra $100 Aussie dollarydoos.

Not sure how things are priced out in dollarydoos, but $100 seems like you might be getting close to the price of a G10/G12 + AIO (H55/H75 or similar), which would be an even better solution, and have the added benefit of likely being transferable to future cards.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Surprise Giraffe posted:

I got a 1080ti armor recently with the same cooler. If you look around online you'll see it makes a few fps difference in most situations and is a straight upgrade from stock blower at a much lower price than the gaming x.

The 1080ti Armor runs at 80% fan speed out of the box, making it very slightly louder than the FE blower cooler, though with better thermals. If you don't care about noise and you need to save every cent you can I can see it as a good purchase, or if you just want the PCB for liquid cooling as previously mentioned.

DrDork posted:

Not sure how things are priced out in dollarydoos, but $100 seems like you might be getting close to the price of a G10/G12 + AIO (H55/H75 or similar), which would be an even better solution, and have the added benefit of likely being transferable to future cards.

A Corsair H55 costs $72AUD and a G10 bracket about $50, so in total it would be $122 plus a tube of thermal paste, so say $130? I'd say it's worth it if they want really good thermals and don't mind doing the work to get the bracket and cooler on. But if they just want to slap it in their computer and be done with it I'd just get a card that is good out of the box.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

AVeryLargeRadish posted:

A Corsair H55 costs $72AUD and a G10 bracket about $50, so in total it would be $122 plus a tube of thermal paste, so say $130? I'd say it's worth it if they want really good thermals and don't mind doing the work to get the bracket and cooler on. But if they just want to slap it in their computer and be done with it I'd just get a card that is good out of the box.

I'd say the extra $30 and about 15 minutes of simple work would be worth it for a whisper-quiet setup that is transferrable, and thus lets him continue to buy "cheap"-level cards in the future. But yeah, there's nothing wrong with the GamerX, either.

SwissArmyDruid
Feb 14, 2014

by sebmojo

quote:

First up is the finest deal today: a selection of GeForce GTX 1080 cards for under $500. Newegg Business has a bunch of Nvidia's latest and next-greatest cards with a deep $50 discount when you use promo code "B2BEMCSEP3". The lowest-priced card on offer is this blower-cooled Gigabyte that comes out to $469 after the discount. You can also pick up this MSI Armor card with a twin-fan cooler for $479, and it has a $10 mail-in rebate too. The promo code should knock $50 off of any graphics card over $500, so take your pick.

http://techreport.com/news/32570/tuesday-deals-graphics-cards-a-mobo-storage-and-a-big-tv

In case people were still looking to grab one at this point, maybe even for hocking.

1gnoirents
Jun 28, 2014

hello :)

SwissArmyDruid posted:

http://techreport.com/news/32570/tuesday-deals-graphics-cards-a-mobo-storage-and-a-big-tv

In case people were still looking to grab one at this point, maybe even for hocking.

That article seems to have missed this one

http://www.ebay.com/itm/GIGABYTE-Ge...DkAAOSwFKZZiyjV

That's been that price there for a while and you don't have to sacrifice cooling for the most part. It's not the *greatest* cooler, I own one (and have owned two this gen), but its still in the premium cooler class. There is no throttling of any sort and runs at about ~55% fan speed, though that isn't very quiet its nothing like a blower or apparently the MSI Armor.

edit: Forgot to mention that deal is made much sweeter with various ebay coupons and whatnot

1gnoirents fucked around with this message at 18:14 on Sep 19, 2017

Gyrotica
Nov 26, 2012

Grafted to machines your builders did not understand.
It's too bad SLI is dying/finicky, otherwise I'd be tempted to get a second 1080.

1gnoirents
Jun 28, 2014

hello :)

Gyrotica posted:

It's too bad SLI is dying/finicky, otherwise I'd be tempted to get a second 1080.

Im a big time supporter of SLI still I just never mention it anymore because single card solutions cover most people's desires these days. Despite that I'd still caution anybody to be just as weary like youre being. We are still waiting on the DX12 holy grail of multi GPU support, but I wouldn't say SLI is dying either, it seems to be just as supported as it ever was (minus 3+ card). That is to say its very limited. But when it works, oh boy its awesome. And now memory bottlenecking would be so much less of a concern than before.

I lucked out in that every game I wanted to play that didn't support SLI wouldn't have benefited from it in the first place based on GPU's I was using. That won't always be the case though. I'd only recommend it in the more traditional SLI scenarios, where the fastest card isn't as fast as you want to go (1080 SLI would qualify for that). Even if I were interested personally I'd still wait until prices drop or the buttmining bubble bursts for some cheap SLI loving

Gwaihir
Dec 8, 2009
Hair Elf

Truga posted:

AMD still has way too many compute customers waiting in line for their server APUs to be able to afford a third party doing their own poo poo with the gpus. Like I said earlier in the thread, consumer dgpu is only a small part of the whole picture, there's other big customers like apple around too. Most importantly though, their APUs been a flat out better buy than intel+nvidia compute clusters for many of applications for a few years now, and that's with lovely kaveri apus.

I keep repeating myself in this thread, but there's things going on consumers never even see. Unless you're ordering AMD's apus for "if you have to ask, you can't afford it" prices right now, they care too much about you. GCN is a byproduct of that, it performs ok in games but the arch's primary focus is low power APUs in compute clusters. They're about 25% the price compared to intel+nvidia racks, while eating roughly the same amount of power for comparable performance:


Again, this is a kaveri benchmark. Zen apus have some really cool new features compute customers have been begging for which intel/nvidia can't provide for obvious reasons, and aren't based on bulldozer which is another plus. And zen2 and later is going to have more better still.

It's time to face the writing on the wall, RTG, which is more or less a gaming brand marketing thing, is going to shut down. Dehumanize yourself and face to geforce if you want to game, amd's next big thing isn't coming until navi which is like 2020 or something, and will still have all the compute things people here hate in it.

Maybe/sorta/we can't really know this based on AMD public financial statements?

AMD breaks down their financials in to "Computing and Graphics" vs "Enterprise, embedded, and semi-custom" (aka consoles), and then "Everything else". (And for AMD the two categories were about equal, with ~100m more revenue from computing and graphics group)

Consumer DGPU is not at all a small market though. NV's q2 gaming only revenue (1.18 billion) was the same as AMD's revenue (1.2 billion) for *everything* Q2- And that's with AMD having a very good Q2 thanks to Ryzen and Threadripper launches.

Yea, data center stuff is *growing* far faster than consumer DGPU, which, for NV at least, is a flat market, but it's not really there yet and is sorta approaching meme status as far as the number of nerds that will repeat DATACENTER!!!! as if it's the only market that matters. (Q2 NV gaming group revenue: 1.186b datacenter group revenue: 416m)

Gwaihir fucked around with this message at 19:15 on Sep 19, 2017

codo27
Apr 21, 2008

Just fired up the new Forza demo. Runs okay with everything max @1440, I have a pair of 970s. But its not good enough. God drat I want that 1080ti but its just not in the cards right now. Its probably foolish to sell them off and try and get a 1080 as a stopgap isn't it?

wargames
Mar 16, 2008

official yospos cat censor

codo27 posted:

Just fired up the new Forza demo. Runs okay with everything max @1440, I have a pair of 970s. But its not good enough. God drat I want that 1080ti but its just not in the cards right now. Its probably foolish to sell them off and try and get a 1080 as a stopgap isn't it?

if you can get a cheap 1080 its not in a bad spot.

Adbot
ADBOT LOVES YOU

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

codo27 posted:

Just fired up the new Forza demo. Runs okay with everything max @1440, I have a pair of 970s. But its not good enough. God drat I want that 1080ti but its just not in the cards right now. Its probably foolish to sell them off and try and get a 1080 as a stopgap isn't it?

You can just drop the eye candy to very high or so for more performance. There's a 99% chance you won't be able to see the difference in motion anyway without going pixel-hunting with a magnifying glass and told exactly where to look.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply