Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
GRINDCORE MEGGIDO
Feb 28, 1985


Is AMD's agreement with gloflo going away anytime?

Adbot
ADBOT LOVES YOU

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

AVeryLargeRadish posted:

Nvidia's Gaming GPU devision is their most profitable segment by far, I doubt they would neglect it.

Strictly speaking this chart is revenue, not profit. I imagine the two businesses have different margins, although I've no idea how close that comes to making up the difference.

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
UK Ebuyer have 8gb Sapphire rx 470s for £190 today, are they worth having a look at at that price?

SwissArmyDruid
Feb 14, 2014

by sebmojo

Zero VGS posted:

What happened with that poo poo where AMD takes like 4 of the RX 480 chips and transposers them all together into a mega-card? That's could still be a thing; I mean if they can create multi-GPU cards without any drawback of SLI/CF then they can made cards as crazy as they want and don't have any jumbo chip yield fiascoes anymore like Fury. I'm as gung-ho about energy efficiency as they get, but if I could have a guaranteed 4K crusher I'd gladly supply the 600w.

Well first, GDDR5X showing up kind of shoved in a half-step. I mean, why move off onto the really new, untested poo poo when you can more or less use GDDR5X as a drop-in replacement for GDDR5? It's not like what is currently being produced can take advantage of HBM1/2's strength as having an overwhelming amount of bandwidth yet anyways, Fiji demonstrated that. So the industry uses GDDR5X for at least one or two architectures, growing their way into a world where HBM is the norm. Except that AMD wanted to make that changeover NOW NOW NOW, and Nvidia was like, "lol no".

Secondly, AMD needs to start building HBM chips again now that they've basically shown that the tech wasn't where it needed to be at the time. HBM1's not dense enough to put more than 4 GB on a chip as you saw, and HBM2 was just starting production. And AMD doesn't have any high-end chips that matter to put HBM on.

Thirdly, we'll get to our long-foretold HBM future sooner or later. Nvidia is going to push them whether they like it or not. It probably depends on whether or not GDD5X can still continue to service Volta, which could be cheaper than moving on with HBM2.

GRINDCORE MEGGIDO posted:

Is AMD's agreement with gloflo going away anytime?

2020. Take heart, it was originally written to last until 2024.

SwissArmyDruid fucked around with this message at 17:08 on Dec 1, 2016

Anime Schoolgirl
Nov 28, 2002

Eletriarnation posted:

Strictly speaking this chart is revenue, not profit. I imagine the two businesses have different margins, although I've no idea how close that comes to making up the difference.
pretty much this, the margins on tesla and quadro cards are loving bananas while the margins for consumer gpus are surprisingly low (~10-20%) for the most part

Turd Eater
May 11, 2003

Fauxtool posted:

remember that guy who took out a bunch of loans to buy amd stock? How is he doing?

https://m.reddit.com/r/wallstreetbets/comments/4z1xi4/yolo_used_a_9k_balance_transfer_offer_to_buy_1750/

He's in the black at over $8/share.

Edit: oh he sold it off 2.5 months ago https://m.reddit.com/r/wallstreetbets/comments/53pt5f/comment/d7vyb24

That dude's post history though...

Turd Eater fucked around with this message at 20:33 on Dec 1, 2016

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Anime Schoolgirl posted:

pretty much this, the margins on tesla and quadro cards are loving bananas while the margins for consumer gpus are surprisingly low (~10-20%) for the most part

If you average revenue and assume a 15% average margin the gaming division still makes an average of $95m in profit per quarter. If we combine all the other divisions and assume they make twice the profit, and I doubt the Auto and OEM divisions are running at a very high margin but we'll include those anyway, you still end up with everything else turning a profit of $180m. If we assume an obscene 50% margin it's $300m for the non-gaming divisions. One quarter to one third of your profits is not small potatoes for any business and my original point, that Nvidia cannot treat the gaming market lightly, still stands.

Also if Nvidia does what everyone seems to expect and gouges the gaming market that will only increase their margin in that market therefore making it more important that they treat said market with respect.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

AVeryLargeRadish posted:

Also if Nvidia does what everyone seems to expect and gouges the gaming market that will only increase their margin in that market therefore making it more important that they treat said market with respect.

The argument is that if you're a defacto monopoly you can gouge and not really have to respect the market very much to still rake in the cash.

Between "FE" cards and the hilarious GSync tax on monitors, they've already discovered that people will pay silly amounts of money for their products. They can only take it so far, though, thanks to consoles, but it's still probably a good bit higher than what they price things at now.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

DrDork posted:

The argument is that if you're a defacto monopoly you can gouge and not really have to respect the market very much to still rake in the cash.

Between "FE" cards and the hilarious GSync tax on monitors, they've already discovered that people will pay silly amounts of money for their products. They can only take it so far, though, thanks to consoles, but it's still probably a good bit higher than what they price things at now.

The original assertion I was responding to was:

Sinestro posted:

I definitely didn't tear up thinking about graphics cards, nope. I really don't see how we escape a future of $700 becoming the new mid-range price.

I think that $700 XX60 series cards and a tripling of prices in general would be more than enough to get people to look at consoles and AMD products and more importantly it would hugely depress sales. If you have a 1070 and four years from now the 1260 comes out and is slightly faster but costs $700 are you going to buy it? Probably not, you'll probably wait until the XX50 at ~$300 is significantly faster than your 1070 and that might take 4-5 generations of GPUs, at least going by past trends, so Nvidia is not selling much for a long time if they jack up prices too much. Like I said before, I think they will gradually raise prices to find what the market will bear so this process will take 3-4 generations of GPUs, that gives a lot of time for other stuff to happen and for the whole situation to change, Nvidia is not going to announce that the 11 series will cost three times as much as the 10 series did, especially since their past behavior tends to indicate that they are a fairly conservative company that would rather rely on what works and makes a profit than on taking huge risks in favor of short term gain.

Gwaihir
Dec 8, 2009
Hair Elf

Anime Schoolgirl posted:

pretty much this, the margins on tesla and quadro cards are loving bananas while the margins for consumer gpus are surprisingly low (~10-20%) for the most part

Even at 20% margin, that's still more profit from gaming GPUs than the current datacenter division *revenue*



AVeryLargeRadish posted:

The original assertion I was responding to was:


I think that $700 XX60 series cards and a tripling of prices in general would be more than enough to get people to look at consoles and AMD products and more importantly it would hugely depress sales. If you have a 1070 and four years from now the 1260 comes out and is slightly faster but costs $700 are you going to buy it? Probably not, you'll probably wait until the XX50 at ~$300 is significantly faster than your 1070 and that might take 4-5 generations of GPUs, at least going by past trends, so Nvidia is not selling much for a long time if they jack up prices too much. Like I said before, I think they will gradually raise prices to find what the market will bear so this process will take 3-4 generations of GPUs, that gives a lot of time for other stuff to happen and for the whole situation to change, Nvidia is not going to announce that the 11 series will cost three times as much as the 10 series did, especially since their past behavior tends to indicate that they are a fairly conservative company that would rather rely on what works and makes a profit than on taking huge risks in favor of short term gain.

Anyone who thinks a "midrange" card will be priced at something outrageous like 700$ is delusional and has completely fallen in to the silly nerd bubble. Cards priced at 250% and under account for 10 of the ~13 million GPUs shipped in Q3 this year: http://www.anandtech.com/show/10864/discrete-desktop-gpu-market-trends-q3-2016/2
Obviously the higher priced cards get better profits, but it's not like you can just magic up millions more people that are willing to drop 500+ on a GPU. There's just flat out not that large a market compared to the $250 and under level.

Gwaihir fucked around with this message at 20:02 on Dec 1, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Honestly it's down to Vega, not Navi, for AMD in dGPU. Big Vega is touted as 4096SPs, same as Fiji, and a 225W TDP. If their supposed charts are anything to go by Vega should also be ~3-4x perf/watt (so ~2.5x). That would put Vega as near 50% better clock for clock, or basically beat best case Fury X crossfire. Based on Titan XP performance, that's where they need to be to stay competitive, they really can't fall short of that mark.

If Vega 11 is a rumored Polaris 10 replacement, right down to same stream processors that'd be roughly 1070 performance, again about what they'd need to remain competitive. Not saying it's going to happen, not trying to hype.

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf
this is the nerdiest thread on all of SA and I love it :q:

penus penus penus
Nov 9, 2014

by piss__donald
No this is what cool people talk about i promise check out my gf she will set you straight plus now she's in 4k

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf
when i hop on the subway for work and see 20 new posts itt i get excited and tap it right away

i think buying a 1070 gave me some kind of bug

WanderingKid
Feb 27, 2005

lives here...

Measly Twerp posted:

Welp, this is the most depressing video on the GPU market ever:

https://www.youtube.com/watch?v=uN7i1bViOkU

Depressing as gently caress. I had a 3DFX Voodoo 2 then a 3DFX Voodoo 3 3000. Half Life 2 launched and the tech demo was running on an ATI Radeon 9700 pro. I couldn't afford one so I got the 9600 pro. Around this time Catalyst was a mess and I was running alot of these hacked 3rd party driver packages. Catalyst got alot better but by that point I went prebuilt with a Dell XPS 8100. It had an unbranded GTX 460 in it and not having to deal with hacked drivers was bliss. (edit: somewhere between Radeon 9600 pro and GTX 460 I bought a Radeon 2600 XT though I don't remember it at all). Then it was a Dell XPS 8700 with a GTX 760 in it. Both times nVidia was all over Dell prebuilts in the bang/back arena. Maybe its the OEMs that account for the enormous numbers of people buying inferior but more expensive cards.

I vaguely recall Intel being fined over a billion euros for anti competitive practices by the European commission. They were doing things like granting huge rebates to OEMs and demanding they go Intel exclusive under threat of withdrawing their money and giving it to one of their competitors thus making it impossible for AMD to even give away their cpus for free to the big OEMs. Maybe something similar happened with gpus too? Its like silicon NAFTA.

WanderingKid fucked around with this message at 21:53 on Dec 1, 2016

penus penus penus
Nov 9, 2014

by piss__donald
Not that I can exclude funny business in general, the video is pointing to the fact people bought millions of inferior GPUs for years rather than high level payoffs and threats

Cinara
Jul 15, 2007
A huge missed point though is that the 290/290X were not sold anywhere near retail for a LONG time due to Bitcoin miners. During the mining craze those cards were sold out everywhere and you could only find them for double their retail price. Gamers barely had the option of going AMD at that time if they wanted a high end card, even if AMD had the best card at the time.

WanderingKid
Feb 27, 2005

lives here...
Were they really that hard to find? I didn't give a poo poo about graphics cards during those years because I was going prebuilt, so I missed the whole era where AMD slid off the edge of the world.

Perplx
Jun 26, 2004


Best viewed on Orgasma Plasma
Lipstick Apathy
I think of big part amd gpus failing to sell was related to amd cpus being poo poo,I think the same cards branded as ATI would have sold better

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

WanderingKid posted:

Were they really that hard to find? I didn't give a poo poo about graphics cards during those years because I was going prebuilt, so I missed the whole era where AMD slid off the edge of the world.

Yeah they were pushed up to nearly a 100% markup for cards that didn't have the lovely hairdryer blower cooler, it was like what we had with the Pascal cards over the summer but much worse. The cool thing is that after the mining stuff ceased to be profitable they were dumped on ebay for super cheap, I got a 290 for a little over $200.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

WanderingKid posted:

Were they really that hard to find? I didn't give a poo poo about graphics cards during those years because I was going prebuilt, so I missed the whole era where AMD slid off the edge of the world.

Yeah every time AMD comes out with a new card the buttcoin miners buy out all the stock for months. Happened to the RX 480 too.

FaustianQ posted:

Honestly it's down to Vega, not Navi, for AMD in dGPU. Big Vega
touted as 4096SPs, same as Fiji, and a 225W TDP. If their supposed charts are anything to go by Vega should also be ~3-4x perf/watt (so ~2.5x). That would put Vega as near 50% better clock for clock, or basically beat best case Fury X crossfire. Based on Titan XP, that's where they need to be to stay competitive, they really can't fall short of that mark.

If Vega 11 is a rumored Polaris 10 replacement, right down to same stream processors that'd be roughly 1070 performance, again about what they'd need to remain competitive. Not saying it's going to happen, not trying to hype.

The rumors I heard were that Vega 10 was 4096-core, so somewhat faster than a Fury X, and that Vega 11 was the new halo product with 6144 cores.

Regardless of specifics, most of the rumors agree that Vega 10 is the larger of the two Vegas (the opposite of the Polaris numbering), so I think you're off there. Also it wouldn't make any sense at all to replace/supplant Polaris after only a few months on the market. It would take something wildly unexpected like giving GloFo the finger and dropping Polaris 10 production, which I just can't see.

We can guess all we want on the details - any math is speculation at this point but from a marketing perspective Vega 10-based cards really need to compete with the 1070, if not beat it with a full-fat chip. 4096 cores sounds about right. But I firmly expect Vega 10 to be the new 490X and Vega 11 to be the new halo card. I would be astonished to see anything else. They aren't going to slot a second card in the same market segment as another brand new card.

Paul MaudDib fucked around with this message at 22:53 on Dec 1, 2016

EdEddnEddy
Apr 5, 2012



I keep thinking about the time where AMD GPU's lost their competitive edge, and it can kinda be attributed to the AMD acquisition of ATI. ATI at the time (5000-6000 seires) were still pretty competitive cards to Nvidia, and the acquisition of ATI by AMD lead to what I expected with the AMD APU's however that got screwed by the usual corporate merger toxicity in R&D and all the other backend stuff that happens, and while the APU idea is grand vs Intel's iGPU's, you have the problem of AMD CPU's sucking rear end in comparison which does nothing to help keep ATI's parts going strong.

With the APU focus, the GPU side lost their focus and edge to remain competitive even though GCN was really good tech, they probably had a lot of canceled/changed roadmaps that would have been ok if not for the APU business and AMD mismanagement.

Also who was doing ATI's FAB work before the AMD acquisition?


This is all pure speculation on my end, but I have been watching the CPU/GPU market since the early days and it is just sad the lack of excitement and competition we have now after the golden/bad days in the 90's early 2000's with all the different chips and makers at the time.

I still miss 3DFX and the promise of what that Voodoo 5 6000 was to bring. I still have a Voodoo Banshee in my P3 933 rig that still works like a charm for some old 3DFX powered stuff. Remember UltraHLE and its Glide only N64 emulation too? Good times.


I mean, even with the AMD takeover of ATI, of all things we are still carrying the name of Radeon and Geforce from 2001. Back before then we were going through GPU naming schemes like candy.




Also while AMD and everyone else had a lull today, picked up some shares at 8.35 and hope it hits a good high around the Zen announcement to sale a little after probably.

craig588
Nov 19, 2005

by Nyc_Tattoo

Paul MaudDib posted:

Yeah every time AMD comes out with a new card the buttcoin miners buy out all the stock for months. Happened to the RX 480 too.

It's not just AMD either. It took me about 3 months to get a 1080 after they were launched. I've wanted videocard preorders for years, you can find articles from over a decade ago with people noting the lack of stock at launch. It really annoys me, game consoles that never have stock issues after the first week or so have months long preorder campaigns, but videocards that always have stock issues for months never get preorder offers.

Bitcoin farms have the opposite effect on card prices when they shut down after a few months when the cards aren't profitable anymore, they were flooding the used market so hard retail had to respond and it was possible to get a new 290 for 200 dollars flat just months after they launched. I remember telling people to buy 290s all summer and they were suspicious like "hmm, didn't you get a 680?" if I didn't have that 680 already I would have gotten a 200 dollar 290 no question.

NoEyedSquareGuy
Mar 16, 2009

Just because Liquor's dead, doesn't mean you can just roll this bitch all over town with "The Freedoms."
The conversation over the last two pages has me questioning the upgrade I was planning. I bought a PG278Q during the recent Black Friday/Cyber Monday sales and my 970 seems slightly underpowered to be pushing 2560x1440. The 1080 being priced at ~$650 seemed kind of obscene since it was launched, I guess AMD no longer being competitive at the high end and Nvidia using that as a way to jack up prices makes sense. I haven't really paid attention to any of AMD's cards for a long time since I had bad experiences with their drivers a long time ago and started going with Nvidia for everything after that. Is Vega likely to be something worth waiting for, or am I still better off getting a 1070 or 1080 despite questionable business practices? From what I can tell Volta line cards aren't going to be coming out until late 2017 at the earliest and there aren't really any other good options at the moment.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

craig588 posted:

It's not just AMD either. It took me about 3 months to get a 1080 after they were launched. I've wanted videocard preorders for years, you can find articles from over a decade ago with people noting the lack of stock at launch. It really annoys me, game consoles that never have stock issues after the first week or so have months long preorder campaigns, but videocards that always have stock issues for months never get preorder offers.

Bitcoin farms have the opposite effect on card prices when they shut down after a few months when the cards aren't profitable anymore, they were flooding the used market so hard retail had to respond and it was possible to get a new 290 for 200 dollars flat just months after they launched. I remember telling people to buy 290s all summer and they were suspicious like "hmm, didn't you get a 680?" if I didn't have that 680 already I would have gotten a 200 dollar 290 no question.

NVIDIA cards mostly aren't used for mining. Their architecture is better for floating-point operations, AMD is better at integer operations that tend to be used in hashing/etc.

I don't know if we ever got an informed post-mortem on the 1080 launch but my interpretation has always been that there were parts shortages. The 1070 is what it looks like when you just have too much demand for your product. Stock would pop up fairly frequently, it just sold out very quick. The 1080, on the other hand, would only show up like once a week at a single store and it would be gone in a flash. Either yield problems on the GPU chip or shortages of GDDR5X would make sense, since it was a full-fat chip on a brand new node that GloFo also really struggled with, and GDDR5X was a brand-new product with a single-source supplier.

Yeah when the coin markets hit the limit the cards flood onto the used market. It can be a good time to pick up a deal, and coin mining isn't as hard on cards as people tend to imply. Coin mining is about turning electricity (ideally your parent's/housemates'/school's electricity) into hashes as efficiently as possible, and overclocking destroys the value proposition there. Also, a huge part of what destroys GPUs under normal usage is the physical expansion/contraction as the chip heats up under load. Assuming neither has been overclocked, the person who is playing some games, then goes off to eat dinner, then comes back, then gets bored and surfs the net for a half hour, then starts up another game, etc is doing way more damage to the card than someone who gets the card warm and keeps it there.

Mining GPUs are likely to have a lot of hours on the fan, but it's real easy to replace fans. And you can rip off the stock cooler and put an AIO liquid cooling bracket on for like less than $80 if you have the mount.

Kinda fucks AMD both coming and going though, because actual gamers don't get the cards during the shortages. So they never show up in the Steam marketshare or any other measures people look at. And then when everyone flips the cards, the markets glut and gamers see their card's value destroyed, and other people end up buying used so AMD's sales abruptly drop. The coin mining thing may pay the bills in the short term but I'm not of the opinion that the yo-yo effect is a net positive for them overall.

NoEyedSquareGuy posted:

The conversation over the last two pages has me questioning the upgrade I was planning. I bought a PG278Q during the recent Black Friday/Cyber Monday sales and my 970 seems slightly underpowered to be pushing 2560x1440. The 1080 being priced at ~$650 seemed kind of obscene since it was launched, I guess AMD no longer being competitive at the high end and Nvidia using that as a way to jack up prices makes sense. I haven't really paid attention to any of AMD's cards for a long time since I had bad experiences with their drivers a long time ago and started going with Nvidia for everything after that. Is Vega likely to be something worth waiting for, or am I still better off getting a 1070 or 1080 despite questionable business practices? From what I can tell Volta line cards aren't going to be coming out until late 2017 at the earliest and there aren't really any other good options at the moment.

The 1080 isn't really priced at $650-700 anymore, it's more like $550 for the cheaper models and $600 for the really nice models. I just paid $475 for a FE at BestBuy (plus two $20 gift cards from eBay) but that was a price mistake.

If you have a GSync monitor already you might as well stick with it. The 970 is definitely on the weak side for pushing 1440p but GSync is pretty good at making it work and I would think you can find some setting that keeps you above 30fps or so. :raise: Please do double-check that GSync is turned on in the control panel for both full screen+windowed games, sometimes driver updates bork the settings.

If you want to upgrade, the 980 Ti or the 1070 is what you need for good 1440p. Both cards are roughly the same performance - most sites do comparisons using the reference cards at base speeds, and the 1070 will self-overclock out of the box so it looks better on paper. But with an aftermarket card or overclocking, the 980 Ti is basically neck and neck with the 1070. The 1070 just uses a bit less power to do it, and has some minor updates thrown in. AMD doesn't really have a good card in this performance class. The Fury/Fury X are competitive on paper but they only have 4 GB of VRAM and games are starting to use more nowadays, and in some situations they have some funky microstutter problems.

Shameless self-plug here, I have a 980 Ti Classified (EVGA's second-highest card) and I decided not to do Step Up because EVGA's got some parts problems with their 10-series cards and I ended up just going with the FE for now instead. It's 6 months old, never overclocked, and has 2.5 years left on a transferrable warranty. I've been meaning to post an SA-mart thread but I finally got off my rear end and did it, so if you or anybody else wants a 980 Ti or some 780 Tis come and get them.

Otakufag
Aug 23, 2004
I just hope AMD gets competitive again and drives prices down, gently caress NVIDIA for making me buy an overpriced Gsync monitor at gun point.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Paul MaudDib posted:

Yeah every time AMD comes out with a new card the buttcoin miners buy out all the stock for months. Happened to the RX 480 too.


The rumors I heard were that Vega 10 was 4096-core, so somewhat faster than a Fury X, and that Vega 11 was the new halo product with 6144 cores.

Regardless of specifics, most of the rumors agree that Vega 10 is the larger of the two Vegas (the opposite of the Polaris numbering), so I think you're off there. Also it wouldn't make any sense at all to replace/supplant Polaris after only a few months on the market. It would take something wildly unexpected like giving GloFo the finger and dropping Polaris 10 production, which I just can't see.

We can guess all we want on the details - any math is speculation at this point but from a marketing perspective Vega 10-based cards really need to compete with the 1070, if not beat it with a full-fat chip. 4096 cores sounds about right. But I firmly expect Vega 10 to be the new 490X and Vega 11 to be the new halo card. I would be astonished to see anything else. They aren't going to slot a second card in the same market segment as another brand new card.

Vega 11 being the chip was only ever rumor that was made up from aether, the only real tangible info on Vega 11 is that it's aimed to supplant Polaris 10 and will come after Vega 10 - http://videocardz.com/63715/amd-vega-and-navi-roadmap. This would put Vega 11 release closer to summer 2017, and AMD (and Nvidia!) have done quick release cycles than that.

Also if GCN5 (I'd assume it would have to be, GCN4 isn't good enough) is to truly compete then one would assume that that a midrange Vega 11 would be supplanting market position (as in X70/X80) but not performance, and thus push Polaris 10 further down the stack, not eliminating it.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
After one valiant year of service, my MSI GTX 970 exploded 5 minutes ago. It sounded (and now smells) like someone set off fireworks in my room.

I see where a capacitor exploded on the GPU. Is the rest of my newly built computer safe?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Dali Parton posted:

After one valiant year of service, my MSI GTX 970 exploded 5 minutes ago. It sounded (and now smells) like someone set off fireworks in my room.

I see where a capacitor exploded on the GPU. Is the rest of my newly built computer safe?

Your machine is probably fine.

Ryuga Death
May 14, 2008

There's gotta be one more bell to crack
Fun Shoe
According to EVGA precision, it's showing my gtx 1070 at 1037mhz for gpu clock and 4006mhz for memory clock while at the desktop and idle. The card is currently powering 3 monitors, 2 of 1080p@60hz and 1 of 1440p@144hz (though I only game on the 1440p). Are these clock speeds or whatever normal?

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
If you don't have the latest driver, there is a bug where more than one DP port will throttle the card up. For reference I have 3 devices (2 DP, 1HDMI) and its at like ~900mhz

However, your browser can trigger the gpu to throttle up as well.

Ryuga Death
May 14, 2008

There's gotta be one more bell to crack
Fun Shoe

incoherent posted:

If you don't have the latest driver, there is a bug where more than one DP port will throttle the card up. For reference I have 3 devices (2 DP, 1HDMI) and its at like ~900mhz

However, your browser can trigger the gpu to throttle up as well.

My three devices are connected by 1 HDMI, 1 DP, and 1 DVI. I assume from this response that my clocks are maybe a bit high?

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
Well, that doesn't fit the profile of the bug. What happens when you clock the 144hz down to 60?

teagone
Jun 10, 2003

That was pretty intense, huh?

Ryuga Death posted:

My three devices are connected by 1 HDMI, 1 DP, and 1 DVI. I assume from this response that my clocks are maybe a bit high?

Sup Rascal, does other OC/monitoring software, e.g., MSI Afterburner, report the same idle clock speed? [edit] I don't believe memory speed downclocks when idle.

teagone fucked around with this message at 07:09 on Dec 2, 2016

Ryuga Death
May 14, 2008

There's gotta be one more bell to crack
Fun Shoe

incoherent posted:

Well, that doesn't fit the profile of the bug. What happens when you clock the 144hz down to 60?

Same thing but disabling the two other monitors reduces the gpu clock to 215mhz and the memory clock to 405mhz.

edit: Having only two monitors connected (one DP/1440p/144hz and one HDMI/1080p/60hz) leaves the gpu clock at 215mhz and memory clock at 405mhz and also lowers the temps by a few Celsius.

edit 2: as soon as a third monitor is connected, the gpu and memory clocks shoot up to 1063mhz and 4006mhz respectively as well as raising the idle temps to 32c. At least I know what's going on.

edit 3: is it a good idea to try connecting one of the side monitors (either HDMI or DVI monitor) connect to the motherboard instead of the 1070 itself or should I just leave well enough alone and accept the relatively high clock speeds with 3 monitors connected to the 1070? I know this would require changing a setting or two in the BIOS. This seems like it could maybe fix this clock issue but who knows.

Ryuga Death fucked around with this message at 08:20 on Dec 2, 2016

teagone
Jun 10, 2003

That was pretty intense, huh?

Can confirm this clock speed wonkiness with my setup as well: GTX 1060 with three 60Hz 1080p displays connected to it (DVI, HDMI, DP). With all three monitors on and running, idle core clock is ~800-850MHz, memory clock stays consistent at 4007MHz, power % is at 28 and idle temp generally hovers around 40C. If I disconnect one of the displays (doesn't matter which one), the CPU core clock drops to ~600MHz, the memory drops to 810MHz, power % drops to between 7 and 8, and the idle temp drops to roughly 30C. Soon as I connect a third monitor, clocks rise and idle temps rise.

I'm assuming this is a hardware limitation, right? No setting I can change will affect the clock speeds with 3 displays attached to my 1060? As per Ryuga Death's suggestion, would enabling the iGPU to use the mainboard's HDMI port to connect one of my displays prevent my 1060 from defaulting to higher clock speeds?

teagone fucked around with this message at 08:37 on Dec 2, 2016

Atomizer
Jun 24, 2007



incoherent posted:

If you don't have the latest driver, there is a bug where more than one DP port will throttle the card up. For reference I have 3 devices (2 DP, 1HDMI) and its at like ~900mhz

However, your browser can trigger the gpu to throttle up as well.

Oh hmm, I didn't know about this. I just installed a 1070 (Zotac Mini) and have 2xDP and 1xHDMI connected, with about those clocks (1 GHz GPU, 4 GHz RAM) idle. I didn't get to play around with anything too much, and didn't try with just one or two displays connected, but the system seemed to be running fine and gaming was unproblematic. I do have the latest driver installed though. Is this really a bug, however? Wouldn't you expect the card to have to run harder with more displays connected?

I dicked around a little with OCing but then backed off because I don't really need the 1070's power for anything I'm doing right at this moment. I also played around with fan speeds; in my SFF case with the card attached to a riser and mounted upside-down at the bottom of the case, it seems to hover at perhaps 55°C idle with the fan off and just minor desktop activity (albeit Chrome open with a bunch of tabs,) and with the fan running at 100% that drove the temp down to 45°C and below. This was after some benchmarking and gaming though, so everything was pretty toasty at that point.

teagone
Jun 10, 2003

That was pretty intense, huh?

Atomizer posted:

Wouldn't you expect the card to have to run harder with more displays connected?

Yeah it does make sense. I'm just wondering if it's possible to offload the higher power usage by connecting one of my three displays to the iGPU instead. If not, no biggie :)

Atomizer
Jun 24, 2007



teagone posted:

Yeah it does make sense. I'm just wondering if it's possible to offload the higher power usage by connecting one of my three displays to the iGPU instead. If not, no biggie :)

I just didn't have an answer for your question as I can't try it myself due to not having onboard video, but I do have some insight into this situation.

According to the display on my UPS, my gaming system used about 60 watts when idle at the desktop (and Chrome open,) and this was for the PC, main monitor, a USB hub, and Logitech gaming keyboard, but not the two auxiliary displays (which I have plugged into surge protection but not battery backup.) When gaming, the power draw was roughly in the 130-180 watt range (and the system has a 500 w PSU.) I don't remember the exact numbers because they were so low as not to concern me at all, gaming or idle, but the point is that the power usage [of the video card running higher at idle than would be possible with a single display] is probably negligible. I mean, the whole system plus one monitor and peripherals used 60 W while idle, so how much less power would the GPU alone draw with two fewer displays to drive? It's certainly not going to cut that number in half, so I'm not too worried about the inefficiency.

I can do some more accurate tests in a few days if you're really interested. If you don't have a UPS that can report power draw or something along those lines, I also have one of these which I highly recommend.

Let's step back, though, and assume that the extra power draw is significant. If you just need displays for productivity and not gaming/video, you could go with the DisplayLink video-over-USB stuff. For example, these would let you use your existing displays:
http://www.newegg.com/Product/Product.aspx?Item=N82E16815101012
https://www.amazon.com/gp/product/B00ECDM78E

Or, you could save even more power with DisplayLink panels like this:
https://www.amazon.com/dp/B013XFJKGI/ref=twister_B01GKN32HI
There are cheaper versions (e.g. smaller and lower resolution) but they use only as much power as they get over USB, typically 10 W, so you'd be drawing only up to that much (my new 25" for example has a 30 W adapter and is claimed to use 24 W typically) without the extra workload on the AIC.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

Vega 11 being the chip was only ever rumor that was made up from aether, the only real tangible info on Vega 11 is that it's aimed to supplant Polaris 10 and will come after Vega 10 - http://videocardz.com/63715/amd-vega-and-navi-roadmap. This would put Vega 11 release closer to summer 2017, and AMD (and Nvidia!) have done quick release cycles than that.

Also if GCN5 (I'd assume it would have to be, GCN4 isn't good enough) is to truly compete then one would assume that that a midrange Vega 11 would be supplanting market position (as in X70/X80) but not performance, and thus push Polaris 10 further down the stack, not eliminating it.

Well, if the best they can hope for is to compete with the 1070 then Vega is dead on arrival.

The reality is that the 1080 and even the Titan XP are not pushing anywhere near as hard as NVIDIA can go here. GP104 is basically the size of a GK104 (680/770), GP102 is significantly smaller than a GK110B (780 Ti), and the only consumer GP104 card in the 10-series lineup is a die harvest. There is a lot of room for faster cards on the chips they have let alone a second-gen respin on their architecture like they have done in the past.

If Vega 10 is the big chip and it doesn't hit until 2Q 2017 (when they say 1H they actually mean 2Q otherwise they would say 1Q) then by that point there's very probably a 1080 Ti or 1180 or whatever NVIDIA wants to call it on the market. The compute market will be decently filled with full-fat GP102 chips and there will be enough spillover to start looking at a full-fat Titan Black Pascal and pushing the die harvests down the line to a 1180. In fact the expectation is that there may be a 1080 Ti as soon as January.

There's at least two more possibilities. One is a gaming card built on GP100. I know people think the fact that there's double-precision hardware on there means that it's worthless at gaming but I don't think the numbers back it up. GP100 is a significantly bigger chip than GP102 - 33% bigger. It's just about the exact same ratio as going from GK110B to GM204. Despite the fact that GK110B has disabled double-precision units wasting die space it's still got more single-precision performance than GM204, and that's with Maxwell making architectural improvements that Pascal did not improve on (architecturally, and in IPC terms, Pascal is the same as Maxwell). I'm specifically thinking of the switch from Kepler's general-purpose/compute oriented queue-based architecture to Maxwell's hardcoded single-queue graphics mode here - that really boosted Maxwell's framerates way above what you would expect from the pure single-precision throughput. To put it in a nutshell, I would expect a Pascal GP100 to fare better against GP102 than GK110B did against GM204, and GK110B was already coming out on top in a technical sense despite having double-precision hardware wasting space.

The other possibility is a GM200-style 600mm^2 gaming/deep-learning chip, let's call it GP200 or GP204 or something. Obviously there's some lead time there and I don't know if there's really time left at this point before they move on to Volta. But the node process will have matured and yields will be increasing, and they have a proven architecture at this point. Stamping down a few extra SMX engines to fill out a bigger die isn't all that difficult conceptually, any more than GP102 was. It just takes time and money. Die harvests are hugely cheaper than full-fat chips and you could probably hit full-fat GP102 performance fairly reasonably.

Anyway, the point here is that nothing really changes if there's not a big Vega 11. NVIDIA has several easy potential moves for faster hardware. If AMD is launching their 1070 competitor in 2Q 2017 then the best they can hope for is to compete against the 1160 (or however the model numbers shake out). NVIDIA pushes one or more new cards onto the top of their lineup, everything else starts gets pushed down a notch, AMD is once again unable to compete above the lower-midrange.

Paul MaudDib fucked around with this message at 16:36 on Dec 2, 2016

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply