Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy
You know what really doesn't make any sense?

"Poor Volta[ge]".

How could they think that was ok if the best they had was something near 1080 performance? Either it was a massive fuckup in that they literally didn't mean to compare their product to Volta, or it's a massive fuckup in that they are not getting anywhere near the performance they were expecting.

Adbot
ADBOT LOVES YOU

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Measly Twerp posted:

You know what really doesn't make any sense?

"Poor Volta[ge]".

How could they think that was ok if the best they had was something near 1080 performance? Either it was a massive fuckup in that they literally didn't mean to compare their product to Volta, or it's a massive fuckup in that they are not getting anywhere near the performance they were expecting.

No, it was a brilliant marketing tactic as at least some portion of the kool-aid drink r/amd base took it as a 100% face that Vega was going to crush Volta and preordered as soon as it was available. And they'll keep holing onto that belief even though they are currently holding a 300W GTX 1080 in their hands that is literally boiling their FineWine.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

sincx posted:

Think I can sell my RX480 for enough to buy a 1080?
They sell for $350-$400 right now so no

repiv
Aug 13, 2009

Measly Twerp posted:

You know what really doesn't make any sense?

"Poor Volta[ge]".

How could they think that was ok if the best they had was something near 1080 performance? Either it was a massive fuckup in that they literally didn't mean to compare their product to Volta, or it's a massive fuckup in that they are not getting anywhere near the performance they were expecting.

Who knows what performance they were expecting, but there's evidence of a power consumption fuckup late in development at least. This slide leaked out in January:



They were expecting a 225W TDP at ~1.5ghz, and welp at how that turned out.

Geemer
Nov 4, 2010




I can't help but laugh at the pixelated out watermark with a videocardz.com slapped over it. Stealing a stolen slide. :allears:

repiv
Aug 13, 2009

Geemer posted:

I can't help but laugh at the pixelated out watermark with a videocardz.com slapped over it. Stealing a stolen slide. :allears:

VCZ posted those slides first so they definitely weren't covering up another sites watermark. It's more likely covering up some identifying information that would get the source in trouble.

Cygni
Nov 12, 2005

raring to post

repiv posted:

They were expecting a 225W TDP at ~1.5ghz, and welp at how that turned out.

I wonder if this points to design issues in Vega itself, or weakness in the Samsung 14nm LPP process they are using. The Samsung 14 LPP node is the same one getting used for Ryzen, which seems pretty power efficient. I think its also used for Polaris 20 / RX 580, so they have experience with it and knew its power characteristics. Seems strange that they would be caught so off guard once Vega started getting cut, unless the issue is with the design itself.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

sincx posted:

Think I can sell my RX480 for enough to buy a 1080?

Sell your RX 480 for $375 right now if you've got something else to use in the meantime (or won't be gaming much). Maybe buy something really cheap from EVGA B-stock like a 750 Ti or something the next time they refresh.

You'll lose ~20% due to fees/shipping/etc so you get $300. Then either spend the extra $150-200 and get a 1080 right now, or wait a month or two until the mining market crashes at which point you can probably get a 1080 for sub-$400.

But seriously it's not going to be long now, mining profit on a RX 480 is down from $5 to like $3 per day now. Sooner or later miners are going to realize those profits don't justify dropping $375 on a GPU so if you're thinking about it don't wait too long.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Paul MaudDib posted:

Sell your RX 480 for $375 right now if you've got something else to use in the meantime (or won't be gaming much). Maybe buy something really cheap from EVGA B-stock like a 750 Ti or something the next time they refresh.

You'll lose ~20% due to fees/shipping/etc so you get $300. Then either spend the extra $150-200 and get a 1080 right now, or wait a month or two until the mining market crashes at which point you can probably get a 1080 for sub-$400.

But seriously it's not going to be long now, mining profit on a RX 480 is down from $5 to like $3 per day now. Sooner or later miners are going to realize those profits don't justify dropping $375 on a GPU so if you're thinking about it don't wait too long.

its already lower than 3, sell on ebay now, a lot of the local Craigslist demand has already started to die down and ebay wont last much longer than that
You can list the 480 and buy a new card today, ship out as soon as it arrives

sauer kraut
Oct 2, 2004

Cygni posted:

I wonder if this points to design issues in Vega itself, or weakness in the Samsung 14nm LPP process they are using. The Samsung 14 LPP node is the same one getting used for Ryzen, which seems pretty power efficient. I think its also used for Polaris 20 / RX 580, so they have experience with it and knew its power characteristics. Seems strange that they would be caught so off guard once Vega started getting cut, unless the issue is with the design itself.

It may well be a 225W card if it's tuned sanely to peak out at ~1500 MHz, but of course they slammed full throttle to match a ref 1080 in reviews :smith:

repiv
Aug 13, 2009

Cygni posted:

I wonder if this points to design issues in Vega itself, or weakness in the Samsung 14nm LPP process they are using. The Samsung 14 LPP node is the same one getting used for Ryzen, which seems pretty power efficient. I think its also used for Polaris 20 / RX 580, so they have experience with it and knew its power characteristics. Seems strange that they would be caught so off guard once Vega started getting cut, unless the issue is with the design itself.

I wonder if it's pertinent that GP107 is built on Samsung 14nm but all the larger Pascal parts are built on TSMC 16nm. Does Samsungs process just suck for larger GPUs, and Nvidia knew? :tinfoil:

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
https://twitter.com/GamersNexus/status/883036964815032320
so I guess benchmarks with it like tomorrow or something

(re hacked together Vega FE w/ water)

sincx
Jul 13, 2012

furiously masturbating to anime titties

Fauxtool posted:

its already lower than 3, sell on ebay now, a lot of the local Craigslist demand has already started to die down and ebay wont last much longer than that
You can list the 480 and buy a new card today, ship out as soon as it arrives

Seems like too much trouble, especially since this card was an RMA replacement for a R9 290 so I've already lucked out. Better not push my luck. I'll wait for the 1180 or Vega 2.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

repiv posted:

I wonder if it's pertinent that GP107 is built on Samsung 14nm but all the larger Pascal parts are built on TSMC 16nm. Does Samsungs process just suck for larger GPUs, and Nvidia knew? :tinfoil:

They have much higher volume so it is entirely likely

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

sauer kraut posted:

It may well be a 225W card if it's tuned sanely to peak out at ~1500 MHz, but of course they slammed full throttle to match a ref 1080 in reviews :smith:

In the PCPer testing it was running at 1440MHz and still consuming close to 300W, I expect the cards with sufficient cooling to sustain 1600MHz to consume well over 300.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

MaxxBot posted:

In the PCPer testing it was running at 1440MHz and still consuming close to 300W, I expect the cards with sufficient cooling to sustain 1600MHz to consume well over 300.

Regardless of the "micro-throttle" theory there is very clearly some kind of TDP throttle on the Vega FE blower cards.

Blower cards run hotter than liquid-cooled cards. Hotter chips have higher voltage leakage than the same chip at a cooler temperature. Higher voltage leakage means higher TDP. I'd probably expect 50-75W higher or so - the 295x2 saves 86W over a pair of 290Xs that have a 516W average gaming TDP.

The fact that a blower card is consuming 75W less than the AIO card says there must be a throttle in there somewhere and has right from the start. The only question is whether it's a regular old TDP throttle or a fancy Pascal-style micro-throttle.

If you assume that the Vega FE liquid-cooled version can sustain 1600 MHz at 375W (which - it had better be able to), and the RX Vega cards are going to clock a little higher (Videocardz spotted a test result that says 1630) then the RX Vega cards will probably consume a bit more than 375W. Probably like 400-425W at stock and then you can open up the power limit and do 500W+.

edit: nevermind about PCPer vs GN, that chart didn't say what I remembered, snipped

Paul MaudDib fucked around with this message at 21:12 on Jul 6, 2017

GRINDCORE MEGGIDO
Feb 28, 1985


Paul MaudDib posted:

then you can open up the power limit and do 500W+.

What an absolute trainwreck of a card.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
But that also proves how well built the AMD GPU's VRMs are compared to NV's crappy only-300W ones on GP102 :smug:

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Paul MaudDib posted:

Regardless of the "micro-throttle" theory there is very clearly some kind of TDP throttle on the Vega FE blower cards.

Blower cards run hotter than liquid-cooled cards. Hotter chips have higher voltage leakage than the same chip at a cooler temperature. Higher voltage leakage means higher TDP. I'd probably expect 50-75W higher or so - the 295x2 saves 86W over a pair of 290Xs that have a 516W average gaming TDP.

The fact that a blower card is consuming 75W less than the AIO card says there must be a throttle in there somewhere and has right from the start. The only question is whether it's a regular old TDP throttle or a fancy Pascal-style micro-throttle.

If you assume that the Vega FE liquid-cooled version can sustain 1600 MHz at 375W (which - it had better be able to), and the RX Vega cards are going to clock a little higher (Videocardz spotted a test result that says 1630) then the RX Vega cards will probably consume a bit more than 375W. Probably like 400-425W at stock and then you can open up the power limit and do 500W+.

edit: nevermind about PCPer vs GN, that chart didn't say what I remembered, snipped

That 1630 is static, 3DMark does not report actual clock speed, just what the chip claims its max clock is.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

AVeryLargeRadish posted:

That 1630 is static, 3DMark does not report actual clock speed, just what the chip claims its max clock is.

I don't think it matters in this case. I think it has to be power-throttling at 300W, it was predictable from the specs of the two cards even before we saw results. It's stopping at exactly 300W for all the reviewers, this just has to be a power limit right now.

We already know from the early reviews that it's not a thermal throttle. Either PCPer or GN (don't remember) tested it with 100% fans and it was a jet turbine but the card didn't break 60C during the test. That satisfies me as far as the existence of a thermal throttle.

My assertion is that the 375W limit will let Vega FE sit at its rated clock limit, with all units enabled/whatever. Either it's power limited or this thing just doesn't even come close to its rated boost clocks during normal operation. (:jeb:)

Hopefully 375W is also enough for RX Vega to hit its full 1630 boost clocks, or they up the power limit a little further. Or at least, an extra 30 MHz should only eat a little more power (<50W). So in theory even if they shipped with a 375W limit you could cover the difference by increasing the power limit.

Paul MaudDib fucked around with this message at 22:06 on Jul 6, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Can someone who's used a G10 bracket tell me whether these heatsinks will clear the bracket?

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Paul MaudDib posted:

Can someone who's used a G10 bracket tell me whether these heatsinks will clear the bracket?
Not sure about a G10 but I used these with an Accelero cooler on my 6970. If you have some tin snips you can cut them very easily down to fit. Hell you could probably do it with scissors.


Why do Nvidia cards not appear to have VRM sensors? Every AMD card I owned since the 4870s had them but this Gigabyte 1070 does not... Or is that specific to Gigabyte?

craig588
Nov 19, 2005

by Nyc_Tattoo
Temp sensors? They're not standard. They do have voltage and current sensors that aren't exposed to end users. (The voltage graphs in Afterburner are reporting requested voltage not measured) I think EVGA is the only company with VRM temperature sensors.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Actually you know what, Gelid makes a bolt-on kit for the VRMs, and according to the OC.net thread they work with the G10 (and the thread thinks they will even work on a 1080 Ti). There's a seller on eBay who has them for $17 shipped, I'll just grab something like these for the RAM (probably unnecessary) and call it a day. Too bad it'll be the slow boat from China, can't find anywhere in the US with that kit.

1gnoirents
Jun 28, 2014

hello :)

Paul MaudDib posted:

Can someone who's used a G10 bracket tell me whether these heatsinks will clear the bracket?

Yes I bought those exact ones 14mm is good

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

Can someone who's used a G10 bracket tell me whether these heatsinks will clear the bracket?

They should be fine, but they're also unnecessary. You can get a big 'ol bags of aluminum ones for cheaper and they work just fine. These are the ones I use on my 1080:

http://www.ebay.com/itm/Lot-40-pcs-9x9x12mm-adhesive-Aluminum-blue-Heat-Sink-For-Memory-GPU-Chip-IC-PI-/141719145659

I threw these on the RAM because I had them and they're a little bigger and fit the RAM chips better:
http://www.ebay.com/itm/LOT-OF-40-PCS-Aluminum-VGA-Card-Xbox360-DDR-RAM-Cooler-self-adhesive-Heatsink-/141264980186

e; remember you're going to need at least 8 for the VMRs, and then another 11 for the RAM if you want to throw sinks on those, too, so you're talking $40 for copper sinks vs $12 for aluminum ones with a hand-full to spare.

DrDork fucked around with this message at 01:50 on Jul 7, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

They should be fine, but they're also unnecessary. You can get a big 'ol bags of aluminum ones for cheaper and they work just fine. These are the ones I use on my 1080:

http://www.ebay.com/itm/Lot-40-pcs-9x9x12mm-adhesive-Aluminum-blue-Heat-Sink-For-Memory-GPU-Chip-IC-PI-/141719145659

I threw these on the RAM because I had them and they're a little bigger and fit the RAM chips better:
http://www.ebay.com/itm/LOT-OF-40-PCS-Aluminum-VGA-Card-Xbox360-DDR-RAM-Cooler-self-adhesive-Heatsink-/141264980186

e; remember you're going to need at least 8 for the VMRs, and then another 11 for the RAM if you want to throw sinks on those, too, so you're talking $40 for copper sinks vs $12 for aluminum ones with a hand-full to spare.

I'm actually going to go with the Gelid VRM cooler, they've been around for a couple generations now and people seem to like them. $15 a bracket ain't bad if it helps spread heat a little vs no heatsinks, that's no more expensive than getting some crappy heatsinks so I'll just wait the 2-4 weeks for slowboat shipping. I ordered 2 in case I put one on a card for my fiance (or for SLI :v:)





OK, so you would turn the bracket 90 degrees counterclockwise from the orientation in the pic to match the outline. You can see the middle of the of the three "arms" is a little longer than the others, that faces right on the board I think.

It looks like it has a cutout in the middle between the high-stage and low-stage VRM sections. I assume that's a bigass filtering capacitor for power stabilization? Doesn't need to be heatsinked?

The DRAM chips are apparently 10x14mm accroding to the datasheet, so some of the cheap 0.5x0.5in heatsinks should be fine. Do the 14mm heatsinks fit the DRAM too, height-wise? Is there any real benefit to having the tall heatsinks or is just having stubby ones on there more than enough? (the memory doesn't put much heat compared to the VRMs)

Paul MaudDib fucked around with this message at 03:51 on Jul 7, 2017

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

I'm actually going to go with the Gelid VRM cooler
Let us know how it works--if it fits well, I'll keep it in mind for the inevitable 1100-series upgrade (assuming they don't change up the VRM layout, which seems a reasonable expectation).

Paul MaudDib posted:

It looks like it has a cutout in the middle between the high-stage and low-stage VRM sections. I assume that's a bigass filtering capacitor for power stabilization? Doesn't need to be heatsinked?
It's a choke, and no, they don't really need to be cooled. It's the MOSFETs that pump out the heat.

Paul MaudDib posted:

Do the 14mm heatsinks fit the DRAM too, height-wise? Is there any real benefit to having the tall heatsinks or is just having stubby ones on there more than enough?
They should fit, yeah. At the very worst they're a tiny bit tall and you sand/grind the tops down a little bit. There's no benefit to tall vs wide or whatever--VRAM doesn't really get hot enough anyway to worry about, so anything you bother slapping on them is just a bonus.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Then I won't worry about it, I'll just use the Gelid bracket and let the DRAM be. I gotta get one of those thermal vision addons for my phone.

lDDQD
Apr 16, 2006

Palladium posted:

But that also proves how well built the AMD GPU's VRMs are compared to NV's crappy only-300W ones on GP102 :smug:

The VRMs on the Vega FE at least are capable of delivering around 600W without breaking a sweat, and are roughly comparable to what you'd see on a Lightning/KIngpIN/Hall of Fame card.

lDDQD fucked around with this message at 05:26 on Jul 7, 2017

Fruit Chewy
Feb 13, 2012
join whole squid
I wonder if I could put that gelid bracket on my 1080ti vrms underneath the arctic accelero. The little stick on heatsinks are lackluster.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Looking forward to RX Vega HOF Edition with 5x 8-Pin PCI-E connectors.

NewFatMike
Jun 11, 2015

MaxxBot posted:

Looking forward to RX Vega HOF Edition with 5x 8-Pin PCI-E connectors.

"What's the most expensive part of your build?"

The electric bill.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Fruit Chewy posted:

I wonder if I could put that gelid bracket on my 1080ti vrms underneath the arctic accelero. The little stick on heatsinks are lackluster.

Find out, then let me know.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Paul MaudDib posted:

This is correct. There is no "default display" in Windows, if you don't have a monitor or a dummy-dongle then the display is actually not rendered at all.

Steam would have to implement a "virtual monitor" that captures the output and streams it. What they actually have uses basically the same mechanism as Shadowplay/OBS - it framegrabs the output being rendered to another display.

Remote Desktop does provide some of this functionality though (you can RD into a headless machine, you cannot VNC into a headless machine). Maybe there would be a way to piggyback on that, but I doubt RD is optimized around gaming (it may actually even be a software renderer, I'm not sure).

You can get Windows RDP to render to the video card and back out (windows 10 or later, natch) in a high quality video format. Check out how to do it here. It's how I can game from home, and is solid on ~30mbit or faster connection @ 1440p.

Running just fine on a 1060 6gb.

e: if you wanna get fancy, you can redirect usb as well and plugged in xbone or BT controller will work.

incoherent fucked around with this message at 06:27 on Jul 7, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

lDDQD posted:

The VRMs on the Vega FE at least are capable of delivering around 600W without breaking a sweat, and are roughly comparable to what you'd see on a Lightning/KIngpIN/Hall of Fame card.

This doesn't necessarily mean much when apparently a lot of RX 580s are built VRMs capable of delivering ~320W, something a Polaris 10 can only achieve under LN2 IIRC. IMHO, this points to a maximum of ~400W power draw for Vega under normal conditions. Good loving lord, LMAO that's beyond worse than Polaris.

lDDQD
Apr 16, 2006
It means you won't have to wait for a custom PCB if you want to do even the most extreme overclocking - reference design is probably as good as it's gonna get :)

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Think about the possibilities of Vega at 400w. You could open a crematorium that also mines buttcoins at the same time.

GRINDCORE MEGGIDO
Feb 28, 1985


Didn't they complete the testing of the watercooled Fe yet?

Adbot
ADBOT LOVES YOU

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

GRINDCORE MEGGIDO posted:

Didn't they complete the testing of the watercooled Fe yet?

Gamers Nexus finished their testing but the video hasn't been uploaded yet.

EDIT: They just put this video up.

https://www.youtube.com/watch?v=yudueaG5_rE

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply