|
You know what really doesn't make any sense? "Poor Volta[ge]". How could they think that was ok if the best they had was something near 1080 performance? Either it was a massive fuckup in that they literally didn't mean to compare their product to Volta, or it's a massive fuckup in that they are not getting anywhere near the performance they were expecting.
|
# ? Jul 6, 2017 19:12 |
|
|
# ? May 31, 2024 05:24 |
|
Measly Twerp posted:You know what really doesn't make any sense? No, it was a brilliant marketing tactic as at least some portion of the kool-aid drink r/amd base took it as a 100% face that Vega was going to crush Volta and preordered as soon as it was available. And they'll keep holing onto that belief even though they are currently holding a 300W GTX 1080 in their hands that is literally boiling their FineWine.
|
# ? Jul 6, 2017 19:18 |
|
sincx posted:Think I can sell my RX480 for enough to buy a 1080?
|
# ? Jul 6, 2017 19:19 |
|
Measly Twerp posted:You know what really doesn't make any sense? Who knows what performance they were expecting, but there's evidence of a power consumption fuckup late in development at least. This slide leaked out in January: They were expecting a 225W TDP at ~1.5ghz, and welp at how that turned out.
|
# ? Jul 6, 2017 19:24 |
|
I can't help but laugh at the pixelated out watermark with a videocardz.com slapped over it. Stealing a stolen slide.
|
# ? Jul 6, 2017 19:43 |
|
Geemer posted:I can't help but laugh at the pixelated out watermark with a videocardz.com slapped over it. Stealing a stolen slide. VCZ posted those slides first so they definitely weren't covering up another sites watermark. It's more likely covering up some identifying information that would get the source in trouble.
|
# ? Jul 6, 2017 19:48 |
|
repiv posted:They were expecting a 225W TDP at ~1.5ghz, and welp at how that turned out. I wonder if this points to design issues in Vega itself, or weakness in the Samsung 14nm LPP process they are using. The Samsung 14 LPP node is the same one getting used for Ryzen, which seems pretty power efficient. I think its also used for Polaris 20 / RX 580, so they have experience with it and knew its power characteristics. Seems strange that they would be caught so off guard once Vega started getting cut, unless the issue is with the design itself.
|
# ? Jul 6, 2017 19:55 |
|
sincx posted:Think I can sell my RX480 for enough to buy a 1080? Sell your RX 480 for $375 right now if you've got something else to use in the meantime (or won't be gaming much). Maybe buy something really cheap from EVGA B-stock like a 750 Ti or something the next time they refresh. You'll lose ~20% due to fees/shipping/etc so you get $300. Then either spend the extra $150-200 and get a 1080 right now, or wait a month or two until the mining market crashes at which point you can probably get a 1080 for sub-$400. But seriously it's not going to be long now, mining profit on a RX 480 is down from $5 to like $3 per day now. Sooner or later miners are going to realize those profits don't justify dropping $375 on a GPU so if you're thinking about it don't wait too long.
|
# ? Jul 6, 2017 20:05 |
|
Paul MaudDib posted:Sell your RX 480 for $375 right now if you've got something else to use in the meantime (or won't be gaming much). Maybe buy something really cheap from EVGA B-stock like a 750 Ti or something the next time they refresh. its already lower than 3, sell on ebay now, a lot of the local Craigslist demand has already started to die down and ebay wont last much longer than that You can list the 480 and buy a new card today, ship out as soon as it arrives
|
# ? Jul 6, 2017 20:16 |
|
Cygni posted:I wonder if this points to design issues in Vega itself, or weakness in the Samsung 14nm LPP process they are using. The Samsung 14 LPP node is the same one getting used for Ryzen, which seems pretty power efficient. I think its also used for Polaris 20 / RX 580, so they have experience with it and knew its power characteristics. Seems strange that they would be caught so off guard once Vega started getting cut, unless the issue is with the design itself. It may well be a 225W card if it's tuned sanely to peak out at ~1500 MHz, but of course they slammed full throttle to match a ref 1080 in reviews
|
# ? Jul 6, 2017 20:27 |
|
Cygni posted:I wonder if this points to design issues in Vega itself, or weakness in the Samsung 14nm LPP process they are using. The Samsung 14 LPP node is the same one getting used for Ryzen, which seems pretty power efficient. I think its also used for Polaris 20 / RX 580, so they have experience with it and knew its power characteristics. Seems strange that they would be caught so off guard once Vega started getting cut, unless the issue is with the design itself. I wonder if it's pertinent that GP107 is built on Samsung 14nm but all the larger Pascal parts are built on TSMC 16nm. Does Samsungs process just suck for larger GPUs, and Nvidia knew?
|
# ? Jul 6, 2017 20:29 |
|
https://twitter.com/GamersNexus/status/883036964815032320 so I guess benchmarks with it like tomorrow or something (re hacked together Vega FE w/ water)
|
# ? Jul 6, 2017 20:32 |
|
Fauxtool posted:its already lower than 3, sell on ebay now, a lot of the local Craigslist demand has already started to die down and ebay wont last much longer than that Seems like too much trouble, especially since this card was an RMA replacement for a R9 290 so I've already lucked out. Better not push my luck. I'll wait for the 1180 or Vega 2.
|
# ? Jul 6, 2017 20:36 |
|
repiv posted:I wonder if it's pertinent that GP107 is built on Samsung 14nm but all the larger Pascal parts are built on TSMC 16nm. Does Samsungs process just suck for larger GPUs, and Nvidia knew? They have much higher volume so it is entirely likely
|
# ? Jul 6, 2017 20:38 |
|
sauer kraut posted:It may well be a 225W card if it's tuned sanely to peak out at ~1500 MHz, but of course they slammed full throttle to match a ref 1080 in reviews In the PCPer testing it was running at 1440MHz and still consuming close to 300W, I expect the cards with sufficient cooling to sustain 1600MHz to consume well over 300.
|
# ? Jul 6, 2017 20:53 |
|
MaxxBot posted:In the PCPer testing it was running at 1440MHz and still consuming close to 300W, I expect the cards with sufficient cooling to sustain 1600MHz to consume well over 300. Regardless of the "micro-throttle" theory there is very clearly some kind of TDP throttle on the Vega FE blower cards. Blower cards run hotter than liquid-cooled cards. Hotter chips have higher voltage leakage than the same chip at a cooler temperature. Higher voltage leakage means higher TDP. I'd probably expect 50-75W higher or so - the 295x2 saves 86W over a pair of 290Xs that have a 516W average gaming TDP. The fact that a blower card is consuming 75W less than the AIO card says there must be a throttle in there somewhere and has right from the start. The only question is whether it's a regular old TDP throttle or a fancy Pascal-style micro-throttle. If you assume that the Vega FE liquid-cooled version can sustain 1600 MHz at 375W (which - it had better be able to), and the RX Vega cards are going to clock a little higher (Videocardz spotted a test result that says 1630) then the RX Vega cards will probably consume a bit more than 375W. Probably like 400-425W at stock and then you can open up the power limit and do 500W+. edit: nevermind about PCPer vs GN, that chart didn't say what I remembered, snipped Paul MaudDib fucked around with this message at 21:12 on Jul 6, 2017 |
# ? Jul 6, 2017 21:01 |
|
Paul MaudDib posted:then you can open up the power limit and do 500W+. What an absolute trainwreck of a card.
|
# ? Jul 6, 2017 21:08 |
|
But that also proves how well built the AMD GPU's VRMs are compared to NV's crappy only-300W ones on GP102
|
# ? Jul 6, 2017 21:31 |
Paul MaudDib posted:Regardless of the "micro-throttle" theory there is very clearly some kind of TDP throttle on the Vega FE blower cards. That 1630 is static, 3DMark does not report actual clock speed, just what the chip claims its max clock is.
|
|
# ? Jul 6, 2017 21:43 |
|
AVeryLargeRadish posted:That 1630 is static, 3DMark does not report actual clock speed, just what the chip claims its max clock is. I don't think it matters in this case. I think it has to be power-throttling at 300W, it was predictable from the specs of the two cards even before we saw results. It's stopping at exactly 300W for all the reviewers, this just has to be a power limit right now. We already know from the early reviews that it's not a thermal throttle. Either PCPer or GN (don't remember) tested it with 100% fans and it was a jet turbine but the card didn't break 60C during the test. That satisfies me as far as the existence of a thermal throttle. My assertion is that the 375W limit will let Vega FE sit at its rated clock limit, with all units enabled/whatever. Either it's power limited or this thing just doesn't even come close to its rated boost clocks during normal operation. () Hopefully 375W is also enough for RX Vega to hit its full 1630 boost clocks, or they up the power limit a little further. Or at least, an extra 30 MHz should only eat a little more power (<50W). So in theory even if they shipped with a 375W limit you could cover the difference by increasing the power limit. Paul MaudDib fucked around with this message at 22:06 on Jul 6, 2017 |
# ? Jul 6, 2017 22:01 |
|
Can someone who's used a G10 bracket tell me whether these heatsinks will clear the bracket?
|
# ? Jul 6, 2017 22:37 |
|
Paul MaudDib posted:Can someone who's used a G10 bracket tell me whether these heatsinks will clear the bracket? Why do Nvidia cards not appear to have VRM sensors? Every AMD card I owned since the 4870s had them but this Gigabyte 1070 does not... Or is that specific to Gigabyte?
|
# ? Jul 6, 2017 22:56 |
|
Temp sensors? They're not standard. They do have voltage and current sensors that aren't exposed to end users. (The voltage graphs in Afterburner are reporting requested voltage not measured) I think EVGA is the only company with VRM temperature sensors.
|
# ? Jul 6, 2017 23:26 |
|
Actually you know what, Gelid makes a bolt-on kit for the VRMs, and according to the OC.net thread they work with the G10 (and the thread thinks they will even work on a 1080 Ti). There's a seller on eBay who has them for $17 shipped, I'll just grab something like these for the RAM (probably unnecessary) and call it a day. Too bad it'll be the slow boat from China, can't find anywhere in the US with that kit.
|
# ? Jul 6, 2017 23:26 |
|
Paul MaudDib posted:Can someone who's used a G10 bracket tell me whether these heatsinks will clear the bracket? Yes I bought those exact ones 14mm is good
|
# ? Jul 6, 2017 23:45 |
|
Paul MaudDib posted:Can someone who's used a G10 bracket tell me whether these heatsinks will clear the bracket? They should be fine, but they're also unnecessary. You can get a big 'ol bags of aluminum ones for cheaper and they work just fine. These are the ones I use on my 1080: http://www.ebay.com/itm/Lot-40-pcs-9x9x12mm-adhesive-Aluminum-blue-Heat-Sink-For-Memory-GPU-Chip-IC-PI-/141719145659 I threw these on the RAM because I had them and they're a little bigger and fit the RAM chips better: http://www.ebay.com/itm/LOT-OF-40-PCS-Aluminum-VGA-Card-Xbox360-DDR-RAM-Cooler-self-adhesive-Heatsink-/141264980186 e; remember you're going to need at least 8 for the VMRs, and then another 11 for the RAM if you want to throw sinks on those, too, so you're talking $40 for copper sinks vs $12 for aluminum ones with a hand-full to spare. DrDork fucked around with this message at 01:50 on Jul 7, 2017 |
# ? Jul 7, 2017 01:46 |
|
DrDork posted:They should be fine, but they're also unnecessary. You can get a big 'ol bags of aluminum ones for cheaper and they work just fine. These are the ones I use on my 1080: I'm actually going to go with the Gelid VRM cooler, they've been around for a couple generations now and people seem to like them. $15 a bracket ain't bad if it helps spread heat a little vs no heatsinks, that's no more expensive than getting some crappy heatsinks so I'll just wait the 2-4 weeks for slowboat shipping. I ordered 2 in case I put one on a card for my fiance (or for SLI ) OK, so you would turn the bracket 90 degrees counterclockwise from the orientation in the pic to match the outline. You can see the middle of the of the three "arms" is a little longer than the others, that faces right on the board I think. It looks like it has a cutout in the middle between the high-stage and low-stage VRM sections. I assume that's a bigass filtering capacitor for power stabilization? Doesn't need to be heatsinked? The DRAM chips are apparently 10x14mm accroding to the datasheet, so some of the cheap 0.5x0.5in heatsinks should be fine. Do the 14mm heatsinks fit the DRAM too, height-wise? Is there any real benefit to having the tall heatsinks or is just having stubby ones on there more than enough? (the memory doesn't put much heat compared to the VRMs) Paul MaudDib fucked around with this message at 03:51 on Jul 7, 2017 |
# ? Jul 7, 2017 03:49 |
|
Paul MaudDib posted:I'm actually going to go with the Gelid VRM cooler Paul MaudDib posted:It looks like it has a cutout in the middle between the high-stage and low-stage VRM sections. I assume that's a bigass filtering capacitor for power stabilization? Doesn't need to be heatsinked? Paul MaudDib posted:Do the 14mm heatsinks fit the DRAM too, height-wise? Is there any real benefit to having the tall heatsinks or is just having stubby ones on there more than enough?
|
# ? Jul 7, 2017 04:05 |
|
Then I won't worry about it, I'll just use the Gelid bracket and let the DRAM be. I gotta get one of those thermal vision addons for my phone.
|
# ? Jul 7, 2017 04:27 |
|
Palladium posted:But that also proves how well built the AMD GPU's VRMs are compared to NV's crappy only-300W ones on GP102 The VRMs on the Vega FE at least are capable of delivering around 600W without breaking a sweat, and are roughly comparable to what you'd see on a Lightning/KIngpIN/Hall of Fame card. lDDQD fucked around with this message at 05:26 on Jul 7, 2017 |
# ? Jul 7, 2017 05:24 |
|
I wonder if I could put that gelid bracket on my 1080ti vrms underneath the arctic accelero. The little stick on heatsinks are lackluster.
|
# ? Jul 7, 2017 05:27 |
|
Looking forward to RX Vega HOF Edition with 5x 8-Pin PCI-E connectors.
|
# ? Jul 7, 2017 05:30 |
|
MaxxBot posted:Looking forward to RX Vega HOF Edition with 5x 8-Pin PCI-E connectors. "What's the most expensive part of your build?" The electric bill.
|
# ? Jul 7, 2017 05:38 |
|
Fruit Chewy posted:I wonder if I could put that gelid bracket on my 1080ti vrms underneath the arctic accelero. The little stick on heatsinks are lackluster. Find out, then let me know.
|
# ? Jul 7, 2017 05:44 |
|
Paul MaudDib posted:This is correct. There is no "default display" in Windows, if you don't have a monitor or a dummy-dongle then the display is actually not rendered at all. You can get Windows RDP to render to the video card and back out (windows 10 or later, natch) in a high quality video format. Check out how to do it here. It's how I can game from home, and is solid on ~30mbit or faster connection @ 1440p. Running just fine on a 1060 6gb. e: if you wanna get fancy, you can redirect usb as well and plugged in xbone or BT controller will work. incoherent fucked around with this message at 06:27 on Jul 7, 2017 |
# ? Jul 7, 2017 06:21 |
|
lDDQD posted:The VRMs on the Vega FE at least are capable of delivering around 600W without breaking a sweat, and are roughly comparable to what you'd see on a Lightning/KIngpIN/Hall of Fame card. This doesn't necessarily mean much when apparently a lot of RX 580s are built VRMs capable of delivering ~320W, something a Polaris 10 can only achieve under LN2 IIRC. IMHO, this points to a maximum of ~400W power draw for Vega under normal conditions. Good loving lord, LMAO that's beyond worse than Polaris.
|
# ? Jul 7, 2017 06:44 |
|
It means you won't have to wait for a custom PCB if you want to do even the most extreme overclocking - reference design is probably as good as it's gonna get
|
# ? Jul 7, 2017 07:05 |
|
Think about the possibilities of Vega at 400w. You could open a crematorium that also mines buttcoins at the same time.
|
# ? Jul 7, 2017 07:10 |
|
Didn't they complete the testing of the watercooled Fe yet?
|
# ? Jul 7, 2017 07:16 |
|
|
# ? May 31, 2024 05:24 |
|
GRINDCORE MEGGIDO posted:Didn't they complete the testing of the watercooled Fe yet? Gamers Nexus finished their testing but the video hasn't been uploaded yet. EDIT: They just put this video up. https://www.youtube.com/watch?v=yudueaG5_rE
|
# ? Jul 7, 2017 07:17 |