Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
eames
May 9, 2009

I'd Vega is running way, way, way past the point of diminishing returns to just compete with Nvidia but it secretly scales down very well. Just like an undervolted 8 core Ryzen can get by with ~45W at 3 Ghz but over 200W at 4.2 Ghz/1.45V. If that is not the case then APUs are going to be a disaster.

OTOH maybe the rumors are true and Intel cooperates with AMD for an APU — SL-X + Vega would make for a nice 1kW TDP APU for the new iMac Pro.

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

eames posted:

I'd Vega is running way, way, way past the point of diminishing returns to just compete with Nvidia but it secretly scales down very well. Just like an undervolted 8 core Ryzen can get by with ~45W at 3 Ghz but over 200W at 4.2 Ghz/1.45V. If that is not the case then APUs are going to be a disaster.

OTOH maybe the rumors are true and Intel cooperates with AMD for an APU — SL-X + Vega would make for a nice 1kW TDP APU for the new iMac Pro.

My 7700k gets 4.2ghz with turbo disabled at 1.08v, which caps out at 50 watts, so yeah Intel still is the better value.

Yaoi Gagarin
Feb 20, 2014

Zero VGS posted:

My 7700k gets 4.2ghz with turbo disabled at 1.08v, which caps out at 50 watts, so yeah Intel still is the better value.

But you're getting 50w at 4.2 on just four cores, he's describing an 8 core CPU at 3.0. That seems like a better deal to me

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

Paul MaudDib posted:

Which it will be anyway, because :wtc:. Seriously AMD what the gently caress happened? No marketing talk, we need an honest answer: something went horribly wrong here, this is like at least double the design TDP from that slide, what the gently caress happened? Did someone get doritos on the masks?

To enhance on my previously nutty theory:

Late last year they realised they had a problem with their silicon so they had to delay while they worked on a solution. February comes around and they still don't have a solution, and it is uncertain how much longer it will take to find one.

The marketing team meanwhile proposes at least meeting the deadline of 1H 2017 by releasing a card for a limited use case that can be met by the bad silicon that they can manufacture, this goes on to become the Vega FE, a card for nobody desired by nobody.

Meanwhile March rolls around and the engineering team finally have a fix for the silicon so now they can start planning the release of RX Vega sometime in late August to early September.

So to reiterate I don't actually know poo poo about how you go from having working silicon to manufacturing that in mass or how long it takes. This just represents the best case scenario as I see it.

The worst case scenario is that someone in RTG saw bulldozer and said "so you think that's bad? watch this!"

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Measly Twerp posted:

To enhance on my previously nutty theory:

Late last year they realised they had a problem with their silicon so they had to delay while they worked on a solution. February comes around and they still don't have a solution, and it is uncertain how much longer it will take to find one.

The marketing team meanwhile proposes at least meeting the deadline of 1H 2017 by releasing a card for a limited use case that can be met by the bad silicon that they can manufacture, this goes on to become the Vega FE, a card for nobody desired by nobody.

Meanwhile March rolls around and the engineering team finally have a fix for the silicon so now they can start planning the release of RX Vega sometime in late August to early September.

So to reiterate I don't actually know poo poo about how you go from having working silicon to manufacturing that in mass or how long it takes. This just represents the best case scenario as I see it.

The worst case scenario is that someone in RTG saw bulldozer and said "so you think that's bad? watch this!"

Wrong. The worst case scenario is they think Vega is good as is and they will continue along this path because of that.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
How do you get silicon that works, but consumes x2 to x3 as much power as it's original target? Like, from a design perspective how is it not molten slag trying to run that much more power? How is anything functioning at all?

I mean, needing that much power doesn't seem to jive with PCPer's results, which show tiny performance changes as Vega loops back and forth between 200W to 320W and reported shader clock doesn't budge. How the gently caress can something be this badly broken, yet still work.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Raja is really proud of the new diagnostic board for Vega because it means the engineers working on it will have more time to reach their bunker. Rest in peace engineering team six.

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

AVeryLargeRadish posted:

Wrong. The worst case scenario is they think Vega is good as is and they will continue along this path because of that.

*shudders*

Meanwhile I've been trying to read the Anand Tech forum discussions of Vega and I have to say that the moderators there are truly the most awful autistic (in the sense that they are incapable of relating to other humans) bunch of wankers I've ever seen moderate a forum. Post PCOS Bill style shitposts in the Vega thread? No problem! Suggest that a mod should do something about it watch it or you'll get permabanned.

I really don't want to be there, but generally there's interesting discussion so...

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
PCPer kept the power limit at default which limited it to around 300W and a clockspeed around 1450MHz. Buildzoid put the limit up the the max which let it draw 400W+ and reach 1650MHz but it's still power throttling.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
vega is just way ahead of its time and modern day power delivery isnt enough. Really they are visionaries

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

FaustianQ posted:

How do you get silicon that works, but consumes x2 to x3 as much power as it's original target? Like, from a design perspective how is it not molten slag trying to run that much more power? How is anything functioning at all?

I mean, needing that much power doesn't seem to jive with PCPer's results, which show tiny performance changes as Vega loops back and forth between 200W to 320W and reported shader clock doesn't budge. How the gently caress can something be this badly broken, yet still work.

Gotta remember that that is at default settings. It is probably doing some form of invisible throttling, these chips can do updates to clock rate 1000 times per second but software only samples once per second, so clocks can appear higher than reported, there is also the possibility of shutting down shader units to keep under certain power limits. The thing is that it either scales horrendously badly as clocks increase or it is being power throttled at the cost of performance, the really high wattage figures assume that it is power throttling and it would scale better if it had enough power, it could just be that it scales like poo poo no matter what.

Craptacular!
Jul 9, 2001

Fuck the DH
Someday it will be a standard feature for all houses will come with a second breaker box for our AMD-powered gaming systems. We just need the driver team to catch up so the FineWine kicks in. #betterred

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Craptacular! posted:

Someday it will be a standard feature for all houses will come with a second breaker box for our AMD-powered gaming systems. We just need the driver team to catch up so the FineWine kicks in. #betterred

Really it's just calling out our inferior US wiring standards. If we used 240v for everything like the UK this wouldn't be an issue!

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

DrDork posted:

Really it's just calling out our inferior US wiring standards. If we used 240v for everything like the UK this wouldn't be an issue!

Until RX Vega 2 comes out which is the same silicon running at 2GHz with a 1kW TDP, almost matching the 1080 Ti in performance!

rex rabidorum vires
Mar 26, 2007

KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN
Literally everything about the Vega launch is :tif: like I have nothing to contribute other than thanks for validating buying a 980ti second hand as the 1070s disappeared to buttmining.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

MaxxBot posted:

Until RX Vega 2 comes out which is the same silicon running at 2GHz with a 1kW TDP, almost matching the 1080 Ti in performance!

Pfft. Most UK sockets are at least 13A (if not 20A), so they can happily pump out the required 3kW for you RX Vega 2 Crossfire setup!

Us sad ungrateful traitors, on the other hand, are lucky to be able to pull ~1.6kW off a circuit, so it'll be back to running extension cords from other rooms to make it all work.

GRINDCORE MEGGIDO
Feb 28, 1985


Holy poo poo
How is this better then fat polaris

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

GRINDCORE MEGGIDO posted:

Holy poo poo
How is this better then fat polaris

Probably a better heating solution? :shrug:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

How do you get silicon that works, but consumes x2 to x3 as much power as it's original target? Like, from a design perspective how is it not molten slag trying to run that much more power? How is anything functioning at all?

Whatever hardware they need for the Primitive Shaders or Draw-Stream-Binning Rasterizer is broken. They thought they'd be pushing a buttload more geometry and they've either killed the PS/DSBR entirely or dropped it onto the shader array or something. They thought they would be making Maxwell-sized gains and instead they have Super Fiji, literally.

I think Polaris is probably their version of Maxwell so to speak - it's a little more stripped-down for gaming than they'd like for the compute market, and they went for broke with this attempt at a compute-centric architecture instead. I mean it literally does perform just like Super Fiji, even less optimized than Polaris core for core. Now with software emulation mode maybe.

I don't see what else it can be.

Paul MaudDib fucked around with this message at 03:08 on Jul 11, 2017

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

eames posted:

I'd Vega is running way, way, way past the point of diminishing returns to just compete with Nvidia but it secretly scales down very well. Just like an undervolted 8 core Ryzen can get by with ~45W at 3 Ghz but over 200W at 4.2 Ghz/1.45V. If that is not the case then APUs are going to be a disaster.

OTOH maybe the rumors are true and Intel cooperates with AMD for an APU — SL-X + Vega would make for a nice 1kW TDP APU for the new iMac Pro.

Not to mention full GP104/106 can go all the way down to 100W/60W while dropping like only ~20% of performance. NV could easily got way better perf/W on Pascal with a wider design but chose to save on die size.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
This thread deserves a better class of shitposting

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

PerrineClostermann posted:

This thread deserves a better class of shitposting

These things are gonna do great in Iceland. In the Ethereum farms :unsmigghh:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

rex rabidorum vires posted:

Literally everything about the Vega launch is :tif: like I have nothing to contribute other than thanks for validating buying a 980ti second hand as the 1070s disappeared to buttmining.

Oh no, I'm locked into a system with consistent quality and performance while AMD does... a thing.


but my freesync

Craptacular!
Jul 9, 2001

Fuck the DH

Paul MaudDib posted:

These things are gonna do great in Iceland. In the Ethereum farms :unsmigghh:

You have them mixed up. Iceland can't cool a Vega, Greenland can.

NewFatMike
Jun 11, 2015

Pleeeeaaaase don't gently caress up the APUs, Raj. I don't think my heart can take it. :(

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
So does Raj deserve any blame for this or not? Was Vega basically finalized when he came on?

Cygni
Nov 12, 2005

raring to post

PerrineClostermann posted:

This thread deserves a better class of shitposting

kyro2 was the best videocard ever made

NewFatMike
Jun 11, 2015

MaxxBot posted:

So does Raj deserve any blame for this or not? Was Vega basically finalized when he came on?

That would make me happy, but I think he would have been involved with the 225W slide from earlier in the year.

wargames
Mar 16, 2008

official yospos cat censor

PerrineClostermann posted:

This thread deserves a better class of shitposting



But yeah there are no redeeming features of Vega other then to teach AMD on how not to make a card.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

wargames posted:



But yeah there are no redeeming features of Vega other then to teach AMD on how not to make a card.

Come on now, we haven't seen RX Vega yet and driver updates might make more difference than we think, there's still at least some hope.

:rimshot:

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
It's mind boggling how badly AMD is bumbling Vega.

But hey at least the used gpu market is heading toward a crash soon.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

GRINDCORE MEGGIDO posted:

Holy poo poo
How is this better then fat polaris

It's not, all signs point toward Fat Polaris being smaller, consuming less power, putting out less heat and out performing Vega. But, like Paul has mentioned several times, AMD/RTG does not have the cash to do two optimized for market chip lines, so they do one that does both poorly. Polaris seems to be much more gaming oriented, so when everyone at AMD piled in to the board room to discuss what they can do to substantially increase revenue and increase market share they had Fat Polaris which could work decent in the gaming market but wouldn't really have a future in AI/Datacenter/Pro use, they dropped it in favor of Vega which seems designed to specifically target those markets. Except it's a failure in that regard and the Pro Duo is a better solution still, so now they have no money and no way to recover.

It cannot be overstated how dead RTG and the graphics division is with this release. AMD the CPU maker and APU maker will be fine, but for dGPU it looks like Radeon is dead. It takes way too much money and effort to keep a dGPU division going.

Truga
May 4, 2014
Lipstick Apathy

Paul MaudDib posted:

but my freesync

This but unironically. If I can get 1080 performance out of this like reports suggest, and also save $150 on freesync vs gsync, I'll do it.

I guess it depends on how they price it, as usually?

wargames
Mar 16, 2008

official yospos cat censor

FaustianQ posted:

It cannot be overstated how dead RTG and the graphics division is with this release. AMD the CPU maker and APU maker will be fine, but for dGPU it looks like Radeon is dead. It takes way too much money and effort to keep a dGPU division going.

I doubt they will close down the dGPU side of things, they are probably hoping navi will turn out like RYZEN did at this point though.

nerdrum
Aug 17, 2007

where am I

Truga posted:

This but unironically. If I can get 1080 performance out of this like reports suggest, and also save $150 on freesync vs gsync, I'll do it.

I guess it depends on how they price it, as usually?

I absolutely think having a device that at nearly 500 watts of power draw is still slower than an overclocked 1070 optimized for a workflow that doesn't exist in any industry outside of motion graphics is a great choice as well and I definitely am very pumped to have 25hz of usable freesync range while my apartment's wall outlets are melting if I attempt stock clocks on a graphics card that shipped a year late competing with a mid tier product that generates 30% of its powerdraw overclocked to its maximum settings.

Godbless the RTG team for this incredible product this 1080p ips monitor I bought five years ago is still going strong!

edit: also, LOL if you genuinely think this is a shippable product at anything under $449 with any margin that shareholders wouldn't crucify AMD for.

nerdrum fucked around with this message at 09:45 on Jul 11, 2017

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Truga posted:

This but unironically. If I can get 1080 performance out of this like reports suggest, and also save $150 on freesync vs gsync, I'll do it.

I guess it depends on how they price it, as usually?

Hey actually that would be one way this could actually get worse: "Introducing RX Vega, the fastest AMD GPU ever! Available in stores now for only $699.99!" :v:

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

nerdrum posted:

I absolutely think having a device that at nearly 500 watts of power draw is still slower than an overclocked 1070 optimized for a workflow that doesn't exist in any industry outside of motion graphics is a great choice as well and I definitely am very pumped to have 25hz of usable freesync range while my apartment's wall outlets are melting if I attempt stock clocks on a graphics card that shipped a year late competing with a mid tier product that generates 30% of its powerdraw overclocked to its maximum settings.

Godbless the RTG team for this incredible product this 1080p ips monitor I bought five years ago is still going strong!

RX 580 is 100% viable for 1080p Freesync though, and the RX 580 isn't thermite concealed as a GPU. It's idiot people like me who have a 1440p or even a 4K monitor who have problems.

nerdrum
Aug 17, 2007

where am I

FaustianQ posted:

RX 580 is 100% viable for 1080p Freesync though, and the RX 580 isn't thermite concealed as a GPU. It's idiot people like me who have a 1440p or even a 4K monitor who have problems.

https://www.amazon.com/ASUS-MG279Q-Screen-LED-Lit-Monitor/dp/B00ZOO348C/ref=zg_bsnr_1292115011_5?tag=amazon0606-20

freesync range: 30-90Hz
price: $561.80

https://www.amazon.com/Acer-Predato...ords=ips+g+sync

gsync range: 30-165hz
price: 699.00

a 140 dollars gets you a 45% increase in usable refresh rate range. This isn't even factoring in that the dell 24/27 TN gsync setups at a 400 dollar price point with 20 minutes of configuration are some of the most color accurate non professional monitors on the market

but I guess it doesn't matter since there isn't a radeon product that will crack 90fps in any possible scenario outside of the ashes benchmark at 1440 :c00lbutt:

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

nerdrum posted:

https://www.amazon.com/ASUS-MG279Q-Screen-LED-Lit-Monitor/dp/B00ZOO348C/ref=zg_bsnr_1292115011_5?tag=amazon0606-20

freesync range: 30-90Hz
price: $561.80

https://www.amazon.com/Acer-Predato...ords=ips+g+sync

gsync range: 30-165hz
price: 699.00

a 140 dollars gets you a 45% increase in usable refresh rate range. This isn't even factoring in that the dell 24/27 TN gsync setups at a 400 dollar price point with 20 minutes of configuration are some of the most color accurate non professional monitors on the market

but I guess it doesn't matter since there isn't a radeon product that will crack 90fps in any possible scenario outside of the ashes benchmark at 1440 :c00lbutt:

The MG279Q's can be updated to do 30-110Hz or 54-144Hz http://nils.schimmelmann.us/post/133778060542/modded-asus-mg279q-drivers-with-60-144-hz-freesync

But my point more was that I've had the MG279Q since 2015, was actually hoping to get something good for the 400 series release, then Vega and now I'm stuck with this thing and nothing to properly push it. Like I'm not disputing the higher resolutions at all, so I'm not sure what the point of your post was in response to mine? I'm not throwing down another 700$ just to switch to Gsync at this point, lmao.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy

nerdrum posted:

I absolutely think having a device that at nearly 500 watts of power draw is still slower than an overclocked 1070 optimized for a workflow that doesn't exist in any industry outside of motion graphics is a great choice as well and I definitely am very pumped to have 25hz of usable freesync range while my apartment's wall outlets are melting if I attempt stock clocks on a graphics card that shipped a year late competing with a mid tier product that generates 30% of its powerdraw overclocked to its maximum settings.

I'm terribly sorry about your power grid from literally 1890? I've had a >500W machine way back when multigpu was required for 2560x1600 to run smoothly, and I had zero issues with power. Power is super cheap here too :shrug:

Hell, my 980Ti draws >400W when I want to play elite in VR or some newer games on ultra smoothly.

quote:

edit: also, LOL if you genuinely think this is a shippable product at anything under $449 with any margin that shareholders wouldn't crucify AMD for.

Are you one of the people that isn't aware AMD probably pays under $100 per unit? A full RX480 card cost them $30 when it was new. If they want to sell it competitively to 1080, they will. If they don't, too bad. This 980Ti can hold me over a month or two more for the buttcoin bubble to crash, and that's when I'll be shopping in either case.


nerdrum posted:

a 140 dollars gets you a 45% increase in usable refresh rate range.

lol no it doesn't https://www.newegg.com/Product/Product.aspx?Item=0JC-00A3-00001

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply