Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I would also point out that for most users tasks, K10, Nehalem and Yorkfields still work, hell drop a AIO on them and crank up the clocks and they're still very serviceable for gaming. This doesn't mean you won't get anything out of newer generations of processors, but that the required oopmh to provide a smooth experience for even today happened in ~2006-2010, and for most uses cases it's better to focus on getting that performance at lower cost/power draw.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
http://techreport.com/news/30631/globalfoundries-adds-a-12-nm-node-to-its-fd-soi-roadmap

On the one hand, the last SOI node that GloFo ran was 32nm, and that was what gave us those wonderful K10 chips, and AMD lost a lot in having to go to bulk silicon with 28nm.

On the other hand, there is probably zero trust with GloFo right now.

Arzachel
May 12, 2012

SwissArmyDruid posted:

http://techreport.com/news/30631/globalfoundries-adds-a-12-nm-node-to-its-fd-soi-roadmap

On the one hand, the last SOI node that GloFo ran was 32nm, and that was what gave us those wonderful K10 chips, and AMD lost a lot in having to go to bulk silicon with 28nm.

On the other hand, there is probably zero trust with GloFo right now.

Apparently they also have a 22nm FDSOI node? Either way, 2019 is probably a very optimistic estimate and by then AMD will be looking to move to a smaller node.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Yeah, you're right. Frankly, I expect the next generation of consoles to land no later than 2018, and 14nm Zen/Vega APUs will power those just fine, no need for 12 or 10nm.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Consoles specifically could move up in performance by just having better existing technology than the current offerings, die shrinking aren't going to change that.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Arzachel posted:

Apparently they also have a 22nm FDSOI node? Either way, 2019 is probably a very optimistic estimate and by then AMD will be looking to move to a smaller node.

fdsoi is apparently cool and good in the IoT space so glofo being a merchant foundry pursuing it makes sense

Grey Area
Sep 9, 2000
Battle Without Honor or Humanity

SwissArmyDruid posted:

Yeah, you're right. Frankly, I expect the next generation of consoles to land no later than 2018, and 14nm Zen/Vega APUs will power those just fine, no need for 12 or 10nm.
Why would they make upgraded versions of the PS4/XB1 if they intend to launch all-new consoles soon?

Arsten
Feb 18, 2003

Grey Area posted:

Why would they make upgraded versions of the PS4/XB1 if they intend to launch all-new consoles soon?

:10bux:
Sony and Microsoft have wanted to shorten the 5+ year generation gap for years. I could see it being a "New Console -> Upgraded Console -> New Console" step pattern every 2.5 years.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
The main thing that the upgraded PS4/Xbox One are targeting is the ability to reliably render games at better than 1080p resolution, because 4k is going to be the new high end for a while. It'll probably be quite some time before the upgraded models get replaced.

SCheeseman
Apr 23, 2003

Arsten posted:

:10bux:
Sony and Microsoft have wanted to shorten the 5+ year generation gap for years. I could see it being a "New Console -> Upgraded Console -> New Console" step pattern every 2.5 years.

I see it more as New Console -> Upgraded Console -> Upgraded Console -> Upgraded Console.

PPC is dead (at least in the consumer space) and it would be stupid to break hardware compatibility again, particularly considering the ridiculous effort and money put into backwards compatibility through emulation. Unless some amazing new architecture comes up down the track both consoles are probably just going to stick to slightly customised PC parts as they are now.

Arsten
Feb 18, 2003

SwissCM posted:

I see it more as New Console -> Upgraded Console -> Upgraded Console -> Upgraded Console.

PPC is dead (at least in the consumer space) and it would be stupid to break hardware compatibility again, particularly considering the ridiculous effort and money put into backwards compatibility through emulation. Unless some amazing new architecture comes up down the track both consoles are probably just going to stick to slightly customised PC parts as they are now.

I agree, I was referencing versioning more than architecture: PS4 -> PS4 Pro -> PS5 -> PS5 DoublePlusGood and so forth. I don't expect an actual architecture change anytime soon, either.

fishmech posted:

The main thing that the upgraded PS4/Xbox One are targeting is the ability to reliably render games at better than 1080p resolution, because 4k is going to be the new high end for a while. It'll probably be quite some time before the upgraded models get replaced.

I disagree simply because they make a lot of money on their consoles. By the end of the 5 years, people aren't buying them anymore because everyone had a PS3 when the PS4 was released. Short of intentionally making lovely consoles, which didn't work out so great for Sony, they'll keep an upgrade cycle of some sort rolling along.

Arsten fucked around with this message at 15:07 on Sep 9, 2016

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Arsten posted:

I disagree simply because they make a lot of money on their consoles.

Was this true even for the PS3 after Sony had to do the dramatic price cuts?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

fishmech posted:

The main thing that the upgraded PS4/Xbox One are targeting is the ability to reliably render games at better than 1080p resolution, because 4k is going to be the new high end for a while. It'll probably be quite some time before the upgraded models get replaced.

Most games won't be rendering at better-than-1080p resolution, they'll be upscaling 1080p. The ones that do still won't be rendering more than 1080p each frame, they will be using something like checkerboard rendering that interleaves multiple frames into a 4K buffer.

The original PS4 and XB1 were something like a 7850 and often had to render at 720p and upscale to 1080p to hit good framerates. From what I've heard the console respins will have Polaris 11 graphics, which is still probably no faster than a R9 285 at best. It does depend on the title, I could do a few games at 4K at low/medium on a R9 280, but most modern stuff is going to have to be upscaled.

Paul MaudDib fucked around with this message at 15:34 on Sep 9, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Paul MaudDib posted:

Most games won't be rendering at better-than-1080p resolution, they'll be upscaling 1080p. The ones that do still won't be rendering more than 1080p each frame, they will be using something like checkerboard rendering that interleaves multiple frames into a 4K buffer.

The original PS4 and XB1 were something like a 7850 and often had to render at 720p and upscale to 1080p to hit good framerates. From what I've heard the console respins will have Polaris 11 graphics, which is still probably no faster than a R9 285 at best. It does depend on the title, I could do a few games at 4K at low/medium on a R9 280, but most modern stuff is going to have to be upscaled.

They have Polaris 10, so 2304 shaders @ 911Mhz. It's apparently enough to run Battlefront with reduced shadows at 1080p60 native, and can actually do 4K30 medium native, although the checkerboard rendering method easily pushes this to 4K40-50.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me

Watermelon Daiquiri posted:

For 450mm wafers, think about how tiling works. If you increase the area, you can fit more dies on it, and there is less wasted edge space since as the size of the die relative to the wafer decreases, a tiled circle of the dies fits closer to a circle.

Couldn't a manufacturer just put smaller die chips on the outside of the wafer to minimize tiling losses? Like GTX 1080 chips in the middle and GTX 1050 chips kindof feathering the edges?

NewFatMike
Jun 11, 2015

AM4 platform stuff is starting to ship from HP and Lenovo.

This Bristol Ridge stuff isn't sounding too bad, and it's nice to see a modernized chipset from AMD as well. LTT said it's kind of a beta test to make sure all the platform kinks are ironed out for the Zen release, but the APUs actually look like they might have merit on their own? Which I didn't expect.

I would love to see when the benches hit how it performs on things like Overwatch, CS:GO and [HOT MOBA] on their own. It gives me hope for a Zen APU based laptop with an eGPU capability.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Arsten posted:

I disagree simply because they make a lot of money on their consoles. By the end of the 5 years, people aren't buying them anymore because everyone had a PS3 when the PS4 was released. Short of intentionally making lovely consoles, which didn't work out so great for Sony, they'll keep an upgrade cycle of some sort rolling along.

No, they usually don't make much money on the consoles themselves. The lion's share of the money comes from the royalties on software sales. Making money directly on the console sales takes multiple years, since it happens once they've had time to reduce costs of production. When you're making upgraded systems, you lose that ability for a while.

Potato Salad
Oct 23, 2014

nobody cares


PBCrunch posted:

Couldn't a manufacturer just put smaller die chips on the outside of the wafer to minimize tiling losses? Like GTX 1080 chips in the middle and GTX 1050 chips kindof feathering the edges?

Noap. The order in which you run a wafer through different machines changes based on the chip you're processing. Each step is hideously expensive, so I strongly doubt you'd want to run extra jobs just to make a little more money at the fringe of the wafer. Source: friend who used to work in a POWER fab.

Haquer
Nov 15, 2009

That windswept look...

fishmech posted:

No, they usually don't make much money on the consoles themselves. The lion's share of the money comes from the royalties on software sales. Making money directly on the console sales takes multiple years, since it happens once they've had time to reduce costs of production. When you're making upgraded systems, you lose that ability for a while.

The PS4 was profitable from launch, however the XBox One All In One Entertainment System from Microsoft had several severe price slashes right off the front with them trying to loss-lead their way to victory like in the 360 days and it fell flat since it's a piece of poo poo

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

NewFatMike posted:

AM4 platform stuff is starting to ship from HP and Lenovo.

This Bristol Ridge stuff isn't sounding too bad, and it's nice to see a modernized chipset from AMD as well. LTT said it's kind of a beta test to make sure all the platform kinks are ironed out for the Zen release, but the APUs actually look like they might have merit on their own? Which I didn't expect.

I would love to see when the benches hit how it performs on things like Overwatch, CS:GO and [HOT MOBA] on their own. It gives me hope for a Zen APU based laptop with an eGPU capability.

Awesome. 5 years later, they've really gotten the most they can out of this architecture and their process finally seems mature. OEM Desktop sales have got to be a market shrinking at a decent clip, and AMD is having to claw their way up from very low market share.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Haquer posted:

The PS4 was profitable from launch, however the XBox One All In One Entertainment System from Microsoft had several severe price slashes right off the front with them trying to loss-lead their way to victory like in the 360 days and it fell flat since it's a piece of poo poo

The Xbox One is doing plenty fine, also the PS4 is the first console Sony's released in a long time that had a margin of profit at launch - and even then it was a very small one.

If you want to talk about a console that's actually tanking, try the Wii U which still hasn't sold as much as the XBO despite a year extra on sale and always being the cheapest.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Grey Area posted:

Why would they make upgraded versions of the PS4/XB1 if they intend to launch all-new consoles soon?

Other people have pretty much covered everything, but of particular note:

The 360/PS3 generation VASTLY overstayed its welcome. Where in the past, console generations were around 5-6 years long, that one was eight. And I know it doesn't *really* feel like it yet, but the PS4 and XBONE are both three years old at this point. If they were going to shorten up the period between consoles, and not let another PS3/360 situation develop, then yes, 2017-2018 is the right time frame to expect a new console, and right about now is the proper timing for the Slim versions of consoles to come out. I file the "4K" and "VR" versions of these consoles into basically the same category.

(You would be forgiven for not realizing that the consoles are three years old, that was my reaction when I did some wikipediaing after E3 this year.)

That said, especially since both major console players have opted to move onto x86 SoCs, development time and budgets for the new consoles should be vastly less than they were in generations past, which now enables more incremental upgrades, a la Apple. In some bleak dystopian future, we might even see Slot A-type processors make a comeback.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

SwissArmyDruid posted:

The 360/PS3 generation VASTLY overstayed its welcome.

On the other hand, late in cycle titles like The Last Of Us showed that we could have had way better looking games on that generation all along!

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

That said, especially since both major console players have opted to move onto x86 SoCs, development time and budgets for the new consoles should be vastly less than they were in generations past, which now enables more incremental upgrades, a la Apple. In some bleak dystopian future, we might even see Slot A-type processors make a comeback.

Basically design from the start with the entire SoC+Memory on an add in board so you can effectively sell a case+storage+form factor+power supply and have regular 2 years upgrades or even a tiered system (basic is 1080p60 medium, enhanced is 1080p60 high, premium is 1080p60 ultra, etc) as long all games can be played on basic and there is no architectural differences that gently caress with development. In theory, more profitable than the current model and honestly kind of attractive as long as the OS is good and has M&KB support. Limitations are obviously down to thermal and power draw design constraints, but I could see ways around that even still.

Wonder if AMD's GMI interconnect would work as a bus interface for such a system?

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.

Potato Salad posted:

Noap. The order in which you run a wafer through different machines changes based on the chip you're processing. Each step is hideously expensive, so I strongly doubt you'd want to run extra jobs just to make a little more money at the fringe of the wafer. Source: friend who used to work in a POWER fab.

Yeah, this is generally true. You could technically do it, but if there are certain steps that only work on one of them, such as depositing a certain oxide or metal layer, you'd have to block off the other one and that would take the aforementioned expensive (money and time) litho step. Also, if you had two very dissimilar products with different BEOL layers, you'd have to somehow get the pads go down to the shorter die, which would again take more litho steps, cause I can think of no self aligning possibilities here.
Fabs do do patched together dies, but they all have to have the same processing steps, or very few differences.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

fishmech posted:

No, they usually don't make much money on the consoles themselves. The lion's share of the money comes from the royalties on software sales. Making money directly on the console sales takes multiple years, since it happens once they've had time to reduce costs of production. When you're making upgraded systems, you lose that ability for a while.

I think it's also a fairly cost-effective way to amortize the design and setup cost of other designs. You can take an off-the-shelf Jaguar laptop core and an off-the-shelf GCN engine and rubberstamp them onto a die, costing relatively little in the way of design/testing time and helping get you volume discounts with GloFo. Obviously it costs money to set up custom litho masks, but design engineering time and process development costs are in the mix too.

Paul MaudDib fucked around with this message at 23:30 on Sep 9, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo

Watermelon Daiquiri posted:

Yeah, this is generally true. You could technically do it, but if there are certain steps that only work on one of them, such as depositing a certain oxide or metal layer, you'd have to block off the other one and that would take the aforementioned expensive (money and time) litho step. Also, if you had two very dissimilar products with different BEOL layers, you'd have to somehow get the pads go down to the shorter die, which would again take more litho steps, cause I can think of no self aligning possibilities here.
Fabs do do patched together dies, but they all have to have the same processing steps, or very few differences.

AMD will have to figure that out themselves if they haven't already. The potential for problems with Vega/Navi/Zen dies not being the same thickness as HBM stacks is not insignificant, if they intend to build chips as MCMs like I expect.

SwissArmyDruid fucked around with this message at 04:25 on Sep 12, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo
Speaking of consoles, there's an article by The Guardian speaking with Playstation boss Andrew House, which hints at how Playstation is going to do their hardware going forward.

Naturally, if it works out well for them expect Microsoft to follow suit.

https://www.theguardian.com/technology/2016/sep/08/playstation-boss-andrew-house-ps4-pro-our-approach-isnt-reactive-this-time-around

It's just a shame that even if AMD wanted to really bring all their new technology to bear with whatever comes after the PS4 Pro, there's really no reason for variable refresh rates on non-gaming or non-interactive content.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SwissArmyDruid posted:

Other people have pretty much covered everything, but of particular note:

The 360/PS3 generation VASTLY overstayed its welcome. Where in the past, console generations were around 5-6 years long, that one was eight. And I know it doesn't *really* feel like it yet, but the PS4 and XBONE are both three years old at this point. If they were going to shorten up the period between consoles, and not let another PS3/360 situation develop, then yes, 2017-2018 is the right time frame to expect a new console, and right about now is the proper timing for the Slim versions of consoles to come out. I file the "4K" and "VR" versions of these consoles into basically the same category.

(You would be forgiven for not realizing that the consoles are three years old, that was my reaction when I did some wikipediaing after E3 this year.)

That said, especially since both major console players have opted to move onto x86 SoCs, development time and budgets for the new consoles should be vastly less than they were in generations past, which now enables more incremental upgrades, a la Apple. In some bleak dystopian future, we might even see Slot A-type processors make a comeback.

The slim version of the Xbox One came out over the summer, and it's pretty much just a die-shrink (which they took advantage of to overclock it a little which allows it to handle UHD Blu-Ray playback, but has minimal impact on gaming performance).

The upgraded consoles that are coming out next year though, are quite a bit beyond that. They'll have actual improved everything though they'll also take advantage of die-shrink/process-shrink stuff.

It's also worth remembering that even though the Wii U is a full 4 years old, coming out in 2012, its hardware is basically the 11 year old Xbox 360 hardware in terms of performance, albeit helped by having 4 GB of RAM to work with instead of 512 MB.

Anime Schoolgirl
Nov 28, 2002

not even that, it's 2gb, of which 1gb is used by the OS :downsgun:

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Anime Schoolgirl posted:

not even that, it's 2gb, of which 1gb is used by the OS :downsgun:

:psyduck:

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Anime Schoolgirl posted:

not even that, it's 2gb, of which 1gb is used by the OS :downsgun:

It's a lot more complicated than that (games can access up to 1.5 GB of the RAM in certain situations, and a lot of that "OS RAM" is actually used to make sure the tablet display can be handled at the same time as the main TV display, so the true amount of RAM reserved to the OS is lower than it first appears), but there was OS-reserved memory in the 360 as well. Specifically, the 360 always has at least 32 of its 512 MB of RAM reserved for OS use, to support the common "guide" UI that lets you switch games, return to dashboard, view acheivements, play external music, etc. Depending on some of those features it can then take additional RAM and the game has to deal with it.

The PS3 reserved a similar amount, and in certain conditions would use up to 96 MB of its own 512 MB of RAM for the OS while a game was in progress, though usually it hung around 32 MB like the 360.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Didn't the 360 have 512mb combined ram, and ps3 have 256mb system and 256mb vram?

Drakhoran
Oct 21, 2012

Yes, ps3 had 256 MB of XDR Dram system memory and 256 MB GDDR3 video memory. The 360 had a single pool of 512 MB GDDR3.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
AMD has announced their high end chipset, it's the X370. I can't wait for the confusion for Cannonlake H370 and Z370 chipsets (AMD you dicks). Wide availability by February, sounds like Zen launch at CES 2017. Probably launches with Vega 10/11 as well?

SwissArmyDruid
Feb 14, 2014

by sebmojo
WELL I'VE BEEN WAITING FOR ALMOST TWO YEARS TO BUILD A NEW COMPUTER, WHAT'S ANOTHER TWO MONTHS. I mean, I am basically "I can feel my back teeth floating" levels of anticipation for Zen/Vega APUs.

(One benefit of the gimpy workstation GPU: I am not tempted to buy the new games as they come out, because I flat-out literally can't play them. Tried DOOM, it's a sludgey 24 FPS mess. And that's an example of a game with EXCELLENT optimization. Not even going to try with DXMD, and Dishonored 2 is probably a flat no.)

SwissArmyDruid fucked around with this message at 14:06 on Sep 12, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo
http://www.extremetech.com/extreme/235446-samsung-may-be-in-talks-with-amd-nvidia-to-license-graphics-ip

Salt now, so you're not salty later.

I would leave a reminder here that AMD sold off their smartphone graphics off to Qualcomm, and is why "Adreno" is an anagram of "Radeon".

That said, my money's on AMD because they have the necessary embedded and semicustom SoC experience that Nvidia doesn't.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I'd also point out that Samsung can get a better deal out of AMD, they have a better working relationship, and AMD's already using their node/process. Nvidia's advantage is proven power efficiency (may not mean much for a FP16 mobile focused part anyway), and proven integration of graphics IP into an ARM SoC. I think a problem here might be that Nvidia and Samsung compete in the same sectors wrt mobile technology, AMD doesn't.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
Nvidia's history in the console space is pretty poor, they are a competitor to Samsung in the mobile space (the tablet chip part at least), and Nvidia doesn't have a history of playing nice when it comes to sharing IP/chip designs. Samsung already has a relationship with AMD.

AMD seems to be the only logical partner for Samsung.

Adbot
ADBOT LOVES YOU

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Also pairing with Nvidia would probably mean getting your CPU bits from a different source, and since Intel don't want to touch custom silicon in any way shape or form, well.

  • Locked thread