|
Cygni posted:Only TSMC/AMD know those numbers now, I bet. That's hilariously proprietary info, if that leaked or was released, it would tell Intel and all the foundry competitors a shitload about AMDs tech and TSMCs process technology. They might give some roundabout throw-away numbers, but any kind of detail will be 'lose your job, and then the lawsuit' tier proprietary.
|
# ? Jun 13, 2019 23:55 |
|
|
# ? Jun 7, 2024 12:24 |
|
iospace posted:Would it be worth shelling out the extra 35 or so bucks for a 2200G over an Athlon 220G? What are your use-cases? If it's primarily going to be a single-tasking office, web-browsing, or HTPC system then I'd say probably no. If it's going to do any kind of multi-tasking or 3D-intensive application (games) then absolutely yes.
|
# ? Jun 14, 2019 00:32 |
|
To others running Linux on ASRock boards, heads up: I've updated to the most recent BIOS on 2 boards now (Fatality B450 MITX, and the A300 STX mini desktop), and in both cases the my network interfaces have been re-enumerated. The B450 board's wireless interface moved from wlp36s0 to wlp7s0; the A300's wireless moved from wlp46s0 to wlp2s0. Not the worst thing in the world, but i spent several minutes wondering how the gently caress a BIOS update broke my internet.
|
# ? Jun 14, 2019 00:39 |
|
Mr.Radar posted:What are your use-cases? If it's primarily going to be a single-tasking office, web-browsing, or HTPC system then I'd say probably no. If it's going to do any kind of multi-tasking or 3D-intensive application (games) then absolutely yes. Probably going to be a capture PC if I do go with the upgrade.
|
# ? Jun 14, 2019 00:47 |
|
Cygni posted:The parts in the stack are all price engineered anyway. You aren't actually paying that price difference from top to bottom of the stack because of any real differences in production cost (with the exception of the 2 die parts obvi). You are paying that price difference because AMD thinks it can get you to pay it. The bill of materials for a 3800X will be something like $40 with the HSF and box, if that. But AMD needs to recoup those development costs, and you do that by charging a premium on the high end. This stuff is pretty fascinating to me. Like how the better parts of a wafer are used for higher end parts and the less-good chunks are put in lower end parts. Are there any videos about how modern processors are made and go through all this? How do they get those giant wafers, why is the quality variable, why are the dudes in white suits in some impossibly sterilized room, what makes a processor high end or low end, how did they fit billions of transistors into something when one was gigantic in the 1950s? Basically a more detailed How Its Made - Processors.
|
# ? Jun 14, 2019 00:59 |
|
I think it was someone in this thread actually that recommended this talk, but it's pretty good and even this engineer describes the process as "inseparable from magic". Also there are a ton of trade secrets he has to talk around, it's pretty interesting. https://www.youtube.com/watch?v=NGFhc8R_uO4&t=2729s
|
# ? Jun 14, 2019 01:11 |
|
And just think, Intel did not improve much of anything since that lecture. lol
|
# ? Jun 14, 2019 01:22 |
|
BeastOfExmoor posted:Weird, I would've sworn I'd seen it reported as 8+4 at Computex, but Anandtech seems to agree with you. 6+6 lets you pick out the best 6 cores from each chiplet (or rather, the best 3 cores from each CCX). Not that I'm an expert or anything, but the binning strategy is obviously immensely more complex than "the best x% of chips become Epyc" like Reddit seems to think. It's probably not even a greedy strategy at all, apart from the handful of chips that are actually broken and need to be binned down (it is a minority, something like 70% of chips are fully functional even on 7nm). For example, Epyc is locked clocks, shipping the top silicon as Epyc is pointless if it's better than the clock/voltage needed, so that might be better off shipped as Threadripper actually (where it can be overclocked). And leakage is not actually that big a problem since high-leakage chips usually clock better. It's at least a combinatorial optimization problem and I wouldn't be surprised if they actually calculated it out for each wafer and just attempted to maximize profit within quotas (meeting order quantities) and certain guidelines (try to ship x% as Epyc, etc). Using the Epyc IO die as the chipset is wild though. Ian Cutress and Wendell did a video where they're just talking about some of the possibilities that opens up and Ian is really jazzed about it. tfw Wendell isn't even the smartest guy in the room. Paul MaudDib fucked around with this message at 01:58 on Jun 14, 2019 |
# ? Jun 14, 2019 01:45 |
|
Paul MaudDib posted:Ian Cutress and Wendell did a video where they're just talking about some of the possibilities that opens up and Ian is really jazzed about it. The unheralded brilliance of AMD's chiplet approach did not dawn upon me until Cutress noted that the first people to hit a new process node are the mobile chip makers, who then sort out all the problems with the new process through sheer yield, and then AMD comes along saying, "hey, we need a job done on that new process with high perf libraries," but whose chiplets are still around the same size as that which TSMC is already making. God, I want AMD to get on Samsung fabs so goddamn bad, I've got a boner just thinking about the potential results.
|
# ? Jun 14, 2019 05:37 |
|
iospace posted:Would it be worth shelling out the extra 35 or so bucks for a 2200G over an Athlon 220G? I think the main benefit is the more powerful integrated graphics. I don't think there's a massive advantage the 4/4 has over 2/4 in daily use. So if you plan to do any gaming at all, I guess you want the 2200g as it could maybe handle 30 fps on low settings.
|
# ? Jun 14, 2019 06:54 |
|
iospace posted:Probably going to be a capture PC if I do go with the upgrade. As a video capture box, it's probably be worth throwing an extra $30 at. Definitely worth $30 if your planned capture card doesn't have an onboard encoder.
|
# ? Jun 14, 2019 13:51 |
|
64 core Threadripper by the end of the year? https://wccftech.com/exclusive-amd-is-working-on-a-monster-64-core-threadripper-landing-as-early-as-q4-2019/
|
# ? Jun 14, 2019 15:38 |
|
I'll pay up to 1300bux for a 32C Zen2 Threadripper.
|
# ? Jun 14, 2019 15:53 |
|
If AMD prices these in any way aggressively I’m going to be building a ton of Threadripper workstations at work next year. It would be an incredible compute density upgrade for them
|
# ? Jun 14, 2019 16:55 |
|
Paul MaudDib posted:Not that I'm an expert or anything, but the binning strategy is obviously immensely more complex than "the best x% of chips become Epyc" like Reddit seems to think.
|
# ? Jun 14, 2019 17:35 |
|
Combat Pretzel posted:I'll pay up to 1300bux for a 32C Zen2 Threadripper.
|
# ? Jun 14, 2019 18:13 |
|
Sub Rosa posted:64 core Threadripper by the end of the year? Please, Dr. Su. I can't take any more excitement.
|
# ? Jun 14, 2019 18:53 |
|
SwissArmyDruid posted:Please, Dr. Su. I can't take any more excitement. My penis can only get so erect!
|
# ? Jun 14, 2019 18:57 |
|
Wake me up when I can get four 64-core chips on the same motherboard.
|
# ? Jun 14, 2019 19:02 |
|
With 64 cores/128 threads in a single socket I imagine you'd end up hitting weird scaling issues with the rest of the board/system and end up with the cores mostly idle anyway?
|
# ? Jun 14, 2019 20:37 |
|
Progressive JPEG posted:With 64 cores/128 threads in a single socket I imagine you'd end up hitting weird scaling issues with the rest of the board/system and end up with the cores mostly idle anyway? Not on my machine! I routinely hit all 48 threads!
|
# ? Jun 14, 2019 20:41 |
|
Progressive JPEG posted:With 64 cores/128 threads in a single socket I imagine you'd end up hitting weird scaling issues with the rest of the board/system and end up with the cores mostly idle anyway? If it was quad channel it would be unbalanced but I have seen some rumors saying that it might have the full eight channels, basically just consumer Rome.
|
# ? Jun 14, 2019 21:01 |
|
MaxxBot posted:If it was quad channel it would be unbalanced but I have seen some rumors saying that it might have the full eight channels, basically just consumer Rome. probably can't do that without socket changes, maybe in an embedded form factor for server boards but why would AMD do that when they could sell you Rome instead?
|
# ? Jun 14, 2019 21:03 |
|
MaxxBot posted:If it was quad channel it would be unbalanced but I have seen some rumors saying that it might have the full eight channels, basically just consumer Rome.
|
# ? Jun 14, 2019 21:44 |
|
Yeah I just meant bandwidth starved, I guess they might make such a chip anyways since it wouldn't matter for some workloads.Paul MaudDib posted:probably can't do that without socket changes, maybe in an embedded form factor for server boards but why would AMD do that when they could sell you Rome instead? TR3 and SP3 are physically identical there's just an ID pin that makes it so SP3 CPUs won't work in a TR4 socket. I think it would be possible not sure if they actually would do it though.
|
# ? Jun 14, 2019 22:56 |
|
48 cores and 4 channel would be more or less idea for a lot of workloads, 64/4 might be a bit bandwidth light for some, but that really depends on what all you have soaking up that CPU time. That and having 144 MB of cache for the chip is frankly hilarious.
|
# ? Jun 14, 2019 23:01 |
|
SwissArmyDruid posted:The unheralded brilliance of AMD's chiplet approach did not dawn upon me until Cutress noted that the first people to hit a new process node are the mobile chip makers, who then sort out all the problems with the new process through sheer yield, and then AMD comes along saying, "hey, we need a job done on that new process with high perf libraries," but whose chiplets are still around the same size as that which TSMC is already making.
|
# ? Jun 15, 2019 00:12 |
|
ratbert90 posted:Not on my machine! I routinely hit all 48 threads! I've got a use case with some people demanding all our SQL data in an excel spreadsheet. So yeah, gimme cores.
|
# ? Jun 15, 2019 00:32 |
|
incoherent posted:I've got a use case with some people demanding all our SQL data in an excel spreadsheet. So yeah, gimme cores. I run multiple VMs and also routinely compile Buildroot and Yocto images. I could max out as many cores as you give me.
|
# ? Jun 15, 2019 00:50 |
|
ZobarStyl posted:I look back to how I reacted when AMD and GoFlo split, thinking 'what the hell kinda CPU design company doesn't even own its own fabs?' How naive that thought seems now, with fabless designers and pure-play foundries being the future for everyone but the biggest players. Intel has historically said that the cost of building out every new process node/fab functionally was betting the entire company. I have to wonder when that becomes prohibitive to the point that you see Intel designs fabbed on even tinier Samsung nodes as well. I feel like it's an ok business plan for a company that's selling as much silicon as intel to own their own fabs. What was dumb and super arrogant was them saying "gently caress all y'all, we're doing our own process generation! 10nm, we don't care what everybody else is standardizing on!"
|
# ? Jun 15, 2019 03:27 |
It’s not like Intel doesn’t have the cash to license process tech for use in their own fabs. Can anyone more knowledgeable in the workings of the semiconductor industry explain the business or technical reasons why they can’t or won’t? I know this is the AMD thread and all, and I don’t mean to derail, I’m just really curious.
|
|
# ? Jun 15, 2019 04:07 |
|
Laslow posted:It’s not like Intel doesn’t have the cash to license process tech for use in their own fabs. I think the biggest thing is, for the longest time, Intel was the best. Why license the process out when you can use your own, better process. There's a reason the 2500k is a meme. Everything from that era that came out of Intel's plants, even with mitigations applied, took a giant poo poo on AMD's contemporary offerings. AMD, under Dr. Su's leadership, has really put the pedal down.
|
# ? Jun 15, 2019 04:41 |
|
Laslow posted:It’s not like Intel doesn’t have the cash to license process tech for use in their own fabs. I'm pretty sure poo poo would have to fail even worse than it is currently for Intel to look elsewhere, 10nm might be a total flop but 7nm might still be good. I think it would take total failure of both 10nm and 7nm for them to consider looking elsewhere.
|
# ? Jun 15, 2019 04:55 |
|
MaxxBot posted:I think it would take total failure of both 10nm and 7nm for them to consider looking elsewhere. If Intel's 7nm turns out to be as big of a shitshow as their 10nm has been then I think they'll be pretty much forced to look elsewhere for high performance parts at a minimum. They won't really have a choice anymore. So far though there hasn't been anything solid to suggest their 7nm will be that bad. Just hints and rumors that it isn't going to be as good as advertised or on time despite the current Intel PR amounting to "everythings fine guys".
|
# ? Jun 15, 2019 06:47 |
|
https://twitter.com/barronsonline/status/1139719100635070464
|
# ? Jun 15, 2019 06:47 |
|
I just had a random thought that is maybe happening in a parallel universe somewhere: Donald Trump is somehow CEO of Intel and on the eve of Ryzen 3x's release he throws a temper tantrum and revokes AMD's x86 license. AMD then responds by revoking Intel's x86-64 license. I'm not sure what happens after that but I bet it'd be funny as long as you don't work in tech.
|
# ? Jun 15, 2019 08:03 |
|
I don't know what would happen exactly, but I do know that licensing wars are good and easy to win.
|
# ? Jun 15, 2019 08:08 |
|
Simple, Intel would go back to their traditional business of selling memory.
|
# ? Jun 15, 2019 11:50 |
|
Digitimes posted:ASMedia has landed orders for AMD's B550 and A520 chipsets that support PCIe 3.0 and will kick off shipments to motherboard ODMs and OEMs in the fourth quarter of 2019.
|
# ? Jun 15, 2019 19:30 |
|
|
# ? Jun 7, 2024 12:24 |
|
Has AMD indicated that Zen2 would have their TSX equivalent? It's a really marginal use case but the PS3 emulator can actually leverage it for notable performance boosts, but instruction sets don't exactly build hype when you're trying to sell your new stack
|
# ? Jun 15, 2019 20:15 |