|
ufarn posted:How fast are Noctua to deliver AM4 mounting parts when you send them your motherboard receipt for their heatsink program? I received my kit 3 weeks after it shipped to the US.
|
# ? Apr 27, 2018 16:38 |
|
|
# ? Jun 8, 2024 07:28 |
|
fishmech posted:We just passed the 14th quarter of continuous decline in all tablet sales. Apple's highest quarter for iPad sales, their first quarter of fiscal 2014, saw just over 26 million iPads sold. Their first quarter of fiscal 2018 saw 13.17 million sold. I think it's more saturation than collapse, but yeah. People don't usually replace the tablets every other year like with phones. The future market isn't going to be what it was until something else revolutionary happens. Hologram tablets or something, I dunno. I don't think Intel and AMD are losing sleep over this in particular.
|
# ? Apr 27, 2018 17:09 |
|
LegitMaan posted:I received my kit 3 weeks after it shipped to the US.
|
# ? Apr 27, 2018 18:09 |
|
ufarn posted:How fast are Noctua to deliver AM4 mounting parts when you send them your motherboard receipt for their heatsink program? Took me 19-21 days, if I remember. It’s a free thing sent from Austria, it takes time.
|
# ? Apr 27, 2018 18:30 |
|
fishmech posted:We just passed the 14th quarter of continuous decline in all tablet sales. Apple's highest quarter for iPad sales, their first quarter of fiscal 2014, saw just over 26 million iPads sold. Their first quarter of fiscal 2018 saw 13.17 million sold. I feel with tablets we're getting less repeat buyer as people are happy with what they have. It goes online it runs the newest apps still. A tablet is likely closer to a game console in life cycle than an x86 computer. We also have large phones which are eating into tablets. I opted to get a Galaxy S9+ for the larger screen to avoid replacing my Nexus 10 which having some kind of internal issue regulating power because it crashes if it gets under 80% charge (flashing display garbled sound that is god awful and is damaging the speakers). I've assumed some kind of voltage issue. With current tech we've kind of hit a point that the average person doesn't see any advantage to upgrading anything. Devs are shying away from releasing stuff targeting only the newest hardware so we're inn a giant game of chicken and VR might be what pulls us out of it as there is a new factor there where people will make a program in VR because they can and want to, not just for money. Do I think tablets are going to take over desktops? not really, do I think ARM could give x64 a run for its money? Sure on a long enough timeline. I wouldn't expect to see them standard in business until 2040 or so though, and that would require both AMD and Intel falling on their faces.
|
# ? Apr 27, 2018 18:30 |
|
Isn't x86 vs ARM more about whether the stronger x86 cores will lose out to ARMs ability to just leverage a hell of a lot more cores for a given wattage?
|
# ? Apr 27, 2018 18:41 |
|
fishmech posted:We just passed the 14th quarter of continuous decline in all tablet sales. Apple's highest quarter for iPad sales, their first quarter of fiscal 2014, saw just over 26 million iPads sold. Their first quarter of fiscal 2018 saw 13.17 million sold. Out of curiosity, what was the last year the iPad (not Pro) had SoC parity with the same year's iPhone? Cause I was going to buy one until they stopped that, and loving lmao if I'm going to pay a grand Canadian for a Pro to get the latest chip.
|
# ? Apr 27, 2018 18:50 |
|
Mister Facetious posted:Out of curiosity, what was the last year the iPad (not Pro) had SoC parity with the same year's iPhone? The iPad Air 2 of 2014 matched up with the iPhone 6 of the same year, while the iPad Mini 3 released that year was behind in the SoC. After that yeah you needed to be on a Pro to be at all with parity to a contemporaneous iPhone. FaustianQ posted:Isn't x86 vs ARM more about whether the stronger x86 cores will lose out to ARMs ability to just leverage a hell of a lot more cores for a given wattage? For servers, maybe. For consumer products that's really not an option because of how a lot of what consumers do doesn't massively parallelize in that way to make up for much slower individual cores. You could take a typical laptop with a quad core x86-64 CPU's power budget and maybe slap in 32 slow ARM cores in its place, but it's unlikely to help your web browsing or anything else. Take Chromebooks for example - there are 6 core ARM based ones out there that still have noticeably worse web browsing performance then recent Intel Core-based super mobile 2 core chips that are 5 watt TDP.
|
# ? Apr 27, 2018 19:20 |
|
Hey guys, I'm a dingus with an iPad Pro. Honestly it's an awesome machine and if I didn't need to do industrial design, it's distinctly possible it could be my only device.
|
# ? Apr 27, 2018 19:54 |
|
For my druthers I think I'd rather have a HP Chromebook x2 than an iPad Pro but they both demonstrate devices that can or do run on ARM instead of x86 and are simple and can last a while without being replaced/upgraded.
|
# ? Apr 27, 2018 20:09 |
|
SamDabbers posted:I can't be the only one who wants to see a workstation-class ARM chip, right? Will ARM always be relegated to low power implementations? No I've really wanted a windows ARM laptop or desktop for a while, I can't even really articulate why I just want one. Maybe it's because I've enjoyed tinkering around with ARM linux SBCs for the past few years. FaustianQ posted:Isn't x86 vs ARM more about whether the stronger x86 cores will lose out to ARMs ability to just leverage a hell of a lot more cores for a given wattage? Well current ARM designs are mostly low-power cores designed for mobile, if you wanted to compete with x86 you'd design an ARM core with a higher TDP. The big question is whether or not ARM offers much of a performance/watt advantage over x86 when scaled up to the level of performance needed for a laptop or desktop. MaxxBot fucked around with this message at 20:38 on Apr 27, 2018 |
# ? Apr 27, 2018 20:32 |
|
MaxxBot posted:
There are real-world examples of this that say yes, ARM on server does have considerable benefits for even cpu-intensive things like TLS termination on large hosting environments. The problem at the moment is most of the common libraries are optimized for x86 and run like dogshit on arm, but with some amount of actual effort they were achieving workload parity with Intel on most tasks while being way under the traditional power budget, not to mention the lower capital costs. https://blog.cloudflare.com/arm-takes-wing/ There was a better article that had similar results but went in to the library optimization issues and some of the preliminary work they did there but I cannot for the life of me fine it.
|
# ? Apr 27, 2018 21:16 |
|
BangersInMyKnickers posted:https://blog.cloudflare.com/arm-takes-wing/ MariaDB did a blog post: https://mariadb.com/resources/blog/write-optimizations-qualcomm-centriq-2400-mariadb-1035-release-candidate And there was this thing about using NEON instructions in place of Intel SSE: https://www.cnx-software.com/2018/04/14/optimizing-jpeg-transformations-on-qualcomm-centriq-arm-servers-with-neon-instructions/
|
# ? Apr 27, 2018 22:05 |
|
Tangentially related to process news: https://www.engadget.com/2018/04/27/intel-delays-cannon-lake-chips-again/
|
# ? Apr 27, 2018 22:06 |
|
"Dancing on Intel's grave" is looking pretty justified after that earnings call. Unless GloFo fucks up or Intel pulls a rabbit out of the hat in the next couple months, AMD is going to have a process lead on Intel for at least a year and that should give them the performance crown. Intel's position is looking rougher than it has a long time, between the 10nm problems, Microsoft is doing work for ARM Windows for mobile devices, and it being a couple years until their next clean-sheet uarch. At this point I'm more curious about NVIDIA, because what I said before still applies IMO. If they want to squeeze in a generation this year, they're running out of launch window, or else they will either be in the position of having a weird short generation that's around for less than a year, or letting AMD have a node advantage for 6-12 months. TBH it seems increasingly unlikely we'll see something this year, particularly if we start edging into fall.
|
# ? Apr 27, 2018 22:50 |
|
Like I said earlier: It's less "dancing on Intel's grave" and more "making hay while the sun shines". Like it or not not, Intel is, and continues to be, "Chipzilla", pulled in $16.1 billion in revenue, and $4.5 billion in profit. That's not even counting what could be a watershed year in 2019 or 2020 for Intel following actual, real, proper Cannon Lake/Ice Lake launch, where people upgrade from their existing Intel silicon to new, Specter/Meltdown-free silicon, rather than do the logistics of switching to AMD.
|
# ? Apr 27, 2018 23:17 |
|
SwissArmyDruid posted:Like I said earlier: It's less "dancing on Intel's grave" and more "making hay while the sun shines". A few companies plan on big epyc offerings once zen2, the 2019 zen2 not zen+, stuff starts launching as well. Khorne fucked around with this message at 23:20 on Apr 27, 2018 |
# ? Apr 27, 2018 23:18 |
|
Logistics of going out and seeing what the AMD server product stack is/figuring out what AMD X EPYC part to get to replace Intel Y Xeon, or going outside of established vendors to someone with an AMD server product line. It seems such a trivial thing, but inertia is the most powerful force in the universe, after all. And it's not just "motherboard + CPU", there's cooling as well. Which might necessitate a completely new rackmount case. And everybody knows how much fun THOSE are to work with. When deployments scale large enough, all those incidental costs and time incurred may make switching over cost-ineffective. SwissArmyDruid fucked around with this message at 23:31 on Apr 27, 2018 |
# ? Apr 27, 2018 23:22 |
|
Take a look at AMD's revenues compared to Intel's. AMD could dominate benchmarks across their offerings over the next couple years and still fail to make a dent on market share. Speaking as someone with a 1950X at home, cost-to-performance isn't the only metric that's important when it comes to big iron or enterprise deployment.
|
# ? Apr 27, 2018 23:27 |
|
MaxxBot posted:No I've really wanted a windows ARM laptop or desktop for a while, I can't even really articulate why I just want one. Maybe it's because I've enjoyed tinkering around with ARM linux SBCs for the past few years. I've wanted ARM on desktop too, just because it really opens up the competition compared to x86. It'd mean Samsung, Qualcomm, Nvidia, (Insert Chinese Competitor), and even AMD (I bet K12 is a high power ARM design, since it was originally slated for AM4). But not Intel, because they sold off their ARM design labs lmao. They don't have to beat x86 though, just come within 10% for it to be worth to switch over. But before that, AMD and Intel would probably open up the license for x86 again. SwissArmyDruid posted:Tangentially related to process news: So basically unless TSMC drops the ball (HA!), AMD gets a 6 month head start on the performance crown, and there is no guarantee Intel gets it back with Icelake. SwissArmyDruid posted:Logistics of going out and seeing what the AMD server product stack is/figuring out what AMD X EPYC part to get to replace Intel Y Xeon, or going outside of established vendors to someone with an AMD server product line. Then it'd be on the AMD marketing team to be aggressive in meeting these groups so no leg work on the buyers part is needed. It's a huge investment but a necessary one.
|
# ? Apr 27, 2018 23:48 |
|
Paul MaudDib posted:At this point I'm more curious about NVIDIA, because what I said before still applies IMO. If they want to squeeze in a generation this year, they're running out of launch window, or else they will either be in the position of having a weird short generation that's around for less than a year, or letting AMD have a node advantage for 6-12 months. TBH it seems increasingly unlikely we'll see something this year, particularly if we start edging into fall. Nvidia will be using the same TSMC 7nm as the Vega respin, i would wager.
|
# ? Apr 27, 2018 23:49 |
|
Cygni posted:Nvidia will be using the same TSMC 7nm as the Vega respin, i would wager. We've only heard about Nvidia using the more mature 12nm/10nm nodes though, I highly suspect Nvidia realizes thier dominate position and is going for more mature process technology because as long as they even have parity with whatever AMD does on 7nm, they're in no danger of losing marketshare. But they could lose a bunch of money jumping into an immature node so
|
# ? Apr 28, 2018 00:10 |
|
FaustianQ posted:We've only heard about Nvidia using the more mature 12nm/10nm nodes though, I highly suspect Nvidia realizes thier dominate position and is going for more mature process technology because as long as they even have parity with whatever AMD does on 7nm, they're in no danger of losing marketshare. But they could lose a bunch of money jumping into an immature node so If they do go for 12/10 then its to save money while amd is willing to roll the dice.
|
# ? Apr 28, 2018 00:20 |
|
SwissArmyDruid posted:Logistics of going out and seeing what the AMD server product stack is/figuring out what AMD X EPYC part to get to replace Intel Y Xeon, or going outside of established vendors to someone with an AMD server product line. They have "bronze", silver, and that kind of scheme on their xeon line now. Khorne fucked around with this message at 00:58 on Apr 28, 2018 |
# ? Apr 28, 2018 00:41 |
|
Khorne posted:Intel rebranded everything over the past year or two. To an unidentifiable degree. I have no idea what intels lineup is anymore, skylake/kaby/cannon it all came out so increadily soon after one another and over the top of each with the -x version being release after another product lunch.
|
# ? Apr 28, 2018 00:51 |
|
My bet for Nvidia is a 12nm TSMC consumer part with GDDR6 this year, and then a 7nm TSMC Vega replacement part next year. This is based on over analyzing dubious quotes and no real hard data. Hell yeah. With the way Nvidia has been playing the cards close to the vest, its hard to say what they are doing. Another thing I was thinkin about with AMD is that they have now spent the absurd redesign/spin-up costs to multi-source both their GPUs and CPUs. In a world where processes seem to be hanging on the ragged edge and every major player has had a failed process in the last 5ish years, that may turn out to be money will spent. If TSMC or Glofo can't deliver, AMD isnt stranded.
|
# ? Apr 28, 2018 00:51 |
|
wargames posted:If they do go for 12/10 then its to save money while amd is willing to roll the dice. AMD has to roll the dice, it's not about being willing at this point. It's why they've pushed for further refinement on the 14nm node rather than jumping to 10nm themselves, they just don't have the money to throw around to bother with a half node, and by the time they have bank to do so 7nm will be mature enough it wouldn't make sense at all. Just do pipecleaning with 7nm Vega in datacenter where margins are super high to recover for the poo poo yields, and push out 7nm Zen right after because garbage yeilds are fine as long as it can be soaked up in EPYC and a willingness to to reuse even effectively trash silicon for desktop. I think the release schedule [2018-19] for AMD is going to look like this Zen+ (April) B450/A(yyy)420 (May) RX 600 series: Polaris 30, 31, 32 on refined 12nm with input from new team members (June-July) TR+, Ryzen Pro and EPYC2 (August) Raven Ridge+ (October) Vega 7nm (November) Zen2: 3800X, 3700X, 3700 (March) X570 (March) Zen2: 3600X, 3600, 3500X, 3500 (April-May) B550/A520 (May) RX 700 series: Navi (June-July) TR2, Ryzen Pro, X599 and Rome (August) Picasso: 3400G, 3300G, 3200G (October) Best guess is the new GPU uarch still won't be ready for a mid 2020 release, late 2020 at very best so 2020 is likely to be AMDs quiet year where they basically release Zen2+ and gear up for the switch over to Zen5, DDR5, etc in 2021.
|
# ? Apr 28, 2018 00:57 |
|
FaustianQ posted:AMD has to roll the dice, it's not about being willing at this point. It's why they've pushed for further refinement on the 14nm node rather than jumping to 10nm themselves, they just don't have the money to throw around to bother with a half node, and by the time they have bank to do so 7nm will be mature enough it wouldn't make sense at all. They didn't jump to 10nm is because gloflo didn't do 10nm. gloflo did form a partnership with ibm/samsung up in new york to dev 7nm. https://arstechnica.com/gadgets/2015/07/ibm-unveils-industrys-first-7nm-chip-moving-beyond-silicon/ and 5 nm seems like it could be on the path for gloflo and amd https://www.hpcwire.com/2017/06/05/ibm-clears-path-5nm-beyond-finfet/ wargames fucked around with this message at 01:18 on Apr 28, 2018 |
# ? Apr 28, 2018 01:14 |
|
SwissArmyDruid posted:Logistics of going out and seeing what the AMD server product stack is/figuring out what AMD X EPYC part to get to replace Intel Y Xeon, or going outside of established vendors to someone with an AMD server product line. You're not wrong about inertia being a big thing in corporate purchasing, but AMD did get both Dell EMC and HPE on board.
|
# ? Apr 28, 2018 01:59 |
|
Which is a massive leap forward! But some curmudgeon of a CTO out there is going to nix a small-to-medium AMD rollout, citing some absolutely bullshit reason or "because I said so" because the "choice" to buy Intel is so deeply ingrained.* Put together enough small-to-medium rollouts that re-up with Intel that don't need to re-up with Intel, and that's hundreds of millions in lost profits for AMD. *Or they're just so technologically incompetent that they went "Meh, Spectre and Meltdown aren't anything to worry about." Excuse me, I just got the chills and need to go sit in a corner and rock back and forth for a bit.
|
# ? Apr 29, 2018 22:23 |
|
Heres all the Zen+ SKUs that are on the way: https://videocardz.com/76064/amd-lists-ryzen-2100-2300x-2500x-and-2800u-ready-for-mass-market
|
# ? Apr 30, 2018 16:58 |
|
So a 4C8T 16MB L3, a 4C8T 8MB L3 and a 4C4T 8MB L3? 175$, 120$ and 80$?
|
# ? Apr 30, 2018 19:06 |
|
Khorne posted:Intel rebranded everything over the past year or two. To an unidentifiable degree. My work laptop has a Xeon badge because what do words even mean anymore I guess?!
|
# ? May 1, 2018 12:48 |
|
Munkeymon posted:My work laptop has a Xeon badge because what do words even mean anymore I guess?! It means it's compatible with ECC RAM.
|
# ? May 1, 2018 12:55 |
|
Munkeymon posted:My work laptop has a Xeon badge because what do words even mean anymore I guess?! Xeon hasn't meant much for over a decade though, what's new? Like, you don't even need it for ECC support if you go old enough; my 875P motherboard is perfectly happy to run 4GB ECC DDR1 with a Pentium M in it.
|
# ? May 1, 2018 17:12 |
|
FaustianQ posted:So a 4C8T 16MB L3, a 4C8T 8MB L3 and a 4C4T 8MB L3? 175$, 120$ and 80$? Those prices may be somewhat optimistic, but if they're around that it would basically crush the Pentium Gold + i3 8100 in that price range. A 80$ 4c/4t Ryzen in particular would render the entire Pentium Gold line moot for even the most budget- conscious builders.
|
# ? May 1, 2018 17:26 |
|
Paul MaudDib posted:It means it's compatible with ECC RAM. https://www.dell.com/community/Laptops-General/Dell-Precision-5520-with-Xeon-support-ECC-RAM/td-p/5164067 pretty sure it's an upsell badge, but maybe newer models aren't as gimped e: yes, I understand the difference between CPU support and system support, but it sure looks like Dell just wanted a mostly-pointless premium tier. I'm guessing they wanted to make the, uh, regular top-tier look less expensive and make a few bucks off people who want impressive case badges. Munkeymon fucked around with this message at 21:23 on May 1, 2018 |
# ? May 1, 2018 21:20 |
|
Computerbase benchmarked Ryzen 2 in a few popular/e-sports titles at 1080p and found decent gains. https://www.computerbase.de/2018-04/ryzen-2600-2700-fortnite-pubg-overwatch/2/
|
# ? May 1, 2018 23:08 |
|
Happy_Misanthrope posted:Those prices may be somewhat optimistic, but if they're around that it would basically crush the Pentium Gold + i3 8100 in that price range. A 80$ 4c/4t Ryzen in particular would render the entire Pentium Gold line moot for even the most budget- conscious builders. Yadda yadda market segmentation, but it's kinda pointless when 2200G/2400G is there and 2600 non-X is already $200 and will be dropping more in the following months.
|
# ? May 2, 2018 02:55 |
|
|
# ? Jun 8, 2024 07:28 |
|
I read that AMD's iGPU solution with the 2200G/2400G blew intel's iGPU offerings out of the water, as far as graphics performance is concerned. Are there any indications of where this is going in the future? Even though I have a GTX 1070, it's really cool to see that you can actually game without a GPU and without relegating yourself to Flash-like steam games. e: I mean yeah it seems like the 2400G is comparable to a discrete 1030GT which isnt much. But I bet I could play New Vegas or something on it. buglord fucked around with this message at 03:49 on May 2, 2018 |
# ? May 2, 2018 03:47 |