|
Intel is still going to hurt a lot more in the server and laptop markets once the newest architecture actually gets put into Epyc and mobile Ryzen chips, and the more high end server sales they lose out on the less they can subsidize their lower end desktop chips. If Intel gets their manufacturing nodes working properly so that they can actually make commercially sensible chips on something that isn't just another 14nm tweak, then they should be able to bounce back, but that's still a big question mark. Looking at how bad their 10nm process still is years after it was supposed to be up and running, I hope they can at least get their act together with the next node (7nm?).
|
# ¿ Nov 22, 2020 17:05 |
|
|
# ¿ May 4, 2024 00:27 |
|
They were not even bulldozer derived, but Jaguar-ish small low power cores on the level of Intel's Atom CPUs
|
# ¿ Dec 28, 2020 20:33 |
|
WhyteRyce posted:Excuse me but I need to transcode my Plex library to x265 while I’m playing games You joke, but... https://twitter.com/marcan42/status/1344289933108809730 (apparently it didn't work too well without manually sending them to separate NUMA nodes)
|
# ¿ Dec 31, 2020 13:15 |
|
Inept posted:The heatsink slayer has arrived Peaking around 290W and 104C for AVX512?!? Has there been any desktop retail CPUs even close to that kind of peak power? Maybe that ridiculous AMD FX-9590 did it, but that's still more power draw than my GPU At least gaming and other normal workloads require a lot less power than that, but those are still not impressive in performance or power draw.
|
# ¿ Mar 6, 2021 15:39 |
|
gradenko_2000 posted:An FX-9590 could draw up to 370w when OC'd to 4.6Ghz all-core (at stock settings it could draw about 230w) Well, yeah. I should have been more clear on talking specifically about stock settings.
|
# ¿ Mar 6, 2021 20:11 |
|
gradenko_2000 posted:are they even halting 10th gen production? I guess that would depend on what contracts they have with big customers to guarantee supply/replacement parts for specific SKUs, but it will probably take a while before the production has been entirely phased out.
|
# ¿ Mar 15, 2021 08:16 |
|
MeruFM posted:I have a dumb question, if chiplets are so important, why are gpu chips still a single chip when they’re so much bigger than CPUs? There is talk that GPUs may be moving in this direction, as well, though I guess only Nvidia and AMD can really answer that for certain. They have also been sufficiently different to CPUs in terms of clock speeds, memory buses, APIs and so on that it's possible the benefits have not really been worth it yet.
|
# ¿ Mar 18, 2021 18:38 |
|
Latencies are generally a slightly smaller issue for GPUs, as I recall, since they can hide it better due to less serial work?
|
# ¿ Mar 18, 2021 19:09 |
|
priznat posted:One of our main filter questions for new grads/coops is "what is a linked list" and I swear only 30% of them get it. I took one course in basic Ada and C++ programming before deciding that particular uni program wasn't my thing and even I know what a linked list is. How could so many people who actually graduate from a whole similar program not know?
|
# ¿ Mar 21, 2021 14:34 |
|
If anything, I feel like the 11900K should be the real waste of sand. Less cores than the 10900K and performance should only slightly exceed the 11700K but at a noticable ($140-ish?) price difference.
|
# ¿ Mar 23, 2021 14:19 |
|
At least the 11600K isn't bad at its price, though Hardware Unboxed is still recommending the 10600K for gaming because of its lower price. Since there isn't anything new below the 5600X on AMD's side, Intel might still be able to provide good value in the budget segment. The 11900K is just sad, especially if the instability issues that Anand reported keep plaguing it. Fabulousity posted:Intel CPU and Platform Chat: Sorry our performance is in another process node New thread title?
|
# ¿ Mar 31, 2021 14:28 |
|
Fantastic Foreskin posted:As someone who only has man-on-the-street level knowledge of chip fab, can someone explain to me what exactly it means for a node/process to fail, and how one does it for 5 years straight? Much of Intel's issues seem to be coming from how they tried to approach smaller manufacturing nodes without waiting for EUV tools, likely in an attempt to maintain an edge in process nodes over other companies. They went for slightly different materials and methods to push how small they could make things than other foundries did and those choices ended up not working out that well in practice. It should be mentioned that they previously had issues for a long time with getting yields (useful chips per wafer) on 14nm to similar levels as their 22nm process. Their 10nm yields were so awful that for quite a while they did not manage to make commercially viable chips on it. If I recall correctly the first commercially available 10nm chip they had was some dual-core chip with the iGPU disabled that released in a China only Lenovo laptop SKU. Basically, it was sent out to die just so they could tell shareholders that they were shipping 10nm. Even when things improved to the point where they can make some chips on 10nm they did not seem able to clock the chips high enough to compete with their own 14nm process. They might be good enough for some laptop uses, for example, but not for desktop. Even their server CPUs that are scheduled for May this year seem like they will be pointless for many customers because of how the performance per watt figures will sometimes be worse than the existing ones, and that's not even counting the competition from Epyc. It does sound like Intel have recently managed to simplify their manufacturing processes a bit due to how EUV tech is now so readily available compared to the more complex (and error prone) workarounds they were trying earlier, so they might finally get back into the swing of things with 7nm ready for production around mid/late 2023 or so.
|
# ¿ Apr 2, 2021 19:26 |
|
Cygni posted:I think i remember reading that the early 10nm node that failed had some obscene amount of multi patterning (as in like 7 passes) that basically ensured they were never going to get yields, with the idea being that eventually they would brute force it and make it work and yields would improve but they... never did. I know Semiaccurate claimed Intel was doing quad patterning which is tricky enough. If it was more than that The problem with turning to TSMC would be that there's no capacity available anyway, so that's not going to be a quick fix.
|
# ¿ Apr 3, 2021 03:18 |
|
Beef posted:To be fair to Intel, it wasn't obvious that EUV would reach maturity in time for their 10nm. I was working at imec years back and EUV was a running joke, always a year or two in the future over a decade or so. Intel was a large investor in ASML and EUV tech, but simply chose to do it later rather than sooner. Well, yeah. It was just a gamble that didn't pay off. It happens.
|
# ¿ Apr 3, 2021 12:50 |
|
I would mostly be wondering about the power supply. Many (not all) GTX 1650 models run without an additional PCI-e power connector because they only need the power that the motherboard slot can provide, so it's hard to say what kind of power supply HP put in that thing and what it could handle in terms of future GPU upgrades. CPU should be fine, though.
|
# ¿ Apr 10, 2021 10:27 |
|
Ok Comboomer posted:not sure but it’s supposed to be rated for 400W and there’s at least a 500W version available from HP. When this chassis is/was available on HP’s BTO site it could be optioned with a 3060 or 3070 so I’m assuming that at minimum something like a midrange rtx card would work in the future Yeah, I mostly just wish there was more detail than "400 Watt". I don't think it's a major concern, it's just the one thing that I would personally want to double check. NewFatMike posted:You know drat well that's a statement about use case, not components. Nobody's doing spreadsheets on a PlayStation. Somewhere in the world, the one person running spreadsheets in Linux on a PS3 just felt a shiver down their spine
|
# ¿ Apr 10, 2021 17:11 |
|
Cygni posted:Also saw that little note that Sapphire Rapids is delayed to Q2 2022 from Q3/Q4 of this year. So if we follow the same schedule as Ice Lake SP, it will come out approximately in 2025. Even Charlie at Semiaccurate seems to think that the Q3 2022 is likely going to be the real availability and he would be shouting from the rooftops if availability was going to be any further away than this.
|
# ¿ Jul 6, 2021 20:00 |
|
Cygni posted:i was goofin at intel's expense, a joke if you will My bad, friend
|
# ¿ Jul 8, 2021 08:33 |
|
Paul MaudDib posted:There were also some instances of games flipping a poo poo with Threadripper because they just couldn't comprehend the idea of a processor with 32 cores and giving you nonsensical "this game needs at least 2 cores!" messages or whatever, but that wasn't Windows' fault.
|
# ¿ Jul 22, 2021 05:53 |
|
Ika posted:Sounds like: Dirt Rally actually has separate XML config files for different core counts, so I don't know if it just can't find a suitable one for that many threads or what is going on. Worked fine on my older 4C/4T R3 1300X, though.
|
# ¿ Jul 23, 2021 12:10 |
|
Beef posted:It's dumb how good that branding works. There are constantly people saying that their laptop is still good because it's an i7 or i9 without having a clue that there are generations. My favorite was a Steam user review I saw a few days ago of a somewhat recent Spider-Man game where someone complained about performance. Their included hardware list just claimed their PC specs were way over recommended and then listed "Intel Core i7 @ 2.60GHz, 16,0GB of RAM, NVIDIA GeForce RTX 2060, SSD 480GB." The Wikipedia page on i7 CPUs gives 17 results (16 if you ignore the embedded 1255UL having it as the E-core frequency) when searching for processors with a base clock of 2.6 GHz, and it's mostly mobile or embedded CPUs from the i7-3720QM released in 2012 to the i7-13650HX released in 2023. Given the historical use of the @ to sometimes indicate overclocking/underclocking frequency it could also have a completely different base clock and the user might just be running it at that frequency manually. The RAM/SSD type or speed not being listed also doesn't help, so I can't even tell for sure if they are on an old desktop or a somewhat more recent laptop by that post
|
# ¿ Jan 10, 2024 21:07 |
|
Someone should try it with some old VIA CPU and see what happens
|
# ¿ Feb 26, 2024 17:42 |
|
Worf posted:theyre gonna have u subscribe to your CPU monthly, trust me Not a subscription, but they essentially did try to sell CPU DLC to consumers 13 years ago (https://en.m.wikipedia.org/wiki/Intel_Upgrade_Service?s=31) and appear to have a similar thing going on in their Xeon line now (https://www.tomshardware.com/news/intel-officially-introduces-pay-as-you-go-chip-licensing) canyoneer posted:Every time I hear a news article talking about huge investments in AI and how Nvidia is the world leader supplying silicon for that I think about Saffron Technologies. There's probably a mountain of companies that have been killed off by similar pettiness over the years. Such a waste.
|
# ¿ Apr 4, 2024 17:58 |
|
|
# ¿ May 4, 2024 00:27 |
|
7 segment displays seems like it should be an easy way to differentiate, given how relatively inexpensive it is and how few (none?) of the low-to-midrange boards have it these days.
|
# ¿ Apr 16, 2024 17:32 |