Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Bofast
Feb 21, 2011

Grimey Drawer
Intel is still going to hurt a lot more in the server and laptop markets once the newest architecture actually gets put into Epyc and mobile Ryzen chips, and the more high end server sales they lose out on the less they can subsidize their lower end desktop chips.

If Intel gets their manufacturing nodes working properly so that they can actually make commercially sensible chips on something that isn't just another 14nm tweak, then they should be able to bounce back, but that's still a big question mark. Looking at how bad their 10nm process still is years after it was supposed to be up and running, I hope they can at least get their act together with the next node (7nm?).

Adbot
ADBOT LOVES YOU

Bofast
Feb 21, 2011

Grimey Drawer
They were not even bulldozer derived, but Jaguar-ish small low power cores on the level of Intel's Atom CPUs

Bofast
Feb 21, 2011

Grimey Drawer

WhyteRyce posted:

Excuse me but I need to transcode my Plex library to x265 while I’m playing games

You joke, but... https://twitter.com/marcan42/status/1344289933108809730
(apparently it didn't work too well without manually sending them to separate NUMA nodes)

Bofast
Feb 21, 2011

Grimey Drawer

Inept posted:

The heatsink slayer has arrived





Peaking around 290W and 104C for AVX512?!? Has there been any desktop retail CPUs even close to that kind of peak power? Maybe that ridiculous AMD FX-9590 did it, but that's still more power draw than my GPU :ohdear:

At least gaming and other normal workloads require a lot less power than that, but those are still not impressive in performance or power draw.

Bofast
Feb 21, 2011

Grimey Drawer

gradenko_2000 posted:

An FX-9590 could draw up to 370w when OC'd to 4.6Ghz all-core (at stock settings it could draw about 230w)

Well, yeah. I should have been more clear on talking specifically about stock settings.

Bofast
Feb 21, 2011

Grimey Drawer

gradenko_2000 posted:

are they even halting 10th gen production?

cuz like, 9th gen already has a discontinuation notice that went out in December of 2020, but the notice is only to announce that the last order date is for June 2021, and the final parts shipments to go out in December 2021

so there's presumably still going to be a lot of 10700s and 10850s floating around among the Rocket Lakes

I guess that would depend on what contracts they have with big customers to guarantee supply/replacement parts for specific SKUs, but it will probably take a while before the production has been entirely phased out.

Bofast
Feb 21, 2011

Grimey Drawer

MeruFM posted:

I have a dumb question, if chiplets are so important, why are gpu chips still a single chip when they’re so much bigger than CPUs?

There is talk that GPUs may be moving in this direction, as well, though I guess only Nvidia and AMD can really answer that for certain.
They have also been sufficiently different to CPUs in terms of clock speeds, memory buses, APIs and so on that it's possible the benefits have not really been worth it yet.

Bofast
Feb 21, 2011

Grimey Drawer
Latencies are generally a slightly smaller issue for GPUs, as I recall, since they can hide it better due to less serial work?

Bofast
Feb 21, 2011

Grimey Drawer

priznat posted:

One of our main filter questions for new grads/coops is "what is a linked list" and I swear only 30% of them get it.

I took one course in basic Ada and C++ programming before deciding that particular uni program wasn't my thing and even I know what a linked list is. How could so many people who actually graduate from a whole similar program not know? :psyduck:

Bofast
Feb 21, 2011

Grimey Drawer
If anything, I feel like the 11900K should be the real waste of sand. Less cores than the 10900K and performance should only slightly exceed the 11700K but at a noticable ($140-ish?) price difference.

Bofast
Feb 21, 2011

Grimey Drawer
At least the 11600K isn't bad at its price, though Hardware Unboxed is still recommending the 10600K for gaming because of its lower price.
Since there isn't anything new below the 5600X on AMD's side, Intel might still be able to provide good value in the budget segment.

The 11900K is just sad, especially if the instability issues that Anand reported keep plaguing it.

Fabulousity posted:

Intel CPU and Platform Chat: Sorry our performance is in another process node
:golfclap:
New thread title?

Bofast
Feb 21, 2011

Grimey Drawer

Fantastic Foreskin posted:

As someone who only has man-on-the-street level knowledge of chip fab, can someone explain to me what exactly it means for a node/process to fail, and how one does it for 5 years straight?

Much of Intel's issues seem to be coming from how they tried to approach smaller manufacturing nodes without waiting for EUV tools, likely in an attempt to maintain an edge in process nodes over other companies. They went for slightly different materials and methods to push how small they could make things than other foundries did and those choices ended up not working out that well in practice. It should be mentioned that they previously had issues for a long time with getting yields (useful chips per wafer) on 14nm to similar levels as their 22nm process.

Their 10nm yields were so awful that for quite a while they did not manage to make commercially viable chips on it. If I recall correctly the first :airquote: commercially available :airquote: 10nm chip they had was some dual-core chip with the iGPU disabled that released in a China only Lenovo laptop SKU. Basically, it was sent out to die just so they could tell shareholders that they were shipping 10nm.

Even when things improved to the point where they can make some chips on 10nm they did not seem able to clock the chips high enough to compete with their own 14nm process. They might be good enough for some laptop uses, for example, but not for desktop. Even their server CPUs that are scheduled for May this year seem like they will be pointless for many customers because of how the performance per watt figures will sometimes be worse than the existing ones, and that's not even counting the competition from Epyc.

It does sound like Intel have recently managed to simplify their manufacturing processes a bit due to how EUV tech is now so readily available compared to the more complex (and error prone) workarounds they were trying earlier, so they might finally get back into the swing of things with 7nm ready for production around mid/late 2023 or so.

Bofast
Feb 21, 2011

Grimey Drawer

Cygni posted:

I think i remember reading that the early 10nm node that failed had some obscene amount of multi patterning (as in like 7 passes) that basically ensured they were never going to get yields, with the idea being that eventually they would brute force it and make it work and yields would improve but they... never did.

We've talked about it before but it truly is remarkable how big a gently caress up that first 10nm failed node was. That they've been able to string 14nm out this long with out getting obliterated is a testament to how far ahead they were to start with, but that failure is really starting to hit home. Steam hardware survey has Intel hemorrhaging marketshare in gaming (they've lost 4% marketshare in the last 4 months), and Milan seems to have a ton of momentum in server world.

10nm SF really has to work, and 7nm really REALLY has to work, or the only option would be to start turning to TSMC in earnest.

I know Semiaccurate claimed Intel was doing quad patterning which is tricky enough. If it was more than that :psyduck:

The problem with turning to TSMC would be that there's no capacity available anyway, so that's not going to be a quick fix.

Bofast
Feb 21, 2011

Grimey Drawer

Beef posted:

To be fair to Intel, it wasn't obvious that EUV would reach maturity in time for their 10nm. I was working at imec years back and EUV was a running joke, always a year or two in the future over a decade or so. Intel was a large investor in ASML and EUV tech, but simply chose to do it later rather than sooner.
It was still a mistake and hubris to think that quad patterning and new materials would make that 10nm node viable, but that's in hindsight.

Well, yeah. It was just a gamble that didn't pay off. It happens.

Bofast
Feb 21, 2011

Grimey Drawer
I would mostly be wondering about the power supply. Many (not all) GTX 1650 models run without an additional PCI-e power connector because they only need the power that the motherboard slot can provide, so it's hard to say what kind of power supply HP put in that thing and what it could handle in terms of future GPU upgrades. CPU should be fine, though.

Bofast
Feb 21, 2011

Grimey Drawer

Ok Comboomer posted:

not sure but it’s supposed to be rated for 400W and there’s at least a 500W version available from HP. When this chassis is/was available on HP’s BTO site it could be optioned with a 3060 or 3070 so I’m assuming that at minimum something like a midrange rtx card would work in the future

Yeah, I mostly just wish there was more detail than "400 Watt".
I don't think it's a major concern, it's just the one thing that I would personally want to double check.

NewFatMike posted:

You know drat well that's a statement about use case, not components. Nobody's doing spreadsheets on a PlayStation.

Somewhere in the world, the one person running spreadsheets in Linux on a PS3 just felt a shiver down their spine :D

Bofast
Feb 21, 2011

Grimey Drawer

Cygni posted:

Also saw that little note that Sapphire Rapids is delayed to Q2 2022 from Q3/Q4 of this year. So if we follow the same schedule as Ice Lake SP, it will come out approximately in 2025.

Even Charlie at Semiaccurate seems to think that the Q3 2022 is likely going to be the real availability and he would be shouting from the rooftops if availability was going to be any further away than this.

Bofast
Feb 21, 2011

Grimey Drawer

Cygni posted:

i was goofin at intel's expense, a joke if you will :(

My bad, friend :(

Bofast
Feb 21, 2011

Grimey Drawer

Paul MaudDib posted:

There were also some instances of games flipping a poo poo with Threadripper because they just couldn't comprehend the idea of a processor with 32 cores and giving you nonsensical "this game needs at least 2 cores!" messages or whatever, but that wasn't Windows' fault.
There are also games like Dirt Rally that still crash if you try to run them on 9+ logical core CPUs, which is probably more likely to be an issue than something not being able to handle 32 cores. :D

Bofast
Feb 21, 2011

Grimey Drawer

Ika posted:

Sounds like:

unsigned __int8 threadIDs[8]; // There will never be more than 8 cores in a system so we can use a static buffer.
Quite possibly. A bit like the settings autodetect in GTA IV panicking if there's more than 2 GB of VRAM and hardlocking the game to 640x480 minimum settings unless you start it with some extra parameter to disable the check.
Dirt Rally actually has separate XML config files for different core counts, so I don't know if it just can't find a suitable one for that many threads or what is going on. Worked fine on my older 4C/4T R3 1300X, though.

Bofast
Feb 21, 2011

Grimey Drawer

Beef posted:

It's dumb how good that branding works. There are constantly people saying that their laptop is still good because it's an i7 or i9 without having a clue that there are generations.

My favorite was a Steam user review I saw a few days ago of a somewhat recent Spider-Man game where someone complained about performance. Their included hardware list just claimed their PC specs were way over recommended and then listed "Intel Core i7 @ 2.60GHz, 16,0GB of RAM, NVIDIA GeForce RTX 2060, SSD 480GB."

The Wikipedia page on i7 CPUs gives 17 results (16 if you ignore the embedded 1255UL having it as the E-core frequency) when searching for processors with a base clock of 2.6 GHz, and it's mostly mobile or embedded CPUs from the i7-3720QM released in 2012 to the i7-13650HX released in 2023.
Given the historical use of the @ to sometimes indicate overclocking/underclocking frequency it could also have a completely different base clock and the user might just be running it at that frequency manually.

The RAM/SSD type or speed not being listed also doesn't help, so I can't even tell for sure if they are on an old desktop or a somewhat more recent laptop by that post :psyduck:

Bofast
Feb 21, 2011

Grimey Drawer
Someone should try it with some old VIA CPU and see what happens

Bofast
Feb 21, 2011

Grimey Drawer

Worf posted:

theyre gonna have u subscribe to your CPU monthly, trust me

Not a subscription, but they essentially did try to sell CPU DLC to consumers 13 years ago (https://en.m.wikipedia.org/wiki/Intel_Upgrade_Service?s=31) and appear to have a similar thing going on in their Xeon line now (https://www.tomshardware.com/news/intel-officially-introduces-pay-as-you-go-chip-licensing)


canyoneer posted:

Every time I hear a news article talking about huge investments in AI and how Nvidia is the world leader supplying silicon for that I think about Saffron Technologies.

Intel bought it in 2015 and had it run sorta autonomously with ~100 employees without smothering it to death in the way that giant corporations who buy small firms with bleeding edge IP usually do. It was profitable and cash-flowing as a traditional software product and SaaS models. It also drove local high performance computing demand, because when customers bought the software it took a lot of horsepower to run the product and Xeons were the right answer for it. There were also some great synergies in applications developed for and sold to external customers in manufacturing that could be reused internally in the fab process.

Sounds great, right? This is where the good decisions end.

3 years later they folded it into another internal AI group who immediately killed it and fired everyone because it was Not Invented Here. One of the axe-men said that they didn't see it turning into a $10B market in the next 5 years.

It's the Intel M&A pattern of buying into a nascent technology really early, getting impatient and divesting, then trying to buy back in too late after it already went big. If they didn't have an empty suit in the driver's seat and kept it going, Saffron would probably be a double-digit percentage of net income these days or at the very least could be sold for a tidy profit

There's probably a mountain of companies that have been killed off by similar pettiness over the years. Such a waste. :(

Adbot
ADBOT LOVES YOU

Bofast
Feb 21, 2011

Grimey Drawer
7 segment displays seems like it should be an easy way to differentiate, given how relatively inexpensive it is and how few (none?) of the low-to-midrange boards have it these days.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply