Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
WhyteRyce
Dec 30, 2001

It's more alluring now because they aren't the leader like they were in the past, there is no clear path to regaining that leadership, and it will costs billions just to try and hold on to second or third place. And since the board brought on bean counter Bob to run the show they don't seem particularly bold either

Adbot
ADBOT LOVES YOU

VorpalFish
Mar 22, 2007
reasonably awesometm

Tbh I bet they're pretty happy to have their own fabs right now even if they're not as good as tsmc. Something to be said for not having to fight with every other chipmaker on the planet for production capacity.

Kazinsal
Dec 13, 2011
I'm going to be running this 8700K + 1080Ti until the bombs fall and/or supercovid-23 ravages the Earth, I can feel it already.

My "cheap upgrade" path would I guess be dropping a 9900K in and hoping the Asus Prime Z370-A I have has enough VRM-y goodness for an all-core 5.0 GHz

Kazinsal fucked around with this message at 03:55 on Dec 30, 2020

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



WhyteRyce posted:

It's more alluring now because they aren't the leader like they were in the past, there is no clear path to regaining that leadership, and it will costs billions just to try and hold on to second or third place. And since the board brought on bean counter Bob to run the show they don't seem particularly bold either

That's my thought on Intel: they have gobs of money and talent, there is no reason they shouldn't be able to make become competitive again, except for if that conflicts with what the shareholders and Bob Swan are interested in.

I feel like Intel and Boeing have become Case Studies for why it's best to still let engineers lead companies that derive their products and services from engineering/science/etc.

Edit:

VorpalFish posted:

Tbh I bet they're pretty happy to have their own fabs right now even if they're not as good as tsmc. Something to be said for not having to fight with every other chipmaker on the planet for production capacity.

It's less about having fab capacity, and more about their struggles to get 10 nm, and now 7 nm, functional and capable of mass production. They've just about hit the limit of what 14nm can do for them, and we've seen what they can provide via 10 nm with Ice Lake/Tiger Lake, but they desperately need to solve their 7nm and future issues before others such as Samsung also surpass them.

At this point I don't think Intel is going to catch TSMC in terms of annual node improvements, but they still could...

Canned Sunshine fucked around with this message at 04:00 on Dec 30, 2020

WhyteRyce
Dec 30, 2001

VorpalFish posted:

Tbh I bet they're pretty happy to have their own fabs right now even if they're not as good as tsmc. Something to be said for not having to fight with every other chipmaker on the planet for production capacity.

Yeah when people say they have a problem with their fabs I act like a smart rear end and ask them to define what that is because they are continuing to crank out gobs and gobs of wafers and manage to sell them all. It's a fine short term situation and if they get 10nm rolling they'll continue to have a great situation even if they aren't the performance leader. Meanwhile AMD has to pretty much pick and choose what markets they'll attack, which ones they'll make a nominal showing, and which ones they just completely abandon.


SourKraut posted:

That's my thought on Intel: they have gobs of money and talent, there is no reason they shouldn't be able to make become competitive again, except for if that conflicts with what the shareholders and Bob Swan are interested in.

I feel like Intel and Boeing have become Case Studies for why it's best to still let engineers lead companies that derive their products and services from engineering/science/etc.

BK was supposed to be that engineering guy who would fix the fabs but I guess they should specify actual engineers and not "engineers" who climbs the ranks as a manager or via politics

I have issues with what Bob has done but I can't argue he's doing mostly what you'd expect a finance guy to do in that position and done it ok. It's not his fault BK and Murthy hosed over the pipeline of engineers to leadership positions and he doesn't really have anyone decent he can lean on

WhyteRyce fucked around with this message at 04:06 on Dec 30, 2020

Not Wolverine
Jul 1, 2007

Cygni posted:

post through the wrongness, never admit you might be wrong. never stop, never surrender: the posters creed.
From hell’s cubicle I stab at thee; for hate’s sake I spit my last breath at thee. Sink all cores and all benchmarks to one common post. Touche Goonesir, I tip my Fedora at thee.

I explicitly stated the 11700k would only be better at gaming, which I think will diminish. In my opinion, it's a bad value specifically because comparing it to a Ryzen CPU with twice as many cores implies a need/desire for more cores, which ironically, doesn't actually benefit gaming a whole lot, yet. Even more ironically, I think game engines will be able to take advantage of more cores in the future, unless MS, Sony and Nintendo are just too high on crack to pick a decent CPU for their respective gaming consoles. Regardless, I think this horse is pretty well beaten.

VorpalFish
Mar 22, 2007
reasonably awesometm

Not Wolverine posted:

From hell’s cubicle I stab at thee; for hate’s sake I spit my last breath at thee. Sink all cores and all benchmarks to one common post. Touche Goonesir, I tip my Fedora at thee.

I explicitly stated the 11700k would only be better at gaming, which I think will diminish. In my opinion, it's a bad value specifically because comparing it to a Ryzen CPU with twice as many cores implies a need/desire for more cores, which ironically, doesn't actually benefit gaming a whole lot, yet. Even more ironically, I think game engines will be able to take advantage of more cores in the future, unless MS, Sony and Nintendo are just too high on crack to pick a decent CPU for their respective gaming consoles. Regardless, I think this horse is pretty well beaten.

I.. What? Sony and MS have already picked their console cpus - they're 8c16t zen2 or they can be 8c8t with slightly higher clocks. Literally the same core count as last generation, just with vastly better per core performance. Making games scale with core count is hard and even when they do, they're still mostly performance limited by 1 or 2 main threads and thus need very high per core performance. I wouldn't recommend going lower than 6c12t but the reality is modern 4c8t cpus with high IPC and clocks still perform very well in the vast majority of games - faster than some 8c16t cpus on slower uarchs. Look at benches of the 3300x vs the 2700x in games.

It's really weird how you're invoke the word "value" while you continue to bring up the 5950x. If you're primarily a gamer, the 5950x is loving awful value. It costs $800 and does nothing for games that a $300 5600x doesn't do. You can't compare the 11700k to the 5950x and say it's bad value without knowing what it's going to cost, and I promise you it will be under $800. If intel undercuts 5800x pricing, it will be good value for high end gaming builds.

16c32t has it's place - if you do media creation, work with a lot of vms, do a lot of compiling - if you have a hobby or a job that benefits from the parallelism. And if you want to game on that machine as well, of course it will perform. It's still a high IPC uarch that clocks well. But buying a 5950x just to game on is literally dumping money down the drain relative to the lower core count options.

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord

gradenko_2000 posted:

There's this rumor about a hedge fund that's trying to push Intel to spin-off its fabs and I found it really annoying to read because it's just going to be another example of vulture financialization that's going to ruin a company if it succeeds.

lol no

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
"Hey Intel, you should totally do this really stupid thing that will make me and my firm a lot of money and totally cripple you in the long run. But I won't care because I'll own my own island."

Not Wolverine
Jul 1, 2007
I like the idea of another fab, but how can someone take a look at the resounding failure of GloFo and think "yes, this is a good idea."? I think Intel can catch up to TSMC and Samsung's processes eventually, but even 14nm+++++++ ad infinitum will eventually reach a limit. 10nm is still behind 7nm, or is Intel's 10nm as good as TSMC 7nm? Performance wise, I think it could be similar, but I thought Intel's 10nm yields were still pretty bad.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
All I know is that if Raja somehow squirms his way into CEO-ship, he'd fall for stupid ideas like this in half a second if they massaged his ego even the slightest little bit.

WhyteRyce
Dec 30, 2001

Intel should probably start spending those gobs of money on employee compensation instead of deliberately not paying top market prices and scratching their head why so many people are leaving them to go to FAANG

Or you can do even more stock buybacks instead I guess

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

WhyteRyce posted:

Intel should probably start spending those gobs of money on employee compensation instead of deliberately not paying top market prices and scratching their head why so many people are leaving them to go to FAANG

Or you can do even more stock buybacks instead I guess

Semiconductor companies seem to pay a lot shittier than FAANG type jobs, NVidia seems to be the one bucking that trend but overall I haven’t heard good things about any of the bigger semi companies, compensation wise.

I would tell new grads now for god’s sake go into software not hardware.

VorpalFish
Mar 22, 2007
reasonably awesometm

Not Wolverine posted:

I like the idea of another fab, but how can someone take a look at the resounding failure of GloFo and think "yes, this is a good idea."? I think Intel can catch up to TSMC and Samsung's processes eventually, but even 14nm+++++++ ad infinitum will eventually reach a limit. 10nm is still behind 7nm, or is Intel's 10nm as good as TSMC 7nm? Performance wise, I think it could be similar, but I thought Intel's 10nm yields were still pretty bad.

It's hard to say but it seems like with the latest superfin variant Intel's 10nm is probably at parity or maybe even a little better than tsmc 7 in terms of density and efficiency, but unfortunately for them tsmc is shipping 5nm now.

Also yields are maybe still a trashfire who knows but probably.

shrike82
Jun 11, 2005

Intel's been a backwater for engineering talent for a while now. Even a decade ago when I was in school, I remember the semicon companies pulling in the middling graduates. The comp gap's only ballooned since then.

WhyteRyce
Dec 30, 2001

priznat posted:

Semiconductor companies seem to pay a lot shittier than FAANG type jobs, NVidia seems to be the one bucking that trend but overall I haven’t heard good things about any of the bigger semi companies, compensation wise.

I would tell new grads now for god’s sake go into software not hardware.

Intel engineers are getting poached by FAANG and Microsoft. I don't know AMD compensation levels but Intel engineers are still getting compensation bumps moving there because of role changes and promos. So even if the hardware guys are getting less pay, Intel still isn't at the top end of that market. And they admittedly don't even try to be at the top.

So they've got a hard challenge a head of them and have been bleeding talent and experience. It won't be just as simple as leadership letting the engineers do what they do best because a bunch of those people are gone. They are going to have to open their checkbooks for more than just splashy VPs and acquisitions

WhyteRyce fucked around with this message at 07:14 on Dec 31, 2020

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
The new consoles use eight-core CPUs, but between not all the cores being used for gaming (IIRC at least one is reserved for the OS and other miscellany) and the relatively modest clocks on them you can probably keep up with a good six-core, and a "true" eight-core would almost assuredly be enough

And even if gaming requirements escalated enough for a more-than-eight-core CPU to be desirable, by the time that happens (because devs still haven't really been maximizing console potential yet) we're already in 2021 and the DDR5 era and the Rocket Lake/Zen 3 offerings won't matter.

gradenko_2000 fucked around with this message at 05:08 on Dec 31, 2020

FuturePastNow
May 19, 2014


I want more cores. But I don't feel I'm going to need more than six cores for another four or five years probably. At least not for games.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

FuturePastNow posted:

I want more cores. But I don't feel I'm going to need more than six cores for another four or five years probably. At least not for games.

At some point more cores becomes a liability. AMD has a tool that can shut down cores on their Threadrippers so they can also function as HEDT gaming CPUs because some games will refuse to run if there's like 48+ cores running.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BIG HEADLINE posted:

At some point more cores becomes a liability. AMD has a tool that can shut down cores on their Threadrippers so they can also function as HEDT gaming CPUs because some games will refuse to run if there's like 48+ cores running.

If you dig through the BIOS settings, most motherboards actually have this somewhere. Sometimes it lets you shut off a specific number of cores, sometimes it just lets you turn off all but one core.

I accidentally enabled that on a workstation I was using as a server and was very puzzled until I realized what might have happened. "Multiprocessing", hmm, nope, I just have a single processor! :downsgun:

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

BIG HEADLINE posted:

At some point more cores becomes a liability. AMD has a tool that can shut down cores on their Threadrippers so they can also function as HEDT gaming CPUs because some games will refuse to run if there's like 48+ cores running.

just gotta use the extra cores for software rendering :science:

WhyteRyce
Dec 30, 2001

Excuse me but I need to transcode my Plex library to x265 while I’m playing games

Bofast
Feb 21, 2011

Grimey Drawer

WhyteRyce posted:

Excuse me but I need to transcode my Plex library to x265 while I’m playing games

You joke, but... https://twitter.com/marcan42/status/1344289933108809730
(apparently it didn't work too well without manually sending them to separate NUMA nodes)

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!
Apple’s gonna drop those rumored 32 and 64 core high end M-series chips this year and all you corecount queens are gonna be tripping over yourselves to talk about how cores never mattered.

BlankSystemDaemon
Mar 13, 2009



It's dumb to talk about whether corecounts matter or not, as it's all a question of what workloads we're talking about.
For anything that can be parallelized, scale-out is always preferable, because we hit the limits of Dennard scaling a decade ago, and the only way we're going to get faster processors is if we add more cores.
If something is thought to be not-parallelizable (like video encoding and decoding was thought to be, once), then someone will work on making it parallelizable.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ok Comboomer posted:

Apple’s gonna drop those rumored 32 and 64 core high end M-series chips this year and all you corecount queens are gonna be tripping over yourselves to talk about how cores never mattered.

In terms of games? Yeah, >8c really doesn't matter, and isn't likely to matter for years. Silly-wide ARM chips are interesting for other reasons, though--mostly the same reasons that wide-but-slow-thread chips have always been interesting, so none of that calculus is likely to change on whether or not moar cores is interesting/useful for a given application/person.

I mean, poo poo, I could give a 64c TR to my mother, whose entire use case centers on Facebook and Skype, but somehow I don't think she'd be suitably impressed by it vs her current system.

LRADIKAL
Jun 10, 2001

Fun Shoe
I don't think the apple chips (yum!) are slouches in single threaded work loads either!.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

LRADIKAL posted:

I don't think the apple chips (yum!) are slouches in single threaded work loads either!.

It's not, but there are very legitimate questions about how high it'll be able to clock--it's pretty damned good for the laptop power envelope it's designed for (and by semi-extension, servers), but how much of that translates over to desktop power envelopes remains to be seen. 2021 will be an interesting year for CPUs, if nothing else.

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

Ok Comboomer posted:

Apple’s gonna drop those rumored 32 and 64 core high end M-series chips this year and all you corecount queens are gonna be tripping over yourselves to talk about how cores never mattered.

I think you mean tripping over ourselves to get one.

CFox
Nov 9, 2005

DrDork posted:

It's not, but there are very legitimate questions about how high it'll be able to clock--it's pretty damned good for the laptop power envelope it's designed for (and by semi-extension, servers), but how much of that translates over to desktop power envelopes remains to be seen. 2021 will be an interesting year for CPUs, if nothing else.

I mean in benchmarks isn't already beating everything with the exception where it's competing against double the cores? No need to jack up the clockspeed when they can just slap on more performance cores and win that way.

Perplx
Jun 26, 2004


Best viewed on Orgasma Plasma
Lipstick Apathy
My m1 at 3.2Ghz is faster than my 9900k at 5ghz, so I'm not worried about clock speed, plus apple will have first dibs on the latest tsmc node for the foreseeable future.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



CFox posted:

I mean in benchmarks isn't already beating everything with the exception where it's competing against double the cores? No need to jack up the clockspeed when they can just slap on more performance cores and win that way.

Yeah, I think this is what most are expecting. The M1 already can ramp up the performance cores to 3.2 Ghz I believe, and we see what they're capable of. That's a bump from what the A14 does in the iPhone 12 and iPad Air 4, and the M1 was designed for minimal active cooling and passive cooling situations. So I'd expect that Apple could crank the power up some more for systems intended to operate active cooling systems, and see an increase in speed. How much more, we don' know, but like you said, they're already excelling so much across the board, that they could simply slap more cores on the M1X and M1Pro or whatever it'll be.

I'd be shocked if the 16" MBP didn't get like a 16 or 24 core chip with 8 or 16 high performance cores and 8 efficiency cores, and then the iMac gets similar, while the Mac Pro ends up as some 32 or 64 core behemoth.

(As a reminder, the M1 is just 4x high performance and 4x efficiency cores).

Edit: Also, the A14/M1 (and prior generations) are quite a bit wider architectures than both Intel and AMD's x86 offerings, so couple that with the highly efficient and aggressive scheduler in iOS and macOS, is why we see such amazing performance.

NewFatMike
Jun 11, 2015

DrDork posted:

I mean, poo poo, I could give a 64c TR to my mother, whose entire use case centers on Facebook and Skype, but somehow I don't think she'd be suitably impressed by it vs her current system.

Might actually get Zoom to run worth a drat :v:

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

CFox posted:

I mean in benchmarks isn't already beating everything with the exception where it's competing against double the cores? No need to jack up the clockspeed when they can just slap on more performance cores and win that way.

In the mobile realm, yeah--especially since x86 CPUs are designed with SMT in mind, meaning the M1 gets to go full-blast on a single thread on a single core, while Intel/AMD x86 doing a single thread on a single core is basically hobbling itself vs its intended 2t-to-1c operation (which doesn't take away from the fact that, for anything that is truly single-thread limited, the M1 will be faster at finishing that specific thread by design there).

But move over to desktops and it gets crushed by nicely clocked Zen3 and should get walked on by Intel's 11xxx series if preliminary reports are reflective of what they'll deliver.

Now, it's also entirely fair to point out that the M1 is, in its current iteration, designed for low power and non-OC operation, and so it should get stomped on by chips with no meaningful power limits and OCs cranking things up as high as they'll go. Which is why I think it'll be quite interesting to see what Apple is able to put out with their desktop versions, as until then it's mostly just speculation as to how they'll perform once given all the juice they can handle.

But, yeah, 5nm vs 14nm+++++++ should give Apple a pretty hefty advantage even if they can't crank the clocks much.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Even in the desktop space, I wouldn't say it gets "crushed": https://github.com/tuhdo/tuhdo.github.io/blob/83edd4600642dc7e2a1625e74790ebbf4da67cb0/emacs-tutor/zen3_vs_m1.org

It gets beaten, sure, but the fact that the M1 is even that close to a 5800X, at 1/10th the TDP, should be absolutely frightening to Intel and AMD, because in those tests, the bulk of the workload for the M1 is being done by the 4 high-performance cores with no SMT, against the 5800X's 8 physical cores with 16 logical threads. And performance benchmarks where x86 still outperforms the M1 handily, have typically benefited from higher core counts.

So again, Apple could probably even just double the M1's current BIG.small configuration and likely beat the 5800X. And probably stay within a 30W TDP envelope to do so...

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
30-40% differences in a lot of the tests meets my measure for "crushed," but you do you.

You're right, though--it's quite close considering the enormous gap in wattage and cores. Should get real interesting when we can compare a truly high-powered 16+ core M1 derivative against whatever AMD/Intel has out at that point.

Though it'll still be something of an interesting curiosity as long as the M1 is Apple-exclusive and no one else has any comparable ARM systems available for public consumption. It'd be real interesting to see what sort of server systems Apple could put together with them, but so far I haven't heard much about anything in that direction (yet--they'd be insane not to be pursuing it).

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Well, I just view "crushed" basically similar to the Excavator-core days, where a chip that was designed to directly compete against Intel's best, pretty much never did and always lost handily, even at roughly equivalent (and typically greater) TDPs.

But for a chip that is low-TDP, passive cooling focused, to even be as close as it is, with a large core/SMT discrepancy, in multithreaded benchmarks, is pretty remarkable and why I don't view it as "crushed" so much as "punching well above the weight one would expect it to punch".

But if you want to go solely off % difference in performance irregardless of any other information, then yes, it gets crushed.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Ok Comboomer posted:

Apple’s gonna drop those rumored 32 and 64 core high end M-series chips this year and all you corecount queens are gonna be tripping over yourselves to talk about how cores never mattered.

Nah I'm gonna end up buying an overpriced iMac which is something I was confident would never, ever happen.

Kazinsal
Dec 13, 2011

MaxxBot posted:

Nah I'm gonna end up buying an overpriced iMac which is something I was confident would never, ever happen.

Same. I've genuinely considered grabbing an M1 mini for shits and giggles. I don't really *need* it but it looks incredibly cool. Might need to grab a KVM if I do though.

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

MaxxBot posted:

Nah I'm gonna end up buying an overpriced iMac which is something I was confident would never, ever happen.

Same, unless Apple puts it in the mini as some upgraded config. But I’m betting a 27” iMac is going to be the only path to a desktop Big M Apple chip.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply