Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
silence_kit
Jul 14, 2011

by the sex ghost
If they are just using parts which were mainly developed for commercial use, why do they need to be manufactured in the US?

I am honestly uninformed of the situation w.r.t. Taiwan and China but could see why the situation might be precarious and why the American government might be worried.

In the case of other international suppliers, what is the big deal? Is this just the US gov’t contingency planning for the case where they are at war with the rest of the world?

Adbot
ADBOT LOVES YOU

hobbesmaster
Jan 28, 2008

silence_kit posted:

If they are just using parts which were mainly developed for commercial use, why do they need to be manufactured in the US?

I am honestly uninformed of the situation w.r.t. Taiwan and China but could see why the situation might be precarious and why the American government might be worried.

In the case of other international suppliers, what is the big deal? Is this just the US gov’t contingency planning for the case where they are at war with the rest of the world?

Yes but also with things like platform security modules/Intel management engine and similar there is a potentially “invisible” back door that could be tampered with in an untrusted supply chain.

Kibner
Oct 21, 2008

Acguy Supremacy

silence_kit posted:

In the case of other international suppliers, what is the big deal? Is this just the US gov’t contingency planning for the case where they are at war with the rest of the world?

Yes. The US government already mistrusts electronics produced overseas because of security concerns (backdoors or other foreign government strategies to compromise devices). For anything that is security critical, they likely only use US-sourced electronics and employ only US natural citizens to minimize the risk of foreign interference.

KYOON GRIFFEY JR
Apr 12, 2010



Runner-up, TRP Sack Race 2021/22
Also, what happens when your relationship with a foreign government changes and you can no longer source defense-critical materials? You don't need to be at war with someone to restrict their access to sensitive components.

silence_kit
Jul 14, 2011

by the sex ghost
Thanks to everyone for your posts on this topic.

Kibner posted:

Yes. The US government already mistrusts electronics produced overseas because of security concerns (backdoors or other foreign government strategies to compromise devices). For anything that is security critical, they likely only use US-sourced electronics and employ only US natural citizens to minimize the risk of foreign interference.

Are you saying that a computer on a classified network is totally made up of US designed and manufactured electronics? Is that even possible?

Kibner
Oct 21, 2008

Acguy Supremacy

silence_kit posted:

Thanks to everyone for your posts on this topic.

Are you saying that a computer on a classified network is totally made up of US designed and manufactured electronics? Is that even possible?

Maybe not 100%, but government contracts try to ensure that there are as few foreign produced products that could conceivably be used to compromise security as possible.

Yaoi Gagarin
Feb 20, 2014

For something like a resistor or a capacitor it doesn't matter. But a lot of embedded CPUs don't need to be on the newest manufacturing nodes anyway, so there's a decent amount of US fabs doing older stuff IIRC. That goes into defense stuff too.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

Kazinsal posted:

Since the "Network and Edge" category is so vague and barely in the black, but every server board made still has a smattering of Intel controllers on it, I wonder if they're taking a bath on Barefoot.

Turns out nobody in white box land wants a programmable switching ASIC if you don't release the driver code, and nobody in black box land who doesn't already have their own custom ASICs cares enough to spend twice as much per ASIC as they would on a Broadcom or Mellanox. Who knew?

About that, https://www.tomshardware.com/news/intel-sunsets-network-switch-biz-kills-risc-v-pathfinder-program

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

I was just coming to post this, that’s a real bummer. I think the local Intel office here is a lot of the network switch stuff too so that would suck.

Them getting into Risc V was surprising even if it was a merely token effort but it doesn’t surprise me it’s one of the first things they kill.

movax
Aug 30, 2008

silence_kit posted:

Does the US military/government ACTUALLY design custom ASICs for their applications? The only ones I am aware of are the RF chips for the RF front-end electronics in military radios/RADARs, which are comparatively much lower tech than computer chips.

Yes -- all the time, especially for rad-hard stuff that a FPGA just won't be performant enough for. Crypto HW is another major category. I thought at one point the NSA had its own little mini-fab... these things don't need to be cutting-edge deep-EUV though.

"90nm should really be enough for anyone" (it really is a sweet spot IMO for most ICs that aren't cutting-edge processors/logic/etc)

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
NSA did have a mini fab before, yes. Not sure if it was removed from Google Earth / Google Maps way back.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Paul MaudDib posted:

They're genuinely in the reverse position of AMD all those years ago, and coming into a recession just the same.

Intel has one thing going for it now that AMD did not back then: When Intel makes announcements of new product, not even good product, just "not bad", their stock does not go down 10%-15% because automated internet-scraping HFT algorithms are so dumb they can't tell the difference between Anne or Berkshire. Or used to, before the mainstream proliferation of machine learning.

Which is just another way of saying that the stock market is inherently irrational and Intel could go undead before people drop it as a blue chip.

SwissArmyDruid fucked around with this message at 00:38 on Jan 28, 2023

mobby_6kl
Aug 9, 2009

by Fluffdaddy
These are pretty bad results but not the end of the world as long as they can actually unfuck things within the next generation or too.



There's a review of a Alder Lake-N N100 PC, the guy is a bit awkward but it's a decent overview.

https://www.youtube.com/watch?v=IL00YmcTV0g

There's also one in Chinese with subs:
https://www.youtube.com/watch?v=atFViD1-R7o

Seem like a pretty big boost, especially compared to the previous entry level models. It does seem like they're all single-channel but this model is also DDR4, so a fast DDR5 version might be able to offset that disadvantage.

icantfindaname
Jul 1, 2008


I haven't been paying attention, Intel entered the consumer desktop GPU market to compete with AMD and Nvidia? Why? That seems like a gigantic investment for a limited market. Are they really betting on GPUs for buttcoin mining and AI stuff being a massive growth market?

Kibner
Oct 21, 2008

Acguy Supremacy

icantfindaname posted:

I haven't been paying attention, Intel entered the consumer desktop GPU market to compete with AMD and Nvidia? Why? That seems like a gigantic investment for a limited market. Are they really betting on GPUs for buttcoin mining and AI stuff being a massive growth market?

They are trying to get into the server compute market and the desktop gpu's are a bit of a side hustle that could earn money, if executed well. Same/similar silicon, different software stack.

mobby_6kl
Aug 9, 2009

by Fluffdaddy

icantfindaname posted:

I haven't been paying attention, Intel entered the consumer desktop GPU market to compete with AMD and Nvidia? Why? That seems like a gigantic investment for a limited market. Are they really betting on GPUs for buttcoin mining and AI stuff being a massive growth market?

It is a massive growth market, datacenter makes up the majority of nvidia's revenue now and Intel reasonably wants a part of that

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

icantfindaname posted:

I haven't been paying attention, Intel entered the consumer desktop GPU market to compete with AMD and Nvidia? Why? That seems like a gigantic investment for a limited market. Are they really betting on GPUs for buttcoin mining and AI stuff being a massive growth market?

As several people have mentioned they're desperate for a datacenter GPU, but I'm still pretty sure the desktop GPU's will be shitcanned sooner rather than later. The driver development costs have to be astronomical and getting to a point where the cards are actually competitive is going to require years and years of running at a massive loss.

Inept
Jul 8, 2003

mobby_6kl posted:

It is a massive growth market, datacenter makes up the majority of nvidia's revenue now and Intel reasonably wants a part of that



"gaming"

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

TheFluff posted:

As several people have mentioned they're desperate for a datacenter GPU, but I'm still pretty sure the desktop GPU's will be shitcanned sooner rather than later. The driver development costs have to be astronomical and getting to a point where the cards are actually competitive is going to require years and years of running at a massive loss.

Intel can't afford to miss the boat again. They hosed up on mobile, they hosed up on wireless chipsets, Optane, and even flash memory. There's no way that Intel can continue to extract fat margins from x86 server chips in the future, because competition from AMD and ARM will prevent Intel from being able to keep high prices.

You know what has really high margins right now? GPUs. Desktop included.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Twerk from Home posted:

Intel can't afford to miss the boat again. They hosed up on mobile, they hosed up on wireless chipsets, Optane, and even flash memory. There's no way that Intel can continue to extract fat margins from x86 server chips in the future, because competition from AMD and ARM will prevent Intel from being able to keep high prices.

You know what has really high margins right now? GPUs. Desktop included.

This is true, but Intel is really good at making really poor decisions, and big companies love short-term thinking.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Paul MaudDib posted:

(except for that tiny national-security tsmc fab that will be years behind even intel let alone TSMC taiwan when it comes online)

Are you talking about the Arizona fab? Because TSMC pivoted to turning it into a leading-edge fab and it's further along than has been publicly indicated.

mmkay
Oct 21, 2010

TheFluff posted:

This is true, but Intel is really good at making really poor decisions, and big companies love short-term thinking.

Also being godawful at timing, both Optane and GPUs came to market just as the prices of DDR and other graphic cards fell.

WhyteRyce
Dec 30, 2001

Optane failed for a variety of reasons, both from business decisions to technical issues to bad luck/timing

Even if they managed to stick to the original plan schedule and if it managed to scale and yield the way they hoped it would, CXL may have still killed it and Optane on CXL is a much less interesting business proposition for Intel

Assepoester
Jul 18, 2004
Probation
Can't post for 10 years!
Melman v2

icantfindaname posted:

I haven't been paying attention, Intel entered the consumer desktop GPU market to compete with AMD and Nvidia? Why? That seems like a gigantic investment for a limited market. Are they really betting on GPUs for buttcoin mining and AI stuff being a massive growth market?
It's gonna take them a while to catch up in the consumer GPU market even with their resources

https://i.imgur.com/zXNExip.mp4

https://www.youtube.com/watch?v=j6kde-sXlKg

Arivia
Mar 17, 2011

F.D. Signifier posted:

It's gonna take them a while to catch up in the consumer GPU market even with their resources

https://i.imgur.com/zXNExip.mp4

https://www.youtube.com/watch?v=j6kde-sXlKg

I feel like you didn’t watch that LTT video.

e: like most of that video is Luke and Linus going “yeah stuff was broken but they are obviously working very hard to fix it, even weird poo poo like active cables not working” and ending on a note of “most people seem to be happy with their cards and for good reason” which is the exact opposite of what would prove your point

Arivia fucked around with this message at 00:52 on Jan 31, 2023

Beef
Jul 26, 2004
Ordered an Arc750, with the 770 being constantly out of stock. They appear to be selling well enough.
Please keep the updates and fixes rolling. I need an affordable option in this hosed up gpu market.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Are the Arc drivers completely from scratch? The UHD/Iris stuff seemed to work fine without any weird issues. Like I could run Cyberpunk slowly, but without crashes or glitches, on a 11th gen laptop chip.

repiv
Aug 13, 2009

i forgot where they said it, but someone from intel said the arc implementation of directx was built from scratch, not derived from their old iGPU driver

wargames
Mar 16, 2008

official yospos cat censor

mobby_6kl posted:

Are the Arc drivers completely from scratch? The UHD/Iris stuff seemed to work fine without any weird issues. Like I could run Cyberpunk slowly, but without crashes or glitches, on a 11th gen laptop chip.

pretty sure they where basically from scratch.

njsykora
Jan 23, 2012

Robots confuse squirrels.


TheFluff posted:

As several people have mentioned they're desperate for a datacenter GPU, but I'm still pretty sure the desktop GPU's will be shitcanned sooner rather than later. The driver development costs have to be astronomical and getting to a point where the cards are actually competitive is going to require years and years of running at a massive loss.

Also availability problems have almost entirely passed, the sole reason you'd buy an Intel GPU right now is if you needed a cheap card that can do AV1 encoding. At the time of the initial announcement it felt like Intel were trying to appear as the savior of the GPU industry when everything was out of stock but by the time their cards came out there were better AMD cards available for less. Plus the driver issues and everyone expects them to abandon the whole line within a few years.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
It's difficult to put a lot of faith into Intel to commit to these GPUs when they're bleeding so much money and given their prior track record. Even if the team at Intel is in 100% good faith trying their best and committed it doesn't matter if leadership / shareholders axe them. I would argue, however, that Intel should probably try to sell that division off potentially instead of doing lay-offs because it would better improve their share values.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I read the posts about Intel being on the back foot, bleeding money, etc., but what does that really mean? Like, I'm pretty sure they're Too Big To Fail, so what happens if Intel stays on a downward trajectory? We get stagnation of products? Harder availability of products as they can't produce as much?

mobby_6kl posted:

There's a review of a Alder Lake-N N100 PC, the guy is a bit awkward but it's a decent overview.

https://www.youtube.com/watch?v=IL00YmcTV0g

...

Seem like a pretty big boost, especially compared to the previous entry level models. It does seem like they're all single-channel but this model is also DDR4, so a fast DDR5 version might be able to offset that disadvantage.

thanks for this, btw - the benchmark numbers were interesting and I think it might be a worthy replacement to my old 2016 i3

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

gradenko_2000 posted:

I read the posts about Intel being on the back foot, bleeding money, etc., but what does that really mean? Like, I'm pretty sure they're Too Big To Fail, so what happens if Intel stays on a downward trajectory? We get stagnation of products? Harder availability of products as they can't produce as much?

Realistically we get some type of protective government tariffs or subsidies, and also Intel becomes just one of a bunch of options for both client and server CPUs.

Honestly, the 2nd part of that is already happening. AMD is eating Intel's lunch in the server space, and Ampere Altra or Amazon's own ARM designs are really competitive too. In the client space, phones and tablets sell much more than laptops do, and ARM Chromebooks are a small but growing slice of the market.

Intel's stagnation of products already happened, that's what we call 2015-2021.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
I wouldn't be surprised if they slide into a state sort of similar to Bulldozer-era AMD, selling lovely products and bleeding money for a number of years until they manage to restructure the organization into a less dysfunctional state. Big orgs have a ton of inertia though and take forever to turn around. One of the reasons I have so little faith in the desktop GPU's is that when you're under pressure to restructure and stop bleeding money it's really hard for leadership to defend a long-term project like that even if it's critically important to the future of the business, because everyone in the org will be trying to save their own rear end, and in doing so they'll attempt to get anything that isn't their own project killed.

redeyes
Sep 14, 2002

by Fluffdaddy
Im an idiot so I bought an arc 770 LE. I'll report back when im really pissed off.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

I have a theory that Intel killed a bunch of their lines over and over again (phi, optane, three or four network companies,etc) because they weren’t as profitable as the xeon lines and other internal politics, but now that they’re eating poo poo on xeon the gpus are critical to growth of high margin and might have a longer runway.

Otoh rumors are that the third gen consumer gpu dev is cancelled (the a770 being first gen.)

mobby_6kl
Aug 9, 2009

by Fluffdaddy
I can see them killing the consumer GPUs if they think the driver situation is completely desperate but it'd be pretty stupid to bail on a strategically important segment. So yeah I can see some idiot shareholders pressuring them into that.

kliras
Mar 27, 2021
what are cloud companies likely to do for the switchover to av1 encoding? just roll their own hardware without using nvidia/amd/intel, or what's the move?

redeyes
Sep 14, 2002

by Fluffdaddy
I'm going to cry a lot of they shitcan the drat thing after I bought it.

Adbot
ADBOT LOVES YOU

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

kliras posted:

what are cloud companies likely to do for the switchover to av1 encoding? just roll their own hardware without using nvidia/amd/intel, or what's the move?

On device encoding is becoming the rage with the controllers on nvme devices able to do differing levels of encoding/decoding on the fly and in the background. A lot of custom options exist already, it’s pretty neat.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply