Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
feedmegin
Jul 30, 2008

simmyb posted:

I figure there is many layers of complexity between "this is a core" and "these are individual transistor" that might be interesting, but Idont even know where to start with terminology or what to look for?

The middle layer would be an FPGA and a Verilog compiler. You can pick up small ones pretty cheap these days if you want to play around with that sort of thing. You're working at, like, 'this is an AND gate (or a bunch of them)' not transistors but it's still hardware. Also means your CPU isn't the size of a room :shobon:

Adbot
ADBOT LOVES YOU

BlankSystemDaemon
Mar 13, 2009




gradenko_2000 posted:

seconding this as a great way to build your understanding of computer hardware from the ground up
Sure, except for the fact that because of the microcode, the x86 ISA design which is CISC gets translated into a very RISC-like microarchitectural design that nobody except Intel knows anything about.

feedmegin
Jul 30, 2008

BlankSystemDaemon posted:

Sure, except for the fact that because of the microcode, the x86 ISA design which is CISC gets translated into a very RISC-like microarchitectural design that nobody except Intel knows anything about.

I don't think the guy is asking 'how can I build a high performance x86-compatible CPU' so much as how they work in general.
I mean, I'm assuming he's not the CEO of AMD or something. I work in embedded, I don't give a poo poo how x86 micro-ops work, an ARM Cortex-M0 doesn't need any of that stuff but it is helpful to know how it works in any case.

Serotoning
Sep 14, 2010

D&D: HASBARA SQUAD
HANG 'EM HIGH


We're fighting human animals and we act accordingly
Thirding Ben Eater he's an incredible resource. Another good mention is Branch Education which has mainly done videos explaining peripheral hardware (SSDs, Hard Drives, and so on) but has promised an upcoming video covering CPU architecture in their latest video on general computer hardware https://youtu.be/d86ws7mQYIg

A book recommendation would be But How Do It Know? which walks the path from gates to simple general purpose CPUs

hobbesmaster
Jan 28, 2008

BlankSystemDaemon posted:

Sure, except for the fact that because of the microcode, the x86 ISA design which is CISC gets translated into a very RISC-like microarchitectural design that nobody except Intel knows anything about.

The question sounded more like they want to learn the material that would be in a university “intro to computer organization” (ie Patterson & Hennessy) class to get some of the general basics of what computers do.

..btt
Mar 26, 2008
I like https://nandgame.com/ - it's very simplified, but lets you "discover" some of the concepts for yourself. Heavily inspired by https://www.nand2tetris.org/ I'm sure.

BlankSystemDaemon
Mar 13, 2009




simmyb posted:

I'll post this in one of the CPU threads because it seems like the least wrong thread for it...

I don't know poo poo about CPU architecture but I recall seeing a few images like the the one below recently, as well as being dragged into some PLC/industrial controls stuff at work lately it has got my curiosity going.

I've seen some images like this:


And read an article or two like this:
https://www.techspot.com/article/1821-how-cpus-are-designed-and-built/

Obviously that article series is a reasonably quick read, and i guess what I'm looking for is an approachable explanation of what's going on between that high level layout and the oodles of transistors and other devices.

I figure there is many layers of complexity between "this is a core" and "these are individual transistor" that might be interesting, but Idont even know where to start with terminology or what to look for?

Am I staring down a path of madness?
If you want the path of madness, watch Tom7s latest video:
https://www.youtube.com/watch?v=Ae9EKCyI1xU

feedmegin posted:

I don't think the guy is asking 'how can I build a high performance x86-compatible CPU' so much as how they work in general.
I mean, I'm assuming he's not the CEO of AMD or something. I work in embedded, I don't give a poo poo how x86 micro-ops work, an ARM Cortex-M0 doesn't need any of that stuff but it is helpful to know how it works in any case.
True enough.

hobbesmaster posted:

The question sounded more like they want to learn the material that would be in a university “intro to computer organization” (ie Patterson & Hennessy) class to get some of the general basics of what computers do.
Yeah, and for that Ben Eaters playlist that's already been linked is basically the best resource.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

feedmegin posted:

The middle layer would be an FPGA and a Verilog compiler. You can pick up small ones pretty cheap these days if you want to play around with that sort of thing. You're working at, like, 'this is an AND gate (or a bunch of them)' not transistors but it's still hardware. Also means your CPU isn't the size of a room :shobon:

Yeah, as an undergraduate in computer engineering in the late-oughts I had an embedded systems class where we used little student FPGA kits to test our code. Verilog was introduced sometime in sophomore year and the final project for senior-level CPU design was to code out an entire 16-bit CPU (minus the memory controller, provided by the professor) and have it run through a test program successfully in the simulator. Mine was only around 600 lines if I recall correctly, but there wasn't nearly as much help from sophisticated IDEs as I was used to in software development so debugging was very laborious.

Eletriarnation fucked around with this message at 16:16 on May 2, 2023

Klyith
Aug 3, 2007

GBS Pledge Week

hobbesmaster posted:

The question sounded more like they want to learn the material that would be in a university “intro to computer organization” (ie Patterson & Hennessy) class to get some of the general basics of what computers do.

Yeah, and that is also potentially a lot more useful than knowing how an 8-bit chip from the 1970s is built with individual transistors. There's a whole lot of stuff between the transistors and how a modern CPU works. Like, if you know how a modern chip works from transistors up, you probably know enough to be a lead architecture engineer or some poo poo.


And particularly if you know some programming / are a programmer, knowing about what pipelines are and how a branch predictor functions is good stuff.

I have a friend who worked on stuff for the nintendo DS, so a 32bit arm CPU that's not fast. One of the things they did was a standard addition to IF when you needed every bit of speed, called IF UNLIKELY. That made sure the branch predictor wouldn't waste time on something that probably wouldn't happen. Pretty simple, and still very top-level for "how a CPU works", but much more useful than knowing how to build an 8bit adder from nand gates.



simmyb posted:

Am I staring down a path of madness?

absolutely yes, but that doesn't mean you shouldn't do it!

simmyb
Sep 29, 2005

Thank you everyone this is all excellent and gives me a few different places to dip my toe in 🙂

hobbesmaster
Jan 28, 2008

Klyith posted:

Yeah, and that is also potentially a lot more useful than knowing how an 8-bit chip from the 1970s is built with individual transistors. There's a whole lot of stuff between the transistors and how a modern CPU works. Like, if you know how a modern chip works from transistors up, you probably know enough to be a lead architecture engineer or some poo poo.

My EE/CompE program and I believe all ABET programs effectively require a semester of semiconductor (ie solid state) physics, semiconductor circuits (diodes, transistors), digital logic (gates, kmaps, whatever) and computer organization.

This hits every part of how a CPU works from sand up and should plant an undergrad firmly in the “valley of despair” on those dunning Kruger charts.


quote:

And particularly if you know some programming / are a programmer, knowing about what pipelines are and how a branch predictor functions is good stuff.

I have a friend who worked on stuff for the nintendo DS, so a 32bit arm CPU that's not fast. One of the things they did was a standard addition to IF when you needed every bit of speed, called IF UNLIKELY. That made sure the branch predictor wouldn't waste time on something that probably wouldn't happen. Pretty simple, and still very top-level for "how a CPU works", but much more useful than knowing how to build an 8bit adder from nand gates.

absolutely yes, but that doesn't mean you shouldn't do it!

“There’s a lot of weird poo poo I know nothing about going on under my currently layer of abstraction” is actually a very important baseline for systems programmers. Expectations should be set but it’s a very reasonable line of thinking.

edit: Valley of despair is the dunning Kruger one, disillusionment if hype cycle.

hobbesmaster fucked around with this message at 18:06 on May 2, 2023

Kibner
Oct 21, 2008

Acguy Supremacy

hobbesmaster posted:

My EE/CompE program and I believe all ABET programs effectively require a semester of semiconductor (ie solid state) physics, semiconductor circuits (diodes, transistors), digital logic (gates, kmaps, whatever) and computer organization.

This hits every part of how a CPU works from sand up and should plant an undergrad firmly in the “trough of disillusionment” on those dunning Kruger charts.

“There’s a lot of weird poo poo I know nothing about going on under my currently layer of abstraction” is actually a very important baseline for systems programmers. Expectations should be set but it’s a very reasonable line of thinking.

Yeah, I had one or two classes that covered the very, very, very basics of this kind of stuff and it made me lean towards the side of "computers are magic". Like, I get kinda how they work, but the fact that they work at all is astounding to me.

WhyteRyce
Dec 30, 2001

hobbesmaster posted:

My EE/CompE program and I believe all ABET programs effectively require a semester of semiconductor (ie solid state) physics, semiconductor circuits (diodes, transistors), digital logic (gates, kmaps, whatever) and computer organization.

This hits every part of how a CPU works from sand up and should plant an undergrad firmly in the “valley of despair” on those dunning Kruger charts.

“There’s a lot of weird poo poo I know nothing about going on under my currently layer of abstraction” is actually a very important baseline for systems programmers. Expectations should be set but it’s a very reasonable line of thinking.

edit: Valley of despair is the dunning Kruger one, disillusionment if hype cycle.

I took two classes in parallel that were for the same thing. One was computer architecture for CS students and one was digital logic for EE/CE students. It was amusing going over the same type of work but one class just drag and dropping muxes and flops into a page and connecting them with lines while the other had me doing bullshit transistor level poo poo

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!
There are open tools, like https://theopenroadproject.org/, for the whole ASCII to silicon pipeline. It's a process not unlike compiling source code, just that it emits geometry at the end.

Professional tools, synopsys and cadence, are very expensive. You can get research PDK from the internet. Real PDKs are a trade secret. TSMC has some kind of agreement with the PRC and their PDK is steganographically watermarked, so people are not so eager to share it. OpenROAD includes some PDKs.

Serotoning
Sep 14, 2010

D&D: HASBARA SQUAD
HANG 'EM HIGH


We're fighting human animals and we act accordingly
It's almost surely more of a software issue, but my main impetus for learning this stuff is figuring out how a modern computer can ever hitch doing basic stuff like navigating around the interface and typing stuff into input text boxes lol. Hell, even how stuff like opening new programs can cause visible hitching; just do it in the "background?" None of this to mention the occasional bouts of instability. Probably easy to go "lol Windows" but what is really happening there? Is Windows/other software somewhere along the stack just grossly mismanaging the resources available to it? Is the humming of electricity coursing through millions of switching traces somehow an inherently volatile process that needs to be carefully managed lest something goes horribly wrong, and so firmware needs to be light-footed in it's demands, user responsiveness be damned? I just don't understand how something so discrete as clock cycles blasting away on a chip built by very smart people literally from the ground up can ever not be, like, perfect-ish?

I'm probably trying to make inferences through like 5+ too many layers of abstraction but what the gently caress is happening on that motherboard/chip/die/core when everything doesn't go as smoothly as I expect it to?

hobbesmaster
Jan 28, 2008

Pvt. Parts posted:

It's almost surely more of a software issue, but my main impetus for learning this stuff is figuring out how a modern computer can ever hitch doing basic stuff like navigating around the interface and typing stuff into input text boxes lol.

This is one of those things where the more you learn the less sure you are what the answer is.

At the lowest level, on the typical modern x86 processor it is actually impossible to make guarantees about determinism or jitter.

Edit: I shouldn’t have used “lowest level” there lol but still

hobbesmaster fucked around with this message at 18:54 on May 2, 2023

WhyteRyce
Dec 30, 2001

scores of post silicon validation engineers stake their entire careers on “doesn’t reproduce, people will blame Windows if it does”

mmkay
Oct 21, 2010

It can be a combination of 'your kernel/application is juggling multiple processes with not enough resources (CPU time/cache/RAM/storage)' and magic. If you have your own app you can try throwing it into something like VTune and it will spit out some hotspots it finds, but otherwise you can blame it on the sand wizards, I guess.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Pvt. Parts posted:

It's almost surely more of a software issue, but my main impetus for learning this stuff is figuring out how a modern computer can ever hitch doing basic stuff like navigating around the interface and typing stuff into input text boxes lol. Hell, even how stuff like opening new programs can cause visible hitching; just do it in the "background?" None of this to mention the occasional bouts of instability. Probably easy to go "lol Windows" but what is really happening there? Is Windows/other software somewhere along the stack just grossly mismanaging the resources available to it? Is the humming of electricity coursing through millions of switching traces somehow an inherently volatile process that needs to be carefully managed lest something goes horribly wrong, and so firmware needs to be light-footed in it's demands, user responsiveness be damned? I just don't understand how something so discrete as clock cycles blasting away on a chip built by very smart people literally from the ground up can ever not be, like, perfect-ish?

I'm probably trying to make inferences through like 5+ too many layers of abstraction but what the gently caress is happening on that motherboard/chip/die/core when everything doesn't go as smoothly as I expect it to?

Start here: https://randomascii.wordpress.com/2017/07/09/24-core-cpu-and-i-cant-move-my-mouse/

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

hobbesmaster posted:

This is one of those things where the more you learn the less sure you are what the answer is.

At the lowest level, on the typical modern x86 processor it is actually impossible to make guarantees about determinism or jitter.

Edit: I shouldn’t have used “lowest level” there lol but still

WhyteRyce posted:

scores of post silicon validation engineers stake their entire careers on “doesn’t reproduce, people will blame Windows if it does”

Lmao to both of these. One thing that drives me up the wall now is jr engineers just saying the behaviour is “weird” or “strange” like it’s some kind of incomprehensible black magic. However, sometimes stuff is happening due to power fluctuations or clock jitter or any number of other causes that it may as well be.

The funnest is when you spend weeks chasing some unexplained behaviour only to have it handwaved away by management or even customers/partners who have seen the same unexplainable things without your device in there, or other similar devices. We coulda saved a lot of time and just stamped it “it’s a mystery!”

Getting it good enough to the point where you won’t have customers yelling at you is the main goal everything else is just extra effort :haw:

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

That was fascinating, as were the follow-up articles.

Thanks for that. It's definitely appreciated that he's looking in to delays and issues that have been introduced to Windows more recently

HalloKitty fucked around with this message at 19:35 on May 2, 2023

Serotoning
Sep 14, 2010

D&D: HASBARA SQUAD
HANG 'EM HIGH


We're fighting human animals and we act accordingly
Awesome article, saving it for later. Think I encountered it back when but am definitely now much more qualified to make sense of it so thanks for posting it.

priznat posted:

Getting it good enough to the point where you won’t have customers yelling at you is the main goal everything else is just extra effort :haw:
I think this might be the main thing where like, I expect the tech world to be some bastion of idealism where everything is ruthlessly optimized and reconsidered from the ground up until peak performance is reached, but that's just not how humans operate, even hardware/software nerds and their clients. Pragmatism is king across all walks of life; it running is several factors of magnitude more important than it running "ideally" or even "efficiently", software/hardware being no exception. I guess a very accessible example of this being keyboard layouts most of us are using to type out these posts (except for me, I'm on ortholinear Colemak-DH :smuggo:) being based on 100+ year old layouts which staggered keys and arranged letters for various mechanical reasons all of which are of course moot now.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
You can have all the idealistic plans you want but they rarely survive engagement with release cycles, to butcher the famous quote

The best option is make stuff you can iterate on so after something ships you have a framework to build on for the next thing!

Beef
Jul 26, 2004

priznat posted:

You can have all the idealistic plans you want but they rarely survive engagement with release cycles, to butcher the famous quote

The best option is make stuff you can iterate on so after something ships you have a framework to build on for the next thing!

So many chicken bits.

phongn
Oct 21, 2006

Pvt. Parts posted:

I think this might be the main thing where like, I expect the tech world to be some bastion of idealism where everything is ruthlessly optimized and reconsidered from the ground up until peak performance is reached, but that's just not how humans operate, even hardware/software nerds and their clients.

If you want to see a description on the effort it takes to "make this poo poo run properly to the best effort possible," read this article on the old Space Shuttle codebase: https://www.fastcompany.com/28121/they-write-right-stuff

JawnV6
Jul 4, 2004

So hot ...

BlankSystemDaemon posted:

Sure, except for the fact that because of the microcode, the x86 ISA design which is CISC gets translated into a very RISC-like microarchitectural design that nobody except Intel knows anything about.

this post was engineered in a lab to enrage me. making GBS threads on someone's genuine curiosity and desire to learn? condescending and rude? well sure, that's easy though! being completely 100% wrong on the merits of CISC/RISC/microcode??? that's where the effort really shines

fucks sake microcode is not magic. go look at the PRM, 90% of it's right there in the pseudocode.

hobbesmaster posted:

This is one of those things where the more you learn the less sure you are what the answer is.

At the lowest level, on the typical modern x86 processor it is actually impossible to make guarantees about determinism or jitter.

Edit: I shouldn’t have used “lowest level” there lol but still

yea, the jitter on the gen3 eyes is fine? how much lower did u wanna go

hobbesmaster
Jan 28, 2008

JawnV6 posted:

yea, the jitter on the gen3 eyes is fine? how much lower did u wanna go

RTOS definition of jitter. I bet you could figure out the exact sequence of events that would end in a furious “that means we’d have to enter the design safe state every time an SMI occurs!”

Vanagoon
Jan 20, 2008


Best Dead Gay Forums
on the whole Internet!
A book that I always recommend to anyone who wants to know how the sausage is made is Jon Stokes - Inside the Machine. It's by one of the Ars Technica writers who always did CPU specific articles. It's really good and approachable and understandable by a doofus like me.

https://www.amazon.com/Inside-Machine-Introduction-Microprocessors-Architecture/dp/1593276680

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Eletriarnation posted:

Yeah, as an undergraduate in computer engineering in the late-oughts I had an embedded systems class where we used little student FPGA kits to test our code. Verilog was introduced sometime in sophomore year and the final project for senior-level CPU design was to code out an entire 16-bit CPU (minus the memory controller, provided by the professor) and have it run through a test program successfully in the simulator. Mine was only around 600 lines if I recall correctly, but there wasn't nearly as much help from sophisticated IDEs as I was used to in software development so debugging was very laborious.

Surprised the final project wasn't to run a test program successfully in the FPGA board, that's what they did in my school's equivalent CE class. Our test programs were pretty small though, IIRC we used assembly language rather than any kind of HLL.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Pvt. Parts posted:

It's almost surely more of a software issue, but my main impetus for learning this stuff is figuring out how a modern computer can ever hitch doing basic stuff like navigating around the interface and typing stuff into input text boxes lol. Hell, even how stuff like opening new programs can cause visible hitching; just do it in the "background?" None of this to mention the occasional bouts of instability. Probably easy to go "lol Windows" but what is really happening there? Is Windows/other software somewhere along the stack just grossly mismanaging the resources available to it? Is the humming of electricity coursing through millions of switching traces somehow an inherently volatile process that needs to be carefully managed lest something goes horribly wrong, and so firmware needs to be light-footed in it's demands, user responsiveness be damned? I just don't understand how something so discrete as clock cycles blasting away on a chip built by very smart people literally from the ground up can ever not be, like, perfect-ish?

I'm probably trying to make inferences through like 5+ too many layers of abstraction but what the gently caress is happening on that motherboard/chip/die/core when everything doesn't go as smoothly as I expect it to?

You've gotten pretty good answers on this (it's almost certainly all software, and there's no need for software to be light-footed etc), but I'd like to point you to this blog post from Dan Luu, which should be very informative even though it's not directly about what you're asking.

https://danluu.com/input-lag/

I'd summarize Dan's post this way: Ancient 8-bit computers from the 1980s have better keypress-to-display latency than anything modern, because back then it took a few instructions and zero processes to detect a keypress and put a character glyph on screen. Modern systems have a shitload of complex software added between the keyboard and the display, so latency suffers even though the machine is goddammned fast compared to an Apple IIe. We are addicted to all the features this complexity buys us, so it's not going away.

One of the consequences of complexity is that every once in a while you get interactions like "process A held a lock too long and therefore process B, which was waiting on that lock, couldn't make forward progress on delivering an event to process C". Locks are a synchronization thing: to prevent bugs in this multiprocess multithreaded world, whenever two things working asynchronously from each other need to access the same resource, the first step is to acquire a "lock". When a thread successfully obtains a lock, it knows it's the only one which has it, meaning it is that thread's turn to do whatever it wants with the resource that lock protects. Once done with the resource, the thread must release the associated lock so someone else can take their turn.

In principle you're supposed to keep critical sections (the code which does stuff with a shared resource while holding a lock) as short and simple as possible, so that your thread never holds a lock for a long time. In practice, programmers often gently caress that up and hold locks longer than they should. There's even a class of fatal bug along these lines, deadlock. Assume thread 1 holds lock A and wants lock B, while thread 2 holds B and wants A. Can these threads make forward progress? Nope, they're hosed, they will sit there forever waiting for each other to release the lock the other thread has.

Josh Lyman
May 24, 2009


BobHoward posted:

Surprised the final project wasn't to run a test program successfully in the FPGA board, that's what they did in my school's equivalent CE class. Our test programs were pretty small though, IIRC we used assembly language rather than any kind of HLL.
In my ECE undergrad (2006 grad 👴🏻 ), our sophomore digital design lab used VHDL. I recall having some exposure to Verilog but not using it much.

My group did FPGA stuff in our senior embedded systems elective, and for senior design we built a self driving robot, but I think we handled sensing on a laptop in C++ and the robot motor controls were serial commands or something.

I kinda miss that world—I’m an economist now. :lol: But the Intel/AMD/GPU megathreads scratch that itch better than any other place on the internet. :unsmith:

Josh Lyman fucked around with this message at 06:48 on May 3, 2023

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Josh Lyman posted:

In my ECE undergrad (2006 grad 👴🏻 ), our sophomore digital design lab used VHDL. I recall having some exposure to Verilog but not using it much.

I may have worded that in a misleading way - we designed our processors in AHDL (Altera's old proprietary HDL), then tested the CPUs we'd built by running test programs written in assembly.

I didn't learn VHDL or Verilog in school, they both came later on the job. But once you've learned one HDL you know what you need to learn any of them.

Wish you'd gotten to pursue that career path if it's the itch you want to scratch. Though if you're like me you end up being a professional hater of all the tools you have to use to do your job.

Kazinsal
Dec 13, 2011

BobHoward posted:

Wish you'd gotten to pursue that career path if it's the itch you want to scratch. Though if you're like me you end up being a professional hater of all the tools you have to use to do your job.

To be honest, I always wanted to be a low level programmer. So instead of doing it for work, which is a pretty low volume job market, I do datacenter engineering as a job and systems research as a hobby. The best part about doing it as a hobby is I've gotten to meet and hang out with a bunch of actual professional systems researchers at Google/Apple/Microsoft/etc. without actually having to grind leetcode and suckle at the taint of big tech for a rare as hell job that I don't have the credentials or experience for and never will.

feedmegin
Jul 30, 2008

Josh Lyman posted:

In my ECE undergrad (2006 grad 👴🏻 ), our sophomore digital design lab used VHDL. I recall having some exposure to Verilog but not using it much.

My understanding (hardware adjacent, but software) is historically Europe used VHDL (sort of ADA-like), the US Verilog (sort of C-like) but the whole world in the last few years is converging more and more on (System-)Verilog. Like I say, hardware adjacent not actually someone who writes that stuff myself, though, so.

feedmegin
Jul 30, 2008

Pvt. Parts posted:

I think this might be the main thing where like, I expect the tech world to be some bastion of idealism where everything is ruthlessly optimized and reconsidered from the ground up until peak performance is reached

Mate we have desktop applications now which ship an entire web browser within themselves to present a UI to the user ( https://en.wikipedia.org/wiki/Electron_(software_framework) ) that would have been done in the past with like 100k of Win32 API or w/e code, just because it's easier to find people who know how to write web stuff than desktop stuff. However much you improve the hardware, software guys will find ways to fill it with more layers of useless crud.

The only place where what you're talking about is remotely true is deep embedded and that's because saving 1k of SRAM per device can mean saving $1c per unit which it turns up mounts up when you're making a million microwaves or whatever.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

BobHoward posted:

Wish you'd gotten to pursue that career path if it's the itch you want to scratch. Though if you're like me you end up being a professional hater of all the tools you have to use to do your job.

I’m pretty sure it is because software developers hate hardware developers and want to gently caress with them

phongn
Oct 21, 2006

Vanagoon posted:

A book that I always recommend to anyone who wants to know how the sausage is made is Jon Stokes - Inside the Machine. It's by one of the Ars Technica writers who always did CPU specific articles. It's really good and approachable and understandable by a doofus like me.

https://www.amazon.com/Inside-Machine-Introduction-Microprocessors-Architecture/dp/1593276680

That’s a great book for beginners and I wish there was an updated version (maybe discussing how modern GPUs work too). It sure is a lot more approachable than Hennessy and Patterson’s two textbooks.

Stokes is off doing other things these days, alas.

Phobeste
Apr 9, 2006

never, like, count out Touchdown Tom, man

Pvt. Parts posted:


I think this might be the main thing where like, I expect the tech world to be some bastion of idealism where everything is ruthlessly optimized and reconsidered from the ground up until peak performance is reached, but that's just not how humans operate, even hardware/software nerds and their clients.

it's true that this isn't how people operate and people are lazy but it's also true that you need to consider the constraints and goals of those people, which is: ship stuff so people can use it. you can consider this in capitalism terms (this is how you make money) but you don't have to, in general you'd want to deliver something when it improves life, right, which is maybe before it's perfect.

so given that, this

feedmegin posted:

Mate we have desktop applications now which ship an entire web browser within themselves to present a UI to the user ( https://en.wikipedia.org/wiki/Electron_(software_framework) ) that would have been done in the past with like 100k of Win32 API or w/e code, just because it's easier to find people who know how to write web stuff than desktop stuff.

is a consequence of wanting to deliver stuff faster and with broader applicability and that requires less work (from you, as opposed to from the rest of the field) to maintain, because
- like feedmegin says it's easier to find those people, and find lots of those people, and
- because web stuff took off more recently, it's easier to work with and make nice stuff in, and
- there's a whole ecosystem of supporting software here that there isn't in write your own win32 code land (and this was only very rarely the actual preferred path compared to qt, wxwidgets, java stuff, flash stuff, whatever) and
- the downsides are limited by how fast computers are now

so, electron is good enough while being faster and easier to make nice stuff on, so why wouldn't you? usually, you wouldn't use electron because
- you're an auteur or hobby developer where not using electron is the entire point
- you have all this stuff already written in not electron
- you're in a space where you really cannot actually stand the performance or size downside - embedded, games, hpc
- the supporting software you need isn't compatible with it or something

i think at some point somebody will make something that does a similar thing to electron (browser-style render layer with some shim for doing stuff outside browser constraints) that doesn't involve packing in all of chrome. in fact there are some already but it's a social system so they gotta take off before they can take off if you see what i'm saying

Kibner
Oct 21, 2008

Acguy Supremacy
^ a very on point and well-done post. Thanks!

Adbot
ADBOT LOVES YOU

WhyteRyce
Dec 30, 2001

priznat posted:

I’m pretty sure it is because software developers hate hardware developers and want to gently caress with them

One of my previous jobs involved getting very mad at our internal validation tools teams because those teams would want to rewrite our working tool suites with new ones and prioritizing new software design paradigms, concepts, and whiz bang features over requirements or actually being able to do its job

Hey would you like to use this new config file format we created that bolts on extra features to the json format? What? No? What do you mean you want to use off the shelf parsers for json we can give you a custom library instead

Also one time I had one of those software engineers tell me they weren’t going to deliver a fix that was blocking PRQ because they had some doxygen requirements that took priority

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply