Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Jaded Burnout
Jul 10, 2004


Protocol7 posted:

Does that mean knowing how things like two's complement work, or understanding the parts that make up a binary representation of a floating-point number is useless? Hardly, and I expect the people who bothered to learn that extra knowledge are doing well in their current roles. That said, you may as well learn some of that. Mostly because of this, which is said much more eloquently than I could phrase it:

The specific teaching of it that I received was useless, which to be fair is all I said originally. I bothered to learn it because I had to, and it's made no difference to any job I've had in the last 20 years.

As for the specifics of floating point being useful all over the place, I mean, of the two, sure, but I've worked on two fully-fledged accounting applications that didn't require in depth knowledge of it, because that's a problem we solved a long time ago. The one and only time even a vague understanding of how floating point numbers are represented in software has been helpful was when I was writing a government's calculator for benefit payments, because there was a lot of division and rounding going on. Still, 15 seconds reading the docs on the language's stdlib for floating point was enough to cover all the bases.

Really the reason I'm being a little bit of a stick in the mud is that "programmers should know [low level thing]" has been trotted out for my entire adult life and it's most often just not true.

Adbot
ADBOT LOVES YOU

Jeb Bush 2012
Apr 4, 2007

A mathematician, like a painter or poet, is a maker of patterns. If his patterns are more permanent than theirs, it is because they are made with ideas.
receiving a broad education that prepares you for a variety of different things you may encounter in the future necessarily means learning a bunch of stuff you won't end up using

Volte
Oct 4, 2004

woosh woosh
What you can and can't do and still get a pay cheque is a different matter from what should be taught in a computer science education. Knowing about twos complement and bit shifting behaviour is not just about applying those specific things in your job all day long, it's also about how you think about the values you're manipulating. Nothing is more important than a good mental model as far as I'm concerned. How many people have used unsigned integers to represent things that, logically, should not be negative, because that's what that means right? And then they subtract two ages and end up with four billion because whoops, their mental relationship with the values they were manipulating was incomplete. It pops up in lots of places you don't realize and you're interacting with it one way or the other. Might as well understand it.

Jaded Burnout posted:

As for the specifics of floating point being useful all over the place, I mean, of the two, sure, but I've worked on two fully-fledged accounting applications that didn't require in depth knowledge of it, because that's a problem we solved a long time ago. The one and only time even a vague understanding of how floating point numbers are represented in software has been helpful was when I was writing a government's calculator for benefit payments, because there was a lot of division and rounding going on. Still, 15 seconds reading the docs on the language's stdlib for floating point was enough to cover all the bases.

Really the reason I'm being a little bit of a stick in the mud is that "programmers should know [low level thing]" has been trotted out for my entire adult life and it's most often just not true.

Volte posted:

You should be able to intuitively understand why you shouldn't be using floating point numbers for currency calculations.

Jaded Burnout
Jul 10, 2004


Volte posted:

How many people have used unsigned integers to represent things that, logically, should not be negative, because that's what that means right? And then they subtract two ages and end up with four billion because whoops, their mental relationship with the values they were manipulating was incomplete. It pops up in lots of places you don't realize and you're interacting with it one way or the other. Might as well understand it.

None that I know personally, because I've never worked with systems that didn't already account for that.

Volte posted:

Nothing is more important than a good mental model as far as I'm concerned.

And as far as I'm concerned, I can explain that to a junior dev in 20 seconds and they can read wikipedia if they're curious. Software education in schools is IMO too heavy on the technical details and too light on the reality of working with the tools available to us. Or at least, it was back then.

Edit: Honestly, I think pretty much all secondary education goes too far into the weeds at a point where the students don't have the context to usefully absorb it. But details and quantifiable procedures are easier to write standardised tests for.

Jaded Burnout fucked around with this message at 19:38 on Sep 28, 2020

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe
Logical and arithmetic right shift are really more to do with the difference between signed and unsigned numbers than anything about the signed-integer representation. Either way, bit-shifts — and bitwise operators more generally — are definitely corner-case features. There are two good reasons a general programmer might need to know about them: custom hash functions and manual bit-packing. For custom hash functions, you really just need a good hash-combine function, which you can write once (or better yet, take from some library) and then re-use everywhere. That doesn't work for bit-packing, which is actually a very useful and important trick in most languages — but it's also obnoxiously unreadable, and the abstractions that would fix that create big performance trade-offs in many of the languages where it would matter most.

Two's complement is important if you want to understand some of the corner cases with signed integers or if you want to look at memory in a low-level debugger. I tend to think that schools should teach these things, but I can understand why some people disagree.

Volte
Oct 4, 2004

woosh woosh
Are you talking about high school? I thought we were talking about CS education here.

more falafel please
Feb 26, 2005

forums poster

I'm so glad I haven't made the transition into enterprise business-case poo poo writing node microservices or something because the idea of working with people who don't know how floats work is terrifying to me

Jose Valasquez
Apr 8, 2005

Volte posted:

Are you talking about high school? I thought we were talking about CS education here.

We're talking about "programmers"

Volte posted:

I don't see how it's pointless to understand how the machine you're programming works. Programmers should have a basic grasp of two's complement, and IEEE-754 while we're at it.

A group that is probably a minority CS degree holders at this point

Volte
Oct 4, 2004

woosh woosh
I somehow missed the post about being taught EBCDIC in high school and thought the discussion was about being taught it during higher level CS education. I actually have no opinion on it being taught in high school, but it should definitely be taught during CS education. I still say that everyone should have a working understanding and healthy respect for IEEE-754 regardless of their educational background, and twos complement just as a matter of understanding how your computer works. You should also have a basic understanding of virtual memory and caching and a bunch of other stuff too, because even if it never pops up at the top of your work list, understanding how your tools work is never truly unnecessary.

It's funny to me that computer science used to be more about learning computation in the abstract and working programmers focused more on the nitty gritty details. Now we're in the situtation working programmers focus on programming in the abstract and the nitty gritty is relegated to academia.

Jose Valasquez
Apr 8, 2005

Volte posted:

I somehow missed the post about being taught EBCDIC in high school and thought the discussion was about being taught it during higher level CS education. I actually have no opinion on it being taught in high school, but it should definitely be taught during CS education. I still say that everyone should have a working understanding and healthy respect for IEEE-754 regardless of their educational background, and twos complement just as a matter of understanding how your computer works. You should also have a basic understanding of virtual memory and caching and a bunch of other stuff too, because even if it never pops up at the top of your work list, understanding how your tools work is never truly unnecessary.

It's funny to me that computer science used to be more about learning computation in the abstract and working programmers focused more on the nitty gritty details. Now we're in the situtation working programmers focus on programming in the abstract and the nitty gritty is relegated to academia.

I too think that the things that I know are very important for everyone else to know but I know different things

Volte
Oct 4, 2004

woosh woosh

Jose Valasquez posted:

I too think that the things that I know are very important for everyone else to know but I know different things
The message I'm getting is "It's not important for people to know the things I don't know".

Foxfire_
Nov 8, 2010

Jaded Burnout posted:

In the case of two's complement I'd say that making us learn how to actually do it with pen and paper was not the most valuable use of our time, but then again they thought that prolog was a good call for the main language we'd learn, so, academia.

Was the degree 'Computer Science' or 'Computer Engineering/Programming'? They aren't the same thing and any worthwhile CS program should include functional/logic programming. Saying it's not practically useful is like a statistician saying "I've never used group theory in my career, so it was a waste of time including it in my generic Mathematics degree". A CS degree is not a prep program for a job as a programmer.

(also two's complement is not a hard thing and shouldn't even take a full lecture. Yeah you can probably skip it and lots of people will be fine, but it's small enough to be weird. It'd be like skipping while loops in a programming class because you can get by with just for loops)


Jaded Burnout posted:

As for the specifics of floating point being useful all over the place, I mean, of the two, sure, but I've worked on two fully-fledged accounting applications that didn't require in depth knowledge of it, because that's a problem we solved a long time ago. The one and only time even a vague understanding of how floating point numbers are represented in software has been helpful was when I was writing a government's calculator for benefit payments, because there was a lot of division and rounding going on. Still, 15 seconds reading the docs on the language's stdlib for floating point was enough to cover all the bases.

If those programs usage of floats was anything besides, "Don't use them for anything besides display" (which is what accounting programs should be doing), they are probably either buggy in corner cases or functioning by happening to stay in a region where float vs real doesn't matter. Doing numerical things correctly with floats just inherently isn't a 15s type thing to learn. It's a thing like threading with deep dangerous waters and it's easy to make something that seems fine but is actually subtly wrong.

Foxfire_ fucked around with this message at 20:26 on Sep 28, 2020

Volte
Oct 4, 2004

woosh woosh

Foxfire_ posted:

Was the degree 'Computer Science' or 'Computer Engineering/Programming'? They aren't the same thing and any worthwhile CS program should include functional/logic programming. Saying it's not practically useful is like a statistician saying "I've never used group theory in my career, so it was a waste of time including it in my generic Mathematics degree". A CS degree is not a prep program for a job as a programmer.
They were originally talking about high school which I missed and caused the whole derail, so that's my bad. A lot of the stuff my high school taught me was a complete waste of time as well, like teaching us matrix multiplication without a single iota of intuition or even an explanation about what matrices are even used for. I can sympathize with a lovely high school teacher making stuff feel like a waste of time, but it doesn't mean it's not still valuable knowledge if taught correctly.

Jaded Burnout
Jul 10, 2004


Yes once you get into tertiary education, that feels like a much more appropriate point to hash out the details.

Arsenic Lupin
Apr 12, 2012

This particularly rapid💨 unintelligible 😖patter💁 isn't generally heard🧏‍♂️, and if it is🤔, it doesn't matter💁.


Volte posted:

You should be able to intuitively understand why you shouldn't be using floating point numbers for currency calculations.
It's funny; I learned all about shifts because they were fast. Most programmers today don't have to care about that sort of speed issue. The thing that absofuckinglutely should NOT be part of whiteboard interviews is what order sorts are. If I want a sort, I pick one from the library. If I'm doing something really esoteric with sorts, I look at the published papers and possibly talk to an actual computer scientist. And I never, ever, ever use a bubble sort, so why would I remember either its order or how to code one?

Pure virtual-dick waving.

more falafel please
Feb 26, 2005

forums poster

Arsenic Lupin posted:

It's funny; I learned all about shifts because they were fast. Most programmers today don't have to care about that sort of speed issue. The thing that absofuckinglutely should NOT be part of whiteboard interviews is what order sorts are. If I want a sort, I pick one from the library. If I'm doing something really esoteric with sorts, I look at the published papers and possibly talk to an actual computer scientist. And I never, ever, ever use a bubble sort, so why would I remember either its order or how to code one?

Pure virtual-dick waving.

I don't expect you to know the orders of various algorithms off the top of your head, but I do expect you to be able to reason about the orders of various algorithms, because if you can't, you're going to do some truly heinous poo poo.

Arsenic Lupin
Apr 12, 2012

This particularly rapid💨 unintelligible 😖patter💁 isn't generally heard🧏‍♂️, and if it is🤔, it doesn't matter💁.


more falafel please posted:

I don't expect you to know the orders of various algorithms off the top of your head, but I do expect you to be able to reason about the orders of various algorithms, because if you can't, you're going to do some truly heinous poo poo.

Valid, but what is often going on is actually the direct question about what order it is, occasionally coupled with a demand to code it.

Jose Valasquez
Apr 8, 2005

Volte posted:

The message I'm getting is "It's not important for people to know the things I don't know".

It's not that I don't know two's complement or the problems with floats*, it's just that the things you are saying are very important for every programmer to know have not been at all important in my career :shrug:

A very specific subset of programmers need to know about those things, but your average programmer popping out CRUD APIs for enterprise apps or wrangling whatever new javascript framework is popular 9-5 is never in their life going to need to know this stuff.

If you want to argue that a Computer Science undergraduate education should include two's complement and IEEE-754, then I don't really have a problem with that, but that's not where this conversation started, it started with

Volte posted:

Programmers should have a basic grasp of two's complement, and IEEE-754 while we're at it.


* I couldn't explain the specifics IEEE-754 off the top of my head, but I know enough to get the gist of what happens. I learned it at one point.

Vanadium
Jan 8, 2005

... I also learned that ones and twos complement are a thing from C++ pedantry but this knowledge has been nearly useless to me because I can never remember which one is which.

Really, there's a lot of stuff I picked up somehow from arguments on the internet and reading ancient C tomes that has somehow never come up at work. I guess except for how my first job used floating point for currency amounts, which, like, I knew was wrong but seemed to work for them so who was I to get into a fight.

Volte
Oct 4, 2004

woosh woosh

Jose Valasquez posted:

A very specific subset of programmers need to know about those things, but your average programmer popping out CRUD APIs for enterprise apps or wrangling whatever new javascript framework is popular 9-5 is never in their life going to need to know this stuff.
A very specific subset of programmers is going to need to know how numbers are represented, or understand the performance implications of reading non-sequential data from memory? Are we really to a point where literally any software other than enterprise CRUD web apps is considered esoteric academic poo poo that "most programmers" wouldn't need to worry about writing? Someone is writing all the software we use on a daily basis and I don't know about you, but I interact with a lot of stuff that's not CRUD web apps. You're actually the one talking about a very specific subset of programmers: the ones who specialize in writing CRUD apps and frontend Javascript stuff. If we're considering "programmer" to mean "anyone who can create at least one specific type of program in one particular environment" then I guess technically you don't need to know anything to be a programmer other than how to read a tutorial about how to do the one thing you need to do.

Maybe I should rephrase what I said originally to be less pithy: If you want to be a well-rounded and versatile programmer capable of approaching any programming problem that may arise, regardless of the domain, programming language, system architecture, etc., then you ought to know what twos complement is and have a basic grasp of IEEE-754. There's nothing wrong with making a career out of specializing in specific types of programming obviously, but the idea that the job performed by the proverbial Enterprise-CRUD-'N'-Javascript Code Ninja is programming and anything else is a specialized subset of that is ultra backwards to me. It's like saying artists these days only need to know how to do corporate graphic design in Adobe Illustrator and working with physical media is an esoteric subset of that.

Volguus
Mar 3, 2009
School = theory
Career = practice

In theory, theory and practice are the same. In practice, they are not.

Still, one should have a basic understanding of theory so that they know (at least) where to look for information when the need does come.

more falafel please
Feb 26, 2005

forums poster

Wait knowing how floats work is theory now

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

more falafel please posted:

Wait knowing how floats work is theory now

Aside from "don't expect exact precision when doing arithmetic with floats or use it for handling money, you will regret it," effectively, yeah.

Volmarias fucked around with this message at 23:42 on Sep 28, 2020

Volte
Oct 4, 2004

woosh woosh

Volmarias posted:

Aside from "don't expect exact precision when doing arithmetic with floats or for handling money, you will regret it," effectively, yeah.
It's actually extremely easy to trigger catastrophic behaviour when doing floating point arithmetic on values of very different magnitudes, it's not just about "exact precision". The quadratic formula everyone learned in high school is a great example of a place you can get just insanely wrong answers: https://www.johndcook.com/blog/2018/04/28/quadratic-formula/

spiritual bypass
Feb 19, 2008

Grimey Drawer
Two's complement is a big deal because everyone will have some point in their career where they'll need to look at a byte that might be a negative integer

fritz
Jul 26, 2003

Volte posted:

Maybe I should rephrase what I said originally to be less pithy: If you want to be a well-rounded and versatile programmer capable of approaching any programming problem that may arise, regardless of the domain, programming language, system architecture, etc., then you ought to ...have a basic grasp of IEEE-754.

I think every programmer should know https://floating-point-gui.de but not every programmer should have to know https://www.itu.dk/~sestoft/bachelor/IEEE754_article.pdf

There's way too many things out there to know for someone to know them all. I can't explain how databases work under the hood (and can barely write sql), or how garbage collection is implemented, and ui is a complete mystery, but if I'm going to start insisting that everybody should know the stuff I know then I'm going to start feeling bad about not knowing the stuff they know.

Volte
Oct 4, 2004

woosh woosh

fritz posted:

I think every programmer should know https://floating-point-gui.de but not every programmer should have to know https://www.itu.dk/~sestoft/bachelor/IEEE754_article.pdf

There's way too many things out there to know for someone to know them all. I can't explain how databases work under the hood (and can barely write sql), or how garbage collection is implemented, and ui is a complete mystery, but if I'm going to start insisting that everybody should know the stuff I know then I'm going to start feeling bad about not knowing the stuff they know.
Yeah, there's a difference between "be an expert on everything you use" and "have an understanding of everything you use". A decent understanding of floating point numbers would, to me, include knowing that consecutive representable values get spaced exponentially further apart the further away from zero you get, and that subtracting two numbers that are almost equal to each other is prone to catastrophic error. You should know enough about everything you use to be able to determine if you should even be using that thing at all. The idea of "knowing just enough to be dangerous" applies to floating point numbers for sure.

Arsenic Lupin
Apr 12, 2012

This particularly rapid💨 unintelligible 😖patter💁 isn't generally heard🧏‍♂️, and if it is🤔, it doesn't matter💁.


I think that for a lot of cases the most important thing is to know when a problem falls into "poo poo I should look up". Like, you don't have to have IEEE 754 on the tip of your tongue, but you do need to know that floating-point is black magic and that you must go hit some references if you're actually using it. Same with memory management; if you can rely on the manager built into your Smalltalk *, you're good, but if you're working in C++, there's a lot of previous art to absorb. And it's nice to recognize when the thing your manager just assigned you reduces to the traveling salesman problem.

Somebody should teach an undergraduate-level CS course of "poo poo that you probably expect to work, but doesn't."


* look, I can dream, okay?

zergstain
Dec 15, 2005

Volte posted:

I somehow missed the post about being taught EBCDIC in high school and thought the discussion was about being taught it during higher level CS education. I actually have no opinion on it being taught in high school, but it should definitely be taught during CS education. I still say that everyone should have a working understanding and healthy respect for IEEE-754 regardless of their educational background, and twos complement just as a matter of understanding how your computer works. You should also have a basic understanding of virtual memory and caching and a bunch of other stuff too, because even if it never pops up at the top of your work list, understanding how your tools work is never truly unnecessary.

It's funny to me that computer science used to be more about learning computation in the abstract and working programmers focused more on the nitty gritty details. Now we're in the situtation working programmers focus on programming in the abstract and the nitty gritty is relegated to academia.

I'm not sure EBCDIC was ever mentioned during the course of my CS education, and I don't see how my class was missing anything at all. I mean I've heard of it, and I understand it to be an alternative coding to ASCII used on IBM mainframes. If I ever needed the details, I can read up on it.

Sure, I didn't go to a top tier school. I think it far from sucked, however.

Volte
Oct 4, 2004

woosh woosh

zergstain posted:

I'm not sure EBCDIC was ever mentioned during the course of my CS education, and I don't see how my class was missing anything at all. I mean I've heard of it, and I understand it to be an alternative coding to ASCII used on IBM mainframes. If I ever needed the details, I can read up on it.

Sure, I didn't go to a top tier school. I think it far from sucked, however.
I never meant to imply that EBCDIC should ever be mentioned by anyone. I was actually referring to twos complement in that confusingly worded sentence.

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe
The only good reason to ever learn about EBCDIC, besides having to deal with some ridiculous IBM mainframe, is that learning to write a program that outputs something in EBCDIC when it’s not the system character set is a great way to actually internalize character encodings and start thinking about your programming language as something that isn’t just magic.

zergstain
Dec 15, 2005

Volte posted:

I never meant to imply that EBCDIC should ever be mentioned by anyone. I was actually referring to twos complement in that confusingly worded sentence.

EBCIDIC was the only thing mentioned before that use of the word "it", so hopefully you can see why I thought you were saying EBCDIC should definitely be taught during CS education. 2s-compliment, I can agree on.

I believe cache locality and poo poo only came up in an elective course I took on parallel programming.

Bruegels Fuckbooks
Sep 14, 2004

Now, listen - I know the two of you are very different from each other in a lot of ways, but you have to understand that as far as Grandpa's concerned, you're both pieces of shit! Yeah. I can prove it mathematically.

Volte posted:

A very specific subset of programmers is going to need to know how numbers are represented, or understand the performance implications of reading non-sequential data from memory? Are we really to a point where literally any software other than enterprise CRUD web apps is considered esoteric academic poo poo that "most programmers" wouldn't need to worry about writing? Someone is writing all the software we use on a daily basis and I don't know about you, but I interact with a lot of stuff that's not CRUD web apps. You're actually the one talking about a very specific subset of programmers: the ones who specialize in writing CRUD apps and frontend Javascript stuff. If we're considering "programmer" to mean "anyone who can create at least one specific type of program in one particular environment" then I guess technically you don't need to know anything to be a programmer other than how to read a tutorial about how to do the one thing you need to do.

Maybe I should rephrase what I said originally to be less pithy: If you want to be a well-rounded and versatile programmer capable of approaching any programming problem that may arise, regardless of the domain, programming language, system architecture, etc., then you ought to know what twos complement is and have a basic grasp of IEEE-754. There's nothing wrong with making a career out of specializing in specific types of programming obviously, but the idea that the job performed by the proverbial Enterprise-CRUD-'N'-Javascript Code Ninja is programming and anything else is a specialized subset of that is ultra backwards to me. It's like saying artists these days only need to know how to do corporate graphic design in Adobe Illustrator and working with physical media is an esoteric subset of that.

IEEE-754 is relevant to CRUD-'N'-Javascript development because the only numeric type available in javascript is 64-bit floating point, following the IE-754 standard. So I would hope anyone who programs in Javascript knows what problems that can cause.

Volte
Oct 4, 2004

woosh woosh

zergstain posted:

EBCIDIC was the only thing mentioned before that use of the word "it", so hopefully you can see why I thought you were saying EBCDIC should definitely be taught during CS education. 2s-compliment, I can agree on.
Yeah oops, I botched the sentence structure. I hope this whole time people haven't been thinking that I'm advocating for EBCDIC to be basic programmer knowledge!

ultrafilter
Aug 23, 2007

It's okay if you have any questions.


Foxfire_ posted:

Saying it's not practically useful is like a statistician saying "I've never used group theory in my career, so it was a waste of time including it in my generic Mathematics degree".

I agree with you completely but this is a bad example. "What is a group?" is basic math that everyone in a statistics department would be expected to know and it shows up often enough that you can't completely forget it.

Volguus
Mar 3, 2009

Bruegels Fuckbooks posted:

IEEE-754 is relevant to CRUD-'N'-Javascript development because the only numeric type available in javascript is 64-bit floating point, following the IE-754 standard. So I would hope anyone who programs in Javascript knows what problems that can cause.

What problems? They're sending the input to the server to deal with it (which, hope to god, is not written in Javascript) or producing the UI/report/whatever from whatever thing the server spit out. Javascript doesn't need to think, just do.

Volte
Oct 4, 2004

woosh woosh

Volguus posted:

What problems? They're sending the input to the server to deal with it (which, hope to god, is not written in Javascript) or producing the UI/report/whatever from whatever thing the server spit out. Javascript doesn't need to think, just do.
Let's hope you don't need to round-trip any 64-bit integers. Also if you think the server is probably not written in Javascript, I have some tragic news.

susan b buffering
Nov 14, 2016

Volguus posted:

What problems? They're sending the input to the server to deal with it (which, hope to god, is not written in Javascript)

:allears:

Foxfire_
Nov 8, 2010

ultrafilter posted:

I agree with you completely but this is a bad example. "What is a group?" is basic math that everyone in a statistics department would be expected to know and it shows up often enough that you can't completely forget it.

I was thinking like someone working in drug trials, which I imagine is mostly knowing what tests to use and when, but I don't actually know anything about day-to-day.

Adbot
ADBOT LOVES YOU

Jose Valasquez
Apr 8, 2005

My objection was to the claim that every programmer should know <insert whatever technical detail>. I think it is demeaning to the tons of people who are good programmers make a living without a strong CS background. I think that most people will never have to worry about two's complement or the specific details of how floats work. Even a good chunk of people doing non-trivial things don't have to worry about them. I work on a global distributed system that handles literally a billion+ requests per day and in my entire career there has not been a single instance where knowing how two's complement works has been useful. There has been one time in my career where I've helped solve a float problem, and "oh yeah, floats are weird, we shouldn't use them here" was enough knowledge to get past it. I think the number of people who never in their career have to consider "arithmetic bit shift vs logical bit shift" includes the vast majority of people.

Obviously someone needs to know these things, somewhere in the stack of the system I work on someone has given all these things lots of thought so that the rest of us don't have to worry about it. I'm also pretty confident that if I end up working on the lower level details where that stuff matters learning two's complement would not be a stumbling block even if I had never heard of it before in my life. "Flip the bits of the non-negative number and add 1, now you have two's complement." Congrats everyone, now the entire thread knows two's complement :v:

quote:

the idea that the job performed by the proverbial Enterprise-CRUD-'N'-Javascript Code Ninja is programming and anything else is a specialized subset of that is ultra backwards to me. It's like saying artists these days only need to know how to do corporate graphic design in Adobe Illustrator and working with physical media is an esoteric subset of that.
You're interpreting what I'm trying to say exactly backwards. The people who do enterprise CRUD are programmers too and don't need to know the fine details of how everything works. I've worked with some excellent programmers who fit that bill. If someone only does art in Adobe Illustrator they can still be a good artist.

Basically what Jaded Burnout said:

Jaded Burnout posted:

Really the reason I'm being a little bit of a stick in the mud is that "programmers should know [low level thing]" has been trotted out for my entire adult life and it's most often just not true.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply