|
EBCDIC my balls
|
# ? Sep 27, 2020 00:12 |
|
|
# ? Jun 5, 2024 21:40 |
|
ultrafilter posted:Is there anyone under 40 who knows EBCDIC is? I may have done some googling over the past couple pages
|
# ? Sep 27, 2020 00:16 |
|
Is there a list of safe characters to use if you are designing a language?
|
# ? Sep 27, 2020 01:07 |
|
CPColin posted:EBCDIC my balls EBCDICbutt was right there. And should also be my username.
|
# ? Sep 27, 2020 02:15 |
|
taqueso posted:Is there a list of safe characters to use if you are designing a language?
|
# ? Sep 27, 2020 02:44 |
|
taqueso posted:Is there a list of safe characters to use if you are designing a language? http://www.asciitable.com/
|
# ? Sep 27, 2020 03:08 |
|
taqueso posted:Is there a list of safe characters to use if you are designing a language? Just use the same characters as some other popular language. if they’re difficult to type somewhere then at least the programmers in those countries will be used to it
|
# ? Sep 27, 2020 08:13 |
|
And if you don't want academia to touch your language, restrict them to plain ASCII and disallow identifiers with less than 4 characters. Academics love their Greek letter identifiers.
|
# ? Sep 27, 2020 12:29 |
|
whoops, you accidentally restricted us to four utf8 bytes rather than four ascii characters, here come my cuneiform variables
|
# ? Sep 27, 2020 13:04 |
|
taqueso posted:Is there a list of safe characters to use if you are designing a language? code:
|
# ? Sep 27, 2020 15:24 |
Remember how in the IRC protocol, at least historically, the []\ characters are considered lowercase versions of the {}| characters, for the purpose of channel names and nicknames? I'm pretty sure that's because their encoding in an old 7 bit codepage (probably used in Finland?) represented letters probably åäö/ÅÄÖ, perhaps in some other order.
|
|
# ? Sep 27, 2020 16:04 |
|
Pretty much, yeah. The brackets and braces code points are in the "national use" category in ISO 646, which is the generalization of ASCII that was to be modified for other Latin alphabet languages. 7 bit encodings in the Scandinavia were things like CP1018/ISO 646-SE, which put ÅÄÖ directly after A-Z and replacing what would be []\ in ASCII. Some real greybeards apparently learned to program with the replacement characters.
|
# ? Sep 27, 2020 16:32 |
|
Xerophyte posted:Some real greybeards apparently learned to program with the replacement characters. Literally writing in mojibake, I love it!
|
# ? Sep 27, 2020 17:07 |
|
what I'm hearing here that the only truly portable language is COBOL
|
# ? Sep 27, 2020 17:24 |
|
Falcorum posted:And if you don't want academia to touch your language, restrict them to plain ASCII and disallow identifiers with less than 4 characters. I felt a great disturbance in the Force, as if millions of C coders cried out in terror and were suddenly silenced.
|
# ? Sep 27, 2020 17:44 |
|
Arsenic Lupin posted:I felt a great disturbance in the Force, as if millions of C coders cried out in terror and were suddenly silenced. Not Bourne!
|
# ? Sep 27, 2020 17:45 |
|
xtal posted:Not Bourne!
|
# ? Sep 27, 2020 17:51 |
|
I think we should add a backtick key to the other keyboard layouts, they apparently hosed up when they invented them.
|
# ? Sep 27, 2020 18:31 |
|
I as a C++ pedantry enthusiast under 40 know about EBCDIC, but I can't imagine anyone knowing about it for other reasons.
|
# ? Sep 27, 2020 22:12 |
|
I know about EBCDIC because i was forced to endure a z series mainframe and PL/1 early in my career, in flagrant disregard for the Geneva Conventions
|
# ? Sep 28, 2020 06:05 |
|
Vanadium posted:I as a C++ pedantry enthusiast under 40 know about EBCDIC, but I can't imagine anyone knowing about it for other reasons. I read like, a lot of fortune files in like 2000 so I hope you're ready for jokes about EBCDIC, VAXen, and PL/1
|
# ? Sep 28, 2020 06:38 |
|
quote:I haven't made any functions private because I think programmer should have access to all of the functions. Anything not documented should be considered private with respect to the API and can change. Use at your own risk. If only there was some language feature that allowed you to tag functions as "private with respect to the API". Nevermind, eh?
|
# ? Sep 28, 2020 14:02 |
|
Soricidus posted:Cool, now explain how all the other programming languages manage without them I really can't. But I am spanish, and spanish seems to already have to have all the characters that most programming languages need. Can't speak for people with funny languages. I guess they learn to press [alt]+[1][2][6] to type ~
|
# ? Sep 28, 2020 14:29 |
|
Jaded Burnout posted:If only there was some language feature that allowed you to tag functions as "private with respect to the API". Nevermind, eh?
|
# ? Sep 28, 2020 15:53 |
|
ultrafilter posted:Is there anyone under 40 who knows EBCDIC is? At my first job I had to port some stuff from an IBM mainframe to unix so I unfortunately know what EBCDIC is. I don't recommend ever getting into the position where a job requires you to learn what EBCDIC is
|
# ? Sep 28, 2020 16:28 |
|
I'm under 40 and I was taught it in high school.
|
# ? Sep 28, 2020 16:47 |
|
Jaded Burnout posted:I'm under 40 and I was taught it in high school. Why?
|
# ? Sep 28, 2020 16:47 |
|
ultrafilter posted:Why? Education is always behind the curve, but even then it was more a historical note so as to compare it to ASCII. UTF8 was not taught, though it was relatively new at the time. We were taught how to do two's complement too. Pointless.
|
# ? Sep 28, 2020 17:00 |
|
Two's complement is a lot more relevant than EBCDIC if you go by what modern systems use.
|
# ? Sep 28, 2020 17:31 |
|
ultrafilter posted:Two's complement is a lot more relevant than EBCDIC if you go by what modern systems use. Number of times it's been at least a bit useful to know what two's complement in the last 20 years is: 0 Number of times it's been at least a bit useful to know what EBCDIC in the last 20 years is: I dunno but more than 0 It was a real scattergun approach to a syllabus, is what I'm getting at.
|
# ? Sep 28, 2020 17:58 |
|
I don't see how it's pointless to understand how the machine you're programming works. Programmers should have a basic grasp of two's complement, and IEEE-754 while we're at it.
|
# ? Sep 28, 2020 18:15 |
|
Volte posted:I don't see how it's pointless to understand how the machine you're programming works. Programmers should have a basic grasp of two's complement, and IEEE-754 while we're at it. Why?
|
# ? Sep 28, 2020 18:24 |
|
What did you need to use EBCDIC for? IEEE-754 because floating point numbers do not behave like real numbers and many, many, many software bugs are from people thinking they do 2's complement you can get away with not knowing if you accept the min/max of signed integral types as being magic. It's still useful for things like looking at a binary file with a hex editor, looking at memory with a debugger, or doing stupid bit manipulation tricks in embedded.
|
# ? Sep 28, 2020 18:28 |
|
why ask why? use UTF-EBCDIC, guy
|
# ? Sep 28, 2020 18:29 |
|
In the case of two's complement, it's just basic operational knowledge. It's so you can answer basic questions like "what's the difference between a logical bit shift and an arithmetic bit shift" and evaluate when you should use those things if the need ever arose. It's so that you have an intuition about why the range of representable integers is what it is. In the case of IEEE-754 it's because they masquerade as real numbers and people like to treat them that way. You should be able to intuitively understand why you shouldn't be using floating point numbers for currency calculations.
|
# ? Sep 28, 2020 18:31 |
|
I'm a fairly successful software engineer with 13 years of experience, I learned two's complement in college and I have never once needed to know two's complement beyond that. There hasn't been a single situation where that knowledge has been useful at all. Knowing that floats can do weird things has been useful once or twice, but just knowing "floats are weird, be careful using them" has been enough knowledge without remembering the details of the technical standard.
|
# ? Sep 28, 2020 18:49 |
|
taqueso posted:why ask why? use UTF-EBCDIC, guy I asked why, because I've heard these sort of Real Programmer arguments for years, and I'm interested in hearing some justifications for it. Foxfire_ posted:What did you need to use EBCDIC for? I didn't need it, it was vaguely useful background knowledge, would've been just fine not knowing it. Volte posted:In the case of two's complement, it's just basic operational knowledge. It's so you can answer basic questions like "what's the difference between a logical bit shift and an arithmetic bit shift" and evaluate when you should use those things if the need ever arose. It's so that you have an intuition about why the range of representable integers is what it is. Foxfire_ posted:2's complement you can get away with not knowing if you accept the min/max of signed integral types as being magic. It's still useful for things like looking at a binary file with a hex editor, looking at memory with a debugger, or doing stupid bit manipulation tricks in embedded. Uh huh. And when do you think "programmers" in aggregate need to do any of these things? Or is C++ the only language to you? Foxfire_ posted:IEEE-754 because floating point numbers do not behave like real numbers and many, many, many software bugs are from people thinking they do Volte posted:In the case of IEEE-754 it's because they masquerade as real numbers and people like to treat them that way. You should be able to intuitively understand why you shouldn't be using floating point numbers for currency calculations. If you're using IEEE-754 as a shorthand for "floating points are a compromise and here's the risks" then I think that's valuable to most any programmer, yes, but I'd posit that the kernel of actually useful information there is "don't use them unless you really have to". If you mean IEEE-754 as the actual spec then the only people who need to know it are the people who need to know it. Which is my point, really. "programmers" are as varied as musicians or construction workers, and when we're paid to work in teams to produce some software of a decent quality based on a world of programming languages, frameworks, and libraries, then there's really no limit to how much or how little we need to know about any particular part of it. In the case of two's complement I'd say that making us learn how to actually do it with pen and paper was not the most valuable use of our time, but then again they thought that prolog was a good call for the main language we'd learn, so, academia.
|
# ? Sep 28, 2020 18:49 |
|
Volte posted:In the case of two's complement, it's just basic operational knowledge. It's so you can answer basic questions like "what's the difference between a logical bit shift and an arithmetic bit shift" and evaluate when you should use those things if the need ever arose. It's so that you have an intuition about why the range of representable integers is what it is. I've been working professionally as a programmer for 5 years and had never heard of "logical" and "arithmetic" shifts until you mentioned them in this post, prompting me to look up what they were. I guess I don't use the shift operator very often, once in a blue moon. I question the description of this as "basic operational knowledge". It's not something you're likely to have a need of unless you're doing low-level bit-twiddling stuff, I think. Understanding of how IEEE floating-point numbers work, on the other hand, matters in quite a lot of contexts and at quite a lot of different levels of abstraction and I think that's key.
|
# ? Sep 28, 2020 18:51 |
|
The overlap between computer science and computer programming is shrinking, in my opinion. You need less and less true CS understanding to be able to write a program. The amount of low-level knowledge you need varies from dev job to dev job too, so that doesn't really help. Does that mean knowing how things like two's complement work, or understanding the parts that make up a binary representation of a floating-point number is useless? Hardly, and I expect the people who bothered to learn that extra knowledge are doing well in their current roles. That said, you may as well learn some of that. Mostly because of this, which is said much more eloquently than I could phrase it: Hammerite posted:Understanding of how IEEE floating-point numbers work, on the other hand, matters in quite a lot of contexts and at quite a lot of different levels of abstraction and I think that's key.
|
# ? Sep 28, 2020 19:06 |
|
|
# ? Jun 5, 2024 21:40 |
|
Probably about 95% of the time you can get away with not knowing much computer science, but you need to know enough to recognize the cases where you do.
|
# ? Sep 28, 2020 19:10 |