Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
CPColin
Sep 9, 2003

Big ol' smile.
EBCDIC my balls

Adbot
ADBOT LOVES YOU

susan b buffering
Nov 14, 2016

ultrafilter posted:

Is there anyone under 40 who knows EBCDIC is?

I may have done some googling over the past couple pages

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

Is there a list of safe characters to use if you are designing a language?

Gaukler
Oct 9, 2012


CPColin posted:

EBCDIC my balls

EBCDICbutt was right there. And should also be my username.

Impotence
Nov 8, 2010
Lipstick Apathy

taqueso posted:

Is there a list of safe characters to use if you are designing a language?

ultrafilter
Aug 23, 2007

It's okay if you have any questions.


taqueso posted:

Is there a list of safe characters to use if you are designing a language?

http://www.asciitable.com/

Soricidus
Oct 21, 2010
freedom-hating statist shill

taqueso posted:

Is there a list of safe characters to use if you are designing a language?

Just use the same characters as some other popular language. if they’re difficult to type somewhere then at least the programmers in those countries will be used to it

Falcorum
Oct 21, 2010
And if you don't want academia to touch your language, restrict them to plain ASCII and disallow identifiers with less than 4 characters.

Academics love their Greek letter identifiers. :v:

Soricidus
Oct 21, 2010
freedom-hating statist shill
whoops, you accidentally restricted us to four utf8 bytes rather than four ascii characters, here come my cuneiform variables

Doom Mathematic
Sep 2, 2008

taqueso posted:

Is there a list of safe characters to use if you are designing a language?

code:
><+-.,[]

nielsm
Jun 1, 2009



Remember how in the IRC protocol, at least historically, the []\ characters are considered lowercase versions of the {}| characters, for the purpose of channel names and nicknames? I'm pretty sure that's because their encoding in an old 7 bit codepage (probably used in Finland?) represented letters probably åäö/ÅÄÖ, perhaps in some other order.

Xerophyte
Mar 17, 2008

This space intentionally left blank
Pretty much, yeah. The brackets and braces code points are in the "national use" category in ISO 646, which is the generalization of ASCII that was to be modified for other Latin alphabet languages. 7 bit encodings in the Scandinavia were things like CP1018/ISO 646-SE, which put ÅÄÖ directly after A-Z and replacing what would be []\ in ASCII.

Some real greybeards apparently learned to program with the replacement characters.

pokeyman
Nov 26, 2006

That elephant ate my entire platoon.

Xerophyte posted:

Some real greybeards apparently learned to program with the replacement characters.

Literally writing in mojibake, I love it!

Jeb Bush 2012
Apr 4, 2007

A mathematician, like a painter or poet, is a maker of patterns. If his patterns are more permanent than theirs, it is because they are made with ideas.
what I'm hearing here that the only truly portable language is COBOL

Arsenic Lupin
Apr 12, 2012

This particularly rapid💨 unintelligible 😖patter💁 isn't generally heard🧏‍♂️, and if it is🤔, it doesn't matter💁.


Falcorum posted:

And if you don't want academia to touch your language, restrict them to plain ASCII and disallow identifiers with less than 4 characters.

Academics love their Greek letter identifiers. :v:

I felt a great disturbance in the Force, as if millions of C coders cried out in terror and were suddenly silenced.

xtal
Jan 9, 2011

by Fluffdaddy

Arsenic Lupin posted:

I felt a great disturbance in the Force, as if millions of C coders cried out in terror and were suddenly silenced.

Not Bourne!

Arsenic Lupin
Apr 12, 2012

This particularly rapid💨 unintelligible 😖patter💁 isn't generally heard🧏‍♂️, and if it is🤔, it doesn't matter💁.


xtal posted:

Not Bourne!
That which is not dead can never die.

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

I think we should add a backtick key to the other keyboard layouts, they apparently hosed up when they invented them.

Vanadium
Jan 8, 2005

I as a C++ pedantry enthusiast under 40 know about EBCDIC, but I can't imagine anyone knowing about it for other reasons.

Gaukler
Oct 9, 2012


I know about EBCDIC because i was forced to endure a z series mainframe and PL/1 early in my career, in flagrant disregard for the Geneva Conventions

more falafel please
Feb 26, 2005

forums poster

Vanadium posted:

I as a C++ pedantry enthusiast under 40 know about EBCDIC, but I can't imagine anyone knowing about it for other reasons.

I read like, a lot of fortune files in like 2000 so I hope you're ready for jokes about EBCDIC, VAXen, and PL/1

Jaded Burnout
Jul 10, 2004


quote:

I haven't made any functions private because I think programmer should have access to all of the functions. Anything not documented should be considered private with respect to the API and can change. Use at your own risk.

If only there was some language feature that allowed you to tag functions as "private with respect to the API". Nevermind, eh?

Tei
Feb 19, 2011
Probation
Can't post for 4 days!

Soricidus posted:

Cool, now explain how all the other programming languages manage without them

I really can't. But I am spanish, and spanish seems to already have to have all the characters that most programming languages need. Can't speak for people with funny languages.

I guess they learn to press [alt]+[1][2][6] to type ~

Master_Odin
Apr 15, 2010

My spear never misses its mark...

ladies

Jaded Burnout posted:

If only there was some language feature that allowed you to tag functions as "private with respect to the API". Nevermind, eh?
Look, it's got a leading _ in its name to indicate private, what more do you want?

Jose Valasquez
Apr 8, 2005

ultrafilter posted:

Is there anyone under 40 who knows EBCDIC is?

At my first job I had to port some stuff from an IBM mainframe to unix so I unfortunately know what EBCDIC is. I don't recommend ever getting into the position where a job requires you to learn what EBCDIC is

Jaded Burnout
Jul 10, 2004


I'm under 40 and I was taught it in high school.

ultrafilter
Aug 23, 2007

It's okay if you have any questions.


Jaded Burnout posted:

I'm under 40 and I was taught it in high school.

Why?

Jaded Burnout
Jul 10, 2004



Education is always behind the curve, but even then it was more a historical note so as to compare it to ASCII. UTF8 was not taught, though it was relatively new at the time.

We were taught how to do two's complement too. Pointless.

ultrafilter
Aug 23, 2007

It's okay if you have any questions.


Two's complement is a lot more relevant than EBCDIC if you go by what modern systems use.

Jaded Burnout
Jul 10, 2004


ultrafilter posted:

Two's complement is a lot more relevant than EBCDIC if you go by what modern systems use.

Number of times it's been at least a bit useful to know what two's complement in the last 20 years is: 0
Number of times it's been at least a bit useful to know what EBCDIC in the last 20 years is: I dunno but more than 0

It was a real scattergun approach to a syllabus, is what I'm getting at.

Volte
Oct 4, 2004

woosh woosh
I don't see how it's pointless to understand how the machine you're programming works. Programmers should have a basic grasp of two's complement, and IEEE-754 while we're at it.

Jaded Burnout
Jul 10, 2004


Volte posted:

I don't see how it's pointless to understand how the machine you're programming works. Programmers should have a basic grasp of two's complement, and IEEE-754 while we're at it.

Why?

Foxfire_
Nov 8, 2010

What did you need to use EBCDIC for?


IEEE-754 because floating point numbers do not behave like real numbers and many, many, many software bugs are from people thinking they do

2's complement you can get away with not knowing if you accept the min/max of signed integral types as being magic. It's still useful for things like looking at a binary file with a hex editor, looking at memory with a debugger, or doing stupid bit manipulation tricks in embedded.

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

why ask why? use UTF-EBCDIC, guy

Volte
Oct 4, 2004

woosh woosh
In the case of two's complement, it's just basic operational knowledge. It's so you can answer basic questions like "what's the difference between a logical bit shift and an arithmetic bit shift" and evaluate when you should use those things if the need ever arose. It's so that you have an intuition about why the range of representable integers is what it is.

In the case of IEEE-754 it's because they masquerade as real numbers and people like to treat them that way. You should be able to intuitively understand why you shouldn't be using floating point numbers for currency calculations.

Jose Valasquez
Apr 8, 2005

I'm a fairly successful software engineer with 13 years of experience, I learned two's complement in college and I have never once needed to know two's complement beyond that. There hasn't been a single situation where that knowledge has been useful at all.

Knowing that floats can do weird things has been useful once or twice, but just knowing "floats are weird, be careful using them" has been enough knowledge without remembering the details of the technical standard.

Jaded Burnout
Jul 10, 2004


taqueso posted:

why ask why? use UTF-EBCDIC, guy

I asked why, because I've heard these sort of Real Programmer arguments for years, and I'm interested in hearing some justifications for it.

Foxfire_ posted:

What did you need to use EBCDIC for?

I didn't need it, it was vaguely useful background knowledge, would've been just fine not knowing it.

Volte posted:

In the case of two's complement, it's just basic operational knowledge. It's so you can answer basic questions like "what's the difference between a logical bit shift and an arithmetic bit shift" and evaluate when you should use those things if the need ever arose. It's so that you have an intuition about why the range of representable integers is what it is.

Foxfire_ posted:

2's complement you can get away with not knowing if you accept the min/max of signed integral types as being magic. It's still useful for things like looking at a binary file with a hex editor, looking at memory with a debugger, or doing stupid bit manipulation tricks in embedded.

Uh huh. And when do you think "programmers" in aggregate need to do any of these things? Or is C++ the only language to you?

Foxfire_ posted:

IEEE-754 because floating point numbers do not behave like real numbers and many, many, many software bugs are from people thinking they do

Volte posted:

In the case of IEEE-754 it's because they masquerade as real numbers and people like to treat them that way. You should be able to intuitively understand why you shouldn't be using floating point numbers for currency calculations.

If you're using IEEE-754 as a shorthand for "floating points are a compromise and here's the risks" then I think that's valuable to most any programmer, yes, but I'd posit that the kernel of actually useful information there is "don't use them unless you really have to". If you mean IEEE-754 as the actual spec then the only people who need to know it are the people who need to know it.

Which is my point, really. "programmers" are as varied as musicians or construction workers, and when we're paid to work in teams to produce some software of a decent quality based on a world of programming languages, frameworks, and libraries, then there's really no limit to how much or how little we need to know about any particular part of it.

In the case of two's complement I'd say that making us learn how to actually do it with pen and paper was not the most valuable use of our time, but then again they thought that prolog was a good call for the main language we'd learn, so, academia.

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

Volte posted:

In the case of two's complement, it's just basic operational knowledge. It's so you can answer basic questions like "what's the difference between a logical bit shift and an arithmetic bit shift" and evaluate when you should use those things if the need ever arose. It's so that you have an intuition about why the range of representable integers is what it is.

I've been working professionally as a programmer for 5 years and had never heard of "logical" and "arithmetic" shifts until you mentioned them in this post, prompting me to look up what they were. I guess I don't use the shift operator very often, once in a blue moon. I question the description of this as "basic operational knowledge". It's not something you're likely to have a need of unless you're doing low-level bit-twiddling stuff, I think.

Understanding of how IEEE floating-point numbers work, on the other hand, matters in quite a lot of contexts and at quite a lot of different levels of abstraction and I think that's key.

Macichne Leainig
Jul 26, 2012

by VG
The overlap between computer science and computer programming is shrinking, in my opinion. You need less and less true CS understanding to be able to write a program. The amount of low-level knowledge you need varies from dev job to dev job too, so that doesn't really help.

Does that mean knowing how things like two's complement work, or understanding the parts that make up a binary representation of a floating-point number is useless? Hardly, and I expect the people who bothered to learn that extra knowledge are doing well in their current roles. That said, you may as well learn some of that. Mostly because of this, which is said much more eloquently than I could phrase it:

Hammerite posted:

Understanding of how IEEE floating-point numbers work, on the other hand, matters in quite a lot of contexts and at quite a lot of different levels of abstraction and I think that's key.

Adbot
ADBOT LOVES YOU

ultrafilter
Aug 23, 2007

It's okay if you have any questions.


Probably about 95% of the time you can get away with not knowing much computer science, but you need to know enough to recognize the cases where you do.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply