Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Thermopyle
Jul 1, 2003

...the stupid are cocksure while the intelligent are full of doubt. —Bertrand Russell


That's great.

(Currently on the hate side of my love/hate Java relationship)

Adbot
ADBOT LOVES YOU

1337JiveTurkey
Feb 17, 2005

Using λ instead of halfLife or some other semantically meaningful variable name makes about as much sense as saying foo ࢠ Z/256Z rather than unsigned byte foo.

PrBacterio
Jul 19, 2000

1337JiveTurkey posted:

Using λ instead of halfLife or some other semantically meaningful variable name makes about as much sense as saying foo ࢠ Z/256Z rather than unsigned byte foo.
I'm fairly sure he was referring to the use of λ as the abstraction (i.e. anonymous function) operator in something like Haskell, i.e. that it'd be better to be able to write map (λx: 2*x) [1:] or something rather than map (\x: 2*x) [1:] (note: I don't actually know Haskell so my approximation to proper Haskell syntax might be somewhat off there), in which case your suggestion could be taken as about equivalent to a demand that people write plus or minus instead of the commonly accepted operator symbols + and - for those operations, or even something like thirtyfive times x. Which is to say that, in some contexts, just the bare symbol, λ, is the most semantically meaningful name that can be given to a particular entity in a program.

Doctor w-rw-rw-
Jun 24, 2008

Thermopyle posted:

That's great.

(Currently on the hate side of my love/hate Java relationship)
If you're doing Android, I'd say it's more Android, less Java, actually. What kind of stuff are you getting frustrated from?

Thermopyle
Jul 1, 2003

...the stupid are cocksure while the intelligent are full of doubt. —Bertrand Russell

Doctor w-rw-rw- posted:

If you're doing Android, I'd say it's more Android, less Java, actually. What kind of stuff are you getting frustrated from?

Well, I guess it's actually static typing rather than Java in particular, but since 95% of the time I'm using a statically-typed language its Java (and most of that time it's frustrating Android dev), Java gets the brunt of my hate.

I don't have any particular problem with it other than the fact that I'd rather just use Python or Ruby or whatever. If only they were the appropriate tool all of the time...

Hmm, I was about to submit this post, and then thought about it some more and I can't really put a finger on why I'd rather write in something like Python than something like Java. It may be just something as simple as that, years ago, I learned on Python. I'm certainly not under any illusions such that I think Python and the like are in any general way objectively superior to Java.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

PrBacterio posted:

vvv Again, just write lambda or something, I don't see the problem.

It's longer and less legible?

It's like here I am telling you about for loops and you're wondering what's so good about them that you can't do with a while loop and a few extra statements.

Look Around You
Jan 19, 2009

Thermopyle posted:

Well, I guess it's actually static typing rather than Java in particular, but since 95% of the time I'm using a statically-typed language its Java (and most of that time it's frustrating Android dev), Java gets the brunt of my hate.

I don't have any particular problem with it other than the fact that I'd rather just use Python or Ruby or whatever. If only they were the appropriate tool all of the time...

Hmm, I was about to submit this post, and then thought about it some more and I can't really put a finger on why I'd rather write in something like Python than something like Java. It may be just something as simple as that, years ago, I learned on Python. I'm certainly not under any illusions such that I think Python and the like are in any general way objectively superior to Java.

Python is also like 50x quicker to write and takes about 50% less code to do similar stuff.

Honestly that's one of the reasons I really like Scala, but it has a pretty big learning curve. It's also a lot pickier about types, but in practice you don't have to write out types 80% of the time. Plus functional programming can make algorithms look a lot more elegant, and it's easier to reason with immutable variables than having to worry about mashups of different states (to me at least).

PrBacterio
Jul 19, 2000

Jabor posted:

It's longer and less legible?

It's like here I am telling you about for loops and you're wondering what's so good about them that you can't do with a while loop and a few extra statements.
You know, that's not a fair representation of what I was saying when you're just taking that one sentence out of the whole paragraph I wrote there. But taking your argument to its logical conclusion, we'd have to demand for all languages to be written in ideographic writing systems like Chinese, which clearly we don't. So clearly, there has to be some limit to the number of characters we're willing to use in our programming formalisms, unless you want to start calling variables, say, (and even so, Chinese characters have some internal structure to them beyond an arbitrary assignment of meaning to symbol). You'll notice that what I said wasn't in opposition to the use of any particular symbol but rather to the idea that the set of allowable symbols should be completely open-ended.

EDIT:

Jabor posted:

You started off saying that ASCII should be enough for everyone; can I assume you've changed your mind on that?
Nah, ASCII is a fine set of characters for use in a programming language, I'll stand by that (and actually, the kinds of characters I'd miss the most from ASCII are not letters but things like (I quoted above) , , and ). I was just making the larger point that sticking with a character set that is finite and limited is good practice in principle and from there on I'd argue that it doesn't matter overly much which particular one it is, which is why (say) ASCII is fine, but it might as well be a different one to include additional symbols for whatever your language requires. But Fortran, for instance, has an even more limited set of characters that it used which consists of only the letters A-Z, the digits 0-9 and some 10 or so punctuation and mathematical symbols which don't even include the comparison operators like = or <, and it's doing fine on that front.
EDIT again: And I'll just say that from this quoted comment of yours I take it that it seems you still didn't really 'get' what I was trying to say, which is probably my fault, not yours. Hopefully I've now managed to make my position somewhat clearer, though :smith:

PrBacterio fucked around with this message at 05:06 on Nov 16, 2012

Opinion Haver
Apr 9, 2007

ultramiraculous posted:

Edit: Those are some pretty simple examples that just happen to be more clear if you're used to thinking in deltas and whatnot. Scala can also become a real, real horror with this poo poo. When adding ★ as an operator starts to feel sane, it's time to step back. You're just creating a horror at that point.

I honestly don't think that making operators named <<= and =>= is that bad. But maybe that's just because I write a lot of Haskell.

The real horror in that first link is calling something a 'pimp'.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

PrBacterio posted:

You'll notice that what I said wasn't in opposition to the use of any particular symbol but rather to the idea that the set of allowable symbols should be completely open-ended.

You started off saying that ASCII should be enough for everyone; can I assume you've changed your mind on that?

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
Also goddamn man it's not like you're charged 10¢ every post, make a new one if you're responding to someone who posted after you and you actually want them to see it.

PrBacterio
Jul 19, 2000

Jabor posted:

Also goddamn man it's not like you're charged 10¢ every post, make a new one if you're responding to someone who posted after you and you actually want them to see it.
You're probably right that I should have put that last bit into a post of its own instead of an edit, it's just that at this point our discussion in this thread seems a bit of a derail to me so I guess now after having made that clarification I'll just leave it at that :shobon:

Doc Hawkins
Jun 15, 2010

Dashing? But I'm not even moving!


Thermopyle posted:

I don't have any particular problem with it other than the fact that I'd rather just use Python or Ruby or whatever. If only they were the appropriate tool all of the time...

I hear very, very good things about JRuby.

PrBacterio posted:

But taking your argument to its logical conclusion, we'd have to demand for all languages to be written in ideographic writing systems like Chinese, which clearly we don't.

Unless such a system is part of the language we'd want to use to build the model of our domain, which happens with math stuff, and when we ourselves write and think with such a writing system.

It's fine if English is the language of your development team, as it is for tons of others, and all the ones I've been a part of, but I still don't understand what you think is lost in the utf8 upgrade.

e: I could have sworn there was a jquery wrapper that made '$' available as 'ಠ_ಠ' or something ridiculous like that, but I can't find it now.

Doc Hawkins fucked around with this message at 06:00 on Nov 16, 2012

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

yaoi prophet posted:

Are you saying you have non-latin characters in your source code?
In strings or comments, yeah sometimes. But I don't see any reason for source code not to use UTF-8.

bobthecheese
Jun 7, 2006
Although I've never met Martha Stewart, I'll probably never birth her child.

Aleksei Vasiliev posted:

In strings or comments, yeah sometimes. But I don't see any reason for source code not to use UTF-8.

Because there are multiple characters that look the same?

Seriously, though, use UTF-8, but you probably shouldn't be straying far from the ascii zone with any code other than output strings or comments (by which I mean, don't use non-ascii characters for variable or function names).

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
I will call an operator ∪ if that's what it does.

Of course I'll also set up a binding in my editor so that typing `union` automatically gets converted to ∪, because being able to input your stuff easily is really the important thing.

Titan Coeus
Jul 30, 2007

check out my horn

Jabor posted:

I will call an operator ∪ if that's what it does.

Of course I'll also set up a binding in my editor so that typing `union` automatically gets converted to ∪, because being able to input your stuff easily is really the important thing.

Please no one listen to this guy. Custom binding just to write code isn't okay.

Opinion Haver
Apr 9, 2007

Titan Coeus posted:

Please no one listen to this guy. Custom binding just to write code isn't okay.

Which is why I manually comment out blocks of code by hand.

Titan Coeus
Jul 30, 2007

check out my horn

yaoi prophet posted:

Which is why I manually comment out blocks of code by hand.


Uh, if that isn't built into your IDE I'm not sure what to tell you.

PrBacterio
Jul 19, 2000
I think at this point this is really nothing more than another one of those "code formatting styles/braces on the same line vs. braces on the next line/tabs vs. spaces/which IDE/language/version control system to use" topics that come down to personal preference and style and we should really leave it alone and let the thread to go back to its original purpose, which is for posting (and making fun of) examples of fantastically bad code.

Maluco Marinero
Jan 18, 2001

Damn that's a
fine elephant.

PrBacterio posted:

I think at this point this is really nothing more than another one of those "code formatting styles/braces on the same line vs. braces on the next line/tabs vs. spaces/which IDE/language/version control system to use" topics that come down to personal preference and style and we should really leave it alone and let the thread to go back to its original purpose, which is for posting (and making fun of) examples of fantastically bad code.

Yeah, it's like whinging about a language you don't use because it doesn't work the way your preferred language works. Tools are meant to be used in certain ways, and are not meant to be used in others. In programming we just happen to have heaps of different tools to solve similar problems. It seems like the focus should be on languages that have terribly inconsistent designs, or when the tools get used in absolutely ridiculous ways.

ninjeff
Jan 19, 2004

PrBacterio posted:

About the anglocentricism I'll just note that as long as we're using languages with English-language keywords there's no way around that anyway, which you'll note is also not a problem. Programming languages are formal languages defined on their own terms

"their own terms" just happens to be "on white English-speakers' terms" the vast majority of the time. Non-English speakers having to learn a foreign language before they can even start learning to program is a big problem.

Jewel
May 2, 2009

ninjeff posted:

"their own terms" just happens to be "on white English-speakers' terms" the vast majority of the time. Non-English speakers having to learn a foreign language before they can even start learning to program is a big problem.

I've always been interested in that. It's strange how English became "the" language to use when programming. It's always interesting seeing non-English speaking people code and having their comments in Chinese or whatnot.

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
Well then do I have the solution for you

http://www.babylscript.com/

Jonnty
Aug 2, 2007

The enemy has become a flaming star!

Suspicious Dish posted:

Well then do I have the solution for you

http://www.babylscript.com/

Tell me they include a tool which standardises all the names to a given language so that it's actually readable.

HORATIO HORNBLOWER
Sep 21, 2002

no ambition,
no talent,
no chance
The question of using non-ASCII characters in source files is germane to this thread in the sense that the real argument against them is that there are so few systems that sanely handle Unicode end-to-end. I don't have the link handy but Raymond Chen has posted a couple times about non-ASCII characters in Windows headers breaking builds in non-Western locales.

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



ninjeff posted:

"their own terms" just happens to be "on white English-speakers' terms" the vast majority of the time. Non-English speakers having to learn a foreign language before they can even start learning to program is a big problem.

It's not exactly that bad. They really 'only' have to learn to map the words they use for a language's keywords to the English equivalent. Well, as long as they aren't using someone's framework with English-only docs, but nothing is stopping a Russian from writing a great framework with Russian-only variable, method, and class names (for any language that supports Cyrillic) and then telling the world to go pound sand if they want it in English. Comparatively few people would use it, but nothing is explicitly preventing someone from doing it.

The main reason I wouldn't want a bunch of Unicode operators in a language is that they'd be a pain in the rear end to type. Otherwise, it'd be pretty sweet to use λ instead of lambda in Python :v:

Captain Capacitor
Jan 21, 2008

The code you say?

Doc Hawkins posted:

I hear very, very good things about JRuby.


I hope it's in a better state that Jython.

ninjeff
Jan 19, 2004

Munkeymon posted:

It's not exactly that bad. They really 'only' have to learn to map the words they use for a language's keywords to the English equivalent.

You can't assume every English-based programming language's keywords have a direct mapping to words in every other natural language. It's generally the case for other languages from around Europe, sure. But there's no Japanese (for example) word that maps to "while" or "if" alone.

Here's another hard-to-spot source of privilege in programming languages: being written left-to-right, top-to-bottom. Imagine growing up reading right-to-left and then being told you have to read the opposite way to read programs, and "that's just the way it is".

Don't even get me started on printf!

Doctor w-rw-rw-
Jun 24, 2008

Thermopyle posted:

Well, I guess it's actually static typing rather than Java in particular, but since 95% of the time I'm using a statically-typed language its Java (and most of that time it's frustrating Android dev), Java gets the brunt of my hate.

I don't have any particular problem with it other than the fact that I'd rather just use Python or Ruby or whatever. If only they were the appropriate tool all of the time...

Hmm, I was about to submit this post, and then thought about it some more and I can't really put a finger on why I'd rather write in something like Python than something like Java. It may be just something as simple as that, years ago, I learned on Python. I'm certainly not under any illusions such that I think Python and the like are in any general way objectively superior to Java.

Having just returned to iOS from Android, it's a breath of fresh air. I think the biggest switch in my style is that I write code, then I write the methods and types I need for it, whereas in Java, types and methods come first, then their usage (to a degree - it obviously involves switching between one and the other). Java's just so enterprisey, and while I like that when I'm doing stuff that's more pure like web-based stuff, where the only I/O you really need to think about is a text (or similar) request/response, it just feels bad when doing mobile.

BliShem
Dec 21, 2008

ninjeff posted:

You can't assume every English-based programming language's keywords have a direct mapping to words in every other natural language. It's generally the case for other languages from around Europe, sure. But there's no Japanese (for example) word that maps to "while" or "if" alone.

Here's another hard-to-spot source of privilege in programming languages: being written left-to-right, top-to-bottom. Imagine growing up reading right-to-left and then being told you have to read the opposite way to read programs, and "that's just the way it is".

Don't even get me started on printf!

I grew up reading right-to-left, and it wasn't a big deal at all. You really only need to know the same 30-40 english words to understand any computer language. Basic level english is also very easy to learn, and if you spend any amount of time on the internet then you probably know it already.

It also means that if you go abroad, you can work as a programmer wherever you go ,that multi-national teams can work on the same project easily and when Tokyo university invents a new language, you don't have to bust out the Japanese dictionary just to understand the hello world example.

I'm a Hebrew speaker, and php's T_PAAMAYIM_NEKUDOTAYIM still fills me with rage every time.

Hughlander
May 11, 2005

Hell I thought it was enough of a pain dealing with Sony sample code where it would be something like:

code:
{
    // J: <Long stream of Kanji characters>
    // E: <Two to three words>
    Long block of complex code
}
We'd usually have google translate open to paste the kanji in because the google translation was far better than the english comments.

Opinion Haver
Apr 9, 2007

Clearly we need to do all of our coding in Lojban.

karms
Jan 22, 2006

by Nyc_Tattoo
Yam Slacker
∪λ∑ != ëçæ

I'm fine with math symbols being available, any science symbol is ok in my book, but typing out accented characters is a pain in the loving rear end.

The Gripper
Sep 14, 2004
i am winner

KARMA! posted:

∪λ∑ != ëçæ

I'm fine with math symbols being available, any science symbol is ok in my book, but typing out accented characters is a pain in the loving rear end.
Not only that, characters represented in ASCII are very easily distinguished from eachother - even by non-english-speaking people - and have the benefit of being very flexible as far as correctness goes (as a written and spoken language). Poor grammar and bad spelling and even poor pronounciation can make something difficult to understand, yet still not stray obscenely far from it's intention.

Those benefits are true of other similar languages, though, I guess English was just lucky to get it's foot in the door first.

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



The Gripper posted:

easily distinguished from eachother

That's pretty facilIlIlIlIlIlIlIlIlIe

The Gripper
Sep 14, 2004
i am winner

Carthag posted:

That's pretty facilIlIlIlIlIlIlIlIlIe
*exceptions include when code is written by ulillillia

tef
May 30, 2004

-> some l-system crap ->

The Gripper posted:

Not only that, characters represented in ASCII are very easily distinguished from eachother - even by non-english-speaking people

Dyslexia is a Myth. Wake up sheeple! (Also, english needs more than ascii to write).

quote:

and have the benefit of being very flexible as far as correctness goes (as a written and spoken language).

The spelling and pronunciation have nothing in common though? I thought you ought to know about this being tough and thoroughly challenging for newcomers to plough through, a linguistic hiccough or cough perhaps.

quote:

Poor grammar and bad spelling and even poor pronounciation can make something difficult to understand, yet still not stray obscenely far from it's intention.

This might be more to do with humans and not the language spoken.

quote:

Those benefits are true of other similar languages, though, I guess English was just lucky to get it's foot in the door first.

Social and cultural factors are the major influence on the spread of English.

The Gripper
Sep 14, 2004
i am winner

tef posted:

The spelling and pronunciation have nothing in common though?
Oh I know, I was just generalizing that neither (bad) pronunciation or (bad) spelling on their own make the language impossible to understand (which doesn't help in learning english, but does aid in english speakers understanding improper use of english). A lot of other languages aren't as forgiving in that aspect, namely ones where a single character or transposing of characters can heavily affect entire phrases like Mandarin and Japanese, and basically all languages where the written word doesn't directly map to pronunciation at all (I know there are cases in english where this can also happen, too.)

e; none of the pronunciation crap is applicable to programming though.

The Gripper fucked around with this message at 13:30 on Nov 18, 2012

Adbot
ADBOT LOVES YOU

tef
May 30, 2004

-> some l-system crap ->

The Gripper posted:

Oh I know, I was just generalizing that neither (bad) pronunciation or (bad) spelling on their own make the language impossible to understand (which doesn't help in learning english, but does aid in english speakers understanding improper use of english). A lot of other languages aren't as forgiving in that aspect, namely ones where a single character or transposing of characters can heavily affect entire phrases like Mandarin and Japanese, and basically all languages where the written word doesn't directly map to pronunciation at all (I know there are cases in english where this can also happen, too.)

I was just wondering if you had any pubic sources on the deanliness on the forgivingness of english.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply