Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

TooMuchAbstraction posted:

You mean the constants that are almost always provided by a math library? The reason you make constants out of those kinds of things is to prevent typos.

I think Vanadium's original point was exactly that it's easy to typo 10000000000 instead of 1000000000, and therefore BILLION is a legitimate constant.

I don't think it's a great point because there's other ways to avoid typos. Even if your language doesn't allow thousand separators, you can type 1000 * 1000 * 1000.

Adbot
ADBOT LOVES YOU

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

The main thing number constants do is make you wonder if the constant is secretly not what it says in the name.

CPColin
Sep 9, 2003

Big ol' smile.

zergstain posted:

I thought any decent compiler would emit a warning if you put a semicolon after an if like that. Actually, though, would anything be lost if they made it so that a semicolon by itself isn't a valid C statement?

You wouldn't be able to write for (;;) { ... } instead of while (true) { ... }, for one. Or, I guess, any for loop where you don't have an initialization or iteration step. (Only weirdos do for (;;), BTW.)

Languages that require braces, even for single-line blocks, have the right idea.

brap
Aug 23, 2004

Grimey Drawer
Yeah, I am pretty much arguing for stuff like a 7000+ line let rec/and declaration to be something that could be split across files. Say, a couple of similarly named files in a folder. I don't think fsi files should be on the file system or at least they shouldn't be version controlled. If you manually modify the order of elements to convey relative importance to the reader, then you can't regen them, can you?

The biggest gripe I have with F#, though, is that you aren't required to annotate the parameters to every named function. Basically, F# should be doing what Swift does here. It can be very painful not just to skim through function declarations but to make changes because your ability to view the intended parameters to a function can be destroyed in a cascading way when you're making edits and the project has build errors.

Jaded Burnout
Jul 10, 2004


Xerophyte posted:

The long scale has survived in the rest of Europe and half the Americas so there's still tons of people to confuse by using billion as a constant name.

Interesting. I did not know this fact nor did I know its name. But yeah looks like the UK gubmint adopted the short scale at the same time as decimalising our currency.

Also

quote:

Coueyte not his goodes
For millions of moneye

Volguus posted:

Have no value whatsoever and serve no purpose and no developer should ever, under any circumstances, do this.

Disagree, in that I see value in anything which reduces errors and as mentioned elsewhere it will reduce errors in languages which don't allow e.g. 1_000_000_000.

NihilCredo posted:

Even if your language doesn't allow thousand separators, you can type 1000 * 1000 * 1000.

Hope you enjoy your IDE moaning about statements which could be replaced by the result.

feedmegin
Jul 30, 2008

ratbert90 posted:

Big Endian AMD64. :psyduck:

Um whut

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
Some company decided to break GCC to fake big-endian on X86, so just in case someone does the same thing on AMD64, they decided to implement support.

ullerrm
Dec 31, 2012

Oh, the network slogan is true -- "watch FOX and be damned for all eternity!"


Around the time that AMD64 was being developed, all the other 64-bit architectures of the era (Itanium, Alpha, SPARC V9, etc) had switchable endianness in software -- they could fetch and store data in either LE or BE order, depending on OS preference.

So, some OpenSSL dev added support for that in case AMD64 ended up doing the same thing. (e: Or if you did stupid shenanigans with gcc, like Stratus did.) But it was never actually used.

(Ironically, Intel ended up adding the MOVBE instruction on Haswell and later cores, which allows you to read/write data in BE order.)

Volguus
Mar 3, 2009

Absurd Alhazred posted:

Do you feel the same way about pi or e?

Defining billion like this
code:
#define BILLION 1000000000
is the same as defining PI as
code:
#define THREE_POINT_ONE_FOUR 3.14
Nonsensical, even if approximately the same thing. You don't define the number of seconds in a day (ignoring leap seconds and so on) as
code:
#define EIGHTY_SIX_THOUSAND_FOUR_HUNDRED 86400
but instead you say what they are
code:
#define SECONDS_IN_A_DAY 86400
or, even better:
code:
#define MESSAGE_CHECK_PERIOD_SECONDS 86400
With the last define is clear what that is and what is used for. With the previous ones ... anything goes. And if tomorrow i decide to check the messages every 10 seconds, the change is simple, in one spot and won't bother anyone. (unless of course there's that guy who decides that my message check period value is exactly what needs for his stuff. But the odds of that happening are much higher with BILLION than with VALUE_FOR_ME_ONLY).

Volguus fucked around with this message at 19:59 on Jul 16, 2017

eth0.n
Jun 1, 2012
So it should be ONES_IN_A_BILLION?

zergstain
Dec 15, 2005

CPColin posted:

You wouldn't be able to write for (;;) { ... } instead of while (true) { ... }, for one. Or, I guess, any for loop where you don't have an initialization or iteration step. (Only weirdos do for (;;), BTW.)

Languages that require braces, even for single-line blocks, have the right idea.

I was thinking that was parsed differently, but I suppose the standard says that the for construct has three statements (which I guess makes it a bit odd that the third statement isn't terminated by a semicolon). I suppose it could be special cased, but requiring braces with if, for, do and while would avoid that headache.

b0lt
Apr 29, 2005

Vanadium posted:

I'm gonna defend the BILLION define because I don't want to count the zeroes every time someone uses a huge number, and people don't use 1e9 etc for integers.

C++14 lets you do 1'000'000'000

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

Nippashish posted:

I'd be pretty upset if you had made a constant called THREE_POINT_ONE_FOUR_ONE_FIVE_NINE.

I think the question being asked there was really whether you'd insist that the constant instead be called something like "ratioOfCircleCircumferenceToDiameter"

Absurd Alhazred
Mar 27, 2010

by Athanatos

Hammerite posted:

I think the question being asked there was really whether you'd insist that the constant instead be called something like "ratioOfCircleCircumferenceToDiameter"

Further aliased as areaOfUnitCircle, with 2pi defined as radiansInCircle.

CPColin
Sep 9, 2003

Big ol' smile.

zergstain posted:

I was thinking that was parsed differently, but I suppose the standard says that the for construct has three statements (which I guess makes it a bit odd that the third statement isn't terminated by a semicolon). I suppose it could be special cased, but requiring braces with if, for, do and while would avoid that headache.

Yeah, you're probably right. Sounds like making empty statements be errors is the right idea!

vOv
Feb 8, 2014

Hammerite posted:

I think the question being asked there was really whether you'd insist that the constant instead be called something like "ratioOfCircleCircumferenceToDiameter"

No, because it should obviously be called twiceSmallestPositiveZeroOfCosineFunction.

TooMuchAbstraction
Oct 14, 2012

I spent four years making
Waves of Steel
Hell yes I'm going to turn my avatar into an ad for it.
Fun Shoe
Or HALF_TAU.

Eela6
May 25, 2007
Shredded Hen

:argh:

TheBlackVegetable
Oct 29, 2006

fleshweasel posted:

The biggest gripe I have with F#, though, is that you aren't required to annotate the parameters to every named function.

That's fair enough, and I agree - I would prefer if F# was more strict about its syntax (I always set warn level to 5, set warnings as errors and add --warnon:1182 to catch unused variables to try and force as much as I can).

My solution to that gripe though was to simply write all my code with the annotations. For any that don't have annotations, Visual Studio F# power tools (or maybe it's VS itself or Resharper, I forget) gives you the signature on mouse-over.

It's nice to be able to leave it up to the type inferencer though.

Zemyla
Aug 6, 2008

I'll take her off your hands. Pleasure doing business with you!

Athas posted:

F# probably has better tooling than any other functional language. It has flaws compared to the big functional languages (Haskell, OCaml, SML, Scala), but unless you're experienced in them, you won't notice. Some of its uglier parts are rooted in compromises to enable good .NET compatibility, but that is probably also its main selling point, so somewhat unavoidable.

It is good language and nobody will leave you out of their party invitations for using it.

So what is a good way of learning it? I know Haskell, but not C#.

gonadic io
Feb 16, 2011

>>=

Zemyla posted:

So what is a good way of learning it? I know Haskell, but not C#.

Think haskell 98 but with worse syntax and no monads and you'll find it very easy. You have ADTs, functions, main, etc etc. The c# bit is only when you start using domain libraries

e: and strict evaluation and you have to declare your functions in a file in dependency order ie main is always the last function in a file. And recursive functions need to be declared so.

But actually programming in it feels the same IMO. First class functions, map filter fold, function composition, the same fundamental (HM) type system, the lists are singly linked and defined in the same way, there's the same pattern matching.

gonadic io fucked around with this message at 22:20 on Jul 16, 2017

zergstain
Dec 15, 2005

CPColin posted:

Yeah, you're probably right. Sounds like making empty statements be errors is the right idea!

Maybe. I was actually agreeing that requiring blocks for loops and conditionals would be the better approach. On top of not having to make a special case in the standard for for, it would also eliminate bugs like
code:
if (expression)
    if (expression)
        statement
else
    statement
I guess K&R didn't think of this stuff, which means it's too late for C.

I was wondering if Swift was one of the languages you were talking about so I looked it up. I see it also doesn't even have C-style for loops.

Vanadium
Jan 8, 2005

Volguus posted:

I'm sorry, it is not defensible in any situation (#define, const long, whatever). And that applies to numbers like 55, 100, or a googol. That constant/macro you define for a purpose: sleep time in an infinite loop, maybe magic number, maybe ... whatever it may be. Then entire reason to have it a constant is so that you can change it easily should you need to, and only change it in one spot and to give it a name that shows its purpose. You can never change BILLION to have value 23, it wouldn't make any sense.

This kind of code

code:
final long FIVE = 5;
#define MILLION 1000000
does not provide any information about those constants/macros. Have no value whatsoever and serve no purpose and no developer should ever, under any circumstances, do this.

Uh no I'm not only defining constants so I can easily change them, lol. When I go on and #define RANSOM_IN_USD or whatever, it's more readable as 100 * BILLION than 100000000000 or to some extent even 100*1000*1000*1000. If I come back to the code a year later I don't want to be counting zeroes to figure out the order of magnitude I was thinking in.

QuarkJets
Sep 8, 2008

You should encode that information in a comment, "// 100 billion", not in #define BILLION 1000000000.

If you need BILLION in a ton of places then you need to re-baseline to be thinking in gigas

CPColin
Sep 9, 2003

Big ol' smile.

zergstain posted:

I was wondering if Swift was one of the languages you were talking about so I looked it up. I see it also doesn't even have C-style for loops.

I was thinking about Ceylon, which both requires braces and treats empty statements as errors. Good times.

Bonfire Lit
Jul 9, 2008

If you're one of the sinners who caused this please unfriend me now.

zergstain posted:

I was thinking that was parsed differently, but I suppose the standard says that the for construct has three statements (which I guess makes it a bit odd that the third statement isn't terminated by a semicolon).
The third part isn't a statement, it's just an expression. (So is the second part; expression statements discard their value so the condition can't be one.) The first part is an expression followed by semicolon, or a declaration (which includes a semicolon so it's not followed by one in the grammar).

C++ makes the whole thing a bit more complicated but basically follows the same rules.

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

QuarkJets posted:

You should encode that information in a comment, "// 100 billion", not in #define BILLION 1000000000.

I don't agree. Things shouldn't be in comments if they can be actively incorporated into the code - even if that's just through using variable names.

It is an unfortunate fact that comments rot; when editing, programmers tend to change the code while leaving comments unchanged. So there is a great risk that someone will come along and change "int ransomAmount = 100000000000; // 100 billion" to "int ransomAmount = 10000000000; // 100 billion". I think, although I have no evidence for it, that variable/constant names may tend to be a little less neglected.

Only put things in comments if you cannot manage to make them clear through the code itself. But having said that, variable names are in a sense halfway between comments and code since they have no meaning to the computer. They should still be used though.

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe
It's not an arbitrary statement anyway.

The better argument for allowing empty statements is that it allows things like assert to work, given that they're implemented using macros.

Eela6
May 25, 2007
Shredded Hen

rjmccall posted:

It's not an arbitrary statement anyway.

The better argument for allowing empty statements is that it allows things like assert to work, given that they're implemented using macros.

Would you mind going into more detail? I didn't know this. (I am very much not a systems programmer and don't know very much about how programming languages 'really work'.)

Zemyla
Aug 6, 2008

I'll take her off your hands. Pleasure doing business with you!

gonadic io posted:

Think haskell 98 but with worse syntax and no monads and you'll find it very easy. You have ADTs, functions, main, etc etc. The c# bit is only when you start using domain libraries

e: and strict evaluation and you have to declare your functions in a file in dependency order ie main is always the last function in a file. And recursive functions need to be declared so.

But actually programming in it feels the same IMO. First class functions, map filter fold, function composition, the same fundamental (HM) type system, the lists are singly linked and defined in the same way, there's the same pattern matching.

Does it have RankNTypes and GADTs somewhere? Because I use both of those a lot.

gonadic io
Feb 16, 2011

>>=

Zemyla posted:

Does it have RankNTypes and GADTs somewhere? Because I use both of those a lot.

No and no. Haskell has been a (drastically) changing language, f# has not been. type providers are kinda like more general monads that can do some really cool things, and it has a really nice system for adding units to ints and floats that isn't just newType, but compared to haskell it does feel a little dated I personally feel. However as much as it might seem like I'm bashing on f#, I'm not - I do like it and were I to start a company/commercial enterprise with no legacy it's 100% the language I'd choose - the additional infrastructure, tooling and interoperability really do make a huge difference when it comes to use cases.

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe

Eela6 posted:

Would you mind going into more detail? I didn't know this. (I am very much not a systems programmer and don't know very much about how programming languages 'really work'.)

assert is a macro in C: its "arguments" are just sequences of tokens, and it expands to a sequence of tokens which is then subject to normal parsing. When assertions are disabled, assert is not supposed to do anything, and this is accomplished by having the macro expand to code that does not do anything. (This is why, unlike (say) Java, you cannot dynamically enable and disable macros in C; you have to recompile.) In fact, it would be valid (I believe) for it to expand to the empty sequence of tokens, so that in code like this:

C code:
assert(x == y);
assert and its arguments would disappear completely from the preprocessed source code, and you'd just be left with the semicolon. That is what I was thinking when I said that assert relied on the validity of the empty statement. But in fact, I am wrong (at least on Darwin) and assert actually expands to ((void) 0), i.e. a non-empty statement that just happens to do nothing.

rjmccall fucked around with this message at 00:37 on Jul 17, 2017

b0lt
Apr 29, 2005

rjmccall posted:

assert is a macro in C: its "arguments" are just sequences of tokens, and it expands to a sequence of tokens which is then subject to normal parsing. When assertions are disabled, assert is not supposed to do anything, and this is accomplished by having the macro expand to code that does not do anything. (This is why, unlike (say) Java, you cannot dynamically enable and disable macros in C; you have to recompile.) In fact, it would be valid (I believe) for it to expand to the empty sequence of tokens, so that in code like this:

C code:
assert(x == y);
assert and its arguments would disappear completely from the preprocessed source code, and you'd just be left with the semicolon. That is what I was thinking when I said that assert relied on the validity of the empty statement. But in fact, I am wrong (at least on Darwin) and assert actually expands to ((void) 0), i.e. a non-empty statement that just happens to do nothing.

If it was defined to nothing, you'd either have to admit assert(true) without a trailing semicolon, or have code that compiles with asserts on, but fails with them off.

QuarkJets
Sep 8, 2008

Hammerite posted:

I don't agree. Things shouldn't be in comments if they can be actively incorporated into the code - even if that's just through using variable names.

It is an unfortunate fact that comments rot; when editing, programmers tend to change the code while leaving comments unchanged. So there is a great risk that someone will come along and change "int ransomAmount = 100000000000; // 100 billion" to "int ransomAmount = 10000000000; // 100 billion". I think, although I have no evidence for it, that variable/constant names may tend to be a little less neglected.

Only put things in comments if you cannot manage to make them clear through the code itself. But having said that, variable names are in a sense halfway between comments and code since they have no meaning to the computer. They should still be used though.

Sure you run the risk of some idiot forgetting to change the comment after changing the value, but you're facing just as much risk of someone coming along and changing the value of BILLION.

And to your first point, the value is already incorporated into the code, you personally just don't have the eyes to be able to count 9 zeros accurately. But the other part of my post addressed all of your concerns: define the variable as ransomAmountInMillions if you think that the value is typically going to contain a large number of zeros and you'd rather not mess with counting them

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

QuarkJets posted:

Sure you run the risk of some idiot forgetting to change the comment after changing the value, but you're facing just as much risk of someone coming along and changing the value of BILLION.

I think the pro-billion define people are thinking something along the lines of:

code:
#define BILLION 1000000000UL

#define DELAY_NANOSECONDS 5 * BILLION   // this is OK.

GTEST(BILLION_DEFINE_UNMOLESTED) {
  ASSERT_EQ(BILLION, 1000*1000*1000);
}  // if you have to do this, you should institute employment review, not code review.

for (i; i < BILLION; i++) { // instantly fail code review.
Context matters. I'd argue against re-normalizing against a larger unit, especially since it might lead to fractions and floating point math. Computers can add large numbers just fine.

zergstain
Dec 15, 2005

CPColin posted:

I was thinking about Ceylon, which both requires braces and treats empty statements as errors. Good times.

I'm not really seeing the value of doing both those things. Nobody is going to accidentally write if (a == b) { ; }

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe

b0lt posted:

If it was defined to nothing, you'd either have to admit assert(true) without a trailing semicolon, or have code that compiles with asserts on, but fails with them off.

Code that doesn't compile when assertions are suddenly enabled/disabled for the first time in years is not exactly a heretofore unencountered problem, but I agree that it's good for assert to at least verify that it's in a syntactically valid position.

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

rjmccall posted:

Code that doesn't compile when assertions are suddenly enabled/disabled for the first time in years is not exactly a heretofore unencountered problem, but I agree that it's good for assert to at least verify that it's in a syntactically valid position.

It's a language-breaking change, since old (really old) C programs used empty loops for timing.

It's one of the warts that it'd be nice to do something about in C and C-likes, but there's too much inertia against it. It'd be nice if there was an explicit nop keyword, the same way it'd be nice if switches required explicit fallthrough, assignment and equality were more than a typo apart and we had a standardized buffer type instead of pointer-to-memory, length and allocated length all being tracked separately.

Everything that stemmed from null-terminated strings was just a bad idea.

CPColin
Sep 9, 2003

Big ol' smile.

zergstain posted:

I'm not really seeing the value of doing both those things. Nobody is going to accidentally write if (a == b) { ; }

I think the latter is probably more because the grammar doesn't specifically allow it, rather than a conscious choice to make empty statements errors. At least, I'm trying to think of an example where an empty statement could mess something up in Ceylon and I can't come up with one right away.

Adbot
ADBOT LOVES YOU

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe

Harik posted:

It's a language-breaking change, since old (really old) C programs used empty loops for timing.

That has not worked in a long time.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply