Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
leper khan
Dec 28, 2010
Honest to god thinks Half Life 2 is a bad game. But at least he likes Monster Hunter.

Hammerite posted:

what's wrong with that?

It's nonstandard

Adbot
ADBOT LOVES YOU

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

leper khan posted:

It's nonstandard

the MS compiler supports it and idk what you're doing if you're building for Windows and not using the MS compiler.

Also I just checked on Wiki and like everything supports it.

Volguus
Mar 3, 2009

leper khan posted:

It's nonstandard

True, but pretty much everyone supports it: https://en.wikipedia.org/wiki/Pragma_once . And you can't deny its benefits (over the standard include guards). But now that I think about it, the only time I used #pragma once was when I used Visual Studio and the IDE itself wrote that for me.

Jewel
May 2, 2009

If you're not using it you're living in the past, hth. If you're making anything modern there's zero reason not to use it. We use it in every header in AAA gamedev and that's compiled over a boatload of compilers for a lot of different platforms.

Soricidus
Oct 21, 2010
freedom-hating statist shill

I'm the giant block of preprocessor macros to copy/paste into all your header files in order to determine, unreliably, whether or not #pragma once is supported, and if so, to use it completely redundantly in addition to #ifdef guards. This definitely belongs in an encyclopedia

Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed
#pragma once doesn't work if you have the same header in multiple locations and both end up getting included.

I've never actually see this be a problem in practice and I kinda feel like it may be a feature in that it'd help you discover that your build env is hosed.

Volguus
Mar 3, 2009

Soricidus posted:

I'm the giant block of preprocessor macros to copy/paste into all your header files in order to determine, unreliably, whether or not #pragma once is supported, and if so, to use it completely redundantly in addition to #ifdef guards. This definitely belongs in an encyclopedia

If you even remotely suspect that your code will need to be compilable on compilers written in 3000BC, then using the guards is the right thing to do, no question about it. Is not even worth it otherwise. You deserve my compassion and condolences.

FlapYoJacks
Feb 12, 2009

Jewel posted:

If you're not using it you're living in the past, hth. If you're making anything modern there's zero reason not to use it. We use it in every header in AAA gamedev and that's compiled over a boatload of compilers for a lot of different platforms.

uclibc and musl don't support it. It's non-standard and shouldn't be used.

FlapYoJacks
Feb 12, 2009

Volguus posted:

If you even remotely suspect that your code will need to be compilable on compilers written in 3000BC, then using the guards is the right thing to do, no question about it. Is not even worth it otherwise. You deserve my compassion and condolences.

Has nothing to do with the compiler and everything to do with the C library the compiler is built against.

fritz
Jul 26, 2003

ratbert90 posted:

uclibc and musl don't support it. It's non-standard and shouldn't be used.

Why would a libc care about a preprocessor directive?

FlapYoJacks
Feb 12, 2009

fritz posted:

Why would a libc care about a preprocessor directive?

In the world of POSIX, preprocessors are separate libraries that the compiler can be built against.

Eela6
May 25, 2007
Shredded Hen
The C preprocessor is kind of a coding horror in and of itself.

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

ratbert90 posted:

uclibc and musl don't support it. It's non-standard and shouldn't be used.

Again - the original code snippet you complained about was explicitly and intrinsically Windows-specific. I looked those two things up and they're both some Linux bullshit.

Kazinsal
Dec 13, 2011


Magic Linux and POSIX poo poo is the reason we can't have nice things? Why, I never

Spatial
Nov 15, 2007

Even the bullshit embedded compilers I use at work support #pragma once. So lol.

Spatial
Nov 15, 2007

Eela6 posted:

The C preprocessor is kind of a coding horror in and of itself.
It's the ultimate horror. It's what allows C and C++ to remain substandard - by providing a means of avoiding any possible improvement with an extremely lovely workaround.

When I ascend the World Throne my first order will be the total elimination of the preprocessor under penalty of death

Spatial
Nov 15, 2007

Why can't I find out the name of an enum in C++? An almost modern language?

Well, you can! Just wrap every declaration in this grotesque macro and *FAAAAART*

FlapYoJacks
Feb 12, 2009

Kazinsal posted:

Magic Linux and POSIX poo poo is the reason we can't have nice things? Why, I never

GlibC supports it, but some people really want to adhere strictly to the C/C++ standards. :shrug: In the interest of code portability, if it's not in the C standard you shouldn't use it.

In the case of windows, if you have no interest of ever porting your code, I guess go nuts? If you do plan on ever porting your code, stick to the standards.

Spatial
Nov 15, 2007

It's not a real problem. It's an imaginary problem that will never happen.

eth0.n
Jun 1, 2012

ratbert90 posted:

In the world of POSIX, preprocessors are separate libraries that the compiler can be built against.

I've done a lot of Linux C and C++, and I've never heard of this, and cannot find anything to support this notion online. As far as I can tell, choice of libc has nothing to do with what the preprocessor supports.

You have a link supporting your assertions? Searches for "musl pragma once" and "uclibc pragma once" appear to turn up nothing, and every discussion of "pragma once" compatibility I can find only talks about compilers.

Pollyanna
Mar 5, 2005

Milk's on them.


JavaScript code:

let currentAgeDays = thisMoment.diff(date, "days"); // because this day next year is -364 days, not -1 year

The horror is that I'm not immediately sure if that's wrong.

Spatial
Nov 15, 2007

drat, I need to port this old code to 64-bit. hmmm...
code:
#define long long long

Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed

ratbert90 posted:

uclibc and musl don't support it. It's non-standard and shouldn't be used.

I suppose this is technically true, but only in that no libc supports #pragma once because it's a compiler feature, not a libc feature.

Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed

ratbert90 posted:

In the world of POSIX, preprocessors are separate libraries that the compiler can be built against.

I think you do not understand as much about what you are talking about as you think you do. musl, uclibc and glibc are not preprocesors.

TooMuchAbstraction
Oct 14, 2012

I spent four years making
Waves of Steel
Hell yes I'm going to turn my avatar into an ad for it.
Fun Shoe

Spatial posted:

drat, I need to port this old code to 64-bit. hmmm...
code:
#define long long long

I had some friends back in college that were doing a coding competition involving astronomical distances (literally, something about the distances between stars in a galaxy or similar). When they ran into integer precision issues, they just slapped "#define int long long" at the top of the file.

They at least had the excuse of trying to program under extreme time pressure as they were in a competition.

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

Spatial posted:

drat, I need to port this old code to 64-bit. hmmm...
code:
#define long long long

I know nothing about C, so I like to imagine the C99 drafting committee as a bunch of dudes in the proverbial smoke-filled room, exchanging ideas for the new 64-bit integer format:

"Hmm... 'Bigint'?"

"Stinks of sql. 'Longer'?"

"Dumb. How about 'int64'?"

"Shut up Larry. This is C. We need something simple."

*the host's retarded kid kramers through the door* "HEEE HEEE. LONG LONG, DAD. HEEE. LONG LONG. *drools*"

*his father, glowing with barely restrained pride* "...Perfect."

Spatial
Nov 15, 2007

What do you think about this: instead of having a library/module system of any kind, in C we have a textual substitution step which copies and pastes whole files into each other. Added bonus: this system is stateful so it doesn't necessarily copy and paste the same thing if you do it twice. In fact it relies on this to work at all.

In C++ this causes so much reparsing of files it results in exponentially long compile times the larger a project is, and a special mechanism has to be used to un-cripple the entire system, even on a state-of-the-art machine.

Spatial fucked around with this message at 22:02 on May 31, 2017

Bonfire Lit
Jul 9, 2008

If you're one of the sinners who caused this please unfriend me now.

Spatial posted:

What do you think about this: instead of having a library/module system of any kind, in C we have a textual substitution step which copies and pastes whole files into each other. Added bonus: this system is stateful so it doesn't necessarily copy and paste the same thing if you do it twice. In fact it relies on this to work at all.
I suppose this might have made sense in 1972.

That there's still nothing better is probably IBM's fault. They're still mad that the C++ committee removed trigraphs.

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

Bonfire Lit posted:

I suppose this might have made sense in 1972.

That there's still nothing better is probably IBM's fault. They're still mad that the C++ committee removed trigraphs.

I looked up their criticism and I found what I think is the understatement of the century:

a paper written in TYOOL 2014 posted:

In reality, we realize the world is moving mostly to non-EBCDIC [...] We recognize the removal of trigraph will make C++ less surprising, easier to teach, and possibly improve its adaption for non-experts for a majority of users because the majority are in the ASCII or non-EBCDIC Unicode world, and do not know about the needs of the EBCDIC world. [..] We recognize that C++ is mostly an ASCII-centric language now. We will continue to oppose trigraph removal, because we feel someone must speak for the minority of users who cannot speak for themselves. This is not just taking a moral high ground, but being practical. We realize the tide is against the EBCDIC world and as such, whether trigraph is removed or not, IBM compiler, EBCDIC, and non-ASCII users must plan to operate in such a world and it is best to start now.

:shepface: Those drat kids and their newfangled 1963 ASCII :bahgawd:

Wikipedia posted:

Extended Binary Coded Decimal Interchange Code[1] (EBCDIC[1]) is an eight-bit character encoding used mainly on IBM mainframe and IBM midrange computer operating systems. EBCDIC descended from the code used with punched cards and the corresponding six bit binary-coded decimal code used with most of IBM's computer peripherals of the late 1950s and early 1960s.[2] It is also supported on various non-IBM platforms such as Fujitsu-Siemens' BS2000/OSD, OS-IV, MSP, and MSP-EX, the SDS Sigma series, and Unisys VS/9 and MCP.

Jargon FIle (1983) posted:

EBCDIC: /eb´s@·dik/, /eb´see`dik/, /eb´k@·dik/, n. [abbreviation, Extended Binary Coded Decimal Interchange Code] An alleged character set used on IBM dinosaurs. It exists in at least six mutually incompatible versions, all featuring such delights as non-contiguous letter sequences and the absence of several ASCII punctuation characters fairly important for modern computer languages (exactly which characters are absent varies according to which version of EBCDIC you're looking at). IBM adapted EBCDIC from punched card code in the early 1960s and promulgated it as a customer-control tactic (see connector conspiracy), spurning the already established ASCII standard. Today, IBM claims to be an open-systems company, but IBM's own description of the EBCDIC variants and how to convert between them is still internally classified top-secret, burn-before-reading. Hackers blanch at the very name of EBCDIC and consider it a manifestation of purest evil.

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

Bonfire Lit posted:

I suppose this might have made sense in 1972.

It didn't. General macro systems were a hot topic of research in the 1960s, and Lisp's macros date to 1963.

The creators of C were just throwing poo poo together and trying to get it to work on the super-constrained system they had access to at the time.

Foxfire_
Nov 8, 2010

:eng101: C/C++'s slow build time is mostly due to the grammer being stupid, not the preprocessor being stupid.

code:
dick * butts;
Is this declaring a new variable named butts that's a pointer to the dick type or multiplying the variable dick by the variable butts? There's no way to tell without parsing everything else!

KernelSlanders
May 27, 2013

Rogue operating systems on occasion spread lies and rumors about me.
Maybe text substitution is the real horror?

Why would you ever allow a declaration to be imported more than once?

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

Foxfire_ posted:

:eng101: C/C++'s slow build time is mostly due to the grammer being stupid, not the preprocessor being stupid.

Now now, there's plenty of blame to go around.

Sure everything has to be parsed to understand anything, but the preprocessor stupidity means that "everything" is orders of magnitude larger than it needs to be!

This is why precompiled headers and module systems exist.

Xarn
Jun 26, 2015
Well at least C++ is getting modules in 2014 2017 2020. :v:

Jewel
May 2, 2009

I've said it before but C#'s "every single function must be in a class" is an incredibly good design because it not only makes you think about where functions belong instead of scattering them to the wind, keeps things together so you can easily remember "Math.<all the math functions", but also lets the compiler be so so much nicer. There's all these weird edge cases that C++ has that you don't get. It's all about namespaces rather than dynamically copypasting code haphazardly into every file. I'd love just C++ but with a compiler that enforced every function being inside a class and linked via the C#-esque module namespace system.

b0lt
Apr 29, 2005

ratbert90 posted:

Has nothing to do with the compiler and everything to do with the C library the compiler is built against.

what

idiotmeat
Apr 3, 2010

Bonfire Lit posted:

I suppose this might have made sense in 1972.

That there's still nothing better is probably IBM's fault. They're still mad that the C++ committee removed trigraphs.

I ran into some code the other day that used trigraphs. I couldn't figure out why since ebcdic has bracket support, so I replaced them all with their respective characters...

Turnrs out ebcdic has a few different flavors, and the brackets tend to move around. :v:

feedmegin
Jul 30, 2008

Jewel posted:

I've said it before but C#'s "every single function must be in a class" is an incredibly good design because it not only makes you think about where functions belong instead of scattering them to the wind, keeps things together so you can easily remember "Math.<all the math functions", but also lets the compiler be so so much nicer. There's all these weird edge cases that C++ has that you don't get. It's all about namespaces rather than dynamically copypasting code haphazardly into every file. I'd love just C++ but with a compiler that enforced every function being inside a class and linked via the C#-esque module namespace system.

Or you could just enforce every function being in a C++ namespace instead? That's literally what those are for.

feedmegin
Jul 30, 2008

ratbert90 posted:

In the world of POSIX, preprocessors are separate libraries that the compiler can be built against.

Others have said it, but, uh, :psyduck: what in God's green earth are you talking about. This is word salad.

Adbot
ADBOT LOVES YOU

Nude
Nov 16, 2014

I have no idea what I'm doing.
Speaking of namespaces (sorry if this is a little off topic) I'm curious on why in C++ it seems like there is a big debate on "using namespace..." particularly "using namespace std", in C# it seems to be accepted to do "using" over writing it out as you would in C++. Is this because of any clear language difference? The reason online seems to because of fear of conflicting method names; so would some people who program in C++ view C# "using" as a coding horror of sorts?

Nude fucked around with this message at 13:19 on Jun 1, 2017

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply