Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

Pollyanna posted:

I just started here, so I can't quite :yotj: yet, but I'm certainly considering it as a way of easing the pain.

...If I get the chance, of course. I'm worried that my constant questions and occasional mistakes might lead them to think I'm not good enough for the job :(

Don't think that. If you JUST started, you can probably just not mention your current job in your resume while you go searching.

Also, constant questions are fine. It's great when a junior dev asks questions, it makes you feel awesomely superior shows that the junior dev has an interest in growth and learning, at least as long as they don't keep asking the same question over and over again.

Adbot
ADBOT LOVES YOU

Deus Rex
Mar 5, 2005

SupSuper posted:

Sorry, should've clarified.
Computer programming/engineering courses normally just care about programming tools and IDEs so you probably won't even smell a command line, specially since you'll mostly be on Windows.
Computer science courses normally expect that you're already a Linux neckbeard using vim and sed to their heart's content otherwise why would you even be here.
This is purely anecdotal, but the point is you usually have to teach yourself CLIs, so most people won't bother.

This is not my understanding or anecdotal experience at all. I have an unrelated degree but took a year of CS courses for kicks and all homework assignments were done in Java. You were expected to submit an Eclipse project for grading purposes. Naturally, I wouldn't extrapolate this to the rest of the program since I only did three courses. Looking ahead at the course catalog, it looks like several of the upper division courses are done with pen and paper.

My sister did a CS/Math degree and as far as I know never used Linux or vim or sed. She doesn't have a neckbeard, either.

Friends and family who have completed CE programs (MIT, UIUC, Purdue, UCLA, UC Irvine) certainly did not do most or even much (any) of their work in Windows or in an IDE.

I can't speak for computer programming courses because I haven't heard of those.

crazypenguin
Mar 9, 2005
nothing witty here, move along

The Laplace Demon posted:

Lisps are languages you should learn for your own benefit. If nothing else, they should help you think differently about problems.

FWIW, I think this advice has been replaced by Haskell. There are still a few things you can get only from a Lisp, but it's not clear to me anymore that those are actually all that generally beneficial to learn.

(e.g. I don't think multimethods are actually a good language feature, and I think most uses of them are either better off as pattern matching on algebraic datatypes, or with a different design altogether. Multimethods are hard to reason about. Haskell actually has a similar problem with overlapping multiparameter type classes, but they're not a standard feature, they're an extension. And the community discourages their use in favor of type families instead.)

ExcessBLarg!
Sep 1, 2001

SupSuper posted:

Sorry, should've clarified.
So, yeah, I can see how the CS and Comp E programs at a particular school might end up this way, but I wouldn't call it "the norm" across programs. But anecdotally? Yeah, that's happened.

Pavlov
Oct 21, 2012

I've long been fascinated with how the alt-right develops elaborate and obscure dog whistles to try to communicate their meaning without having to say it out loud
Stepan Andreyevich Bandera being the most prominent example of that
I'm just finishing the CS track at my college (a middle-of-the-road one) and even in the programming 101 course that half the campus took, you were required to submit your homework through a terminal command on one of the linux lab machines. One of the first things they taught was how to SSH into one remotely.

Pavlov fucked around with this message at 20:20 on Mar 25, 2015

carry on then
Jul 10, 2010

by VideoGames

(and can't post for 10 years!)

Deus Rex posted:

This is not my understanding or anecdotal experience at all. I have an unrelated degree but took a year of CS courses for kicks and all homework assignments were done in Java. You were expected to submit an Eclipse project for grading purposes. Naturally, I wouldn't extrapolate this to the rest of the program since I only did three courses. Looking ahead at the course catalog, it looks like several of the upper division courses are done with pen and paper.

My sister did a CS/Math degree and as far as I know never used Linux or vim or sed. She doesn't have a neckbeard, either.

Friends and family who have completed CE programs (MIT, UIUC, Purdue, UCLA, UC Irvine) certainly did not do most or even much (any) of their work in Windows or in an IDE.

I can't speak for computer programming courses because I haven't heard of those.

Those aren't standardized, though, they're up to the preferences of the professor/grader/whomever. To provide my anecdote, I was a grader/tutor for the data structures course, which was in Java but only required the source files, not an IDE project. They did, however, require a screenshot, and I noticed that the vast majority of students used their own PC (mostly Windows, a few OS X, one or two Linux) and almost all used the IDE used in the class--not because they were required to, but because it was what the professor did her demos in and the easiest to follow along in. I'd bet that if the professor were doing everything on the shell that that's what I would have seen the most of. This diverged as we got into the higher levels and people became more confident in what they liked/used, but it was a rare class that actually dictated the tools used to do the programming (only really weird stuff like DrScheme.)

Space Kablooey
May 6, 2009


Pollyanna posted:

I was hired on under the impression that I'd be in charge of refactoring it into something less insane (read: rewrite), but apparently that won't be for a while :( which makes sense, since it's still unclear to me what the architecture (and details of the goal) of the app is. Definitely doing BDD for that project.

...If I get the chance, of course. I'm worried that my constant questions and occasional mistakes might lead them to think I'm not good enough for the job :(

If you are the one that has been brought in to refactor that mess, they have absolutely no reason to doubt your abilities if you keep asking them stuff. If I brought someone to refactor my messes and that someone didn't ask any questions then I would start to become worried.

That goes the same for occasional mistakes. If anything*, it shows that the newcomer is trying something that is a bit out of their league, and that also shows interest in growth.

*: As long if it's not really basic programming stuff or the same error very frequently

No Safe Word
Feb 26, 2005

Pavlov posted:

I'm just finishing the CS track at my college (a middle-of-the-road one) and even in the programming 101 course that half the campus took, you were required to submit your homework through a terminal command on one of the linux lab machines. One of the first things they taught was how to SSH into one remotely.

Wanna shake the hands of your CS department faculty.

chutwig
May 28, 2001

BURLAP SATCHEL OF CRACKERJACKS

Pavlov posted:

I'm just finishing the CS track at my college (a middle-of-the-road one) and even in the programming 101 course that half the campus took, you were required to submit your homework through a terminal command on one of the linux lab machines. One of the first things they taught was how to SSH into one remotely.

When I was at Rutgers 14 years ago, the CS111 homework was auto-graded by a program on one of the CS department's servers. To submit, you also had to be able to SSH in and scp your homework over (or copy and paste it into vi/emacs if you just couldn't figure that part out). I never looked at Rutgers as being an even remotely progressive institution in any way, but the idea that a decade and a half later there are still CS departments that manually grade homework using TAs is a bit surprising to me.

Zopotantor
Feb 24, 2013

...und ist er drin dann lassen wir ihn niemals wieder raus...

Subjunctive posted:

Get off the main thread. Then tell me where to send my 5-figure consulting invoice.

gently caress every OS with a single-threaded event loop.
yes I'm still bitter that BeOS didn't go anywhere

JawnV6
Jul 4, 2004

So hot ...
God forbid a human actually look at code and provide comments back to the student. Much better to toss it into a machine to be ground down to a hard number. Gets rid of all those pesky conversations about how they could've done better.

Evil_Greven
Feb 20, 2007

Whadda I got to,
whadda I got to do
to wake ya up?

To shake ya up,
to break the structure up!?

1337JiveTurkey posted:

With CRC cards, I mentioned the classes first since they're the first C, but it's focused on what the responsibilities are. What needs to happen determines what classes do what. When you get into chess pieces or disabilities, the question is "Is this something we need to account for from the spec?" If you want a chess game which allows for variant chess pieces, putting the code to determine the valid moves in the respective pieces makes sense.

For a disabled person taking out the trash, some things will necessarily be different and some things will necessarily be the same with the responsibilities divvied up in a way that reflects that. However without some idea as to what limitations we have to expect on people trying to take out the garbage, we can't tell whether it warrants different people having different takeOutGarbage() methods or if it needs some sort of Job class that has a TakeOutGarbage subclass.

From a modelling perspective whatever has the most expressive power seems superior but if we're looking at requirements, there's a line where it's good enough. Remember that all models are wrong, some are useful.
Hmm... it still seems like the idea is to give multiple responsibilities to one entity after creating that entity. Let's look at a simplified structure that might result from these approaches for chess, then.

Either way of looking at it is going to come up with pieces and a board as they are integral parts of the game:
C++ code:
class Piece { ... }
class Board { ... }
A big difference is in how the members of these classes are filled out. The Who/What perspectives might make movement part of the Piece (and its children) object(s). Making a child object for each type of piece seems like a reasonable idea from the model perspective, because the movement for each type of piece is different. If the movement is moved into the Piece, then that object ought to know that information - a method should be something like move(toFile, toRank) (file in chess is columns, while rank is rows).

However, consider this - what is the Board object supposed to do? It becomes somewhat redundant - is there still a point to keeping track of positions in that object? It would be more sensible to simply encapsulate the pieces. This approach is forgetting a fundamental thing about the board game: pieces don't move by themselves - a human moves them. Pieces simply exist on the board. The board simply exists in the world. Both are inanimate things, so why are they doing things on their own in a model perspective?

We could model the player itself and determine that the player object is responsible for moving things. Yet, if we wanted to model how a player truly operates, it might be useful to consider that a player doesn't necessarily know or follow the rules. The rules themselves are an abstract, yet well-defined, concept - a fictional structure players generally agree to operate under. Shifting the onus to these rules, as you had alluded to earlier, makes more sense:
C++ code:
class ValidateMove { ... }
This seems better - it is responsible for rule enforcement with regard to movement. Having a whole bunch of methods - one for each type of piece - is doable and allows for extension (albeit a clumsy implementation). What goes in the Board and Piece objects, now? Piece can become much simpler: a type and its respective getter and setter. There's now little point in having child objects. Pieces might look different and be moved differently, but they're still just things sitting on a game board. Board might contain a two-dimensional array of spaces, each of which a Piece can occupy, with its respective getters and setters.

Yet, who now does movement? ValidateMove is clearly responsible for checking moves, thus something must ask it to do this. Board? Piece? We could just follow the above notion that led to the creation of ValidateMove - and makes some sense from the model perspective - that a player is responsible for a move:
C++ code:
class Player { ... }
For now, that seems sensible, but it might not in the long run. There are other concerns that lead towards this structure - chess, like many board games, is frequently timed. There are a number of variations on this, as there are variations of other rules. If any of these variations were added to the design spec, it makes even less sense to put the responsibility for movement in the individual pieces.

Volmarias
Dec 31, 2002

EMAIL... THE INTERNET... SEARCH ENGINES...

chutwig posted:

When I was at Rutgers 14 years ago, the CS111 homework was auto-graded by a program on one of the CS department's servers. To submit, you also had to be able to SSH in and scp your homework over (or copy and paste it into vi/emacs if you just couldn't figure that part out). I never looked at Rutgers as being an even remotely progressive institution in any way, but the idea that a decade and a half later there are still CS departments that manually grade homework using TAs is a bit surprising to me.

Same here. Hello from the class of `05

Of course, when I went back there for a recruiting event in TYOOL 2015 they still weren't teaching their students to use any form of source control, instead trusting them to just figure something out on their own.

There should be a mandatory "here are things you need to learn if you plan on programming for a living" class, but the ivory tower types don't want to actually get their fingers dirty by writing actual code, combined with professors that haven't done anything since the 80s, so I doubt that will change.

Volmarias fucked around with this message at 21:37 on Mar 25, 2015

Pavlov
Oct 21, 2012

I've long been fascinated with how the alt-right develops elaborate and obscure dog whistles to try to communicate their meaning without having to say it out loud
Stepan Andreyevich Bandera being the most prominent example of that

JawnV6 posted:

God forbid a human actually look at code and provide comments back to the student. Much better to toss it into a machine to be ground down to a hard number. Gets rid of all those pesky conversations about how they could've done better.

In our courses we still had a TA give the code a once-over for feedback after the fact, but the grading program was great because the professors would configure it to run your code on a series of test cases and instantly give you back the diff results. You essentially had no excuse for formatting issues and the like.

The MUMPSorceress
Jan 6, 2012


^SHTPSTS

Gary’s Answer

Volmarias posted:

Same here. Hello from the class of `05

Of course, when I went back there for a recruiting event in TYOOL 2015 they still weren't teaching their students to use any form of source control, instead trusting them to just figure something out on their own.

There should be a mandatory "here are things you need to learn if you plan on programming for a living" class, but the ivory tower types don't want to actually get their fingers dirty by writing actual code, combined with professors that haven't done anything since the 80s, so I doubt that will change.

The professor supervising my student lecturer for Intro to OSes bragged that he didn't learn x86 until last year, and he only did it because they were making him teach the OS class. So yes, academia definitely has some weird disconnect with reality.

chutwig
May 28, 2001

BURLAP SATCHEL OF CRACKERJACKS

Volmarias posted:

Same here. Hello from the class of `05

Sup fellow '05er. :cool: I only did CS111 and CS112 since I wasted 4 years in the linguistics program.

Re: academia being insular, I've worked at both Rutgers and UPenn. At Rutgers I was pretty far removed from the worst of the academic stuff and mostly just had to deal with the occasional stereotype physics professor trundling over to my department's offices to inform us that his Sun pizzabox was unable to open the EPS file we had generated from an Illustrator document because pine wiped out the attachment or some crap. It was really bad at Penn, though, and very negatively influenced operational activities, because people who had been in the echo chamber for 15+ years would go on lengthy tangents about, among other things:
  • why virtualization is the devil and an unproven technology (I worked at Penn from 2009-2013)
  • SPARC is a better ISA than x86 and so we should order more poo poo from Sun rather than buying Dells that cost a quarter as much
  • nobody knows Git and we need to keep using RCS, or CVS if you're really daring
  • no host should be connected at gigabit ever and everything should run at 100/half
Basically people who lectured were also in charge of making decisions about ops stuff and it was a nightmarish disaster. Last I checked, most people evacuated the department about when I did and the VP was fired for having allowed such a pile of poo poo to fester.

chutwig fucked around with this message at 23:22 on Mar 25, 2015

Sauer
Sep 13, 2005

Socialize Everything!
Spent all day today creating milling programs on an ancient CNC controller running an ancient version of LynxOS (a real-time Unix that guarantees deterministic order of operations, its slow as hell). You can define variables but they have to be only four characters long and none of those characters can belong to a symbol used by the controller itself so forget X, Y, Z, A, T, G, M or L. You can retrieve machine values stored in the controller (such as current tool diameter) but you better have memorized the opcodes for them because they certainly don't have intuitive names and the manual was lost years ago. And forget debugging. You're not stepping the program line by line or doing simulated runs so I hope you got it right the first time or that endmill is going to cut something that is probably very expensive or impossible to replace. All of this I can forgive since making a robot turn trees into useful products is a pretty awesome job and I love playing with these machines.

What I can't forgive is the variable width font in the program editor.

Pollyanna
Mar 5, 2005

Milk's on them.


HardDisk posted:

If you are the one that has been brought in to refactor that mess, they have absolutely no reason to doubt your abilities if you keep asking them stuff. If I brought someone to refactor my messes and that someone didn't ask any questions then I would start to become worried.

That goes the same for occasional mistakes. If anything*, it shows that the newcomer is trying something that is a bit out of their league, and that also shows interest in growth.

*: As long if it's not really basic programming stuff or the same error very frequently

Yeah...most of the reason I'm insecure is that I still refer to my teammates and direct superior often for relatively small things ("Do I understand these ticket requirements right?", "Where does this code I'm looking for live?") and it might be getting on their nerves :ohdear: But it's better than the alternative.

One thing I want to do when we for-real start the refactoring is to spend at least the first sprint doing no coding whatsoever. The sprint(s) will be dedicated to compiling the features and required behavior of the application, and documenting what the hell it does/is supposed to accomplish. Not having something like that weirded me the hell out at the beginning and it will make my job way easier in the future.

ExcessBLarg!
Sep 1, 2001

LeftistMuslimObama posted:

The professor supervising my student lecturer for Intro to OSes bragged that he didn't learn x86 until last year, and he only did it because they were making him teach the OS class.
Wow, he picked up real-mode segmentation, protected mode task structures, long mode (is it 64-bit?) simplifications, fast syscalls, the A20 gate (and at least 2 of the 3 ways to enable it) pretty darn quickly.

The PC systems architecture is like the antithesis of academic systems. It's an absurd mess.

Deus Rex
Mar 5, 2005

ExcessBLarg! posted:

Wow, he picked up real-mode segmentation, protected mode task structures, long mode (is it 64-bit?) simplifications, fast syscalls, the A20 gate (and at least 2 of the 3 ways to enable it) pretty darn quickly.

Getting back on topic, the x86 A20 gate is one of my favorite coding horrors. The 286 increased the address space to 24 bits, expanding on the 8086's 20-bit address space. Now, the 8086 only had 20 lines on the address bus, but you could nevertheless craft segment numbers and offsets that would generate a physical address > 2^20 - 1. The 8086 would silently wrap addresses mod 2^20, so that if you tried to access the physical address 2^20 + 1, you would end up accessing physical address 1.

The 286 introduced real mode, which was intended to emulate the 8086 for backwards compatibility. But initially, they failed to account for this quirk in real mode, and as it turned out there were many programs which relied on the 8086's wraparound behavior. So Intel decided to put a gate on the A20 line (address lines were 0-indexed) and leave it disabled by default to maintain backwards compatibility; in protected mode, a program would need to enable the A20 line in order to address memory at every odd megabyte.

Even today, every modern x86 operating system still has to go through the rigmarole of toggling the A20 gate during boot.

http://www.win.tue.nl/~aeb/linux/kbd/A20.html

ExcessBLarg!
Sep 1, 2001
I think it was IBM that added the A20 gate. Because they put it on the 8042 keyboard controller, it requires the OS to prod the controller in a somewhat slow and dreadful way in order to actually enable it. On the PS/2 they added it to "System Control Port A" which is much faster/easier to enable, but not universally supported. There's also a BIOS call to enable it, but not all BIOSes support the call. So you need to support at least 2/3 of the ways for hardware compatibility.

The other fun detail is that the 486 added an A20 pin to the CPU, asserted by the same gate, to ensure that the wrap-around behavior was preserved in cache. I assume the pin is still there, unless this whole mess got virtualized somewhere along the way.

pseudorandom name
May 6, 2007

Wikipedia says Nehalem removed the dedicated pin in favor of a message over QPI.

omeg
Sep 3, 2012

x86 is the horror. I want my 6502 back.

Deus Rex
Mar 5, 2005

ExcessBLarg! posted:

I think it was IBM that added the A20 gate. Because they put it on the 8042 keyboard controller, it requires the OS to prod the controller in a somewhat slow and dreadful way in order to actually enable it. On the PS/2 they added it to "System Control Port A" which is much faster/easier to enable, but not universally supported. There's also a BIOS call to enable it, but not all BIOSes support the call. So you need to support at least 2/3 of the ways for hardware compatibility.

Ahh, that makes much more sense that IBM added the gate.

TheresaJayne
Jul 1, 2011

omeg posted:

x86 is the horror. I want my 6502 back.

I miss the 68000 + Gary and Angus?

Steve French
Sep 8, 2003

SupSuper posted:

Computer programming/engineering courses normally just care about programming tools and IDEs so you probably won't even smell a command line
What do you think the things people use the command line for are, if not programming tools?

SupSuper posted:

This is purely anecdotal, but the point is you usually have to teach yourself CLIs, so most people won't bother.

This was my experience; overall your point is spot on.

Look Around You
Jan 19, 2009

Steve French posted:

What do you think the things people use the command line for are, if not programming tools?


This was my experience; overall your point is spot on.

I use it for file management since it's way more efficient than the gui ones.

Evil_Greven
Feb 20, 2007

Whadda I got to,
whadda I got to do
to wake ya up?

To shake ya up,
to break the structure up!?
So, I was flipping through an old book of mine circa 1998 (C Programming: A Modern Approach), and came across the ## operator, which I had probably seen before and then promptly forgotten about. It still exists, but it seems pretty impractical to me.

Some fun uses of the ## operator:
C code:
//A short guide on how to frustrate everyone
#define BUTT(n) butt##n
...
double BUTT(9), BUTT(11);	//this creates two integer variables, butt9 and butt11
butt9 = 9;			//and you can use them like so
BUTT(11) = 11;			//or like this
cout << butt9/BUTT(11); 	//resulting in 0.818182 being printed
and:
C code:
//A short guide on wasting time using a mutiline macro
#define GENERIC_MAX(T)		/
T T##_max(T x, T y)		/
{				/
	return x > y ? x : y;	/
}
...
GENERIC_MAX(double);	//defines function double double_max(double x, double y) with body from macro
...
cout<<double_max(9,11);	//this prints 11
Are these useful for anything beyond pissing people reading code off / wasting time, or are they simply the horror of the language as they appear?

Evil_Greven fucked around with this message at 05:05 on Mar 27, 2015

Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed
It's used all over the place in code doing much of anything with the preprocessor.

WHERE MY HAT IS AT
Jan 7, 2011

Volmarias posted:

Same here. Hello from the class of `05

Of course, when I went back there for a recruiting event in TYOOL 2015 they still weren't teaching their students to use any form of source control, instead trusting them to just figure something out on their own.

This seems to be the norm at most schools as far as I can tell. Oddly enough I'm in a program at a poo poo-tier community college and source control is mandatory for all our projects (ie they do a git log in your submission before grading it and you get marks docked if you didn't use it or your commit messages are all "aldkfkdfjdj") but that seems to be the exception rather than the rule. I feel sorry for most of the other interns at work though who've never used source control before and so now their only exposure to it is loving clearcase.

KaneTW
Dec 2, 2011

Evil_Greven posted:

So, I was flipping through an old book of mine circa 1998 (C Programming: A Modern Approach), and came across the ## operator, which I had probably seen before and then promptly forgotten about. It still exists, but it seems pretty impractical to me.

Some fun uses of the ## operator:
C code:
//A short guide on how to frustrate everyone
#define BUTT(n) butt##n
...
double BUTT(9), BUTT(11);	//this creates two integer variables, butt9 and butt11
butt9 = 9;			//and you can use them like so
BUTT(11) = 11;			//or like this
cout << butt9/BUTT(11); 	//resulting in 0.818182 being printed
and:
C code:
//A short guide on wasting time using a mutiline macro
#define GENERIC_MAX(T)		/
T T##_max(T x, T y)		/
{				/
	return x > y ? x : y;	/
}
...
GENERIC_MAX(double);	//defines function double double_max(double x, double y) with body from macro
...
cout<<double_max(9,11);	//this prints 11
Are these useful for anything beyond pissing people reading code off / wasting time, or are they simply the horror of the language as they appear?

Preprocessor macros are pretty useful, especially when writing boilerplate code.

Hammerite
Mar 9, 2007

And you don't remember what I said here, either, but it was pompous and stupid.
Jade Ear Joe

Evil_Greven posted:

So, I was flipping through an old book of mine circa 1998 (C Programming: A Modern Approach), and came across the ## operator, which I had probably seen before and then promptly forgotten about. It still exists, but it seems pretty impractical to me.

Some fun uses of the ## operator:
C code:
//A short guide on how to frustrate everyone
#define BUTT(n) butt##n
...
double BUTT(9), BUTT(11);	//this creates two integer variables, butt9 and butt11
butt9 = 9;			//and you can use them like so
BUTT(11) = 11;			//or like this
cout << butt9/BUTT(11); 	//resulting in 0.818182 being printed
and:
C code:
//A short guide on wasting time using a mutiline macro
#define GENERIC_MAX(T)		/
T T##_max(T x, T y)		/
{				/
	return x > y ? x : y;	/
}
...
GENERIC_MAX(double);	//defines function double double_max(double x, double y) with body from macro
...
cout<<double_max(9,11);	//this prints 11
Are these useful for anything beyond pissing people reading code off / wasting time, or are they simply the horror of the language as they appear?

Reminds me of programming TeX

\def\x{tt}\expandafter\def\csname bu\x\endcsname{fart}\butt % fart

Spatial
Nov 15, 2007

KaneTW posted:

Preprocessor macros are pretty useful, especially when writing boilerplate code.
They are terrible and should be annihilated from the face of this otherwise good Earth.

Spatial
Nov 15, 2007

Macros are "Why make this nicer when we have this aggressively lovely workaround?" except at language level.

loving Dennis Ritchie is a loving pussy. I’m going to loving bury that guy, I have done it before and I will do it again. I’m going to loving kill C.

Dicky B
Mar 23, 2004

I remember pre-c++11 using token pasting to write fake variadic templates :allears:

The MUMPSorceress
Jan 6, 2012


^SHTPSTS

Gary’s Answer
Forgive me if this is a dumb question, but I don't really get the point of function-like macros anyway. Is it just that they're inlined so you don't add a level to the stack for small computations? It seems like most of the time that level of optimization would be unnecessary, and if I'm doing that calculation often enough that I need a shortcut for it, I'd rather make it a function in a header file that I can use in other source files or projects.

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



WHERE MY HAT IS AT posted:

This seems to be the norm at most schools as far as I can tell. Oddly enough I'm in a program at a poo poo-tier community college and source control is mandatory for all our projects (ie they do a git log in your submission before grading it and you get marks docked if you didn't use it or your commit messages are all "aldkfkdfjdj") but that seems to be the exception rather than the rule. I feel sorry for most of the other interns at work though who've never used source control before and so now their only exposure to it is loving clearcase.

Your poo poo tier CC sounds like they have their poo poo together (in that area) and you should probably write someone a nice letter to that effect.

ShoulderDaemon
Oct 9, 2003
support goon fund
Taco Defender

LeftistMuslimObama posted:

Forgive me if this is a dumb question, but I don't really get the point of function-like macros anyway. Is it just that they're inlined so you don't add a level to the stack for small computations? It seems like most of the time that level of optimization would be unnecessary, and if I'm doing that calculation often enough that I need a shortcut for it, I'd rather make it a function in a header file that I can use in other source files or projects.

One "nice" thing about macros is that they can be non-syntactic. In our codebase at work, we have macros along the lines of:
C++ code:
#define ASSERT(c,e,m) do if (! (e)) {                    \
  ::std::ostringstream _msg;                             \
  _msg << __FILE__ ":" << __LINE__ << ": " #c ": " << m; \
  throw ::ducky::Assert##c(_msg.str());                  \
} while (false)
This lets you write assertions like:
C++ code:
ASSERT(ContractViolation, foo == bar, "Expected " << bar << " but got " << foo << " from " << dump_state());
This is doing a few neat things:
  • The different types of assertion exceptions are all handled by the same macro, using token pasting.
  • The assertion messages have filenames and line numbers, which you can't get if you generate them with a real function.
  • The non-syntactic message parameter allows us to build complex assertion messages at runtime without having to explicitly create temporaries at assertion sites.
  • The non-syntactic message parameter is evaluated only if the assertion fires, so if e.g. dump_state() is very slow, it will not be called in the normal case.

It's not like we couldn't work without it, but it makes some parts of the codebase a lot nicer to deal with.

That said, a lot of the "obvious" uses of macros are either terrible poo poo nobody should want to do, or are doable with just a little more work using either normal functions or templates. Macros have their place, mostly in reducing code that you would otherwise be forced to manually duplicate at every usage site.

ExcessBLarg!
Sep 1, 2001

Evil_Greven posted:

C code:
//A short guide on wasting time using a mutiline macro
#define GENERIC_MAX(T)		/
T T##_max(T x, T y)		/
{				/
	return x > y ? x : y;	/
}
...
GENERIC_MAX(double);	//defines function double double_max(double x, double y) with body from macro
...
cout<<double_max(9,11);	//this prints 11
Wow, that's a lot of divisions being done there.

LeftistMuslimObama posted:

but I don't really get the point of function-like macros anyway.
So they can be used as a poor-man's inline function, and in the past had been used as such before compilers supported inlining (or did a good job of inlining). However, "function-like" macros can generate all sorts of expressions that wouldn't be valid (or otherwise couldn't be constructed) as a separate function. For example, code inside a macro can access variables inside the scope that the macro is called, but that aren't otherwise passed to the macro. Evil_Greven's example also demonstrates an implementation of poor-man's generics.

LeftistMuslimObama posted:

I'd rather make it a function in a header file that I can use in other source files or projects.
Beware that tossing a function in a header file can cause issues. Unless you declare them "static", including the header in multiple translation units will result in multiple copies of the function being generated in object code with external linkage, with a symbol conflict upon linking. C99/C++ support the "inline" keyword to get around this, but "static" vs. "static inline" is something of a zen riddle, and non-static inline has weird semantics I always forget.

Adbot
ADBOT LOVES YOU

The MUMPSorceress
Jan 6, 2012


^SHTPSTS

Gary’s Answer
Interesting. I'm just going to post in this thread whenever I encounter things I don't get, because I get way more useful answers from y'all than I get from grad student lecturers who are just teaching to pay for their master's degrees.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply