|
The Management posted:true on some architectures how's the carter administration?
|
# ? Jun 24, 2017 14:44 |
|
|
# ? Jun 5, 2024 03:44 |
|
Cocoa Crispies posted:i straight up never know what sizes c variables are anymore so last time i wrote c i typedef'd u8, s8, f64, etc. and just asserted their sizeof during startup tests jfc. not knowing stdint is basically a fail during the interview. Sweeper posted:which one actually I can't seem to find any.
|
# ? Jun 24, 2017 15:30 |
|
z80 or something idk
|
# ? Jun 24, 2017 21:49 |
|
rt4 posted:z80 or something idk ya, sega genesis runs into integer overflows at 255 in lots of games so i assume it's base int was that small
|
# ? Jun 25, 2017 03:03 |
|
cis autodrag posted:ya, sega genesis runs into integer overflows at 255 in lots of games so i assume it's base int was that small genesis is 68000, 16 bit processor. but developers frequently used bytes for various counters because RAM is extremely limited
|
# ? Jun 25, 2017 03:12 |
|
The Management posted:genesis is 68000, 16 bit processor. but developers frequently used bytes for various counters because RAM is extremely limited why did i think it was a z80. well, i never was much for memorizing hardware genealogies. i just tend to notice all the speed run tricks that revolve around overflows.
|
# ? Jun 25, 2017 03:25 |
|
cis autodrag posted:why did i think it was a z80. well, i never was much for memorizing hardware genealogies. i just tend to notice all the speed run tricks that revolve around overflows. I think the master system was a z80?
|
# ? Jun 25, 2017 03:55 |
|
The Management posted:jfc. not knowing stdint is basically a fail during the interview. i was going to say "int" in C is 16 bits at a minimum, I don't think any languages called something that can be 8 bits an "integer"
|
# ? Jun 25, 2017 04:05 |
|
hobbesmaster posted:i was going to say "int" in C is 16 bits at a minimum, I don't think any languages called something that can be 8 bits an "integer" C does not define the size of an int. it defines that the size relationship must be: char <= short <= int <= long int sizeof(char) must equal 1, and it must be at least 7 bits to contain the ascii charset. I used to work on a dsp where char, short, and int were all 16 bits and sizeof on all of them evaluated to 1.
|
# ? Jun 25, 2017 04:25 |
|
anyway, there are standard sizes. use them.code:
and if I see you converting a pointer to an int in an interview know that I am emailing the recruiter to walk you out after I'm done.
|
# ? Jun 25, 2017 04:30 |
|
one of you is an idiot who's huffed too many farts but I don't program so I can't talk which one
|
# ? Jun 25, 2017 04:33 |
|
The Management posted:C does not define the size of an int. it defines that the size relationship must be: http://port70.net/~nsz/c/c89/c89-draft.html#2.2.4.2
|
# ? Jun 25, 2017 04:37 |
|
The Management posted:and if I see you converting a pointer to an int in an interview know that I am emailing the recruiter to walk you out after I'm done. what if it's an aligned pointer and I want to store some extra information in the low bits? what then smart guy????????
|
# ? Jun 25, 2017 05:05 |
|
The Management posted:and if I see you converting a pointer to an int in an interview know that I am emailing the recruiter to walk you out after I'm done. what if I leave a comment right above it saying "yolo"?
|
# ? Jun 25, 2017 05:09 |
|
travelling wave posted:what if it's an aligned pointer and I want to store some extra information in the low bits? what then smart guy???????? then you better use a uintptr_t, not an int.
|
# ? Jun 25, 2017 06:07 |
|
what if i value an understanding of c semantics but i don't program in c regularly enough to remember all the proper typedef nonsense and i sure as gently caress don't want to either? and it's not relevant to my job in any way whatsoever? what then smart guy???
|
# ? Jun 25, 2017 07:06 |
|
cis autodrag posted:why did i think it was a z80. well, i never was much for memorizing hardware genealogies. i just tend to notice all the speed run tricks that revolve around overflows. the genesis had both a 68k and a z80, and either one could act as the CPU. that was how master system backwards compatibility worked this is also why the main sound chip was attached directly to the z80 -- that's how it was on the master system all genesis games and emulators use the z80 for sound, if nothing else
|
# ? Jun 25, 2017 08:01 |
|
JewKiller 3000 posted:what if i value an understanding of c semantics but i don't program in c regularly enough to remember all the proper typedef nonsense and i sure as gently caress don't want to either? and it's not relevant to my job in any way whatsoever? what then smart guy??? then I wouldn't be interviewing you because you are useless to me.
|
# ? Jun 25, 2017 15:31 |
|
Captain Foo posted:one of you is an idiot who's huffed too many farts but I don't program so I can't talk which one The Management is correct here I'll sometimes use an int for simple iterators like for(int i=0; i < SOME_CONST; i++) since I assume the compiler would warn me if the int somehow wasn't big enough, although "I assume the compiler would..." is a bad phrase when it comes to C
|
# ? Jun 25, 2017 17:51 |
|
Wrt the stdint debate, can we just all agree that mistaking an int for a one byte char is instant grounds for removal from the premises/telephone line?
|
# ? Jun 26, 2017 00:26 |
|
If you honestly gently caress up on an ultra basic question in that way, then the only logical conclusion that can be drawn from that is "this person has absolutely no idea what they are talking about"
|
# ? Jun 26, 2017 00:30 |
|
my response would be: "sorry, I program in a modern OS and so don't concern myself with such trivialities" and then my interviewers will stand up and clap
|
# ? Jun 26, 2017 01:10 |
|
qhat posted:Wrt the stdint debate, can we just all agree that mistaking an int for a one byte char is instant grounds for removal from the premises/telephone line? it's totally reasonable to assume that an int would be 8 bits on an 8 bit processor, though in practice that turns out not to be the case. the (C++) standard does not prevent this from being the case. moreover there are 8 bit integer types, so depending on context it may be fine to refer to these as ints. I'm curious why you care that people have every type's size memorized when they should be using sizeof and/or be using objects with a well defined size anyway. I could care less if someone has memorized how many bits of mantissa a float or double have as well. when the correct answer to your weeder question is: "it depends on the architecture, language, and in some languages, the compiler" (alternatively: "why are you using 'int'?") then you've got a couple problems and the candidate may not be one of them.
|
# ? Jun 26, 2017 02:50 |
|
leper khan posted:it's totally reasonable to assume that an int would be 8 bits on an 8 bit processor, though in practice that turns out not to be the case. the (C++) standard does not prevent this from being the case. only in the most fishmechian "the c++ standard says that the integer limits are the same as C" way
|
# ? Jun 26, 2017 03:04 |
|
leper khan posted:it's totally reasonable to assume that an int would be 8 bits on an 8 bit processor, though in practice that turns out not to be the case. the (C++) standard does not prevent this from being the case. In C++, without any length modifiers, the width of an int is guaranteed to be at least 16bits. As for 8 bit processors, we tend to deal in reality, not fantasy. If you go for a C/C++ job and you don't know the size of the most fundamental datatypes in the language (obviously given a known architecture) then you are a terrible C programmer and an instant fail.
|
# ? Jun 26, 2017 03:18 |
|
employers just shouldn't ask any questions really since it's obviously impossible to gauge someone's understanding by asking them the most utterly basic questions about a language
|
# ? Jun 26, 2017 03:28 |
|
moreso than bad interview questions, I wish it was easier to fire people once you realize they're poo poo instead letting them stay forever and making everyone else hate their job a bit more or leaving.
|
# ? Jun 26, 2017 04:14 |
|
qhat posted:In C++, without any length modifiers, the width of an int is guaranteed to be at least 16bits. As for 8 bit processors, we tend to deal in reality, not fantasy. If you go for a C/C++ job and you don't know the size of the most fundamental datatypes in the language (obviously given a known architecture) then you are a terrible C programmer and an instant fail. tbh never ever use the "int" type for anything in c/c++. it's a code smell. always always always in every circumstance use (u)intXX_t. if I need to know what size an "int" is on my platform I'll look it up and then change the code in question to use the correct type instead e: size_t and friends are also acceptable Arcsech fucked around with this message at 04:24 on Jun 26, 2017 |
# ? Jun 26, 2017 04:21 |
|
why are types in c so convoluted anyway? is the answer "int was once upon a time a thin wrapper over a hardware register size"?
|
# ? Jun 26, 2017 04:44 |
|
basically, yeah. most architectures implement "int" as "machine word" which means you can't rely on a specific size for int cross-platform
|
# ? Jun 26, 2017 04:55 |
|
I'd get nervous on answering immediately the size of int, I'd need to think for a bit (ok char is 256, OBVIOUSLY because ascii and stuff, so 8 bits. then short is double that 16, and int is double that... then 32. cue in talk about architectures here) I'm just used to typing u16, i32 etc that I never think about the basic types anymore
|
# ? Jun 26, 2017 05:07 |
|
qhat posted:employers just shouldn't ask any questions really since it's obviously impossible to gauge someone's understanding by asking them the most utterly basic questions about a language this but unironically
|
# ? Jun 26, 2017 05:09 |
|
Symbolic Butt posted:I'd get nervous on answering immediately the size of int, I'd need to think for a bit CHAR_BIT is at least 8bits one of those factoids you follow up with "I hope this is never relevant"
|
# ? Jun 26, 2017 05:28 |
|
ascii only uses 7 bits so i hope you like signed char
|
# ? Jun 26, 2017 06:15 |
|
cis autodrag posted:why are types in c so convoluted anyway? is the answer "int was once upon a time a thin wrapper over a hardware register size"? nearly everything in c is a thin wrapper over an idealized set of dec hardware from the 1970s writing pdp-11 assembly looks a lot like C right off the top, even down to the dumbassery with array[idx] vs idx[array]
|
# ? Jun 26, 2017 06:21 |
|
JewKiller 3000 posted:ascii only uses 7 bits so i hope you like signed char i prefer my chars autographed
|
# ? Jun 26, 2017 06:37 |
|
RISCy Business posted:i prefer my chars autographed people just take selfies with the chars now
|
# ? Jun 26, 2017 06:48 |
|
FMguru posted:theres been a real downturn in signed chars since smartphones became a thing millenials are killing the autograph industry
|
# ? Jun 26, 2017 06:52 |
|
Can we get this back on topic with more videogame chat
|
# ? Jun 26, 2017 06:54 |
|
|
# ? Jun 5, 2024 03:44 |
|
qhat posted:Wrt the stdint debate, can we just all agree that mistaking an int for a one byte char is instant grounds for removal from the premises/telephone line? what do you mean by 'mistaking' an int for a char? because in this very thread The Management posted:I used to work on a dsp where char, short, and int were all 16 bits and sizeof on all of them evaluated to 1.
|
# ? Jun 26, 2017 10:25 |