|
also nicely solves issues like what's longer than a long and so forth
|
# ? Mar 16, 2015 06:12 |
|
|
# ? May 28, 2024 14:59 |
|
one good thing about my degree was that i had to do two summers worth of work in the industry before graduating of course many of my classmates just didn't graduate, got a job anyway
|
# ? Mar 16, 2015 06:34 |
|
also 99% of dev work is copy pasting from snack overflow and jerkin off
|
# ? Mar 16, 2015 06:35 |
|
Bloody posted:sorry for not writing the ebnf grammar of what i consider to be an acceptable type yospos birch
|
# ? Mar 16, 2015 10:22 |
|
i do not know why i thought this was funy, i am going to bed
rjmccall fucked around with this message at 10:49 on Mar 16, 2015 |
# ? Mar 16, 2015 10:42 |
|
whats up with c and c++ having poo poo like uint32_t when unsigned int already exists? and whats with the _t bit?
|
# ? Mar 16, 2015 10:56 |
|
Bloody posted:types not of the form (u)int(#bits)_t suck
|
# ? Mar 16, 2015 14:23 |
|
Awia posted:whats up with c and c++ having poo poo like uint32_t when unsigned int already exists? "unsigned int" is not guaranteed to be any particular size. it could be 16 bits or 64 bits depending on platform/compiler details _t suffix is a posix-ism
|
# ? Mar 16, 2015 15:38 |
|
Awia posted:whats up with c and c++ having poo poo like uint32_t when unsigned int already exists? _t lets you know it's a typedef it makes it surprisingly much easier to read
|
# ? Mar 16, 2015 18:43 |
|
Bloody posted:nah my complaints mostly involve numeric types and that their size is significantly important to their functionality and also very poorly conveyed well that seems pretty reasonable
|
# ? Mar 16, 2015 18:54 |
|
Bloody posted:nah my complaints mostly involve numeric types and that their size is significantly important to their functionality and also very poorly conveyed int not being precisely defined was just because unix ran on 18-bit and 16-bit machines back in the day when 18/36-bit disappeared the undefinedness of the types really should have gone away too, but here we are~~
|
# ? Mar 16, 2015 19:26 |
|
16-bit vs 18-bit is all the more reason to strictly define it. if your code used the wrong kind for your platform the compiler should have just produced some tortured output to achieve the written effect. imagine how ownage it would be to have arbitrary-precision numerics built into the standard numeric types. drat.
|
# ? Mar 16, 2015 19:35 |
|
there are still plenty of embedded systems with 16-bit int, i mean you could define it to be 32-bit but then a bunch of standard functions would need to be rewritten to pass things as shorts (intfast_t?) for efficiency
|
# ? Mar 16, 2015 19:43 |
|
Cybernetic Vermin posted:int not being precisely defined was just because unix ran on 18-bit and 16-bit machines back in the day it wouldn't have helped anything people would still assume sizeof(unsigned int) == sizeof(int*) and equally stupid poo poo
|
# ? Mar 16, 2015 19:46 |
|
Bloody posted:16-bit vs 18-bit is all the more reason to strictly define it. if your code used the wrong kind for your platform the compiler should have just produced some tortured output to achieve the written effect. imagine how ownage it would be to have arbitrary-precision numerics built into the standard numeric types. drat. the point is that 'int' was a way of saying "i want the native integer which is at least 16 bits", which was then put in the specification, but it was at the time said with the implication that it would be 16 bits or just slightly larger at 18 since those were the only relevant cases. the worst that happened is that you sometimes on 18-bit machines got two bits extra which you didn't strictly need. while, of course, making the alternatives possible. saying int16_t on a PDP-7 would have indeed created very inefficient machine code for no really good reason. as it happens, this is no longer an issue since all sane platforms which you would expect c code to be portable to does 8, 16 and 32 bit integers, and will for the most part either do 64 or emulate it pretty quick. so now specifying precisely what you mean is the sane way to go i am not really sure there is an argument to be had here about anything: the historical reason for the weird integer type choices in c are pretty easy to understand, but today it is all trivial and all the integer types will in fact have the sizes you expect them to everywhere forever, but you might as well do stdint stuff to be explicit. it was a bit weird that we managed to go to 32 bit ints, but that happened reasonably early and was sort of needed for unix itself anyway. exceptions to all statements for really custom embedded poo poo where you might as well not call it c for the purposes of standards and intreroperability.
|
# ? Mar 16, 2015 21:06 |
|
Cybernetic Vermin posted:exceptions to all statements for really custom embedded poo poo where you might as well not call it c for the purposes of standards and intreroperability.
|
# ? Mar 16, 2015 21:07 |
|
Bloody posted:16-bit vs 18-bit is all the more reason to strictly define it. if your code used the wrong kind for your platform the compiler should have just produced some tortured output to achieve the written effect. imagine how ownage it would be to have arbitrary-precision numerics built into the standard numeric types. drat. code:
|
# ? Mar 17, 2015 03:23 |
|
embedded poo poo has gotten better about this!
|
# ? Mar 17, 2015 04:20 |
|
Brain Candy posted:
yeah but that isn't c therefore i do not care even one bit
|
# ? Mar 17, 2015 04:22 |
|
Bloody posted:yeah but that isn't c therefore i do not care even whatever number of bits are in the machine's natural word size
|
# ? Mar 17, 2015 04:50 |
|
|
# ? Mar 17, 2015 04:59 |
|
so im trying to connect to a quadcopter over wifi, the quadcopter creates a wifi network, which my program seems to connect to fine regardless of what i do, but then it seems to fail opening ports depending on if i step through the code or just run it normally, it works fine stepping through but fails everytime if i run without debugging anyone have any ideas why this might be happening? im not even entirely sure if it's opening the ports that is the problem
|
# ? Mar 17, 2015 17:40 |
|
if something works when debugging but not when running its usually a timing issue. if you are preparing resources before using them (ex: opening a port) you need to make sure those operations are complete before trying to use them. most of the time those kinds of APIs will just block while waiting for the io to complete, but if you're doing that in another thread you may need to sync some stuff up between threads to prevent race conditions.
|
# ? Mar 17, 2015 17:58 |
|
Shaggar posted:if something works when debugging but not when running its usually a timing issue. if you are preparing resources before using them (ex: opening a port) you need to make sure those operations are complete before trying to use them. most of the time those kinds of APIs will just block while waiting for the io to complete, but if you're doing that in another thread you may need to sync some stuff up between threads to prevent race conditions. i get you, trying to find where this library creates threads is gonna be a ballache is there a way to do that in vs?
|
# ? Mar 17, 2015 18:07 |
|
are you trying to connect to a port on the remote host or open a port locally for the remote host to connect to? im guessing you're connecting to the device in which case you should probably be using TCP Client unless theres already a library for the thing. if you are using a 3rd party lib you can hit f12 to inspect classes + members but you probably wont have the source.
|
# ? Mar 17, 2015 18:12 |
|
Shaggar posted:are you trying to connect to a port on the remote host or open a port locally for the remote host to connect to? i am trying to connect to a port on the romote host then start sending strings to it the library im using says it handles the connections itself and i do have the source, i cant for the life of me find where it creates the ports though
|
# ? Mar 17, 2015 18:28 |
|
4 exams squeezed into 3ndays, owned again by the exam timetable
|
# ? Mar 17, 2015 18:32 |
|
go to whatever your entry point is and then inspect from there to see how it sends it to the device. ex: if theres a method device.DoThing() then inspect the source of dothing and drill down until you find how its sending the commands to the device. then examine what objects its using to send those commands and how they were instantiated.
|
# ? Mar 17, 2015 18:33 |
|
Shaggar posted:go to whatever your entry point is and then inspect from there to see how it sends it to the device. ok, i seem to have found it, it creates a background worker that it passes a function "ConnectAsync" to. but then the connection seems to happen in an event handler much later in the code threaded programming is hard
|
# ? Mar 17, 2015 18:53 |
|
code:
|
# ? Mar 17, 2015 18:59 |
|
Awia posted:ok, i seem to have found it, it creates a background worker that it passes a function "ConnectAsync" to. sleep(2000);
|
# ? Mar 17, 2015 22:25 |
|
bobbilljim posted:sleep(2000); always the right solution
|
# ? Mar 17, 2015 22:25 |
|
just realized that i disabled a bunch of tests last week and then forgot about them seriously though gently caress tests i've been gradually moving over to doing way more assertion and validation in the actual code and it is much better than hoping that the house of cards known as test data can actually reflect anything meaningful DONT THREAD ON ME fucked around with this message at 00:11 on Mar 18, 2015 |
# ? Mar 18, 2015 00:09 |
|
Valeyard posted:4 exams squeezed into 3ndays, owned again by the exam timetable i only have 1 final all week
|
# ? Mar 18, 2015 00:09 |
|
MALE SHOEGAZE posted:just realized that i disabled a bunch of tests last week and then forgot about them unit tests are an ok part of *development* liek to help you design your code and what it should do. for finding bugs and poo poo? fat chance. code reviews & people actually employed to test is teh right way
|
# ? Mar 18, 2015 00:13 |
|
writing tests only works if you can actually think of the case that breaks your program. otherwise you write tests for obvious stuff, get about code coverage and act shocked when it's still broken.
|
# ? Mar 18, 2015 00:25 |
|
what is a test? im this thread
|
# ? Mar 18, 2015 00:32 |
|
tests can also be useful for catching bugs introduced to systems affected by the one you changed but yeah automated tests aren't foolproof and there's no such thing as 100% coverage
|
# ? Mar 18, 2015 00:32 |
|
who tests the tests?
|
# ? Mar 18, 2015 00:35 |
|
|
# ? May 28, 2024 14:59 |
|
fleshweasel posted:writing tests only works if you can actually think of the case that breaks your program. otherwise you write tests for obvious stuff, get about code coverage and act shocked when it's still broken. that's what fuzzers and other test generators are for
|
# ? Mar 18, 2015 00:36 |