Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Bloody
Mar 3, 2013

also nicely solves issues like what's longer than a long and so forth

Adbot
ADBOT LOVES YOU

bobbilljim
May 29, 2013

this christmas feels like the very first christmas to me
:shittydog::shittydog::shittydog:
one good thing about my degree was that i had to do two summers worth of work in the industry before graduating

of course many of my classmates just didn't graduate, got a job anyway

bobbilljim
May 29, 2013

this christmas feels like the very first christmas to me
:shittydog::shittydog::shittydog:
also 99% of dev work is copy pasting from snack overflow and jerkin off

Symbolic Butt
Mar 22, 2009

(_!_)
Buglord

Bloody posted:

sorry for not writing the ebnf grammar of what i consider to be an acceptable type :jerkbag:

yospos birch

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe
i do not know why i thought this was funy, i am going to bed

rjmccall fucked around with this message at 10:49 on Mar 16, 2015

oh no blimp issue
Feb 23, 2011

whats up with c and c++ having poo poo like uint32_t when unsigned int already exists?
and whats with the _t bit?

Jerry Bindle
May 16, 2003

Bloody posted:

types not of the form (u)int(#bits)_t suck

Notorious b.s.d.
Jan 25, 2003

by Reene

Awia posted:

whats up with c and c++ having poo poo like uint32_t when unsigned int already exists?
and whats with the _t bit?

"unsigned int" is not guaranteed to be any particular size. it could be 16 bits or 64 bits depending on platform/compiler details

_t suffix is a posix-ism

MeruFM
Jul 27, 2010

Awia posted:

whats up with c and c++ having poo poo like uint32_t when unsigned int already exists?
and whats with the _t bit?

_t lets you know it's a typedef
it makes it surprisingly much easier to read

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Bloody posted:

nah my complaints mostly involve numeric types and that their size is significantly important to their functionality and also very poorly conveyed

well that seems pretty reasonable :hf:

Cybernetic Vermin
Apr 18, 2005

Bloody posted:

nah my complaints mostly involve numeric types and that their size is significantly important to their functionality and also very poorly conveyed

int not being precisely defined was just because unix ran on 18-bit and 16-bit machines back in the day

when 18/36-bit disappeared the undefinedness of the types really should have gone away too, but here we are~~

Bloody
Mar 3, 2013

16-bit vs 18-bit is all the more reason to strictly define it. if your code used the wrong kind for your platform the compiler should have just produced some tortured output to achieve the written effect. imagine how ownage it would be to have arbitrary-precision numerics built into the standard numeric types. drat.

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe
there are still plenty of embedded systems with 16-bit int, i mean you could define it to be 32-bit but then a bunch of standard functions would need to be rewritten to pass things as shorts (intfast_t?) for efficiency

Notorious b.s.d.
Jan 25, 2003

by Reene

Cybernetic Vermin posted:

int not being precisely defined was just because unix ran on 18-bit and 16-bit machines back in the day

when 18/36-bit disappeared the undefinedness of the types really should have gone away too, but here we are~~

it wouldn't have helped anything

people would still assume sizeof(unsigned int) == sizeof(int*) and equally stupid poo poo

Cybernetic Vermin
Apr 18, 2005

Bloody posted:

16-bit vs 18-bit is all the more reason to strictly define it. if your code used the wrong kind for your platform the compiler should have just produced some tortured output to achieve the written effect. imagine how ownage it would be to have arbitrary-precision numerics built into the standard numeric types. drat.

the point is that 'int' was a way of saying "i want the native integer which is at least 16 bits", which was then put in the specification, but it was at the time said with the implication that it would be 16 bits or just slightly larger at 18 since those were the only relevant cases. the worst that happened is that you sometimes on 18-bit machines got two bits extra which you didn't strictly need. while, of course, making the alternatives possible.

saying int16_t on a PDP-7 would have indeed created very inefficient machine code for no really good reason. as it happens, this is no longer an issue since all sane platforms which you would expect c code to be portable to does 8, 16 and 32 bit integers, and will for the most part either do 64 or emulate it pretty quick. so now specifying precisely what you mean is the sane way to go

i am not really sure there is an argument to be had here about anything: the historical reason for the weird integer type choices in c are pretty easy to understand, but today it is all trivial and all the integer types will in fact have the sizes you expect them to everywhere forever, but you might as well do stdint stuff to be explicit. it was a bit weird that we managed to go to 32 bit ints, but that happened reasonably early and was sort of needed for unix itself anyway. exceptions to all statements for really custom embedded poo poo where you might as well not call it c for the purposes of standards and intreroperability.

Bloody
Mar 3, 2013

Cybernetic Vermin posted:

exceptions to all statements for really custom embedded poo poo where you might as well not call it c for the purposes of standards and intreroperability.

:negative:

Brain Candy
May 18, 2006

Bloody posted:

16-bit vs 18-bit is all the more reason to strictly define it. if your code used the wrong kind for your platform the compiler should have just produced some tortured output to achieve the written effect. imagine how ownage it would be to have arbitrary-precision numerics built into the standard numeric types. drat.

code:
Python 3.4.1 (default, May 19 2014, 13:10:29) 
[GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> d = 2**32 - 1
>>> d
4294967295
>>> d*d
18446744065119617025
>>> d*d*d
79228162458924105385300197375
>>> d*d*d*d*d*d
6277101726617670944954607416215071121746690847213956890625

hobbesmaster
Jan 28, 2008


embedded poo poo has gotten better about this!

Bloody
Mar 3, 2013

Brain Candy posted:

code:
Python 3.4.1 (default, May 19 2014, 13:10:29) 
[GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> d = 2**32 - 1
>>> d
4294967295
>>> d*d
18446744065119617025
>>> d*d*d
79228162458924105385300197375
>>> d*d*d*d*d*d
6277101726617670944954607416215071121746690847213956890625

yeah but that isn't c therefore i do not care even one bit

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Bloody posted:

yeah but that isn't c therefore i do not care even whatever number of bits are in the machine's natural word size

Bloody
Mar 3, 2013


:agreed:

oh no blimp issue
Feb 23, 2011

so im trying to connect to a quadcopter over wifi, the quadcopter creates a wifi network, which my program seems to connect to fine regardless of what i do, but then it seems to fail opening ports depending on if i step through the code or just run it normally, it works fine stepping through but fails everytime if i run without debugging
anyone have any ideas why this might be happening?

im not even entirely sure if it's opening the ports that is the problem

Shaggar
Apr 26, 2006
if something works when debugging but not when running its usually a timing issue. if you are preparing resources before using them (ex: opening a port) you need to make sure those operations are complete before trying to use them. most of the time those kinds of APIs will just block while waiting for the io to complete, but if you're doing that in another thread you may need to sync some stuff up between threads to prevent race conditions.

oh no blimp issue
Feb 23, 2011

Shaggar posted:

if something works when debugging but not when running its usually a timing issue. if you are preparing resources before using them (ex: opening a port) you need to make sure those operations are complete before trying to use them. most of the time those kinds of APIs will just block while waiting for the io to complete, but if you're doing that in another thread you may need to sync some stuff up between threads to prevent race conditions.

i get you, trying to find where this library creates threads is gonna be a ballache
is there a way to do that in vs?

Shaggar
Apr 26, 2006
are you trying to connect to a port on the remote host or open a port locally for the remote host to connect to?
im guessing you're connecting to the device in which case you should probably be using TCP Client unless theres already a library for the thing.

if you are using a 3rd party lib you can hit f12 to inspect classes + members but you probably wont have the source.

oh no blimp issue
Feb 23, 2011

Shaggar posted:

are you trying to connect to a port on the remote host or open a port locally for the remote host to connect to?
im guessing you're connecting to the device in which case you should probably be using TCP Client unless theres already a library for the thing.

if you are using a 3rd party lib you can hit f12 to inspect classes + members but you probably wont have the source.

i am trying to connect to a port on the romote host then start sending strings to it
the library im using says it handles the connections itself and i do have the source, i cant for the life of me find where it creates the ports though

Valeyard
Mar 30, 2012


Grimey Drawer
4 exams squeezed into 3ndays, owned again by the exam timetable

Shaggar
Apr 26, 2006
go to whatever your entry point is and then inspect from there to see how it sends it to the device.

ex: if theres a method device.DoThing() then inspect the source of dothing and drill down until you find how its sending the commands to the device. then examine what objects its using to send those commands and how they were instantiated.

oh no blimp issue
Feb 23, 2011

Shaggar posted:

go to whatever your entry point is and then inspect from there to see how it sends it to the device.

ex: if theres a method device.DoThing() then inspect the source of dothing and drill down until you find how its sending the commands to the device. then examine what objects its using to send those commands and how they were instantiated.

ok, i seem to have found it, it creates a background worker that it passes a function "ConnectAsync" to.
but then the connection seems to happen in an event handler much later in the code
threaded programming is hard

oh no blimp issue
Feb 23, 2011

code:
CommunicationCenter.RegisterController(this);
CommandCenter.RegisterController(this);
ahhhhhhhhhh

bobbilljim
May 29, 2013

this christmas feels like the very first christmas to me
:shittydog::shittydog::shittydog:

Awia posted:

ok, i seem to have found it, it creates a background worker that it passes a function "ConnectAsync" to.
but then the connection seems to happen in an event handler much later in the code
threaded programming is hard

sleep(2000);

oh no blimp issue
Feb 23, 2011

bobbilljim posted:

sleep(2000);

always the right solution

DONT THREAD ON ME
Oct 1, 2002

by Nyc_Tattoo
Floss Finder
just realized that i disabled a bunch of tests last week and then forgot about them

seriously though gently caress tests i've been gradually moving over to doing way more assertion and validation in the actual code and it is much better than hoping that the house of cards known as test data can actually reflect anything meaningful

DONT THREAD ON ME fucked around with this message at 00:11 on Mar 18, 2015

Star War Sex Parrot
Oct 2, 2003

Valeyard posted:

4 exams squeezed into 3ndays, owned again by the exam timetable
ouch

i only have 1 final all week

bobbilljim
May 29, 2013

this christmas feels like the very first christmas to me
:shittydog::shittydog::shittydog:

MALE SHOEGAZE posted:

just realized that i disabled a bunch of tests last week and then forgot about them

seriously though gently caress tests i've been gradually moving over to doing way more assertion and validation in the actual code and it is much better than hoping that the house of cards known as test data can actually reflect anything meaningful

unit tests are an ok part of *development* liek to help you design your code and what it should do. for finding bugs and poo poo? fat chance. code reviews & people actually employed to test is teh right way

brap
Aug 23, 2004

Grimey Drawer
writing tests only works if you can actually think of the case that breaks your program. otherwise you write tests for obvious stuff, get :smug: about code coverage and act shocked when it's still broken.

DimpledChad
May 14, 2002
Rigging elections since '87.
what is a test?


im this thread

DaTroof
Nov 16, 2000

CC LIMERICK CONTEST GRAND CHAMPION
There once was a poster named Troof
Who was getting quite long in the toof
tests can also be useful for catching bugs introduced to systems affected by the one you changed

but yeah automated tests aren't foolproof and there's no such thing as 100% coverage

gonadic io
Feb 16, 2011

>>=
who tests the tests?

:newlol:

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

fleshweasel posted:

writing tests only works if you can actually think of the case that breaks your program. otherwise you write tests for obvious stuff, get :smug: about code coverage and act shocked when it's still broken.

that's what fuzzers and other test generators are for

  • Locked thread