Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

The language actually predates the concept of DRY*. Using a header file was a way to make things easier on the compiler. A bunch of the C++ warts are because it uses a text-based include system, instead of something semantic.

Imagine you want to ensure a compiler with only a few kb of ram can compile a module in one pass, and you'll come up with something like C's way. Then decide you want to stay compatible with C but also want to pile a ton of features on top and you get C++ and a compiler that wants GB of ram to compile a module.


*The earliest i can find is that DRY was used in Pragmatic Programmer 1999, but I didnt look too hard

Really, the preprocessor is supporting DRY, the alternative would be manually updating function prototypes in each file, it just doesn't get all the way there.

taqueso fucked around with this message at 00:15 on May 12, 2020

Adbot
ADBOT LOVES YOU

more falafel please
Feb 26, 2005

forums poster

taqueso posted:

The language actually predates the concept of DRY. Using a header file was a way to make things easier on the compiler. A bunch of the C++ warts are because it uses a text-based include system, instead of something semantic.

I know the preprocessor is actually defined in the standard, but I like to think of it as a separate, terrible system that is used in every C++ project.

Dren
Jan 5, 2001

Pillbug

Dominoes posted:

I think I have it mostly done, although can't get it to compile.

🤔

qsvui
Aug 23, 2003
some crazy thing

pseudorandom name posted:

For your purposes there's no reason not to use a std::vector, things like std::array are useful for when you want a statically allocated read-only C++ container, and even then it is a hassle to use because std::make_array() is still experimental.

Looks like std::to_array() is making it into C++20 at least.

Dominoes posted:

#2: Being unable to use the std::optional type due to embedded restrictions, how would you handle an optional type?

What embedded restrictions does std::optional have? Are you just using a toolchain that doesn't have it?

Dominoes
Sep 20, 2007

more falafel please posted:

So what "`f` was not declared in this scope" means is that the compiler's looking for a function called `f` in the scope you're trying to call it in. If `f` hasn't been declared yet in that CPP file, you can't reference it. I'm assuming that `f` is defined later in the file. Normally, to get around this, you either put a declaration at the top of the file, or, to avoid repeating those declarations in every CPP file that might need to use that function, in a separate header file which is #included at the top of the file.

Yes, it violates DRY. C (and by extension C++) was designed in the nineteen sixties.

edit: gonna put this a different way: In order to reference a symbol of any kind (variable, class, function), the compiler has to have seen that symbol already, so it can know what kind of thing it is. Each C++ file is compiled separately, so if you don't want to put declarations of the external (or internal) symbols referenced in that C++ file at the top of the file, you need to declare them in a header which is #included.
I appreciate the detailed explanation! That puts the requirement in context.

taqueso posted:

I've never witnessed someone going from rust to c++, only the other way. It must be infuriating. Going the other way, all the language design decisions are 'oh thats great they fixed that bad default' type breathes of fresh air.
I feel like getting into embedded, this was unavoidable. The immediate cause was not having an easy-to-use driver for arduino for a hardware project seems like a mistake given its popularity. Long term, I'm cutting myself out of the large majority of libraries by only using Rust. Ie I'm interested in lvgl, and may have to write the Rust bindings myself, or use FFI.

taqueso posted:

The language actually predates the concept of DRY*. Using a header file was a way to make things easier on the compiler. A bunch of the C++ warts are because it uses a text-based include system, instead of something semantic.

Imagine you want to ensure a compiler with only a few kb of ram can compile a module in one pass, and you'll come up with something like C's way. Then decide you want to stay compatible with C but also want to pile a ton of features on top and you get C++ and a compiler that wants GB of ram to compile a module.


*The earliest i can find is that DRY was used in Pragmatic Programmer 1999, but I didnt look too hard

Really, the preprocessor is supporting DRY, the alternative would be manually updating function prototypes in each file, it just doesn't get all the way there.
That makes sense given the history.

qsvui posted:

What embedded restrictions does std::optional have? Are you just using a toolchain that doesn't have it?
It's likely just for Arduino: It uses C++ 11. From what I understand, optional will work in the next version, but currently doesn't. I was also interested in `tuple` and couldn't make it work, but perhaps I need to import it.

Dominoes fucked around with this message at 01:58 on May 12, 2020

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

Optional could be done with a magic number (0 or null or -1 or MAX_UINT) or as a tag added as struct.

The old fashioned C way (I guess this would be for a result equivalent actually) would be to return status and place the value in storage pass in as pointer. In C++ you would pass a reference where you want the data to go and take a return value for status.

qsvui
Aug 23, 2003
some crazy thing
Or you can just use a library like this: https://github.com/martinmoene/optional-lite

Beef
Jul 26, 2004
For embedded projects I really like DasBetterC, the D subset that purely runs on libc. It's sort of halfway between C, C++ and Rust, but with a really powerful macro/CTFE system.

https://dlang.org/blog/category/betterc/


My personal take on learning C++ is to start with a limited subset, such as C with overloading. There are plenty of foot-bazookas and annoying idiosyncrasies when coming from another language, so it helps not to expose yourself to the entire language at once.

Beef fucked around with this message at 09:59 on May 12, 2020

Xarn
Jun 26, 2015
Only if you already know C. Learning C++ from scratch by starting with the C subset will actively cause you to learn poo poo habits.

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

As someone with poo poo C++ habits, I'll agree. Learn C++[highest possible year you can] and try to learn the modern ways to do things.

Computer viking
May 30, 2011
Now with less breakage.

This is kind of why I've bounced off C++ a few times - I'm ok at C, I've done Java and C# and softer things, it feels like I should pick up C++ well enough to do more than "stick to what Qt prefers". It's just that feeling I get that whatever I look at is likely to be either outdated, not what the cool kids do, "well known" to be a bad idea, "not yet well supported", or some combination of these. I'd honestly rather try to learn how to talk to Baristas about coffee roasts.

Beef
Jul 26, 2004
I agree that learning the modern C++ style right away is the way to go, in no way I meant to start with C++98 and work your way up. You want to avoid the giant vestigial organs and limbs each holding a footgun pointing to other parts of the language like a kind of body-horror mexican standoff.
But I'm sure you also agree that you would want to teach even modern C++ incrementally. If you hack away to a minimal subset for your "lesson 1", you naturally land on something that is more C-like. You can get most of a modern C++ style across without having to wade through class inheritance or exceptions.

Jeffrey of YOSPOS
Dec 22, 2005

GET LOSE, YOU CAN'T COMPARE WITH MY POWERS
Yeah I think there's a pretty big open question as to how best to learn programming in general, and a big old crusty language like C++ in particular. Years of "object oriented as gospel" have lead to some very strange priorities and, to me, staying far away from big inheritance hierarchies and uml diagrams is worthwhile until you can understand why someone might use them.

I know I learned via C, and I was privileged enough to learn in school, so it was more like "learn how a computer works->learn a language that's 1:1 with machine language->build the sort of abstractions that c++ provides in that language", and I think it went well, but I'm still no fan of big all-in inheritance hierarchies. If you are working and don't have time for that, learning modern idiomatic C++ directly is for sure the way to go. C++ is in a weird place where there's a ton of abstraction, it's really hard to look at a piece of code in isolation and figure out how it will translate to machine code, but it still provides all of the low level tools that let you hang yourself. C programmers looking for a few abstraction aids and higher level language programmers looking for a bit more fine-grained control both might turn to C++ but are going to see very different things. Default initialization for ints is a perfect example of this conflict - both sides have pretty reasonable but opposite expectations, the language has to choose.

Absurd Alhazred
Mar 27, 2010

by Athanatos

Jeffrey of YOSPOS posted:

Default initialization for ints is a perfect example of this conflict - both sides have pretty reasonable but opposite expectations, the language has to choose.

It just chose both. :v:

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

What is a good modern C++ guide?

Beef
Jul 26, 2004
Fear and Loathing in Las Vegas

Private Speech
Mar 30, 2011

I HAVE EVEN MORE WORTHLESS BEANIE BABIES IN MY COLLECTION THAN I HAVE WORTHLESS POSTS IN THE BEANIE BABY THREAD YET I STILL HAVE THE TEMERITY TO CRITICIZE OTHERS' COLLECTIONS

IF YOU SEE ME TALKING ABOUT BEANIE BABIES, PLEASE TELL ME TO

EAT. SHIT.


taqueso posted:

What is a good modern C++ guide?

CppCoreGuidelines or the Google C++ style guide?

Ones I have bookmarked anyway.

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!

Beef posted:

I agree that learning the modern C++ style right away is the way to go, in no way I meant to start with C++98 and work your way up. You want to avoid the giant vestigial organs and limbs each holding a footgun pointing to other parts of the language like a kind of body-horror mexican standoff.
But I'm sure you also agree that you would want to teach even modern C++ incrementally. If you hack away to a minimal subset for your "lesson 1", you naturally land on something that is more C-like. You can get most of a modern C++ style across without having to wade through class inheritance or exceptions.
I like this point. I think what you're saying is, using modern C++ features is good, learning C++ the way it was taught in the 90s where it's shaped like Java is bad.

Like, C++ can be a functional language if you want it to be, and that's a great way to learn because you can be doing things and getting it to do stuff you want without ever typing 'class', vs. the whole thing of Dog being a subset of Animal is a terrible way to learn, you almost never want to actually use inheritance like that and it absolutely gets in the way of getting stuff to happen. For the most part inheritance is just used to define an interface separately from the implementation[s] of the interface, you rarely make a tree of it.

But it's good to learn smart pointers rather than Pointers Classic, once you get to the point of learning pointers, and it's good to learn with std::vectors.

more falafel please
Feb 26, 2005

forums poster

roomforthetuna posted:

But it's good to learn smart pointers rather than Pointers Classic, once you get to the point of learning pointers, and it's good to learn with std::vectors.

I struggle with this a bit, because "learning about pointers" was the point when, like, how a computer works clicked for me. It probably helped that I was learning C++ at the same time as I was teaching myself Z80 assembly for the TI-82, but realizing that this "pointer" thing was just a memory address, which is just a ding dang number that's a byte index in memory made everything make sense. Like "oh poo poo to run a program the OS probably just reads it into memory and then just jumps to the beginning of it, huh?" My fear is that not learning Pointers Classic means that programmers don't have a low-level mental model of how the machine works (or, really, how the C abstract machine works, obviously every stage of this is more complicated). I guess I don't know about everyone who works in C++ professionally, but over here in games you need a pretty solid low level understanding to get anywhere.

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!

more falafel please posted:

I struggle with this a bit, because "learning about pointers" was the point when, like, how a computer works clicked for me. It probably helped that I was learning C++ at the same time as I was teaching myself Z80 assembly for the TI-82, but realizing that this "pointer" thing was just a memory address, which is just a ding dang number that's a byte index in memory made everything make sense. Like "oh poo poo to run a program the OS probably just reads it into memory and then just jumps to the beginning of it, huh?" My fear is that not learning Pointers Classic means that programmers don't have a low-level mental model of how the machine works (or, really, how the C abstract machine works, obviously every stage of this is more complicated). I guess I don't know about everyone who works in C++ professionally, but over here in games you need a pretty solid low level understanding to get anywhere.
There's still arguably a good space to learn about regular pointers too, for like moving around in data structures or under some conditions for referencing other owners' unique_ptrs. Smart pointers are just a good thing to learn rather than malloc/free which are a disaster area that you don't really need to understand even for pretty low-level things. Anything higher level than compilers and operating systems shouldn't need you to understand malloc.

I'm certainly not advocating that "how it all works under the hood" is a thing you shouldn't still learn. (And I'm *more* advocating that OO C++ is the worst place to begin, my objection to malloc/free is like 10% of the intensity of my objection to teaching people about inheritance before they're anywhere near a situation where it has value.)

Dominoes
Sep 20, 2007

OOP feels like obfuscation to me. I'm more a fan of classes/structs as a way to describe the data you're working with, especially as the program's foundation. The appeal, from what I gather, of C++ over C, is more the higher-level features and abstractions.

Re D: Seems good from a quick skim, but seems like it wouldn't have the critical perk of C++'s embedded ecosystem. More generally as a language, I'm having a tough time grasping why it includes a GC: This eliminates some of the (until recently) unique utility of C/++.

Dominoes fucked around with this message at 02:18 on May 14, 2020

Foxfire_
Nov 8, 2010

My experience teaching people pointer stuff is that the main hurdle is getting them to understand variables that hold "things" vs "names of thing". I've found it easier to teach concepts in C++ (where the names have physical meaning as memory addresses) than in python (where the names are abstract).

Actually using them, you're way more likely to screw up using manually controlled lifetimes than garbage collected lifetimes, but conceptually the addresses are easier to teach.


malloc/free seems kind of orthogonal to me because if you understand how pointers/indirect names work already, "malloc=gives you an array of bytes" and "free=you give back the array of bytes" don't really have any complexity to them

roomforthetuna
Mar 22, 2005

I don't need to know anything about virii! My CUSTOM PROGRAM keeps me protected! It's not like they'll try to come in through the Internet or something!

Foxfire_ posted:

malloc/free seems kind of orthogonal to me because if you understand how pointers/indirect names work already, "malloc=gives you an array of bytes" and "free=you give back the array of bytes" don't really have any complexity to them
The concept of malloc/free itself isn't complex (until you get into the real nitty gritty with fragmentation and poo poo), but using it right so you don't end up forgetting to clean something up is complex. Though now I think about it more, maybe I'm actually in favor of teaching malloc/free first because you can't really appreciate smart pointers until you've hosed up with malloc.

A funny corollary to "you're more likely to screw up manually controlled lifetimes than garbage collected lifetimes", which I agree is true, is that later, when you're working on large projects with significant data flows, garbage collected lifetimes will merrily screw stuff up for you when you didn't screw up at all. I've had Chrome crashing because when it hits the memory limit for a tab it crashes without even trying a last minute gc (which is incredibly stupid - I did some poo poo in the debugger so I could trigger gc manually, and doing the same sequence of events but with a manual gc right before it crashes, it actually had only used 1/4 of the available memory - and it wasn't that it had no idle time, it just hadn't felt like doing gc yet). I've seen Go code triggering its process-manager to kill it because it has used 2GB of RAM in a process that was essentially just streaming data, should have needed less than 4k + base process allocation, but because it was allocating a temporary buffer for received blocks inside a loop it could run into a situation where it didn't find time to trigger the gc in between iterations. Manual lifetimes protect you from that. Being able to do stuff on the stack helps too.

more falafel please
Feb 26, 2005

forums poster

roomforthetuna posted:

There's still arguably a good space to learn about regular pointers too, for like moving around in data structures or under some conditions for referencing other owners' unique_ptrs. Smart pointers are just a good thing to learn rather than malloc/free which are a disaster area that you don't really need to understand even for pretty low-level things. Anything higher level than compilers and operating systems shouldn't need you to understand malloc.

I'm certainly not advocating that "how it all works under the hood" is a thing you shouldn't still learn. (And I'm *more* advocating that OO C++ is the worst place to begin, my objection to malloc/free is like 10% of the intensity of my objection to teaching people about inheritance before they're anywhere near a situation where it has value.)

Yeah, agreed on all counts. I guess my ideal curriculum would have basic programming concepts (control, functions, scope, probably light recursion) taught in a language like Python. Straight basics of imperative language concepts. Then a low-level class (we can probably start at the assembly level) that teaches about memory, binary/hex, how flow control in high-level languages is implemented in terms of jumps, the stack, etc, floating point, debugging in assembly, blah blah. Then an OS class taught in C, using Unix (either a toy one for teaching purposes, or just "install this CentOS virtualbox image"), covering processes, scheduling, i/o, threading and concurrency pitfalls, shells, and simultaneously the C model, safe string manipulation, basic data structures (maybe just linked lists), using makefiles and external libs (something simple like zlib). Then software design using modern C++, multiple teams working together on features in a shared (probably pre-existing) codebase with source control and project management. Data structures, algorithms, grammars, automata, all that poo poo can be taught concurrently in classes using either pseudocode (like the CLRS language) or something low-overhead like python.

more falafel please
Feb 26, 2005

forums poster

roomforthetuna posted:

which I agree is true,

I don't, at least for nontrivial programs. GC basically assumes infinite memory, and the least bit of complexity makes it fall rear end-backwards into "it'll probably collect this stuff eventually" or making easy mistakes (circular references, dangling references) into big problems. Again, most of my experience is in games, which have to run for a long time, doing lots of stuff with as much memory as possible, but the Chrome example is a good one too. Unreal Engine 3 was garbage collected. We never shipped a UE3 game where auto-GC was turned on -- we always had to just refactor every nontrivial allocated object to be reference counted (or better yet, have explicit lifetime and ownership) and only run GC at level loads. UE4 uses reference counting (basically just shared_ptr/unique_ptr/weak_ptr, but Make It Unreal) and it's much easier to deal with.

It's just as easy to write resource management bugs in GC languages/environments, and GC tells you you shouldn't worry your pretty little head about it. You still need to manage memory, lifetimes, ownership, etc in GC languages, it's just harder to tell that you're doing it right, plus you get a nice big 100ms hitch in the main thread every so often.

Foxfire_
Nov 8, 2010

more falafel please posted:

I guess my ideal curriculum would have basic programming concepts (control, functions, scope, probably light recursion) taught in a language like Python. Straight basics of imperative language concepts.

This is kind of what my school did (with a brief survey class between the intro to programming and topic ones) and I think one of the challenges with it is that pointers are a basic programming concept for anything with mutable state and are harder to teach in a language like Python where the pointer vs thing distinction isn't explicit and the addresses are hidden. Getting people in intro classes to understand when and why assignments sometimes make copies and sometimes not was the biggest challenge in tutoring.

more falafel please posted:

It's just as easy to write resource management bugs in GC languages/environments, and GC tells you you shouldn't worry your pretty little head about it. You still need to manage memory, lifetimes, ownership, etc in GC languages, it's just harder to tell that you're doing it right, plus you get a nice big 100ms hitch in the main thread every so often.

GC/reference counting fails much nicer though. When you mess up you get out of memory crashes as opposed to heisenbugs/data corruption/security holes

Xerophyte
Mar 17, 2008

This space intentionally left blank
Re: horrors of undergraduate memory management, the first time I ever wrote C++ was in my bachelor's thesis. I'd written Java and C before, but my knowledge of C++ was basically hearing that it's like Java but without garbage collection. Since in Java you always use new to construct a class type that's what I tried doing in C++, and when I noticed that new returned a pointer type in C++ I just figured that the way you used classes in C++ must clearly be to always declare them as pointers and then very carefully new/delete them to manually manage object lifetimes. I did at least manage to accidentally do some sort of RAII with new in constructors and delete in destructors, even if I'd never heard the term RAII. [E: Oh, and of course I never used references at all.]

I somehow wrote an entire D3D renderer that usually didn't explode while working like that. I felt very dumb when someone finally explained that you can allocate classes on the heap and non-fundamental class members don't actually need to be pointers.

I don't think I'd want to try teaching C++ to someone by just telling them to using smart pointers and not worry about the memory management stuff until later, even if it's certainly better than what I did. It seems easiest to first teach the basic memory model and then introduce the reference counted standard library classes and how they work within that model. Doing the reverse mans hoping your student doesn't stumble into the field of footguns that are the core language features while all they know is unique_ptr.

Xerophyte fucked around with this message at 12:00 on May 14, 2020

Dren
Jan 5, 2001

Pillbug
I think memory leaks from circular dependencies with smart pointers (or objects in GC languages) are subtler and worse to figure out than memory leaks from incorrect manual management of memory. edit: this is made worse by the garbage collector because when it happens you never know if it’s garbage collector just noy garbage collecting like roomforthetuna talked about, or an error.

I’m kind of a fan of thoughtful passing of references. If the lifetimes are wrong the program will just explode and now you know what went wrong, and where.

Dren fucked around with this message at 12:23 on May 14, 2020

Jabor
Jul 16, 2010

#1 Loser at SpaceChem
Generally speaking, garbage collectors are perfectly capable of GCing object structures with circular references.

If you're using a specific collector that isn't, then yeah, you need to be real particular about things. It's also less of a "garbage collector" and more of a "everything is a shared_ptr by default".

Beef
Jul 26, 2004

Dominoes posted:

Re D: Seems good from a quick skim, but seems like it wouldn't have the critical perk of C++'s embedded ecosystem. More generally as a language, I'm having a tough time grasping why it includes a GC: This eliminates some of the (until recently) unique utility of C/++.

At the risk of sounding like a D groupie ...

You can ploink down D objects in any C/C++ project, that's what it was designed to do. D is ABI compatible with C and C++, it contains by default a ton of utilities to help you call into and be called from C++ such as name (de-)mangling. e.g. You can just #include C/C++ headers, extern symbols yourself, and link the D-compiled .o file in your existing C/C++ project with GCC or CLANG. It was used by Remedy Entertainment partly to provide hot-swappable code for its animation system, inside a pure C++ engine.
Personally, I've been using it in a large C code base for our experimental hardware, because gently caress putting my dick in a 1M-loc in C project with only emulated-printf debugging. It's worth it just for the compile-type checks and doing away with the forest of #include "template.c" and #ifndefs forever.

D's GC is more like adding a GC to C++: it's optional, avoidable, @nogc regions enforceable at compile time, and only a few specific constructs generate garbage. Annoyingly, the standard library likes to generate garbage, although there seems to be effort to @nogc as much as possible these days.
The use case for D's GC is a bit like in the Unreal example from an earlier post above: you use it when prototyping and refactor it out later from the critical parts.
D with the -betterC flag has no GC or any D runtime whatsoever.

Walter Bright used to write commercial C++ compilers (Zortech/Symantec/DigitalMars C++) and a former game dev, and it really shows.

Beef
Jul 26, 2004

more falafel please posted:

GC basically assumes infinite memory

So does malloc on virtual memory :q:


more falafel please posted:

It's just as easy to write resource management bugs in GC languages/environments, and GC tells you you shouldn't worry your pretty little head about it. You still need to manage memory, lifetimes, ownership, etc in GC languages, it's just harder to tell that you're doing it right, plus you get a nice big 100ms hitch in the main thread every so often.

This. If performance is in any way a feature of your software product, GC-everything-by-default is a pain. Although same goes for manual allocator if you are using the general allocator everywhere.
There are some areas where GC do beat manual allocation, e.g. a large number of small objects with unpredictable dynamic lifetimes. but again you have to be very aware of the tradeoffs.

more falafel please
Feb 26, 2005

forums poster

Foxfire_ posted:

GC/reference counting fails much nicer though. When you mess up you get out of memory crashes as opposed to heisenbugs/data corruption/security holes

OOM crashes are unacceptable in all the software I've worked on professionally. So are GC hitches. When I've used GC environments, about half the engineering effort for the project goes into trying to make sure the GC runs as little as possible. GC doesn't make it so you don't have to manage your memory, it just makes memory harder to manage :)

Ralith
Jan 12, 2011

I see a ship in the harbor
I can and shall obey
But if it wasn't for your misfortune
I'd be a heavenly person today

Beef posted:

Walter Bright used to write commercial C++ compilers (Zortech/Symantec/DigitalMars C++) and a former game dev, and it really shows.
I know D is a bit of a mess, but that's just harsh.

Foxfire_
Nov 8, 2010

more falafel please posted:

OOM crashes are unacceptable in all the software I've worked on professionally. So are GC hitches. When I've used GC environments, about half the engineering effort for the project goes into trying to make sure the GC runs as little as possible. GC doesn't make it so you don't have to manage your memory, it just makes memory harder to manage :)

Neither failure mode is ok in a shipped product, but crash always is a less bad and easier to fix failure mode than "calculate incorrect answers sometimes (but not always)" or "allow arbitrary code execution"

Foxfire_ fucked around with this message at 21:09 on May 14, 2020

Beef
Jul 26, 2004

Ralith posted:

I know D is a bit of a mess, but that's just harsh.

:golfclap:

Yeah, that sword definitely cuts both ways.

rjmccall
Sep 7, 2007

no worries friend
Fun Shoe

roomforthetuna posted:

There's still arguably a good space to learn about regular pointers too, for like moving around in data structures or under some conditions for referencing other owners' unique_ptrs. Smart pointers are just a good thing to learn rather than malloc/free which are a disaster area that you don't really need to understand even for pretty low-level things. Anything higher level than compilers and operating systems shouldn't need you to understand malloc.

I'm certainly not advocating that "how it all works under the hood" is a thing you shouldn't still learn. (And I'm *more* advocating that OO C++ is the worst place to begin, my objection to malloc/free is like 10% of the intensity of my objection to teaching people about inheritance before they're anywhere near a situation where it has value.)

Anyone learning to use a programming language needs to understand how they should actually write code. Anyone learning to do systems programming needs to understand the basics of how the systems they use work. In many languages, you can learn one and then the other. In C++, the way you should actually write code includes basic abstractions like references that require you to understand how they work in order to use them safely and effectively, and so it is very difficult to separate the two steps. It is much easier to first learn C in order to internalize the basic model, but then you have to carefully un-learn all of the C idioms (but not their underlying concepts) when you go back to learning how to actually write C++.

more falafel please
Feb 26, 2005

forums poster

Foxfire_ posted:

Neither failure mode is ok in a shipped product, but crash always is a less bad and easier to fix failure mode than "calculate incorrect answers sometimes (but not always)" or "allow arbitrary code execution"

OOM crashes are much harder to fix than like, a simple null deref. We literally had a perf engineer on a project for two years doing just memory fixes and optimizations. That's on a port of a Unity project originally written by a couple of dinguses to console, so maybe the absolute worst case scenario, but still.

darkforce898
Sep 11, 2007

Are there any resources that can help me implement encryption and decryption of files and communication from a client to a server?

I'm writing a program in C that will run on embedded arm devices and has a configuration file that enables and disables certain features. How do I secure the configuration so that a user cannot just change it by hand?

Right now it has a secret key in the binary that is obsfucated that I am going to combine with an API key. I know that putting the secret key in the binary isn't going to stop someone from running Ida pro, but it will stop someone from running strings on it. And then running aes256cbc on the file to read and write.

Is there a better way to do this? I don't want to reinvent the wheel. Also, is there a better resource for OpenSSL api docs than just stack overflow and the wiki?

more falafel please
Feb 26, 2005

forums poster

darkforce898 posted:

Are there any resources that can help me implement encryption and decryption of files and communication from a client to a server?

I'm writing a program in C that will run on embedded arm devices and has a configuration file that enables and disables certain features. How do I secure the configuration so that a user cannot just change it by hand?

Right now it has a secret key in the binary that is obsfucated that I am going to combine with an API key. I know that putting the secret key in the binary isn't going to stop someone from running Ida pro, but it will stop someone from running strings on it. And then running aes256cbc on the file to read and write.

Is there a better way to do this? I don't want to reinvent the wheel. Also, is there a better resource for OpenSSL api docs than just stack overflow and the wiki?

"Embedded ARM device" could mean a lot of things, is there an OS with file permissions? Is the program running at elevated privilege with respect to the user? Because my first thought is to use chmod to make the file readable only by the user that the program runs as.

Adbot
ADBOT LOVES YOU

Private Speech
Mar 30, 2011

I HAVE EVEN MORE WORTHLESS BEANIE BABIES IN MY COLLECTION THAN I HAVE WORTHLESS POSTS IN THE BEANIE BABY THREAD YET I STILL HAVE THE TEMERITY TO CRITICIZE OTHERS' COLLECTIONS

IF YOU SEE ME TALKING ABOUT BEANIE BABIES, PLEASE TELL ME TO

EAT. SHIT.


more falafel please posted:

"Embedded ARM device" could mean a lot of things, is there an OS with file permissions? Is the program running at elevated privilege with respect to the user? Because my first thought is to use chmod to make the file readable only by the user that the program runs as.

Pretty sure he's asking about a DRM implementation, so simply blocking other processes from accessing a plaintext file may not work terribly well for a number of reasons. Doesn't seem anywhere as secure as what he's proposing in the first place, might be a reasonable extra to include though.

Anyway the described system sounds more than secure enough by embedded standards. There hasn't been an unbreakable DRM scheme where the hardware is in customer hands designed so far, not even by the likes of Sony, Nintendo and Microsoft throwing hundreds of millions and ASICs at it, so you probably won't design one either.

e: Or if it's an especially valuable high-volume item do what everyone else does and make a security dongle, or hire a team and spend a lot of money on it. But that's very much a cost-benefit problem at that point.

Private Speech fucked around with this message at 01:20 on May 17, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply