|
The language actually predates the concept of DRY*. Using a header file was a way to make things easier on the compiler. A bunch of the C++ warts are because it uses a text-based include system, instead of something semantic. Imagine you want to ensure a compiler with only a few kb of ram can compile a module in one pass, and you'll come up with something like C's way. Then decide you want to stay compatible with C but also want to pile a ton of features on top and you get C++ and a compiler that wants GB of ram to compile a module. *The earliest i can find is that DRY was used in Pragmatic Programmer 1999, but I didnt look too hard Really, the preprocessor is supporting DRY, the alternative would be manually updating function prototypes in each file, it just doesn't get all the way there. taqueso fucked around with this message at 00:15 on May 12, 2020 |
# ? May 11, 2020 23:33 |
|
|
# ? Jun 1, 2024 23:52 |
|
taqueso posted:The language actually predates the concept of DRY. Using a header file was a way to make things easier on the compiler. A bunch of the C++ warts are because it uses a text-based include system, instead of something semantic. I know the preprocessor is actually defined in the standard, but I like to think of it as a separate, terrible system that is used in every C++ project.
|
# ? May 11, 2020 23:36 |
|
Dominoes posted:I think I have it mostly done, although can't get it to compile. 🤔
|
# ? May 11, 2020 23:52 |
|
pseudorandom name posted:For your purposes there's no reason not to use a std::vector, things like std::array are useful for when you want a statically allocated read-only C++ container, and even then it is a hassle to use because std::make_array() is still experimental. Looks like std::to_array() is making it into C++20 at least. Dominoes posted:#2: Being unable to use the std::optional type due to embedded restrictions, how would you handle an optional type? What embedded restrictions does std::optional have? Are you just using a toolchain that doesn't have it?
|
# ? May 12, 2020 01:48 |
|
more falafel please posted:So what "`f` was not declared in this scope" means is that the compiler's looking for a function called `f` in the scope you're trying to call it in. If `f` hasn't been declared yet in that CPP file, you can't reference it. I'm assuming that `f` is defined later in the file. Normally, to get around this, you either put a declaration at the top of the file, or, to avoid repeating those declarations in every CPP file that might need to use that function, in a separate header file which is #included at the top of the file. taqueso posted:I've never witnessed someone going from rust to c++, only the other way. It must be infuriating. Going the other way, all the language design decisions are 'oh thats great they fixed that bad default' type breathes of fresh air. taqueso posted:The language actually predates the concept of DRY*. Using a header file was a way to make things easier on the compiler. A bunch of the C++ warts are because it uses a text-based include system, instead of something semantic. qsvui posted:What embedded restrictions does std::optional have? Are you just using a toolchain that doesn't have it? Dominoes fucked around with this message at 01:58 on May 12, 2020 |
# ? May 12, 2020 01:56 |
|
Optional could be done with a magic number (0 or null or -1 or MAX_UINT) or as a tag added as struct. The old fashioned C way (I guess this would be for a result equivalent actually) would be to return status and place the value in storage pass in as pointer. In C++ you would pass a reference where you want the data to go and take a return value for status.
|
# ? May 12, 2020 02:32 |
|
Or you can just use a library like this: https://github.com/martinmoene/optional-lite
|
# ? May 12, 2020 02:42 |
|
For embedded projects I really like DasBetterC, the D subset that purely runs on libc. It's sort of halfway between C, C++ and Rust, but with a really powerful macro/CTFE system. https://dlang.org/blog/category/betterc/ My personal take on learning C++ is to start with a limited subset, such as C with overloading. There are plenty of foot-bazookas and annoying idiosyncrasies when coming from another language, so it helps not to expose yourself to the entire language at once. Beef fucked around with this message at 09:59 on May 12, 2020 |
# ? May 12, 2020 09:44 |
|
Only if you already know C. Learning C++ from scratch by starting with the C subset will actively cause you to learn poo poo habits.
|
# ? May 12, 2020 10:26 |
|
As someone with poo poo C++ habits, I'll agree. Learn C++[highest possible year you can] and try to learn the modern ways to do things.
|
# ? May 12, 2020 19:55 |
|
This is kind of why I've bounced off C++ a few times - I'm ok at C, I've done Java and C# and softer things, it feels like I should pick up C++ well enough to do more than "stick to what Qt prefers". It's just that feeling I get that whatever I look at is likely to be either outdated, not what the cool kids do, "well known" to be a bad idea, "not yet well supported", or some combination of these. I'd honestly rather try to learn how to talk to Baristas about coffee roasts.
|
# ? May 12, 2020 20:21 |
|
I agree that learning the modern C++ style right away is the way to go, in no way I meant to start with C++98 and work your way up. You want to avoid the giant vestigial organs and limbs each holding a footgun pointing to other parts of the language like a kind of body-horror mexican standoff. But I'm sure you also agree that you would want to teach even modern C++ incrementally. If you hack away to a minimal subset for your "lesson 1", you naturally land on something that is more C-like. You can get most of a modern C++ style across without having to wade through class inheritance or exceptions.
|
# ? May 12, 2020 20:24 |
|
Yeah I think there's a pretty big open question as to how best to learn programming in general, and a big old crusty language like C++ in particular. Years of "object oriented as gospel" have lead to some very strange priorities and, to me, staying far away from big inheritance hierarchies and uml diagrams is worthwhile until you can understand why someone might use them. I know I learned via C, and I was privileged enough to learn in school, so it was more like "learn how a computer works->learn a language that's 1:1 with machine language->build the sort of abstractions that c++ provides in that language", and I think it went well, but I'm still no fan of big all-in inheritance hierarchies. If you are working and don't have time for that, learning modern idiomatic C++ directly is for sure the way to go. C++ is in a weird place where there's a ton of abstraction, it's really hard to look at a piece of code in isolation and figure out how it will translate to machine code, but it still provides all of the low level tools that let you hang yourself. C programmers looking for a few abstraction aids and higher level language programmers looking for a bit more fine-grained control both might turn to C++ but are going to see very different things. Default initialization for ints is a perfect example of this conflict - both sides have pretty reasonable but opposite expectations, the language has to choose.
|
# ? May 12, 2020 20:58 |
|
Jeffrey of YOSPOS posted:Default initialization for ints is a perfect example of this conflict - both sides have pretty reasonable but opposite expectations, the language has to choose. It just chose both.
|
# ? May 12, 2020 21:14 |
|
What is a good modern C++ guide?
|
# ? May 12, 2020 21:22 |
|
Fear and Loathing in Las Vegas
|
# ? May 12, 2020 21:38 |
|
taqueso posted:What is a good modern C++ guide? CppCoreGuidelines or the Google C++ style guide? Ones I have bookmarked anyway.
|
# ? May 12, 2020 21:57 |
|
Beef posted:I agree that learning the modern C++ style right away is the way to go, in no way I meant to start with C++98 and work your way up. You want to avoid the giant vestigial organs and limbs each holding a footgun pointing to other parts of the language like a kind of body-horror mexican standoff. Like, C++ can be a functional language if you want it to be, and that's a great way to learn because you can be doing things and getting it to do stuff you want without ever typing 'class', vs. the whole thing of Dog being a subset of Animal is a terrible way to learn, you almost never want to actually use inheritance like that and it absolutely gets in the way of getting stuff to happen. For the most part inheritance is just used to define an interface separately from the implementation[s] of the interface, you rarely make a tree of it. But it's good to learn smart pointers rather than Pointers Classic, once you get to the point of learning pointers, and it's good to learn with std::vectors.
|
# ? May 13, 2020 04:35 |
|
roomforthetuna posted:But it's good to learn smart pointers rather than Pointers Classic, once you get to the point of learning pointers, and it's good to learn with std::vectors. I struggle with this a bit, because "learning about pointers" was the point when, like, how a computer works clicked for me. It probably helped that I was learning C++ at the same time as I was teaching myself Z80 assembly for the TI-82, but realizing that this "pointer" thing was just a memory address, which is just a ding dang number that's a byte index in memory made everything make sense. Like "oh poo poo to run a program the OS probably just reads it into memory and then just jumps to the beginning of it, huh?" My fear is that not learning Pointers Classic means that programmers don't have a low-level mental model of how the machine works (or, really, how the C abstract machine works, obviously every stage of this is more complicated). I guess I don't know about everyone who works in C++ professionally, but over here in games you need a pretty solid low level understanding to get anywhere.
|
# ? May 13, 2020 20:29 |
|
more falafel please posted:I struggle with this a bit, because "learning about pointers" was the point when, like, how a computer works clicked for me. It probably helped that I was learning C++ at the same time as I was teaching myself Z80 assembly for the TI-82, but realizing that this "pointer" thing was just a memory address, which is just a ding dang number that's a byte index in memory made everything make sense. Like "oh poo poo to run a program the OS probably just reads it into memory and then just jumps to the beginning of it, huh?" My fear is that not learning Pointers Classic means that programmers don't have a low-level mental model of how the machine works (or, really, how the C abstract machine works, obviously every stage of this is more complicated). I guess I don't know about everyone who works in C++ professionally, but over here in games you need a pretty solid low level understanding to get anywhere. I'm certainly not advocating that "how it all works under the hood" is a thing you shouldn't still learn. (And I'm *more* advocating that OO C++ is the worst place to begin, my objection to malloc/free is like 10% of the intensity of my objection to teaching people about inheritance before they're anywhere near a situation where it has value.)
|
# ? May 14, 2020 00:45 |
|
OOP feels like obfuscation to me. I'm more a fan of classes/structs as a way to describe the data you're working with, especially as the program's foundation. The appeal, from what I gather, of C++ over C, is more the higher-level features and abstractions. Re D: Seems good from a quick skim, but seems like it wouldn't have the critical perk of C++'s embedded ecosystem. More generally as a language, I'm having a tough time grasping why it includes a GC: This eliminates some of the (until recently) unique utility of C/++. Dominoes fucked around with this message at 02:18 on May 14, 2020 |
# ? May 14, 2020 01:09 |
|
My experience teaching people pointer stuff is that the main hurdle is getting them to understand variables that hold "things" vs "names of thing". I've found it easier to teach concepts in C++ (where the names have physical meaning as memory addresses) than in python (where the names are abstract). Actually using them, you're way more likely to screw up using manually controlled lifetimes than garbage collected lifetimes, but conceptually the addresses are easier to teach. malloc/free seems kind of orthogonal to me because if you understand how pointers/indirect names work already, "malloc=gives you an array of bytes" and "free=you give back the array of bytes" don't really have any complexity to them
|
# ? May 14, 2020 03:55 |
|
Foxfire_ posted:malloc/free seems kind of orthogonal to me because if you understand how pointers/indirect names work already, "malloc=gives you an array of bytes" and "free=you give back the array of bytes" don't really have any complexity to them A funny corollary to "you're more likely to screw up manually controlled lifetimes than garbage collected lifetimes", which I agree is true, is that later, when you're working on large projects with significant data flows, garbage collected lifetimes will merrily screw stuff up for you when you didn't screw up at all. I've had Chrome crashing because when it hits the memory limit for a tab it crashes without even trying a last minute gc (which is incredibly stupid - I did some poo poo in the debugger so I could trigger gc manually, and doing the same sequence of events but with a manual gc right before it crashes, it actually had only used 1/4 of the available memory - and it wasn't that it had no idle time, it just hadn't felt like doing gc yet). I've seen Go code triggering its process-manager to kill it because it has used 2GB of RAM in a process that was essentially just streaming data, should have needed less than 4k + base process allocation, but because it was allocating a temporary buffer for received blocks inside a loop it could run into a situation where it didn't find time to trigger the gc in between iterations. Manual lifetimes protect you from that. Being able to do stuff on the stack helps too.
|
# ? May 14, 2020 05:32 |
|
roomforthetuna posted:There's still arguably a good space to learn about regular pointers too, for like moving around in data structures or under some conditions for referencing other owners' unique_ptrs. Smart pointers are just a good thing to learn rather than malloc/free which are a disaster area that you don't really need to understand even for pretty low-level things. Anything higher level than compilers and operating systems shouldn't need you to understand malloc. Yeah, agreed on all counts. I guess my ideal curriculum would have basic programming concepts (control, functions, scope, probably light recursion) taught in a language like Python. Straight basics of imperative language concepts. Then a low-level class (we can probably start at the assembly level) that teaches about memory, binary/hex, how flow control in high-level languages is implemented in terms of jumps, the stack, etc, floating point, debugging in assembly, blah blah. Then an OS class taught in C, using Unix (either a toy one for teaching purposes, or just "install this CentOS virtualbox image"), covering processes, scheduling, i/o, threading and concurrency pitfalls, shells, and simultaneously the C model, safe string manipulation, basic data structures (maybe just linked lists), using makefiles and external libs (something simple like zlib). Then software design using modern C++, multiple teams working together on features in a shared (probably pre-existing) codebase with source control and project management. Data structures, algorithms, grammars, automata, all that poo poo can be taught concurrently in classes using either pseudocode (like the CLRS language) or something low-overhead like python.
|
# ? May 14, 2020 05:48 |
|
roomforthetuna posted:which I agree is true, I don't, at least for nontrivial programs. GC basically assumes infinite memory, and the least bit of complexity makes it fall rear end-backwards into "it'll probably collect this stuff eventually" or making easy mistakes (circular references, dangling references) into big problems. Again, most of my experience is in games, which have to run for a long time, doing lots of stuff with as much memory as possible, but the Chrome example is a good one too. Unreal Engine 3 was garbage collected. We never shipped a UE3 game where auto-GC was turned on -- we always had to just refactor every nontrivial allocated object to be reference counted (or better yet, have explicit lifetime and ownership) and only run GC at level loads. UE4 uses reference counting (basically just shared_ptr/unique_ptr/weak_ptr, but Make It Unreal) and it's much easier to deal with. It's just as easy to write resource management bugs in GC languages/environments, and GC tells you you shouldn't worry your pretty little head about it. You still need to manage memory, lifetimes, ownership, etc in GC languages, it's just harder to tell that you're doing it right, plus you get a nice big 100ms hitch in the main thread every so often.
|
# ? May 14, 2020 05:57 |
|
more falafel please posted:I guess my ideal curriculum would have basic programming concepts (control, functions, scope, probably light recursion) taught in a language like Python. Straight basics of imperative language concepts. This is kind of what my school did (with a brief survey class between the intro to programming and topic ones) and I think one of the challenges with it is that pointers are a basic programming concept for anything with mutable state and are harder to teach in a language like Python where the pointer vs thing distinction isn't explicit and the addresses are hidden. Getting people in intro classes to understand when and why assignments sometimes make copies and sometimes not was the biggest challenge in tutoring. more falafel please posted:It's just as easy to write resource management bugs in GC languages/environments, and GC tells you you shouldn't worry your pretty little head about it. You still need to manage memory, lifetimes, ownership, etc in GC languages, it's just harder to tell that you're doing it right, plus you get a nice big 100ms hitch in the main thread every so often. GC/reference counting fails much nicer though. When you mess up you get out of memory crashes as opposed to heisenbugs/data corruption/security holes
|
# ? May 14, 2020 07:57 |
|
Re: horrors of undergraduate memory management, the first time I ever wrote C++ was in my bachelor's thesis. I'd written Java and C before, but my knowledge of C++ was basically hearing that it's like Java but without garbage collection. Since in Java you always use new to construct a class type that's what I tried doing in C++, and when I noticed that new returned a pointer type in C++ I just figured that the way you used classes in C++ must clearly be to always declare them as pointers and then very carefully new/delete them to manually manage object lifetimes. I did at least manage to accidentally do some sort of RAII with new in constructors and delete in destructors, even if I'd never heard the term RAII. [E: Oh, and of course I never used references at all.] I somehow wrote an entire D3D renderer that usually didn't explode while working like that. I felt very dumb when someone finally explained that you can allocate classes on the heap and non-fundamental class members don't actually need to be pointers. I don't think I'd want to try teaching C++ to someone by just telling them to using smart pointers and not worry about the memory management stuff until later, even if it's certainly better than what I did. It seems easiest to first teach the basic memory model and then introduce the reference counted standard library classes and how they work within that model. Doing the reverse mans hoping your student doesn't stumble into the field of footguns that are the core language features while all they know is unique_ptr. Xerophyte fucked around with this message at 12:00 on May 14, 2020 |
# ? May 14, 2020 11:58 |
|
I think memory leaks from circular dependencies with smart pointers (or objects in GC languages) are subtler and worse to figure out than memory leaks from incorrect manual management of memory. edit: this is made worse by the garbage collector because when it happens you never know if it’s garbage collector just noy garbage collecting like roomforthetuna talked about, or an error. I’m kind of a fan of thoughtful passing of references. If the lifetimes are wrong the program will just explode and now you know what went wrong, and where. Dren fucked around with this message at 12:23 on May 14, 2020 |
# ? May 14, 2020 11:58 |
|
Generally speaking, garbage collectors are perfectly capable of GCing object structures with circular references. If you're using a specific collector that isn't, then yeah, you need to be real particular about things. It's also less of a "garbage collector" and more of a "everything is a shared_ptr by default".
|
# ? May 14, 2020 14:06 |
|
Dominoes posted:Re D: Seems good from a quick skim, but seems like it wouldn't have the critical perk of C++'s embedded ecosystem. More generally as a language, I'm having a tough time grasping why it includes a GC: This eliminates some of the (until recently) unique utility of C/++. At the risk of sounding like a D groupie ... You can ploink down D objects in any C/C++ project, that's what it was designed to do. D is ABI compatible with C and C++, it contains by default a ton of utilities to help you call into and be called from C++ such as name (de-)mangling. e.g. You can just #include C/C++ headers, extern symbols yourself, and link the D-compiled .o file in your existing C/C++ project with GCC or CLANG. It was used by Remedy Entertainment partly to provide hot-swappable code for its animation system, inside a pure C++ engine. Personally, I've been using it in a large C code base for our experimental hardware, because gently caress putting my dick in a 1M-loc in C project with only emulated-printf debugging. It's worth it just for the compile-type checks and doing away with the forest of #include "template.c" and #ifndefs forever. D's GC is more like adding a GC to C++: it's optional, avoidable, @nogc regions enforceable at compile time, and only a few specific constructs generate garbage. Annoyingly, the standard library likes to generate garbage, although there seems to be effort to @nogc as much as possible these days. The use case for D's GC is a bit like in the Unreal example from an earlier post above: you use it when prototyping and refactor it out later from the critical parts. D with the -betterC flag has no GC or any D runtime whatsoever. Walter Bright used to write commercial C++ compilers (Zortech/Symantec/DigitalMars C++) and a former game dev, and it really shows.
|
# ? May 14, 2020 14:20 |
|
more falafel please posted:GC basically assumes infinite memory So does malloc on virtual memory more falafel please posted:It's just as easy to write resource management bugs in GC languages/environments, and GC tells you you shouldn't worry your pretty little head about it. You still need to manage memory, lifetimes, ownership, etc in GC languages, it's just harder to tell that you're doing it right, plus you get a nice big 100ms hitch in the main thread every so often. This. If performance is in any way a feature of your software product, GC-everything-by-default is a pain. Although same goes for manual allocator if you are using the general allocator everywhere. There are some areas where GC do beat manual allocation, e.g. a large number of small objects with unpredictable dynamic lifetimes. but again you have to be very aware of the tradeoffs.
|
# ? May 14, 2020 14:30 |
|
Foxfire_ posted:GC/reference counting fails much nicer though. When you mess up you get out of memory crashes as opposed to heisenbugs/data corruption/security holes OOM crashes are unacceptable in all the software I've worked on professionally. So are GC hitches. When I've used GC environments, about half the engineering effort for the project goes into trying to make sure the GC runs as little as possible. GC doesn't make it so you don't have to manage your memory, it just makes memory harder to manage
|
# ? May 14, 2020 17:21 |
|
Beef posted:Walter Bright used to write commercial C++ compilers (Zortech/Symantec/DigitalMars C++) and a former game dev, and it really shows.
|
# ? May 14, 2020 17:41 |
|
more falafel please posted:OOM crashes are unacceptable in all the software I've worked on professionally. So are GC hitches. When I've used GC environments, about half the engineering effort for the project goes into trying to make sure the GC runs as little as possible. GC doesn't make it so you don't have to manage your memory, it just makes memory harder to manage Neither failure mode is ok in a shipped product, but crash always is a less bad and easier to fix failure mode than "calculate incorrect answers sometimes (but not always)" or "allow arbitrary code execution" Foxfire_ fucked around with this message at 21:09 on May 14, 2020 |
# ? May 14, 2020 20:57 |
|
Ralith posted:I know D is a bit of a mess, but that's just harsh. Yeah, that sword definitely cuts both ways.
|
# ? May 14, 2020 22:28 |
|
roomforthetuna posted:There's still arguably a good space to learn about regular pointers too, for like moving around in data structures or under some conditions for referencing other owners' unique_ptrs. Smart pointers are just a good thing to learn rather than malloc/free which are a disaster area that you don't really need to understand even for pretty low-level things. Anything higher level than compilers and operating systems shouldn't need you to understand malloc. Anyone learning to use a programming language needs to understand how they should actually write code. Anyone learning to do systems programming needs to understand the basics of how the systems they use work. In many languages, you can learn one and then the other. In C++, the way you should actually write code includes basic abstractions like references that require you to understand how they work in order to use them safely and effectively, and so it is very difficult to separate the two steps. It is much easier to first learn C in order to internalize the basic model, but then you have to carefully un-learn all of the C idioms (but not their underlying concepts) when you go back to learning how to actually write C++.
|
# ? May 14, 2020 22:51 |
|
Foxfire_ posted:Neither failure mode is ok in a shipped product, but crash always is a less bad and easier to fix failure mode than "calculate incorrect answers sometimes (but not always)" or "allow arbitrary code execution" OOM crashes are much harder to fix than like, a simple null deref. We literally had a perf engineer on a project for two years doing just memory fixes and optimizations. That's on a port of a Unity project originally written by a couple of dinguses to console, so maybe the absolute worst case scenario, but still.
|
# ? May 14, 2020 23:44 |
|
Are there any resources that can help me implement encryption and decryption of files and communication from a client to a server? I'm writing a program in C that will run on embedded arm devices and has a configuration file that enables and disables certain features. How do I secure the configuration so that a user cannot just change it by hand? Right now it has a secret key in the binary that is obsfucated that I am going to combine with an API key. I know that putting the secret key in the binary isn't going to stop someone from running Ida pro, but it will stop someone from running strings on it. And then running aes256cbc on the file to read and write. Is there a better way to do this? I don't want to reinvent the wheel. Also, is there a better resource for OpenSSL api docs than just stack overflow and the wiki?
|
# ? May 17, 2020 00:29 |
|
darkforce898 posted:Are there any resources that can help me implement encryption and decryption of files and communication from a client to a server? "Embedded ARM device" could mean a lot of things, is there an OS with file permissions? Is the program running at elevated privilege with respect to the user? Because my first thought is to use chmod to make the file readable only by the user that the program runs as.
|
# ? May 17, 2020 00:33 |
|
|
# ? Jun 1, 2024 23:52 |
|
more falafel please posted:"Embedded ARM device" could mean a lot of things, is there an OS with file permissions? Is the program running at elevated privilege with respect to the user? Because my first thought is to use chmod to make the file readable only by the user that the program runs as. Pretty sure he's asking about a DRM implementation, so simply blocking other processes from accessing a plaintext file may not work terribly well for a number of reasons. Doesn't seem anywhere as secure as what he's proposing in the first place, might be a reasonable extra to include though. Anyway the described system sounds more than secure enough by embedded standards. There hasn't been an unbreakable DRM scheme where the hardware is in customer hands designed so far, not even by the likes of Sony, Nintendo and Microsoft throwing hundreds of millions and ASICs at it, so you probably won't design one either. e: Or if it's an especially valuable high-volume item do what everyone else does and make a security dongle, or hire a team and spend a lot of money on it. But that's very much a cost-benefit problem at that point. Private Speech fucked around with this message at 01:20 on May 17, 2020 |
# ? May 17, 2020 00:52 |