|
B-Nasty posted:When I was in school, the submitted code was actually read (probably by overworked TAs), and code quality issues affected your grade as much as the 'did it run?' component. Not doing the human reading part would be like an English professor only running a spell-check and word-count on your essay to determine if you passed the assignment. "Yep, it's written in English, doesn't have obvious grammatical mistakes, and is 3000 words long...A+." I don't fully disagree, but I'd like to point out that manual checking has huge scaling problems. Our CS1 course is relatively small in the grand scale of things and it has something on the scale of 250 000 returned assignments each iteration. We run five or six of these each year. Manually checking all of that stuff and giving sensible feedback is simply not possible, given that we're not a particularly rich university. We do employ code style checking tools that reject solutions that don't match the course's code style requirements but there's no switch for "sane variable names" in it beyond "ensure it's at least three letters and is correctly capitalized". B-Nasty posted:... claim that coding is part science, part art. I agree and this is actually one of the primary motivators for having the students' answers checked automatically: the only way to learn to program is to do a huge amount of programming. The only way to ensure that the students do a huge amount of programming is to have a huge amount of assignments, and the only way to check that they complete all the assignments is to check them automatically. B-Nasty posted:I've come to appreciate that the code 'working' is just a small component of its (lack of) value. Coders that go the extra mile to make their code easy to reason about, open to new tests, DRY, and well-organized are worth their weight in gold. Again, agreed. These skills just happen to be something that is extremely difficult to teach. Partly because something like "easy to reason about" is so hard to define. E: NihilCredo posted:Engineering students manage to learn the importance of stress-strain analysis without collapsing a shoddily-built bridge, and medical students don't need to go through the experience of extracting a leftover gauze from a patient's chest in order to really get why proper operating room procedures matter. They are taught the right techniques from the start, in excruciating detail, and get rigorously tested on their knowledge of those techniques. That's an interesting thought. I believe that at least in medical the field of surgery "leave a gauze in the patient's chest" in reality is a surprisingly big problem, up to a point where in a modern surgical theatre you have a dude who's only job is to ensure that that literal scenario does not happen by constantly counting stuff and writing the process down in excruciating detail. I wonder what exactly would be programming equivalent of that, statistical code analysis? Counter prompt: I think the programming equivalent of your scenarios are "code does not compile/has bug". What's the surgeon equivalent of badly named variable? Loezi fucked around with this message at 16:11 on Apr 23, 2017 |
# ? Apr 23, 2017 16:01 |
|
|
# ? Jun 9, 2024 04:27 |
Loezi posted:I don't fully disagree, but I'd like to point out that manual checking has huge scaling problems. Our CS1 course is relatively small in the grand scale of things and it has something on the scale of 250 000 returned assignments each iteration. We run five or six of these each year. Manually checking all of that stuff and giving sensible feedback is simply not possible, given that we're not a particularly rich university. We do employ code style checking tools that reject solutions that don't match the course's code style requirements but there's no switch for "sane variable names" in it beyond "ensure it's at least three letters and is correctly capitalized". Maybe you don't need to have humans look at every problem. You can have two tiers of problems, common test-yourself ones where the student just has to prove they can implement something that follows the functional specification (i.e. has a handle on the algorithm), and another tier of somewhat larger projects, of which you have maybe 2 or 3 each semester, which are all graded by humans, in addition to automated tests. In the initial courses it will probably be worth it to have one or two early on that are designed to not be so difficult to solve but will require the students to have good code quality. I.e. make sure the students have an understanding of what "good code" entails and how it affects grading. You could also have TA's select students at random and ask them to present their solution to last week's problem to the entire class. That may encourage the students to write code they are able to explain and aren't embarrassed about showing to others. quote:What's the surgeon equivalent of badly named variable?
|
|
# ? Apr 23, 2017 16:15 |
|
Loezi posted:What's the surgeon equivalent of badly named variable? Poorly done stitches that leave a nasty scar. Sure, it stopped the bleeding and healed up without infection, but now the patient has to stare at your rush-job for the rest of their life.
|
# ? Apr 23, 2017 16:20 |
|
My data structures course was in C and I tend to think that was a pretty bad choice since we spent most of our time debugging seg faults instead of learning the concepts.
|
# ? Apr 23, 2017 16:56 |
|
this whole train of thought is why pair programming exists and is very good (or, you know, code review). having taught, I don't really see either of those practices being super, uh, possible in an academic setting which is unfortunate, because those are skills that are incredibly useful to have when you're starting out. if people were better at being reviewed, they'd be more likely to write reviewable code.
|
# ? Apr 23, 2017 17:04 |
|
uncurable mlady posted:this whole train of thought is why pair programming exists and is very good (or, you know, code review). having taught, I don't really see either of those practices being super, uh, possible in an academic setting which is unfortunate, because those are skills that are incredibly useful to have when you're starting out. if people were better at being reviewed, they'd be more likely to write reviewable code. When I was in school there was a course with weekly assignments that met up to discuss each other's solutions. The grade was split between the normal auto grading and participation in the review. It was fairly effective.
|
# ? Apr 23, 2017 17:25 |
|
leper khan posted:When I was in school there was a course with weekly assignments that met up to discuss each other's solutions. The grade was split between the normal auto grading and participation in the review. It was fairly effective. everyone did individual solutions? I think that's a good way to go about it. Most places that do group projects, it seems to me, have everyone contribute to a single solution which usually is frustrating for everyone involved because there's always that one person who tries to skate by or is bad at version control and fucks everyone else's stuff up which admittedly is pretty much how the real world works too
|
# ? Apr 23, 2017 17:27 |
|
The other alternative is to force the students to learn the pain of poorly-designed programs in the uni environment. For my CS 200-level courses, we were expected to maintain a simple infix calculator program, similar to UNIX bc, through the entire semester. Each week, we were expected to add a new feature to our previously written program -- i.e. "do proper order of operation, support parenthesis, support saving/loading to memory, support subroutines, etc." If you did a lovely job of writing it initially, you could expect to have at least one week where your task would turn into "rewrite this entire thing to support the new feature." The teacher did not review the program for style (although we had a peer review session mid-semester) -- instead, he maintained a unit test suite and ran all students' apps against it. If your app crashed, you couldn't get any higher than a C for that turn-in; if it kept running but gave an incorrect answer, no higher than a B. And he was a sharp motherfucker who took great joy in making garbage parsers poo poo themselves. At the end of the semester, our final exam was to write a paper on what we learned from having to maintain and repeatedly re-write the same product for 3 months, and compare/contrast our week-one code with our week-14 code.
|
# ? Apr 23, 2017 17:36 |
|
"ML languages and type inference are so expressive!"
|
# ? Apr 23, 2017 18:00 |
|
ullerrm posted:The other alternative is to force the students to learn the pain of poorly-designed programs in the uni environment. What do you think about the outcomes?
|
# ? Apr 23, 2017 18:13 |
|
eschaton posted:Most sites don't need JavaScript. An ancient Mac II should be able to browse the average web site just fine. (As long as it's not using SSL anyway.) The Raspberry Pi isn't several times the speed of a SPARCstation 20 though? One of those machines might easily be a quad core 90 MHz machine, quad core 200 MHz with common upgrades available. The 700 MHz single core ARM CPU in an original Raspberry Pi is pretty lovely compared to that. It's hard to make direct comparisons, sure, but overall the original Raspberry Pi especially is a very slow computer. It's also a stupid comparison anyway, as mid 90s sites typically had barely any graphics let alone audio or video, largely due to the fact that speeds were slow and even if you had a fast connection standards weren't really in place to handle them. (Also nah anyway, Mac IIs are way too dog slow for modern sites, even if they're mostly just text. They can also barely handle connecting to a modern cable modem, let alone using the 640x480 resolution with 256 color limitation for real browsing. It was already difficult to impossible to use Mac II it for browsing websites 10 years ago even ignoring the dumb flash based sites out there, it's hopeless now.)
|
# ? Apr 23, 2017 18:31 |
Things beyond JavaScript that also make modern websites heavy on CPU to render (at full fidelity): Large image files in full color, with potentially transparency as well. Complex text layout with support for any Unicode script, as well as the various layering and other effects you can do in CSS.
|
|
# ? Apr 23, 2017 18:43 |
|
Evil_Greven posted:This is a fascinating approach. I thought it was pretty illuminating. When I went through it, I had to rewrite my code from scratch several times to replace parser/lexer, replace basic types with a rational class or a bignum library, add subroutine support, etc. Each time, the overall code quality got measurably better. The emphasis on "above all else, do not crash" was nice too -- it became a point of pride that the instructor couldn't break your code this week. Having to add features one at a time instead of "do the entire thing in one shot" felt a lot more like actual development than mere homework assignment, especially since we were required to use a source control system as well.
|
# ? Apr 23, 2017 18:47 |
|
Out of curiosity, is it common for developers to document an alternate meaning of a structure as opposed to using a union, or do I work someplace special? For example:code:
Cue multiple copies of that comment everywhere in the code except for the 3 instances where the code breaks.
|
# ? Apr 23, 2017 18:55 |
|
ullerrm posted:The other alternative is to force the students to learn the pain of poorly-designed programs in the uni environment. I was going to chime in and say this. A lot of "code smell" issues didn't really hit home for me until I had to maintain a real garbage fire of a program, where the previous devs took full advantage of the flexibility that Python gives you to make some of the worst-designed, -named, -documented, and -implemented code I have ever seen. Show that poo poo to a third-year CS student, and tell them "when I call this top-level function in this 2000-line-long library, it crashes. Fix it." They will gain an appreciation for good variable names, accurate comments, proper use of namespaces, using builtin library functions instead of homegrown implementations, etc.
|
# ? Apr 23, 2017 18:56 |
|
fishmech posted:The Raspberry Pi isn't several times the speed of a SPARCstation 20 though? One of those machines might easily be a quad core 90 MHz machine, quad core 200 MHz with common upgrades available. The base model was a 60MHz SuperSPARC, though an even slower 50MHz version was available. Measured by the BYTE UNIX benchmark, an original RPi is about 7× the speed of one of those, except for I/O which about matches. The Ross HyperSPARC was good, but not that good; going from 60MHz SuperSPARC to 125MHz HyperSPARC would get about a 1.5× speedup, not 2×. So a SS20 with four 200MHz HyperSPARC CPUs might be faster than a RPi, but 150MHz will probably just meet it.
|
# ? Apr 23, 2017 19:04 |
|
nielsm posted:Things beyond JavaScript that also make modern websites heavy on CPU to render (at full fidelity): Broadband ruined the web.
|
# ? Apr 23, 2017 20:06 |
|
Lumpy posted:Broadband ruined the web. Web 2.0 is the worst. Every time I open one website I have to go through ScriptSafe/NoScript and add 5-10 different additional API/service providers just to get it to minimally work.
|
# ? Apr 23, 2017 20:09 |
|
idiotmeat posted:Out of curiosity, is it common for developers to document an alternate meaning of a structure as opposed to using a union, or do I work someplace special?
|
# ? Apr 23, 2017 20:29 |
|
ullerrm posted:I thought it was pretty illuminating. When I went through it, I had to rewrite my code from scratch several times to replace parser/lexer, replace basic types with a rational class or a bignum library, add subroutine support, etc. Each time, the overall code quality got measurably better. The emphasis on "above all else, do not crash" was nice too -- it became a point of pride that the instructor couldn't break your code this week. My compilers and operating systems classes were the same way, and OS was probably my favorite class in the whole degree because of it. (I probably wouldn't've called it that on one of the nights I was having to rework my buggy-rear end, space-wasting heap implementation again so my OS could pass that week's tests, though.)
|
# ? Apr 23, 2017 20:48 |
|
Winter Stormer posted:My compilers and operating systems classes were the same way, and OS was probably my favorite class in the whole degree because of it. (I probably wouldn't've called it that on one of the nights I was having to rework my buggy-rear end, space-wasting heap implementation again so my OS could pass that week's tests, though.) My Operating Systems prof did this, but also threw in an extra trap for morons: since it was all cumulative, if you got a higher mark on a later checkpoint or the final project, it would replace all earlier lower marks on the project. There were people who tried to do the whole thing at once at the very end -- I'm guessing they did not do at all well.
|
# ? Apr 23, 2017 21:11 |
|
NihilCredo posted:Engineering students manage to learn the importance of stress-strain analysis without collapsing a shoddily-built bridge, and medical students don't need to go through the experience of extracting a leftover gauze from a patient's chest in order to really get why proper operating room procedures matter. They are taught the right techniques from the start, in excruciating detail, and get rigorously tested on their knowledge of those techniques. A program that's frustrating to maintain is not an analogous level of engineering failure to a collapsing bridge. It's more like a bridge that's hard to get to from a nearby town that everyone knew was growing a lot when they put the bridge in, or a bridge where they used lovely concrete on the walkways and now it's crumbling and maintenance crews have to come out every few months to patch it over. Those are engineering failures that notably happen with real bridges. Even a lovely program that crashes whenever you do X isn't really like a collapsing bridge, because the crash itself might not matter, and it's entirely possible that a user can still use the program once they've learned not to do X. It's more like a highway where, if you merge onto it from a particular on-ramp, you basically have no chance of getting to a particular exit unless traffic is very light. And that is also a failure that notably happens with real highways. The programming analogue of a collapsing bridge would be something like an SQL injection where the failure can lead to catastrophic damage with irreversible consequences. For that, I would have to agree with your points, because while reputable sources do carefully stress the importance of the right techniques for avoiding SQL injections, our profession doesn't demand that you learn SQL programming from reputable sources, and there aren't really any professional consequences to loving it up.
|
# ? Apr 23, 2017 21:29 |
|
eschaton posted:Most sites don't need JavaScript. An ancient Mac II should be able to browse the average web site just fine. (As long as it's not using SSL anyway.) I think this is right and wrong at the same time. There's very few sites that need javascript. Many sites can be improved from their hypothetical non-JS versions with some amount of JS just from a UX standpoint. The real problem is that so many sites go way beyond* that and that's not really Javascripts fault in the strictest sense. It's developers/designers/product-managers fault. I mean, you could make the argument that JS enables them to do that, but I can't think of a good version of that argument that I agree with. * and by "beyond" I don't mean LoC. The total "amount" of JS isn't important, its how the page performs.
|
# ? Apr 23, 2017 21:41 |
|
Loezi posted:I believe that at least in medical the field of surgery "leave a gauze in the patient's chest" in reality is a surprisingly big problem, up to a point where in a modern surgical theatre you have a dude who's only job is to ensure that that literal scenario does not happen by constantly counting stuff and writing the process down in excruciating detail. I wonder what exactly would be programming equivalent of that, statistical code analysis? Running a site with no JavaScript is fine as long as no one's trying to grow their business with it. If they are, they're going to get annoyed at hearing "no" all the time.
|
# ? Apr 23, 2017 22:09 |
|
Thermopyle posted:I think this is right and wrong at the same time. Bingo. My shop has just started to realize that serving all of the content through js is a huge drag on average. New recommendation: use js where needed instead of all the friggin time.
|
# ? Apr 23, 2017 22:56 |
|
NihilCredo posted:Engineering students manage to learn the importance of stress-strain analysis without collapsing a shoddily-built bridge, and medical students don't need to go through the experience of extracting a leftover gauze from a patient's chest in order to really get why proper operating room procedures matter. They are taught the right techniques from the start, in excruciating detail, and get rigorously tested on their knowledge of those techniques.
|
# ? Apr 23, 2017 23:12 |
|
Thermopyle posted:
Fixed for everywhere I have every worked / every client I have ever dealt with.
|
# ? Apr 24, 2017 00:10 |
|
Yeah, good point.
|
# ? Apr 24, 2017 00:27 |
|
Lumpy posted:Fixed for everywhere I have every worked / every client I have ever dealt with. Yeah, and every new hire the marketing department makes requires some other lovely analytics company's pixel and JS library be loaded.
|
# ? Apr 24, 2017 01:54 |
|
SupSuper posted:I think human nature instinctively get the consequences of physical damage (and even then they still gently caress up). By comparison, the worse a shoddily built program will do 99% of the time is make someone's life miserable, usually the programmers, possibly the users too, and since jobs are dreary as it is, making your job harder is not immediately apparent. Unless your idea is to force everyone to shape up by working on life-threatening software, I don't think you can really compare the two. Putting that aside, I've been in the industry long enough to know that DoD-style "process" isn't going to make incompetent devs competent, but will make good devs miserable enough to find another line of work. Gazpacho fucked around with this message at 06:11 on Apr 24, 2017 |
# ? Apr 24, 2017 05:59 |
|
Gazpacho posted:The Therac-25 was an effective cancer therapy machine, except when it wasn't. The average student reaction to hearing about Therac-25 is probably going to be "poo poo, I better shape up my code when working with a literal cancer laser that is shot at people's heads." and then not care while working on a hot new React website for Mom'n'Pop's Improved Gizmos Inc.
|
# ? Apr 24, 2017 09:57 |
|
code:
I feel sure if I continue to work with Oracle it's going to involve acquiring increasing amounts of brain damage to the point where this stuff starts to make a strange kind of sense, which I don't really want. Perhaps this is partly Windows's fault for not always being friendly to applications trying to use forward slashes instead of backslashes, but I'm drat well going to blame this on Oracle anyway because SQL Plus has trained me to use forwardslash everywhere since it hosed up when I tried to execute script files using backslashes in the path.
|
# ? Apr 24, 2017 11:05 |
|
Loezi posted:The average student reaction to hearing about Therac-25 is probably going to be "poo poo, I better shape up my code when working with a literal cancer laser that is shot at people's heads." and then not care while working on a hot new React website for Mom'n'Pop's Improved Gizmos Inc. Some poor soul is developing a front end for a radiation therapy modality using Angular and Node.JS at the very second you type these words.
|
# ? Apr 24, 2017 13:00 |
|
The whole JS / no-JS thing was embodied for me by that "joke" best-site-ever that was just text on a page. Then someone made a response to the page that used proper web fonts and better spacing etc and the whole thing devolved into a slow-motion passive agressive slap fight over what was "needed" and what wasn't. My life as a developer was infinitely easier when the backend was where the logic went and the front end just handled displaying content. But as a user I've gotta say I much prefer it when a site pre-warns me that a form field is invalid before I send it.
|
# ? Apr 24, 2017 13:15 |
|
NtotheTC posted:My life as a developer was infinitely easier when the backend was where the logic went and the front end just handled displaying content. But as a user I've gotta say I much prefer it when a site pre-warns me that a form field is invalid before I send it. Why can't you have both? Why having in the frontend is that much harder?
|
# ? Apr 24, 2017 13:22 |
Hammerite posted:
The user account the process was running as did not have write access to C:\ so user Windows account virtualization kicked in and redirected the write to the user's personal VirtualStore directory instead. That's a feature of UAC that lets software access files in locations like C:\Program Files\ and C:\Windows\ and write to them without actually touching system-global stuff and requiring local administration permission. Works as intended.
|
|
# ? Apr 24, 2017 13:25 |
|
Volguus posted:Why can't you have both? Why having in the frontend is that much harder? It's not harder in terms of difficulty, you just have to use javascript to do it. And unless you enjoy pain you're probably going to want to use at least one framework rather than write your own and then it's too late, you're in the JS ecosystem and there's no turning back.
|
# ? Apr 24, 2017 13:27 |
|
Volguus posted:Why can't you have both? Why having in the frontend is that much harder? You have to have both. The frontend validation is for user convenience, the backend validation is for security and data integrity.
|
# ? Apr 24, 2017 13:48 |
Thermopyle posted:You have to have both. The frontend validation is for user convenience, the backend validation is for security and data integrity. The actual hard part is keeping validation at both ends in sync, or ideally making a solution to use a single code base for both.
|
|
# ? Apr 24, 2017 14:20 |
|
|
# ? Jun 9, 2024 04:27 |
|
Loezi posted:That's an interesting thought. I believe that at least in medical the field of surgery "leave a gauze in the patient's chest" in reality is a surprisingly big problem, up to a point where in a modern surgical theatre you have a dude who's only job is to ensure that that literal scenario does not happen by constantly counting stuff and writing the process down in excruciating detail. I wonder what exactly would be programming equivalent of that, statistical code analysis? Fun fact, this is called the "charge nurse". Their actual job is to make sure that the charge is captured for every thing used (you're getting charged 5bux per gauze in surgery). The whole "count it again to make sure it gets removed/thrown out" thing is basically a secondary function they've been given because it's no good capturing all those charges if they get sued out from under the hospital afterward. In the US, at least, those nurses literally wouldn't exist if the hospital wasn't permitted to bill per-supply because there'd be no financial incentive in it anymore. I'm assuming in civilized countries with socialized medicine having someone tracking supplies is instead mandated as a standard of care.
|
# ? Apr 24, 2017 15:02 |