|
pigdog posted:From Notch's perspective, counterintuitively, yes. It proves, with cold hard cash to back it up, that the faults in his game and coding style are relatively insignificant. I am not Notch. I am not speaking for Notch. I am speaking for myself. I don't give a drat whether or not he's making money off of it, it doesn't affect me one bit. I'm a lot more interested in theory than in making money anyway. quote:Ultimately you don't code something like this for coding's own sake. Succeeding in writing software that people would buy is a higher level goal than writing perfect code. The latter is ultimately the means to attain the former. Again this is extremely presumptuous. Writing "perfect code" is more than just about becoming profitable or "successful". It's also a way to improve your knowledge and skills. Sometimes I'll put in 5 hours on school project that I could cobble together in 1 hour. Sure I would probably get the same grade, but doing that won't make me any better of a programmer. If he's just out to make money, then fine, that's what matters to him, but that's not what this thread is about. This is not "Software Success: Post the produce that make you laugh (or smile)". I'm not even being glib. You're missing the whole point of this thread.
|
# ? Apr 9, 2012 07:42 |
|
|
# ? May 30, 2024 12:16 |
|
The whiteknight squad comes out whenever anyone mentions minecraft in any thread in any subforum, frankly it ought to get 1) banned from this thread and 2) sequestered to its own subforum. Didn't furries get obnoxious a few years back to the point they were quarantined? Do that again.
|
# ? Apr 9, 2012 07:42 |
|
Jewel posted:But this argument is like saying "Well I'm the owner of a candy company. We make the cheapest shittiest candy you can buy, but because it's so cheap, children buy it even though it tastes bad, because they can't tell the difference!" I'm willing to let it slide and not outright say he's an awful developer since he originally didn't have these grand plans for the game, but it doesn't make bad code any less bad (though at a smaller scale, maybe the problems wouldn't be an issue).
|
# ? Apr 9, 2012 07:44 |
|
Look Around You posted:I am not Notch. I am not speaking for Notch. I am speaking for myself. I don't give a drat whether or not he's making money off of it, it doesn't affect me one bit. I'm a lot more interested in theory than in making money anyway. On the contrary, I'm perfectly aware of what this thread is about. Minecraft's success story in my opinion augments it brilliantly. Developers may get too caught up in quality and gold-plating that they don't realize, or forget, what's actually important: for the code to provide some value for someone. Personally, I'm totally obsessed with writing clear, optimal and bulletproof code with test-driven methodologies, but there's no way to argue with Minecraft's success. Something like Skyrim has been and has to be lovingly, skillfully and painstakingly crafted, but from the other end, noone would gold-plate or test-drive a shell script they would only use once. Notch has found his own sweet spot between speed, skill and quality, and even if it's not quite as well made as Skyrim or not quite as hacked together as a single-use shell script, he has succeeded and thus done it right in his own way.
|
# ? Apr 9, 2012 07:56 |
|
Hey guys, how's this for a coding horror:code:
|
# ? Apr 9, 2012 08:05 |
|
pigdog posted:On the contrary, I'm perfectly aware of what this thread is about. Minecraft's success story in my opinion augments it brilliantly. Developers may get too caught up in quality and gold-plating that they don't realize, or forget, what's actually important: for the code to provide some value for someone. Personally, I'm totally obsessed with writing clear, optimal and bulletproof code with test-driven methodologies, but there's no way to argue with Minecraft's success. Something like Skyrim has been and has to be lovingly, skillfully and painstakingly crafted, but from the other end, noone would gold-plate or test-drive a shell script they would only use once. Notch has found his own sweet spot between speed, skill and quality, and even if it's not quite as well made as Skyrim or not quite as hacked together as a single-use shell script, he has succeeded and thus done it right in his own way. The thing is that nobody was detracting from his product's success. At all. Nobody here has argued that Minecraft wasn't a successful product. There's a difference between a product's success and it's technical correctness and I think that most people posting in this thread realize that. I mean, I personally prefer spending my time perfecting my skills, learning new things and hopefully advancing the field eventually. Ideally I'll end up in academia, but that's a long way off. Honestly I don't care if I ever get rich, I'd rather advance the field and my own knowledge. e: I recognize that I'm a bit uncommon and a bit of an idealist though.
|
# ? Apr 9, 2012 08:06 |
|
My kingdom for a new LOL PHP error.
|
# ? Apr 9, 2012 08:57 |
|
Here's an oldie but goodieAristotle Palagaltzis posted:This is a tale of an integer overflow vulnerability (paraphrased for the purposes of the tale, as are all following snippets):
|
# ? Apr 9, 2012 09:48 |
|
Otto Skorzeny posted:Here's an oldie but goodie The best part is when you remember that sizeof(char) == 1. Everywhere. If it isn't, then you're not actually coding in C.
|
# ? Apr 9, 2012 09:57 |
|
Jabor posted:The best part is when you remember that sizeof(char) == 1. Everywhere. If it isn't, then you're not actually coding in C. That's a style thing. Let's not start that again...
|
# ? Apr 9, 2012 10:13 |
|
Jabor posted:The best part is when you remember that sizeof(char) == 1. Everywhere. If it isn't, then you're not actually coding in C. hobbesmaster posted:That's a style thing. Let's not start that again... The Gripper fucked around with this message at 10:19 on Apr 9, 2012 |
# ? Apr 9, 2012 10:14 |
|
The Gripper posted:I like the (size * charsize) <= 0 test, purely because I can't figure out what edge-case it solves. I wasn't looking at the bottom one, I blocked it out of my mind at first glance so was just looking at the calloc call. I can't even pretend to understand how that would make sense.
|
# ? Apr 9, 2012 10:30 |
|
hobbesmaster posted:That's a style thing. Let's not start that again... It's a style thing in a malloc (a poor style, incidentally, although for other reasons), but not here. If you look up the commit, they actually thought they were making the code portable to systems where sizeof(char) != 1. That is to say, systems guaranteed not to exist.
|
# ? Apr 9, 2012 10:32 |
|
hobbesmaster posted:I wasn't looking at the bottom one, I blocked it out of my mind at first glance so was just looking at the calloc call. I can't even pretend to understand how that would make sense.
|
# ? Apr 9, 2012 10:37 |
|
The Gripper posted:I didn't even notice the calloc, so I'll freely admit that doing it there is an ok style choice! Since it's come up twice now, I'll disagree Doing foo = malloc(n * sizeof(*foo)) or foo = calloc(n, sizeof(*foo)) always makes more sense than foo = malloc(n * sizeof(whatever_type_*foo_is))
|
# ? Apr 9, 2012 10:43 |
|
sizeof(var) can be better than sizeof(var_type), just in case var someday changes type and noone remembers to update that sizeof. (PS that guide is pretty useful in general)
|
# ? Apr 9, 2012 12:22 |
|
Strong Sauce posted:My kingdom for a new LOL PHP error. Why not learn about how to handle unicode in php ? http://www.phpwact.org/php/i18n/utf-8
|
# ? Apr 9, 2012 12:40 |
|
Otto Skorzeny posted:foo = malloc(n * sizeof(*foo)) assuming no integer overflow
|
# ? Apr 9, 2012 12:40 |
|
tef posted:assuming no integer overflow Lets do it the PHP way! code:
|
# ? Apr 9, 2012 13:24 |
|
Otto Skorzeny posted:Here's an oldie but goodie So why didn't even one of them just come up with (what seems to me to be the obvious and sane version of) code:
EDIT: vvv but there's no division happening here at run-time anywhere? it's all compile-time which even a straightforward and simple compiler should be able to optimize away during compile-time constant folding vvv PrBacterio fucked around with this message at 18:16 on Apr 9, 2012 |
# ? Apr 9, 2012 18:00 |
But division is slooooooowww!!
|
|
# ? Apr 9, 2012 18:05 |
|
Thanks for reposting that php thing, the link to the writeup in the post I had bookmarked apparently went dead.
|
# ? Apr 9, 2012 18:18 |
|
PrBacterio posted:EDIT: vvv but there's no division happening here at run-time anywhere? it's all compile-time which even a straightforward and simple compiler should be able to optimize away during compile-time constant folding vvv There wouldn't be half as much crappy code floating around if people recognized when their 'optimizations' were absolutely pointless.
|
# ? Apr 9, 2012 18:27 |
|
I just came across this:code:
code:
|
# ? Apr 9, 2012 18:38 |
|
It's not necessarily unhelpful to define a value for zero even if you don't expect to use it. For example, assuming you're dealing with Objective-C, instance variables are initialized as zero, so you could easily see a value of 0 for something declared as that enum. I'm with you in not wanting 0 to be a legal non-falsy value though, and making the first in the list = 1 does that.
|
# ? Apr 9, 2012 19:11 |
|
The Dunning-Kruger effect. That is my only explanation for this stuff. Anyone is entitled to write crappy code. We all do it when prototyping something or when the schedule demands we get a product out the door. When a good programmer gets reports of his code running slowly or realizes he can't add a feature due to a bad design decision.... He fixes the code. He runs a profiler on it. He tries to research better algorithms. Especially when that person makes millions of dollars and now has his or her full time to devote to the product and the ability to hire additional developers. Code's history is excusable. It's present condition is not. Thats what gets me: People who don't just write crappy code - they are proud of it and continue to actively avoid learning a better way of doing things. The only explanation is that their incompetence is so pronounced they are unable to recognize just how incompetent they are. edit: removed derail to focus on the real point Simulated fucked around with this message at 20:30 on Apr 9, 2012 |
# ? Apr 9, 2012 19:15 |
|
Shut up about notch/Minecraft.
|
# ? Apr 9, 2012 19:37 |
|
Ender.uNF posted:People who don't just write crappy code - they are proud of it and continue to actively avoid learning a better way of doing things. The only explanation is that their incompetence is so large they are unable to recognize just how incompetent they are. That's my last job in a nutshell. I had conversations along the following lines many times: Other Developer: Why did you refactor my code to do <thing that's a best practice>? Me: Because <thing> is a best practice for <reasons>. OD: But we don't do <best practice> anywhere else! I'm going to change it back. Me: *blank stare, begins mentally composing letter of resignation* Meanwhile, if the roles were reversed, the conversation would go like this: Me: Why did you refactor my code to do <thing that's a best practice>? Other Developer: Because <thing> is a best practice for <reasons>. Me: Ooh, that's really good to know, thanks! I'll keep that in mind in the future.
|
# ? Apr 9, 2012 19:39 |
|
Ithaqua posted:Other Developer: Why did you refactor my code to do <thing that's a best practice>? This may be a case of the developer caring about "his" code too much. Devs getting possessive has been the #1 obstacle I've encountered when trying to do non-trivial scale refactoring, even trumping management deadline fuckery.
|
# ? Apr 9, 2012 22:29 |
|
You are not your code
|
# ? Apr 9, 2012 22:59 |
|
tef posted:assuming no integer overflow Well, the compiler probably can (repeat after me: signed integer overflow in C is undefined...)
|
# ? Apr 9, 2012 23:46 |
|
McGlockenshire posted:This may be a case of the developer caring about "his" code too much. I don't think it was out of ownership. I was the youngest developer on the team by decades, and I incorrectly thought that being around older devs would mean that they had a lot to teach me. Instead, I was the only person with a current skill set and exposure to modern software development practices, and they were dismissive of my suggestions because I'm "young" (by which I mean "barely under 30", with 8 years of professional experience). There definitely were some areas where there was "ownership," of course. I suggested phasing out the horrible database-and-reflection-driven abomination they were using for XML generation in favor of simple serialization and was dismissed by the guy that wrote the monstrosity because it wasn't "flexible". Of course, implementing even a simple XML schema took weeks of loving around creating database rows, and generating the XML was so slow that processing 100 records took 5 minutes (and we were working with 10,000+ record data sets), but it was flexible.
|
# ? Apr 10, 2012 00:02 |
|
I forget if I've pasted this one before: Click "Comments", and then scroll down a bit. Here's something else by him.
|
# ? Apr 10, 2012 00:47 |
|
code:
|
# ? Apr 10, 2012 00:52 |
|
So a coding decision I made months ago is starting to drive me insane. I was writing a decoder for a certain messaging protocol and decided to interpret str(1)'s as chars. I can't count how many times a bug has been tracked down to a mistake that's essentially the following:quote:if (x == 1) {...} quote:if (x == '1') {...} I'm not sure if the horror is that the implicit cast is legal, or that I decided to use a "weak type" for such a critical data structure. Either way, this has caused all sorts of fun bugs, especially since I'm no longer the only one working on this codebase. I should probably refactor before it's too late . . .
|
# ? Apr 10, 2012 01:07 |
|
PHP's rand() function seems to have some pretty obvious patterns.
|
# ? Apr 10, 2012 02:08 |
|
Nippashish posted:PHP's rand() function seems to have some pretty obvious patterns. FWIW, I'm pretty sure that's only PHP on Windows. On Mac OS X the sample code Bo Allen posted displays a bitmap without any real patterns. I haven't tested it on Linux, but it looks like other people have and it appears to be Windows/PHP specific.
|
# ? Apr 10, 2012 02:38 |
|
Yeah, the linked article discusses that. It's due to PHP's rand function just being a wrapper for the system's random function.
|
# ? Apr 10, 2012 02:50 |
|
Perl, Ruby and Python all expose the same flawed rand by default, as does, say, C. This has been a known problem for decades. Don't use rand for anything other than trivial matters.
|
# ? Apr 10, 2012 02:55 |
|
|
# ? May 30, 2024 12:16 |
|
Linux and OSX - at least - long ago either upgraded rand() itself or made it an alias for random(), which while not crypto-quality isn't laughably and obviously broken as rand() is. Windows still uses the old, terrible implementation, though.
|
# ? Apr 10, 2012 03:14 |