|
We should come up with the Goon language of choice.... Suggestions for names: shitface - You only code in this language once you have downed 12 cases of Stella.... GN - This language relies on strict grammar, the compiler hassles you about misused apostrophes and incorrect contractions. NotOnMyOS (NOMOS)- this language will not run under any operating system as it classes them all as waste of space, uses hardware directly
|
# ? Jan 26, 2015 08:14 |
|
|
# ? Jun 3, 2024 01:52 |
|
TheresaJayne posted:We should come up with the Goon language of choice.... We already have php for that e: wait sorry, php is twelve cases of cough syrup, my mistake
|
# ? Jan 26, 2015 10:34 |
|
I thought we settled on the Butts programming language, featuring the (_|_) operator.
|
# ? Jan 26, 2015 12:13 |
|
qntm posted:I thought we settled on the Butts programming language, featuring the (_|_) operator. Does it work this way? pre:32 10 x (__|__) (_|_) x *flush* unclog sum.butt >> 42
|
# ? Jan 26, 2015 12:22 |
|
Butts? What is this, the 90s? Program in dickbutt already
|
# ? Jan 26, 2015 14:44 |
|
TheresaJayne posted:NotOnMyOS (NOMOS)- this language will not run under any operating system as it classes them all as waste of space, uses hardware directly Otherwise known as embedded C?
|
# ? Jan 26, 2015 19:37 |
|
feedmegin posted:Otherwise known as embedded C? Nah, you can trick C to run in a unit test environment. I want a language runtime that catastrophically fails if it detects an OS à la IDT bit checking to find a VM. JawnV6 fucked around with this message at 23:26 on Jan 26, 2015 |
# ? Jan 26, 2015 20:01 |
|
Here's a piece of awesome code that only contributed to the pain that was utf8 mysql databases with latin1 connections:code:
(Also latin1 is traditionally ISO8559-1, but mysql uses Windows cp1252) (gently caress character sets for life)
|
# ? Jan 26, 2015 22:18 |
|
ErIog posted:gently caress it, someone should really just write a bot that reposts Jonathan Blow's twitter feed directly into this thread. Regardless of whether or not the language is good, or whether or not his games are good, Jonathan Blow is That Guy. The one who desperately wants to be a Real Programmer (in the vein of Mel). He thinks he can't do what he does or something bad will happen if he doesn't have perfect control over everything, and only uses zero-cost abstractions and blah blah. He may have some sort of point if he were a core engine developer for Unreal or Crytek, but it starts to become silly when you look at actual semi-respected game programmers like John Carmack talking about how they kind of dig Common Lisp and Haskell and wish they had more time to learn it. What has Blow done? Oh, right, a decent platformer and an admittedly pretty adventure game. There are certainly very peculiar and stringent concerns when it comes to writing real time simulations. That's not in question. However, those are already being addressed in the places they matter most. Things like DX12, Mantle, and the next Open GL are being designed to give better control over GPU device memory. That's well and good. He, on the other hand, always strikes me as that one guy on your project who writes this bloated mess and insists on using his own rewrite of half the standard library to "be faster" without actually benchmarking or profiling.
|
# ? Jan 27, 2015 05:43 |
|
GrumpyDoctor posted:It gives idiots a wrong answer to regurgitate on StackOverflow when someone asks how to kludge multiple dispatch in languages without multiple dispatch.
|
# ? Jan 27, 2015 06:20 |
|
ExcessBLarg! posted:So what's the right answer? Besides "don't". Use the strengths of your language rather than trying to kludge some feature from another language into it.
|
# ? Jan 27, 2015 06:45 |
|
ExcessBLarg! posted:So what's the right answer? Besides "don't". The one time I personally ran into it the language in question was C#, and I've since learned that you can use dynamic to do it passably.
|
# ? Jan 27, 2015 08:04 |
|
Jsor posted:Regardless of whether or not the language is good, or whether or not his games are good, Jonathan Blow is That Guy. It's doubly weird compared to Mike Bithell's talks about the games he's made and his process, which basically boil down to "loving make something. Does it work well enough? Good enough. Does it not? Keep working on it until it does." I think part of it is how they see themselves. It's a lot easier to say "Yeah, the code for TWA was kinda poo poo, I feel bad for the folks I paid to port it" if you're not the kind of person who feels that every aspect of their projects define them and their worth as a human being. I feel like the majority of the code we see in this thread is written by people with too much of their self worth invested in this code, that they're going to make it work all by themselves, dammit, that their ideas are right, that it is the requirements that are wrong. Unskilled and too afraid to admit it and ask for a loving code review or a pair.
|
# ? Jan 27, 2015 08:24 |
|
Storysmith posted:It's doubly weird compared to Mike Bithell's talks about the games he's made and his process, which basically boil down to "loving make something. Does it work well enough? Good enough. Does it not? Keep working on it until it does." The crazy thing is that Jonathan Blow's stated goal is not to make it that much easier for independent programmers but to make life better for people making triple A games. However, his background is in small teams where he is either the sole programmer or one of very few. He's done consulting on larger titles before, but who knows what that involved. He has these fantasies that this language is going to revolutionize how people make games, but he's ignoring what the real road blocks are to game development. Products like Gamemaker, Unity, and UE4 have done more to revolutionize the game industry than his language ever will. I know, it's crazy. There's really successful games that were made in languages that used GC, and nobody really cared. Binding of Isaac was made in Flash, which was a horror, but that game went on to sell literally millions of copies and be enjoyed by tons of people. Even if he gets his language to a somewhat usable state, I doubt the majority of the people that touch it are going to want to dick around writing their own display code so they can make their small scale game. I also doubt the existing engine programmers for the commonly used engines are going to want to go through the pain of switching over. There was Quake 1 code floating around in engines up until only a few years ago. Western developers have been pretty proactive about using off-the-shelf middleware and engines. They all come with caveats, but devs seem to enjoy the basic efficiencies that come with that stuff. This thing is apparently gonna give you blowjobs while you code in it because it's so efficient and awesome, but you're probably going to be spending a bunch of time writing file format loaders and reinventing the wheel. On the other hand, you could load up even the pile of poo poo that Unity is rapidly becoming, and have a prototype going in like a weekend if you're familiar with it. He seems to think of Games Programmers as an end unto themselves instead of the necessary cost outlay that needed to happen in order for designers and producers to put something together. This goes back to your point that all of Blow's talk seems oddly divorced from what game development actually is in 2014. It's like if he were devoting his time to developing a really great movie editing process that relies on cutting physical negatives while you have financially lucrative movies being shot on what we would have called camcorders 15 years ago. He's dicking around in the technical weeds rather than focusing on what the actual goals of game development are. ErIog fucked around with this message at 09:02 on Jan 27, 2015 |
# ? Jan 27, 2015 08:58 |
|
The real problem is, a lot of game code is finely tuned over years. I've made a couple of games, but I like toying with engine tech mostly. They're pretty poo poo, but I learn a lot. I tend to work in languages without big game codebases (formerly Go, now Rust) and while I've managed to get simple things like graphics-oriented matrix math on par with known quantities like GLM, getting something like a physics engine to Bullet or Havok levels takes years of profiling, tuning, learning from mistakes, and reading. Same thing with all the little special cases that run graphics programming. Hardware instancing, CSG tools, and so on. A lot of that stuff is probably a 1 week project if you get the gist going into it, but getting it all written in a sane, performant manner is going to take a long time no matter what. Like, even if this language is totally awesome, game developers are much better served by a black box function call that has been magically implemented by someone else that exploits cache locality and the new super fast, easy to tune constraint-based physics method described in [Dickbutt, 2013] than they are having the power to manually play with the motherboard fan controller, but still needing a year to reimplement known algorithms. Linear Zoetrope fucked around with this message at 09:50 on Jan 27, 2015 |
# ? Jan 27, 2015 09:44 |
|
Jsor posted:The real problem is, a lot of game code is finely tuned over years. I've made a couple of games, but I like toying with engine tech mostly. They're pretty poo poo, but I learn a lot. I tend to work in languages without big game codebases (formerly Go, now Rust) and while I've managed to get simple things like graphics-oriented matrix math on par with known quantities like GLM, getting something like a physics engine to Bullet or Havok levels takes years of profiling, tuning, learning from mistakes, and reading. Same thing with all the little special cases that run graphics programming. Hardware instancing, CSG tools, and so on. A lot of that stuff is probably a 1 week project if you get the gist going into it, but getting it all written in a sane, performant manner is going to take a long time no matter what. This doesn't only apply to games programming. There's a reason why things like scipy are basically fancy wrappers for FORTRAN routines.
|
# ? Jan 27, 2015 10:11 |
|
Jsor posted:The real problem is, a lot of game code is finely tuned over years. I've made a couple of games, but I like toying with engine tech mostly. Right, and Blow's experience is mostly on set-top boxes where he has to reinvent the wheel, in very small teams where nobody cares if he reinvents the wheel as long as it's not a giant obvious waste of time, and then also in consulting for optimizing specific pieces of larger code bases where one programmer essentially has ownership of the black box function they're writing. At no point is his experience on a global project level with multiple team members in an environment that has to hit a deadline 18 months out. So all the stuff he's talking about does kind of make sense in his own bubble, but becomes when you start trying to generalize it to games programming as a whole. It's a kind of solipsism.
|
# ? Jan 27, 2015 12:14 |
|
Soricidus posted:There's a reason why things like scipy are basically fancy wrappers for FORTRAN routines. The reason people wrap Fortran routines is not simply because somebody already wrote and carefully optimized LAPACK in Fortran. Fortran is a fundamentally more suitable language for writing these routines than C, or indeed most more modern languages. It's much easier for compilers to perform alias analysis on Fortran than it is for C, allowing more aggressive loop kernel optimizations (among other things), and we don't have anything close to a mature, portable systems programming language that could replace C without suffering from these shortcomings. Technology has genuinely stagnated in this area, and as single-core performance plateaus and hardware manufacturers start considering more exotic architectures there will be ever more pressure to reconsider these foundations.
|
# ? Jan 27, 2015 15:52 |
|
Internet Janitor posted:Fortran is a fundamentally more suitable language for writing these routines than C, or indeed most more modern languages. It's much easier for compilers to perform alias analysis on Fortran Fortran? The language where everything is global and you can redefine a number literal with a SWAP procedure?
|
# ? Jan 27, 2015 18:59 |
|
Suspicious Dish posted:Fortran? The language where everything is global and you can redefine a number literal with a SWAP procedure? Fortran, which doesn't traditionally have a pointer type, meaning you can assume that variable a isn't going to be modified by someone doing b = &a; *b = 42. For optimisation within a function this is way more important.
|
# ? Jan 27, 2015 19:44 |
|
Suspicious Dish posted:Fortran? The language where everything is global and you can redefine a number literal with a SWAP procedure? I think you're a little confused about F-languages!
|
# ? Jan 27, 2015 20:10 |
|
I thought all values in Fortran were pass-by-reference.
|
# ? Jan 27, 2015 20:12 |
|
Suspicious Dish posted:you can redefine a number literal with a SWAP procedure? Is that a thing in modern fortrans anymore? ("modern" == "77 or later")
|
# ? Jan 27, 2015 20:28 |
|
Suspicious Dish posted:I thought all values in Fortran were pass-by-reference. They are, but Fortran has a strong aliasing rule on parameters: if the same object is passed by reference to two separate parameters, or passed by reference in a way that the original object is accessible to the subprogram, you aren't allowed to modify it through either reference (or else it's undefined behavior). Fortran 90 added pointers, but I think they're still pretty restricted about what they can randomly alias; this is still mostly hearsay, though, I haven't really studied the Fortran specs. Similarly, you can pass a constant as a by-reference argument, but if you do, you aren't allowed to assign to that parameter (or else it's undefined behavior). Fortran 90 made this type-checkable; a parameter that's explicitly INTENT(OUT) or INTENT(INOUT) can't be passed a constant, and a VALUE parameter is, actually, passed by value. There may also be a language rule somewhere that says you copy-in constants by default; that would be a sensible fix, but again, I can't claim to be an expert. rjmccall fucked around with this message at 20:39 on Jan 27, 2015 |
# ? Jan 27, 2015 20:35 |
|
Suspicious Dish posted:Fortran? The language where everything is global and you can redefine a number literal with a SWAP procedure? That was an old bug that actually happened (or, rather, wasn't disallowed) in old Fortran compilers, but it was fixed by Fortran '77 at least. Fortran's aliasing rules are what makes it able to optimize so well. I think C can actually get similar speed by compiling with -fstrict-aliasing (on GCC), but nobody does. I wonder if Rust can get to Fortran speeds in a hypothetical future since it has similar assumptions about ownership and mutability. I think it can in principle, but since Fortran was sort of written by engineers and scientists for engineers and scientists it had a focus on this stuff, I doubt Rust's authors are interested in doing the kind of analysis necessary for high performance numerical computing (but then, it is open source so maybe someone else is). Linear Zoetrope fucked around with this message at 21:05 on Jan 27, 2015 |
# ? Jan 27, 2015 20:59 |
|
From my newspaper's website: Does Ruby (RoR?) always spit out so much code with errors? Also, wouldn't the proper way to handle error messages in production be to log it and fail silently/show minimal data? I'd personally consider so much error info a security horror. Some really big newspapers use this same core website as well, so I wouldn't be surprised if the info here applies to those as well.
|
# ? Jan 28, 2015 00:59 |
|
Knyteguy posted:From my newspaper's website: Looks like a misconfigured web server that's serving up source code instead of executing it. An administration horror, if you will. There's also a lesser coding horror: using CGI in 2015.
|
# ? Jan 28, 2015 01:10 |
|
Jsor posted:I doubt Rust's authors are interested in doing the kind of analysis necessary for high performance numerical computing (but then, it is open source so maybe someone else is). So far they're all like "let's hope we can eventually pass all our aliasing info on to llvm for use with future, more advanced optimization passes", I think right now they aren't doing any optimizations themselves.
|
# ? Jan 28, 2015 01:44 |
|
Movac posted:There's also a lesser coding horror: using CGI in 2015. More horrific, a CGI script ostensibly intended to run as a server-side include that uses wget to write to a temp file on every invocation. quote:FIXME: Temporary override for Net::HTTP, until I can figure out why it causes an Internal Server Error on Pluck... weird.
|
# ? Jan 28, 2015 03:42 |
|
Knyteguy posted:From my newspaper's website: Yeah this has literally nothing to do with Ruby or RoR. It's like they File.read a .rb file or something. edit: and that is some awful code getting spit out. Gross.
|
# ? Jan 28, 2015 05:15 |
|
Knyteguy posted:From my newspaper's website: I like the comment in the middle that appears to say "remember to not insert the text here"
|
# ? Jan 28, 2015 07:31 |
|
I just wish that instead of BASIC the early Z80s had ALL used Forth instead, I still remember the amazement when i came across the Jupiter Ace as well as later on the MUF programming language.
|
# ? Jan 28, 2015 08:19 |
|
Jsor posted:I think C can actually get similar speed by compiling with -fstrict-aliasing (on GCC), but nobody does. Nobody does because GCC defaults it for -O2 and -Os since many years anyway. You'll find a lot of code turning it off though, because most programmers can't write correct C and in most cases don't think they are supposed to either, so they rant like this: http://www.mail-archive.com/linux-btrfs@vger.kernel.org/msg01647.html Apple's GCC build turns it off by default. MSVC doesn't do it at all.
|
# ? Jan 28, 2015 10:08 |
|
Movac posted:There's also a lesser coding horror: using CGI in 2015. Is there an easier way to just quickly get something online? It's an additional advantage that I can just write my stuff in just about any language. (I don't really do webdev, but sometimes I want to put a quick hack online.)
|
# ? Jan 28, 2015 12:31 |
|
Athas posted:Is there an easier way to just quickly get something online? It's an additional advantage that I can just write my stuff in just about any language. Pretty much every language has a light-weight web framework at this point. e: or hell, use a heavy framework
|
# ? Jan 28, 2015 13:02 |
|
Skuto posted:You'll find a lot of code turning it off though, because most programmers can't write correct C and in most cases don't think they are supposed to either, so they rant like this: http://www.mail-archive.com/linux-btrfs@vger.kernel.org/msg01647.html Unsurprising to see Torvalds advocating for cleansing and eugenics.
|
# ? Jan 28, 2015 14:11 |
|
KernelSlanders posted:Pretty much every language has a light-weight web framework at this point. But don't they just use CGI underneath? I haven't kept up that well with trends in Python web frameworks but I thought CGI was the way to get it working on Apache since the native module died.
|
# ? Jan 28, 2015 17:06 |
|
Munkeymon posted:But don't they just use CGI underneath? I haven't kept up that well with trends in Python web frameworks but I thought CGI was the way to get it working on Apache since the native module died.
|
# ? Jan 28, 2015 17:12 |
|
So I got into a debate about silly 'design patterns', and I thought I'd share two of my favorite Java classes. One is the AbstractSingletonProxyFactoryBean (which I still don't know what you'd use for), and the RequestProcessorFactoryFactory (which is, technically, a factoryfactoryfactory).
|
# ? Jan 28, 2015 17:28 |
|
|
# ? Jun 3, 2024 01:52 |
|
pigdog posted:Nope, frameworks these days often act as web servers themselves. CGI is really dumb and slow and nobody uses it these days. Turns out I was thinking of mod_wsgi, anyway, which I guess the cool kids have abandoned for gnuicorn or uWSGI. Neat!
|
# ? Jan 28, 2015 18:19 |