|
Suspicious Dish posted:I wonder if Intel publishes "here's nops for all byte sizes that matter" which their processors will try to handle faster than other nop instructions. IA-32 Intel Architecture Optimization Reference Manual, page 2-81, from 2004 posted:NOPs They probably have more recent recommendations.
|
# ? Apr 4, 2013 22:37 |
|
|
# ? May 28, 2024 14:51 |
|
rjmccall posted:As I read the language spec, Java is actually required to constant-fold concatenations that involve only constant strings and primitives, so "abc" + 4 * 5 + "def" is required to be treated exactly like "abc20def". This seems likely; I know that C# has the same requirement.
|
# ? Apr 4, 2013 22:50 |
|
One other thing is that 0x90 is also a true nop on amd64. Xchg ebx ebx has the net effect of clearing the upper bits of rbx, while 0x90 doesn't affect rax at all.
|
# ? Apr 4, 2013 23:27 |
|
rjmccall posted:They probably have more recent recommendations. ....And that answers my question. quote:Because NOPs require hardware resources to decode and execute, use the least number of NOPs to achieve the desired padding.
|
# ? Apr 4, 2013 23:44 |
|
rjmccall posted:As I read the language spec, Java is actually required to constant-fold concatenations that involve only constant strings and primitives, so "abc" + 4 * 5 + "def" is required to be treated exactly like "abc20def". Yea I could have been wrong about that. Maybe it made a new StringBuffer for every + operation? I just remember that people used to freak out about seeing string concatenation with a + because of all the unnecessary objects it would create.
|
# ? Apr 5, 2013 00:01 |
|
Goat Bastard posted:Yea I could have been wrong about that. Maybe it made a new StringBuffer for every + operation? I just remember that people used to freak out about seeing string concatenation with a + because of all the unnecessary objects it would create. Well, the compiler definitely doesn't make much effort to be clever about avoiding new StringBuffers. For example, the bytecode generated for this is awful: Java code:
Java code:
|
# ? Apr 5, 2013 00:12 |
|
rjmccall posted:They probably have more recent recommendations. Intel® 64 and IA-32 Architectures Optimization Reference Manual, June 2011 posted:1-byte: XCHG EAX, EAX
|
# ? Apr 5, 2013 00:54 |
|
rjmccall posted:Well, the compiler definitely doesn't make much effort to be clever about avoiding new StringBuffers. For example, the bytecode generated for this is awful: Yeah, using + on strings isn't really all that bad, because the compiler can mostly take care of it ... except when done in a loop.
|
# ? Apr 5, 2013 01:11 |
|
Just found this:C# code:
|
# ? Apr 10, 2013 18:13 |
|
Gonna whine about IDL some more as well as somebody else's code. Evidently IDL just inherently does poo poo as pointers so when I was trying code a little script that was just a FOR loop and pass the value of the FOR counter variable into the other person's code, I was passing a pointer to the address and the thing somebody else wrote that I was calling a lot. That person's code does things to the value of the input snapshot number (I don't know why exactly, but it ends up being on the order of a few thousand by the end). Then their code would finish, the counter variable would be on the order of a few thousand and it would exit after processing the first snapshot leaving the remaining ~50-200 untouched. This was a simple fix by just doing code:
code:
code:
code:
|
# ? Apr 11, 2013 15:32 |
|
The coding horrors I come across every day are the coding question implementations that my phone screen candidates give me. Is it really so hard to do LinkedList implementations?
|
# ? Apr 11, 2013 15:41 |
|
Volmarias posted:The coding horrors I come across every day are the coding question implementations that my phone screen candidates give me. In what language?
|
# ? Apr 11, 2013 16:29 |
|
I think I found it. The worst C++ program ever written. I walked over into manufacturing for a moment today to help one of my coworkers figure out a problem with a calibration station he's making for the product we're both working on. While I was doing this, one of the production support guys came over and asked whether either of us knew C++. I foolishly answered yes. The system he showed me, written in 2008 (so there's no excuse for it), had one class with about 450 public member variables. The whole thing "worked" by having one method that is a giant state machine that implements all the functionality - manipulating I/O lines, performing GUI functions, etc. The rest of the code works by poking values into member variables, then manipulating the state variable. Did I mention that they author felt the need to implement all the GUI widgets he used himself? Do I need to mention that he did so very badly? The story I eventually teased out of another coworker familiar with the situation is that the program was written by a MechE who had gone to a VB course and written production support software in VB for a few years, who then bought one of those "Learn C++ in a weekend" books or something like that. *shudders*
|
# ? Apr 11, 2013 16:40 |
|
Shugojin posted:I hate IDL I loving despise IDL, and i hate the fact that I need to use it for my work. Usually, I can get around it by using Python, but I often have to use some of my team's programs, and astronomers LOVE IDL. You know how much physicists love loving FORTRAN? Yeah, astronomers love IDL like that. A part of it is that the astrolib library has been ingrained very deeply in the community. So, understandably, you are sometimes just stuck dealing with IDL. There's also just the fact that we all tend to like to code in the language we know, and scientists tend to be very much of the thought "eh, why bother" when it comes to learning new programming languages not named FORTRAN or IDL... and in some circumstances, Perl. I'm just so loving happy that Python has been (quickly) taking over the Universe (literally ). Many, many astronomy libraries are being written and released for Python. The young astronomer community is embracing it, and hopefully, by the time I'm a crusty old scientist with my own horde of grad students and post docs, IDL will be a distant memory. This is all without bringing up how loving outrageously expensive the IDL licenses are. Only loving MATLAB surpasses it in terms of "hey bend over", and at least MATLAB has enough of an audience to justify their insane license costs.
|
# ? Apr 11, 2013 18:44 |
|
Hate to break it to you, but a lot of those Python libs probably have Fortran powering some part of them: http://www.scipy.org/Installing_SciPy/BuildingGeneral Fortran is also faster than C++ and possibly C for a lot of math operations. Not going away anytime soon - it's still got that niche locked down.
|
# ? Apr 11, 2013 19:49 |
|
Hard NOP Life posted:In what language? Any language the candidate wants; I tell them to use whatever they're most comfortable with. Most choose Java, though I've had a couple do C#. At this point, I'm pleasantly surprised if the method implementation they write would actually compile. Volmarias fucked around with this message at 20:15 on Apr 11, 2013 |
# ? Apr 11, 2013 20:01 |
|
Throw in a question with string manipulation, and you'll find that most of them are actually choosing a hypothetical Java variant with a mutable String class (which they call "Java" for short).
|
# ? Apr 11, 2013 20:59 |
|
Doctor w-rw-rw- posted:Hate to break it to you, but a lot of those Python libs probably have Fortran powering some part of them: http://www.scipy.org/Installing_SciPy/BuildingGeneral I don't think anybody is complaining about battle-tested, bulletproof numeric Fortran libraries.
|
# ? Apr 11, 2013 21:16 |
|
GrumpyDoctor posted:I don't think anybody is complaining about battle-tested, bulletproof numeric Fortran libraries. There's a pretty big difference between working directly with lovely Fortran code and happening to depend on some very high quality Fortran code that you never need to know exists.
|
# ? Apr 11, 2013 21:31 |
|
Doctor w-rw-rw- posted:Fortran is also faster than C++ and possibly C for a lot of math operations. Not going away anytime soon - it's still got that niche locked down. And how exactly is it faster? Don't point to libraries with bindings for all languages involved.
|
# ? Apr 11, 2013 21:38 |
|
Otto Skorzeny: you may have discovered a compelling counterpoint to the argument that introductory programming courses should start with gate-logic and work their way up the stack. Any program running on a real computer clearly can be viewed as a finite state machine, it just isn't a sane level of abstraction for most tasks. edit: hobbesmaster: One thing I am aware of is that Fortran makes it easier to analyze loop kernels and prove that references do not alias against one another, which can pave the way for optimizations. In classical Fortran the only means of data indirection is array lookups, and data structures tend to be simply flat arrays. Internet Janitor fucked around with this message at 21:54 on Apr 11, 2013 |
# ? Apr 11, 2013 21:50 |
|
hobbesmaster posted:And how exactly is it faster? Don't point to libraries with bindings for all languages involved. Mostly because it is easier for a compiler to prove that Fortran code operating on a matrix doesn't violate strict aliasing, so it is free to perform a couple of optimizations that would render code that violated strict aliasing incorrect. C99 code written to use the restrict qualifier on pointers judiciously can often overcome this hurdle. (You can think of the Fortran approach of EQUIVALENCE vs the C99 approach of restrict as a whitelist for aliasing and a blacklist, respectively). Besides real reasons for Fortran to be faster than C, however, there are also stupid reasons, eg. writing a matrix operation in Fortran and C and traversing in column-major order in both and wondering why the C is so slow. Blotto Skorzany fucked around with this message at 21:55 on Apr 11, 2013 |
# ? Apr 11, 2013 21:53 |
|
Why wouldn't that be slow in Fortran as well; surely you'd run into the same cache issues? (I know nothing about Fortran).
|
# ? Apr 11, 2013 22:52 |
|
yaoi prophet posted:Why wouldn't that be slow in Fortran as well; surely you'd run into the same cache issues? (I know nothing about Fortran). Fortran stores matrices in column major order.
|
# ? Apr 11, 2013 23:21 |
|
Doctor w-rw-rw- posted:Hate to break it to you, but a lot of those Python libs probably have Fortran powering some part of them: http://www.scipy.org/Installing_SciPy/BuildingGeneral At what point did I deride Fortran?
|
# ? Apr 11, 2013 23:23 |
|
VanillaKid posted:Fortran stores matrices in column major order. Oh, well that makes sense. Do non-programming but still software-related horrors count? Because this website is absolutely atrocious.
|
# ? Apr 11, 2013 23:28 |
|
yaoi prophet posted:Oh, well that makes sense. I love the guy with the bananas.
|
# ? Apr 12, 2013 00:43 |
|
yaoi prophet posted:Oh, well that makes sense. Haha, no video with supported mime type found. You lose HTML5 Homies!
|
# ? Apr 12, 2013 04:26 |
|
More adventures in old and dodgy perl code. Found a script that takes user input, builds a query (by string concatenation) and executes it directly. But it's ok, because they escape quotes (in only one of the possible execution paths). This script runs as a soap service, but despite this someone copy pasted the code directly into another script running under apache on the same server. This copy doesn't sanitise user input at all. The scripts serve to look up staff contact details based on a few parameters, one of which is location. The location isn't stored in the database so instead they try infer the location by the phone number (if it's been saved). The client side code reads (embedded in the perl source, naturally): code:
code:
|
# ? Apr 12, 2013 06:13 |
|
JetsGuy posted:I loving despise IDL, and i hate the fact that I need to use it for my work. Usually, I can get around it by using Python, but I often have to use some of my team's programs, and astronomers LOVE IDL. You know how much physicists love loving FORTRAN? Yeah, astronomers love IDL like that. A part of it is that the astrolib library has been ingrained very deeply in the community. So, understandably, you are sometimes just stuck dealing with IDL. There's also just the fact that we all tend to like to code in the language we know, and scientists tend to be very much of the thought "eh, why bother" when it comes to learning new programming languages not named FORTRAN or IDL... and in some circumstances, Perl. FORTRAN and IDL are loved by dinosaurs; modern physicists have mostly made the switch to C++ or Python. There's a bunch of legacy poo poo written in FORTRAN, and there will always be a FORTRAN niche, but the language is quickly disappearing from actual scientific usage (as an example, the everything at CERN is almost exclusively C++ and Python with a few things that are based in FORTRAN; for computational purposes there will be some FORTRAN libraries hanging around for a very long time, I think we all agree) It's exactly for the reasons that you say; the older crowd wants to keep using FORTRAN and IDL because that's what they've always used, whereas the younger crowd wants to use Python and C++ because it's what they learned in school (and a lot of other reasons)
|
# ? Apr 12, 2013 11:44 |
QuarkJets posted:FORTRAN and IDL are loved by dinosaurs; modern physicists have mostly made the switch to C++ or Python. There's a bunch of legacy poo poo written in FORTRAN, and there will always be a FORTRAN niche, but the language is quickly disappearing from actual scientific usage This has been my experience as well, and thank god. Although ROOT is pretty annoying sometimes, and in my experience it is used a ton by the particle physics community.
|
|
# ? Apr 13, 2013 04:06 |
|
The worst code I've yet encountered was produced by Astro researchers. The best theory I can come up with is that perhaps the industry is accustomed to one-off code that only needs to function until a paper gets published. This may also be why they're fine with using something like IDL in the first place.
|
# ? Apr 13, 2013 05:29 |
|
Progressive JPEG posted:The worst code I've yet encountered was produced by Astro researchers. The best theory I can come up with is that perhaps the industry is accustomed to one-off code that only needs to function until a paper gets published. This may also be why they're fine with using something like IDL in the first place. Most code written by scientists is terrible for exactly this reason.
|
# ? Apr 13, 2013 06:12 |
|
Yeah but you should see the quality of astrophysics done by computer scientists.
|
# ? Apr 13, 2013 07:22 |
|
VikingofRock posted:This has been my experience as well, and thank god. Although ROOT is pretty annoying sometimes, and in my experience it is used a ton by the particle physics community. Having used both ROOT and MATLAB for many years now, you should count your lucky stars that you get to use ROOT. It has a whole bunch of horrible problems and bizarre ways of doing things done, but MATLAB is one hundred times worse and lacks a lot of the graphical power that ROOT has (although for writing little one-off projects that just produce results, MATLAB is better; what I'm saying is that ROOT is far better for producing plots and other pretty things, whereas MATLAB is more of a result workhorse with the presentation of data thrown in as an afterthought) Also, many of the problems with ROOT disappear if you start using PyROOT. Give that a shot
|
# ? Apr 13, 2013 08:37 |
|
A longstanding issue with godawful physicist code is that it's often so terrible that no one can hope to reproduce the results because of how loving fragile it is.
|
# ? Apr 13, 2013 08:40 |
|
Progressive JPEG posted:The worst code I've yet encountered was produced by Astro researchers. The best theory I can come up with is that perhaps the industry is accustomed to one-off code that only needs to function until a paper gets published. This may also be why they're fine with using something like IDL in the first place. You are exactly right. Even today most graduate physics/astronomy students have maybe one computational physics course before entering grad school, and most grad programs may only offer one additional computational course. There's not even an introduction to programming in these programs, you're told about these tools that are necessary for solving certain problems but nobody explains how to actually produce good code or anything like that. My graduate level computational class was basically just a class on Mathematica and was completely useless All that we have is our own experience and the experience of our peers, which often isn't much to go on. Legacy code becomes untouchable because it produces the results that we expect. CERN specifically has an advantage in that there are actual computer scientists working there alongside the physicists, and there are sometimes workshops to help people learn better coding skills. Many fields don't get that; you're in a basement lab with a bunch of other grad students who are just as clueless as you are. QuarkJets fucked around with this message at 08:44 on Apr 13, 2013 |
# ? Apr 13, 2013 08:41 |
|
Progressive JPEG posted:The worst code I've yet encountered was produced by Astro researchers. The best theory I can come up with is that perhaps the industry is accustomed to one-off code that only needs to function until a paper gets published. I do computer science research and my proof of concept code literally makes my skin crawl but I don't loving care because of the reason you just listed. MORE PAPERS
|
# ? Apr 13, 2013 15:07 |
|
QuarkJets posted:You are exactly right. Even today most graduate physics/astronomy students have maybe one computational physics course before entering grad school, and most grad programs may only offer one additional computational course. There's not even an introduction to programming in these programs, you're told about these tools that are necessary for solving certain problems but nobody explains how to actually produce good code or anything like that. My graduate level computational class was basically just a class on Mathematica and was completely useless I'm in the undergrad computational (astro)physics at my university at the moment, it's what I was bitching about. It's basically 3 hours a week of being told about algorithms and getting then getting thrown headfirst into actually coding an implementation. It's pretty rough. I've uh, implemented some code the professor has written and it's kinda godawful in a lot of ways. What I produced for this class is far from the best I've ever written and as far as I can tell I'm one of the better people who aren't the compsci guy. I don't envy the TA who grades our poo poo. Also she loves numerical recipes which is just terrible, she has the code sometimes in slides like it's insightful and doesn't hide half the calculations in proprietary libraries.
|
# ? Apr 13, 2013 15:52 |
|
|
# ? May 28, 2024 14:51 |
|
dis astranagant posted:A longstanding issue with godawful physicist code is that it's often so terrible that no one can hope to reproduce the results because of how loving fragile it is. Yep. This is an issue across a lot of fields in science and is part of the reason there's a crisis of sorts with regards to reproducibility. I'm in particular thinking about the more smooshy medical/health/social sciences.
|
# ? Apr 13, 2013 17:14 |