|
I've been fascinated by APL and its dialects for the past few years. My background is in classic functional programming, and I'm now a researcher in the field of optimising compilers for data-parallel functional programming, and the APL community contains the most experience with writing expressive data-parallel programs. I'm still more of an APL fanboy than a real APL programmer, but I like this function for removing the inner wovels of words:code:
For learning K, can anyone recommend a good open source implementation? I've seen Kona before, and even played a little with it, but I was not aware that it implemented an older version of the language. I still need to learn J also. I don't care much about the ASCII-fied syntax, but I've heard that it has made some things cleaner (like working with multidimensional arrays). Some of its features, like trains, have made it into Dyalog APL, though. Finally, a shout out to GNU APL. It's a very good free software implementation of APL, which nobody seems to know about.
|
# ¿ Dec 2, 2015 13:21 |
|
|
# ¿ May 10, 2024 00:25 |
|
Is there a good writeup of why K as a language is fast? I understand how a column-oriented in-memory database can be fast, but what about the language itself? As far as I understand, it's just an interpreter. Does it even use parallelism? My own theory is that most K code uses language features directly, without any pointless abstraction layers, which means that only a relatively small number of interpreter actions are actually performed (and those are of course hand-optimised), but I have never seen a large K program. Are there any benchmarks comparing K to C? I often see it mentioned that it's just so much faster, but nobody mentions exactly why (except of course for the K code being much shorter, thus making it easier to optimise).
|
# ¿ Dec 2, 2015 22:22 |
|
BigBobio posted:So I actually program use APL at work. Once you get past the syntax and the special characters, its quite nice. Prototyping is a breeze Where do you work? Everyone I know who programs APL is either in finance or... idiosyncratic.
|
# ¿ Apr 29, 2016 09:00 |
|
Congratulations on the job! I've become involved in writing an APL compiler targeting GPU execution. As a result, I figured I should probably get some more experience with APL by itself. Is there a nice way to write stencils? Currently we do it by a mess of concatenations and transposes, for example here, for the core of a 2D stencil: pre:iter ← { temp ← ⍵ m1 ← (⍉1↓⍉temp),row 1⍴0 m2 ← (row 1⍴0),⍉¯1↓⍉temp x ← (m1 + m2) - c1 × temp n1 ← ⍉(⍉1↓temp),col⍴0 n2 ← ⍉(col⍴0),⍉¯1↓temp y ← (n1 + n2) - c2 × temp delta ← (step ÷ Cap) × (power + (y ÷ Ry) + (x ÷ Rx) + (amb_temp - temp) ÷ Rz) temp + delta }
|
# ¿ Jun 20, 2016 03:32 |
|
Oh yeah, that'll make it nicer. I wonder if it's supported by our parser. Won't do anything for performance, but we're already close to hand-written code anyway.
|
# ¿ Jun 21, 2016 11:42 |
|
You still have to fill out a form and wait for someone to approve it if you want to try it out. There is an "unregistered" version, but only for Windows. Why do they make it so hard to try it out? It's interesting that there's been so relatively many array programming posts on HN recently.
|
# ¿ Jul 1, 2016 22:07 |
|
|
# ¿ May 10, 2024 00:25 |
|
taqueso posted:HN seems to do everything in spurts. Maybe array languages are finally going to get big because concurrency is becoming so important. Are any of the array languages particularly parallel in practice? I noticed Dyalog added futures recently, which is something you could find in any old procedural language.
|
# ¿ Jul 1, 2016 22:14 |