Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Asymmetrikon
Oct 30, 2009

I believe you're a big dork!
Redux heavily borrows from Elm's Architecture, and Elm has an equivalent to React in elm-html. If you're comfortable in one, you'll very likely be comfortable in the other.

Adbot
ADBOT LOVES YOU

Arcsech
Aug 5, 2008

tekz posted:

How does elm compare to say, using react+redux, which I really like.

The architecture and thought process is similar between the two. The main difference is that Elm has a strong static type system, and encodes IO in the type system, kinda like Haskell - although it does so in a very different (and in my opinion, easier to understand) way.

Maluco Marinero
Jan 18, 2001

Damn that's a
fine elephant.
Working in React & Redux is similar but clunky because JavaScript doesn't have first class support for its base principles (immutability and FRP) like Elm does. The plus side is it's easier to write 'escape hatch' imperative code for pragmatism reasons, although of course if there was less friction to doing things right like in Elm, you'd need the escape hatches less.

mila kunis
Jun 10, 2011
I guess I'll start with Elixir; http://elixir-lang.org/learning.html has a few options. I'm thinking of picking up the O'Reilly book, but if anyone has another recommendation please let me know.

Pollyanna
Mar 5, 2005

Milk's on them.


tekz posted:

I guess I'll start with Elixir; http://elixir-lang.org/learning.html has a few options. I'm thinking of picking up the O'Reilly book, but if anyone has another recommendation please let me know.

Programming Elixir has been pretty great so far. Most Pragprog books are good, in fact.

bobthenameless
Jun 20, 2005

I found Etudes for Elixir to be kinda neat too, but I also enjoyed etudes for erlang as well.

MononcQc's Learn You Some Erlang can be helpful too; its erlang, but any of the chapters about OTP/general erlang ecosystem stuff can be really helpful for Elixir stuff still; also you may find yourself using some erlang libraries here and there and this can be helpful as well.

There's also Elixir Rader (and apparently another, Elixir digest, that im sure is comparable) that collects library news/cool articles/ etc if you want a newsletter to pepper your inbox; i kinda enjoy the "weekly news" things for languages but it might not be for everyone.

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
Elm just switched their model from using signals, addresses, and ports to a unifying model of "subscriptions". At first glance it seems pretty promising:

http://elm-lang.org/blog/farewell-to-frp

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



The 0.17 installer was flagged as Win32/Fathale.B!plock by Windows Defender, but only after I ran it. Hope it's a false positive :ohdear:

Pollyanna
Mar 5, 2005

Milk's on them.


So I hear a lot about Lisp's extensibility and metaprogramming capabilities, and how it's good for defining a problem in a language that maps directly to said problem domain. People talk about domain-specific languages and macros and bottom-up programming and "changing the language to suit the problem" which makes it seem a lot more like dark-arts wizardry than a programming language.

I don't really understand what this all means, and how it makes Lisp so powerful/versatile. I get the concept of domain specific languages, even if I have no idea how to make one or what makes a good DSL good, and I understand that homoiconicity means that everything is just data, including code. But I can't seem to connect those concepts to understanding and writing code that maps directly from problem domain to implementation.

It feels like there's supposed to be an aha-moment with Lisp that I haven't had yet, and I don't know what I'm missing. What am I looking for, and how do Lisp's killer features matter in programming it well?

tazjin
Jul 24, 2015


Pollyanna posted:

So I hear a lot about Lisp's extensibility and metaprogramming capabilities, and how it's good for defining a problem in a language that maps directly to said problem domain. People talk about domain-specific languages and macros and bottom-up programming and "changing the language to suit the problem" which makes it seem a lot more like dark-arts wizardry than a programming language.

I don't really understand what this all means, and how it makes Lisp so powerful/versatile. I get the concept of domain specific languages, even if I have no idea how to make one or what makes a good DSL good, and I understand that homoiconicity means that everything is just data, including code. But I can't seem to connect those concepts to understanding and writing code that maps directly from problem domain to implementation.

It feels like there's supposed to be an aha-moment with Lisp that I haven't had yet, and I don't know what I'm missing. What am I looking for, and how do Lisp's killer features matter in programming it well?

For me the lisp aha-moment happened when I started using structural editing for it (paredit in emacs), that's when I realised how the syntax and everything is just superficial and that data is code and how the universe really works.

I went to find more information and was finally enlightened by this article about the nature of lisp.

Redmark
Dec 11, 2012

This one's for you, Morph.
-Evo 2013
The great thing about Lisps to me isn't anything philosophical, or even anything to do with its semantics. It's that you can copy an s-expression from a place, and put it in another place, and be assured that it means the same thing. You can press a key to slurp the sexp and plop it somewhere else, and it just works. When a lot of one's time is spent cutting bits out and moving them around in a large codebase, that's honestly a huge deal and saves a lot of typing and making sure you mouse over exactly the right code fragment. It's an ugly reason for liking Lisp, but as you've seen it's quite difficult to digest the pretty reasons :v:

raminasi
Jan 25, 2005

a last drink with no ice

Pollyanna posted:

So I hear a lot about Lisp's extensibility and metaprogramming capabilities, and how it's good for defining a problem in a language that maps directly to said problem domain. People talk about domain-specific languages and macros and bottom-up programming and "changing the language to suit the problem" which makes it seem a lot more like dark-arts wizardry than a programming language.

I don't really understand what this all means, and how it makes Lisp so powerful/versatile. I get the concept of domain specific languages, even if I have no idea how to make one or what makes a good DSL good, and I understand that homoiconicity means that everything is just data, including code. But I can't seem to connect those concepts to understanding and writing code that maps directly from problem domain to implementation.

It feels like there's supposed to be an aha-moment with Lisp that I haven't had yet, and I don't know what I'm missing. What am I looking for, and how do Lisp's killer features matter in programming it well?

Not to knock Lisp (I've dabbled, but I'm certainly not one of the enlightened), but before you get too depressed about your failure to ascend, remember that for every beautiful Lisp monument to metaprogramming there are a hundred boring-rear end, non-Lisp programs out there doing work that people rely on as well. For all its glories, it hasn't made itself widely indispensable, and after however many years, "well they all just haven't heard the Good Word!" becomes an excuse and not an explanation.

And when I read "Imagine the possibilities if the programmer could modify the abstract syntax tree himself!" the first "possibility" I think of is new heights of unmaintainability.

weird
Jun 4, 2012

by zen death robot

Redmark posted:

The great thing about Lisps to me isn't anything philosophical, or even anything to do with its semantics. It's that you can copy an s-expression from a place, and put it in another place, and be assured that it means the same thing. You can press a key to slurp the sexp and plop it somewhere else, and it just works. When a lot of one's time is spent cutting bits out and moving them around in a large codebase, that's honestly a huge deal and saves a lot of typing and making sure you mouse over exactly the right code fragment. It's an ugly reason for liking Lisp, but as you've seen it's quite difficult to digest the pretty reasons :v:

it's also not particularly accurate

Siguy
Sep 15, 2010

10.0 10.0 10.0 10.0 10.0
There have been a few different times over the years when I read long articles explaining clever macros and got briefly excited about the idea of diving into Common Lisp, but each time I found the language too barbaric to deal with. The function names in Common Lisp seem to have no logic or uniformity at all and the ecosystem feels ancient.

I'd be curious to try a more modern Lisp like Clojure but I've never had a personal project that fit.

Pollyanna
Mar 5, 2005

Milk's on them.


I think the "personal project that fits" part is what trips people up. It's hard to map the usefulness of code-as-data-as-code and macros if your typical use case is setting up a web application or making an app with a GUI.

tazjin posted:

For me the lisp aha-moment happened when I started using structural editing for it (paredit in emacs), that's when I realised how the syntax and everything is just superficial and that data is code and how the universe really works.

I went to find more information and was finally enlightened by this article about the nature of lisp.

I got that first part, where everything's the same under the hood, but apart from that I'm a little lost.

I read the article, and I think I'm starting to get it. The XML and to-do list example is a good starting point, but it feels like that's just one particular application of it, and I'm not sure how to apply it to anything else. I can see flashes of brilliance there when you realize that the entire point is about transforming data in a functional manner, but that kind of loses its luster when you work in FP-ready languages already.

The one problem I have with macros vs. functions is that everything you can do with macros, you can do with functions. At least, as far as I can tell. As someone who mostly uses FP, I reach for a data-transforming function over a macro since that's what I'm used to. I don't feel the need to use macros, almost ever, so I don't really have a reason to understand them.

What could help is to have some sort of exercise where you give people some plain s-expressions meant to represent some kind of data structure, and have people write something that makes that chunk of data into a program. I think that's the crux of the matter - the fact that programs are data in Lisp confuses people, because everyone considers programs and data to be wholly separate things in their minds. Programs do things, whereas data are things. One becoming the other never happens, perceptually, and maybe that's the problem. "Code is data" is readily understood, but "data is code" might just be the missing piece.

Here, this is something awful:

Lisp code:
(forum "Something Awful"
  (board "Games"
    (thread "DO4M"
      (post (user "Pollyanna") "here a post about a doam")))
  (board "Serious Hardware/Software Crap"
    (board "The Cavern of Cobol"
      (thread "Functional Programming: This thread is like a burrito"
        (post (user "Pollyanna") "here a post about a lisp")))))
This whole thing is a chunk of data representing (part of) the forums, and this is all the input you get. Your job, as an evaluator of Lisp's potential, is to pretty-print this in two different ways:

1. Using only functions, and
2. Using only macros.

Forums, boards, threads, posts, and users all have different pretty-print formats. For example, I want all forums to be displayed in UPPERCASE LETTERS, all users to be printed like Pollyanna says:\n, all posts to be printed like this:

code:
-----------------
here a post about a lisp
-----------------
and so on. I gotta run, but I feel like implementing this functionality in the two different ways (and maybe even a third way, a combination of both) will help illuminate Lisp's strengths.

Pollyanna fucked around with this message at 16:24 on May 19, 2016

Banned King Urgoon
Mar 15, 2015

Pollyanna posted:

I gotta run, but I feel like implementing this functionality in the two different ways (and maybe even a third way, a combination of both) will help illuminate Lisp's strengths.

It won't, because macros and functions serve different purposes (and you should never use a macro when a function will suffice). The reason for macros is to implement new syntactic forms, with their own flow control. For instance, cond:

Lisp code:
(defn pos-neg-or-zero  [n]
  (cond
    (< n 0) (println "negative")
    (> n 0) (println "positive")
    :else (println "zero")))
cond takes a sequence of test/expression pairs, evaluating the tests in order and, when one is true, evaluates the corresponding expression. You can't implement this as a function because functions are strict (that is, they evaluate all their arguments), whereas cond is short-circuiting and will only ever evaluate one of the bodies.

edit: Another (possibly better) example: unlike Haskell or ML, Lisp functions are not automatically curried. If foo is a function of 3 arguments, Haskell treats (foo a b) as a function of the one remaining argument, while in Lisp (foo a b) is an error. But since this syntax is so convenient, Clojure provides the threading macro:

Lisp code:
(-> 5 (+ 3) (/ 2) (- 1))
;; macroexpands to (- (/ (+ 5 3) 2) 1)
The -> macro lets us pretend we're composing partially-applied functions; behind the scenes, it deconstructs the syntax and fills in the "missing" argument. This is the power of macros: you can do arbitrary term-rewriting, control the order of execution, and generally implement whatever semantics you want.

Banned King Urgoon fucked around with this message at 00:06 on May 20, 2016

weird
Jun 4, 2012

by zen death robot

Siguy posted:

The function names in Common Lisp seem to have no logic or uniformity at all and the ecosystem feels ancient.

can you elaborate on both of those, i don't know what you're talking about

Ralith
Jan 12, 2011

I see a ship in the harbor
I can and shall obey
But if it wasn't for your misfortune
I'd be a heavenly person today

Banned King Urgoon posted:

It won't, because macros and functions serve different purposes (and you should never use a macro when a function will suffice). The reason for macros is to implement new syntactic forms, with their own flow control.
This is very true. It can also be convenient to use macros to statically compile an EDSL (for example, a query language like LINQ or SQL, or URL routing rules for a REST API, or etc) to the host language.

Siguy
Sep 15, 2010

10.0 10.0 10.0 10.0 10.0

weird posted:

can you elaborate on both of those, i don't know what you're talking about

I was talking a little out of my rear end since I didn't put much effort into learning Common Lisp, but all the built-in functions felt very old-school and weird to me, with lots of abbreviations and terminology I wasn't familiar with. That doesn't mean they're bad necessarily and I probably overstated saying they have no logic, but as someone new to the language I didn't understand why sometimes a common function would be a clear written out word like "concatenate" and other times a function would just be "elt" or "rplaca".

weird
Jun 4, 2012

by zen death robot

Siguy posted:

I was talking a little out of my rear end since I didn't put much effort into learning Common Lisp, but all the built-in functions felt very old-school and weird to me, with lots of abbreviations and terminology I wasn't familiar with. That doesn't mean they're bad necessarily and I probably overstated saying they have no logic, but as someone new to the language I didn't understand why sometimes a common function would be a clear written out word like "concatenate" and other times a function would just be "elt" or "rplaca".

i understand with elt. most of the odd abbreviated names like rplaca are low level / legacy functions that you don't really need to know about anyway, like you should never be writing (rplaca cons x), but always (setf (car cons) x), and even CAR should be FIRST if it's a list and not just a general cons. setf is really awesome by the way

weird
Jun 4, 2012

by zen death robot
also on ecosystem, the book you linked is a very good introduction to lisp but it was written before quicklisp came out, which is a biggy

Pollyanna
Mar 5, 2005

Milk's on them.


:shrug: I dunno, then. This is just kind of my impression of what that article was trying to teach me, and what the whole magic/zen enlightenment of Lisp is. I just wanna know how to make things really bullshit easy and dead simple, and macros seem like a good way to do that. How, I don't know, but it's what people seem to be saying about them.

Siguy posted:

I was talking a little out of my rear end since I didn't put much effort into learning Common Lisp, but all the built-in functions felt very old-school and weird to me, with lots of abbreviations and terminology I wasn't familiar with. That doesn't mean they're bad necessarily and I probably overstated saying they have no logic, but as someone new to the language I didn't understand why sometimes a common function would be a clear written out word like "concatenate" and other times a function would just be "elt" or "rplaca".

This is basically not a thing in Clojure, which is the newest dialect running on the JVM. Check it out: https://www.conj.io/

Ralith
Jan 12, 2011

I see a ship in the harbor
I can and shall obey
But if it wasn't for your misfortune
I'd be a heavenly person today

Siguy posted:

I was talking a little out of my rear end since I didn't put much effort into learning Common Lisp, but all the built-in functions felt very old-school and weird to me, with lots of abbreviations and terminology I wasn't familiar with. That doesn't mean they're bad necessarily and I probably overstated saying they have no logic, but as someone new to the language I didn't understand why sometimes a common function would be a clear written out word like "concatenate" and other times a function would just be "elt" or "rplaca".
Common Lisp is really really old. You should probably study something modern like Racket instead.

weird
Jun 4, 2012

by zen death robot
common lisp doesn't really compare to clojure or racket. if you're into lisp for the syntax, which is a good reason to be into it, they're all the same, but cl has a very different style, and there isn't really anything competing with it

xtal
Jan 9, 2011

by Fluffdaddy

Zemyla posted:

If you were compiling the code, you'd see a much more dramatic decrease in the memory used with foldl', because of something called list fusion. Basically, in GHC, a lot of functions in the Prelude, and a good number of functions in other modules, use rewrite rules to pretend a list is actually a function that produces elements of the list on demand.

The key here is GHC.Base.build, which uses the RankNTypes extension (which allows polymorphism in function arguments):
code:
build :: forall a. (forall b. (a -> b -> b) -> b -> b) -> [a]
build g = g (:) []
If you're familiar with the type of foldr, you may notice a similarity between the two. It's intentional, and there is in fact a rewrite rule that says:
code:
forall k z (g :: forall b. (a -> b -> b) -> b -> b)
foldr k z (build g) = g k z
This enables it to directly consume the list as it would have been produced, without producing an intermediate list at all in the first place.

As an example of what this lets you do, let's take a simple function from the Prelude, map:

code:
map :: (a -> b) -> [a] -> [b]
map _ [] = []
map f (x:xs) = f x : map f xs
Looks pretty simple, huh? Well, the rule-rewriting engine will trigger a transformation into:

code:
{-# RULES "map" [~1] forall f xs. map f xs = build (\c n -> foldr (mapFB c f) n xs #-}

mapFB :: (b -> r -> r) -> (a -> b) -> a -> r -> r
mapFB c f = \x ys -> c (f x) ys
This basically causes that map code to be turned into a build which uses a foldr on its argument, causing what would be an intermediate list to vanish. Note that this produces a list in the form of build g as well. If the incipient list is turned into an actual list, then another rule fires:
code:
{-# RULES "mapFB" [1] forall f. foldr (mapFB (:) f) [] = map f #-}
This is controlled by inlining level annotations: the [1] means "only use this rule on inlining levels 1 and 0" and [~1] means "use this rule on all levels before 1" - there are 5 levels, and in order they are 4, 3, 2, 1, and 0.

What all this means is that, if you were to use foldl' to do the summation, then the compiler would use rules, inlining, strictness analysis, arity analysis, and others (they're tedious to perform by hand, but not complicated or obscure) to turn:
code:
value :: Int
value = foldl' (+) 0 $ take 10000000 $ repeat 5
into
code:
value :: Int
value = let
    xs !m !z = case m of
        1 -> z + 5
        _ -> xs (m - 1) (z + 5)
    in xs 10000000 0
And a simple tail-recursive function with a strict index can easily be turned into a loop in assembly.

So the lesson is that if you're consuming a list, use either foldr (if you're producing a lazy structure or consuming a possibly-infinite list) or foldl' (if you're producing something strict like a number and you know the list is finite).

Side note: All of this happens if you consume the list at the same time you produce it. If you have a list with a billion elements, calculate its length, and the later calculate its sum, then you're going to be hanging on to a list with a billion elements in memory. Sucks to be you.

This is all nice and good but does anybody else want to crosspost it to the coding horrors thread? Every pragma in GHC is just awful.

Athas
Aug 6, 2007

fuck that joker
Every optimising compiler belongs in the horror thread. It's always a mess of heuristics, hacks and "good-enough"s.

xtal
Jan 9, 2011

by Fluffdaddy
nm

xtal fucked around with this message at 02:08 on May 25, 2016

QuantumNinja
Mar 8, 2013

Trust me.
I pretend to be a ninja.

Athas posted:

Every optimising compiler belongs in the horror thread. It's always a mess of heuristics, hacks and "good-enough"s.

I disagree. Well, sort of. Which is to say, Kent Dybvig did it with one heuristic that properly-quantified "good enough". The rest of Chez Scheme (which was recently open-sourced) is actually a pretty direct process of efficient and direct syntactic rewriting. For example, here is the pass handling letrec.

Outside of Chez Scheme (and other, similar compilers written with Dybvig's philosophy, like Ikarus), however, yeah, most compilers, regardless of optimizations, are a horrorshow. Rust's compiler doesn't even do massive optimizations and it's a horrorshow, at this point GCC's a black box, LLVM can't even decide which cross-compilation is available, and ICC is full of magic tricks.

QuantumNinja fucked around with this message at 04:47 on Jun 4, 2016

quiggy
Aug 7, 2010

[in Russian] Oof.


So I'm trying to Learn Me A Haskell (tm) and am doing so by doing Euler problems. I'm currently trying to do problem 5, which is to find the smallest number evenly divisible by 1 through 20. At first I tried to brute-force it, which turned out to be monstrously slow. Instead my methodology is now to find the prime factors of 2 through 20 (since we get 1 for free), merge them, and then multiply those. The merging method bit is what's killing me: ideally, I want the merged list to have as many 2s as the number with the most 2s in its prime factorization (which is 16, so four 2s), as many 3s as the number with the most 3s (9 or 18, both with two 3s), and so on. Unfortunately, the built-in union method doesn't quite work like this because the order of arguments is important: union [1, 1] [1] is [1, 1], but union [1] [1, 1] is [1].

How would I write a union function that works the way I need it to? I've gotten as far as being able to find the prime factorization of the numbers from 2 to 20 but I can't for the life of me figure out this intermediate step.

e: aaaand naturally I figured out at least one way to do this. Tips and improvements are more than welcome, though :)

code:
merge :: [Int] -> [Int] -> [Int]
merge [] [] = []
merge x  [] = sort x
merge [] y  = sort y
merge x  y
    | xh == yh = xh : merge xt yt
    | xh <  yh = xh : merge xt y
    | xh >  yh = yh : merge x  yt
        where (xh:xt) = sort x
              (yh:yt) = sort y

quiggy fucked around with this message at 19:10 on Jul 1, 2016

Nippashish
Nov 2, 2005

Let me see you dance!

quiggy posted:

So I'm trying to Learn Me A Haskell (tm) and am doing so by doing Euler problems. I'm currently trying to do problem 5

I did the same thing a few years ago and ended up with this:
code:
smallestDivisibleByAll xs = product $ sda xs []
    where sda [] lst = lst
          sda (x:xs) lst = sda xs $ lst ++ (factors \\ lst)
              where factors = primeFactors x

quiggy
Aug 7, 2010

[in Russian] Oof.


Nippashish posted:

I did the same thing a few years ago and ended up with this:
code:
smallestDivisibleByAll xs = product $ sda xs []
    where sda [] lst = lst
          sda (x:xs) lst = sda xs $ lst ++ (factors \\ lst)
              where factors = primeFactors x

The lst ++ (factors \\ lst) bit is pretty brilliant. Clearly I still have a lot of learning to do :v:

Banned King Urgoon
Mar 15, 2015

quiggy posted:

So I'm trying to Learn Me A Haskell (tm) and am doing so by doing Euler problems. I'm currently trying to do problem 5, which is to find the smallest number evenly divisible by 1 through 20.

Uhh...

code:
foldl1' lcm [1..20]
vvv Fair enough. Looking again, your implementation of merge has the right idea, except
  • You call sort on every iteration of merge! The logic is simpler (and more efficient) if the arguments were pre-sorted.
  • It is not tail recursive — xh : merge xt yt calls merge and then appends xh. To efficiently build a list, you should use an accumulator.
e.g.
code:
merge xs ys = merge' [] (sort xs) (sort ys)
  where
    merge' acc xs [] = reverse acc ++ xs
    merge' acc [] ys = reverse acc ++ ys
    merge' acc (x:xs) (y:ys) =
      case compare x y of
        EQ -> merge' (x:acc) xs ys
        LT -> merge' (x:acc) xs (y:ys)
        GT -> merge' (y:acc) (x:xs) ys

Banned King Urgoon fucked around with this message at 21:47 on Jul 1, 2016

quiggy
Aug 7, 2010

[in Russian] Oof.


Banned King Urgoon posted:

Uhh...

code:
foldl1' lcm [1..20]

I didn't realize lcm was a thing until after I had already solved the problem. In truth breaking it down into smaller pieces is probably better for me to try to learn the language regardless.

9-Volt Assault
Jan 27, 2007

Beter twee tetten in de hand dan tien op de vlucht.
A new Haskell MOOC starts in about 2,5 weeks time: https://www.futurelearn.com/courses/functional-programming-haskell. It seems quite basic but hey, at least its teaching Haskell instead of yet another Python one.

xtal
Jan 9, 2011

by Fluffdaddy
Lately I've been learning Idris because Haskell has a bunch of warts that make me cry. The library support is, expectedly, lacking, so I'm considering writing Haskell with an alternate prelude instead. Has anybody worked with ClassyPrelude, Protolude or Foundation who can help me decide what to use? (Or tell me if this is even a good idea, because I expect I am going to need to add a whole lot of string conversions for compatibility with libraries.)

sink
Sep 10, 2005

gerby gerb gerb in my mouf

xtal posted:

Lately I've been learning Idris because Haskell has a bunch of warts that make me cry. The library support is, expectedly, lacking, so I'm considering writing Haskell with an alternate prelude instead. Has anybody worked with ClassyPrelude, Protolude or Foundation who can help me decide what to use? (Or tell me if this is even a good idea, because I expect I am going to need to add a whole lot of string conversions for compatibility with libraries.)

I can't answer any of your questions but I'm curious as to what warts you consider to be intolerable in Haskell? Are any of these resolved in version 8 or with compiler extensions? And what is satisfactorily solved by Idris?

I've been writing (functional) Scala for the last few years, and I'm sick of suffering its warts. Haskell is closer to what I want, but it'll take a good bit of personal investment before I am useful with it. Lack of libraries aside, is Idris much better?

Ralith
Jan 12, 2011

I see a ship in the harbor
I can and shall obey
But if it wasn't for your misfortune
I'd be a heavenly person today

xtal posted:

Lately I've been learning Idris because Haskell has a bunch of warts that make me cry. The library support is, expectedly, lacking, so I'm considering writing Haskell with an alternate prelude instead. Has anybody worked with ClassyPrelude, Protolude or Foundation who can help me decide what to use? (Or tell me if this is even a good idea, because I expect I am going to need to add a whole lot of string conversions for compatibility with libraries.)
Speaking as a one-time contributor to Idris, I too am confused about what warts you find so intolerable in Haskell. Modern Haskell is great already (ever played with lenses?), and GHC 8 is bringing all kinds of fun toys on top of that.

Idris is fun to play with, but it doesn't supplant Haskell, no matter how excited I might be about what dependent types will do for software development long-term.

Asymmetrikon
Oct 30, 2009

I believe you're a big dork!
I like Idris over Haskell because of some of the historical weirdness it got rid of (switching : and ::, returning Maybe instead of erroring in, i.e., List access functions, the whole Monad/Applicative thing, Int vs. Integer.) I haven't looked too much into GHC 8, though.

xtal
Jan 9, 2011

by Fluffdaddy
I have a bunch of small complaints (Idris improves a lot more than the type system) but the lack of performance (String) and safety (totality) in the Prelude are what gets me. Having an opaque list type already pokes holes in type safety, so at the very least I want a Prelude that is total. Alternatives also tend to be implemented via type classes rather than concrete types which I prefer.

xtal fucked around with this message at 02:16 on Sep 3, 2016

Adbot
ADBOT LOVES YOU

LOOK I AM A TURTLE
May 22, 2003

"I'm actually a tortoise."
Grimey Drawer

Asymmetrikon posted:

I like Idris over Haskell because of some of the historical weirdness it got rid of (switching : and ::, returning Maybe instead of erroring in, i.e., List access functions, the whole Monad/Applicative thing, Int vs. Integer.) I haven't looked too much into GHC 8, though.

Applicative has been a superclass of Monad since GHC 7.10. The duplicate functions like return/pure and (>>)/(*>) are still there though, and I guess they'll probably never go away.

  • Locked thread