Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Ganondork
Dec 26, 2012

Ganondork

loinburger posted:

I've been a Java (and to a lesser extent a C#) programmer for awhile, and about two years ago I took a job programming in Scala. I liked Scala the language, but I got frustrated with a small but extremely vocal portion of the Scala community (among which were some of my co-workers), namely Haskell programmers who wanted Scala to be exactly like Haskell. For example, we'd have a bunch of big rear end thread-local objects, and every time we updated a field we'd have to copy the entire object because Everything Must Be Immutable Just Like In Haskell, and then we'd wonder why the garbage collector was running too often. Or during a code review I'd be criticized for using classes instead of using whatever the Haskell-ish equivalent would be. Another problem is that we couldn't just use a mature Java library for something (the library might return nulls!), we had to hunt for an alpha Scala library instead. When the startup eventually went bankrupt I went back to Java.

It's the same where I work, but most of the OOP guys are slowly being converted to FP. I can appreciate both sides of the argument, but in our case the existing OOP implementations are horrible, multi-layered abstractions that are incredibly obtuse. That's not to say that OOP is horrible, but holy poo poo do I hate layers...and the cake pattern. :cripes:

Honestly, the myriad of styles possible in Scala is it's worst attribute. There are just too many ways to write some horribly confusing code.

Also, I wish Haskell people didn't talk so drat much about Haskell.

Adbot
ADBOT LOVES YOU

loinburger
Jul 10, 2004
Sweet Sauce Jones
A couple months after I started working at the Scala shop they hired a guy who'd had prior experience with Scala, but up until then nobody really knew how to use it and my only prior FP experience was with Scheme, so I just treated it like Java with weird and inconvenient syntax.
code:
def whatIsThisOptionShit(t: Option[T]): T = {
  if(t.isEmpty) throw new NullPointerException("why is this null what the hell") else t.get
}
Once the experienced guy told us about using map/flatMap on Options they made a helluva lot more sense

Sedro
Dec 31, 2008

Ruzihm posted:

Argh. I am trying to do a "manual install" of sbt, but it's still trying to reach out to the internet. What gives?
SBT will check for updates to things, especially if you depend on SNAPSHOTs.

You could try offline mode

Ruzihm
Aug 11, 2010

Group up and push mid, proletariat!


Sedro posted:

SBT will check for updates to things, especially if you depend on SNAPSHOTs.

You could try offline mode

I saw that and gave it a shot. SBT refused to listen to anything until it downloaded its necessary packages. I went ahead and just let it do that v:shobon:v. No big deal.

KernelSlanders
May 27, 2013

Rogue operating systems on occasion spread lies and rumors about me.

jpotts posted:

It's the same where I work, but most of the OOP guys are slowly being converted to FP. I can appreciate both sides of the argument, but in our case the existing OOP implementations are horrible, multi-layered abstractions that are incredibly obtuse. That's not to say that OOP is horrible, but holy poo poo do I hate layers...and the cake pattern. :cripes:

Honestly, the myriad of styles possible in Scala is it's worst attribute. There are just too many ways to write some horribly confusing code.

Also, I wish Haskell people didn't talk so drat much about Haskell.

Cake pattern is terrible. I don't understand why anyone would chose it over constructor injection.

sink
Sep 10, 2005

gerby gerb gerb in my mouf

KernelSlanders posted:

Cake pattern is terrible. I don't understand why anyone would chose it over constructor injection.

Agree.

Fullets
Feb 5, 2009

KernelSlanders posted:

Cake pattern is terrible. I don't understand why anyone would chose it over constructor injection.

The rationale in Odersky's talk is that interfaces (well, traits) simply can't express the type constraints he required without casting, and he (unsurprisingly) preferred type safety. Although IIRC, Dotty's gone with a different abstraction.

It is quite useful in those situations but some people decided it should be used for everything.

Steve French
Sep 8, 2003

I've experimented some with the cake pattern and really struggle to come up with solid justifications for doing so. Has anyone done much with macwire, or used it as outlined here?

http://di-in-scala.github.io/

I have to say I like the high level idea: start with dead simple constructor injection, and then make use of more advanced techniques and tools as you need them. Sharp contrast with the cake pattern, which is more "all boilerplate all the time", and makes things often very difficult to construct dynamically based on configuration.

Mellow_
Sep 13, 2010

:frog:
I've been mucking about with the Play framework and am looking for a half decent ORM to use with it. I see it comes with Ebean, anyone have opinions on it?

If it turns out to be junk I might just use spring with jpa.

I also took a look at squeryl but it didn't seem that fantastic.

loinburger
Jul 10, 2004
Sweet Sauce Jones
I used Slick up until about six months ago - it may have gotten better since then, but at the time vanilla Slick was reliant on case classes and tuples and all of their attendant restrictions. However, it's got a code generation tool that lets you bypass those restrictions (generating classes instead of case classes to represent rows), and that also saved us a boatload of time in refactoring (this was a startup and so the database kept changing on us) because the code generation tool would read the metadata from the database tables and produce appropriately typed row classes.

Almost all of the database writes were from JSON in POST/PUT requests, and so the code generation tool let us generate code to:
1. Validate the JSON (ensure that each key corresponded to a valid column name)
2. Loop through the JSON, convert its untyped values to the appropriate type (or throw an exception), and update the appropriate row objects
3. Update the appropriate join tables to keep the database consistent (this was done with a hand-coded Map, we were planning on automatically generating this Map using foreign key constraints but then the company more or less went bankrupt)

I can give you some example code if you want. The learning curve was a bit steep, but in the end something like 25% of our back-end code was being generated by Slick.

Mellow_
Sep 13, 2010

:frog:

loinburger posted:

I used Slick up until about six months ago - it may have gotten better since then, but at the time vanilla Slick was reliant on case classes and tuples and all of their attendant restrictions. However, it's got a code generation tool that lets you bypass those restrictions (generating classes instead of case classes to represent rows), and that also saved us a boatload of time in refactoring (this was a startup and so the database kept changing on us) because the code generation tool would read the metadata from the database tables and produce appropriately typed row classes.

Almost all of the database writes were from JSON in POST/PUT requests, and so the code generation tool let us generate code to:
1. Validate the JSON (ensure that each key corresponded to a valid column name)
2. Loop through the JSON, convert its untyped values to the appropriate type (or throw an exception), and update the appropriate row objects
3. Update the appropriate join tables to keep the database consistent (this was done with a hand-coded Map, we were planning on automatically generating this Map using foreign key constraints but then the company more or less went bankrupt)

I can give you some example code if you want. The learning curve was a bit steep, but in the end something like 25% of our back-end code was being generated by Slick.

This is really interesting and I would definitely be interested in viewing some example code.

Thanks for the option, I'll look into it.

EDIT: Wait a minute, if Play and Slick are both made by TypeSafe, why the heck haven't they started using Slick with Play?

Mellow_ fucked around with this message at 03:36 on Jun 5, 2015

loinburger
Jul 10, 2004
Sweet Sauce Jones
We used Slick with Play, dunno why they're not bundled - maybe because vanilla Slick is still a bit flaky

Code - I've done a half-assed job of sanitizing it by replacing all instances of the company's name with "companyname" - I can provide more code if you want, this is just what I thought were the relevant parts

The slick-codegen folder has all of the code generation stuff - this was run as a separate project. SlickCodeGenerator.scala is where we read all of the database metadata and generate the code. ModelTransformations.scala has all of the customizations, e.g. each table has a "blacklistedColumns" set that contains the columns that are in the database but shouldn't be in the row object. slickTemplate.scala.txt has the code template. I was responsible for ModelTransformations and the template, we hired one of the Slick developers to write SlickCodeGenerator.

The store folder has some sample output in the models sub-folder; the generic sub-folder has the superclasses for the DAOs (RichTable) and the row objects (ScalatraRecord, from when we were using Scalatra instead of Play, which we never got around to renaming). The auto_generated folder has the output from slick-codegen, while the rest of the files in the models directory inherit from a file in the auto_generated directory (this is so that we can customize the auto generated classes). So inheritance goes generic -> auto_generated -> hand_coded, where we use usually only reference the hand_coded or generic classes everywhere else in the program.

loinburger fucked around with this message at 15:15 on Jun 5, 2015

Sedro
Dec 31, 2008
Play does support slick.

https://www.playframework.com/documentation/2.4.x/PlaySlick

There's also Anorm which is a lightweight Slick.

KernelSlanders
May 27, 2013

Rogue operating systems on occasion spread lies and rumors about me.
I've been pretty happy with spray for my toy web apps although serving the original html is a bit clunky.

sink
Sep 10, 2005

gerby gerb gerb in my mouf

I'm sorry that project was so hosed up, even from the start. And it never really got much better. The codegen stuff helped with a lot of the suffering though.

loinburger
Jul 10, 2004
Sweet Sauce Jones
That was a fun project, it's just a shame we never got to finish it

sink
Sep 10, 2005

gerby gerb gerb in my mouf

loinburger posted:

That was a fun project, it's just a shame we never got to finish it

Hey you don't have PMs, can you please email me? scott at thereceptor dot net

You gotta sign your doc

DholmbladRU
May 4, 2006
Running into an issue with Slick and am unsure how to troubleshoot. Have a database table with a column which is of type Time. When I attempt to query this column I receive the above error message. Has anyone encountered this issue before? I know its an issue with the type because if I change it to integer or something else the query works perfectly.

error
code:
ERROR "Illegal hour value XX for java.sql.Time type in value"
This error is picked up by the following, and what you see above is the localized message
code:
 def SQLException(e: SQLException) = CError(e.getErrorCode, e.getSQLState, e.getLocalizedMessage)
[code]

Below is the DAO

[code]
protected case class SomeRow(id: Int, companyId: Int, someTime: Option[DateTime] = None)

protected implicit def GetResultRow(implicit e0: GR[Int], e1: GR[Option[DateTime]], e2: GR[Byte], e3: GR[DateTime]): GR[SomeRow] = GR{
prs => import prs._
  SomeRow.tupled((<<[Int],<<?[DateTime]))
}

protected class LU(_tableTag: Tag) extends Table[SomeRow](_tableTag, "SOME_TABLE") {
def * = (id, allotedTime) <> (SomeRow.tupled, SomeRow.unapply)
/** Maps whole row to an option. Useful for outer joins. */
def ? = (id.?, someTime).shaped.<>({r=>import r._; _1.map(_=> SomeRow.tupled((_1.get, _2.get)))}, (_:Any) =>  
throw new Exception("Inserting into ? projection not supported."))

/** Database column ID DBType(INT), AutoInc */
val id: Column[Int] = column[Int]("RECORD_ID", O.AutoInc)

/** Database column SOME_ROW DBType(TIME), Default(None) */

}
}
query

code:
DAL.SomeRow.filter(_.id === 1).firstOption

DholmbladRU fucked around with this message at 13:20 on Jun 16, 2015

Steve French
Sep 8, 2003

DholmbladRU posted:

Running into an issue with Slick and am unsure how to troubleshoot. Have a database table with a column which is of type Time. When I attempt to query this column I receive the above error message. Has anyone encountered this issue before? I know its an issue with the type because if I change it to integer or something else the query works perfectly.

Below is the DAO

code:

protected case class SomeRow(id: Int, companyId: Int, someTime: Option[DateTime] = None)

protected implicit def GetResultRow(implicit e0: GR[Int], e1: GR[Option[DateTime]], e2: GR[Byte], e3: GR[DateTime]): GR[SomeRow] = GR{
prs => import prs._
  SomeRow.tupled((<<[Int],<<?[DateTime]))
}

protected class LU(_tableTag: Tag) extends Table[SomeRow](_tableTag, "SOME_TABLE") {
def * = (id, allotedTime) <> (SomeRow.tupled, SomeRow.unapply)
/** Maps whole row to an option. Useful for outer joins. */
def ? = (id.?, someTime).shaped.<>({r=>import r._; _1.map(_=> SomeRow.tupled((_1.get, _2.get)))}, (_:Any) =>  
throw new Exception("Inserting into ? projection not supported."))

/** Database column ID DBType(INT), AutoInc */
val id: Column[Int] = column[Int]("RECORD_ID", O.AutoInc)

/** Database column SOME_ROW DBType(TIME), Default(None) */

}
}


query

code:

DAL.SomeRow.filter(_.id === 1).firstOption

I see no above error message

DholmbladRU
May 4, 2006

Steve French posted:

I see no above error message

ops. edited original content

loinburger
Jul 10, 2004
Sweet Sauce Jones
Is DateTime from joda.time? I'm pretty sure that Slick only supports the old standard library dates (util.Date, sql.Time, sql.Timestamp)

DholmbladRU
May 4, 2006

loinburger posted:

Is DateTime from joda.time? I'm pretty sure that Slick only supports the old standard library dates (util.Date, sql.Time, sql.Timestamp)

Yes DateTime is joda

loinburger
Jul 10, 2004
Sweet Sauce Jones
Try storing a "new java.sql.Time(someDate.getMillis())" instead, at least until Slick adds support for joda.time or java.time

loinburger fucked around with this message at 19:29 on Jun 16, 2015

DholmbladRU
May 4, 2006

loinburger posted:

Try storing a "new java.sql.Time(someDate.getMillis())" instead, at least until Slick adds support for joda.time or java.time

In the case class or in the class LU?

loinburger
Jul 10, 2004
Sweet Sauce Jones
In the case class - then you can do something like add implicit conversions for java.sql.Time <-> joda.time.DateTime to make your life easier, since java.sql.Time is a bit poo poo

Sedro
Dec 31, 2008
You can tell slick how to map types when you define your table. You can also extend the Driver and put this stuff in there.
code:
import driver.api._
implicit val dateTimeMapping = MappedColumnType.base[DateTime, Time](dt => new Time(dt.getMillis), t => new DateTime(t.getTime))

KernelSlanders
May 27, 2013

Rogue operating systems on occasion spread lies and rumors about me.
Has anyone had any success using Spark's DataFrame object? Am I missing something or is the whole thing just a horrendously designed API that can't possibly be useful for anything? Like in pandas you can df['c'] = df['a'] + df['b']. Is there a simple way to do that in spark Dataframes? What about df['idx'] = df.id.map(lambda x: np.where(ids == x)[0][0])?

Monkey Fury
Jul 10, 2001

KernelSlanders posted:

Has anyone had any success using Spark's DataFrame object? Am I missing something or is the whole thing just a horrendously designed API that can't possibly be useful for anything? Like in pandas you can df['c'] = df['a'] + df['b']. Is there a simple way to do that in spark Dataframes? What about df['idx'] = df.id.map(lambda x: np.where(ids == x)[0][0])?

Nope. You can try something like .withColumn(), I guess, but IIRC there's no straight-forward way to do it.

How many other folks in here are doing Scala and Spark work? We just launched a small test cluster recently, and my life is now figuring out how we leverage that plus our existing Vertica (please kill me) store. Also now learning Scala because I've found functional programming in Python - my primary day to day language - a bit of a hassle, especially since most Spark docs and examples and use cases happen in Scala or apparently Clojure.

Cryolite
Oct 2, 2006
sodium aluminum fluoride
Why isn't ArrayBuffer[String] an Iterable[String]?

Let's say I have an ArrayBuffer like this:

code:
val a = ArrayBuffer[String]("AAA", "BBB", "CCC")
// I can make a string of the elements with some separator like this:
a.mkString(", ") // "AAA, BBB, CCC"
Coming from C# I'm used to being able to do string.Join(", ", a) which takes an IEnumerable<String>. If I pass a List<String> it works because that's an IEnumerable<String>.

In Scala I have to import asJavaIterable to get an implicit conversion from ArrayBuffer[String] to Iterable[String] so that the same thing works:

code:
import scala.collection.JavaConversions.asJavaIterable
String.join(", ", a)
Is this the only way to get this to work (for this overload of join)? Why was it built this way? Why couldn't ArrayBuffer[String] implement Iterable? Going through the inheritance heirarchy I see the Iterable trait in there, which after some confusion I realized is not the same as the Iterable interface.

I can also get it to work doing this:

code:
String.join(", ", a: _*)
...which converts a to a CharSequence... varargs argument for the other overload of join. What is that : _* doing? Is that an operator?

b0lt
Apr 29, 2005

Cryolite posted:

Why isn't ArrayBuffer[String] an Iterable[String]?

Let's say I have an ArrayBuffer like this:

code:
val a = ArrayBuffer[String]("AAA", "BBB", "CCC")
// I can make a string of the elements with some separator like this:
a.mkString(", ") // "AAA, BBB, CCC"
Coming from C# I'm used to being able to do string.Join(", ", a) which takes an IEnumerable<String>. If I pass a List<String> it works because that's an IEnumerable<String>.

In Scala I have to import asJavaIterable to get an implicit conversion from ArrayBuffer[String] to Iterable[String] so that the same thing works:

code:
import scala.collection.JavaConversions.asJavaIterable
String.join(", ", a)
Is this the only way to get this to work (for this overload of join)? Why was it built this way? Why couldn't ArrayBuffer[String] implement Iterable? Going through the inheritance heirarchy I see the Iterable trait in there, which after some confusion I realized is not the same as the Iterable interface.

I can also get it to work doing this:

code:
String.join(", ", a: _*)
...which converts a to a CharSequence... varargs argument for the other overload of join. What is that : _* doing? Is that an operator?

java.lang.Iterable is not the same as scala.collection.Iterable. Scala's Iterable's iterator method collides with the Java one, so classes can only implement one of them.

:_* expands the argument to fill the rest, either by converting to an Array for a variadic function, or by unpacking a tuple.

KernelSlanders
May 27, 2013

Rogue operating systems on occasion spread lies and rumors about me.
Generally, you don't want to user the Java library if there's a Scala library that does the job. The interfaces just tend to be clunky in Scala code. I'm not sure why you feel String.join(",", a) is cleaner than a.mkString(",") but if you really want to do so, you can always define your own function.

code:
scala> def join[T](list: ArrayBuffer[T], delim: String): String = list.mkString(delim)
join: [T](list: scala.collection.mutable.ArrayBuffer[T], delim: String)String

scala> join(a, ",")
res1: String = one,two,three
I would though, argue for a.mkString(",") as a much more Scala-like way of expressing the idea. We love calling methods on collections. You could imagine a def mkString(d: String) = this.tail.foldLeft(this.head){ (l, r) => l + d + r.toString }

Cryolite
Oct 2, 2006
sodium aluminum fluoride

KernelSlanders posted:

I'm not sure why you feel String.join(",", a) is cleaner than a.mkString(",")

Oh no, I definitely like it more, I was just trying to take a technique I did know and use it as a chance to learn more about how things work. :)

Cryolite
Oct 2, 2006
sodium aluminum fluoride
How would you generate the first 10 even fibonacci numbers in Scala?

I overheard someone say they use this as an interview problem so I did it in C# by generating an IEnumerable<int>, filtering to the even ones and then taking 10 like this:

code:
static IEnumerable<int> Fibonacci()
{
    int a = 0;
    int b = 1;
    while (true)
    {
        yield return a;
        var temp = b;
        b = a + b;
        a = temp;
    }
}

static void Main(string[] args)
{
    var nums = Fibonacci().Where(x => x % 2 == 0).Take(10);
}
Looking around online I found this way to do it, but it's recursive:

code:
def fib(a: Int = 0, b: Int = 1): Stream[Int] = Stream.cons(a, fib(b, a + b))
fib().filter(_ % 2 == 0).take(10).force
There's also this way which I guess is also recursive but I don't completely understand it:

code:
def fib2(): Stream[Int] = 0 #:: fib2.scanLeft(1)(_ + _)

fib2.filter(_ % 2 == 0).take(10).force
Is there an idiomatic way to do this in Scala like the C# example? There doesn't seem to be anything like yield return in Scala. The most similar way might be to create a new Iterator like in this example where hasNext would always be true and I could imperatively manage whatever state I want before returning a value in the next method. However, this doesn't seem like good Scala.

How would you do it?

Also, what exactly is the fib2 function above doing? I know that #:: creates a stream with a head of 0 and a tail of fib2.scanLeft(1)(_ + _), but if it recursively calls fib2 again won't that tail start with 0, and then the next call to fib2 starts with 0, and so on? It works, but it seems like it shouldn't to me.

KernelSlanders
May 27, 2013

Rogue operating systems on occasion spread lies and rumors about me.
Stream is an attempt at a functional approach to something like the C# yield return although it's probably closer to a generator comprehension in python. That's probably the closest to idiomatic of the approaches you listed. Does this example make more sense?

code:
val integers: Stream[Int] = 0 #:: integers.map(_ + 1)
Basically, it's a lazy evaluated list. The stream integers is defined (recursively) as the element 0 followed by all the elements of integers plus one. So the zeroth element is zero (per the initial case), then the first element of integers (which is the zeroth element of the thing to the right of the #::) is the zeroth element of integers plus one.

By the way, your implementation of fib2 isn't great because that scanLeft will end up recomputing the entire sequence for every element. Try this to see why:

code:
def addThem(a: Int, b: Int): Int = {
  println(s"add> $a + $b = ${a+b}")
  a + b
}

def fib2(): Stream[Int] = 0 #:: fib2.scanLeft(1)(addThem)

fib2.take(10).foreach(println(_))
There's a better implementation in the Stream scaladocs. If it's an interview question though, you probably can't go wrong with the tail-recursive approach.

Steve French
Sep 8, 2003

Fill in the blank:

code:
def justDoingSomeThings(mapping: Map[Int, Int], key: Int): Int = mapping(key)

...

justDoingSomeThings(m, 1)
to make this throw:

java.util.NoSuchElementException: key not found: wut

(verbatim)

Steve French fucked around with this message at 03:01 on Sep 12, 2015

Steve French
Sep 8, 2003

Steve French posted:

Fill in the blank:

code:
def justDoingSomeThings(mapping: Map[Int, Int], key: Int): Int = mapping(key)

...

justDoingSomeThings(m, 1)
to make this throw:

java.util.NoSuchElementException: key not found: wut

(verbatim)

Here's a hint / secondary challenge. Same method definition, but different code in the blank and instead of the last line, this:

code:
assert(justDoingSomeThings(m, 1) == justDoingSomeThings(m, 1))
to make the assertion fail.

Sedro
Dec 31, 2008
Sure, just change the default value
code:
val m = Map[Int, Int]().withDefault(_ => throw new NoSuchElementException("key not found: wut"))
val m = Map[Int, Int]().withDefault(_ => Random.nextInt())

Steve French
Sep 8, 2003

Sedro posted:

Sure, just change the default value
code:
val m = Map[Int, Int]().withDefault(_ => throw new NoSuchElementException("key not found: wut"))
val m = Map[Int, Int]().withDefault(_ => Random.nextInt())

Hrm, I suppose the question was a bit underspecified, since clearly there are other ways than I had in mind to get that behavior.

This came up due to some surprising behavior of mapValues, which actually creates a view under the hood, even though its type signature doesn't really indicate that (e.g. you can't call force on its result).

code:
val m = Map(1 -> "wut").mapValues(Map("oh" -> 2))
val m = Map(1 -> "wut").mapValues(_ => scala.util.Random.nextInt)

Sedro
Dec 31, 2008
That is odd. I guess I'd prefer if mapValues was eager and Map.view could be used for laziness

Adbot
ADBOT LOVES YOU

KICK BAMA KICK
Mar 2, 2009

Real dumb question but it's been impossible to Google. Looking at the official tutorials and in this example
code:
    override def equals(that: Any): Boolean =
      that.isInstanceOf[Date] && {
        val o = that.asInstanceOf[Date]
        o.day == day && o.month == month && o.year == year
      }
why are isInstanceOf and asInstanceOf called with square brackets instead of parentheses?

e: oh are those methods generics that actually take no arguments?

KICK BAMA KICK fucked around with this message at 05:31 on Oct 10, 2015

  • Locked thread