Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Clojure vs. Scala (programming-puzzler.blogspot.se)
129 points by iamtechaddict on Dec 24, 2013 | hide | past | favorite | 132 comments


Scala guy here

Suits would surely be case objects rather than classes, or I'd be tempted to just use a Java enum (Scala could really do with an equivalent). With card ranks, either you want to be able to compare them - in which case integers are the correct representation - or they're just 13 "things", in which case again an enum-like representation is good; I can't imagine why you'd ever want to represent a card rank as int-or-string.

Case classes are java-serializable by default. Any number of other serializers (json, database...) exist that work seamlessly with case classes. I do think it would be nice to have an abstraction for something weaker than a class, something that was guaranteed to be inert data, but still typed.

And yeah, I might implement a deck as something that contains a sequence. But I could then derive typeclasses for sequence operations quite cheaply using lenses. (I do wish there was a nicer way of doing this kind of delegation though).

Sure, Scala steers you towards making a type straightjacket for yourself; you definitely can operate on stringly typed or maply typed data, as it sounds like Clojure encourages you to do all the time. And maybe it's not the best language for exploratory manipulation of semi-structured data. But when the time comes to write a production-ready system, the strong types that Scala gives you are a much faster way to achieve the level of reliability you need than the level of testing you need in a less typed language.

(And even if you don't need the safety, I find types are a great aid to figuring out WTF you were doing if you come back to the code in six months' time)


I can't imagine why you'd ever want to represent a card rank as int-or-string.

I think this illustrates the author's point that languages structure our thinking. The value of a King as 13 and the Queen as 12 is determined by the rules of the card game, not the number of elements displayed on its face as is the case for the non-face cards - e.g. the face cards all have a value of 10 in blackjack.

Some languages better allow for the messy details of the world and allow the programmer to treat the partial and occasional isomorphism between names of and the values assigned to cards in a particular context uncomplected.

None of which is to say that strong static typing does not offer a particular set of advantages, only to suggest that it can become forced at a certain level of abstraction where real world objects are represented without a specific context for interpretation.

Imagine a program which begins by asking, "Which game do you want to play?" and includes a card guessing game, blackjack, Uno!, go fish, and crazy eights. We want the values of the cards to lie in their use within each game, not where we shuffle and deal and discard from hands.


> Imagine a program which begins by asking, "Which game do you want to play?" and includes a card guessing game, blackjack, Uno!, go fish, and crazy eights. We want the values of the cards to lie in their use within each game, not where we shuffle and deal and discard from hands.

I maintain that you gain nothing (except the possibility of errors) by representing the cards as int-or-string here. Think about how you'd actually use the cards in implementing those games. With my enum-like scala representation you could do anything you could do with the clojure version, and you'd get the advantage of e.g. compiler warnings if you defined incomplete match/case statements.


The author's point isn't that int-or-string is a good choice, it's that there are a lot of choices and none of them allow natural abstraction over a deck of cards, the static typing encourages the abstraction down to a lower level - like ints or into a mashup like string-or-int.

And this is the really important point, this pushing the abstraction down to a lower level causes it to leak into the implementation of card games in ways that make the card games themselves harder to reason about.

Decks of cards have Kings, not 13's, and using 13 to represent the King is of type leaky abstraction. It's a muddy road leading into the swamp.

When I implement blackjack I get to write little balls of mud:

    If card = 13 then 10
And in my card guessing game

    If card = 13 and guess = King then "Yes, the card is a King!"
All that for a warning when I don't have a card 7, which is going to make Pinochle have to work around the abstraction - probably by bypassing the entire implementation when it chokes on two Jacks of Diamonds etc. in a deck.


I said the enum-style approach is what you want for something like that. "case KING | QUEEN | JACK | TEN => 10; case ValueCard(x) => x" is just as readable as anything you'd do in clojure, and the warning (not error - you can ignore it if your game really doesn't need to handle that card) if you forgot one is valuable.


The enum is a bad abstraction because when I play GoFish, the computer asks, "Do you have any 13's?" and when I ask, "Do you have any Jacks?", I am always told to GoFish. It's an abstraction that sometimes returns a meaningful value, and other times doesn't, and the abstraction is built to optimize exactly the cases that are easy for me to reason about when writing a program and returns something that only looks correct in all the difficult cases, such as the Ace and so I wind up having to do all the heavy lifting myself anyway but only after having been misled to the wrong assumption that the abstraction was robust and well thought out.

There's no value to a warning that's irrelevant and perhaps even negative value if it encourages the idea of adding a wildcard last case just so I don't have it interrupting my thoughts each and every time I build my program since doing so opens the door to very run time errors it was supposed to prevent or conversely creates the habit of ignoring the compiler warnings in the very worst situation - when I think I know what I am doing.

If enums were good enough and I wanted to ignore compiler warnings, I could have used C.


> The enum is a bad abstraction because when I play GoFish, the computer asks, "Do you have any 13's?" and when I ask, "Do you have any Jacks?", I am always told to GoFish.

What? No it doesn't. Scala case objects don't have numerical values.


I am having an incredibly difficult time understanding why int-or-string cards are desirable also. You'd end up with multiple functions like heartsGameCardValue and euchreGameCardValue but with int cards you need to write the same thing and without a type system to catch errors. If it was truly simpler to reason in terms of int, you could also do a cardIntegerRepresentation function and it would add very little length to the codebase.


> Scala gives you are a much faster way to achieve the level of reliability you need than the level of testing you need in a less typed language

I have serious doubts about this since practically everything in the Scala standard library is horribly broken (collections, views, everything in scala.io, scala.xml and scala.actors etc.). The number of bugs found by Paul Phillips alone is absolutely ludicrous.

As it stands now, both Haskell and Clojure have vastly superior libraries, build systems that actually work, and implementations that people actually understand and can modify.


How are collections broken? Views are deprecated/set to be deprecated, I don't really use scala.io, I find java io sufficient for my needs, scala.xml is being split out in 2.11 and macros will let people use a third party implementation, scala.actors is also deprecated, having been replaced by Akka.

Paul has definitely found a lot of "bugs", he has certainly contributed a huge amount to the cleanliness of the code base, and has been a big pusher for removing certain levels of unsoundness in the type system, but if you were looking for bug-finders + fixers, Simon Ochsenreither is probably a better bet.


Just a quick question - if the views are set to be deprecated, what would be the way to do a (long) series of collection transformation without spawning multiple intermediate collections? So, essentially, equivalent of .view, transformations, .force?


You use an iterator, which has its own caveats, or you simply accept that chaining collection transformations is going to be ridiculously inefficient.


The scala collections library is the most powerful/flexible I've ever seen in a strongly typed language; it combines the safety of Haskell with the flexibility of Clojure. Given the amount of ongoing complaints I see around cabal I don't think it's fair to say it "actually works" (fwiw I'm very happily using maven to build my Scala projects, so I never really know what the sbt fuss is about). And the rate of features and other progress in recent scala releases clearly demonstrates there are plenty of people who understand the implementation well enough to modify it.


> it combines the safety of Haskell with the flexibility of Clojure

Foldable, Traversable, Monoid etc. are all far better abstractions than what Scala collections provide, and they don't lead to unmanageable inheritance hierarchies and ridiculous APIs. How many bugs have been caused by that? A few hundred maybe? Does Set equality work in the current release or is that broken again? Is it possible to do thread safe efficient merges in immutable collections yet?

The only thing Scala collections offers that Haskell doesn't is the CanBuildFrom travesty. At least in Clojure you can chain transformations together without generating tons of intermediate values. In Scala you're forced to choose between being ridiculously inefficient or using iterators.


Can you expand on the CanBuildFrom travesty?

(It seems like you've got interesting things to say, but you're being a bit strident. It bums me out that Scala discussions on HN take such a heated tone.)


CanBuildFrom is essentially a typeclass in Scala that is used for converting between collection types. Ignoring the coherency issues associated with using implicits for typeclasses, it makes the implementation of the collections library significantly more complicated and causes issues with type inference.

In the rare cases where you actually need to convert between collection types, it's usually trivial and more efficient to do it manually. Essentially they've added tons of complexity for negligible benefit. One of my main gripes with the Scala collections library is that it's overly complex and that this complexity has been the source of hundreds of bugs.

If there was any real benefit to the way it's currently done they wouldn't be hiding the true type signature for things like `List.map` in the official Scaladocs.


You are kind of either ignoring or not knowing the core use-case of CanBuildFrom. Why talk about things you don't understand?


The reasons other than the ones I've outlined are caused by implementation details and wouldn't exist if they chose the correct abstractions.


That's wrong.


You should try haskell sometime, their version of collections is much better. Complaints about cabal are of the same nature as complaints about every package manager "hey I tried to do something that can't be done and it said it can't be done". Having used all three languages, I still use haskell daily, I would use clojure again if demanded without complaint, and I would never in a million years consider touching scala again, ever.


The last time I looked chaining collection operations still required unnecessary boilerplate and was not even the default way to use collections. Did they fix that? As far as I know, they didn't.

Additionally, it seems that moving from lists to a different collection type still involves rewriting your whole code due to the insane idea of using different functions by default (fmap/map, mempty/[], mappend/++).

I just don't have to deal with any of these design mistakes in Scala. Having sane defaults and stuff that "just works" consistently are a huge benefit.


>The last time I looked chaining collection operations still required unnecessary boilerplate and was not even the default way to use collections. Did they fix that?

I can't even imagine what you are talking about, so I can't say if they "fixed it" or not.

>Additionally, it seems that moving from lists to a different collection type still involves rewriting your whole code due to the insane idea of using different functions by default

"I did something dumb and now I don't like it". So, don't do it? Why were you using map in the first place?

>I just don't have to deal with any of these design mistakes in Scala

I don't have to deal with them in haskell either. But I do have to deal with the hundreds of other mistakes scala made if I use it.


"build systems that actually work"

Maybe I'm feeding a troll here, but I've been building production software for over 4 years using SBT. It works.


Scala.xml and scala.actors are both deprecated. XML has been moved out of the language and scala.actors has been replaced with Akka actors (which are awesome, BTW).


They were (or are) being deprecated because they were broken beyond repair. That's pretty damning considering that all of the things I mentioned have been done properly in plenty of other languages. They aren't treading new ground here.

> Akka actors (which are awesome, BTW).

PartialFunction[Any, Unit]. Ridiculous.


The found a better solution so they switched to it?

That's bad?

As for the PartialFunction[Any, Unit] comment: well, yeah. That's the way actors work. They can be sent any message but may or may not respond. The company I work for has a very large Akka-based backend and that has honestly never been an issue for us. Moreover, it's necessary to facilitate hop-swapping (changing an actor's behavior dynamically during runtime). That is most certainly not a feature I would be willing to sacrifice. "context.become" is an incredibly powerful tool within Akka, especially in how it makes modeling a finite-state machine quite trivial in practice.


FWIW as a big scala fan I've given up on akka actors, until they manage to make typed ones that work (the current ones rely on an experimental reflection library that has threading bugs). The biggest advantage of Scala is type safety; if I wanted an unsafe language with actors I'd use Erlang.


Quite honestly, it's never a problem. In fact, I think it's a huge advantage. I simply would not be willing to give up the ability to change an actor's behavior dynamically.

We use the FSM helper [1] heavily in our backend at Conspire. For data-pipeline use cases, it's invaluable. Keeps our code concise, readable and testable. Even basic become/unbecome operations are big for us, I've written up one of their use cases on our blog [2].

[1] http://doc.akka.io/docs/akka/2.2.3/scala/fsm.html

[2] http://blog.goconspire.com/post/64901258135/akka-at-conspire...


So the Scala solution for this is: http://ideone.com/Vussjh

  object Cards {
    sealed trait Suit
    case object Spades extends Suit
    case object Clubs extends Suit
    case object Hearts extends Suit
    case object Diamonds extends Suit

    sealed trait Rank
    trait FaceCard extends Rank

    case object Jack  extends FaceCard
    case object Queen extends FaceCard
    case object King  extends FaceCard
    case object Ace   extends FaceCard

    case class ValueCard(n: Int) extends Rank

    case class Card(rank: Rank, suit: Suit)

    type Deck = IndexedSeq[Card]

    def numberToRank(n: Int): Rank = n match {
      case 1 => Ace
      case x if x <= 10 => ValueCard(x)
      case 11 => Jack
      case 12 => Queen
      case 13 => King
      case 14 => Ace
      case _ => throw new IllegalArgumentException
    }

    def numberToSuit(n: Int): Suit = n match {
      case 0 => Spades
      case 1 => Clubs
      case 2 => Hearts
      case 3 => Diamonds
    }

    def apply(): Deck = for(rank <- 1 to 13; suit <- 0 to 3) yield Card(numberToRank(rank), numberToSuit(suit))
  }
Now I have: normal operations bound to Deck (head, take, drop etc) and the ability to pattern match on cards. Also strongly typed. This took about 5 minutes top to write.

edit: I'd say the 'numberToRank' parts are probably code smells, but given that its Christmas Eve, I really can't be bothered to think of a better solution right now. Implementing ordering etc can be done on the traits very easily too, but that is game specific (for example are aces high or low, is a King ranked higher than a Jack etc).


Some things are strongly typed and playing cards are among them. It is just that playing cards are strongly typed as playing cards, not integers or pairs of integers.

In blackjack:

    case 10 => Jack or Queen or King or 10
and

    case Ace => 1 or 11
and poker with "Dr Pepper"

    case 10 => ace or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10 or jack or queen or king
    case 2 => ace or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10 or jack or queen or king
    case 4 => ace or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 or 10 or jack or queen or king
It is sometimes better if the data type provides abstractions over the messy details of the world rather than adding another one to it and planning for the possibility of a Joker in the deck might make sense.


Heh, same time as my edit. Those methods should be private (i.e. they just exist for building the deck, its the consumers responsibility to shuffle / split into multiple decks etc, as well as assigning value to each card).


So what then does 13=king get me over king=king? And does forgetting to privatise implementation details suggest that the necessary language idioms make the data structure complex in a way that is orthogonal to the real world complexity of the objects it represents, and wasn't the opportunity to avoid that complexity the advantage suggested by the author. To put it another way, private and sealed and trait don't reflect characteristics of a deck of playing cards. They are overhead from the language. It is a case where a static type system, because it is biased toward it's and strings and objects, makes it more difficult to reason about our code - leaving jokers aside, how much of the Scala code makes sense if I want to play Pinnocle versus the data approach advanced by the author? Now I'll admit that Lisp has shaped the way I see the two approaches to representing cards. and this may be why I see implementation of jokers and pinnocle as easy extensions of lists and rather more complex under an object system with static type checking. But I will point out that this model got me to thinking about card games rather than card data types pretty quickly, and card games are probably the reason we're draining the swamp.


Yep, valuation can be simply a function, but wrapping in a typeclass might work as well.


Does that 'or' method actually exist in a library somewhere or was it just for illustration?


Just for illustration. But to the author of the article's point that programming languages structure our thinking, the idea that "or" must be either in the core language or part of some library illustrates the constraints imposed on expressing our ideas absent the ability to introduce new syntax via a lisp-style macro facility. In a sense, Clojure's light weight data structures can be seen as extending up into the language.


> I'd say the 'numberToRank' parts are probably code smells, but given that its Christmas Eve, I really can't be bothered to think of a better solution right now.

  object Cards {
    sealed trait Suit
    case object Spades extends Suit
    case object Clubs extends Suit
    case object Hearts extends Suit
    case object Diamonds extends Suit

    sealed trait Rank
    trait FaceCard extends Rank

    case object Jack  extends FaceCard
    case object Queen extends FaceCard
    case object King  extends FaceCard
    case object Ace   extends FaceCard

    case class ValueCard(n: Int) extends Rank

    case class Card(rank: Rank, suit: Suit)

    type Deck = IndexedSeq[Card]
    
    val faceCardRanks = IndexedSeq(Jack,Queen,King,Ace)
    val valueCardRanks = for (rank <- 2 to 10) yield ValueCard(rank)
    val ranks = IndexedSeq.concat(valueCardRanks,faceCardRanks)
    val suits = IndexedSeq(Spades,Clubs,Diamonds,Hearts)
    
    def apply(): Deck = for(rank <- ranks; suit <- suits) yield Card(rank, suit)
  }


I think people here are totally missing out on appreciating a very thoughtfully written post by focusing on the fact that he may be comparing programs written in his favorite language to your favorite language. He is not asking for Scala solutions to any problem - and he has not contrived a deceptive Scala solution to "prove" the superiority of his language of choice.

He's comparing the structure and design of programs between two great languages of similar capabilities (features) and philosophies. I think his point about lightweight data modeling is spot on - Clojure is a language that goes out of its way to make modeling your problems idiomatically painless and dramatically simpler than most other languages.

It's idiomatic in Clojure to use plain old maps to represent structures or entities in your program that you would otherwise represent with a class or equivalent construct in many Object Oriented languages. That isn't a code smell either - it's idiomatic and wonderful. Keywords (similar to Ruby symbols in both use and philosophy) are wonderful things that together with maps make enums as a language primitive irrelevant.


It's basically a kind of step back to the Lisp of the 60s and seventies. Unfortunately I don't think this approach scales well. For example I prefer to look in a debugger or inspector at an object and not at some other map.


I'd contend that it's the one approach we know does scale well. The largest systems we've built, such as the web, are designed around bare data structures, rather than objects.


No, large heterogenous systems like the web are built on serialization formats that different components within that same heterogenous system can (and do) interpret as either bare data structures, objects, or just unstructured streams of text (or binary data.)


Serialised data is still data - a sequence of bytes representing a JSON object is still semantically the same when it's decoded into a hash map. The information content remains the same.


I'm going to be honest - you are coming off as very contrarian for the sake of being contrarian. You seem to be consistently responding to my comments in this thread with the equivalent of "You're wrong, because I like to do things this way."

That isn't an argument against any point I've made. Please show me some code and give me some context that will make me rethink my assumptions and arguments if you feel the need to point out my flawed thinking.


No, plain maps are nothing more like plain assoc lists (and similar stuff) in plain old Lisp. We were modelling key/value data structures like that from the earliest Lisp. Sure its easy to use. Up to a point. Lisp had an evolution. Things changed. I've seen quite a bit larger Lisp software over the years and the ones not using explicit classes turned out to be harder to maintain.


Sounds like idiomatic Clojure has more in common with Python than it does with Java. I'm finding that to be true as I teach myself Clojure. My background is an ex-advanced-Java programmer who left that all behind and has built production systems in Python for the last ~5 years. I'm learning Clojure now because the Java ecosystem is important to me, but I simply refuse to use Java to interact with it :)

Lightweight data modeling is important and Java is truly terrible at this. I illustrated this in a gist about creating and iterating over a list and a map, and contrasting that to the equivalent Python: https://gist.github.com/amontalenti/8114359

The author says about Scala: "Due to its highly detailed static type system, Scala attracts the kind of programmers that like to carefully categorize and enumerate all the possible data structures."

It turns out, this describes expert Java programmers very well, too -- so it's no surprise that Scala is a very popular language with Java programmers. I'm finding that Clojure is the more attractive language if your sensitivities lean toward the "simplicity" and "dynamism" camp. I was reading some Scala code being used in production at Twitter and found this marvel: https://twitter.com/amontalenti/status/410977749629546496 -- you would simply never see anything like this in Clojure or Python codebases.

The point about multi-paradigm is interesting. It's very true that Clojure, unlike Python, does not support true multi-paradigm. Then again, Python does not support "true" functional programming. It's close, but no cigar, due to the lack of full-featured lambdas / blocks. If you have to pick one paradigm, functional is definitely the simpler and more essential one.

Illustration: imagine a variant of Python that forced all code to live in classes -- ick. But imagine a variant of Python without classes -- that's not so bad.

It's worth reading about Clojure's take on object-orientation-atop-functional using multimethods and hierarchies: http://clojure.org/multimethods


Great points all around.

I'll add one thing to your point - multimethods are a fantastic example of how Clojure allows you to structure programs in functional way that takes a programming feature (in this case polymorphic method dispatch based on the arguments at runtime) to the nth degree.

In Java and many other languages that have polymorphic method dispatch - the dispatch value is usually the type of the object for whom the method is being called on. Functional languages like Haskell take a much more powerful and expressive approach to this and extend it to matching the dispatch values for the called function on the type or pattern matched value of any argument.

Multimethods are essentially Clojure's take on this in a dynamic language. They allow the programmer to define a function that constructs the dispatch value however it wants - usually from the arguments given as input, in any order, on any condition.

To illustrate, I'll give an example I showed my wife yesterday. She's writing a roguelike game in ClojureScript at the moment that you can play in the browser - naturally she needed a way to represent the game world and the players position as they move around.

So, for illustration I'll omit the code for the world state, but she needed to bind event listeners for the keyboard to properly dispatch the proper movement logic. So the code she wrote looked like this:

https://gist.github.com/aamedina/8114944

She came to me with this code with the following problem - how do I think she should organize it? Clearly the handling logic is going to be more complex than a single line per direction (:up, :down, :left, :right) could handle, she has to account for terrain differences, buildings, monsters, what have you.

So I suggested using multimethods to accomplish this. This is what the multimethod solution I cooked up would look like in this case.

https://gist.github.com/aamedina/8114949

Now she has room in those multimethods to handle the potentially complex logic needed to make those arrow keys move the character properly. But another effect of this is that it hides the underlying state mutation from the implementation.

Now your move functions are pure and isolated from the coords atom, in addition to the fact that multimethods made the logic more organized, at the expense of a few more lines of code.

edit: moved code to gist


In my experience mulltimethods get you a long way when writing Clojure code. My projects use them often and add two or three macros. That is normally the only "overhead" I have to impose over regular functions.

That said, in the example you posted I don't think you are using all the power of the multimethods, because you did not replace the case statement. You went from a case statement to a multimethod+a case statement. In this case, why not let the mutimethod dispatching to do all the routing/case functionality for you? Just use (.-keyCode e) as your dispatch function and use the different values of KeyCodes.XXX for each implementation of the multimethod. And you can even add a :default implementation that leaves the coords untouched.

In my experience the natural use of multimethods arise when you can choose the dispatch function for an element based on either some piece of data from the element or a computed value derived fron the element. But if you have to rely on case/cond/if statements the value of the multimetod is lessened, as you still have to touch your dispatch function when your hierarchy of values change.

edit:typos in the code


Very good point. This was an artifact of the way my wife originally structured the function (dispatching on the keywords :up, :down, :left, :right), and I just kept that.


The proximity of your handle to adriaanm's, Scala tech lead at Typesafe [1], made my first reading of your (good) posts very confusing.

[1] https://news.ycombinator.com/user?id=adriaanm


That's really funny. :)


A simple CASE statement would be better. It would replace a complex machinery plus lots of character-level syntax complication.

A typical case where a more complex construct does not help much.

Generic functions were introduced into Lisp, to replace a complex message sending mechanism with a more functional notation.


Did you miss the case statement both used in the original code and the dispatch function?


No, the original version was easier - though I would write it slightly different.


https://gist.github.com/aamedina/8115304

First version here: Uses only a case statement to dispatch the proper behavior, but as a pure function.

When she begins to take the state of the map and all of the other variables into account, this case statement will prove confusing to wade through - because it's one function that is doing four related but separate tasks. So version two is what you'd likely end up doing after the code got too unwieldy.

Oh wait... that's what a multimethod is! Now you've gone full circle. I hope you can see the point I was trying to make more generally, beyond the limited example I gave here.


That's because you're not considering the fact that the complexity of handling movement in each direction will increase -dramatically- as she continues to write out the game.

If the logic would forever stay a one liner, "increment the y coord!" when the up arrow key is pressed, I would agree with you - keep the case statement and go on your merry way. But that isn't the case, and my wife anticipated that being the case, hence the motivation for her question.

Spreading out complex logic into separate functions is usually considered good form. Like every design decision, you have to be wary not to overdo it - but I think my wife was spot on in identifying the need to do so in this case.

Putting the case statement aside for a second - you're also overlooking the fact that by using functions in this way the implementation becomes pure. The functions do not mutate anything and are completely oblivious to the fact that they are being used to swap! out the current position of the player for another.

Some benefits to writing pure functions include easier testing (you don't have to supply an atom to know if your function works, just call the function) and REPL use is vastly simplified. Additionally, I don't even need a browser or a keyboard handler to see if my movement functions work.


I can write it as multiple functions without the multimethods just as well.


If I understand that scala code correctly it's just enumerating a multiple arity function manually. It's not that uncommon to see that in clojure for performance reaasons, clojure.core has plenty of examples as does the java-written compiler. It's a lot faster to enumerate the common arities than to use apply like:

    (defn f [& args]
      (apply g args))
And it usually just takes a couple of strokes in any decent text editor anyway.


You can however use macros to generate the fixed argument count versions; I don't know if Scala is able to do that.


Macros are available but still experimental. They're already used in stable branches in a number of projects though, including the Play Framework.


It can, though the functionality is relatively recent.


> It turns out, this describes expert Java programmers very well, too -- so it's no surprise that Scala is a very popular language with Java programmers. I'm finding that Clojure is the more attractive language if your sensitivities lean toward the "simplicity" and "dynamism" camp. I was reading some Scala code being used in production at Twitter and found this marvel: https://twitter.com/amontalenti/status/410977749629546496 -- you would simply never see anything like this in Clojure or Python codebases.

Well done, you found some bad code written in Scala. I can assure you I've seen far worse in Python.


Out of curiosity, is that piece of code necessary? Is there no better way to condense that? I'm a beginner with Scala, so I have no clue on that front.


Well, that code exists at all for compatibility with tuples, which are a misfeature from the early days of scala. The "right" way to solve the problem is with shapeless' HLists. With HList you could write a simple generic method that would work with a HList of any length. (If you don't need to join a bunch of hetrogenously-typed futures while preserving all their type distinctions then you could just use the method that takes a Seq, above in the file)

(You could also generate exactly this code with macros, which were a new feature in scala 2.10; I imagine a newer version of the code will do that)


I'm surprised you can't map over tuple members and join. Should be a one liner, but I'm a scala newbie though. Tupled were an unnecessary choice here.

Pointing to this kind of stuff as an example of scala shows a complete misunderstanding of the language quite honestly.


There are some similar ugly workarounds of that nature in Play's JSON macros. Quite ugly but also transparent to anyone not writing library code.


That is a library code, not something you would write (or try to understand) normally.


[deleted]


That's part of Clojure's implementation in Java and the alternative would be to use reflection (obviously a terrible idea). There is similar code in the Scala implementation for dealing with functions, case classes and tuples. The only difference is that the Clojure implementation reflects that it's dynamically typed. I'd even bet there's similar code in CPython.


Actually that is Java, not Clojure.



Thanks for this. I also put together a Clojure example based on a starting point someone else shared on Github.

Clojure: https://gist.github.com/amontalenti/8117294


It is interesting to see small implementations like these - you can get a feel for a language in seconds.

https://gist.github.com/wting/77c9742fa1169179235f


I believe you could easily get the Scala version to be just as short, in terms of lines, as the Clojure version.


I actually had something similar a couple of weeks ago. My initial model was just:

  val cards = for {
    rank <- 'a, '2, [...], 'j, 'q, 'k
    suit <- 'h, 'c, 'd, 's
  } yield (rank, suit)
  val deck = scala.util.Random.shuffle(cards)
which gives you something to play with to validate your model. It's not an api I'd choose to publish, of course, there are many better type-safe ones written here, though I'm sad none of them have used Unicode for suits (I know someone working on a dating app who used a method called ❤ to compare two users: so something like "if (sappho ❤ rhodopis > 95) { ... }")

But my point is that Scala gives you the option, and is tolerant, of working in this slightly dirty way, and then lets you clean it up (and a type-safe compiler will flag up where you are using your old api). I don't know clojure well enough, so, what's the similar workflow there?


So some developers will love type-systems, and some will prefer to stay in Lisp-land forever. :-) The OP seems to be a lisp hacker at heart, and Clojure is the batteries-included Lisp du "jure". (sic)


The article starts saying that Scala gives you lots of options while Clojure guides you to a certain style, but the Scala example takes the most complicated solution as if every programmer would solve it that way. He's contradicting himself.

Scala can solve it in a way quite similar to Clojure. The author is trying to hammer a point about Clojure which doesn't emerge from the example, as most Scala programmers can notice.

It would have been great if he had listed all the possible solutions and told us "Scala lets you solve it in all these very different ways without the language guiding you, Clojure guides you to the simpler solution". That's a fair point, and one Scala programmers will tend to agree with.


Agreed, any problem may be better solved by one or the other. What I notice, having never used Clojure, is that the Clojure devs may well finish the implementation while the Scala devs are still discussing the design. Of course, that level of flexibility will better suit some problems. I read a high profile project's coding guidelines recently: "Scala is a very flexible language. Use restraint." Perl's TMTOWTDI is the reason I moved to Python. I've recently been using Go instead of Scala, partly again because TMTOWTDI, whereas Go is supremely clean, but also, I think, Scala is just too big for my needs, a skyscraper when I need a house. This sounds a bit like sql vs nosql, rather than when should I use x vs y. I look forward to trying Clojure.


Have you considered F# (or Ocaml), or Haskell? I can see good reasons to move away from Scala, but I don't think you need to throw away all that wonderful functional typing goodness.


The example cited here is insulting to everyone's intelligence:

  The possibility of representing face cards with a name would likely never occur to you, because it would be too complicated to go through the effort of defining the type of a rank to be a "integer or a class comprised of four case classes -- jack,queen,king,ace".
I'm sure every competent Scala developer will see that the equivalent in Clojure can be done with val King = 13; val Queen = 12; ..., which also means you get ordering for free as you're not mixing ints and keywords.

I do agree with the author's point that Clojure almost forces you to adopt a simpler design, but I feel that long-term maintainability requires a delicate balance of simplicity and structure that can be achieved with Scala, but takes more self-control with Clojure.


Using val King = 13; val Queen = 12; in Scala is not equivalent with using Clojure keywords, as their printed variant will be 13 and 12, not :king and :queen. Reading debugging output or logs would force you to manually convert those values.

Ordering for free is valuable, I guess, but it sort of depends on the situation. Sometimes face cards are worth 10, other times they are worth 11, 12, 13. If you use val King = 10;, then it suddenly is impossible to distinguish between face cards and tens.


Right, if you wanted to have an ordering around, you would use an enum. The enum's values would print as Ace, 2, 3, 4, ..., Jack, Queen, King, and then when you wanted to implement a particular game the game could define the mapping from rank -> set of possible values. You wouldn't want to map each rank to a single value, since in commonly played games aces are worth both 1 and 11 or both 1 and 14.

If you didn't want the ordering (or the compiler warning when your blackjack implementation omits queens), you could use Scala's symbols, which are more or less the same as Clojure's keywords:

  scala> 'king
  res0: Symbol = 'king


Or use value classes and get the best of both worlds.


No, your first sentence is wrong. The following definition stores the ranks as unboxed integers, but prints them differently:

    case class Rank private (val rank : Int) extends AnyVal {
      override def toString() = {
        rank match {
          case 1 => "A" 
          case 11 => "J"
          case 12 => "Q"
          case 13 => "K"
          case _ => rank.toString()
        }
      }
    }   

    object Rank {
       val Ace = Rank(1)
       val Two = Rank(2)
       // ...
       val King = Rank(13)
    }


And you can create a new string interpolation to make card instances, like:

    card"Heart:Ace"
or

    card"$suite:Ace" if you want to reference a variable.


Scala feels like the Perl of the modern era: there is more than one way to do it. There is value in that mode of thinking, and Perl is a better language than people give it credit for.

However, I think there's a reason Python and Ruby have overtaken Perl in the interpreted language space: Python values simpler solutions [1]. Ruby also values a certain type of beauty over Perl.

Clojure/Scala is a similar dichotomy. Scala has everything you might need, while Clojure is rooted in the spartan power lisp provides.

Also, I wonder: if they weren't both JVM languages, would we even be having this type of discussion? I mean, we aren't comparing Clojure to Cobra or D.

[1] See the Zen of Python http://www.python.org/dev/peps/pep-0020/


Back in my uni days, aka not that long ago, the scala (which I love)/perl (which I hate) comparison is what made me realise that language arguments are almost always pointless. Any choice you make in a certain class of language is probably going to be roughly equal in terms of ease of use, productivity, whatever; so you might as well just go with what you are comfortable with.


I also love scala and hate perl. I wonder how close the correlation actually is - do the same type of people really like both, or is this just something people who don't like scala say?


I am not as fluent in Scala as I am in Perl, but I do like both. Perhaps, I am an exception.


> so you might as well just go with what you are comfortable with

Very true. I was thinking of the ethos of Perl and Scala, and wasn't trying to compare the languages directly.


> Scala feels like the Perl of the modern era: there is more than one way to do it.

Scala is definitely a TMTOWTDI language.

> However, I think there's a reason Python and Ruby have overtaken Perl in the interpreted language space: Python values simpler solutions [1]. Ruby also values a certain type of beauty over Perl.

Ruby -- like Perl and Scala -- is also a TMTOWTDI language, its probably the most important thing it inherited from Perl (the one Perlism it isn't shedding as it grows.)

So, "Ruby and Python have overtaken Perl" isn't an indicator of something wrong with the TMTOWTDI approach.


For me personally, if they weren't running in the JVM, we wouldn't be having this discussion.

The JVM is key for many as operational stability trumps developer finger-candy. It's an easy sell to add a language change it, when it's not really going to change how an entire part of the organization is going to manage it.


  type Card = (Any, Any)
  type Deck = Seq[Card]
  val foo: Deck = Seq((4, 'clubs), ('king, 'diamonds))
It seems to work over here. I would have to be a bit out of my mind to actually want to write code this way, though.

As to the "Lightweight data modeling" pitch, there's a pretty large number of other languages that are better suited to this because they have TCO (when run on their main platforms, the analogues of the Sun JVM) and don't require you to debug apparently-endless stack traces in a different language when your completely untyped exploration/prototype inevitably explodes.


Looks like a slow news day.

A dynamically typed language is obviously easier to begin coding in, especially with a trivial example. The problems that statically typed language solve are usually found in larger code bases and in performance critical applications.


There is an orthogonal Clojure/Scala axis of interest for code bases of appreciating size: Mutability.

Clojure programmers choose immutability by default. Scala, however, makes it just as easy to write "var" as it is to write "val", and just as easy to create a mutable collection as it is to create an immutable one.

In my experience, this is far more important for larger code bases than static typing. Luckily, Scala offers immutability as a feature, which is more than can be said about Java/C++/Go/etc.


Interesting! I suppose var and val are as easy to type, but in Scala my mental default is always val; var is a smell to me. Isn't that the case for most people writing in Scala?


This is true. But there are a lot of devs interested in combining the ease and speed of dynamic language syntax with the robust compilation of a static language...

This will be a popular topic for awhile, I believe, because the JVM isn't going anywhere in the next decade (gut feeling).


> On the other hand, in Scala, you'd be more likely to create a card Class with a rank and suit field.

No, if I was trying to do something intended to be generic, I'd be tempted to start out with this definition of Card:

  trait Card
Scala makes it cheap to use descriptive type names while starting out making minimal assumptions. This helps avoid making too many assumptions up front, like the whole model-cards-as-ints one.

> For modeling the deck, you probably wouldn't say a Deck is-a sequence, because composition is favored over inheritance.

"X is favored over Y" does not mean "never use Y". And, in Scala, "favor composition over inheritance" mostly strongly applies when you are talking about classes, particular classes as the thing being either composed or inherited from.

I'd probably say something like:

  trait Deck[C<:Card] extends Seq[C] {
    ...whatever special behavior Decks need to support in general...
  }


I find it interesting that there are about 50 comments in this thread discussing the best/better ways to model the problem in Scala, and no controversy at all about the Clojure implementation.

That's a feature of Clojure, IMO.


I am not sure I would agree that the deck example he gave represents the Scala idiom. I myself would probably start with something like

========================

object Cards {

type Deck = Seq[Card]

case class Card(r:Rank, s:Suite)

type Rank = Int

sealed trait Suite

case object Spade extends Suite

case object Heart extends Suite

case object Diamond extends Suite

case object Club extends Suite

}

==================================

Granted, all this code is unnecessary in Clojure but there are two benefits of writing them:

1) compilation type check

2) for developers who have no context of playing cards, this provides a clearer picture, while in the Clojure code they will have to browse more code to get the full picture of the data domain.


I think that last part can be especially important.

I'm perfectly comfortable with both approaches, but as things get bigger, I value explicit structure more than convenience. When I'm dealing with implicit structure, I definitely notice the increased cognitive load.

That's fine for something small, short-lived, and personal. But if I'm doing something large, long-lived, and shared across many people, I think implicit structure gets more and more dangerous. It's hard to keep everybody's mental models aligned over time, which makes it easy to end up with a code base whose coherency declines.


After modeling Clojure's Card Deck as a simple sequence of maps, the author models Scala's card deck in an overly complex way using a Deck Class with a nested card class, and each card nesting four distinct case classes to represent suit (Yikes!). The author defends the terrible Scala design by writing:

> Sure, you could eschew objects in Scala and mimic Clojure by using generic maps/vectors/lists/sets for all your structured data needs, but that's clearly not how Scala is meant to be used.

Scala isn't "meant to be used" any way. Scala is a lots of things (often times to its detriment) but opinionated is not one. The author even alludes to this early in the post:

> Ten years ago, I would have said that my ideal dream language is one that provides the flexibility to program in any style. I want to be able to choose object-oriented or functional, immutable or mutable, high-level abstractions or low-level speed. With respect to this ideal, Scala clearly wins, supporting more programming styles than Clojure.

Classic fanboyism, as the author purposely created a shitty Scala example. I would advise that someone investigating functional languages on the JVM take this post with a grain of salt.


Not surprised by this comment. Scala has the most bitter community I've ever seen. Contrast that to the Clojure community


Why would you paint an entire programming community as bitter? That is extremely close minded.

The author clearly contradicted themself. Did you read the article before downvoting? Also why invoke a false duality like someone being part of the Scala or Clojure community is mutually exclusive? For what its worth, I have used and am a fan of both languages. That won't stop me from pointing out a flawed post though.


Well, imagine if you could represent all your data as JSON, rather than a complex hierarchy of objects and methods, and the language was designed around making that kind of data super-easy to work with.

Then imagine that you could represent your code as JSON and also that a database existed that let you store and build queries on JSON directly (i.e., Datomic).


Come back to me when there's a standard for JSON schemas so my data can be actually strongly-typed + verified.


JSON was used in the original article as a placeholder for Clojure forms. You can (optionally) get strong typing using Clojure records[1] (see defrecord) or using core.typed[2]. Datomic also lets you specify types[3].

I'm not "coming back" to you, as I don't appreciate your snarky, know-it-all tone. I'm replying for the benefit of others reading who may not be familiar with Clojure.

[1] http://clojure.org/datatypes

[2] https://github.com/clojure/core.typed

[3] http://docs.datomic.com/schema.html


I think the parent is saying basically: Greenspun's eleventh rule

   Any sufficiently complex Lisp implementation contains 
   an ad hoc, informally-specified, bug-ridden, slow 
   implementation of half of Haskell's type system.


I actually think that Clojure's relatively unique feature which differentiates it from most languages (other than, perhaps, Erlang and Haskell) is its emphasis on managing state, especially in the face of concurrency.


Sort of. But I don't think most people are actually using the concurrency primitives in Clojure all that much. How many systems out there that people are writing make heavy, or even significant, use of concurrency primitives? I'd guess it's definitely the minority.

Immutability and programming with data structure literals are what Clojure's all about. Lisp, dynamic, and functional are obviously the other big choices that were made to achieve the overarching goal of the language - simplification. But imo, programming with performant, immutable persistent data structures is the essence of Clojure programming.


I agree that immutability is central to Clojure and the resulting code definitely reflects that. But managing state, as the OP put it, truly is what Clojure is all about - and immutability falls out of that very deliberate design decision.

Let's not confuse the lack of significant STM transactions with refs and its irk with representing the totality of Clojure's concurrency story. Many Clojure libraries make use of atoms, agents, and dynamic scope (which is arguably a concurrency primitive, given the thread locality of the bindings, but not unique to Clojure).

While "concurrency is not parallelism", parallelism is a special case of concurrent programs that is often challenging. Clojure's offerings here too are compelling - we have the amazing Reducers library which allows you to write higher order functions for collections that, as Rich Hickey put it, "know how to reduce themselves" - and get parallelism (without locks, without even thinking about it really) on top of that.

And then there's the lazy parallel functions - pmap, pcalls, pvalues. Check them out if you haven't. I use Clojure often in a machine learning context with Hadoop/Storm, these abstractions are highly valuable in crafting solutions.

Futures/promises are also widely used in Clojure.

I'm confused as to why you think data structure literals are part of Clojure's core thesis. Perhaps I just don't understand what mean - are you saying that the fact that the Lisp reader can read strings as data structures as being part of what Clojure is about? If so, that statement would apply to all Lisps, not just Clojure.


>I'm confused as to why you think data structure literals are part of Clojure's core thesis.

I use refs, atoms, and agents very sparingly. You just don't need them very much when you have a suite of performant, immutable data structures (with literals). These data structures, and the assortment of polymorphic functions provided in the language to operate on them, imo are the crux of the language. They dominate the experience of the programmer. As far as managing state goes, they make it so that you rarely have to reach for the explicit constructs listed above.


Aren't immutability and persistent data structures means of managing state?

Note that pron said "its emphasis on managing state, especially in the face of concurrency", not "its emphasis on managing state in the face of concurrency". That is, concurrency is an important area where Clojure's state management really shines, but certainly not the only time that managing state is important. I would say managing state is a key aspect of any complex program, even in the absence of concurrency.

I think this actually fits nicely with TFA's assertion about "lightweight data modeling" because managing state and data modelling go hand in hand IMHO. The fact that its lightweight and easy to do (data structure literals also help keep it easy) means that programmers will tend towards this in their solutions.

Actually, I believe Rich Hickey says in his talks that its all about managing complexity and making solutions simple through a datacentric approach. I believe that this is, ultimately, what managing state and lightweight data modelling (and by extension Clojure) is all about and its this philosophy that draws me to Clojure.


Sure, we're in violent agreement. I took pron's statement as if he were referring to the explicit constructs in Clojure for managing state, i.e. refs, atoms, agents, vars. These are in fact about managing state, with or without concurrency in the mix. But in my experience, these represent a very small portion of Clojure programs. Whole libraries are built without using any of them.

I think Mark's right about "lightweight data modeling." It's central to Clojure programming.


We are definitely in agreement then :)


If object-oriented programmers (here represented by Scala-ites) are excessively fond of modelling the crap out of stuff, to the detriment of the code they're trying to write, there is also a problem with the opposite, "lightweight data modelling" approach: a weird fascination with representations of data, instead of the structure of it.

Here's an example from Land of Lisp:

    (defun limit-tree-depth (tree depth)
      (list (car tree) 
            (cadr tree) 
            (if (zerop depth)
                (lazy-nil)
                (lazy-mapcar (lambda (move)
                                (list (car move) 
                                      (limit-tree-depth (cadr move) (1- depth))))
                             (caddr tree)))))
and here's the version that I re-wrote, using your friend and mine, defstruct:

    (defun limit-tree-depth (tree depth)
      (labels ((limit-depth-move (move)
                 (let* ((next-tree (limit-tree-depth (get-tree move) (1- depth))))
                   (if (passing-move-p move)
                       (make-passing-move :tree next-tree)
                       (make-attacking-move :src (attacking-move-src move)
                                            :dst (attacking-move-dst move)
                                            :tree next-tree)))))
        (make-tree-node :player (tree-node-player tree)
                        :board (tree-node-board tree)
                        :moves (if (zerop depth)
                                   (lazy-nil)
                                   (lazy-mapcar #'limit-depth-move
                                                (tree-node-moves tree))))))
I submit that the second may be preferable, even if it is a bit more verbose.


agreed


> All or nearly all of the functional aspects of Clojure have counterparts in Scala. On top of that, Scala provides [...]

Scala doesn't provide the syntactic macros, or the same simpler approach to concurrency.


Scala macros are different. We deeply respect the Lisp tradition, in which we are rooted, but we also accommodate and empower our language's values - static types and rich syntax [1]. The end result is admittedly less flexible, but it also opens a number of interesting possibilities beyond those available in Lisp [2]. Our design is also not set in stone, and we're experimenting with different macro flavors [3], but it's too early to say something definitive about those.

[1] http://scalamacros.org/paperstalks/2013-09-19-PhilosophyOfSc...

[2] http://scalamacros.org/paperstalks/2013-12-02-WhatAreMacrosG...

[3] http://docs.scala-lang.org/overviews/macros/annotations.html


There seem to be 4 primary programming paradigms available on the JVM...

Clojure - concurrency (immutable by default), macros (lisp-style)

Scala - object/functional fusion, lazy evaluation

Ruby dialect (JRuby or Groovy) - "methodMissing" meta-object protocol

Java - low-level OOP (but functional lambdas and lazy streams coming in Java 8)

COMING SOON: Javascript (Rhino or Nashorn) - functional with prototype-style OOP


Clojure uses a lot of lazy evaluation in its collections library. Does that count?


Yes, I just listed the features of Clojure that weren't available in any other production-ready JVM language.


Why is Rhino coming soon?


That was "Javascript on the JVM is coming soon".

I don't think Rhino's being used much in production (correct me if I'm wrong) perhaps because of slow execute times, but the far racier Nashorn is likely to kick out any lingering use of languages used for JVM scripty stuff, e.g. Groovy, Xtend, Beanshell.


So it seems sometimes that being 'forced' to adhere to lightweight data structures and functional methods makes all the difference. I'd tend to agree with you, as I have noticed that it is hard to avoid OO and imperative techniques when I have that option.


Perhaps I'm missing something about Clojure, but it looks completely and utterly foreign. As someone who didn't have much trouble learning a "complex" language like Scala I find Clojure really hard to fathom.


Coming from a Java and Groovy background it took me a while to "unlearn" and I think this is what happens to you.

If you spend more than a few hours on Clojure (maybe you can create a small toy project) it make much more sense.

I'm experiencing quite the opposite, I'm learning Scala now and it all feels too cumbersome, and even the syntax of Scala bothers me now.


Have you learned other Lisp-like languages before? Lisp exists truly in its own realm, syntax wise.

My experience so far is Logo when I was 6 and reading Godel, Escher, Bach, but I was thinking yesterday of investigating Clojure and that's my task for today. I'll check back.


> Lisp exists truly in its own realm, syntax wise.

It's fairly easy to put another syntax atop Clojure if you want, though in my experience the net effect is to restrict what Clojure can do, beginning with disallowing macros.


I'm looking forward to embracing its lispiness.


What parts do you find hard to understand?


It seems to me what he calls "lighweight data modeling" is a synonym for "untyped data". Use ints, strings, arrays and maps to model everything (his JSON example).

It's great for prototyping but when you start getting big, you really need to tighten these models down to real types.


Why vs? What a false choice, I love both of them.


There is no language I love more than Clojure but I would never describe it as "lightweight" (as I wait 5-10 seconds for lein repl to load).


Despite the "kumbaya, we're functional brothers...we all win", I know for fact (very reliable sources) that the Typesafe guys absolutely hate the traction that Clojure is getting, especially that they need to get a ROI on their funding.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: