HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Rails Conf 2012 Keynote: Simplicity Matters by Rich Hickey

Confreaks · Youtube · 134 HN points · 32 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Confreaks's video "Rails Conf 2012 Keynote: Simplicity Matters by Rich Hickey".
Youtube Summary
Rich Hickey, the author of Clojure and designer of Datomic, is a software developer with over 20 years of experience in various domains. Rich has worked on scheduling systems, broadcast automation, audio analysis and fingerprinting, database design, yield management, exit poll systems, and machine listening, in a variety of languages.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Mar 03, 2022 · 1 points, 0 comments · submitted by ColinWright
Aug 26, 2020 · 1 points, 0 comments · submitted by jorgebucaran
Jul 23, 2020 · slifin on Nubank acquires Cognitect
If what you're comfortable with is Spring or Drupal or Rails and those work for your use case then more power to you

At the same time it's true there are products like roam research that are fundamentally harder to achieve in a cookie cutter environment

This talk kind of eludes to why that is but it's worth learning there's many mature pieces of tech in this community that are astounding

If you need boring.lib then reach into Java/JavaScript with the easiest interop I've ever used

If you've spent years compiling your code and running it or changing it and pressing f5 or changing it and hoping hot code reload does the right thing it won't be apparent why a REPL in process would be helpful

If you've spent years editing your code character by character it won't be apparent why being able to use shortcuts or indentation to edit your code as forms would be a new editing default

I'd argue it's possible to teach everyone, but not motivate everyone

I've also seen a few smart people learn the mechanical parts of Clojure and then dismiss it, not understanding the hype because they were just using it like a very imperative language that just happened to be immutable

As a learner you should definitely go through it's rational and Rich Hickey talks first before learning it:

Mar 28, 2020 · 2 points, 0 comments · submitted by s16h
Jan 16, 2020 · tosh on The Roots of Lisp (2001)
another angle:

while Clojure is a Lisp with its own distinct flavor it gives you …

* a large community of practitioners and professionals (great for asking questions, finding collaborators, …)

* easy access to libraries in the js/jvm/.net ecosystems

* a style that relies more on data and transformations of data (illuminating simplicity)

in any case: it is worth digging deeper, one thing that kept me away was not knowing where to start (analysis paralysis). in hindsight picking any Lisp would have been great (instead of postponing).

Find a thread and start pulling :)

Rich Hickey’s talks were a great entry point for me

e.g. (the “Simplicity Matters” keynote at Rails Conf 2012)

edit: Land of Lisp is a great book as well

Nov 05, 2019 · taffer on PostgREST
> much as you call property setters or getters in OOP languages

Good design is obvious and orthogonal[1]. If you write setters in an OOP language in such a way that they do surprising things, i.e. not just setting a value, then I would call that bad design.

> which is good for loose coupling, modularity, etc.

What do you gain by using triggers in this case? All you get is mental overhead, because whenever you use DML you have to keep in mind that there might be a trigger hiding somewhere that does strange things. If you call a procedure instead, you make it clear that you want to do more than just a simple update or insert.

> [...] it is possible for the method call to make use of concurrency and parallelism constructs [...] to do a unknown number of things in an unknown order

Why would I want this? I want my code simple[2], stupid and obvious, and not convoluted, clever and surprising[3].




Why do you think triggers must be astonishing (but OOP not so)??
There's nothing wrong with that if that's the logic you want!
Versioning: sorry I'm not sure what you mean. Same as if you had queries in an ORM and views using the results, you'd need to branch and make changes, then merge the branch together at once. I just do the same. If the database structure changes, I run the ALTER TABLE commands, when switching to that branch.

Other people : I guess like anyone choosing a language, you're excluding those who don't know it. I'm assuming more know SQL than Rust or Elixir or whatever. And more should.

Gain in practice : Two best improvements were:

(1) Having all data and data-logic (which might also be business-logic) in the same place. No worrying that some external code somewhere might have old data/business logic in it. One definitive source. Like Rich Hickey's classic "Simple/Complex Hard/Easy" talk - - I like that this is simple, un-complex. The data and external code don't need to be braided/complected together to work.

(2) The freedom do switch my external code from Ruby to Go or Elixir or whatever, and not have to rewrite all that functionality. It's all just simple API calls.

Sorry I haven't looked into materialized views yet, so I don't know how this compares.

Thanks a lot for your reply!

It seems the main gain for your is to move between languages and allow initial iterations/bugfixes without touching the application code.

The database has more maintenance issues (like rolling txids) than the application code. I am not sure I want to add more complexity and potential issues to the database.

FYI, a materialized view is a potentially long or complex query whose results are cached, so you can say 'select * from complex_query_result' to get them, and refresh the complex_query_result whenever you feel like it. You can also update the query that generates complex_query_result.

In practice, MV can give you speed (as you can refresh the MV when you need/want, while keeping the results) and also put the data-logic inside the database (as the MV is defined initially, and can later be updated) if you don't need super fresh results. If you do, use a regular view.

In either case, you can use the view approach when parameters are needed, iff you can reduce you query to where parameter=something on the view. Otherwise, you need to use languages like pl sql.

As the materialized views queries just return the results to be processed, and I have very little extra to do, your approach seems overkill for my use case.

Unless it changed very recently be aware that you can't update sql definition of a materialized as easily as you would update a view. There is currently no "CREATE OR REPLACE" option so any dependencie build onto a materialized view can quickly become a real pain (been there). Sometime, it might be easier to stick to the old trick of a table updated by a refresher function (possibly called as a trigger).
Personally I have update scripts for 'version control' of the MV and its dependancies, it is not very painful to do:

begin; drop ... cascade; create ...; commit;

Dependencies are a general problem, like changing the type of a table that has dependent views. It's a good idea if your database update scripts/migration software can handle something like this. Other databases don't tend to be as strict as Postgres here (I only found out that some rarely-accessed views never quite worked after migrating from Oracle)

I've had good luck with these functions:

Allows you to save-and-drop dependent views (materialized or regular) and then restore them after your updates.

Deeply agree about dependencies being a general problem.

Thanks for the script that look pretty clever #bookmarking. Like particularly the approach of "drop what you saved, no more no less". DROP CASCADE is simpler but can have undetected side effect, if this script fail to backup all dependency, logically you will get an error when attempting to delete target and that rocks.

> When people talk about simplicity in software, they don’t necessarily refer to ease of use or number of lines of code, but instead it’s about how understandable a solution is given their shared knowledge.

Rich Hickey gave a great talk on the topic of "simple" vs "easy":

Aug 23, 2019 · cutler on Why Clojure?
Clojure is by far the best programming language I've ever used. Rich Hickey's Sermons On The Mount changed the game of programming once and for all. With Clojure you could finally have your Lisp cake and eat it. Witness the sheer chutzpah of the guy when he basically told Ruby devs they were doing it wrong at Rails Conf in 2012 (
Oh man this talk... so incredibly good. The principle of simplicity vs easy is what drew me to Elixir, and what I'm starting to dislike about Rust.

In Erlang/Elixir, you have patterns, and interfaces which are fairly low level, but provide meaningful abstractions over common goals... I.e. we have OTP. OTP is fairly simple, and that doesn't mean it's easy, it's really not, but it can be simple once you're familiar with it.

Rust, on the other hand, is easy to get started with, the core of the language is fairly small (and relatively simple) but that's where the complexity starts to creep in.

It's so small, I have to constantly reach for a crate for common goals... There are tons, and they're fucking awesome, BUT they're easy, not simple, and my application's complexity grows exponentially relative to my familiarity each time I grab another crate.

I think it's easy ;) to say the Rust ecosystem has to contain the complexity it does, and making it easy is the best we can do, but with great challenges comes great reward... Batteries not included, yes, but at least you could know the size (spec) required for the ones you need? That's probably a shitty metaphor but I tried...


One more thought on this tirade, when Rich Hickey casually says that engineers should be working to remove complexity from the business... That hit me right in the feels. Too often do we get complex business decisions/goals and just accept them. We need to push back and help them simplify their goals... too often I've gotten some crazy request, where instead of implementing it, I pushed back to get to the root of the problem and ended up with something wildly simpler and better for both parties. When we push back, it shouldn't be to make something easy, it should be to make something simple. You can scale simplicity almost infinitely, but complexity will eventually come crashing down.

More on topic... Thanks for posting this talk, it and the article have inspired me, I'm gonna go try to write a bit of Clojure today. You should write some Elixir, or maybe LFE (Lisp Flavored Erlang), I think you'd quite like BEAM :)

FYI, Clojure can run on the beam:
> Too often do we get complex business decisions/goals and just accept them.

At my last job the client reps were notorious for saying “yes” to every client request and other devs were notorious for implementing those requests without question. I really tried hard to push back and find out what the client was trying to achieve. So many times after a brief discussion I would be able to inform them that what they wanted was already possible or doable with way less effort and expense. Sure, we didn’t get to bill them as much, but I’d much rather solve their pain points without adding code (especially when the code has already been written) than needlessly charging them.

> by far the best programming language I've ever used

Would love to hear why? What is that make Clojure such a good experience for you?

Dynamic and functional. Lazy evaluation. Real immutable data structures. Core.async. A modern lisp not constrained by the exclusive use of parens, ie. [] for vectors and {} for sets & maps as with JS, Python and Ruby. Real REPL-driven development. Macros. Spacemacs with CIDER. Polymorphism which beats OOP at its own game. Massive Java ecosystem at your fingertips plus 25000 pure Clojure libraries and last, but by no means least - a rock 'n roll BDFL with a mullet.
Not original poster, but my take is:

- Immutable data-structures with concise literals for lists, vectors, maps, and sets. Having pure functions and immutable data-structures makes code easier to reason about, easier to test, and thread-safe.

(But clojure doesn't "force" you to be pure. The idea is that you write as much of your code in pure functions as you can, and push the IO and impure parts to the extremities. It's very pragmatic in this way, and you can get real work done.)

- Simple syntax:

While the syntax may be intimidating at first and arithmetic looks weird to untrained eye, once you realize that everything is a function (or a special form that also looks just like a function), the very light syntax and consistency feels amazing/refreshing.

- Macro system:

This is tied to the previous point. Since all your code is technically a "list", and you have something akin to a pre-processor where you have the full library of clojure functions to manipulate data. But in this pre-processing phase, your data is the code. (the code is a list). Now you can dynamically re-arrange or re-write code. Lookup homoiconicty and learn about the kinds of things you can do in macros that can't be done in other languages.

- Capabilitis for general programming, abstractions, code re-use etc.

- Performance is quite good. It's often nearly as fast as java, yet your codebase might be 10x smaller because the language is so expressive.

- Concise/expressiveness. Clojure comes with a nice built-in library with generic functions that you re-use again and again. Give a programmer a few dozen of these functions and it's amazing what can be composed to solve many problems succinctly and elegantly.

- There are surely other benefits, but the last one I'll leave with is hard to explain unless you have felt it before. It's REPL driven development.

Clojure comes with some seriously awesome REPLs (i.e. a shell for interactive tinkering with the language). Other languages may have some form of REPL, but no other language in my experience has come close to the feel you get in a clojure REPL. I can best explain it as a freedom of very light-weight experimentation that you use to write your code. It's a playground for writing functions with very quick feedback to see if your code will work or not. One factor that makes the clojure repl experience so nice ties back again to the succinctness and expressiveness of the language. Typing commands in the repl is painless because it's concise, not a lot to type, and then the feedback is so instant.

Somehow when I write code in python, java, javascript, or other languages, I just don't use their REPL or shell as often. It's just not quite the same as the clojure repl experience.

> Macro system

Macros are actually my least favourite part of Clojure. Sure, its great to have them and there are some libraries that use them to excellent effect (instaparse, Hugsql, etc), but most of the time, I prefer tools that don't use macros. You can see by comparing libraries that were made in the earlier days of Clojure versus more recent ones: the earlier ones love to use macros while the newer ones prefer functions and data. The data-first libraries are, in my opinion, easier to test and easier to build complex things on top of. If I look at a readme and see that the expected way to interact with a library is a macro, I usually look for a data-first library instead and only use the macro one if I can't find an alternative.

Macros are technically cool, you can do some interesting things with them, but I find that in most cases where they're used, they are inferior to alternatives. Its technically very cool that core.async could be implemented as macros, but I feel that core.async greatly suffers from it versus being built into the runtime: you cannot use many operations in functions called by core.async/go because the macro can't see inside function calls. Also, I've had exceptions thrown by core.async where the stack trace did not mention any of my code. That was not fun to debug.

Don't get me wrong, macros do have their place: hugsql uses macros to parse the SQL file and generate functions for you, which is awesome! But for every great macro-based library, there are many more that I wish didn't use macros.

(This is user-facing macros in a libraries' API. I have no problem with using macros internally)

I do love Clojure overall, though.

I agree!

When I meet a girl, I ask her about macros and functions first.

I except to hear something like: macro < function < data.

I do agree that you should prefer functions over macros. The rule of thumb I’ve heard is only use macros if they are absolutely necessary.

But having that option is better than not having the option at all IMHO.

Absolutely, I even mentioned some cases where I felt they were used to great effect. Having said that, though, in my experience, they do cause an “just build it with macros” attitude. For example, the issues I mentioned with core.async will likely never get fixed as doing so would require compiler/runtime support which is highly unlikely to happen.
The above posters point on Homoiconicity is a big one, here is a quick (and silly) example for anyone unfamiliar with the term.

Take the following clojure:

    (+ 1 2)
     => 3
Here the ( ) delimits, a list, and as the blog post says most lists are function calls. In this case the function is + and it's params are 1 and 2.

It means if we do this

     (1 + 2)
We get an exception that 1 isn't a function...

However, we can tell Clojure to treat this code as data by quoting the list using the '

    '(1 + 2)
    => (1 + 2)
THis is now a list. Where the first item in the list is a number and second is a symbol + and the last item is a number.

WHat if we write a function to swap the first two items in a list?

    (defn swap [x]
       (list (second x) (first x) (last x)))

    (swap '(1 + 2))
     => (+ 1 2)
We have to quote the parameter (1 + 2) as clojure evaluates arguments to functions (mostly..), by quoting it we are saying don't evaluate, instead treat it as data.

So you can see

     (+ 1 2)
Looks like Clojure code, even though it's a list.

We can eval it:

    (eval (swap '(1 + 2)))
    => 3
It's inconvienent to have to remember to quote the params and call eval.

Up steps macros, macros are evaluated before compile time and don't evaluate their arguments. So we can rewrite swap as a macro

    (defmacro swap [x]
 (list (second x) (first x) (last x)))
Now we can call

    (swap (1 + 2))
 => 3
This lets us essentially extend the compiler and create DSLs specific to your domain problem. Creating new language constructs is incredibly easy.
Why not just do that as a function though? Why use a macro?
It was an admittedly simple example that was to show how easy it is in Clojure to treat code as data and data as code.

The main point of macros is when you use a regular function in Clojure you have applicative order evaluation. Macros do not, as macros are designed to transform and generate code.

A better example would have been the (when) macro.

In Clojure you have (cond) and (if) for conditional evaluations. If has the form

    (if (cond)

    (if (= 1 1)
      (println "1 = 1")
      (println "uh-oh the unviverse is broken!"))
    => "1 = 1"
Now what if we want to do more than 1 statement on the true path and we dont care about the false path?

    (if (= 1 1)
      (println "1 = 1")
      (println "also hello"))
This won't work as now "also hello" is the false path.. We can use (do) to specify multiple things to be done.

     (if (= 1 1)
          (println "1=1")
          (println "Also hello")))

      Also hello
But writing (do) is a bit of a pain, so an alternatie would be (when)

    (when (= 1 1)
       (println "1=1")
       (println "Also hello"))
Will print, when run,

    "Also hello"
COuld we implement this as a function, sure? BUt we would have to quote the function calls when we pass them so they aren't evaluated and then eval them if true. You could do it, but it would be messy.

Essentially, we want to write

    (when (= 1 1)
       (println "1=1")
       (println "Also hello"))
But want the code that gets compiled to be:

     (if (= 1 1)
          (println "1=1")
          (println "Also hello")))
So we want to extend the language so we can write (when) and the compiler will write us an (if (do))

This is incredibly easy in Clojure. We just need to create a list where the first item in the list is `if`, the second item is the conditional we pass to the macro, the 3rd item is a sublist, of which its first item is `do` followed by the list / functions to execute when our test evaluates to true!

    (defmacro when [test & body]
        (list 'if test (cons 'do body)))

This is just the source code for the actual clojure core when macro. But you can see how easy this is!

If we run the macro expansion on when we can see the code that it generates:

      (macroexpand '(when (= 1 1)
         (println "1=1")
         (println "Also hello")))

       => (if (= 1 1) (do (println "1=1") (println "Also hello")))

Another example would be the reader macro:

Say we have the nested function calls:

    (reduce + (filter even? (range 1 11)))
    => 30

After you have a lot of nesting this can get difficult to read, so you have a macro ->> which is the threading macro. This takes a series of functions and threads the result of each function as the input to the next.

     (->> (range 1 11)
          (filter even?)
          (reduce +))
The source code for this is almost as simple as (when)

     (defmacro ->>
        [x & forms]
        (loop [x x, forms forms]
           (if forms
             (let [form (first forms)
                 threaded (if (seq? form)
                 (with-meta `(~(first form) ~@(next form)  ~x) (meta form))
                 (list form x))]
              (recur threaded (next forms)))
The ` ~ and ~@ are just doing some quoting and quote splicing to determine when we want stuff evaluated.

Basically, using macros you do things like control symbolic resolution time, extend the compiler to create a DSL spcific to your domain and reduce boiler plate code.

That's before you start getting into properly weird stuff like anaphoric macros.

Thank you! I think I've seen the "if" and "when" examples before, but it clicked better this time.

I've read a few Lisp books and dozens of internet blog posts, so I know about macros and why people use them without getting the full understanding which comes with actually writing code.

Because function arguments are evaluated before the function itself. In this case, that evaluation would fail because numbers do not implement IFn, and hence cannot be called.

Macro arguments are not evaluated, and so this works.

You could make it a function, and pass the arguments quoted, but it'd be more cumbersome.

I have just tried it in REPL and got this cryptic error:

    cljs.user=> (defmacro swap [x] (list (second x) (first x) (last x)))
    cljs.user=> (swap (1 + 2))
    Execution error (Error) at (<cljs repl>:1). is not a function
You couldn't write it as a function, unless you pass in the 1+2 part to an outer function (macro) as either a list of arguments (1, math.add, 2) or a string (+you have an eval fn).

At that point you're emulating lisp without the elegance, and the first approach is only possible because functions are first class objects. If you were trying to rearrange an expression that had control flow or keywords in it (eg. Modify the behaviour of an if) then you would have to reify the "if" (change the original code so it had a class to represent the if not use the keyword). So you're kind of reinventing lisp by wrapping every part of your language as an object

I'm apparently too stupid for you :). Please excuse my ignorance. Why can't you do something like the below?

(defun swap (x y) (y x))

Then you can call it like:

(swap (2 3)) => (3 2)

In clojure if the reader (compiler) ever sees a list like `(2 3)` it evaluates it as a function call with the first item as the function so you'd need `'(2 3)` or `(list 2 3)` to generate a list literal (or more often `[2 3]` as a vector type in clojure).

You could do `(defn swap [the-list] ((second the-list) (first the-list)))` (which would invoke the second item as a function on the first item). It comes down to is the list a literal list as a parameter to something or is the first item a function to be invoked.

In the example we wanted to be able to call (1 + 2) and have it evaluate to 3.

   (defn swap [x]
      (list (second x) (first x) (last x)))

   (swap (1 + 2))
   =>Exception! 1 isn't a function.
Clojure tried to evaluate it's argument to swap, and the argument was (1 + 2), which is a function call, where the function is 1 and the arguments are + and 2.

So we quoted it in the function call by putting ' in front of the list '(1 + 2):

    (swap '(1 + 2))
    => (+ 1 2)
Here, we stll didn't get 3 as our output... We got (+ 1 2), which is a list. Because the function returned a list, it didn't return code! It might look like code, but it's not code! It's a list.

So if I was to

   (+ (swap '(1 + 2)) (swap '(3 + 4)))
   => Crashes! Can't convert  alist to a number.
Because what it actually runs is

   (+ '(+ 1 2) '(+ 3 4))
Whereas with the macro

   (+ (swap (1 + 2)) (swap (3 + 4)))
   => 10
Works because the macro gets expanded BEFORE compile time, and our swap code gets replaced out with the code the macro generates!

    (swap (1 + 2))
actually compiles as:

    (+ 1 2)
SO at runtime, that will be 3.
Others have answered already, but I remember this being confusing for me. And sometimes it still is. I think the crux of it comes to two things. The first is that the text (f a b c) is evaluated as a function call with function 'f' receiving arguments a b and c. The second thing is that arguments are evaluated before they are passed into a function. So a b and c are evaluated first, and then given to 'f'. This is done recursively.

So (+ 1 2) first evaluates 1 and 2, which each evaluate to themselves and thus can't be simplified any further. They are then passed into '+'.

in (+ 1 (* 2 3)) the + is a function and the arguments are 1 and (* 2 3). 1 evaluates to itself. Evaluating (* 2 3) starts by evaluating 2 and 3, then applying them to '*' resulting in 6. So if we displayed an intermediate step, our form would now look like (+ 1 6), finally evaluating to 7.

To prevent the form (f a b c) from being treated as a function call, you can "quote" it by prefixing with a single quote (aka tick): '(f a b c). The evaluator now knows we intend that to just be data and not to evaluate it.

The exception to evaluating (f a b c) as a function call is if 'f' is a macro. In that case, the arguments a b and c are not evaluated and instead are passed into the macro as data.

Where this comes into play is if you want to programmatically change the code around before it's executed, and particularly where the arguments have side effects when they are evaluated (changing the state of something, like adding data to a database).

Is there a way to simplify the `swap` function? E.g. in JavaScript it is just

    swap = (x,y, => [y,x,]
Not the grandparent, but I realized the other day that I'm at nine years of clojure, so...

What's made clojure so great, imo, is its unicorn status as a principled-yet-practical language. That "principled" part is not worthless---it means that a lot of great minds are drawn to it. Before react took over the world, clojure folks were already taking steps in that direction.

A lot of other things. The "sequence" as a core abstraction is very powerful. Immutable data structures by default make functional programming perfomant and efficient.

Speaking of data structures, data structure literals have spoiled me for other languages. After Java, especially.

And none of this mentions clojure's lispiness. The ease of metaprogramming has allowed the community to build some of the best tooling out there, between CIDER for emacs, and Figwheel for the browser---oh, did I forget to mention clojurescript? Being able to reuse code on the front and backend is great for web applications.

Clojure isn't "everything." I think I'd still benefit from learning Haskell, APL, and Forth, and I wouldn't mind knowing Ruby and js a bit better. And I'll probably be dragged back to Python and R if I keep doing maths.

But if someone asked me which single language would probably do the most for them professionally, I'd say clojure. It's the language people start startups so they can use it.

What is GUI programming like in Clojure? What libraries exist, and what paradigms are used? E.g. is it more like React or is it more like Gtk/Qt?
Seesaw/swing are the old stable solution. The latest library is cljfx built on JavaFX. The developer is very responsive and open about it's development

Check out the examples/mini-tutorial:

It's all React-like. You have a state atom and a GUI map data structure (which I think is equivalent to your DOM in React..) and then you hook up events that update the state and blah blah. It's all very clean and easy to read/use. The underlying JavaFX is also great for an OO GUI library so it's not gross to dive into if you need it

For frontend Reagent (React but with just Clojure functions and data structures. No classes, no jsx, just one language and immutable data.), and re-frame which does state management and event dispatching better than the competition. Redux is a poor poor imitation.
For web frontend (ClojureScript) there's reagent and re-frame, the Clojure versions of React and Redux.
There is a swing example on the getting started page (or there was a few years ago). It was a one liner to have a Swing pop up say "hello world" or something like that. I'm not sure how easy it is to write actual apps that way.
There's a really good UI framework for Clojure (not ClojureScript) called seesaw. If I recall correctly it was swing, but it's a very nice data focused, functional library. It really works well with the Clojure philosophy and interactive development. It's also VERY complete.

There are downsides, however. The UIs that it creates definitely look like Swing apps. I.e. ugly at least on Linux. Also, the up to date documentation is difficult to find, I'll like to it here when I get home. The GitHub-linked docs work fine, but they do miss a few features.

I wish the last bit was true. Clojure jobs are very thin on the ground even here in startup London.
Ever play with Erlang or Elixir? If yes, I am curious what your thoughts are on them.
Yes, Elixir would be my second choice but although it has Lisp influences it doesn't capture the real magic of Lisp for me. Code as data belongs exclusively to real Lisps. Elixir also lacks vectors of its own and tends to be too niche compared with Clojure.
I've been doing a lot of Elixir recently, and I have a fair amount of experience with Scheme and Clojure. The biggest "aha" moment for me with Elixir came when I realized that everything, just like in a LISP, was an expression.

Then, when I learned about Elixir's macro system, I thought, "wait a minute, this is LISP! This is just a LISP in Ruby's clothing in Erlang's pasture!"

That was a delight. :) My heart still belongs to the LISP family, but right now I'm working as an Elixir developer.

I agree. In 2014.

It's 2019. Nobody gives a shit.

It's a lot of fun but as projects got larger and larger for me (thousands of lines, or even tens of thousands), I found the dynamic typing taking up more and more of my time. I've since moved on to statically typed systems where the compiler takes a big load off the cognitive requirements of maintaining and debugging software.
This has been my experience as well. I have a couple of Clojure(Script) applications that are approximately 3-5K lines each and those have been a pleasure too work on. However, my latest project is now pushing past 20K lines and the mental load has jumped exponentially. There is a much greater need for spec, asserts, type hints, and comments just to keep everything straight.
That's interesting do you have any idea why the mental load jumped? would a static analysis tool working with your type hints help?

Or is there many things to consider at once in the system instead of many individual things?

The biggest issue I keep running into is just the concept of data "shape". I love that clojure gives you so much freedom but it can be quite the footgun in a large system because you see that a function expects a map with keys :foo, :bar, and :baz but what are the values for those keys? Spec helps a little bit here for primitive values but for complex nested structures (e.g. {:a [{:b [1 2]} {:c "bar"}]}), it doesn't do much. So, as data moves through the system, and as the system grows it has become increasingly difficult to track the mutations to the underlying structures.

I do think that a static analysis tool would be of some help. Some sort of tooling to better handle tree structures would be very handy. I often find myself being off-by-one level with get-in calls on tree data (E.g. (get-in m [:a :b :d]) where m is {:a {:b {:c {:d 1}}}}), which is annoying because the NPE gets thrown 3 function calls up the stack.

Totally not saying "you're holding it wrong", but maybe once you're more than a few couple levels deep into a nested map it's time to look at an in memory db like [Datascript]( Actual Datalog queries can replace get-in vectors growing out of hand, and you also get better mutations with transactions.
I'm late to the thread, but I've felt the same pain and wrote a library to help deal with the cognitive load of working on "shapes" of data [1].


How long have you been programming professionally?

Did you have prior experience with dynamic languages? If so, how much?

Did you have prior experience with statically typed languages? If so, how much?

I'm trying to see if Clojure requires more fundamental programming intuition and experience to sustain in large projects. I work with tens of thousands of Clojure LOC, I don't feel these issues and can't relate, and love Clojure, so I'm curious to understand what context it best applies too. I wouldn't want to force it on a team that wouldn't benefit from it, so I'm interested about learning these aspects, so I'm able to recognize in what context it would make sense for me to influence a team to adopt it or not.


12ish years (more as student & hobbyist), lots (mainly python, javascript and clojure), lots (mainly Java and C++, sadly inly got to tinker with better type systems). I consider myself a clojure developer and have been using it on and off since summer 09. Im using it for a large project now.

And yet... I agree with GP. I love clojure, but over time I’m becoming less and less sold as dynamic typing. Spec helps, a little. Property-based generative tests (especially when used with spec) helps a little too. Neither are a replacement for proper static types, though, especially an ML-esque type system with type inference. Bonus points if spec validation could pass type data to the compiler/type inference (eg the code path after validation can assume that the data is of the the types described in the spec).

I dream of a statically typed Clojure with type inference, spec-inferred types and optional dynamic typing for REPL experimentation and glue code.

Very interesting, and thanks for answering.

Could I ask what kind of project it is? I wonder if I'm just lucky that Clojure fits perfectly my use case, which is mostly a set of distributed systems of all kinds. So while as a whole there's tens of thousands of LOC. The components have very strong boundaries being as it's a set of services assembled together through RPC, PubSubs and DBs. Maybe that alleviate the lack of a static type checker.

We've adopted Clojure about 3 years ago, team of 10. We've had a few people leave and join throughout. Only one person knew Clojure beforehand. Our stack is about 50% Clojure, 40% Java and 10% Scala. Of all three, Clojure has given us the least issues, has been pretty easy to maintain and generally has fewer defects. Java tend to have the most bugs, almost always related to some shared state. Scala I find the hardest to extend and maintain, but the code base we have for it I think does Scala wrong, it's like the worst mix of OOP and FP.

> I dream of a statically typed Clojure with type inference, spec-inferred types and optional dynamic typing for REPL experimentation and glue code

That's pretty much exactly core.typed:

That said, the project never managed to get more contributors.

If you're looking for a typed Lisp, I've been keeping my eyes out on Carp:

I’ve written a lot of different Clojure projects over the years. Some large, some small. Back in 2013–2015 I ran a startup entirely on Clojure(script). My current project is an automation service for cryptocurrency trading bots (that is, all the infrastructure, automation, configuration, dashboard etc for running a bot, but the bot strategy and signals are up to the user — hence automation tool, not bot). It’s a large system with multiple services (some running in different datacenters too). Backend is Clojure (based on duct) and frontend is Clojurescript with re-frame.

Don’t get me wrong, if I restarted the project again from scratch, I’d choose the same setup (or a very similar one). I love Clojure and am very productive in it, but that doesn’t mean I don’t think it could be better still.

> That's pretty much exactly core.typed

I haven’t looked at it in a couple of years, maybe its changed, but when I did, it didn’t really do it for me. It’s still a separate tool that lives separate from Clojure itself and it felt very “heavy”. In my personal opinion and experience, having certain things as separate entities (decomplected as Rich would say) isn’t always a good thing and leads to an inferior thing. I’ve played around with many programming languages in my time (I’m a bit of an enthusiast, I guess. I like trying out languages that are very different from what I already know – that’s how I originally got into Clojure) and it seems like a common theme. The closer a feature is to the compiler/runtime, the better it works and seamless it is over all. Another Clojure example is the limitations core.async has, because its an external library: things like <! cannot be placed inside functions or the go macro can’t see it to transform it as macros cannot look inside function calls. I’ve also encountered an exception recently where the stacktrace only showed core.async and clojure.core code, not a single stack frame referenced MY source files. These problems are hard to solve as an external library.

> If you’re looking for a typed Lisp

I’m not. I like Clojure’s particular mix of sensible syntax, immutability, sequence abstraction and general way of doing things. Other Lisps I’ve looked at don’t have the same emphasis on these things as Clojure does, so I don’t want another Lisp, I want a language that makes the exact same decisions and tradeoffs as Clojure, except on dynamic vs static types (and actually useful error messages). Maybe one day I’ll give it a try, I certainly don’t expect Cognitect to change their language because of what my preferences are.

It's not as simple as that. There are no typed Lisps, even though Lisps have been around since the 60s. That's not just coincidental in my opinion. The closest to a typed Lisp are gradual typed Lisp, like Typed Racket, and that is very similar to how Core.typed does it. There is Shen as well. Carp is the first strongly typed Lisp I'm seeing, and it is experimental and might never take off.

The issue is how would type definitions be introduced, and what kind of types would be most appropriate? As you said yourself, the way core.typed did it felt too "heavy". Yet it isn't clear how to make a more lightweight variant for a Lisp language such as Clojure without rendering the types worthless.

The first, and one of the biggest issue in my mind, is that what everyone loves about Clojure is the data-oriented style. In that style, you represents entities and their relationships using heterogeneous collections. That's where in Clojure you model your domain with Maps, Lists, Vectors, Sets, etc. It is awesome, but no one has figured out a non "heavy" way to statically type it. All methods I know of have bad programmer ergonomics. In effect, adding types back to it almost kills the data-oriented style, and it ends up feeling a lot like modeling with Classes instead. See Haskell's wiki section on this problem: they haven't solved it, and have multiple ways to possibly handle the scenario, and non are ideal.

> I want a language that makes the exact same decisions and tradeoffs as Clojure, except on dynamic vs static types

I would too, but with the caveat that the development experience would be the same, and the programming ergonomics and styles would be retained. And this, I'm afraid, is an open problem that no one has solved yet. It's not just a case of personal preference. Having a language which has the pros of Clojure and the pros of static types, without the cons of static types is hard. That's why for now, you need to choose one or the other.

May 25, 2019 · wa1987 on Murray Gell-Mann Has Died
Never seen the term 'plectics' before. Definitely seems like a source of inspiration for Rich Hickey's famous talk:
Mar 28, 2019 · stonewhite on I Miss Rails
This reminds me of the Rich Hickey keynote talk[1] at Rails Con 2012. Which boils down to: just out of sheer convenience you no longer own your code and, don't make the easy choice, take your time and make the simple choice.


Sep 09, 2018 · olieidel on Clojure is cool
For web projects, the benefits provided by Clojure on the backend side may not be very large; there are already good languages with extensive libraries available (Django, Rails, etc.). It arguably may seem hard to choose Clojure with its minimalistic libraries when these frameworks provide an "easy" [1] way to get running with a full-blown admin interface. Furthermore, backend code by itself tends to be dependency-heavy in the way that you need dependencies which you'll definitely not write yourself: You need a library to interface with your database, you need something for cryptography / passwords, etc.

Looking at the frontend (React) side however, things are different. The JavaScript ecosystem is a mess. From a viewpoint of a React developer, there are lots of libraries which vary widely in quality. react-router is an interesting example here, it had 4 (?) breaking changes so far by replacing the entire api. There's a ton of mental overhead for the normal React developer trying to write a "simple" app.

Ironically, developers start rolling their own stuff. Instead of using a form library which tightly couples your components to your redux state (redux-form), you start writing your own. Instead of coupling your entire views to graphql via apollo, you start doing it differently, your way.

This is where ClojureScript is a game changer. If your app differs just slightly from a (very) vanilla CRUD app and whipping some libraries together doesn't do the trick, you start writing custom stuff. When writing custom stuff, you want a programming language which is a) well thought through (great standard library, immutability, sane concurrency) b) predictable and c) productive. ClojureScript has all three while JS has none.

We (Merantix) are currently developing a medical image viewer in ClojureScript and had prototyped two separate versions: One in JS with React, another one in ClojureScript with reagent and re-frame. Even though it is dependency-heavy (webgl stuff), ClojureScript turned out to be the superior choice: Immutable data structure at its core which ironically perform better than Immutable.js and way higher developer productivity due to less random bugs and a more interactive development (REPL).

Using Clojure on the backend now seemed like an obvious choice: We can reuse and share code from the frontend and more importantly, all our developers are "full-stack" in the sense that everyone can at least understand what's going on "on the other side" (backend / frontend) of the stack as it's literally the same codebase.

The learning curve is significant but the advantages are tremendous. I wholeheartedly recommend learning Clojure even if you're not allowed to use it at your job. It sounds cliché, but it will make you a better programmer for sure.

[1] (to understand the meaning of "easy" above)

Rich Hickey has a good point in his Rails Conf 2012 "Simplicity Matters" talk[0] - json is just more simple than xml, as it clearly states which parts are maps and which are sequences. Xml has an inherent, implicit order in everything, due to its history as a document markup language.

If you think xml through the lens of jsx pragma style transformation, xml element and it's attributes are more or less a (typed) map. Similarly element's children is an (ordered) sequence. Where as in json you can have maps inside maps, in xml you always have to wrap maps inside an ordered sequence, as maps can only be passed to element as its children.

You can totally use xml without paying attention to the order, but the order is still there, complicating things.

[0] Rails Conf 2012 Keynote: Simplicity Matters by Rich Hickey -

> Folks are using "simple" and "easy" interchangeably here. That's probably inappropriate.

Agreed, see Rich Hickey's "Simplicity Matters" presentation on the difference [0].

Simple-Complex vs Easy-Hard


Rich Hickey is great. I remember his Simplicity Matters keynote at Rails Conf 2012. So clear and insightful.
Being able to explain a complex topic to diverse audiences is not easy to do. Rich does it very well.
Thanks. Forgot about that.
Rich Hickey - "simple made easy"

Rob Pike - "concurrency is not parallelism"

Uncle Bob Martin - "future of programming"

Martin Kleppmann - "transactions: myths, surprises and opportunities"

Simon Brown - "software architecture vs code"

That transaction talk is really good, thank you. I can now finally name the effect that I had noticed but had trouble explaining and referring to, write skews.

It's interesting that he doesn't mention phantom reads as the difference between repeatable read/snapshot isolation, and serializable, which is what other sources tend to do.

Snapshot isolation always seemed to me like cheating the intended meaning of repeatable read, insofar as some databases refer to their snapshot isolation level as repeatable read.

That is, in the strictest sense, if you read a row twice, you get the same value with snapshot isolation, but you don't actually know that the value will be the same when you commit, which as I understand is a case of a write skew.

In fact, if one thinks of the definition of these levels in terms of locking semantics, one would expect a repeatable read to have the same meaning as obtaining a read lock on the row you read, which I understand would prevent at least some types of write skew, since no modification would be possible on that row, because it would need a write lock. There could still be hazards related to phantom reads (and possibly other effects), such as making a decision based on a computed aggregate that can change if new rows are inserted. Still, this meaning of repeatable reads would already provide a useful isolation level for various cases, except that it doesn't work with snapshot isolation.

I have a suspicion that applications out there made incorrect assumptions as to the actual isolation provided by the DB they use.

The blog post barely gets into the idea of what simplicity means in programming.

If you have any interest in the topic, I recommend Rich Hickey talk "Simplicity matters"

My personal favorite is: Living with complexity 2010 by Don Norman (Author of Design of everyday things)

A thorough review by an experienced Angular developer. After having used Angular 2/4 in a greenfield project for the last 6 months, I agree with most of what he says. TypeScript is great, but Angular is sometimes easy, seldom simple and mostly complected [1].

1. Simplicity Matters by Rich Hickey:

Aug 23, 2017 · cstrahan on D as a Better C
I'll let the parent answer you (though I have similar sentiments as someone with language implementation experience, and FWIW, I have professional experience with Go, Ruby, JavaScript, Clojure, C#, and I've used Haskell, F#, and much more in my free time).

I will, however, point out that you're likely using a different definition of simple ("easily understood or done; presenting no difficulty. ") than is often used in technical circles ("composed of a single element; not compound"). Simplicity and ease are distinct, and one does not necessarily imply the other; it's not hard to find convoluted designs that feel easy due to familiarity, while a much smaller, consistent design will feel difficult because the concepts (while fewer in number) are foreign.

Rich Hickey (creator of Clojure) gave a great talk on easy vs simple:

Here's something familiar to most professional web developers: JavaScript.

But it's also complicated:

Here's the spec:

And here's something unfamiliar and often considered difficult: lambda calculus.

And yet it's very simple. Here's the entire definition of lambda expressions (lifted from Wikipedia):


Lambda expressions are composed of:

* variables v1, v2, ..., vn, ...

* the abstraction symbols lambda 'λ' and dot '.'

* parentheses ( )

The set of lambda expressions, Λ, can be defined inductively:

1. If x is a variable, then x ∈ Λ

2. If x is a variable and M ∈ Λ, then (λx.M) ∈ Λ

3. If M, N ∈ Λ, then (M N) ∈ Λ

Instances of rule 2 are known as abstractions and instances of rule 3 are known as applications.


So, while JavaScript programmers might find the lambda calculus unapproachable, it would be difficult to argue that the former has fewer gotchas than the latter, as the former is wildly complicated and the latter is so simple that you could fit its definition on business card. Once you appreciate the difference between simplicity and ease you can better evaluate your options, as it might be worth investing time in learning an intimidating, foreign solution if you believe that it will provide better stability/correctness guarantees/etc. Also, simple things are generally easier to work with compared to complex things assuming a similar degree of experience -- so investing in unfamiliar-yet-simple things will often save you a lot of effort in the long run. Strictly adhering to familiar things is a recipe for being stuck in a local optimum.

To take this full circle, Go being easy doesn't mean that it's simple.

Thank you, this was excellent and very informative.

You weren't kidding about the lambda calculus definition!

With that said, your comment seems quite technical, about the single point of simplicity. You don't talk much about Go at all, and although the other poster has also answered, I'd be curious in your answer as well (if you've worked with Go):

You've given a definition of simplicity (correcting mine). Did you find Go simple? What other thoughts did you have?

This is a better answer than the one I gave. And is also a good explanation of why leaving Clojure for anything else always feels so icky. :)
Sure, no arguments there. It does save time in JavaScript and a large part of that is because the language has been designed around mutability.

Part of that trade-off is that JavaScript can't make the same guarantees about what happens when you pass an object into a function. It's harder to be confident that a given program is correct.

Immutability is just a part of the "simple made easy"[1] ethos of Clojure and I think most Clojure programmers will argue that taking the time to understand that philosophy _is_ worth the investment.


Ah, so you see why there are not many clojure programmers
He gave the Simplicity Matters talks at Railsconf ( and opened my eyes to a lot of important issues. He's accessible yet challenging.

It didn't make me want to do Clojure, but I definitely wanted to fully understand the message.

Prodcedural code can also be well compartmentalized and tested.

I was also until recently quite convinced OOP was the only way to go, but I'm seeing signs everywhere that a lot of the design-problems I've met over the past few years are at least magnified by OOP.

The abstraction promised by OOP is a good thing, however very few people are able to consistently make good, reusable and maintained abstractions. To the point where it becomes a weakness rather than a strength. I don't want a billion wrapper-objects that obfuscate my code, and makes the surface area of it bigger than it has to be. Often I struggle understanding code more because of how it was separated, than because of the complexity of what it actually does.

I liked Rich Hickeys talk "Simple made Easy" [1] and "Brian Will: Why OOP is Bad" [2]

[1]: [2]:

Rich Hickey - Simplicity Matters
There are a number of objective measures of cyclomatic complexity in software. These metrics show that the higher the complexity, the lower the cohesion of the code[1].

Code that is complex, and has low cohesion, is harder to understand, and therefore harder to change. It's the elephant you have to push on every time you want your program to do something new[2].

"Dependency hell" might be subjective, but tools that reduce the upfront cost of increased dependencies don't remove the other burdens from you, the developer. In fact they often allow you to produce an impenetrable, unrecoverable tangle more quickly than doing without them.

Edit: Just realized who I responded to... "But, you knew all that."


  > Edit

So, what's interesting is, I would often consider many small bits to have a _lower_ cyclomatic complexity number. That is, whenever I've used tools that measure this kind of thing, the solution is always to take the big things and break them up into many, smaller bits. It's possible that this is bias in the tooling, though.

I think some of this comes down to individual preference as well. It's like that joke, would you rather fight one horse-sized duck or 100 duck-sized horses? In this admittedly very stretched metaphor, the former are relatively monolithic codebases, and the latter are relatively modular ones. I know that I used to prefer one hundred-line class to ten ten-line classes, but now, much much prefer the latter. My experience talking to people about this is that people fall somewhere on this line, often in different places, and that makes it harder to understand. The action that I'm taking to reduce complexity can often be perceived as increasing complexity, depending on where the other person falls on that line.

Lots-of-little-ones (LOLO? LOLO) is the right answer in a number of cases. One big god service? No; microservices (lots-of-little-ones). Large many-lined functions with copious branching and conditionals? No, LOLO. Big teams with multi-hour status meetings? No... LOLO. One big integration at the end of the project? No...

That dependency diagram for Servo actually looked relatively clean. You can clearly see which elements are library or utility code. The visualization would probably be better as a three-dimensional model with weighting.

For OO practice, state of the art is SOLID (with a sprinkling of RAII if you happen to be using C++). SOLID leads you straight down the path of LOLO. Small increments FTW.

Jul 21, 2016 · ClashTheBunny on Zenzizenzizenzic
This is the difference between easy and simple. Easy to use is often hard to learn. This is simple. Many of the best tools are this way. Violins are another great example. I like clojure for a similar reason: Rich Hickey explaining the difference at rails conf:
"Good design is about taking things apart." --Rich Hickey

I think that statement captures several of these. He says it in the context of methodology and "architectural agility" in a great talk called "Simplicity Matters." [0]


The original post (not this OP) was clearly a response to Rich Hickey's talk "Simplicity Matters," which was given at a Ruby conference.[0] Particularly the part about "1,000 gems," which Hickey calls "hairballs" in the talk. I've started calling all packages hairballs since watching that.


To be fair, given the quality of the code from most Rails developers, the gems are more like turds than hairballs.
Complication kills, not complexity. Complex systems are made up of simple independent pieces. Those pieces work together naturally often leading to new behaviors. Because they are independent, they can also adapt easily to changes in the environment.

Complication is different. Complicated systems have dependencies that are more tightly bound together. Because of this, they can't adapt as easily to environmental changes. Consequentially, this creates conflict between system and environment that can often be destructive or violent.

Or at least that's my social hypothesis. I based my idea on the Rich Hickey "Simple vs. Easy" talk on software complexity. I think his ideas could be generalized to societies as well.

The title reminded me of my favorite Rich Hickey talk:

A lot of problems in software arise from choosing the easy route (i.e. gem install hairball), with no regards for the complexity it could bring into your project. Do this enough times and your project could easily end up like one of the projects described here: a massive mess of complexity that's excruciatingly difficult and risky to change and improve.

Aug 29, 2015 · 2 points, 1 comments · submitted by dkarapetyan
Note that this is an excerpt from Rich Hickey's (creator of Clojure) keynote talk entitled "Simplicity Matters" at Rails Conf 2012.
Aug 28, 2015 · 126 points, 69 comments · submitted by hharnisch
The simplicity culture is what drove me to Python, and the Clojure community's simplicity culture is on a whole other level. It's not just the code and syntax that need to be simple. The abstractions in the program need to be simple as well, and Clojure gives you the tools to make simple abstractions easy to build and understand.

Go learn Clojure.

I just started BraveClojure today. I've said "next week" enough times.
Or learn another Lisp. I recommend Guile Scheme and Racket.
You may get just as much pedagogical advantage from learning one of those, but Clojure will be much more practical with its JVM base.

Besides, I know lispers hate the idea, and it may not even be necessary, but uniting behind a single good-enough lisp like Clojure will reap more rewards then advocating for multiple (while still good) Scheme implementations.

Here we go again with the "any Lisp that isn't Clojure isn't practical" argument. I disagree. Guile and Racket are plenty practical, and both have dynamic FFIs to any C library you want so I couldn't care less about using Java and its terrible VM boot time and build systems. I'm not interested in "good enough" when I've already been using something better for years.
Curious if you could tell us more about what you've worked on with Guile/Racket in a production environment?
I'm not the persuasive person that convince the bosses to use Scheme, so I don't have a triumphant success story, but some minor victories. I'm one of the core developers of GNU Guix (a package manager, distro, and associated set of tools) that recently was deployed successfully in a large production HPC cluster in Germany, which I think is a pretty nice achievement. At work, I use Guix and an init system called DMD (also written in Guile) a lot for development, and I'm trying to slowly work them into production via Docker.

It's an uphill battle to advocate for something that isn't the status quo, so I'm proud of the little victories.

If you build systems that are part of an existing ecosystem, specifically an ecosystem built on Java technologies, then Guile or Racket would be terrible choices.
Sure, that's fair, and for that Clojure is a fine choice.
I'm a Clojure developer and I agree that people shouldn't use the argument that other Lisps aren't practical (I'm just happy people are using lisp based languages). The point we should make is that for any shop that's using the JVM or Javascript already it's easy to introduce Clojure/ClojureScript into that environment without a ton of risk.
I spoke with a "grizzled old Lisp hacker" at my Clojure meetup here in Minneapolis. He let it drop that "If I didn't know the problem in advance, I would choose Racket over any other language. I love Clojure, though. I just don't get as much opportunity to use it." There's room for everyone here.
Many choices are 'sacrifices' that bring so much simplicity. Syntax, persistent data structures, ... I hope I can jump back to clojure soon.
Yeah, Python : 1999 :: Clojure : 2015

edit: state of adoption and perceived coolness, not birth year. Coincidentally, Python 1.0 released in 1994 and Clojure 1.0 was released in 2009, and both had a couple years of unstable releases predating 1.0

Clojure is actually from 2006 after two years of planning and development.
Yeah, but Python is not from 1999, so they were referencing something else: either personal adoption or language take-off.
> Clojure gives you the tools to make simple abstractions easy to build and understand.

I understand if you don't want to take the time, but I would love it if you could you give an example of an abstraction that you can build in Closure that you would consider "easy to understand", yet which couldn't be built just with, say, anonymous functions, structs, arrays, and simple loops?

I think it comes down to expressing your intent in terms of that intent rather than confining it to imperative constructs. Pretty much any work on collections of items is generally simpler and more in line with the semantics of your intentions when working with Clojure or other functional languages or structures, especially when parallel code is involved.

I'd say a specific example would be pmap. It's very difficult to parallelize code as simply as pmap does without functional programming constructs.

Hm. I'm looking at the implementation[1] of pmap.... I don't fully understand the implementation (can anyone point me to something that explains what rets is? Googling "clojure rets" doesn't return anything useful) but it appears to just be looping through the list and parcelling work out to multiple processors. That seems like something simple enough to do in any language with threads and function references, no?

I'll dig in some more though. Thanks for the reference.


To me it comes down to what my intent with the code is.

I could write a few simple routines in Go that certain get the job done without much code, but when I read the code I have to perform more mental translation from how the code is written to what my intent was.

In contrast, pmap or other constructs such as PLINQ get the same work done with less code that expresses my intent more clearly.

in this code, "rets" is a local variable used in that function, which is why it wouldn't show up anywhere in google. it's just a variable name.
The simple syntax makes things like Hiccup and macros more convenient.

Macros allow things like core.async (like Go's channels), which is just a normal library; you didn't have to upgrade your Clojure version or anything.

Immutable datastructures make your life simpler because you're not worried about values mutating suddenly. Keeps you from cloning or locking an object. And undo is simpler: you don't destroy old state by mutating it, so you can just hold onto old versions.

While simplicity may matter, I believe Clojure to be a poor example of it -- it's a lot of functions all thrown together in basically one namespace, with poor error handling, and a tendancy to throw a 50 line traceback with a lot of random symbols in it.

Clojure macros are the antithesis of simple, and the need to indent a scope for every new variable actually fights against TDD in my experience.

I recently wrote a good chunk of a RESTful SQL-backed application in Python in two days that took a team of 3 people in Clojure over 2 months to just get to the basic level of library support one would expect from things like sqlalchemy and flask.

Clojure isn't simple -- it's basically a step up from assembler in how little it provides.

Simplicity is having all the power tools and being able to put them together and be instantly productive, and to support programming in multiple paradigms.

While it's not the norm, I sometimes feel many FP purists spend so much time debating purity and giving basic concepts complex names - when they could be using something else and getting much more done.

Side effects aren't the devil and are sometimes neccessary to get real work done. Bad code can be written in anything, and it just takes experience.

I'd much rather see a language focus on readability, maintaince, and rapid prototyping than side effects.

Functional programming concepts have benefits - I love list comprehensions and functools.partial in python is pretty neat, but when you can also have a decent object system, and embrace imperative when steps are truly imperative, you can get a whole lot more done.

It's unfortunate that you got downvoted, because while people may not agree with this there is definitely something to this.

My approaches to Clojure have been seriously hampered by the fact that some of the abstractions above those that are "simple" are remarkably complex, and that the tools that surround the ecosystem are still pretty frail.

Macro bugs are certainly something that have scared me away for a while.

Yeah, exactly.

My experiences were around looking for a quality ORM, job scheduler, and web framework - things like korma and ring exist - but they lack a large amount of features compared to equivalents found in /most other/ languages.

I came to the conclusion that Clojure is an acceptable way to call Java SDKs if you want a bit higher Java velocity AND Lisp fits your brain already, but I'd rather pick up Scala or Groovy instead for that purpose.

Libs in pure clojure, for which I tried dozens, were usually incomplete and error-prone even if they were community favorites, which I attribute in part to the fact that it's a small circle of developers using it, and the language is still newish.

Macros are for extending the language, I have never in 2 years of professional clojure dev, used a single macro in problem domain code. I've written a few libraries that are supposed to extend the language, but I've only used them rarely there too, maybe under a dozen times.
Sure, but when there's bugs in the caller of macros I have to figure out what exactly is going on, and that's where the rubber meets the road.
> I recently wrote a good chunk of a RESTful SQL-backed application in Python in two days that took a team of 3 people in Clojure over 2 months to just get to the basic level of library support one would expect from things like sqlalchemy and flask.

Nonsense. Java library interop is great. If you could write it in 2 days in Python then it wouldn't take more than a week in clojure. In the very worst case scenario you could simply write "Java in Clojure" and directly use Java libraries.

> Clojure isn't simple -- it's basically a step up from assembler in how little it provides.

It gives you full access to the JVM and its tens of thousands of man years worth of high quality libraries.

As for the language itself it has macros, first class functions, built in vectors and maps, structural editing, fantastic REPL support, and a top rate concurrency story to name a few features.

you might want to look at clojure again. that comment reads like babble
Rich uses a very precise and archaic definition of simplicity, which he describes in his "Simple Made Easy" talk. In a nutshell, when Rich talks about making something simple, he's talking about limiting the number of things that can affect it.

For example, under this definition, the simplest thing possible is immutable data, since nothing can affect it. The next simplest things are pure functions, because they're affected only by their arguments. Clojure is a language built around Rich's idea of simplicity, so Clojure prefers data over pure functions, and pure functions over side-effectful functions.

What you're describing is what Rich would likely term "easy". Something is easy if you can do it with little effort. Something is simple if few things affect it.

"talk-transcripts : Rich Hickey "

- Inside Transducers (11/2014)

- Transducers (09/2014)

- Implementation details of core.async Channels (06/2014)

- Design, Composition and Performance (11/2013)

- Clojure core.async Channels (09/2013)

- The Language of the System (11/2012)

- The Value of Values (07/2012)

- Reducers (06/2012)

- Simple Made Easy (9/2011)

- Hammock Driven Development (10/2010)

- Are we there yet? (09/2009)

I love Rich Hickey and I will watch any talk he ever gives. I believe I have seen them all at current.

Here's what drives me nuts about this one, though - it gets passed around a lot where I work, and people say how strongly they agree with him. There are real, concrete things he claims are not simple here! Things like for loops. And these people I'm talking about, they say they love this talk and then they say they love for loops. Really! I don't get it.

It's one thing to recognize simplicity versus ease of use, it's a wholly different thing to apply it. Ease of use is what is most trivially observable by an end user. Actually understanding the nature of the problem domain and properly evaluating architectures beyond making sweeping inferences from what the interface looks like, that is significantly more difficult. Despite all the lip service that gets payed towards simplicity, most people do not want it and it will be met by derision and scorn if the simple solution imposes a higher learning curve or does not make policy and integration decisions for the user. Especially considering the culture of coding bootcamps, DevOps, "get shit done" and "move fast and break things" by and large promotes an anti-intellectualism that is in stark opposition to deeply evaluating problem domains so you can come up with simple solutions.
Right, and so what you get is people saying something is "simple" when what they really mean is that they like it. Which is of course not Hickey's fault, it just irks me for some reason.
It's perfectly reasonable to agree wholeheartedly with his core idea (easy vs simple) but disagree with his particular pronouncements on what is and isn't simple. Which could involve for loops.

Now, personally, I don't have much to say about for loops one way or the other. But I do disagree with him on types, which can be far less complex than he intimates. System F, say, which largely covers F#, OCaml and Haskell, can be completely defined as a handful of self-evident rules. A single page for both checking and inference, if in somewhat dense notation. That's not complex at all, especially since it follows fairly naturally from the way the lambda calculus works even without types.

To me, that seems like a perfectly consistent view. Nothing forces you to take everything he says or leave it: you can find it accurate piecemeal. Take the mental framework but apply it with your own knowledge and experience and you could very well come up with your own conclusions.

Seems like the perfect way to use ideas like this.

Carl Sassenrath is another programming language designer who as long been a proponent for simplicity.

Some links:

* Definition of Simple (2011) -

* Fight Software Complexity Pollution (2010) -

* Contemplating Simplicity (2005) -

I was at this talk and I disagree with his fundamental statement that simple + simple = simple. I program in Ruby one of the biggest problems beginners make is not creating complex data structures where they are needed. Instead they pass around hashes of hashes of hashes. Why? Hashes are simple, they're easy to understand and work with. Unfortunately this initial simplicity introduces unexpected complexity, things like deep dup and deep merge are now needed. Every part of your system must now know the hash structure you're using and often reproduce logic in many places like dealing with optional keys that may be missing. By not isolating the complexity to one part of the app it must now be shared by the entire app. simple + simple !(always)= simple.

If I had a time machine and could make one change in an open source project it would be to go back and remove the design choice of putting using hashes as internal config objects in Rails. It makes understanding all the possible edge cases almost impossible unless you're really familiar with the project.

The second is the claim of speed that this "simplicity" buys you. I agree that functional programming is extremely fast when it's parallelized and it's extremely easy to make a functional program parallelizable. When we're dealing with sequential tasks, mutating data is much faster than allocating new data structures. I think clojure helps with the speed of allocating new immutable structures by using copy on write strategies behind the scenes.

I think Rich is an extremely smart and very accomplished programmer. I think functional programming is really good at some things, I don't feel like many people talk about the things it's not good at. To me we if we're not embracing and explore all a new concept/language/paradigm strengths and weaknesses, we're not growing by being exposed to that thing.

>I think clojure helps with the speed of allocating new immutable structures by using copy on write strategies behind the scenes.

No, it's not copy on write in the sense that COW means in other languages. Data in clojure is persistent, so usually even an altered piece of data is not actually copied, only the tiny bit that changed is added, if necessary. This is very different than, for example, Swift that implements copy on write, but the whole data structure is copied even if the change is minor.

Additionally, you say:

>When we're dealing with sequential tasks, mutating data is much faster than allocating new data structures

That would be true in cases where the entire data piece is new; but so much of what you do in any language involves interative changes over existing data, and again because of clojure's persistence, new allocations are often not happening at all in many cases.

...However, I do agree with most of your other points :)

I was introduced to not too long ago on HN. I don't use clojure so I was paraphrasing my somewhat limited understanding. Thanks for the clarification.

That part about speed was more about my general frustration with the "immutable is fast" and "mutable is slow" meme that I hear too frequently. While that can be the case, it isn't always 100% true. Your language and how you use it can play a huge part.

I'm hoping to learn more about clojure data structures in the coming months, so much good stuff.

My guess would be that immutable clojure is still faster than say mutable ruby though.

I don't think anybody is arguing that trie lookup is faster than array lookup. But when it comes to passing data between parts of the system, zero cost copies are much faster than copying buffers or objects.

> Unfortunately this initial simplicity introduces unexpected complexity, things like deep dup and deep merge are now needed.

Not if your data structures are immutable, which is exactly what he is advocating.

Exactly; the whole idea of "deep dup" doesn't even exist in Clojure. It's a non-issue. Some languages go so far as to encourage actually serializing and then reading back in a data structure as a form of doing a deep duplication! (I've done this in Objective-C, based on advice in popular books.) This is an extreme example of the pitfalls of mutable data in many circumstances.
Excellent point, I honestly hadn't considered that. I've spent more time dealing with functional programming than in actual languages with immutable structures. Thanks for pointing that out.

I think my other point still stands, things like optional keys, or missing or required keys still must be shared throughout the app. Everything that touches that hash needs to know its structure.

Yeah, but that structure is also easy to explore and understand.

And in case you want some security you can use libraries like prismatic/schema, that allow you to declaratively describe your expected data-structures akin to a type system.

The web is build on json and not corba because plain maps and vectors are far easier to work with than domain specific objects.

There is a discussion of deep vs shallow hashes. Take Datomic or Datascript for instance. Very shallow datastructures, with a defined schema. Indexes to make this fast.

There’s tradeoffs. Shallow datastructures are under-used.

But OOP objects share fundamental flaws with deep hashes. (And add some more.)

Having to know the structure of a hash/map/dict is strictly superior to having to know the methods and fields of an object. For any given task the two are equivalent, except that one is easily modified/reused/passed to other callers without creating an extremely brittle class hierarchy you're only going to have to rearchitect when your requirements change. As an IBM researcher looking at Watson waaaay before Watson actually existed: "OOP requires you to treat all extrinsic information as if it were intrinsic to the object itself." This is tantamount to saying you must have a complete and consistent programme design before you write a single line. Which is crazy (and probably the reason UML exists.) But don't take my word for it, think it through, it's the only conclusion you can reach, I think.
There is actually a lot of complexity to hashes in Ruby.

The two major sources of complexity are the fact they are mutable and the fact that keys are mutable but hashes are computed when the object is first added.

So if you construct a valid hash, then call three functions passing them the hash, you have no idea if the hash is still valid for the second and third functions without reading the code for all the preceding functions.

What this tends to mean in practice is you are always having to check all over the place that your hash is valid.

They might look the same as clojure hashes but in practice they are used very differently.

> There is actually a lot of complexity to hashes in Ruby.

I agree here

> So if you construct a valid hash, then call three functions passing them the hash, you have no idea if the hash is still valid for the second and third functions without reading the code for all the preceding functions.

I don't find that to be true, even in the Rails codebase. You pass a hash to a function (method) and expect that it won't be mutated. If you're writing a method and you mutate a hash, you're expected to dup the argument so it won't be mutated. This is the convention. There are times when hashes are mutated, but generally that's reserved for methods who's purpose are to mutate its arguments.

Where Rails gets into trouble in its functional passing of hashes isn't in the mutability of the hashes, it's in the composability of the functions. I've never written about this problem before, so i'm not sure I have a great example but it's definitely a problem.

One example in rails is `url_for` it takes a hash argument. The problem is that there's multiple `url_for` methods that all do slightly different things. Some are needed for generating links in email, while some work in your views, and others are designed for you to use programmatically outside of a view context. One of the hardest things about this method is that one `url_for` can call another `url_for`. Since we can never be guaranteed the order of the calls it is really it makes things like having default values, or optional keys. You have to replicate logic in different functions since some may never be called in the order you might expect. This significantly impacts our ability to refactor which in turn impacts our ability to make performance improvements.

I recently did a bunch of perf work in and some of my biggest perf improvements were getting rid of duping and merging of hashes. If Ruby had an immutable and performance efficient hash then it would have helped a bunch, however I don't think it would make the general awfulness that is an entirely hash based API to a very complex action (such as url_for) that significantly better to work with.

Take a look at ring for clojure, which is a web server interface very similar to ruby rack.

Because it passes every request as a map you can hook functions (middleware) in between that change the behaviour of the request handling, for example they could add a field for passed params or fields for user authentication.

This is possible because maps are easily extendable and functions down the line don't need to know about additional keys. You can't do that with OO properly.

You are sort of arguing a strawman. First you say that hashes are simple, but then go on to argue how they are not. Maybe they were not simple in the first place? Simplicity does not mean easy. It was "easy" to use a hash. Simplicity can actually be incredibly hard to design properly. I think this is what Rich alludes to in the video.

Edit: It's one of the first thing he says in the video actually - "It's about the interleaving, not the cardinality."

Actually, parent gets it and is neither arguing a strawman nor mistakenly labelling hashes as simple rather than easy.

Rich absolutely argues that basic data structures like hashes are simple -- something that is generally agreed upon. They are simpler than 'objects' because you can use basic comparison operators on them, and you can operate on them with higher-order functions etc.

The question which parent is digging at is whether a larger, complex app, that leans heavily on hashes actually results in an app that is on the whole simpler, i.e. does 'a simple thing plus a simple thing equal a simple thing'. This is something I've heard discussed well on the Ruby Rogues podcast -- I think that either David Brady or Josh Susser may have a good blog post on the subject of 'simple + simple != simple' but I'm struggling to track it down.

Whilst I love Rich Hickey's talks I do find myself coming round to the same conclusion as parent -- if you forgive my possibly incorrect interpretation of their argument -- that the idea that simple data structures are simpler is fairly useless if programmers use that fact naively.

PS. Everyone should watch the linked talk, it's brilliant, and one of my favourite programming talks ever. I recommend it to every programmer I meet.

Why ignore the message of simplicity in abstractions and design only to nitpick on hashes? I am not a fan of using hashes either. I also disagree with Rich's stance on static typing. But still think the video has a great message that is being lost here.
I think you nailed my original intent quite well, thanks for the post.
> Every part of your system must now know the hash structure you're using and often reproduce logic in many places like dealing with optional keys that may be missing.

Why not just write accessors and mutators for your hashes? Also, Python hash a function dict.get(key, default). Does Ruby not have this?

> I think functional programming is really good at some things, I don't feel like many people talk about the things it's not good at.

This is a wonderful tenet in general: let's think about what X is bad at, rather than what it's good for.

While I'm not much of a Rubyist (I've hacked a little in my time, but I fall on the Python side of the divide), working with trees (hashes of hashes are a kind of tree) has never been easier than in Clojure in my experience. I suspect this is true of any lisp.

Could you expand on how Clojure makes it easier to deal with the frustrations your parent comment mentions (deep merges, shared reliance on hash structure, reproduced logic, and handling of optional values)? I've only used Clojure a bit, but I'm not sure how it solves any of these problems (which I also find frustrating in Ruby).
Hickey has stated elsewhere that optional values don't really make sense in a dynamically-typed language.
Immutability is a big part of it, as is the wide array of operators for maps and map-like data structures.
Persistent data structures are part of it, for sure. But on a higher level. On a lower level, if you look at how and assoc-in works in Clojure, the scales will fall from your eyes (they certainly fell from mine). Consider:

(def g {:x "a" :y {:k 71}})

user=> (assoc-in g [:y :k] 72)

{:y {:k 72}, :x "a"}

With this you can walk and mod deeply nested datastructures with ease (and others can reason about it well).

I may be misunderstanding your point, but that's just a well-done function which could be written in virtually any language, the advantage isn't inherent to Clojure.
> which could be written in virtually any language

What natrius says is completely right on this point (which explains why those functions don't exist commonly outside of lisp) but it should by now make you wonder: if that's true, why haven't they?

OH, I see your point. Thanks for the enlightenment.
It's not true. ImmutableJS is becoming very common, especially for React users:
People like to talk about immutable.js. It's not common.
8k stars and counting. We used it at the last place I worked (a YC startup) and several of my friends report the same.
The difference is that in Clojure, `assoc-in` is effectively a deep copy as well since the data structures are immutable. Most of the complexity involved with nested maps is about the effects of modifying them. In practice, this doesn't exist in Clojure.
Your comment seems more about Clojure's type system and fundamental data structures than about functional programming in general. e.g. Haskell solves some/most of the problems you mention, while introducing different ones and presenting other trade-offs to consider.
Using data structures without a schema is user error, and isn't the fault of any particular language.

The rest of your points have some merit. :)

> I think Rich is an extremely smart and very accomplished programmer. I think functional programming is really good at some things, I don't feel like many people talk about the things it's not good at.

As someone who used to swear by Ruby but now prefers FP langs, part of learning FP is as much about learning what FP is good at as it is about learning what FP isn't good at. Given FP's proclamation for declarative style and immutability, clearly FP isn't good at domains where imperative styles are important or really, really, really[0] CPU intensive work where mutability is needed.

Although, thanks to compiler research (which is mostly done by FP language users, ahem) languages like OCaml are bridging this gap when you take a look at projects like MirageOS. You get to write your code in a declarative, immutable style that then gets compiled to very fast native code. It's having your cake and eating it, too.

As far as your original complaint of everything in Ruby either being a hash of hashes or a class that just wraps a hash of hashes, I completely agree. That is what drove me away from the language. I'd rather just blow that hash + class relationship up completely. Not to mention, in ML-dialect languages you can use features like Enum/Sum/Product types to define the configuration/syntax of your programs which is a lot better than reading arbitrary keys in a hash, as you have stated.

0 - No, I mean really, really, really, really. "I think I need C to do this fast enough for my use case" is the "I need Cassandra for my 2GB database of 'Big Data'" of the FP world.

He didn't watch 'Simple Made Easy'. He watched a different Hickey talk, 'Simplicity Matters':
Very interesting related talk about complecting things - "Simple Made Easy" - by Rich Hickey, inventor of Clojure:

If you're a Ruby person, maybe watch this version instead, since it's almost the same talk but for a Rails Conference, with a few references:

Jul 20, 2012 · halgari on Choosing an ORM strategy
just use hash maps:
An ORM at the basic level just populates some data structure (could be a hash map) from your database.

At some point when an app gets complex programmers tend to keep SQL organized so it's not all throughout the code, just like keeping view logic separate. To me that is just a simple ORM as well.

There seems to be an anti-ORM trend but I suspect it's more of an opposition to certain established ORMs that are really complicated to work with.

If that's not true, I'm also curious to hear about other strategies that people are using these days. Having no strategy at all seems a little sloppy to me, but I'm always open to new ideas.

ORMs either arise from need (homegrown) or are bought with way more features than you need and the associated complexity that brings.

If there were a way to scale an ORM so it can modularly add features/complexity as you need it (ie, customize the metamodel complexity), that would be the most ideal approach.

On the other hand, testing thousands to millions of possible use cases in combined codebases is a nightmare.

I agree. I've written many home-grown ORMs as well as used a few well-known ones. It's definitely easier in some ways writing your own because you understand all of the details & it has nothing in it that is unneeded. The downside is that you don't have the community support maintaining the code and providing support resources. On the flip-side I've spent days researching and tweaking hibernate mapping files, occasionally making me want to throw my laptop out the window!

One thing is for sure, no matter what ORM or similar solution you use, if you don't know how to observe and understand the SQL that is generated then you won't be able to use it efficiently.

May 08, 2012 · 2 points, 0 comments · submitted by ms4720
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.