HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Clojure for Java Programmers Part 1 - Rich Hickey

ClojureTV · Youtube · 8 HN points · 6 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention ClojureTV's video "Clojure for Java Programmers Part 1 - Rich Hickey".
Youtube Summary
Part 1 of a presentation by Rich Hickey to the NYC Java Study Group. A gentle introduction to Clojure, part 1 focuses on reader syntax, core data structures, code-as-data, evaluation, special operators, functions, macros and sequences. No prior exposure to Lisp is presumed.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Nov 05, 2022 · 3 points, 0 comments · submitted by tosh
> So that's how I view Lisp macros - creating code on the fly and executing it.

This is an incorrect assumption. In compiled Lisps[0], macros are a compile-time construct, which manipulate the data structures that represent your code[1]. All manipulation occurs at compile-time. In interpreted Lisps, macro evaluation is temporally intertwined with program execution, but the two phases are logically distinct.

[0]: Compilation time in a Lisp is not necessarily similar to compilation in other languages. Working at a REPL does not imply that interpretation is happening. Clojure is an example of a Lisp that is exclusively compiled. Even when sending individual expressions to the REPL, they are compiled before being evaluated.

[1]: From a reply elsewhere in this thread: https://youtu.be/P76Vbsk_3J0?t=1333 See this video for an overview of the core of Clojure. The link starts with an overview of Clojure's data types and data structures (skipping the intro for the reasons behind Clojure). This leads directly into the evaluation model, which is based on the data structures of Clojure. Watch through ~1h:30m to get a feel for what macros are used for in Clojure's core. This gives a very good overview of the evaluation model for macros; this is generally applicable to Lisp macros outside of Clojure as well, though there are nuances to different dialects. I recommend that you start at the linked time, but if you want to jump straight to the evaluation model, you can jump to ~47m.

mwattsun
Thank you for taking the time to answer! At compile time vs at runtime is an important distinction that I need to think about. I'll start the video from the beginning because I'm curious about Clojure and most things Rich Hickey does, since I liked is Simple Made Easy video so much.
mwattsun
So Clojure programs are data structures, and you can pass a structure to a macro to rewrite it into something the compiler can evaluate. In this way you can extend the language. You can build your own DSL that solves your particular problem in an organic way that grows as you go along.

If I understand that correctly I don't get it, because I would still have to write the macro, which in C# I would write as a function that I call in my source code, which doesn't grow the syntax of C#, but grows the program to a DSL that solves my problem, albeit not as nicely as in a REPL.

Maybe this is all about the REPL and I don't use them or see any appeal in them. I like to write my text, look at it, and hit run. You write a REPL line and it disappears.

I'll keep learning. I'm sure the light bulb will go off eventually.

greggyb
As for REPL driven development, it's not about typing code at a REPL. This video provides a pretty good example of working at a REPL in Clojure. Note that with the exception of one doc reference at the beginning, all code is typed directly into a file buffer. The editor provides shortcuts to send various forms to the REPL for evaluation.

https://vimeo.com/230220635

greggyb
A canonical example of macro capabilities that methods/functions cannot recreate is a short-circuiting conditional.

Clojure's only built-in conditional operator is 'if'. Despite this, we have nice short-circuiting or, and, when, unless, and other conditional operations in Clojure, defined as macros.

Clojure (and C# and most languages) eagerly evaluate their function arguments. You cannot write a function that short circuits. Let's consider the following Clojure:

    (when false (infinite-loop))
If when is a function, it's compiled to the appropriate instructions to evaluate both arguments at call-time. This will cause our program to hang on (infinite-loop).

But, when is a macro, which means that during compilation time, the compiler defers to the when macro. The when macro is near-trivial.

    (defmacro when
      "Evaluates test. If logical true, evaluates body in an implicit do."
      {:added "1.0"}
      [test & body]
      (list 'if test (cons 'do body)))
So the compiler has encountered when and sends the data structure of the form to this macro to evaluate. The result of this macro evaluation is passed back to the compiler.

So, when receives a list, '(false (infinite-loop)). when returns a list to the compiler, '(if false (do (infinite-loop)). The compiler sees that there is no more macro expansion to be done, so it emits the appropriate code to execute that operation. When we get to run-time, the if happily short-circuits and this ends up being a no-op.

This sort of conditional macro tends to be trivial in implementation, but highlights the difference between an eagerly evaluated argument to a function and a syntactic form passed to a macro for transformation at compile time. And if Clojure lacked a conditional construct, you could happily add it.

Continuing with the comparison between Clojure and C#. C# gained async as a new keyword which required compiler modifications to implement. Clojure got core.async as a library. Anyone could have implemented core.async on their own. Now, Clojure's core.async is CSP-style (similar to Go) and C#'s rewrites your code into a state machine on your behalf, which basically pretties up callback handlers for you. If you prefer that C# async style to CSP, you can introduce that construct in Clojure yourself with something that looks as "native" as core.async. If you want to use CSP in C#, you cannot create any syntax for this and so will never be able to make something feel native as async does.

Or another example would be something like C#'s using construct for IDisposables. If you wanted to implement using in a C# without it, you couldn't. You cannot create new control flow constructs. Something similar with Java AutoClosables is implemented as a macro in Clojure as well (again, if it didn't exist in the language, you can add it just like below). You can implement arbitrary control flow in Lisp macros in a way that would require syntax and compiler modifications in other languages.

    (defmacro with-open
      "bindings => [name init ...]
      Evaluates body in a try expression with names bound to the values
      of the inits, and a finally clause that calls (.close name) on each
      name in reverse order."
      {:added "1.0"}
      [bindings & body]
      (assert-args
         (vector? bindings) "a vector for its binding"
         (even? (count bindings)) "an even number of forms in binding vector")
      (cond
        (= (count bindings) 0) `(do ~@body)
        (symbol? (bindings 0)) `(let ~(subvec bindings 0 2)
                                  (try
                                    (with-open ~(subvec bindings 2) ~@body)
                                    (finally
                                      (. ~(bindings 0) close))))
        :else (throw (IllegalArgumentException.
                       "with-open only allows Symbols in bindings"))))

I hope these examples help to illustrate the differences. If you're looking for more examples, you can take a look at the Clojure source to see what functionality is implemented by macros. https://github.com/clojure/clojure/search?q=defmacro&type=co...
mwattsun
Thank you for the examples. I'm really curious to get to the bottom of this. My inspiration has been Erik Meijer, specifically his article "Confessions of a Used Programming Language Salesman. Getting the Masses Hooked on Haskell" [1]. If I understand you correctly, you are implementing lazy evaluation in your when macro, something you call short-circuiting.

By dropping laziness in strict functional languages such as Scheme, SML, OCaml, Scala, and F#, the purity and semantic beauty of true functional programming is lost. When programming in Haskell, laziness is one of those things that you rely on all the time without noticing it; it is something you only realize once you miss it.

As for async/await, Erik led the effort to put it in C# and then Dart. I'm assuming he wanted to bring Haskell Continuation Monads [2] to the popular languages.

I'm building a hobby website, so I got into Blazor because I like C# and was interested in doing it client side. I gave up on it, not because of C# or bad implementation (it's actually great) but because Microsoft has a noisy and bureaucratic way of configuring code that drives me nuts. I went back to Dart because it is amazingly productive on the client side, but now this video [3] shared in this thread makes me think I should give ClojureScript a try.

[1] http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.72....

[2] https://www.haskellforall.com/2012/12/the-continuation-monad...

[3] https://vimeo.com/230220635

greggyb
I'm happy to chat more if it's helpful (contact details in profile). I'll be honest that I didn't get Lisp for a long time, and stilll wouldn't claim true expertise. I attempted SICP a few times based on the glowing reviews from folks I admire. One of my paths to tech was via Paul Graham's writings, so I had a bias for Lisp early on.

Two parts here. First, and briefly a note on short circuiting and lazy evaluation. Second, and longer, a further discussion of Lisp macros.

First. Short circuiting is a special case of lazy evaluation, but is the most common exposure to lazy evaluation among programmers in the C family of languages. It is the property of conditional and boolean operations that evaluate only the minimum necessary to return a result. (or true IGNORED) only needs to evaluate the first operand to or, true, because the result of the or is true regardless of the value of IGNORED. This is common to all mainstream strictly evaluated languages I am aware of. https://en.wikipedia.org/wiki/Short-circuit_evaluation

Macros allow you to implement lazy evaluation in an otherwise strict language, but this is not a comprehensive explanation of their utility.

Second. I've watched pretty much all of Rich Hickey's presentations a half dozen times or more. Somewhere in the mix of more practice, another time through SICP, and watching Rich Hickey I started getting what homoiconicity means and the value prop of Lisp.

It's simultaneously very prosaic and very deep. Macros are pretty simple:

1. Access to the AST of a chunk of code for arbitrary processing.

2. At compile-time.

This is not exclusive to Lisps. I think pretty much any mature language offers a library or a built-in mechanism for getting at the AST of arbitrary code in that language. The "at compile-time" part is not necessarily built in to other languages. You absolutely can have a build system that has two (or more) passes, the first building a macro expansion framework and library of macros and the second processing the rest of your source code and rewriting the AST of the code in those files to do macro expansion.

Where Lisps are different:

1. The AST of the language is represented directly in the core data structures of the language. An operation in Lisp[0] is simply a list whose first element is an operator (one of a special operator, a macro, or a function in most Lisps) and whose remaining elements are operands to that operator. This makes AST manipulation easier, because you use the same functions/abstractions/interfaces as in normal programming, because you're directly using familiar data structures.

2. Separation of reader and evaluator (interpreter or compiler or either/both). The reader transforms textual representations into in-memory representations of values and data structures. The evaluator only ever receives these in-memory representations. Thus, it's easier to do programmatic manipulation of an AST. You never have to do code-gen of emitting source code glyphs to feed into a compiler that expects text input. Once past the reader, everything is an in-memory representation, so we never have to get back to text. That said, it's trivial to get back to text if you want, because Lisps also include a printer which takes an in-memory representation and prints the glyphs that represent their readable (i.e., can be read by the reader) textual representation.

3. The evaluation model explicitly supports tagging some code as a macro to go through macro expansion before being evaluated "normally".

4. It is idiomatic to use macros to solve code re-use problems that cannot be readily handled by standard function calls.[1]

These four Lisp traits are not exclusive to Lisp, necessarily. As I mentioned above, you could build a macro expansion and code generation library that is part of a build process for any compiled language. Eval, in interpreted languages, is as expressive as Lisp macros, but is incredibly poorly supported and unergonomic in comparison to Lisp macros.

It's an ergonomic and feasibility thing more than a capability thing. That said, POSIX shell is turing complete, so you don't need anything like C# or Lisp or Haskell or anything else we've discussed to get your computation done. The reaction you might have to comparing shell scripting to Haskell is not dissimilar to the reaction an experienced Lisp programmer might have to comparing AST manipulation and code gen through a library to Lisp's macro facilities.

[0]: Clojure, specifically abstracts the idea of callable, and there are more callable things than special operators, macros, and functions.

[1]: Languages such as Haskell and OCaml continue to push the boundaries of what can be handled by "standard function calls" (which term I hesitate to use, given its imprecision). That said, given that macros allow the use of a turing-complete language to generate inputs to the evaluator, they allow you to implement any language construct that is computable without modifying the evaluation infrastructure.

mwattsun
> homoiconicity... is simultaneously very prosaic and very deep

That's what drives me on. That and Rich Hickey's enthusiasm and the sense that he knows something really valuable that I don't. I watch a lot of videos as well, so I can't remember who said that programming languages were user interfaces between human and machine and are an attempt to talk to machines to get them to do things. In that sense, programming languages are a lot deeper than syntax convenience or efficiency. A good language allows us to talk to machine at a much higher level, and I get from functional programmers that this is what they find so attractive.

I will contact you if I get really stuck and when I finally get it just to share it with someone. Thanks!

edit: I've heard a lot about SICP but never looked into it. I think it's time, now that I have time

bcrosby95
Homoiconicity is neat, but I think the real nice thing is the S-expressions and general lack of special keywords. It makes it easy to take a chunk of code in your source file and seamlessly send it to the REPL and have it work.

Compared to Elixir (another language I like) - it is homoiconic but it doesn't have this feature. I can't, for example, take an arbitrary function definition and update or define it in the REPL - because all functions must be defined in a module. However, I like these modules because I think it helps enable discovery of functions - it's much easier to find all operations you can do on lists or maps in Elixir than Clojure.

Also, I wouldn't necessarily call macros more powerful than functions. They're just different, and they enable different things. You cannot, for example, pass a macro around like you can a function. Personally I've never written them, but I use them every day because the core library defines macros. I view them as ways to let people smarter than me to give me tools I can use to write my programs.

As far as driving you on, I don't think Clojure is the end-all be-all language. I happen to like it. I also like Elixir. I know people who really love other, more standard languages though. I don't think you need to force yourself to love or "get" Clojure, but playing around with it can expand your mind a bit, especially if you've never used an FP before.

But if you want an FP that (IMHO) would be easier to transition to, Erlang was the first FP that I "got", afterwhich I tried Clojure again. The syntax is a lot more like traditional imperative programs which can ease the transition. Nowadays I would recommend Elixir over Erlang for several reasons: more modern with modern tooling, less syntax quirks, more active online community, lots of active development.

For me, the main benefits of Clojure are (other FP languages have some of these too):

1. Sharable, immutable state as the default.

2. A set of higher order functions that can do all sorts of data transformation.

3. Being able to use the same language on the web in client + server (without having to use Javascript).

4. Clojure gives me the ability to, more than other languages I've used, target the level of abstraction that I need for the problem at hand, which removes a lot of cruft.

5. Lack of special syntax: helps enable REPL driven development, makes it easier to seamlessly adopt new features from other languages.

6. It's hosted on two ecosystems I'm very familiar with - Java and Javascript.

greggyb
I'll recommend the videos from the 1980s and this version of the text, which is set nicely and has working footnotes.

Videos: https://ocw.mit.edu/courses/electrical-engineering-and-compu...

Text: https://github.com/sarabander/sicp (links to epub and html versions at the top of the README)

mwattsun
> I attempted SICP a few times

I don't have a computer science degree and I think that has held me back some, but I try to overcome it by always trying to learn and never thinking I should already know something or feeling like I'm an expert. I don't have the patience at my age to go back to school and get a degree. That you attempted SICP a few times is a good sign that you don't give up and speaks to your success in finally understanding.

p_l
Consider that macros generally don't expand syntax - because there is very little syntax anyway, and what you have is a tree of data and function calls.

A Macro just lets you write a function that will be executed by compiler to manipulate such tree, meaning that you can have equivalent of adding syntax like "with/using" from some languages by writing a function that takes as arguments the object and a block of code, then re-emits the same block of code but adding object.Dispose() call at the end. Or whatever else you need.

But they are still normal functions - it's the time of execution that changes.

mwattsun
I generally never used macros when I wrote C because the C macro language is lacking, there are a lot of pitfalls and saving function call overhead is usually not that important.

I think I'll try to get a new understanding of Lisp macros by trying to come up with a language feature in C# or Dart that I really want but can't implement without a lot of non-intuitive difficulty, but could easily in a Lisp style macro that the compiler could use to build in a new feature at compile time.

From a Clojure perspective, here are a few things to think about and explore to help understand where the power and value of macros lie.

1. Clojure does not have all of its various conditionals as language built-ins. There is a single conditional construct built in to the language, if. The rest are all defined in terms of if. Function arguments are eagerly evaluated in Clojure. How do you write a short-circuiting function in Clojure? (when predicate consequent) as a function must evaluate both the predicate and the consequent. If consequent is slow to calculate then your whole when expression is as slow as consequent, even if predicate is false and we do not need consequent. when is a Clojure macro based on the if special form.

2. core.async is a mere library.[0] This implements CSP (same idea as Go's channels) and async programming (the sort of stuff that needs compiler support in other languages such as C# or Go) as a library. If Rich Hickey didn't write core.async, you could build this control flow yourself for async programming. In fact, you can create arbitrary control flow mechanisms using macros. Common Lisp's entire object system can be implemented in macros.

3. The only special forms in Clojure are def, if, fn, let, loop, recur, do, new, ., throw, try, set!, quote, and var. The rest of what you would consider the core of Clojure consists of functions and macros. Much of the control flow functionality is implemented in macros.[1]

The core of macros is that they allow you to evaluate arbitrary user code at compile time. Clojure code is represented as native Clojure data structures (the same is true for all Lisps). The entire standard library which you are accustomed to using to manipulate standard data structures is available at compile time to transform the data structures that represent your code. I can't explain this better than Rich Hickey does in [1], so I encourage you to watch the linked section of that talk.

[0]: https://youtu.be/yJxFPoxqzWE?t=560 See this video for an overview of core.async. Later on he talks about the stuff that is implemented as macros in the library. This link starts up when he's talking about C# style async.

[1]: https://youtu.be/P76Vbsk_3J0?t=1333 See this video for an overview of the core of Clojure. The link starts with an overview of Clojure's data types and data structures (skipping the intro for the reasons behind Clojure). This leads directly into the evaluation model, which is based on the data structures of Clojure. Watch through ~1h:30m to get an overview of what macros are used for in Clojure's core.

> I'm explaining why Lisp is hard to learn.

It is not hard to learn in general, there are too many counterexamples of people of all sorts and skills learning it without these issues you are having, learning it as their first, learning it later in their careers... (CL is big, I'll give you that, but each piece can be chewed one at a time and does not seem to be individually that difficult.) So I can hypothesize why it's hard to learn for you. My first guess is still that you aren't meeting Lisp in its own terms and are trying to make Lisp fit your mental model of some other language and its ecosystem. This isn't going to work, and the failure mode (if I'm right) has nothing to do with Lisp but with biases in general heuristics of learning.

You don't need to "unlearn" the way you're used to doing things, but you do need to separate yourself from it for a bit until you actually grasp the Lisp (and its ecosystem's) way(s) and can then see the connections, tenuous and non-existent as some may be. I'll risk raising some of those connections below, but really, things need to be understood on their own terms first or you wind up with misconceptions. (I've seen it multiple times with Java/Python programmers getting confused at the other's "import" statement.)

> strange terminology

It's not Lisp's fault that it's old. Being old means it will have terminology that may be alien to what we have today with nothing today being a perfect fit, or alternatively sometimes it expresses mostly what we say today in a different way. (And when you talk about concepts, not just pure language features, you run into incommensurability. See https://dreamsongs.com/Files/Incommensurability.pdf for an example with Mixins.)

CLOS for example has these things called "slots" which are more or less "fields" or "class/instance variables" in later OOP languages. But it's best to think of them as CLOS slots, so that you don't bring along any (mis)conceptions from other languages. It's sometimes amusing how different other OOP systems are given CLOS was the first to be part of an ANSI standard.

> that took place in 2008

I rather like that I can often find things decades old about Lisp and they are still relevant. Systems are at least as early as the Lisp Machine (1979). Perhaps it's worth going to the original description there? I also sort of like this history diving, even if modern sources obsolete the history in every way that matters. But here is the Lisp Machine manual on the topic: https://hanshuebner.github.io/lmman/maksys.xml "The way it works is that you define a set of files to be a system, using the defsystem special form, described below."

Is that unclear? Systems are first and foremost sets of files. Systems also let you define relations between files, primarily which files depend on which other files, as further sentences reveal.

If you're talking about this page as the first result: https://common-lisp.net/~mmommer/asdf-howto.shtml I agree it's an awful guide. I've never seen it before, it's a shame it's the #1 result. You'd be best served by ignoring it. Systems have nothing to do with CLOS, I assume that guide started trying to use animal parts as a way to elucidate dependencies. (A tail doesn't actually depend on legs though?)

I don't recall how I learned systems, probably because a mix of ASDF's manual (https://common-lisp.net/project/asdf/), this best practices guide linked from ASDF's home page (https://gitlab.common-lisp.net/asdf/asdf/blob/master/doc/bes...), just looking at example asd files of popular libraries and applications (https://github.com/CodyReichert/awesome-cl), a short ebook I occasionally shill (https://www.darkchestnut.com/book-common-lisp-application-de...), and just actually writing enough lisp and making my own library, were in part or all together enough. When you're actually writing software, you can just incrementally compile and load your files (or just eval chunks you've selected with the editor), or if you made a bunch of changes across the whole project then you can just reload your system (if you've defined it) and have everything recompiled and reloaded for you.

It's worth noting that systems are not part of the ANSI Lisp standard, though. Just as Maven is not part of Java. If you jump straight to Maven without first understanding how to create a Java program with multiple files using javac and jars, you might have a hard time later. Lord knows plenty of people get by relatively fine skipping around though (and might not even touch Maven directly, just do everything from an IDE); you can do this with Lisp too, but you seem to want to understand things more fundamentally.

Probably what you need to understand most is the core function "load" (http://www.lispworks.com/documentation/HyperSpec/Body/f_load...). After that you probably want to know about packages (which are akin to namespaces -- bags of symbols, if you already know what 'symbol' means in the CL context). You may wish to know about the "modules" mini-feature enough to know it's deprecated and doesn't really provide anything like what you think a module system should, so no one uses it, and that this mini-feature is independent from something else you might find rarely mentioned in ASDF that is unhelpfully called a 'module'.

> packaging in lisp?

Lisp itself doesn't have a notion of "packaging" in the meaning I'm inferring from you, so your question is wrong on its outset. It just has things like compile-file and load. If you like, you can loosely refer to a directory of files you load one by one as a "package" or a "library". You can define a system for those files with ASDF, and treat "loading the system" as something like "loading the library". When you want to get other "libraries" (directories of files -- just like python eggs or java jars, but lisp libs aren't compressed into zips) from the internet, or publish your own, the community standard is https://www.quicklisp.org/beta/

Sorry if this is just more unhelpful text, I can feel your frustration even if I don't fully understand it. I do agree that being alien is a detriment to Lisp's popularity, but that's not going to change. You might find some leverage by working at CL from the less alien Clojure as a starting point. I always thought this two part talk on Clojure for Java Programmers was really good, but the last person I sent it to dozed off watching it, so your results may vary: https://www.youtube.com/watch?v=P76Vbsk_3J0

Oct 26, 2020 · pgt on An Intuition for Lisp Syntax
Lisp has only one rule called the Operational Form:

   (operator arg1 arg2 arg3 ...)
The Operational Form is just a list denoted by parentheses. The operation or 'function' comes first, followed by its arguments or operands, e.g.

   (+ 1 2 3)
   => 6
The + symbol resolves to the plus function and is passed in the arguments 1, 2 and 3. Nested forms are evaluated inside-out:

   (+ 5 (- 10 6))
   => 9
Traditionally, Lisp only has one data literal: `(linked lists)`, but Clojure adds `[square brackets for vectors]` and `{:curly brackets}` for hash maps. The colon denotes a keyword, which is a symbol which only ever resolves to itself and is commonly used for labeling things, e.g. keys in a map.

Notice how the operational form is just a list with some symbols and data literals. In Lisp, the syntax for writing data structures and the syntax for writing code is the same syntax. And since a function operates on data and produces new data, what if a function could operate on code and produce new code, since code is just data?

We call such a function a macro (and I don't mean Excel macros). Macros run at "compile-time" (technically 'read time') and the code output is executed at "run-time" (or during 'evaluation'). The benefit of macros is that if your language is missing a feature, you can add it.

You can see this in practice by looking at the source code for the `and` and `or` logic functions in Clojure, which are typically built-ins, but in Clojure they are just macros bootstrapped on top of the special forms `if` and `let`: https://github.com/clojure/clojure/blob/38bafca9e76cd6625d8d...

Clojure only has 13 special forms:

    [fn let let loop do while . if def recur
     try catch throw quote var
     monitor-enter monitor-exit]
Everything else is built on top of that.

When I started learning Clojure, I found the ClojureScript Koans to be very helpful in getting a feel for the semantics and to become familiar with the argument placement: http://clojurescriptkoans.com/

If you come from a traditional OO-background, my condolensces and I recommend starting with Rich Hickey's 2-hour talk, "Clojure for Java Programmers": https://www.youtube.com/watch?v=P76Vbsk_3J0

kyberias
> If you come from a traditional OO-background, my condolensces

Is this condescending attitude really necessary or useful?

Aug 12, 2018 · 5 points, 0 comments · submitted by tosh
Same here, I never understood Java interfaces, abstract classes, and a ton of other "features" but picking up Clojure was a breeze. I don't understand why complicating thing that supposed to be simple helps you by any mean. On the top of that, I have seen several cases when Java programmers tipped over in their own code because of the complexity that they thought they understand, except there was a particular case when it was doing something else then expected.

Reasoning about Clojure (LISP) code is always easy because if you follow best practices you have small functions with very little context to understand.

On the top of these, I see the ratio of LOC 50:1 (worst case even higher) for Java : Clojure code that does the same thing. Usually people get triggered and say why does it matter, but in reality less code is better for everybody. Easier to understand, less chance of errors, etc. Correctness was long time lost for majority of Java developers, just put out a survey and you can see it for yourself.

It is also pretty common practice not to handle exceptions well and just let a DNS not found error explode as an IOexception and good luck tracking down what caused it (literary happened to me).

I understand that the average Java dev does not see any value in LISP (Clojure) but it silly to expect that the average of any group is going to lead the scientific advancement of any field including computer science.

One tendency that you can see if you are walking around with open eyes that the people who spent significant time in developing procedural code in an imperative language understand the importance of functional language features and the power of LISP. One can pretend it does not matter see you in 5-10 years and see how much this changes.

https://twitter.com/id_aa_carmack/status/577877590070919168

https://www.youtube.com/watch?v=8X69_42Mj-g

https://www.youtube.com/watch?v=P76Vbsk_3J0

Indeed. Rich Hickey's 2-part presentation on "Clojure for Java Programmers" was a really slick presentation of a Lisp. (Focusing on the 'code is data' aspect from the beginning (at least once he starts talking about the language specifics themselves, he does spend a while motivating dynamic languages) rather than talking a lot about just higher order functions and macros and so on. He goes over the evaluation model and its difference from a Java/C++ model, he shows how to interop with Java... It's just a great talk.)

Part 1: http://www.youtube.com/watch?v=P76Vbsk_3J0

Part 2: http://www.youtube.com/watch?v=hb3rurFxrZ8

Slides: http://www.slideshare.net/adorepump/clojure-an-introduction-...

HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.