HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
The Value of Values with Rich Hickey

InfoQ · Youtube · 11 HN points · 15 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention InfoQ's video "The Value of Values with Rich Hickey".
Youtube Summary
In this keynote speech from JaxConf 2012, Rich Hickey, creator of Clojure and founder of Datomic gives an awesome analysis of the changing way we think about values (not the philosphoical kind) in light of the increasing complexity of information technology and the advent of Big Data. The broad subject of the talk makes it worth watching for almost anyone in the programming world, and was one of the highlights of the JaxConf lineup.

If you like the talk, check out Hickey's other appearance at JaxConf2012 here: http://marakana.com/s/rich_hickey_on_deconstructing_the_database,1261/index.html
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
is caching the correct analogy?

the stars you see died eons ago. special relativity is your friend. we witness "immutability" casually.

nothing in the relational model that i am aware of says that a materialized view cannot be a simple immutable table of "facts". same input same output. unambiguous.

simply use the dynamic queries to summarize the "facts" in, typically, a small mat view tables and your gui will never lie.

https://www.youtube.com/watch?v=-6BsiVyC1kM

Simple Made Easy (https://www.youtube.com/watch?v=LKtk3HCgTa8)

The Language of the System (https://www.youtube.com/watch?v=ROor6_NGIWU)

The Value of Values (https://www.youtube.com/watch?v=-6BsiVyC1kM)

All by Rich Hickey (Clojure creator) - strongest programming speaker I've found to date.

Who knows what bowl.stir() actually does to the internal state of bowl, and what methods should have been run up until this point to get into that state, is the bowl ready to stir? What methods should have already run for that? So much of the article's code is crutched on good naming

I like to think of this in terms of the Charizard Pokemon card

For context in this example I have this card and I'm sensitive about damage to it

so in this OO example I put the card in a box and allow you to interact with it in a very limited way, you cannot use anything you're used to to interact with it like your own gloves or hands etc

Just my "methods" so I might give you a tiny hole to look at it, you could still damage it through the hole, so I have lots of logic to ensure you cannot poke it incorrectly hopefully the verbosity on both your and my side is/was worth it and bug free and not missing cases, hopefully my hole was in the right place for your uses

Obviously I can't give you too many holes in the box otherwise what's the point in the box? I need the box to maintain my sanity

The other alternative is I just give you the card, and take the risk that you might damage it, this is a disaster for my well being OR I duplicate the card perfectly and give you the duplicate in which case I don't care what happens to the duplicate, MUCH easier in my opinion, so please

Stop creating hellish boxes with holes for other developers to peak through just choose a language with efficient immutability as the default or use pass by value semantics with mostly pure functions

Reserve your classes for things that are truly data structures in the general sense, not bs domain stuff like "bowl", bowl is not a fundamental type of computer science like integer, bowl is just data and it should be treated as such https://www.youtube.com/watch?v=-6BsiVyC1kM so it can have schema and such but don't put it in some kind of anal worry box, otherwise your program may end up more about managing boxes and peak holes than it will be about pokemon cards

bendbro
I think this argument can also be extended to cloud services, where the "OOP interface" analog is the CRUD site, queue, or other more horrible infrastructure used to give access from a client to a service.

> not bs domain stuff like "bowl", bowl is not a fundamental type of computer science like integer, bowl is just data and it should be treated as such

To an extreme: if your abstraction isn't formally verified, kill it?

Assuming as truth the idea that abstractions follow organizational structure, then only divide an organization when you have a formal abstraction for each division?

I wish there was a way to reason about this stuff that isn't so artful. I intuitively understand things like DRY, SOLID, etc, but being absolutely confident that they are true or whether they have been applied correctly is art, and I would prefer it to be math.

UncleMeat
> Who knows what bowl.stir() actually does to the internal state of bowl, and what methods should have been run up until this point to get into that state, is the bowl ready to stir?

A major benefit of OO is that you can actually enforce this. Encapsulation is useful for data objects where some configurations of bits are valid and some are invalid. Careful interfaces let you ensure that the object is always in a valid state and does not permit you to do a thing when it is not valid. The fact that you'd be unsure of these questions is an indication that your interface is done poorly.

Granted, this is really hard to get right. Doing it badly leads to the nightmarish combination of easily mutable state that isn't easily visible.

"Copy everything" can be a really compelling option for many programs and there are persistent data types that help do this in a mostly scalable fashion. But there are plenty of cases where it just won't work. In my job our system primarily works on a data object that is too large to meaningfully copy everywhere. The solution is extremely judicious use of "const" and clear rules for automatically invalidating certain dependent program state when the underlying state we are working with changes. Lots of work, but in the end you get a ton of very strong invariants that make it really easy to work with the data.

Sep 01, 2020 · josephg on The database I wish I had
Most software isn’t FAANG. In the long tail of database size, I’d guess 95%+ of databases are smaller than a few hundred megs. And at that size, storage is cheap and storing a full historical record of changes should absolutely be the default.

Rich Hickey has a great talk on this a few years ago talking about the difference between Place and Value. He says accountants learned this lesson hundreds of years ago with bookkeeping. Users just don’t generate much data relative to the size of our disks. Well worth a watch, especially if you disagree:

https://youtu.be/-6BsiVyC1kM

Someone1234
So immutable records only makes business sense in databases too small to benefit from immutable records (where you can just version the entire database, for a "few hundred megs")?

I don't understand what FAANG has to do with this. Medium or large non-tech companies commonly have business critical databases in the hundreds of gig range.

Acting like medium to large databases aren't common in business is just outright odd. The argument also doesn't address what I asked (business case for this vis-a-vis cost).

josephg
> So immutable records only makes business sense in databases too small to benefit from immutable records

My claim is that discarding historical data is an optimization. Its an optimization that should be off by default and turned on when needed. Archiving and compacting history should be something you do only need to do when your database size gets out of control.

For small databases there's no reason to throw away historical records at all - and having an immutable log of records and updates should be the default.

ascar
> Most software isn’t FAANG. In the long tail of database size, I’d guess 95%+ of databases are smaller than a few hundred megs.

I think that statement is plain and simply untrue for most databases that generate enough value to pay for a developer working on it. I would bet it can be reversed to at least 95% of business relevant databases are larger than a few hundred mb.

I manage a small 200 player old-school browser game for a few years and our database creates a few hundred megabyte per month. A major source of this data is a user trace-log ('what action they performed when' compact as a timestamp and 2 integers), which we clear on every new round (about 2 months). Keeping every update (instead of the small trace log) would easily scale to gigabytes per month. And we're talking 200 users. I also worked for a small 150 employee web service company, where the development database snapshot was about 5gb (a heavily trimmed down and anonymised version of the production db).

Now, cold storage is cheap. But fast access database storage isn't the same as a disk. That's why we keep hot data in memory, only write to disk when necessary and backup to cheaper storage.

Lastly, a relational databases WAL is a complete historical record of all writes and reads, not just changes. It's used by default and in fact necessary to accomplish ACID and consistency guarantees. Keeping this log gives you the full history that allows to restore the full database state to any given point in time without polluting the actual data with historical records. Granted, access to this data is much harder than a simple query.

There are also many other options to keep historical data rather than keeping it in the OLTP database, like OLAP data lakes.

The Value of Values with Rich Hickey

https://youtu.be/-6BsiVyC1kM

Immutability has been pretty hot of late, though still controversial. There are several popular languages which encourage or even force immutability (Elixir, Clojure). What sold me on the idea was Rich Hickey's talk, "The Value of Values": https://www.youtube.com/watch?v=-6BsiVyC1kM

I look at it as mostly a safety net: I'd rather use const by default, unless I have a good reason to be able to mutate the variable. (To flip it around: what problem does being mutable-by-default solve?) 9 times out of 10, I don't need to change the value for the lifecycle of the function; and often if I do, the code is cleaner anyway if I use an immutable pattern (such as `const foo = bar.map()` rather than `for(x in bar) foo.push()`). Thanks to the magic of Merkle trees, we can generally make fresh pointers without it costing much memory overhead.

sunseb
I agree lukifer. Immutability is useful and interesting, but `const` in JS not really about immutability, it's about variable reassignment (which is trivial to handle really, even for a noob programmer).

const arr = ['foo'];

arr[0] = 'bar';

assert(arr[0] === 'bar'); // !!!

Something like that would make sense though IMHO:

const PHI = 1.618;

I don't get how JS is designed really, they try to solve a bad design (`var`) but they introduce another bad design (`const`) and they make things even messier. Many devs ship JavaScript thought TypeScript these days, so I hope Microsoft can clean this mess one day.

lukifer
Yeah, that's fair, I've been bitten by "not actually immutable" bugs before (works well enough for primitives, but not objects or arrays, which is most things in JS-land!). Libraries like Immutable.js can help bridge the gap.

> they try to solve a bad design (`var`) but they introduce another bad design (`const`)

`let` is intended to be the actual `var` replacement, and it's okay for that. I think there's an argument that `const` is a bad design, in the sense that it's deceptive, under the not-that-uncommon case you describe. Still, I'd rather have it than not, and I personally still like defaulting to `const` and opting-in to `let` in the (rarer and rarer) event that it's needed. (I'm also on 100% TypeScript at this point, which doesn't remove the potential for problems, but does provide an extra layer of guard rails in general.)

sunseb
Yes, `let` is fine. I use `let` 99% of the time in my personal projects, it's more convenient, but I often must use `const` when working on an existing codebase. :(

Most of JavaScript devs do like you, they default to `const` and if they need to reassign the variable, they switch to `let`.

To me it feels more like a ceremony for declaring a variable than a rational and a useful practice. :)

madeofpalk
What about const/let is poorly designed?
sunseb
Have a look here:

https://jamie.build/const

https://twitter.com/dan_abramov/status/1208369896880558080 (React core team member)

https://twitter.com/littlecalculist/status/91787524189167616... (Ember creator)

madeofpalk
I don't think it's poorly designed, its just maybe perhaps not all that useful?
karatestomp
Some people seem really unhappy that it might mean a reference is constant, rather than a value, if the value happens to be a reference. If that makes sense.
TechBro8615
The issue I have with `const` is not your original premise, but the one you allude to here: people confuse reassignment and immutability. Because of this, they end up using const/let to "signify that they're not changing a variable," when it doesn't actually matter anyway, because they're thinking of immutability but disambiguating reassignment capability. You can usually tell who has only a superficial understanding of the language by reading how `const` and `let` are used in their code.

Personally, I default to const, and use it pretty much 99.9% of the time. I use `let` in the few cases where it's actually useful to have a block-scoped variable. Ironically, this probably does a better job of "communicating" intent than the ritualistic obsession with "immutability" does, simply because you know if you're deviating from the default, there is a good reason for it.

I think Rich Hickey's (the creator of Clojure) talk, The Value of Values, to be a great 'splainer: https://youtu.be/-6BsiVyC1kM
Rich Hickey has a talk on this (The Value of Values) here: https://youtu.be/-6BsiVyC1kM
thomk
Thank you, this talk was paradigm shifting AND familiar for me at the same time.
Rich Hickey's "Value of Values"[0] is what finally sold me on the benefits of pure functional programming and immutable data structures. (It remains horrifying to continue working with MySQL in my day job, knowing that every UPDATE is potentially a destructive action with no history and no undo.)

[0]: https://www.youtube.com/watch?v=-6BsiVyC1kM

jamie_ca
That feature's "coming" - part of SQL 2011 called System Versioning, currently supported in DB2 and SQL Server, but available in MariaDB 10.3 Beta according to this talk:

https://youtu.be/xEDZQtAHpX8?t=2278

juangacovas
I thought the MySQL binlog was the history and mysqlbinlog the tool to help with the undo.
lbj
Well put - I came here to say the same :)
spudlyo
Often times MySQL is set up with auto-commit set to true, where every DML statement (like UPDATE) is wrapped in an implicit BEGIN and COMMIT. It doesn’t have to be that way though, you can manage the transaction yourself, and you don’t have to COMMIT if you don’t want to, you can undo (ROLLBACK) if necessary.
lukifer
It’s true, transactions in MySQL work great. But once the change is committed, the previous value is overwritten permanently. If the user wants to undo five minutes later, or I want to audit based on the value a month ago, we’re hosed unless we’ve jumped backflips to bake versioning into the schema.

I think Hickey’s comparison to git is apt: we don’t stand for that in version control for our code, why should we find that acceptable for user data?

vp8989
Because there is vastly more state in the world than there is code. And frankly, most state is not that important. Just wait until you work with a sufficiently large immutable system, they are an operational nightmare.

You should opt-in to immutability when the state calculations are very complex and very expensive to get wrong.

I do wish mainstream languages had better tools for safe, opt-in immutability. Something like a "Pure" attribute you assign to a function. It can only call other pure functions and the compiler can verify that it has no state changes in it's own code.

Nov 05, 2018 · 1 points, 0 comments · submitted by tosh
Sep 24, 2018 · 1 points, 0 comments · submitted by palerdot
Anything by Rich Hickey, especially "The Value of Values" https://www.youtube.com/watch?v=-6BsiVyC1kM .
Mar 06, 2018 · 1 points, 0 comments · submitted by tosh
Clojure's implementation did not have to be so complicated, there are simpler implementations of persistent data structures.

Immutability is obviously the more elegant option as far as I'm concerned: Hickey's The Value of Values[0] lays out the case quite well.

[0]: https://www.youtube.com/watch?v=-6BsiVyC1kM

Dec 30, 2016 · JMStewy on Why Clojure? (2010)
That talk was "The Value of Values," which I enjoyed as well.

Links for anyone interested:

https://www.infoq.com/presentations/Value-Values (1 hour version)

https://www.youtube.com/watch?v=-6BsiVyC1kM (1/2 hour version)

cle
That's a great talk and has had a lot of influence. It's specifically referenced in Project Valhalla's proposal: http://openjdk.java.net/jeps/169
agumonkey
Haa good catch, that's a strong sign if Oracle/Java is using it as an inspiration.
It's funny how catnaroek was completely vendetta-downvoted in this thread for speaking valid things. For those who don't understand the difference between objects and values, this video may be of help: https://www.youtube.com/watch?v=-6BsiVyC1kM
Hi! I have been a programmer for a couple years and started picking up clojure after watching a few of rich hickeys talks. This is the one that best satisfies your question, imho: http://youtu.be/-6BsiVyC1kM
Jul 03, 2015 · 2 points, 0 comments · submitted by RKoutnik
If you have the slightest interest in the issue of concurrency then you should study Erlang and Clojure. Since Moore's Law has largely faded, and we now gain speed by putting more CPUs in each server (rather than having the individual CPUs be faster) concurrency is an increasingly unavoidable aspect of programming. Joe Armstrong (Erlang) and Rich Hickey (Clojure) have both spent an enormous amount of time thinking about how to support concurrency in a programming language. Armstrong talks about a paradigm that he calls "Concurrency Oriented Programming". Armstrong's thesis is surprisingly readable and I recommend it:

http://www.erlang.org/download/armstrong_thesis_2003.pdf

Rich Hickey's video presentation "The value of values" is also a good overview of some concepts that are important to Functional Programming:

https://www.youtube.com/watch?v=-6BsiVyC1kM

darthVapor
How do you feel about Go?
lkrubner
I think Go is very interesting. One thing that I like about Go is something for which it is often criticized, which is the ability to create a binary that holds all dependencies. Some people are critical of these "fat binaries", but I have lost countless hours dealing with dependency issues, in various languages, and dependency issues are my least favorite kind of computer problem, so any language (and/or eco-system) that gives me an easy way to solve dependency problems is interesting. Likewise, when I work in Clojure I often go with the "uberjar" option to deploy, which often means the only outside dependency is the JVM itself. But Go can go further, as there is no need for some VM, like the JVM.
Cyther606
I agree, shipping the dependencies in a single binary is the easiest way to package software. Nim is the same as Go in this respect: native code generation via compilation to C without a VM. And for me, Nim's Pythonic syntax is hands down more pleasant than Go's or even D's. It's the only systems programming language I've come across where it feels like I'm coding at the speed of thought.
darthVapor
I haven't heard much about Nim (until reading this post). I saw that someone was comparing it to python in a way? What are your thoughts on it?
codygman
I believe Nim also has generics and macros.
kyrra
I think you misunderstand Moore's law. It states nothing about speed, but about chip complexity (number of components in an integrated circuit). Moore's law has been following pretty closely since 1965 (and looks like we will maintain it until 2020 at least at this point).

What we have hit is is the max frequency the CPU can operate at. This is not something Moore claimed (as far as I'm aware). Physics got in the way when the transistors got so small that they can't operate any faster without causing problems.

EDIT: reddit discussion about MHz cap from 3 years ago: http://www.reddit.com/r/askscience/comments/ngv50/why_have_c...

Jul 07, 2014 · 6 points, 0 comments · submitted by tosh
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.