HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Alan Kay - Normal Considered Harmful

powerfuloutlet · Youtube · 107 HN points · 21 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention powerfuloutlet's video "Alan Kay - Normal Considered Harmful".
Youtube Summary
A talk given at UIUC - Fall 2009
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Richard Feynman is the best teacher we know (explaining electricity): https://www.youtube.com/watch?v=nYg6jzotiAc

He also explains seeing, heat, electro magnetism, elasticity and mirrors among other things.

His academic lectures are just as good but too long and hard for laymen to follow.

- Every time I switch on an electrical device I hear Feynman say 'Zzzzinggg' and I see the copper bars jiggling across town.

- Every time I see a cup of hot liquid I hear Feynman say "jiggling atoms"

- Watch his hands and fingers telling the more accurate science story, simulating the electrons and atoms.

- I would say that this the most important video to see for any human being on the planet. De second most important thing would be half of Alan Kay's lectures https://youtu.be/FvmTSpJU-Xc?t=2067

Great video's to watch with your kids! (from 3-4 years and older).

Definitely! Although I'm not sure if Morphic[1] is the way forward for UX on the web, despite it only being 10kloc. XD

[Anybody stumbling upon this comment, and wondering what Alan Kay is/was up to, this might be a good philosophical starter: https://www.youtube.com/watch?v=FvmTSpJU-Xc]

1. https://lively-kernel.org

Jul 30, 2021 · jhgb on The Itanic Has Sunk
I love the function call timing story:

> The iAPX 432 failure could best be summed up by a meeting of Intel marketroids and Tandem engineers who wanted to use the chip in their next generation machine, after the slides, the senior engineer asked: “How long does it take to execute a procedure call?”

> The presenter looked it up. “Two hundred and fifty microseconds.”

> Tom immediately walked out, followed by the majority of the Tandem software department. The presenter was poleaxed. “What did I say?”

Found in comments to https://www.youtube.com/watch?v=FvmTSpJU-Xc , but I saw it somewhere else, too, I just don't recall where. Doesn't anyone know?

mst
Interestingly, it appears based on https://en.wikipedia.org/wiki/Intel_iAPX_432#The_project's_f... the procedure call instruction was a heavyweight thing designed to maximise features and it could branch much more quickly - but their compilers didn't handle that out of the box.
It's funny you say that. This is something Alan Kay talks about at length. For example, watch the first 20 mins of this talk - https://www.youtube.com/watch?v=FvmTSpJU-Xc

Essentially, he asserts that computing (or computer science even) isn't a real field (like biology or physics). Most people can't name the early computing pioneers whose work they build on (not true in Physics or Biology, we celebrate the pioneers) nor are they familiar with the work that was done in the Xerox PARC days.

It's gotten so bad that a lot of computer science research just assumes that what we have now is what's going to remain forever. In the 70's, all kinds of interesting ideas and experiments were tried.

Hardware implemented VM's, direct manipulation programming, OS-free computing environments, highly re-configurable computing a la FPGA's, and hundreds more.

Nowadays, we think that creating custom ASIC to run machine learning algorithms quicker is innovative and novel.

All this to say that many of the ideas in Dynamicland aren't new. They're rooted in ideas that are decades old. If you look at Bret's papers he uses as references frequently, you'll notice how many of them are over 10 years old - http://worrydream.com/refs/

m_mueller
Just a personal anecdote, I gave a talk about D. C. Engelbart last week to my research group and none of them knew him beforehand.
skadamat
Then you'd especially love Alan's talk!
Alan Kay - Normal Considered Harmful: https://www.youtube.com/watch?v=FvmTSpJU-Xc
Alan Kay's 'Normal Considered Harmful' is a good place to start (more about computing in general): https://www.youtube.com/watch?v=FvmTSpJU-Xc
> as in other fields

Well maybe it isn't a field after all. Maybe it's "pop culture" like this cranky old guy says https://www.youtube.com/watch?v=FvmTSpJU-Xc.

You're all over the place here and your arguments make no sense whatsoever when examined.

First, you conflate mass-appeal with some sort of objective "better" criterion which is of course bonkers. To use one of your own examples against you, there are hundreds of thousands of Java monkeys out there that are using glue other people made to tie together rocks to build stonewalls. Which do fail as soon as the weather stops being nice. Security (you should look into Java deserialization bugs), reliability, performance what do they know about any of these things?

Second, you conflate late-binding as present in Lisp and Smalltalk with late-binding present in other dynamic languages. The two are not equivalent, a perfect example of the whole is greater than the sum of its parts.

Lisp and Smalltalk will never become popular (read my previous comment), but that does not mean that they do not sit on an apex and still have a lot to give. To anyone interested in the "craft of programming", "the Art", there is nothing better period. Here are some references for you, from the masters themselves:

[1] https://www.infoq.com/presentations/We-Really-Dont-Know-How-...

[2] https://www.youtube.com/watch?v=YyIQKBzIuBY

[3] https://www.youtube.com/watch?v=FvmTSpJU-Xc

"The Mess We're In" by Joe Armstrong: https://www.youtube.com/watch?v=lKXe3HUG2l4

"Normal Considered Harmful" by Alan Kay: https://www.youtube.com/watch?v=FvmTSpJU-Xc

Dec 26, 2016 · armitron on Nurturing Genius
That depends on your definition of genius, which seems to differ widely from mine but also if one takes "genius" to mean someone who does things a non-genius would not be able to do.

There have been attempts in recent years to roll back and debase the definition of genius to be more compatible with egalitarianism (especially in parts of Europe) and the tenet of homogeneity but this is not what genius originally stood for.

Fabian Tasano writes: "Regarding the version of “genius” that is currently in retreat but still occasionally used: many people seem to have a simplistic idea of what it takes to be one. According to one popular model, all that is required is an increase in the magnitude of certain qualities which everyone already possesses in some measure. Make the particular qualities pronounced enough, and you get to genius. But a better way to understand the concept — assuming we’re applying the word to (say) Gauss or Picasso, rather than John Cleese or Wayne Rooney — may be that a genius has a particular capacity, which on a certain level can seem obvious or unremarkable, but which no one else has. A genius, on this understanding, is a person uniquely capable of making a leap ‘off the path’. With hindsight the leap may seem simple or obvious, but at the time no one else was, apparently, capable of making it. A potential leap of this kind is made possible by preceding leaps. Nevertheless its actual occurrence may go on not happening for decades. During that time there may be clear pointers towards it. Yet it is not until a genius comes along that the leap actually happens."

I could not agree more with this. Alan Kay also outlines an extremely similar point of view:

https://www.youtube.com/watch?v=FvmTSpJU-Xc

Also see his blue/red world metaphor.

Nov 09, 2016 · 1 points, 0 comments · submitted by tosh
None
None
Aug 18, 2016 · avindroth on “It's The Future”
Of course not. I am in undergrad.

But I think Alan Kay has been "exposed" to computer science, and I follow his logic, based on my limited scope of knowledge.

https://www.youtube.com/watch?v=FvmTSpJU-Xc

This is obviously entirely false if one takes "genius" to mean someone who does things a non-genius would not be able to do.

There have been attempts in recent years to roll back and debase the definition of genius to be more compatible with egalitarianism (especially in parts of Europe) and the tenet of homogeneity but this is not what genius originally stood for.

Fabian Tasano writes:

"Regarding the version of “genius” that is currently in retreat but still occasionally used: many people seem to have a simplistic idea of what it takes to be one. According to one popular model, all that is required is an increase in the magnitude of certain qualities which everyone already possesses in some measure. Make the particular qualities pronounced enough, and you get to genius.

But a better way to understand the concept — assuming we’re applying the word to (say) Gauss or Picasso, rather than John Cleese or Wayne Rooney — may be that a genius has a particular capacity, which on a certain level can seem obvious or unremarkable, but which no one else has.

A genius, on this understanding, is a person uniquely capable of making a leap ‘off the path’. With hindsight the leap may seem simple or obvious, but at the time no one else was, apparently, capable of making it.

A potential leap of this kind is made possible by preceding leaps. Nevertheless its actual occurrence may go on not happening for decades. During that time there may be clear pointers towards it. Yet it is not until a genius comes along that the leap actually happens."

I could not agree more with this. Alan Kay also outlines an extremely similar point of view: https://www.youtube.com/watch?v=FvmTSpJU-Xc (Normal considered harmful).

Programming is an artistic/creative process. The crowd will never beat genius in this game since the "vision" of the crowd will be diluted by each participating individual. The obvious exception here lies in crowds that are ruled by geniuses with an iron fist (thus not really committees), something which can only strengthen my point. Examples that support this from the past 30 years in technology are countless. We even have sayings about software designed-by-committee.

The point Hintjens is trying to make with ZeroMQ is flawed because he equates the evolution of an idea that has already manifested to the inception of the idea itself.

PieterH
You're simply shifting the goal posts to try to keep others from scoring. There is no "idea". There are hundreds of thousands of small incremental steps.
Interestingly, this was what Alan Kay was advocating for on the web: "A bytecode VM og something like X for graphics" [1].

When Lars Bak and Kasper Lund launched Dart [2], I found it sad that they weren't more bold - left CSS and the DOM alone, and created an alternative Content-Type. So you can choose to Accept 'application/magic-bytecode' [3] before text/html, if your client supports so. Sadly, we ended up with Web Assembly, which by the few talks I've seen, appears to only cater to that of graphic/game developers, with no support for dynamic or OO languages.

[1] https://www.youtube.com/watch?v=FvmTSpJU-Xc [2] https://www.dartlang.org [3] Or in Dart lingo, a VM snapshot.

elviejo
Yes I think of dart as a missed opportunity, it isn't smalltalk for the web, neither is it strongly typed... I think this falt of character makes it that nobody hate it, but also no body loves it.

Go doesn't have generics, some hate it, some love it. But it took a strong stand on that point.

Aug 04, 2015 · 93 points, 52 comments · submitted by e12e
talles
"I don't think computing is a real field, it acts like a pop culture, it deals in fads and it doesn't know its own roots."

Harshly put, but there's some truth there.

fit2rule
I too think its very cynical, but extremely accurate.
coldtea
How is it cynical?

Devs go for fads (just watch HN homepage over time), and there are tons of snakeoil salesmen pushing their wares (e.g. Mongo) and millions of programmers without the basic scientific and engineering rigor.

It would only be cynical if it wasn't an extremely accurate description (which you agree it is).

fit2rule
Its possible to address the subject without cynicism.
coldtea
If something is an accurate description it's no cynicism.

Cynicism implies that something is in a condition X, and the cynic describes it as a much worse condition Y.

fit2rule
Actually, that's not quite right.

"Cynicism is an attitude or state of mind characterized by a general distrust of others' motives."

Perhaps you are conflating the word cynic with critic.

coldtea
That's just part of the full definition.

And even in this case: it's only "distrust" if other's motives aren't bad in the first place.

If people in general have bad motives in IT (lazyness, profit, unprofessionalism etc) it's not "cynicism" to say so.

It's merely calling a spade a spade.

I'm more reffering to the "bitterly or sneeringly distrustful, contemptuous, or pessimistic" meaning of the lemma though.

Where again, if contempt is waranteed and if the situation is dire, it's not cynical to be "contemptuous" or "pessimistic" is just realistical description.

noobermin
It's funny how he chides compsci's as being ignorant about their founders. As a physicist, I personally know physicists who know little about the field's history and care even less. Moreover, I'd argue that in Physics, the push to normal is much stronger than it is in the tech world. May be I'm just thinking the grass is greener on the other side, however.
agumonkey
Maybe because physics is much more tied than computers. In the former, natural laws are quite few and strict, in the latter everybody is free to invent his own little abnormal world without nothing to go against it.
a-nikolaev
Computing natural laws are much much much stricter and simpler than the laws of physics.

When in physics you have many fundamentally distinct fileds with their own models and view of the world: mechanics, thermodynamics, statistical physics, electronics, optics, quantum physics and all sorts of fundamental theories like string theory. This is a very rich system, with many complex models. Scienetists only hope fore some unification there, but for the most part you have to deal many diverse parts of a huge multi-scale puzzle, and the peices don't always fit together nicely.

When you look at computation, what it really is, well a primitive Turing machine is all you can hope for really. Some resarchers push it a little bit into infinite-time computability, but it's not a realistic model of computation anyway.

What you can do with computation are conditions, loops, and variable assignment. Or in lambda calculus, it's just substitution and name binding. Even worse, name binding is not really necessary if you express your program with combinators. So, fundamentally computation is substitution, a rewriting system.

Computation is simple and trivial, but still, it's a great model, and you can create amazing things in it, even though the laws of computation are really trivially simple.

agumonkey
Well, nobody lives at the TM or LC level. I agree that these theories are as strict as a theory can be. But we stack so many layers, the theory disappear and it's all politics. As if in the end programming was more about running a city than building an engine. And then, simplicity feels like a pipe dream.
a-nikolaev
Of course no one in their right mind writes code for complex Turing machines, but don't be confused by all the layers of abstractions.

Programming at any level is fundamentally the same, it's about iteration, branching, composition of smaller pieces, and packaging smaller pieces into bigger ones. No matter at what level you write your code, it's always like this. The differences in API are not important.

EDIT: I think that concurrency is a bit different in this respect. It really requires somewhat different prespective, but again it reduces to simple basic elements (like the agent model, or pi calculus) which are reiterated and reimplemented many times.

nly
As much as I agree with the sentiment presented by Alan Kay, in this talk and others, his presentations often feel bogged down in philosophical fluff and flakey analogy. If you want to see what I mean, mute and scan forward in the video paying attention only to the slides. It's almost impossible to tell what this talk is trying to convey from the slides alone. Nothing is concise. It's very lofty, interleaved with seemingly random stories. If this talk was given by someone without a name, we'd consider it completely whacked.

The tl;dr:

    * Smalltalk did everything better
    * Software was better in the 60s and 70s
    * Alan Kay really dislikes the web.
I'd like to see a lot more talk from his progeny/ilk about modern revivals of the philosophy from these computing heydeys, alongside practical examples of how we can do modern applications better.
coldtea
>* If you want to see what I mean, mute and scan forward in the video paying attention only to the slides. It's almost impossible to tell what this talk is trying to convey from the slides alone.*

That's why it's a speech and not a slideshow.

(Besides who said you HAVE to summurize the speech in the slides? You could use them as supplementary material, and have people pay attention to your speech as opposed to PowerPoint BS).

And it's a generic speach, not him presenting some paper etc. Have you seen his research work papers?

>* Smalltalk did everything better

Well, compared to most things we have today, especially JS, it really did.

>* Software was better in the 60s and 70s

No, but thinking about software was better. We got far more breakthroughs in those days compared to today -- and not just because they were "low hanging fruit".

>* Alan Kay really dislikes the web.

Who doesn't? Compared to what it could be?

e12e
It is most definitely a "mindset" talk more than a "concrete solutions" talk. That said, bear in mind this talk is from 2009 -- "JavaScript: The Good Parts" came out in 2008. So while what he says about web might seem entirely redundant today, with the focus on single-page apps that misses (at least) two things: it wasn't quite that obvious across the field in 2009, and perhaps more importantly, one of the concrete things he does link to, lively kernel[1] was already a real (though new, and not 1.0 stable), working system -- with persistence via WebDAV and the possibility to solve real problems.

[1] http://www.lively-kernel.org/

brudgers
scan forward in the video paying attention only to the slides. It's almost impossible to tell what this talk is trying to convey

I don't find it particularly surprising that not listening to a talk makes it hard to divine what the speaker said. If I didn't already believe it, the tl;dr [for a video?] would be evidence upon which I might come to such a belief.

TorbjornLunde
While this talk could be said to be a talk where he points out problems but don’t show solutions.

Anyway he has a talk here which at least show some more recent examples of what he considers a better way: Is it really "Complex"? Or did we just make it "Complicated"?: https://www.youtube.com/watch?v=ubaX1Smg6pY

In this he talks about two programming languages: Nile and OMeta which allows them to solve some problems in very few LOCs (among other things).

11thEarlOfMar
I'd agree that there is a pretty long way between him tossing the ball and the audience catching and running with it. I take his purpose to be to exhort the viewers to challenge their own thinking and their own purpose, rather than trying to achieve a specific improvement.
bitwize
I'm a bit surprised that Dr. Kay didn't fact-check the boiling frog story; modern biologists don't believe it has any basis in reality.
mflindell
I was worried about that too but I'm sure someone with a degree in biology would have figured that out, right? Pretty sure its meant as a metaphor
dwmtm
Do you really need to fact-check a metaphor?
bitwize
I mean it's a cute story and an idiom that is pretty much indelible from the language, but Kay was putting it in terms like it's a fact about frogs' biology when it is not. He has a background in biology as well as CS, so I am indeed surprised he didn't check this.
e12e
Appears to be inconclusive: "None of these modern rebuttals – Melton, Zug, or Hutchison's – attempts to replicate an extremely slow-heating experiment as cited by Goltz or Scripture: only Hutchison did a thermal trial at over five times Goltz's slow rate and over nine times Scripture's 0.002 °C per second temperature rise." [1]

More interesting is perhaps the question of how you get the frog to sit still -- or maybe change the experiment so there's a grid over the top of the pan -- and try to see if the frog gets more and more "desperate" (flight from danger as opposed to "just a frog jumping around") as the water slowly heats?

[1] https://en.wikipedia.org/wiki/Boiling_frog

26cf805ae26f
You don't need to be a "modern biologist" to assert those claims. I just tried once and it worked (I mean the frog did die)
mgrennan
This talk describes the problem with software development today. Lot of young people wanting to create the next new thing with no idea about what they are creating.

TO MUCH DOING, NOT ENOUGH THINKING.

"THINK" is a motto coined by Thomas J. Watson. If you don't know who it was... Again you fail.

If you want to create the next new thing, go talk to someone who has been working in IT for 40+ years. Remember, IT work in dog years. That means the person you are consulting with has 320 years of experience.

marcusarmstrong
I clicked on this talk only to then realize, "Wait, that looks familiar". Yeah. I was at this talk when it was given. Whoops.
therealmarv
best usage of "considered harmful" in a title
adamzubi
http://googler700.blogspot.com/2015/07/list-of-java-projects...
DyslexicAtheist
If I see one more "considered harmful" post/presentation I'm gonna lose my sh1t. and no matter if the author is Alan Kay, Dijkstra or Wirth.

maybe I should do a "Using Cosidered Harmful, considered harmful"

EDIT: seems somebody already beat me to it http://meyerweb.com/eric/comment/chech.html

DyslexicAtheist
same goes for "eating your own dog food".
danbruc
Has already been done.

http://meyerweb.com/eric/comment/chech.html

e12e
Just randomly came across this Alan Kay talk that I hadn't seen before. Some very interesting points on the challenge of real innovation:

"Normal is the greatest enemy with regard to creating the new. And the way of getting around this, is you have to understand normal, not as reality, but just a construct. And a way to do that, for example, is just travel to a lot of different countries -- and you'll find a thousand different ways of thinking the world is real, all of which is just stories inside of people's heads. That's what we are too. Normal is just a construct -- and to the extent that you can see normal as a construct inside yourself, you've freed yourself from the constraint of thinking this is the way the world is. Because it isn't. This is the way we are."

agumonkey
I'm curious about system evolution and their tendency to 'normalize'. I too like the diversity and originality, but constant newness, as in 'javascript client framework' is exhausting .. surely there's a balance, and hopefully some kind of theory on where to place the center.
brudgers
Kay draws a distinction between news and new. A new JavaScript framework is news...a person can explain the idea in 5 minutes or so because it is a normal incremental change. It's incremental because it's projecting normal as the future. Evaluating JavaScript frameworks is exhausting because the differences are mostly mundane, not world wide web versus gopher.
random_2azkXJ
So the difference between "new" and "news" is the difference in the perceived scope? Is this what you've got from Alan Kay's philosophy?
loup-vaillant
I think it's about inferential distance. "News" is only 1 or 2 inferential steps from common knowledge. "New" is several step removed, and as such look alien, crazy, or pointless.
brudgers
No and of course not.
e12e
Its the difference between looking at something from a different angle and looking at it from space. It's much harder to go to space [than] to turn your head (well, was much harder before anyone did it - now its easy).
random_2azkXJ
Yeah, that's what I thought too!
CmonDev
Most of those JS libraries are same MVC/MVP/MVVM done over and over again.
agumonkey
That was the point, you want a certain kind of new, not running around in circles at high speed.
random_2azkXJ
Philosophical question: What's when "running around in circles at high speed" becomes normal? Will then "normal" become "new"?
agumonkey
I deeply believe that, as kay said, we are relative and will stop perceive things after a will, forgetting.. anything old will be new, even if it's normal, slow and ~sensical. Cycles. Or orbitals as I like to see them.
anon4
> balance

This is a funny word if you think about it. It's the property where several forces on a point are in equilibrium. This definition generalises to very abstract situations, and especially to political ones. Several sides are all pulling (or pushing... the metaphor doesn't really matter) in their own direction and where the final decision falls is where the powers are in equilibrium. It's the point which is as dissatisfactory for every party as that party is powerless. Sometimes, that's what you want.

Often, balance evokes the image of the halfway point. The balance between killing every Jewish baby and no Jewish baby is not killing half of all newborns. It's also not killing exactly zero babies. If you want a balance between killing all Jewish babies and no Jewish babies, the neo-nazis all have a saying and you end up having to kill one of every hundred million or so babies. Which is unacceptable. This example was an extremist straw-man to illustrate the following point: You're not looking for balance, you're looking to maximise a utility function.

The utility function may be maximised by looking for balance between forces, in fact it often is, or we approach something close to the maximum by finding the balance. However, you mustn't mistake the balance for the utility function. You're not maximising balance. You're looking for a solution to a problem. You want less javascript client frameworks? Ok. You still want some javascript client frameworks? Ok. Why.

Because you want to maximise the utility of javascript client frameworks (I assume). This utility might be different for different people, i.e. they may want to do different things with the frameworks, etc. But, I can agree with you that having a constant stream of new ones serves no utility, except the tautological one of "let's have as many frameworks as we can write".

Now we come back to balance. At some rate of new frameworks being made, we balance out the amount of work spent on maintenance of the old ones with the influx of new ideas that make our work and life easier and from time to time we completely overhaul and forget an old technology. Which, in turn, maximises our utility of these frameworks. It does look like balance <=> utility, but not quite. This is a balance in the context of a specific utility function. You've defined the utility function as "I want X and Y with preference weights α and β", which looks very much like a linear programming[1] problem, the solution to which is... weighted balance.

Others would not agree to your utility function. They maybe don't want new ideas so much. Now you're balancing utility functions. But those people still agree that they want to "find the balance". It's just that their equilibrium is not your equilibrium.

So, please don't just throw out "there's balance" like that. Everyone can agree to balance and it's at a different point for different people. It's a word that causes... well, muddled thinking. If you can, avoid using it at all[2]. Get to the root of your problem as much as you can, then say what ails you. "I like seeing new approaches tried, but it's hard to keep up with the influx of largely samey ones that are constantly cropping up. I wish they would slow down." -- or something along these lines, I'm not saying this is what you specifically want.

Sorry for the long rant. I've had to listen to people wanting to balance a lot lately.

[1] https://en.wikipedia.org/wiki/Linear_programming a type of math problem concerned with finding the point where the given variables all satisfy a set of constraints expressed as linear inequalities and a given function achieves a maximum. [2] maybe there is a balance to abusing "balance"...

agumonkey
Well you actually spelled my thoughts, so thank you :) I could have been more precise in what I meant, but I wanted to stay abstract, not specific to some trendy example, but having a general view on what structures and shapes forces in a 'new' context and how evolution goes. Maybe it's just an entropy thing blended with the memory limit of a generation that will have the same energy as its predecessor but without clear understanding of the state of the art, meaning they will walk the same paths thinking it's new instead of recognizing the old.
TheOtherHobbes
Devs like to solve problems for their own sake. Building a completely new anything is a completely different challenge to solving a problem like "Build a version of existing thing X using language Y and/or running in environment Z."

The second option is well-bounded and safe. It requires technical skill, not creativity. It's a legible, comprehensible challenge.

The first option is unbounded and unsafe. It can't be done without creativity, originality, and technical skill.

I'm becoming convinced there should be a much stronger creative and artistic element in developer training. Invention and innovation are so much more valuable than wheel reinvention that turning out people who are only comfortable with the latter is selling everyone short.

vezzy-fnord
You can't really teach such a thing, not in any profound way. The programmer in question must have an intrinsic or otherwise self-conditioned drive to read computing history, papers and be interested in actually doing research before starting a project.

We have largely crafted a culture where doing research before writing code is considered slow and ineffectual for whatever reason. Instead, we value "moving fast and breaking things" and whipping up the quick hack, which encourages people to propagate their computing biases and never step out of their comfort zone.

This is one of the reasons why I mostly scoff at attempts to make computer programming a compulsory schooling subject. Coding as "the new literacy" devalues computing and is the very embodiment of the programming pop culture that Alan Kay has warned about.

widdershins
It's hard to cultivate a good literary culture until you have a large literate and critical audience. Perhaps the same is true for programming culture.

I agree with your other points though.

agumonkey
Moving fast is valuable sometimes, as much as going away in a hammock as Hickey would say. Learning how to alternate both scales is missing though. And I often thought smart people had that down, they could think ideals, then get down through the stack to effectively make things, then get back to the abstraction without losing focus or getting lost. Quite often I'm stuck at one level or the other.

> This is one of the reasons why I mostly scoff at attempts to make computer programming a compulsory schooling subject. Coding as "the new literacy" devalues computing and is the very embodiment of the programming pop culture that Alan Kay has warned about.

Especially since the people behind this idea have zero idea about what is programming. Some want people to learn HTML, which is pretty much void.

simonh
Great stuff. This immediately made me think of Steve Jobs and the way he approached - I was going to say product design, but really everything he did. It's a very Budhist way of thinking about the world.
branchless
Love Alan Kay. Went to a series of talks as part of a week of education for an old employer. We had several very highly qualified people speak to us about various challenges in IT. They were all quite "normal". Alan Kay's talk blew me away because it was so left-field. Interesting guy.
brudgers
More Alan Kay:

http://mythz.servicestack.net/blog/2013/02/27/the-deep-insig...

branchless
The guy is sickening! Not only did he come across as very likeable he also dropped in that as a child he had to decide between going into computer science or becoming a professional jazz musician (I think he played guitar).

Choosing between musician and computer genius is bad enough but to retain being likable with this is frankly too much :-)

Jun 07, 2015 · tnorgaard on HTML is done
Maybe we should listen to Alan Kay's proposal: https://youtu.be/FvmTSpJU-Xc?t=1082.

It might be bad example, but if the Java SE runtime libraries was just a dependency like any other mvn artifact (module), Sun / Oracle would for example have been in a position to introduce immutability to the Collections framework and fix crazy other stuff[1] without breaking backward compatibility. But now we are stuck with those legacies forever[2], since they standardized on a too high level abstraction and choose fewer layers. For Java's, luckily they had layer underneath - JVM byte code, which is why I predict Java is still relevant for many years going forward.

The next generation of WWW needs to a very low common abstraction, e.g. bytecode. Something similar to X Windows.

[1] See Effective Java Puzzlers. [2] Project Jigsaw might change that.

davidy123
You don't see the inherently 'open' nature of HTML and common accessories to be a huge benefit? Sure, they can get messy, but are still bound to be more parseable and tow people along to the transformative idea of a giant shared graph with dev tools (inspection) support, compared to byte streams. Over time Flash, Java applets and other approaches have come along but been deprecated, which I'd considered to be emblematic of searching for a more transparent information commons outside one technical requirement (even if the logical conclusion of a 'semantic web' has been elusive so far). The fact that "view source" is available in all major browsers is to me incredibly meaningful and positive compared to alternative visions.

Or would your byte code blaster have similar properties?

tnorgaard
I think we can build it however closed or open we would like it to be. The sharing of source code I suspect is more social than technical, therefore I don't expect much change.

Java Applets, Flash, SilverLight always was at a disadvantage, not being native to the browser runtime. Startup time and the need for a "plugin" hurt them. And they all kinda sucked in their own way. I hoped for a long time that Dart / native DartVM to Chrome could make a stride, but sadly they were fighting uphill battle against the sheer volume of JS developers resisting change. I'm convinced that it's never about the language[1], but the VM and it's abstraction layer. We need a layer that we can compile JavaScript(/HTML/CSS), Scala, Haskell, C#, F# etc. down to, not more languages.

I wonder how many man-years has been wasted on vertically aligning div's inside a div. :)

[1] Guy Steele, amazing talk on "Growing a Language", https://www.youtube.com/watch?v=_ahvzDzKdB0

It's not a field it's a pop-culture just as Kay said (https://www.youtube.com/watch?v=FvmTSpJU-Xc).
Alan Kay suggested we have these problems because computer science is more like pop culture than science. http://www.youtube.com/watch?v=FvmTSpJU-Xc
Very well said. The focus on convenience and cool features over safety makes me really sad, and want to force automotive engineers to watch some Alan Kay talks. He loves to talk how people who don't know the history and basics of their craft will arrive at inferior solutions, for example in this one: http://www.youtube.com/watch?v=FvmTSpJU-Xc
Dec 29, 2012 · 1 points, 0 comments · submitted by fumar
Similar claim was made by Alan Kay in his lecture Normal Considered Harmful [1]. However he didn't complain about lower barriers to entry. Instead he pointed out that the people in the 90s ignored the work and experience of the past generations of programmers. He especially criticized the Web.

[1] http://www.youtube.com/watch?v=FvmTSpJU-Xc

I think Mr. Granger should refrain from using inaccurate remarks about the history of programming. Light Table is neat but it is not the first attempt at an interactive development environment. His ideas were quite common 30+ years ago (20 years ago would have been right about the time when this style of programming started falling out of vogue). Alan Kay does a far better job than I can in explaining some of this history:

http://www.youtube.com/watch?v=FvmTSpJU-Xc&feature=youtu...

Update I wasn't being terribly fair. Granger might simply not have been aware. In that case I hope he sees my comment and watches the above video. :)

endlessvoid94
I was at that talk at UIUC!
seanmcdirmid
When I gave my talk on live programming in OOPSLA/Onward! 2007, one of the gang of four (of Design Patterns fame) criticized me on not mentioning smalltalk enough. You know, I think we are all aware of Smalltalk and its ability to reload code on the fly, or the "live" eToys demo Alan Kay was giving in the late 90s/early 00s (I saw it the first time at his ECOOP 2000 banquet speech). But there really is something special in Light Table (and other projects, including my own) that go beyond that; they are working on a practical tool for programmers, they are really focusing on an interactive UX, which to be clear, Smalltalk environments never really had.

Yes, get your history and citation right to avoid being flamed. But no, it didn't end with Smalltalk (or Lisp, or whatever) and we are not just going back to that.

agentultra
Appeals to emotion don't really tell me what makes Light Table innovative. You say there is, "something special," about it. What is it?

It takes either hubris or ignorance to believe your ideas are completely original. Whether Chris is aware of it or not, Light Table is not a new idea. I'm not suggesting it isn't a good one because I believe it is. However the way he worded his post suggests to me that he is probably not aware of the giants upon whose shoulders he now stands.

Nobody just picked up a guitar and invented rock-and-roll. The theory or relativity didn't just come to Einstein one day. No good idea is born in a vacuum.

I only rain on the parade a little in the hopes that his writing will improve and to raise awareness in people who aren't aware that these are good ideas and that there have been many attempts to develop them.

seanmcdirmid
Light Table's layout and association between code and UI is much more direct than Smalltalk's; I would say its even better when you consider morphic. The way they've reified execution is also nicely done (seeing what objects exist as they are created).

There is a lot of innovations here, and yes, they are probably building on 20 years of related work THAT GO WAY BEYOND SMALLTALK. If he goes into a smalltalk credit, why not Lisp machines? Why not VPLs? Why not every little research project that has attempted this in the past (including my own work)? And he is still correct: the vast majority of programmers haven't been influenced by this at all, we failed, and he is trying again hopefully with something that will work! If he was writing a technical paper, then he could discuss everything in a related work section; but it was a strangeloop talk + blog post, what kind of rigor are you expecting?

agentultra
Why not simply be more aware of those things?

His blog post made sweeping generalizations about what programming was like twenty years ago that are clearly inaccurate at their worst or at least misleading at best.

He doesn't need to cite anything. He could just make fewer generalizations of the sort in his post. Many programmers from twenty years ago were working on the very kinds of projects he is today. His might be better for many reasons than those early attempts but he does himself and his audience a disservice by not being aware of them.

seanmcdirmid
What does "many" mean? 10? 20? Out of how many programmers who were active back then?

20 years ago most programmers were not using Smalltalk or lisp machines or whatever, even if those things existed they weren't used often, so the author is technically correct about the "state of the art" with the caveat that their were many attempts to go beyond this but nothing got any traction.

Also, you are incorrect about 20 years ago. The first example of liveness and directness in programming was in...1962 with the Sutherland/Sketchpad demo. 50 years ago.

agentultra
Indeed! The Sketchpad demo. The Mother of All Demos (Englebart who basically conceived of hyper-media and the future of knowledge work). The Symbolics lisp machines.

It's all really cool stuff. I'm really glad that Light Table is making it popular again. Maybe this time we can at least give a nod to where these ideas come from so that more people can be aware. New instead of novel.

tluyben2
I think 20 years ago would've been a good time to really hardcore market those environments. As a new student with a profitable software company, I would've been interested in this. However I never heard about these technologies until much later; C(++) & Pascal (Modula) and 'rapid' friends Clipper (for business applications) and such were things popular then. Not sure how it was in the US then, but in the EU I didn't see much promotion going on for Smalltalk/Lisp and I didn't know anyone outside uni using those languages. So I think Chris is right there; most programmers will never have seen anything like this in their lives nor do they know it exists, even if they were around in the 80s/90s or earlier when they could've known.
tisme
> interactive UX, which to be clear, Smalltalk environments never really had

Smalltalk had an interactive UX in 1986 or so when I used it first for some CAD stuff. Or is there a special definition of 'interactive UX' which would rule out what smalltalk offered back then?

seanmcdirmid
You know, people sometimes say Visual C++ is interactive UX because of the freaking form builder. I guess my argument is more about a certain level of interactivity that is missing; its not black and white to be sure.
klibertp
Don't know about "back then", but if Pharo and Morphic are not "interactive UX" then I have no idea what could be called that...
jgrahamc
"Yes, get your history and citation right to avoid being flamed."

That's the wrong motivation for understanding the history of computing as it relates to what you are working on. There are things to learn from history. It's not the ability to correctly cite Smalltalk etc. to quiet some older programmers , it's the ability to compare, contrast and learn from what came before. The really debilitating thing about not understanding the history is that you are doomed to repeat the same mistakes; it's better if you can make new mistakes.

For example, you say "But there really is something special in Light Table". Great. What is it?

seanmcdirmid
I've mentioned in a peer post that his design is very well done; they are focusing on awareness and code topology issues that I think is long overdue. Now, do I know exactly what the differences are between smalltalk yet? No, not at all. But I've never seen smalltalk demos like this (the closest would be etoys, which was a not-very-comparable end-user programming experience).

Again, its a blog post and a talk, I don't think we should expect rigor on previous/related work coverage at this point, that will come with time.

Aug 10, 2012 · 3 points, 1 comments · submitted by fogus
david927
This is the most important video you'll see for a while. Please, watch.
Aug 08, 2012 · 1 points, 0 comments · submitted by signa11
You might find this entertaining - I'm watching a speech by Alan Kay at the moment where he says that computing is more pop-culture than science: http://www.youtube.com/watch?v=FvmTSpJU-Xc
May 21, 2012 · 2 points, 0 comments · submitted by da02
Apr 28, 2012 · 6 points, 1 comments · submitted by siteshwar
kristianp
This has been on news.yc before, but it's new to me. Even if you don't watch the full hour, the first 15 minutes or so makes an interesting point about our lack of knowledge of the pioneers of the field of computer science.
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.