HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Alan Kay at OOPSLA 1997 - The computer revolution hasnt happened yet

Jeff Gonis · Youtube · 42 HN points · 56 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Jeff Gonis's video "Alan Kay at OOPSLA 1997 - The computer revolution hasnt happened yet".
Youtube Summary
Alan Kay's seminal 1997 OOPSLA keynote. Originally hosted on Google Video, copies of it are now only available from the squeak.org website as far as I can find. Putting it on youtube is my attempt to preserve a really important talk in computer science and computing in general.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Sep 29, 2022 · 3 points, 1 comments · submitted by kixxauth
kixxauth
I watched this a few years ago, and it reminded me that sometimes we try to solve problems that someone else tried to solve before us. Maybe they solved it better, maybe not. But, there is a lot to learn.

Refreshing my brain and watching it again today.

A true philosopher ;-)

> accuse Kay of making with regards to OOP

You likely mean this talk: https://youtu.be/oKg1hTOQXoY?t=636 He says: "actually I made up the term object-oriented and I can tell you I did not have C++ in mind, so the important thing here is I have many of the same feelings about smalltalk". Have a look at https://news.ycombinator.com/item?id=32649622, https://news.ycombinator.com/item?id=32651863, https://news.ycombinator.com/item?id=32654863 and https://news.ycombinator.com/item?id=32651366 and you will (hopefully) understand why he said that. It also explains well why Ingalls was the sole author on the Smalltalk-76 paper.

Aug 16, 2022 · 4 points, 1 comments · submitted by sebastianconcpt
rektide
Computing has been retreating off the personal & away into the cloud, to neo-mainframes. It feels like the revolution gets further off.
Alan Kay's lectures on youtube are mostly about (critical) thinking and how to educate your own thinking. Each of his talks is aimed at a different audience but all address your question. For our HN audience the 1997 OOPSLA talk [1] is a place to start, or the "How to Invent the Future I - Stanford CS183F: Startup School" talks [2].

There are many recommendations of books to read to develop your thinking.

[1] https://www.youtube.com/watch?v=oKg1hTOQXoY

[2] https://www.youtube.com/watch?v=id1WShzzMCQ

BOOSTERHIDROGEN
Mind sharing the books list? thanks
morphle
Alan mentions all the books in the talks and shows the front page. Sorry, it takes to much time right now to order, edit and prioritise the list of books for you. I'll try to append that to this thread later this week. You can contact me on morphle at ziggo dot nl and we can have an interactive session where I can shower you with critical thinking resources from Alan Kay, Marvin Minksy and a few others.

Book references:

Human Blindspots and “Bad Brains”:

Francis Bacon, “Novum Organum Scientia”

Daniel Kanneman, “Thinking: Fast and Slow”

Robert Meyer, Howard Kunreuthner, “The Ostrich Paradox”

https://en.wikipedia.org/wiki/List_of_cognitive_biases

Alfred Korzybski, “Science and Sanity”

YouTube: “How To Catch A Baboon” https://www.youtube.com/watch?v=ctol7JwpcuQ

Exemplary Practices and Perspectives:

Amory Lovins, “Reinventing Fire”

E.F. Schumaker, “Small Is Beautiful”

Christopher Alexander, “Notes on the Synthesis of Form”

from: http://www.vpri.org/pdf/Kay_How.pdf

http://www.squeakland.org/resources/books/readingList.jsp

https://mostrecommendedbooks.com/alan-kay-books

https://wiki.c2.com/?AlanKaysReadingList

https://mostrecommendedbooks.com/alan-kay-books

I know there must be lots more if you search online, there is even a HN post with a book list written by Alan himself.

I have gathered the books on these lists and all the video's of the talks in a 500 GB torrent. Maybe you can help organise it so we can publish it here?

Talks:

https://tinlizzie.org/IA/index.php/Talks_by_Alan_Kay

https://www.mprove.de/visionreality/media/kay.html

operating systems which are totally transparent and easily understood by their users

I sort of glossed over this part so now I have a chance to elaborate. Alan Kay has put a ton of thought into this issue [1]. He firmly believes that we can build an operating system and application software with an extremely small footprint (LOC's) so that a single person can understand the whole thing.

Since he gave that talk, we've moved further and further away from Kay's vision. We've made things more and more complex, opaque, centralized, and difficult to change. We've given away our future to big tech companies. Heck, we've even given away the past. We've lost much of the freedom we had back in the 90's, let alone the 70's and 80's when Kay did so much of his work. We're going to have to work incredibly hard just to regain what we've lost.

[1] https://www.youtube.com/watch?v=oKg1hTOQXoY

jamiek88
No one human mind could ever grok the entirety of say, iOS or Android.

Do we just go back to the software Middle Ages?

100% agree. This is the reference, for those who are interested -- watch for a few minutes starting at https://www.youtube.com/watch?v=oKg1hTOQXoY&t=1280s
this observation from 1997 always rings true. watch a few minutes starting at https://www.youtube.com/watch?v=oKg1hTOQXoY&t=1280s
The Art of metaobject protocol

In his 1997 talk at OOPSLA, Alan Kay called it "the best book anybody's written in ten years", and contended that it contained "some of the most profound insights, and the most practical insights about OOP", but was dismayed that it was written in a highly Lisp-centric and CLOS-specific fashion, calling it "a hard book for most people to read; if you don't know the Lisp culture, it's very hard to read" [2]

[1] https://mitpress.mit.edu/books/art-metaobject-protocol

[2] https://www.youtube.com/watch?v=oKg1hTOQXoY

OOPSLA talks of that era were fantastic. Alan Kay has often paid tribute to Christopher Alexander in his talks and I cannot resist mentioning this from Alan from the following year https://www.youtube.com/watch?v=oKg1hTOQXoY
seumars
Too bad Kay has never paid enough tribute to Nygaard and Dahl in the same manner. His comments from that '97 OOPSLA talk on Simula being incomprehensible are absurdly misleading.
gjvc
will investigate. any and all pointers welcome.
seumars
Simula67 was already known in academic circles by the early 70s. Donald Knuth had first taken an interest in Simula I after a visit in Oslo but couldn't manage to bring it to Stanford because of the very high licensing fees the Norwegian Computing Center were charging for it. That first iteration of the language was more focused on being a general purpose "system description" language for different kinds of real-life simulations (this would also kickstart Nygaards work on the social impact of technology and the beginnings of user-oriented system development, then participatory design). Simula67 however wasn't the "incomprehensible" language that Kay describes. The second iteration was much more focused and had all the elements of modern object-oriented patterns: classes, subclasses and inheritance, objects, object references and attributes, object dot notation, polymorphism, etc.

So Kay's take on Simula being this strange language he had to "make sense of" for his work on Smalltalk, or even the fact he claims he coined the term "object-oriented" is total BS. Here's a more nuanced and impartial account of Simula by James Gosling, if you want to know more: https://www.youtube.com/watch?v=ccRtIdlTqlU.

Mar 13, 2022 · danuker on .NET Myths Dispelled
> Alan Kay’s 97 OOPSLA keynote

Is it this one?

https://www.youtube.com/watch?v=oKg1hTOQXoY

Edit: I suspect it is this one, at 10:33 Alan says "Actually I made up the term <<object-oriented>>, and I can tell you I did not have C++ in mind."

ncmncm
Alan Kay has been redefining what he calls OO for decades. He has lost all credibility.

He cribbed OO for Smalltalk from Simula, same as Stroustrup did for C++. (Smalltalk-72 was not OO.) "OO" is just a name for something that already existed.

Anyway there is nothing very meritorious about OO: it is sometimes the right thing for part of a problem. But organizing a whole language around it is as idiotic as organizing a car design around making left turns.

These days it's easier to create an OS from scratch than a web browser.

This is an excellent observation, and gives me an excuse to recommend Alan Kay's seminal OOPSLA 1997 keynote "The computer revolution hasnt happened yet" -- the link to the whole thing is below, but I've set the time at the start of what I think is the immediately relevant point (he takes a minute or so to explain)

https://www.youtube.com/watch?v=oKg1hTOQXoY&t=1330s

Kay is said to have coined the term "object-oriented", but his conception of object orientation differs considerably from that of Dahl and Nygaard as well as from all current OO languages. Dahl and Nygaard invented the first object-oriented programming language (Simula 67) but didn't use the term. And yes, active objects reacting to messages (events) passed to them were already part of Simula I (1962).

The referenced article - like many others - moreover quotes Kay incompletely; Kay in the sentence next to the one quoted in the article said "the important thing here is that I have many of the same feelings about Smalltalk"; see https://www.youtube.com/watch?v=oKg1hTOQXoY&t=633s. Smalltalk-76 and 80 (in contrast to Smalltalk-72) implement inheritance and virtual methods as it was done in Simula 67, just like C++ and Java do.

May 17, 2021 · 2 points, 0 comments · submitted by tosh
You could look it up, it’s a famous quip. From his talk at OOPSLA, 1997.

https://youtu.be/oKg1hTOQXoY

The OOP/C++ line is at 10:34. Good talk.

OK, I'll bite. :-)

In particular, dynamic linking is suggested by many as being a bad thing, for example...

https://www.google.com/search?q=dynamic+linking+is+bad

http://users.eecs.northwestern.edu/~kch479/docs/khale_rtld_2...

http://harmful.cat-v.org/software/dynamic-linking/

In general, as Alan Kay said in his 1997 OOPSLA talk [1] (which was not about Unix, but Smalltalk) "Think of how you can obsolete the damn thing". The Unix wars of the 1990s led (perhaps ironically) to Unix becoming and remaining a local maxima, and not obsoleted by something radically new and different, and better [2] in some dimension. This is perhaps a "meta-answer" to your question.

You may wish to study the features of both DEC OpenVMS (released in 1977, a few years after Unix's initial release in 1969) and "Plan 9 from Bell Labs" [3] (released in 1992) for some contrasting approaches.

[1] https://www.youtube.com/watch?v=oKg1hTOQXoY

[2] the word "better" only suggests one dimension of analysis or measurement

[3] yes, that's its full name

gjvc
Also, I should add that NextSTEP / OpenSTEP were the probably the most exciting (but prohibitively expensive to many people) tools of their day.
agomez314
Definitely planning on studying Plan 9. I was slightly discouraged after reading through the manual's overview section since the authors just glossed over the problems they perceive in Unix and didn't elaborate on that.
Mar 03, 2021 · jgon on Flutter 2
Time to post the evergreen thoughts of Alan Kay from his 1997 OOPSLA talk: https://youtu.be/oKg1hTOQXoY?t=1491

We've spend the last 25 years hacking a document renderer and scripting language aimed at 5 line form validation logic into the be-all and end-all of modern application development. Browsers sit at 10s of millions of lines of code, with thousands of people working on them. We depend on a standardization committee to have things like the common controls you could find in a Smalltalk-80 UI, and these are still insufficient for many purposes such that developers end up cramming together a million diffs to create the controls they want, in the style they want.

The writing was on the wall once we had a canvas element, aka a drawing context, and webassembly, aka a vm target you can compile to. We can keep building up giant garbage heap of complexity and saying that's what we meant to do ( https://youtu.be/oKg1hTOQXoY?t=814 ), or we can take step back and ask if we can provide simpler lower level tools on top of that.

With that said, I feel somewhat ambivalent about all of this. Of course the web is an open standard and has provided an incredible explosion of knowledge and utility for everyone. Any sweeping changes to that should elicit feelings of caution I think. Things like accessibility are important, although I think that enough people recognize this that we don't need to worry about it being left behind with this approach and I would also say the ocean of divs approach isn't exactly great for accessibility either.

Will this take off and be the first of a new kind of application development? Maybe, or maybe we'll all look back on it in 10 years and wonder what the hype was. For my part I do feel like some sort of change is going to happen, the current approach of piling more crap on top of the current html app dev paradigm feels unscalable, while the strengths of how it allows for app distribution feel like something that we can never get away from. Will Flutter retain those strengths and leave behind the weaknesses? I hope so, and it make me excited for the future of web development in a way I haven't previously.

Jul 21, 2020 · 1 points, 0 comments · submitted by tosh
Yup, it's a really good exposition of the process most math goes through before it is fully "rigorous". In one of Alan Kay's talks he mentions how Euler basically got all his proofs wrong but his intuitions were almost all correct and some people got PhDs by making his arguments more rigorous [0].

Gian-Carlo Rota has a similar story about most of his proofs being wrong but generally having the right intuitions and being on the right track. The paper is titled "Ten Lessons I Wish I Had Been Taught" and it's really good. He gives advice applicable to any domain where sharing knowledge is the key marker of progress: http://www.ams.org/notices/199701/comm-rota.pdf.

--

0: https://www.youtube.com/watch?v=oKg1hTOQXoY&feature=emb_titl...

[~ 8min 15sec] There was a mathematician by the name of Euler. Whose speculations about what might be true formed 20 large books. That most of them were true. Most of them were right. Almost all of his proofs were wrong. And many PhDs in mathematics in the last, and this, century have been formed by mathematicians going to Euler's books, finding one of his proves [and] showing it was a bad proof. And then guessing that he, his insight was probably correct and finding a much more convincing proof.

>Over the past ~25 years, a document delivery system has been perverted into a application delivery system.

...and a protocol for inter-object communication. That's probably the most important part, because that's what creates most of the complexity and security holes today. There is no end to this in sight.

Alan Kay warned web developers about this in 1997:

https://www.youtube.com/watch?v=oKg1hTOQXoY

Nobody listened. The majority of web devs still don't get it. (Can't wait for the arrogant replies here.) We ended up reinventing and re-implementing "internet objects" in the shittiest way possible.

ng12
Truly spoken like someone who's never done any GUI development in a professional setting.
> What happened in most of the world starting in the 70s was abstract data types which was really staying with an assignment centered way of thinking about programming and I believe that... in fact when I made this slide C++ was just sort of a spec on the horizon. It's one of those things like MS-DOS that nobody ever took seriously because who would ever fall for a joke like that.

- Alan K

I don't hate on object oriented programming, but when I come across videos like this https://www.youtube.com/watch?v=oKg1hTOQXoY&feature=youtu.be... it is good enough reason for me to question OOP and explore alternatives.

I will say though... in my limited sample size of coworkers, it is more fun to work with people who attack OOP than those who defend it. But the best people to work with are those who don't get emotional about programming ideologies.

Alan Kay at OOPSLA 1997 - The computer revolution hasnt happened yet

https://www.youtube.com/watch?v=oKg1hTOQXoY

Jun 20, 2019 · 13 points, 1 comments · submitted by tosh
tosh
inlining the pinned comment by Joel Jakubovic from Youtube as it is easy to miss:

  07:00 "On the Fact that the Atlantic Has Two Sides"
        Kay: "On the Fact that Most of the World's Software Is Written On One Side Of The Atlantic!"
  07:30 Two types of math: computers = practical math
        structures of a much larger kind, consistent
  08:55 Debugging goes on in math as well
  10:00 What's being done out in the world under the name of "OOP"?
  10:33 Infamous C++ quip
  10:50 Same feelings about Smalltalk! Meant to be improved. Not syntax, or classes. Taken as given...
  13:00 Math as gears
  13:35 Analogy to 60s programs: SCALING A DOGHOUSE!
  18:00 Pink plane / blue plane
  21:30 Data abstraction in 1961 US Air Force
  23:25 Contrast that with what you have to do with HTML on the internet! Dark Ages!
          Presupposes that there should be a "browser" that should understand its format.
          "This has to be one of the worst ideas since MS-DOS"
          What happens when physicists decide to play with computers :P
        Browser wars, which are 100% irrelevant [to the big picture]
  25:30 The internet is starting to move in that direction as people invent ever more complex HTML formats, ever more intractable.
       This same mistake made over and over again
  26:00 Sketchpad: very much an object-oriented system
  26:50 Simula
  27:50 Better Old Thing vs. New Thing
  29:20 Molecular biology: cell physiology and embryology (morphogenesis)
  30:00 First assay of an entire living creature: E.Coli; biological info processing
  34:10 Only takes 50 (40) cell division [cycles?] to make a baby!
  35:20 Computers: slow, small, stupid. How can we get them to realise their destiny?
  36:00 Doghouses, clocks don't scale 100x very well. Cells scale by 1,000,000,000,000x.
        How do they do it, how might we adapt this for building complex systems?
  36:40 Simple idea that C++ has not figured out.
        "No idea so simple and powerful that you can't get zillions of people to misunderstand it"
        Must not allow the INTERIOR to be a factor in the computation of the whole
        Cell membrane keeps stuff OUT as much as it keeps stuff IN
        Confusion with objects from noun+verb-centered language. "Our process words stink"
  37:50 Apologies for "Object-Oriented". Should have been Process/Message/間 (ma) -Oriented
        Stuff IN-BETWEEN objects. What you DON'T see.
  39:25 An object can act like anything. You have encapsulated a COMPUTER!! The universal simulator!
        Take the powerful thing you're working on and not LOSE it by PARTITIONING UP your design space!
        That's the bug in data/procedures.
        Pernicious thing in C++ and Java by looking like the OLD THING as much as possible.
  40:35 Virtual Machine
  41:00 UNIX processes: like objects, but too much overhead. "3" ought to be an object; but it can't be a UNIX process.
  41:50 Bio-cells can't share their DNA. OUR "cells" can! No need to e.g. engineer a special virus to change the DNA (e.g. cystic fibrosis)
  43:00 "An object is a virtual server" Every object should at least have a "URL"
        I believe every object should have an IP
  44:10 So far we've been CONSTRUCTING software; soon we'll have to GROW it
        Easy to grow a baby 6 inches; never have to take it down for maintenance!!
        Can't grow a Boeing 747! Simple mechanical world; only object was to MAKE the artefact, NOT to grow it
  44:55 How many people STILL[1997] use a dev system that forces you to develop OUTSIDE of the lang?
        Edit/compile/run? Even if it is "fast". Cannot possibly be other than a DEAD END for building complex systems!
        [20 years later, in 2017: has anything changed?]
  45:50 ARPANET, the precursor to the Internet.
        From the time ARPANET started running, expanded by 8 orders of magnitude.
        Not ONE PHYSICAL ATOM in the internet today that was in the original ARPANET!
        Not ONE LINE OF CODE of the original remains in the system!
        System expanded by 100,000,000x, has changed every atom and every bit and has NEVER HAD TO STOP!
        That is the metaphor we ABSOLUTELY MUST apply to what we THINK are "smaller" things!
        When we think programming is "small", that's why our programs are so BIG.
  47:45 LISP, meta-reflection. "Maxwell's Equations" of software
        Saddest thing about Java (possibly before reflection API?)
        Java originally for programming toasters, not internet, but still --
        How can you hope to cope with stuff without a meta-system?
        Represents a real failure of people to understand what the larger picture is / will be.
  50:10 Meta-programming. Bootstrapping on top of the language itself.
        Tyranny of a single implementation.
  52:15 Dozens and dozens of different object systems, all with very similar semantics, but different pragmatic details.
        Think of what a URL is, a HTTP message is, an object is, an obj-oriented pointer is...
        Should be clear that objects should not have to be local to the computer.
        Interop is possible basically for free under this stance.
        Things like JavaBeans and CORBA will not suffice.
        Objects will have to DISCOVER and NEGOTIATE what each other can do.
        Prod and poke each other to see response.
        Must automate this!!
  55:20 1970s: Abstract Data Types. Assignment-centered thinking.
        C++ like MS-DOS: no-one took seriously, because who would ever fall for a joke like that? ...
  57:35 McLuhan: "I don't know who discovered water, but it wasn't a fish!"
  58:10 3 stages of new ideas:
        Dismissed as mad.
        Then: totally obvious.
        Finally: original denouncers claim to have invented it!
        Smalltalk, sadly, went this way.
        We don't know how to make systems yet!
        Don't let what we DON'T know become a religion!
        Before coming out into the world, Smalltalk was very good at making obsolete previous versions of itself.
        Squeak: bootstrap something better than Smalltalk!
        Think of how you can obsolete the damn thing!
  1:01:30 Pipe organ: "Play it grand."

  Let us combine all this with Bret Victor's work, and lay the foundations for the real computer revolution yet to come.
> Here's my pet theory on why Kay's vision for OOP didn't win in the marketplace of ideas: The software industry achieves "extreme late-binding" via network protocols like TCPIP/HTTP instead of a single programming language's "message bus".

Sounds precisely like his vision.

https://youtu.be/oKg1hTOQXoY

In this video he refers to how the notion of treating each object as a complete computer, with its own URL.

jasode
>Sounds precisely like his vision.

Yes, I've seen that well-known video many times.

When I say Kay's vision, I'm talking about his often repeated comment, "I made up the term object-oriented, and I can tell you I did not have C++ in mind."

His "vision" in my comment being the whole Smalltalk programming environment and language that emphasizes the messages and very late binding. (Hence, the "talk" in "Smalltalk".)

What datacenters like the ones at Google,Facebook,Amazon,etc did is have a bunch of C++/Java/Go/PHP programs "talk" to each other by sending data over the wire via serialization protocols such as protobufs/thrift/Kafka/JSON/XML/etc.

Many companies independently arrived at the idea that they wanted "late binding" ... but they didn't call it "late binding". Instead they might say they want to "decouple" one microservice from another. Instead of calling Linux executables "objects", they might say "service endpoints" or "REST endpoints".

The key is that instead of buying a Smalltalk license for programmers to implement "extreme late binding", they just used C++ and shared data over the network with one of the flavors of data serialization. Again, C++ isn't the "real" OOP according to Alan Kay -- but that's one of the alternatives that "won" in the marketplace of ideas over Smalltalk. Or they may have used PHP to write a web backend (the "object") and the urls as REST APIs to act as "messages". PHP is another alternative that "won" over Smalltalk.

Many companies implemented "cell biology evolution" via constant software updates, and "late-binding" via network protocols. With hindsight, we see that the late-binding dynamic advantages that Kay laid out for Smalltalk was done organically by software industry in multiple programming languages. It didn't even matter that some of those other languages were static type systems.

May 16, 2019 · 5 points, 0 comments · submitted by ohjeez
WASM makes some sense in the browser, but what horrifies me about it is that it is going the way of Electron.

The usual pattern:

1. Instead of figuring out how to do things well in the native space, developers hack something together from web technologies, because it's "simpler".

2. They realize it doesn't work all that well and reinvent some aspect of native environment to fill the gaps.

3. They backport all the complexity back to the web, because they want to run one codebase everywhere.

This is incredibly backwards. We get bloated faux-native apps that use abstractions that make zero sense outside of the web context. We get web technologies that are bad approximations of what has been done reasonably well in the native space. Finally, we get more and more complexity with each iteration.

---

BTW, does anyone remember how Alan Kay said[1] browsers should be designed to be like mini operating systems? Does anyone remember how many people said he is wrong and doesn't understand the web? Well, browsers increasingly are used as an operating system, except it's a system formed by accretion, not by deliberate design. So maybe, just maybe, people should pause, listen to Kay and read up on how similar problems were solved by very smart people before us, instead of just adding more and more web-ducktape to everything.

https://www.youtube.com/watch?v=oKg1hTOQXoY&t=1405s

pcwalton
OK, but wasm is a pretty "boring" VM; it's a reasonable design you could easily arrive at by only considering native code. People are acting like this result indicates that wasm is fundamentally misdesigned, rather than that it simply doesn't yet support SIMD and has a relatively immature compiler compared to GCC and Clang.
Most Alan Kay's presentations I've seen were worth watching. Tons of great ideas in each. They really improved how I reason about systems.

There is this famous talk from OOPSLA 1997:

https://www.youtube.com/watch?v=oKg1hTOQXoY

If you want to see what "practical" OOP looked like in the 80s, this is a good video:

https://www.youtube.com/watch?v=QjJaFG63Hlo

From his talk on OOPSLA[1], I'd say that only Smalltalk and Common Lisp (through CLOS) were able to perform OO as Kay envisioned.

Erlang is another story: it would probably be better to call it just actor model. But again, Kay himself says[2] that there is not a lot of difference between actor model and his OOP.

[1] https://www.youtube.com/watch?v=oKg1hTOQXoY

[2] https://www.quora.com/What-is-the-difference-between-Alan-Ka...

madhadron
Add Self to your list, which was even purer than Smalltalk since it got rid of the classes and let the objects stand by themselves.

It's not just the actor model, though. In the actor model, all the actors are contributors to the conversation. The subjects of conversation are something else. In object oriented programming, the subjects of conversation are the same category of things as the agents of conversation, and may move back and forth freely.

The simplest way to see the difference is to think about how a new contributor to the conversation is added. In the actor model, some actor forks a new actor in response to a message. If I have built up the state to describe a new actor, there is still a step to bring it to life that changes it from one thing to another.

In object oriented programming, when you build up that state, it may become a contributor to the conversation at any point in time and then go back to being a subject of conversation, or do both at once. There is no switch.

The best arguments against OOP, as it is done in most languages today, are made by Alan Kay, the creator of Smalltalk and the term "object-oriented". Alan should have used the term "message passing" so people wouldn't miss this component of Smalltalk that made it so great.

"I invented the term object-oriented, and I can tell you that C++ wasn't what I had in mind." - Alan Kay, Source: https://youtube.com/watch?v=oKg1hTOQXoY&t=634s

hota_mazi
It's not an argument against OOP, it's an attempt to redefine the term to be more in line with something that Kay invented.

And the world has moved on. OOP is not about message passing.

Jun 19, 2018 · leoc on Qt for WebAssembly
Isn't progress wonderful? 2010: https://news.ycombinator.com/item?id=1056391

Speaking of which, the linked page is no longer up, so here's a Wayback Machine impression: https://web.archive.org/web/20100304101843/http://labs.troll...

... and ofc the Google Video link in my comment https://news.ycombinator.com/item?id=1056393

> 1 point by leoc on Jan 16, 2010 | parent | favorite | on: QT on Google Native Client

> 1997^H^H^H^H1961 is calling; someone just picked up the rotary phone. http://video.google.com/videoplay?docid=-2950949730059754521....

is now broken too, so here's a link into a YouTube upload: https://www.youtube.com/watch?v=oKg1hTOQXoY&t=612s

This is an excellent piece of writing. The aphorism "Go slow and fix things." is brilliant. I'm stealing that.

I implore people reading this thread to watch this Alan Kay video from 1997, where he rather predicts (albeit perhaps implicitly, if not explicitly) that we will reach the situation we are in today.

https://www.youtube.com/watch?v=oKg1hTOQXoY

There is a quote from the talk which rings true with many of the comments here "HTML on the internet has gone back to the dark ages, because it presupposes that there should be a browser which understands its formats"

I have found the odd nit with the article:

"In one way, it is easier to be inexperienced: you don’t have to learn what is no longer relevant."

I think he might mean unlearn, but it's a bit ambiguous. :-)

devxpy
That talk is truly awesome. Thanks.
gjvc
I'm glad you found it useful. Here's a related bonus link :-)

https://www.quora.com/Is-there-a-programming-language-thats-...

Dr Kay has been loudly telling us for a long time that we are on the wrong track whilst having already delivered a complete alternative. One can only wonder what his perspective feels like.

Alan Kay has an excellent and relevant talk as to why we shouldn't have ended up here (from 1997), which I think will answer your questions.

https://www.youtube.com/watch?v=oKg1hTOQXoY

Oct 23, 2017 · 2 points, 0 comments · submitted by gjvc
"The more interesting/optimized ways to map this would be where a single object in the software internet somehow maps to multiple computers, either doing parallel computation or partitioned computation on each. I feel the semantics of mapping the object onto a physical computer would have to be encoded in the object itself."

You might be interested in Alan Kay's '97 OOPSLA presentation. He talked in a similar vein to what you're talking about: https://youtu.be/oKg1hTOQXoY?t=26m45s

Inspired by what he said there, I tried a little experiment in Squeak, which worked, as far as it went (scroll down the answer a bit, to see what I'm talking about, here): https://www.quora.com/What-features-of-Smalltalk-are-importa...

I only got that far with it, because I realized once I did it that I had more work to do in understanding what to do with what I got back (mainly translating it into something that would keep with the beauty of what I had started)...

"Maybe there is a feedback loop where the growth of Unix leads to hardware vendors thinking 'lets optimize for C', which then feeds the growth further? OTOH, even emulated machines are faster than hardware machines used to be."

There is a feedback loop to it, though as development platforms change, that feedback gets somewhat attenuated.

As I recall, what you describe with C happened, but it began in the late '90s, and into the 2000s. I started hearing about CPUs being optimized to run C faster at that point.

I once got into an argument with someone on Quora about this re. "If Lisp is so great, why aren't more people using it?" I used Kay's point about how bad processor designs were partly to blame for that, because a large part of why programmers make their choices has to do with tradition (this gets translated to "familiarity"). Lisp and Smalltalk did not run well on the early microprocessors produced by these companies in the 1970s. As a consequence, programmers did not see them as viable for anything other than CS research, and higher-end computing (minicomputers).

A counter to this was the invention of Lisp machines, with processors designed to run Lisp more optimally. A couple companies got started in the '70s to produce them, and they lasted into the early '90s. One of these companies, Symbolics, found a niche in producing high-end computer graphics systems. The catch, as far as developer adoption went, was these systems were more expensive than your typical microcomputer, and their system stuff (the design of their processors, and system software) was not "free as in beer." Unix was distributed for free by AT&T for about 12 years. Once AT&T's long-distance monopoly was broken up, they started charging a licensing fee for it. Unix eventually ran reasonably well on the more popular microprocessors, but I think it's safe to say this was because the processors got faster at what they did, not that they were optimized for C. This effect eventually occurred for Lisp as well by the early '90s, which is one reason the Lisp machines died out. A second cause for their demise was the "AI winter" that started in the late '80s. However, by then, the "tradition" of using C, and later C++ for programming most commercial systems had been set in the marketplace.

The pattern that seems to repeat is that languages become popular because of the platforms they "rode in on," or at least that's the perception. C came on the coattails of Unix. C++ seems to have done this as well. This is the reason Java looks the way it does. It came out of this mindset. It was marketed as "the language for the internet," and it's piggybacked on C++ for its syntax and language features. At the time the internet started becoming popular, Unix was seen as the OS platform on which it ran (which had a lot of truth to it). However, a factor that had to be considered when running software for Unix was portability, since even though there were Unix standards, every Unix system had some differences. C was reasonably portable between them, if you were careful in your implementation, basically sticking to POSIX-compliant libraries. C++ was not so much, because different systems had C++ compilers that only implemented different subsets of the language specification well, and didn't implement some features at all. C++ was used for a time in building early internet services (combined with Perl, which also "rode in" on Unix). Java was seen as a pragmatic improvement on C++ among software engineers, because, "It has one implementation, but it runs on multiple OSes. It has all of the familiarity, better portability, better security features, with none of the hassles." However, it completely gave up on the purpose of C++ (at the time), which was to be a macro language on top of C in a similar way to how Simula was a macro language on top of Algol. Despite this, it kept C++'s overall architectural scheme, because that's what programmers thought was what you used for "serious work."

From a "power" perspective, one has to wonder why programmers, when looking at the prospect of putting services online, didn't look at the programming architecture, since they could see some problems with it pretty early, and say to themselves, "We need something a lot better"? Well, this is because most programmers don't think about what they're really dealing with, and modeling it in the most comprehensive way they can, because that's not a concept in their heads. Going back to my first point about hardware, for many years, the hardware they chose didn't give them the power so they could have the possibility to think about that. As a result, programmers mostly think about traits, and the community that binds them together. That gives them a sense of feeling supported in their endeavors, scaling out the pragmatic implementation details, because they at least know they can't deal with that on their own. Most didn't think to ask (including myself at the time), "Well, gee. We have these systems on the internet. They all have different implementation details, yet it all works the same between systems, even as the systems change... Why don't we model that, if for no other reason than we're targeting the internet, anyway? Why not try to make our software work like that?"

On one level, the way developers behave is tribal. Looked at another way, it's mercantilistic. If there's a feedback loop, that's it.

"OTOH, even emulated machines are faster than hardware machines used to be."

What Kay is talking about is that the Alto didn't implement a hard-coded processor. It was soft-microcoded. You could load instructions for the processor itself to run on, and then load your system software on top of that. This enabled them to make decisions like, "My process runs less efficiently when the processor runs my code this way. I can change it to this, and make it run faster."

This will explain Kay's use of the term "emulated." I didn't know this until a couple years ago, but at first, they programmed Smalltalk on a Data General Nova minicomputer. When they brought Smalltalk to the Alto, they microcoded the Alto so that it could run Nova machine code. So, it sounds like they could just transfer the Smalltalk VM binary to the Alto, and run it. Presumably, they could even transfer the BCPL compiler they were using to the Alto, and compile versions of Smalltalk with that. The point being, though, that they could optimize performance of their software by tuning the Alto's processor to what they needed. That's what he said was missing from the early microprocessors. You couldn't add or change operators, and you couldn't change how they were implemented.

shalabhc
Thanks for the long write up. I found it very interesting.

> You might be interested in Alan Kay's '97 OOPSLA presentation

Oh yeah I have actually seen that - probably time to watch it again.

> Well, this is because most programmers don't think about what they're really dealing with

Agree with that. Most people are working on the 'problem at hand' using the current frame of context and ideas and focus on cleverness, optimization or throughput within this framework. When changing the frame of context may in fact be much better.

> What Kay is talking about is that the Alto didn't implement a hard-coded processor. It was soft-microcoded.

Interesting. I wonder if FPGAs could be used for something similar - i.e. program the FPGAs to run your bytecode directly. But I'm speculating because I don't know too much about FPGAs.

alankay1
Yes re: FPGAs -- they are definitely the modern placeholder of microcode (and better because you can organize how the computation and state are hooked together). The old culprit -- Intel -- is now offering hybrid chips with both an ARM and a good size patch of FPGA -- combine this with a decent memory architecture (in many ways the hidden barrier these days) and this is a pretty good basis for comprehensive new designs.
alankay1
Actually ... only the first version of Smalltalk was done in terms of the NOVA (and not using BCPL). The subsequent versions (Smalltalk-76 on) were done by making a custom virtual machine in the Alto's microcode that could run Smalltalk's byte codes efficiently.

The basic idea is that you can win if the microcode cycles are enough faster than the main memory cycles so that the emulations are always waiting on main memory. This was generally the case on the Alto and Dorado. Intel could have made the "Harvard" 1st level caches large enough to accommodate an emulator -- that would have made a big difference. (This was a moot point in the 80s)

mmiller
I know this is getting nit-picky, but I think people might be interested in getting some of the details in the history of how Smalltalk developed. Dan Ingalls said in "Smalltalk-80: Bits of History":

"The very first Smalltalk evaluator was a thousand-line BASIC program which first evaluated 3 + 4 in October 1972. It was followed in two months by a Nova assembly code implementation which became known as the Smalltalk-72 system."

The first Altos were produced, if I have this right, in 1973.

I was surprised when I first encountered Ingalls's implementation of an Alto on the web, running Smalltalk-72, because the first thing I was presented with was, "Lively Web Nova emulator", and I had to hit a button labeled "Show Smalltalk" to see the environment. He said what I saw was Nova machine code from a genuine ST-72 image, from an original disk platter.

I take it from your comment that you're saying by the time ST-76 was developed, the Alto hardware had become fast enough that you were able to significantly reduce your use of machine code, and run bytecode directly at the hardware level.

I could've sworn Ingalls said something about using BCPL for earlier versions of Smalltalk, but quoting out of "Bits of History" again, Ingalls, when writing about the Dorado and Smalltalk-80, said of BCPL that the compiler you were using compiled to Alto code, but ...

"As it turned out, we only used Bcpl for initialization, since it could not generate our extended Alto instructions and since its subroutine calling sequence is less efficient than a hand-coded one by a factor of about 3."

alankay1
The Alto didn't get any faster, and there was not a lot of superfast microcode RAM (if we'd had more it would have made a huge difference). In the beginning we just got Smalltalk-72 going in the NOVA emulator. Then we used the microcode for a variety of real-time graphics and music (2.5 D halftone animation, and several kinds of polytimbral timbre synthesis including 2 keyboards and pedal organ). These were separate demos because both wouldn't fit in the microcode. Then Dan did "Bitblt" in microcode which was a universal screen painting primitive (the ancestor of all others). Then we finally did the byte-code emulator for Smalltalk-76. The last two fit in microcode, but the music and the 2.5 D real-time graphics didn't.

The Notetaker Smalltalk (-78) was a kind of sweet spot in that it was completely in Smalltalk except for 6K bytes of machine code. This was the one we brought to life for the Ted Nelson tribute.

Sep 18, 2017 · 2 points, 0 comments · submitted by tosh
The computer revolution hasn't happened yet by Alan Kay https://youtu.be/oKg1hTOQXoY
also:

Alan Kay: "OOPSLA 1997 - The computer revolution hasnt happened yet"

https://www.youtube.com/watch?v=oKg1hTOQXoY

and +1 to anything by bret victor

Dec 02, 2016 · gnipgnip on Let’s Stop Bashing C
Alan Kay in his famous OOPSLA talk mentions how the architecture itself leads to these language choices.

https://www.youtube.com/watch?v=oKg1hTOQXoY

He mentions how Xerox PARC machines with tagged architectures from the 80s were only 1/50 the speed of the x86 systems in '97. This, in contrast to the 50000x increase in computing power over the same time frame.

adamnemecek
Can anyone recommend any resources on older architectures? Alan also talks about the Burroughs CPUs but idk how one would go about learning about those these days.
nickpsecurity
Im at work but these two should be a nice start:

https://www.smecc.org/The%20Architecture%20%20of%20the%20Bur...

http://homes.cs.washington.edu/~levy/capabook/

Bitsavers and Wikipedia are good resources. The summaries of most on Wikipedia are accurate enough with references to get more authorative information on. I found a lot of it that way.

I believe that switching to a stack machine is short-sighted and a big mistake:

The AST format could open up new possibilities for software, some of which are observable in Lispy languages like Scheme (I won't list them here). Instead, we're looking at locking the software world back into this 1960's model for another 50 years out of a misguided concern for optimization over power.

It's like foregoing the arch because it's more work to craft, and instead coming up with a REALLY efficient way to fit square blocks together. Congratulations, we can build better pyramids, but will never grasp the concept of a cathedral.

To really grasp my point, I BEG you all to watch the following two videos in full and think hard about what Alan Kay & Douglass Crockford have to say about new ideas, building complex structures, and leaving something better for the next generation:

https://youtu.be/oKg1hTOQXoY

https://www.youtube.com/watch?v=PSGEjv3Tqo0

As Alan Kay states, what is simpler: something that's easier to process, but for which the software written on top of it is massive; or one that takes a bit more overhead, but allows for powerful new ways to model software and reduce complexity?

I believe that an AST model is a major start in inventing "the arch" that's been missing in software, and with something that will proliferate the whole web ... how short-sighted it would be to give that up in favor of "optimizing" the old thing.

Imagine if instead of JavaScript, the language of the web had been Java? Lambdas would not be mainstream; new ways of doing OOP would not be thought of; and all the amazing libraries that have been written because of the ad-hoc object modeling that JavaScript offers. Probably one of the messiest and inefficient languages ever written, yet one of the most powerful ever given. C'mon, let's do it a step more by making it binary, homoiconic, and self-modifying.

Thanks.

shkaboinka
...and if you're brave enough, think about how Christopher Alexander's philosophy of "unfolding wholeness" applies so much more to an AST than to the stack-machines of the 1960's:

https://youtu.be/98LdFA-_zfA

No, compared to the early vision (1950s/60s) of computers as metatools that humans can use to augment their cognition. The current crop of mobile devices (and even most desktop devices) are nowhere nearer to that glorious goal. See his OOPSLA talk "The computer revolution hasn't happened yet" - https://www.youtube.com/watch?v=oKg1hTOQXoY
I recently (re)discovered Alan Kay's 1997 OOPSLA talk:

https://www.youtube.com/watch?v=oKg1hTOQXoY

For others that have been following his work (and talks), it doesn't really contain much new stuff, but the context of a talk to an audience of programmers brings out some different contexts. He touches on the the fact that they didn't "just" spend extra money to prototype the dynabook in the form of the Alto - but they built 2000 machines - and part of that is of course so they could invite (among others) whole classes of school children to come and play with the tech.

For the record, the title was taken from the title of AKay's 1997 OOPSLA keynote, which is well worth a watch.

https://www.youtube.com/watch?v=oKg1hTOQXoY

research for this paper clearly wasn't very thorough.

this quote, which is key to the title is 10 years wide of the mark: ``"As the computer scientist and graphical interface pioneer Alan Kay likes to say, "the real computer revolution hasn't happened yet" (“The Computer Revolution,” 2007)" '' This title, as far as I can tell, first appeared in the 1997 OOPSLA talk of the same name https://www.youtube.com/watch?v=oKg1hTOQXoY

gozo
Eh what? It's a reference. The year has no other meaning than to account for when the referenced work was published, which was in 2007.
carussell
There's no rule that says the author must find Kay's talk to be more compelling than his VPRI memo, and it isn't evidence that the paper was poorly researched.
None
None
stcredzero
Besides Case and Hart's piece, we have a few other concrete examples of how computing could be a medium for understanding systems.

Only a few? There are a lot of them, actually. Games are starting to become these, particularly ones which encourage exploration and making, like Minecraft. I think we're only at the beginning of our exploration of the craft of emergent sandbox worlds. We're probably at an analogous stage in emergent sandbox worlds as the cartoon doodles found on Japanese temple beams from around 1200 years ago. One day, we'll get to the sandbox world equivalent of "The Watchmen."

In case anyone is looking for the video of the keynote Kay gave before this post. There's a dead link to it at the bottom of the page.

  https://www.youtube.com/watch?v=oKg1hTOQXoY
A naive attempt to do the same we do now just waaaaaaay more inflexibly.

The author needs to watch this a meditate on why Alan Kay said what he said about the web architecture: http://youtu.be/oKg1hTOQXoY

It looks like you're making a point that these things are all inherently complex.

Isn't it more probable that the tools we have today are just inadequate to deal with those problems? And maybe they are still inadequate after all these years because our industry is very stubborn and doesn't learn from its mistakes?

I see nothing complex about drawing interactive elements on the screen. Smalltalk with its Morphic interface offers a much richer and flexible GUI toolkit and that was out when, in the 60s? How many GUI toolkits have learned from those lessons? And Morphic/Smalltalk was cross-platform before Java, in ways Java isn't to this day. It seems to me what hampers evolution is the technology we choose (Java, C) and not the problem itself (drawing interactive elements on the screen).

For anyone interested in this discussion I recommend Alan Kay's "The Computer Revolution hasn't happened yet": www.youtube.com/watch?v=oKg1hTOQXoY

mwcampbell
My point is that the inherent complexity of a UI toolkit is greater than many programmers realize. It's quite easy to implement your own UI that directly draws its controls on the screen and handles mouse clicks. Now, how are you going to make that accessible to blind users? Users with mobility impairments who can't use a mouse? How about right-to-left languages and non-Latin input methods? And there's probably more that I'm not aware of.
jamii
Thats true, but there is still plenty of incidental complexity. Poor layout languages with no composition, stateful manipulation of the ui, no way to express components. Tools like React.js and Apple's Auto Layout show that much of this complexity is not actually needed.
mwcampbell
Agreed. I like the ideas behind React.js in particular.
Jul 30, 2014 · 2 points, 0 comments · submitted by noveltysystems
He gives some really good points about this in a talk delivered to OOPSLA in 97 [1].

1. http://www.youtube.com/watch?v=oKg1hTOQXoY

Of course the full talk also has some merit: https://www.youtube.com/watch?v=oKg1hTOQXoY
May 06, 2014 · gjvc on Is HTML Too Big To Fail?
quoting Alan Kay "HTML on the Internet has gone back to the dark ages because it presupposes that there should be a browser that should understand its formats."

video: https://www.youtube.com/watch?v=oKg1hTOQXoY

transcription: http://blog.moryton.net/2007_12_01_archive.html

Apr 15, 2014 · 2 points, 0 comments · submitted by dharmatech
Feb 16, 2014 · thangalin on Stack Overflow is down
While writing ConTeXt code (similar to LaTeX), I will reference the StackExchange network:

    % @see http://tex.stackexchange.com/a/128858/2148
Brett Victor asks, "How do you get communication started between uncorrelated sentient beings?" to introduce the concept of automatic service discovery using a common language.[1]

Alan Kay had a similar idea: that objects should refer to other objects not by their memory space inside a single machine but by their URI.[2]

When programmers copy/paste StackOverflow snippets, in a way they are actually closer to realizing Alan Kay's vision of meta-programming than those who subscribe to the "tyranny of a single implementation" -- or "writing" code as some would mock, expressing a narrow view of what they think "programming" a computer must entail.

The StackExchange network provides a feature-rich interface to document source code snippets that perform a specific task. What's missing is a formal, structured description of these snippets and a mechanism to provide semantic interoperability that leads to a universal prototyping language for deep messaging interchange.[3]

How else are we going to go from Minecraft[4] to Holodeck[5]?

[1]: http://www.youtube.com/watch?v=8pTEmbeENF4#t=790

[2]: http://www.youtube.com/watch?v=oKg1hTOQXoY#t=2940

[3]: http://www.youtube.com/watch?v=oKg1hTOQXoY#t=3125

[4]: http://www.youtube.com/watch?v=k1h1U5jHh6U

[5]: http://www.youtube.com/watch?v=7OCKDEdtWys#t=39

You should check out Alan Kay's OOPSLA keynote "The Compute Revolution Hasn't Happened Yet" at http://www.youtube.com/watch?v=oKg1hTOQXoY specifically about 20 mintues in where he says that a browser should be exactly what you say here. On the one hand, he was talking about this almost 20 years ago, on the other hand you're in good company because he's a pretty smart guy!

I encourage all developers to watch that talk and do some thinking about the current state of our practice and where we want to go from here.

The Alan Kay reference is from his OOPSLA'97 talk [2], where he says that he's "apologized profusely over the last 20 years for introducing the term 'object oriented'" and suggests that the Japanese notion of "Ma" or "the unseen stuff that goes between objects" as what is important.

(edit) [3] is a quote from a communication with Kay in 2003 -

"OOP to me means only messaging, local retention and protection and hiding of state-process, and extreme late-binding of all things. It can be done in Smalltalk and in LISP. There are possibly other systems in which this is possible, but I'm not aware of them."

[2] https://www.youtube.com/watch?v=oKg1hTOQXoY (around 38minutes)

[3] http://userpage.fu-berlin.de/~ram/pub/pub_jf47ht81Ht/doc_kay...

I should just probably say, clearly you haven't seen Common Lisp's defpackage, modules are actually first class objects there and are completely decoupled from the file system.

But most importantly as Barbara Liskov mentions in this video[1] we don't know what is a module exactly or how to use them yet. Which is an specific statement aligned with Alan Kay's famous "We don't know how to design systems so let's not turn it into a religion yet."[2]

tl;dr; 1) Innovation is good. 2) Javascript's module is a half assed implementation of Common Lisp's defpackage. (Don't get me wrong is still way better than Python's abhorrent ninja goto: import)

[1]: http://www.infoq.com/presentations/programming-abstraction-l... [2]: https://www.youtube.com/watch?v=oKg1hTOQXoY

Apr 28, 2013 · 1 points, 0 comments · submitted by StylifyYourBlog
Summary of the links shared here:

http://blip.tv/clojure/michael-fogus-the-macronomicon-597023...

http://blog.fogus.me/2011/11/15/the-macronomicon-slides/

http://boingboing.net/2011/12/28/linguistics-turing-complete...

http://businessofsoftware.org/2010/06/don-norman-at-business...

http://channel9.msdn.com/Events/GoingNative/GoingNative-2012...

http://channel9.msdn.com/Shows/Going+Deep/Expert-to-Expert-R...

http://en.wikipedia.org/wiki/Leonard_Susskind

http://en.wikipedia.org/wiki/Sketchpad

http://en.wikipedia.org/wiki/The_Mother_of_All_Demos

http://io9.com/watch-a-series-of-seven-brilliant-lectures-by...

http://libarynth.org/selfgol

http://mollyrocket.com/9438

https://github.com/PharkMillups/killer-talks

http://skillsmatter.com/podcast/java-jee/radical-simplicity/...

http://stufftohelpyouout.blogspot.com/2009/07/great-talk-on-...

https://www.destroyallsoftware.com/talks/wat

https://www.youtube.com/watch?v=0JXhJyTo5V8

https://www.youtube.com/watch?v=0SARbwvhupQ

https://www.youtube.com/watch?v=3kEfedtQVOY

https://www.youtube.com/watch?v=bx3KuE7UjGA

https://www.youtube.com/watch?v=EGeN2IC7N0Q

https://www.youtube.com/watch?v=o9pEzgHorH0

https://www.youtube.com/watch?v=oKg1hTOQXoY

https://www.youtube.com/watch?v=RlkCdM_f3p4

https://www.youtube.com/watch?v=TgmA48fILq8

https://www.youtube.com/watch?v=yL_-1d9OSdk

https://www.youtube.com/watch?v=ZTC_RxWN_xo

http://vimeo.com/10260548

http://vimeo.com/36579366

http://vimeo.com/5047563

http://vimeo.com/7088524

http://vimeo.com/9270320

http://vpri.org/html/writings.php

http://www.confreaks.com/videos/1071-cascadiaruby2012-therap...

http://www.confreaks.com/videos/759-rubymidwest2011-keynote-...

http://www.dailymotion.com/video/xf88b5_jean-pierre-serre-wr...

http://www.infoq.com/presentations/Are-We-There-Yet-Rich-Hic...

http://www.infoq.com/presentations/click-crash-course-modern...

http://www.infoq.com/presentations/miniKanren

http://www.infoq.com/presentations/Simple-Made-Easy

http://www.infoq.com/presentations/Thinking-Parallel-Program...

http://www.infoq.com/presentations/Value-Identity-State-Rich...

http://www.infoq.com/presentations/We-Really-Dont-Know-How-T...

http://www.mvcconf.com/videos

http://www.slideshare.net/fogus/the-macronomicon-10171952

http://www.slideshare.net/sriprasanna/introduction-to-cluste...

http://www.tele-task.de/archive/lecture/overview/5819/

http://www.tele-task.de/archive/video/flash/14029/

http://www.w3.org/DesignIssues/Principles.html

http://www.youtube.com/watch?v=4LG-RtcSYUQ

http://www.youtube.com/watch?v=4XpnKHJAok8

http://www.youtube.com/watch?v=5WXYw4J4QOU

http://www.youtube.com/watch?v=a1zDuOPkMSw

http://www.youtube.com/watch?v=aAb7hSCtvGw

http://www.youtube.com/watch?v=agw-wlHGi0E

http://www.youtube.com/watch?v=_ahvzDzKdB0

http://www.youtube.com/watch?v=at7viw2KXak

http://www.youtube.com/watch?v=bx3KuE7UjGA

http://www.youtube.com/watch?v=cidchWg74Y4

http://www.youtube.com/watch?v=EjaGktVQdNg

http://www.youtube.com/watch?v=et8xNAc2ic8

http://www.youtube.com/watch?v=hQVTIJBZook

http://www.youtube.com/watch?v=HxaD_trXwRE

http://www.youtube.com/watch?v=j3mhkYbznBk

http://www.youtube.com/watch?v=KTJs-0EInW8

http://www.youtube.com/watch?v=kXEgk1Hdze0

http://www.youtube.com/watch?v=M7kEpw1tn50

http://www.youtube.com/watch?v=mOZqRJzE8xg

http://www.youtube.com/watch?v=neI_Pj558CY

http://www.youtube.com/watch?v=nG66hIhUdEU

http://www.youtube.com/watch?v=NGFhc8R_uO4

http://www.youtube.com/watch?v=Nii1n8PYLrc

http://www.youtube.com/watch?v=NP9AIUT9nos

http://www.youtube.com/watch?v=OB-bdWKwXsU&amp;playnext=...

http://www.youtube.com/watch?v=oCZMoY3q2uM

http://www.youtube.com/watch?v=oKg1hTOQXoY

http://www.youtube.com/watch?v=Own-89vxYF8

http://www.youtube.com/watch?v=PUv66718DII

http://www.youtube.com/watch?v=qlzM3zcd-lk

http://www.youtube.com/watch?v=tx082gDwGcM

http://www.youtube.com/watch?v=v7nfN4bOOQI

http://www.youtube.com/watch?v=Vt8jyPqsmxE

http://www.youtube.com/watch?v=vUf75_MlOnw

http://www.youtube.com/watch?v=yJDv-zdhzMY

http://www.youtube.com/watch?v=yjPBkvYh-ss

http://www.youtube.com/watch?v=YX3iRjKj7C0

http://www.youtube.com/watch?v=ZAf9HK16F-A

http://www.youtube.com/watch?v=ZDR433b0HJY

http://youtu.be/lQAV3bPOYHo

http://yuiblog.com/crockford/

ricardobeat
And here are them with titles + thumbnails:

http://bl.ocks.org/ricardobeat/raw/5343140/

waqas-
how awesome are you? thanks
Expez
Thank you so much for this!
X4
This is cool :) Btw. the first link was somehow (re)moved. The blip.tv link is now: http://www.youtube.com/watch?v=0JXhJyTo5V8
The answer for me is pretty easy: The Computer Revolution Hasn't Happened Yet by Alan Kay. The OOPSLA '97 keynote speech.

https://www.youtube.com/watch?v=oKg1hTOQXoY

Great for repeated watching, I get something new from it every time I watch it. It is also great for recalibrating your point of view from amazement at whatever the current trend is in technology, to a more long-term outlook as well as encouraging higher standards for what is currently available.

I think this shift in outlook is important for technologists like us, because it easy to become immersed in the day to day goings-on of tech and become myopic in a way. Using the invention of the printing press and literacy, etc, etc is a great way to reorient your attitude towards technology and what it can/should do.

chubot
FWIW I just watched this talk, and tried to judge it apart from my opinion that he is totally wrong about the web.

I think Kay is one of the most interesting speakers in CS; he's a learned man. But I didn't find the content that great. If you want some thoughts on software architecture, programming languages, with a lot of great cultural references outside the field, I much prefer the thoughts of Richard Gabriel:

http://www.dreamsongs.com/Essays.html

If you search for his work on "ultra large scale systems", it is a lot more thought provoking than what Kay offers, and it is a lot more specific at the same time.

When watching the Kay talk, I was struck that he seems to be lumping every "good" in software engineering under the term "object oriented". Virtual machines are objects. Servers are objects. Numbers should be objects. All abstraction is objects. It was not really a useful or enlightening way of thinking about the world.

I know OOP has a different definition than what he intended with SmallTalk, but he didn't invent modularity. For example, Unix is highly modular but not object-oriented, and it predates his ideas.

chubot
Is this the one where he says the web was designed by amateurs and it should have been a virtual machine instead? He says they did a better web 30 years before or something to that effect.

If so I'll commit blasphemy and say he's wronger than wrong. I was especially offended that a curious person would display such ignorance in understanding the work of others.

Without going too much into detail, if he believes this, then he can't possibly understand Tim Berner-Lee's principle of least privilege. And that the web would have never become ubiquitous, e.g. made the jump from PCs to mobile, in his bizarre universe.

coldtea
>If so I'll commit blasphemy and say he's wronger than wrong. I was especially offended that a curious person would display such ignorance in understanding the work of others.

I don't see how he can be wrong. The point he made wasn't that the things they did in the 70s were better than today's web or UIs. Just that people (and companies) should have invested in THAT line of work, and building on THOSE principles, instead of making a mess with reinventing the wheel, competing on BS differentiators and cutting off research.

I'd take my OS X or an Ubuntu desktop over any Xerox guy of the time, any day of the week. But I'd take the volume of research, innovation, insights and coherence that they had at the time over today's ho-hum idea of development too.

Despite having all those things 30 years ago, it took 30 years for them to become mainstream -- and far fewer stuff emerged since. And not for lack of CPU resources. Even something like BeOS could offer a 2005 desktop experience in 1996.

The most wide innovation mover in the 00's was Apple. And even they, they mostly worked on market innovation (ie. repackage older niche stuff in a way that finally makes people want to use them, instead of a jumbled mess), not on research innovation.

chubot
Read my response above; you're not addressing the point he makes that the web should have been based on a VM. That's what I'm saying is wronger than wrong.

If their line of research is based on distributing content over VMs, then that's completely different than the web's design philosophy, and it's probably THE reason his ideas never were adopted while the web was.

Those design ideas are diametrically opposed. The design of the web won, for that specific reason. It was BETTER rather than worse, as he claims.

I was astonished during the talk that he never reflected on why his ideas in this area weren't successful while the web was. It is NOT because the designers of the web are idiots or amateurs. That is specifically what he claims.

coldtea
Your whole premise is that his ideas "weren't successful", whereas I see it more as a case of "Greenspun's tenth rule".

That is: tt's not that the VMs lost and the web won.

It's the web started from something that wasn't adequate for what we wanted, and after all these years it has been adding all kinds of kludgy, warty and half-baked implementations of VM features.

In essence, instead of designing a proper web-as-VM from the start (and slowly evolve it as capabilities of networks and CPUs increase) we started with Lee's document based thing, that we soon found out is barely enough, and we have to have video, sound, app capabilities, process isolation, maybe a common bytecode, networking, real-time, etc.

I.e we started from dumb documents we links, when what we wanted was a VM. And now we have twisted the dump document model to sorta kinda look like a VM.

How many times have you heard that "the browser is the new OS" and such?

So, it's a Greenspun's tenth rule, only instead of Lisp we build an ad-hoc, bug-ridden, clumsy version of Alan Kay's, Doug Engelbart, etc, ideas.

chubot
Right, so you are arguing exactly what Kay argued, and what the Adobe guys argued. I am saying that is wrong for the reasons above (i.e., exactly what Tim Berner's Lee argued).

Anyway, Kay basically attributed it to ignorance and stupidity on the part of the TBL and the web's designers, which I think is incredibly arrogant, especially since he is wrong.

You didn't really address the arguments above -- you're basically saying "it would have been nicer and cleaner if we started with a VM".

I can sort of see how it would have been nicer in some theoretical world, just like using PDF/PS instead of CSS would have been nicer (in a way).

But I am arguing that it would have never happened. If you started with a VM-based system, there would be technical problems that prevented it from being widely adopted.

Actually, Java was pretty much exactly that. It was supposed to be a VM that ran anything, and disenfranchised Microsoft's OS. And we know JS is more popular than Java, despite way more marketing muscle behind Java in the early years.

It's already been tried. I'd argue that it wasn't just beyond the abilities of Java's designers, but actually impossible (this is as much social problem as a technical one).

You make the vague statement as Kay does that "we should have started with a VM", but it would be impossible to come up with a specific VM design (actual running code) that actually worked as well and as widely as the Web does.

jules
You're absolutely right that historically, it wouldn't have worked. But the web has since moved beyond plain old static documents. What was good for a 1989 web isn't what's good for a 2013 web.

The design by committee of the W3C has resulted in overcomplicated yet at the same time underpowered designs. The model where an ivory tower mandates new things from the top down rarely works well. How many man years does it take to implement a new browser? It's basically an impossible task unless you have mullions of dollars to spend. A much better model for innovation is one where multiple competing technologies get implemented, and the best one wins organically. The web as a VM would have enabled that. You no longer need a committee that mandates HTML5. It can be implemented as a library, and if people like it they will use it. The W3C has a chicken and egg problem: before it standardizes something, there is not much experience with the features it standardizes, so it's hard to do good designs. On the other hand once you standardize something, you're stuck with it. It can (practically) never die. If you have multiple "HTML5 libraries" then it's not a problem to try something out, and pick the best thing. The best solution can win, and the others can slowly fall out of use.

chubot
I agree with the problems, i.e. that web standards and implementations are messy, but not with the solutions. It just seems like a big pipe dream -- "oh I wish we could start over from scratch, rewrite the web, and make things clean". Will never happen.

I also don't agree in 2013 that moving say HTML5 to a VM is a good idea. There's nothing stopping anyone from releasing a VM now. You could release a new VM and an application platform. I mean that is essentially what, say, Android is. But even Android clients will always be a subset of all web clients.

I think people proposing this mythical VM don't have a clear understanding of what VMs are. They are fairly specific devices. Do you think Dalvik, something like v8, or the JVM, or .NET, can form the basis for all future computing? It's an impossible task for a single VM. Even Microsoft is taking more than 10 years to move a decent amount of their own software to their own VM, on a single platform. All VMs necessitate design decisions that are not right for all devices. Even Sun had diverging VMs for desktop and mobile.

jules
Do you think HTML+CSS+JS+(...) can form the basis of all future computing? You have to apply the same standards to both sides. It's hard to be perfect, yes, but it's easy to do better than the web mess. If I had to choose between HTML+CSS+JS+(...) and either the JVM or CLR as the basis of the future of all computing, I would definitely choose one of the latter options. Both the JVM and .NET VM would work okayish, though obviously not ideal. You want a simple bare bones low level VM with the sole purpose of building things on top of it. The JVM and .NET have much cruft of their own. That stuff should go into libraries, not into standards.

You don't need to start from scratch and force everybody to adopt, W3C style. You can build the initial version as an abstraction layer on top of web technologies, so that only the VM implementors have to take care of hiding the mess, and application developers can build on top of that. With current improvements in JS VMs and things like WebGL this is slowly becoming feasible. If that is successful, you can implement a high performance version natively.

This is inevitable. We won't be using HTML+CSS+JS in 100 years, other than for archaeological purposes. The question is how soon will it happen.

chubot
HTML and company aren't the basis of all future computing, but they or their non-Turing complete descendants absolutely will exist in 100 years. They don't exist for lack of imagination; they exist for timeless and fundamental reasons.

Do you think plain text will exist in 100 years? If so then it's not that much a stretch to say that HTML will. It astonishes me that people think that only code, and not data, will be transported over networks. That seems to be what you are claiming -- that it's preferable to transmit code than data in all circumstances?

The argument isn't symmetric because I fully believe that VMs are necessary for the web. I just don't agree with Kay that the designers of the web are idiots (he really says this) because they didn't start with a VM. VMs will come and go as hardware and devices and circumstances change. Data has much more longevity; it encodes fewer assumptions.

Assumptions that are no longer true are generally the reason a technology dies. It is pretty easy to imagine a VM (unknowingly or not) encoding preferences for keyboard and mouse input; that technology would have died with the advent of touch screens. Likewise, a VM that provides affordances for touch screens will likely be dated in 10 years when we're using some other paradigm.

HTTP and HTML are foundational to the web. They are basically the simplest thing that could possibly work. You can reimplement them in a few days using a high level language. They will be around for a LONG LONG time.

More complicated constructs like JS and VMs will have shorter lifetimes. I guarantee you that HTML will be around long after whatever comes after JavaScript, just like HTML will outlive Java Applets and Flash.

They layered architecture of the web is absolutely the right thing. Use the least powerful abstraction for the problem, which gives the client -- which necessarily has more knowledge -- more choices. You could distribute all text files as shell scripts, but there's no reason to, and a lot of reasons why you shouldn't.

jules
Sure some kind of markup construct will exist in 100 years. But not HTML. I do not believe plain text will exist in 100 years, but that's another discussion. I am claiming something far weaker than what you seem to think I'm claiming. I do not think each and every website will be written from scratch on top of the VM. I'm simply claiming that the distribution mechanism for new features of the web will change.

Currently the features are mandated by the W3C, then implemented by all browser vendors. The browser vendors send out updates to all users. Instead what will happen is that some future version markup language that's better than the then current HTML will be distributed as a library, running on top of the VM. This way if you are a site owner, you can immediately start using it. You do not need to wait until (1) the W3C recognizes that this markup language is a good thing and standardizes it (2) the browser vendors have implemented it (3) your visitors have updated their browser. You simply include the library, and start using the new markup language.

Note that this is already happening in the small. Javascript libraries like knockout.js are already changing the fundamental model of building web applications. Instead of waiting for the W3C to standardize some kind of web components with data binding, people implemented it as a library. 20 years ago people would have thought such a thing impossible. They would have thought that something like that surely has to be built in to the browser. As JS gets more powerful, more and more features can be implemented this way, instead of through standardization. Note that things are still flowing over the network in markup language (in this case, knockout.js template language). A similar thing happened with form elements. Remember the xforms standardization effort? Nobody cares anymore because JS libraries offer far better rich forms elements. The thing that changed is where it's implemented: ON the web, rather than IN the web. This organic model is far more in line with the principles of the internet, rather than the centralized way it's done with the web standards. Instead of giving somebody a fish, give him the tools to fish.

> HTTP and HTML are foundational to the web. They are basically the simplest thing that could possibly work. You can reimplement them in a few days using a high level language.

A few days?! This is absolute nonsense. You can't even read the spec in a few days, let alone all the specs it depends on, like PNG, JPG, etc. Maybe you can implement some tiny subset of HTML in a few days, but the whole thing is massively complicated. In comparison a VM is far far simpler.

100 years is a very long time. The web is 23 years old.

chubot
So either TBL is an idiot for designing HTML instead of a VM, or he's not and Alan Kay is an idiot for calling him such. Which is true? Maybe you are not defending Alan Kay's stance, but you haven't said that.

Plain text has existed for 50+ years; I'm sure it will exist in 100. I'm pretty surprised you don't think so. Actually Taleb's Antifragile talks about this exact fact -- things that have stood the test of time will tend to stick around. For example, shoes, chairs, and drinking glasses have been around for thousands of years; they likely will be around for thousands more. An iPad has maybe another decade. HTML has already stood the test of time, because it has gone through drastic evolution and remained intact.

Your knowledge of how web standards are developed isn't quite correct. The W3C didn't invent SSL, JavaScript, XmlHttpRequest, HTML5, or HTTP 2 (SPDY), to name a few. Browser vendors generally implement proprietary extensions, and then they are standardized afer the fact.

I agree that the JS developments you list are interesting. JavaScript is certainly necessary for the web because it lets it evolve in unexpected directions. AJAX itself is a great example of that.

I'm talking about HTTP and HTML 1.0 -- they are conceptually dead simple, and both specs still use the exact same concepts and nothing more. I don't know if HTML 1.0 had forms -- if it did not then you could certainly implement HTTP + HTML in a couple days. I'm talking something like Lynx -- that is a valid web browser that people still use.

Lynx can still view billions of web pages because the web degrades gracefully, because semantic information is conveyed at a high level. The problem with VMs is they don't degrade. Suppose everyone gets sick of CSS. You can throw out all your CSS tomorrow, and write MyStyleLanguage, but your HTML will still be useful. If you encode everything in a VM, then the whole page breaks. It's all or nothing.

An analogy is that HTML vs VMs is similar to compiler IR vs assembly language. You can't perform a lot of optimizations on raw assembly code. The information isn't there anymore; it's been "lowered" in to the details of machine code. Likewise the client is able to do interesting things with non-Turing languages, because it can understand them. Once it's lowered into a VM language, the client can't do anything but slavishly execute the instructions. The semantic info that enables flexibility is gone by that point.

If you think markup will still exist in 100 years, then it's not too much more of a claim to say that markup will be transmitted by web servers and browsers. Do you agree with that? If that is true then TBL is not an idiot. Alan Kay's claim is basically ridiculous. A weaker version of it is still untrue.

I would say that in 100 years, HTML will still exist -- i.e. a thing that lets you separate plain text in to paragraphs, make it bold, etc. In contrast, we will have already gone through dozens of more VMs like the JVM, Flash, JS, Dart, etc. Certainly the JVM and Flash will be gone long before that point. They will have lived and died, but HTML will still be there.

jules
I'm not calling anybody an idiot, and Alan Kay isn't either. You are the only one calling people idiots here I'm afraid.

The fact that some things stood the test of time has no predictive value. It's just confirmation bias. There are plenty of things that existed for far more than 50 years that did not stand the test of time. The horse cart, the abacus, and the spear for example. I believe that as IDEs and other programming tools become more sophisticated, it starts to make more and more sense for them to work directly on abstract syntax trees rather than plain text strings. But this is another discussion.

I'm well aware that the W3C has not have original started the standards (including the original HTML), but it's how they evolve, accumulate cruft, and it's how we get stuck with them. Standardization can only ever accumulate complexity on the web. Things cannot easily, or at all, be removed. With an organic model, as soon as things fall out of use they disappear from consideration.

Perhaps you can implement HTTP and HTML 1.0 in a few days if you are a hero programmer. I'm not sure what your point is. We are living in 2013 not 1989.

Yes, the client can't do as much high level optimization, and that's a good thing. That kind of optimization belongs be at the library level, not at the client level (and certainly not duplicated several times for every different browser vendor each with a slightly different and inconsistent implementation).

I agree that markup will be transmitted in 100 years, and I'm pretty sure so does Alan Kay. He, like I do, simply believes that the building blocks should be inverted. The thing that interprets the markup doesn't belong hard coded in the client. The client should simply be universal (i.e. Turing complete) and the thing that interprets and renders the markup should be built on top of that.

I agree that in 100 years we can still separate text into paragraphs and make things bold, but there are plenty of things that let you do that which are not HTML, and neither is the main point of HTML separating text into paragraphs and making things bold (certainly not in 2013). So this is no reason why HTML will exist in 100 years.

chubot
Sorry, you simply didn't watch the video then. He does this in the video linked, and also in Scaling and Computation (which are more than 10 years apart; it's something he deeply believes).

One quote is, "this is what happens when you let physicists play with computers" -- but that's not all. I am paraphrasing "idiot", but he certainly heaps scorn on them for not understanding something obvious and fundamental, when he is the one who doesn't understand something obvious and fundamental.

jules
A tongue in cheek remark and calling somebody an idiot are two very different things.
espeed
That line is in the "Programming and Scaling" talk: http://www.tele-task.de/archive/video/flash/14029/
Confusion
I either don't understand your point or I don't understand Kay's idea, because: javascript is Turing complete and runs in a VM/sandbox?
chubot
If you have JavaScript, "technically" HTML, CSS, XML, and JSON are unnecessary. You can write programs to print and layout text (that's essentially what a PDF file is), and programs can emit and compute data as well.

Kay specifically calls out HTML as a "step backward" for this reason (although he is simply wrong as I've said). He wants a web based entirely on executable code. Yes, it sounds dumb prima facie but that's what he has said consistently over a long period of time.

jgon
That exact quote is from a more recent talk of his, but he does mention the web and his current distaste for the design principles it embodies. I would encourage you to still watch the talk as it his criticism of the web is an incredibly small portion of the overall talk.

I wish that you had included more substance in your criticisms so I could have either agreed with you, disagreed, or explained my interpretation of his remarks. I will say that I think Kay explicitly agrees with the principle of least privilege (Rule of Least Power according to wikipedia) and calls that out in his talk. I see no way in which the ever-enlarging html,css, and javascript standards embody what Berners-Lee was talking about, especially when they still don't offer nearly the functionality that an OS offers currently. You can tell this because there are really only 4 organizations capable of creating a full browser and they happen to be mostly the largest tech corporations on the planet. If you need those kinds of resources to be able to implement the web today, how can this ever be considered "Least Power"? I suspect that in 10 years the W3C will have added so much more functionality to the various standards in an attempt to make them somewhat at parity with native development that browsers will collapse under their own weight as they try to implement what is basically an OS (this really became explicit when Chrome started using processes to "protect" tabs. Sound familiar?).

The web as it exists today is a huge pile of hodge-podge conflicting standards, none of which are remotely close to offering the level of development performance for apps rather than pages that you can get with any desktop toolkit. Alan is fully behind the "idea" of the web, but is taking the long-term, and I believe correct, view that this cannot continue as it is and we would be well served by trying do something about it by conscious effort rather than evolving half-heartedly in that direction.

dasil003
And yet no desktop toolkit has ever approached the ubiquity and deployability of the web. However much of a hodgepodge the web is, it can and will continue to improve incrementally, whereas the a GUI toolkit no matter how technically superior can never cross the chasm to ubiquity, wring your hands though you might.
saintx
I am enjoying this part of the discussion. One of the things that struck me a few years ago was how the design of the web was so heavily oriented around trying to reproduce the print media industry in a browser. Thus, we get the "markup" of HTML, which provides layout at the expense of structure. Later on, we added CSS which ideally should provide layout, but bad habits learned at an early age or provided by tools that came about in the 90s still haven't caught up in many cases to this day.

There's no concept of "line 4 of chapter 3 of book X" in the web, at least not down at its core. You have to shim it in with anchor tags or something similar, and as a result we don't have ubiquitous browser apps that let me highlight a snippet of text in a book, write some notes in the margin (example: "this is a reference to Hamlet's soliloquy"), and then on a whim search for all of Noam Chomsky's (or whoever's) notes on Hamlet, or perhaps search for all public notes on this book in the context of Hamlet. The lack of structure means that everything's this haphazard soup of markup, with the focus being so much on how it looks that structure of the text and the relationship between parts gets lost. I'm not sure the "semantic web" will make up for that lack, but it's probably a good example of how the "next web will be built on the bones of this one" (to paraphrase someone else's comment in this thread).

jules
The web can, and will continue to accumulate stuff. You can't clean up a mess by accumulating more stuff.

There is no technical reason why a proper GUI framework couldn't be as ubiquitous as HTML+CSS+JS+SVG+etc, except that historically it didn't happen that way. It's definitely not inconceivable that somebody will build Kay's VM vision as an abstraction layer on top of web technologies, so that at least we no longer have to deal with the mess in each and every web application separately. If such a VM becomes popular enough we can implement said VM directly instead of on top of existing web technologies.

dasil003
> You can't clean up a mess by accumulating more stuff.

You can't clean up a mess by adding layers, but you can replace layers (eg. you can add web sockets and start driving real time apps that way). Therefore incremental improvement is possible essentially forever.

> There is no technical reason why a proper GUI framework couldn't be as ubiquitous as HTML+CSS+JS+SVG+etc, except that historically it didn't happen that way.

Agreed. However it's not just accidental either. The web is many things to many people. It's not like cross-platform GUI toolkits failed for lack of trying, but the fact is that they could not reproduce the advantages that the web had in the very beginning. Specifically, it was comically easy to create content and even a browser on any platform in those early days. Obviously over the next decade web tech was severely abused to shoehorn application functionality where it was never intended, and ends up being more complex in the long run. However the number and utility of the dead simple documents is absolutely massive. That, combined with a defacto standard for creating server apps without installation that is perfectly accessible to a layman (theoretically with any disability as well) was the wedge that GUI toolkits never came close to having.

Your idea about building a new abstract layer on top of existing web technologies and then reimplementing it natively is the most credible idea I've heard about how the web could be supplanted, and I certainly wouldn't bet against that happening over the coming decades. However I think the web should be a textbook example of how worse is better is not just a conciliatory platitude for disgruntled engineers, but as an actual requirement for certain classes of technology to gain critical mass.

DanBC
I used to think differently, but my time on HN has persuaded me that I'm wrong. I now agree with you.

I do miss the "Write it according to the specs and publish it, and anyone on any browser, any OS, any machine, any display type, can access it" attitude. Some people really do need to re-learn the universality of the www, instead of fencing off content inside apps. (Why do newspapers insist on really poor online versions, and mobile versions, and app versions?)

coldtea
>And yet no desktop toolkit has ever approached the ubiquity and deployability of the web.

Actually there are more Windows installations than are internet users.

(If you count mobile phones the numbers might skew a little, but the vast majority of those on third world countries are not used for the web anyway).

InclinedPlane
Perfect is the enemy of good.

If we'd waited for perfect we'd never have linux, we'd still be waiting on hurd. If we'd waited for perfect we'd never have the internet and the web, we'd still be waiting on xanadu.

jgon
Well I'm not wringing my hands, and I don't think you really have the ability to comment on my body language. Let's keep this debate focused solely on the contents of the talk.

The ubiquity and deployability of the web is not an inherent quality of the design of html, css or javascript. The ubiquity and deployability seem to me to be largely due to the internet which is one of the comments that Kay makes. People confuse the workings of the internet with the workings of the web. Right now we send documents that are interpreted, but there is nothing to say we can't send objects which are "interpreted" (or JITed more likely). But his point in the talk I posted was that the amount of accidental complexity that is piling up on the web as we speak (and which will only get worse and worse) will eventually collapse upon itself. You can already see the complexity of things like javascript and css increasing as the web attempts to offer applications, not just documents, and doing this by having a committee standardize high-level behaviour like layout seems to be a poor approach in the long run.

Most of the talk is focused on thinking in the long-term as well as working to decrease the complexity of the software edifices we are currently creating, before they become too large to improve in any revolutionary way.

Who knows, maybe the next web will be built on top of the web, the way the web was built on top of the internet. The point of his comments in the talk is that we should explicitly think about the design of what we are building before we rush off to pile up code.

dasil003
You'll have to forgive me not watching the talk yet. I'm not in a position to comment on Kay's entire vision.

My comment was aimed squarely at the refrain that we've been hearing from serious engineers for a long time about the utter unsuitability of the web for apps. I find this tiresome because technology doesn't win by being better, it wins by being adopted. Of course there's no technical reason we can't have a better basis than the web for internet apps. It's not about technical limitations, it's about the adoption curve. The perfect cross-platform GUI toolkit in 1992 would have still failed if it required you to write C++ to render your class schedule.

The fact that any person with almost no technical expertise can access an app from any computer they sit down at anywhere in the world without requiring installation is actually an amazing achievement that far outweighs the kludginess of the apparatus.

chubot
I don't think your response addresses the principle of least power. That it takes millions of lines of code to create a web browser is a separate issue.

The principle of least power basically means that you should convey meaning across the web using the least expressive language or abstraction possible. The reason is that you want give the client more choices to interpret the data/content, rather than forcing it to correspond.

http://www.w3.org/DesignIssues/Principles.html

Google, or any other search engine, would not be possible without the principle of least power. Mobile phones wouldn't have been able to use the web if the semantics were over-fitted for specific devices.

I've heard what is essentially Kay's argument from other sources too. Some computer scientist at Adobe (may have even been one of the founders) lamented that the web wasn't build on PDF. PDF is a Turing complete language that can describe any page layout.

Certainly after one has messed with CSS enough, you can see why he would say that (aside from the obvious favoritism). But it is terrible idea, in that it overfits content for specific devices, and is a vector for malware, etc. Once you are Turing complete, it's very hard to avoid these pitfalls.

Anyway, as I recall, Kay is making essentially the exact same argument as the Adobe guy, and the way he said it in that talk (Programming and Scaling, thanks below) displays astonishing ignorance for such a brilliant guy.

If Kay is right, then he should just create a web that is based on a VM. And then he can see how far it goes. It will never go anywhere because it will be broken by design.

That we are developing more VMs now for the web doesn't negate the point. The web should be layered; each kind of content should choose the least powerful channel to transmit it. If you need arbitrary code, then the least powerful abstraction is a Turing complete VM. But you don't need a Turing machine to display a blog post.

Single best is difficult, here are some favourites of mine:

"You and your research" by Richard Hamming:

http://www.youtube.com/watch?v=a1zDuOPkMSw

"How to design a good API and why it matters" by Joschua Bloch:

http://www.youtube.com/watch?v=aAb7hSCtvGw

Google TechTalk on Git by Linus Torvalds:

http://www.youtube.com/watch?v=4XpnKHJAok8

All talks ever given by Alan Kay, for example:

http://www.youtube.com/watch?v=oKg1hTOQXoY

stblack
Double thumbs-up to Google TechTalk on Git by Linus Torvalds.
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.