HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Lecture 1A | MIT 6.001 Structure and Interpretation, 1986

MIT OpenCourseWare · Youtube · 7 HN points · 46 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention MIT OpenCourseWare's video "Lecture 1A | MIT 6.001 Structure and Interpretation, 1986".
Youtube Summary
Overview and Introduction to Lisp
Despite the copyright notice on the screen, this course is now offered under a Creative Commons license: BY-NC-SA. Details at http://ocw.mit.edu/terms

Subtitles for this course are provided through the generous assistance of Henry Baker, Hoofar Pourzand, Heather Wood, Aleksejs Truhans, Steven Edwards, George Menhorn, and Mahendra Kumar.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
For CS, it was SICP. I read it in high school (accompanied with the lecture series[0]) and it lead to me learning about functional and logic programming, compilation, algorithms and data structures and more. It was more than enough to prepare myself for university, and even in my third year some topics have not been covered to the depth they are in SICP!

For math, it was Gödel, Escher, Bach. I was into mathematics in high school as well, but did not have a proper teacher at the time so I was stuck on trying to absorb some of the more technical undergraduate texts in abstract algebra. However, GEB was at the right level and had a nice layman explanation of the 2nd incompleteness theorem, along with beautiful exposition into philosophy, computer science, math, art, music and cognition. I enjoyed it so much I even wrote a compiler for the programming language in the book to C.[1] The exposition on formal reasoning also got me started in learning the Coq theorem prover, which I continue to use to this day.

[0] https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PLE18841CAB...

[1] https://github.com/siraben/meta-yacc

For perspective, SICP was written to accompany a course for MIT students. Students who not only had the background for MIT, but access to its authors, MIT grad student teaching assistants, and soft resources like study groups. Students who were not usually working full time jobs.

On top of that, it was written to challenge even the best of those students (including grad students).

SICP is really really hard for just about everyone...at least in parts...by design. That's what makes it a good book. You can keep going back to it for years and years and learning something as you gain relevant experience and knowledge.

The good news is nobody cares how long it takes you to learn from it, and anything you learn from it is valuable even if it is just a small part. You have decades to open it and read from time to time. There's no test in a few weeks and no grade in May.

It might be worth checking out the original Ableson and Sussman lectures on Youtube: https://www.youtube.com/watch?v=2Op3QLzMgSY

It is also worth considering How to Design Programs as a supplemental text. https://htdp.org/ and moving into the Racket ecosystem.

Good luck.

The videos from a class the authors gave in 1986 are available online (https://m.youtube.com/watch?v=2Op3QLzMgSY). I highly recommend those, if reading the text is not your style.
tartoran
This is a gem!!
... probably that learning Smalltalk and Lisp would elevate my thinking and make it much easier to think about languages I've had to learn since.

Also, watch the SICP video series https://www.youtube.com/watch?v=2Op3QLzMgSY if you haven't yet read the book -- it's quite a gentle introduction.

Good luck!

The very first thing that Hal Abelson says and puts up on the chalkboard in his very first SICP talk (Lecture 1A 6.001 'Structure and Interpretation') in 1986 is that "Computer Science is not about either science, or computers'. https://www.youtube.com/watch?v=2Op3QLzMgSY
Wow, what a MASSIVE missed opportunity to quote the best variation of this idea by far, done by Hal Abelson in the legendary "6001-Introduction to Computer Science" class on MIT in the 80s.

You can check it here: https://www.youtube.com/watch?v=2Op3QLzMgSY

Abelson in the first minute crosses both computer AND science, and references the also legendary SICP with "computer so-called science actually has a lot in common with magic".

Honestly, this alone already made the article that empty.

https://www.youtube.com/watch?v=2Op3QLzMgSY&t=15s
neonological
Good find. My post is voted down, but this authoritative source is literally saying the same exact thing.
Physics is not a subfield of math. It has entirely non-mathematical origins (see Aristotelian physics [1]).

Computer science was developed by mathematicians as a study of algorithms, procedures for computing, and methods of abstraction. In the words of Hal Abelson, computer science is not a science and it's not really about computers in the same sense that geometry is not about surveying instruments [2].

[1] https://en.wikipedia.org/wiki/Aristotelian_physics

[2] https://www.youtube.com/watch?v=2Op3QLzMgSY

Such a timeless quote. I believe this is the source:

https://youtu.be/2Op3QLzMgSY

While there, SICP is not a bad way to augment a CS degree :-)

"The natural language which has been effectively used for thinking about computation, for thousands of years, is mathematics."

I'm not sure if this is true. Harold Abelson creates the distinction[1] between Mathematics being the study of truth and Computing being the study of process. It seems to me that these really are different things and Mathematics isn't the "natural language" to discuss computations, but rather truth and patterns. But of course process (computing) can only happen within the boundaries of mathematical truths and patterns.

[1] https://www.youtube.com/watch?v=2Op3QLzMgSY the first few minutes

justinmeiners
I believe this is a comment about interests of the fields akin to differences between math and physics.

They still approach computation using mathematical reasoning methods. Note how they define car and cdr and how they approach problems in those videos.

I believe Abelson and Sussman use the kind of mathematical reasoning I am talking about in all their work. SICP being a prime example.

May 29, 2019 · 5 points, 1 comments · submitted by whack
HNLurker2
Will watch instead reading and solving the problems. What am I missing?
Most of the languages are constantly changing but the JavaScript world is way faster than anybody else moves.

Although might be over-hyped, I personally believe it's more a good thing which moves the community even the industry forward. React was kind of enlightened the functional and reactive patterns to the mainstream developer community.

In contrast, most of the back-end communities still struggle with data synchronization between microservices, while all modern front-end frameworks are able to update all of the view or other read models almost magically.

If anyone wants a more specific example, I think it's quite shocking to me that there are only very few back-end architectures knows clearly how to design a whole large scale architecture push WebSocket notification for a given user -- just like typical google suite stuff. WebSocket has been there for years, there are so few people know how to develop and operate a stateful service... unbelievable.

I agree with that shipping business functionality is the most important value in technology. However, I feel most of the back-end communities worries about the problems they are encountering too much. And they don't really care about the art of the programming itself, yet advanced architectures indeed require language or platform level construct, for example, immutable data structures are much easier to be concurrent safe, or just the computation should be pure to be safely rescheduled on other core or machine, etc.

In the front-end world, I think for example React and Mobx which I'm using are pretty much taking care of most of the accidental complexity of software development. In the back-end world, I can imagine something similar to a full-blown framework like Rails but built on top of something like Orleans or Lasplang would help us get rid of all the accidental complexity, and just expressing the domain itself instead. We're still too far from there.

A quote from SICP lecture: https://youtu.be/2Op3QLzMgSY?list=PLE18841CABEA24090&t=81

"And the Egyptian who did that, geometry really was the use of surveying instruments. Now the reason that we think about computer science is about computers is pretty much the same reason that Egyptian thought geometry was about surveying instruments."

Agreed. This is also why I adore the SICP lectures from the 1980s [1] (and the book in general).

It broke programming down to its most fundamental and important building blocks without all the baggage of machine/operating system/application/dependency/etc specific stuff.

[1] https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PL8FE88AA54...

Dec 01, 2018 · shakna on Go 2, here we come
If you've never met Scheme, then SICP [0, 1, 2] may be something that can change the way you program. It certainly made me better, or at least a deeper understanding.

[0] https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PLE18841CAB...

[1] https://www.amazon.com/Structure-Interpretation-Computer-Pro...

[2] https://web.mit.edu/alexmv/6.037/sicp.pdf

For anyone who wants to feel what this might have felt like (and you've yet to watch them), MIT posted videos of Abelson and Sussman presenting the course on YouTube [0]. I felt a similar sense of magic the first time I saw the derivative section.

[0] https://www.youtube.com/watch?v=2Op3QLzMgSY

Although not strictly a talk, I would highly recommend the first of the SICP lectures[0], if only to expand your thinking about what computer science:

"I'd like to welcome you to this course on Computer Science. Actually that's a terrible way to start. Computer science is a terrible name for this business. First of all, it's not a science. It might be engineering or it might be art. We'll actually see that computer so-called science actually has a lot in common with magic. We will see that in this course. So it's not a science. It's also not really very much about computers. And it's not about computers in the same sense that physics is not really about particle accelerators. And biology is not really about microscopes and petri dishes. And it's not about computers in the same sense that geometry is not really about using a surveying instruments."

[0]: https://www.youtube.com/watch?v=2Op3QLzMgSY

mlevental
can someone please convince me that the use of the word "magic" is justified? yes I know sicp is universally exalted in the cs/programming community and even though I haven't read it yet I support the principle of a principled approach to computation (it's math after all in the purest sense) but I can't support the infantilism of words like magic and the wizard on the cover of the book because the two themes are directly in opposition. there is no magic and clear rigorous analysis of programs is very fruitful.
enkiv2
Crowley defines magick as the execution of Will upon the world. Programming is actually a better fit for this particular definition than most of the western-occult-tradition ritual stuff Crowley himself was doing (including the enochian stuff).

Somebody with only a shallow pop-culture understanding of occult tradition is bound to associate it with fuzzy thinking & children's media. The use of alchemical metaphors in SICP indicates that the authors have at least some familiarity with the history of magick, though.

The most important figures in the western occult tradition were mathematicians (like Dee) or invented early computational or permutational devices (like Llul). Magick is very firmly bound up in this kind of mathematical thinking. On the other side, the mathematical foundations of computing come out of mathematicians who had occult justifications: Godel (and Cantor before him) was a mathematical platonist who dabbled in gematria, and his work on computability was part of the ars magna for him. (In case you're not up on the history, Turing's work on universal turing machines was an attempt to rephrase Godel's incompleteness in a way that was accessible to non-mathematicians, and his later work with Church proving the equivalence between Godel's model and lambda calculus was built upon this work. While computing machines predated this formal basis, the formal basis is pretty important -- we all learned it in college, after all!)

gmiller123456
Maybe it's just me (I haven't actually read the book), but I don't see the guy on the cover as a wizard. He's holding a pair of dividers which I associate much more with a craftsman or study of geometry. I just assumed the clothing was a style I didn't understand.
Lordarminius
You're right, it occurred to me as well. He looks more like a 14th / 15th century european scholar than a neocromancer.
enkiv2
He's an alchemist. The original version of the illustration says salve/coagulate rather than eval/apply.
brilee
There's an aphorism: "objects are a poor man's closure; closures are a poor man's object". When I got to the point in SICP where I understood that, the word "magic" was very definitely justified :)
yesenadam
even though I haven't read it yet

Maybe there's your answer.

lcuff
Watching the SICP lectures, I had a moment of "I totally did not see that coming", with triple exclamation marks. The ability to do astonishing things with so little code justifies the word magic to me. Also magic as sleight-of-hand is not infantile to me, either. It's ingenious and practice, practice practice in order to astonish the audience.
The Mother Of All Demos is an absolute classic: https://youtu.be/yJDv-zdhzMY https://en.m.wikipedia.org/wiki/The_Mother_of_All_Demos

I also very much enjoy Feynman’s talks. Here’s one on imagining physics: https://youtu.be/4zZbX_9ru9U

For computer science in particular, I highly recommend the first lecture in the SICP series, especially on the naming of so called “computer science”: https://www.youtube.com/watch?v=2Op3QLzMgSY

Along those lines, the accompanying MIT class (MIT 6.001 Structure and Interpretation, 1986) is available on YouTube[0] and MIT OCW[1]. It's worth watching, if only for the following:

"I'd like to welcome you to this course on Computer Science. Actually that's a terrible way to start. Computer science is a terrible name for this business. First of all, it's not a science. It might be engineering or it might be art. We'll actually see that computer so-called science actually has a lot in common with magic. We will see that in this course. So it's not a science. It's also not really very much about computers. And it's not about computers in the same sense that physics is not really about particle accelerators. And biology is not really about microscopes and petri dishes. And it's not about computers in the same sense that geometry is not really about using a surveying instruments."

[0]: https://www.youtube.com/watch?v=2Op3QLzMgSY [1]: https://ocw.mit.edu/courses/electrical-engineering-and-compu...

If you'd like to go further along that path, and I'd really recommend trying to learn a lisp. How to Design Programs[0] is good gentle introduction if you're new to the ideas of lisp or functional programming, and SICP[1][2] is the old classic.

[0] http://www.ccs.neu.edu/home/matthias/HtDP2e/index.html

[1] http://web.mit.edu/alexmv/6.037/sicp.pdf

[2]https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PLE18841CAB...

dorfsmay
Which one though?

I have looked at LISPs, but it's difficult to pick one to invest time in. Because I am so restricted on time I have to stick to languages which are used in the industry (so my fun has real-life ROI), which I think boils down to CL which is often touted as old-fashion and its market seems to shrink slowly, and Clojure which means the JVM and mainly Java libraries and high mem requirements.

For pure fun and practicality, Racket does seem up on the "list", but not used at all in the industry. A lot of languages (Python, Rust) are adopting the good parts of lisp (except for the minimal syntax, sadly), and I'm getting better mileage from spending time learning and becoming better at those than starting from scratch in a language that I won't be able to use during work hours.

opnitro
So, I really like How To Design Programs, because the thing it got across for me is how to appropriately think in a functional way. It walks you through building data structures and full programs in a really nice method. I've now been going through SICP, and nothing has seemed out of place as the notation is almost identical, and the few places that it isn't are not hard to translate.
dorfsmay
My question was more about which program language, which Lisp to choose in 2018?
opnitro
Oh, so How To Design Programs uses "teaching" languages, which are sub-languages of Racket. Going from that book to learning the full features of Racket has been fun.
Reminds me of the opening lines of the SICP lectures[0]:

"I'd like to welcome you to this course on Computer Science. Actually that's a terrible way to start. Computer science is a terrible name for this business. First of all, it's not a science. It might be engineering or it might be art. We'll actually see that computer so-called science actually has a lot in common with magic. We will see that in this course. So it's not a science. It's also not really very much about computers. And it's not about computers in the same sense that physics is not really about particle accelerators. And biology is not really about microscopes and petri dishes. And it's not about computers in the same sense that geometry is not really about using a surveying instruments."

[0]: https://www.youtube.com/watch?v=2Op3QLzMgSY

crimsonalucard
It's really more math then anything. Building concepts from atomic axioms.
jonny_eh
I like to think of it as dynamic math. Calculus is also a form of "math over time", but is more about a single equation to describe that change over time. Computer science is more like "math with discrete logical steps over time". It allows you to answer questions like "which algorithm can sort a given array in the fewest steps"?
crimsonalucard
Math is building formal logic systems based off of a set of a few axioms. Computer science is essentially an aspect of this.
First 10 mins of the first Structure and Interpretation lecture (by Harold Abelson) [1]

Famous first words ...

"I'd like to welcome you to this course on computer science. Actually, that's a terrible way to start. Computer science is a terrible name for this business."

... and this is where it all clicked to me ....

"Well, similarly, I think in the future people will look back and say, yes, those primitives in the 20th century were fiddling around with these gadgets called computers, but really what they were doing is starting to learn how to formalize intuitions about process"

[1] https://www.youtube.com/watch?v=2Op3QLzMgSY

The entire series of 20 lectures (uploaded by MIT) is available here: https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PLE18841CAB...
The quote is of Hal Abelson and one of his lectures at MIT disagrees with you [1]. The introduction is amusing and opens with Computer Science has nothing to do with Computers and little to do with Science. It's really a poorly named field. I believe it's a bit tongue in cheek and not defining science entirely by "uses the scientific method".

[0] https://en.wikiquote.org/wiki/Hal_Abelson

[1] https://www.youtube.com/watch?v=2Op3QLzMgSY

> Interesting. Maybe we should compare notes sometimes. I have a bit of a story with finding alternative-type stuff.

I really like the theory of programming, and am constantly looking for languages that are

a) easier to write

b) amazingly expressive

Its always finding what's out there.

> - I've been contemplating tackling Python, in spite of the 2.x/3.x thing. There's the perfect vacuum where versioning shouldn't be a problem, and the practical real world where knowing Python and the differences between the two will let me actually do interesting things.

Thankfully, that's mostly gone now. Few enterprises still holding out, but most of the big libraries are 2/3 or 3 only.

And 3 is a lot more thought-out. Most of the warts are gone.

> - I completely forgot to mention Awk (!) mostly because I have no idea at what point I began to pick it up, but I did get a basic idea of how it works a while ago, and I've been factoring it into shell pipelines more frequently lately. It's great for when I need to do something that sed can't express well (and I just properly read about how match() a) returns the index and b) also works apropos to grep recently, which is great)

Awk is different, but great.

I had a boss at a previous job who'd used it for at least a decade for everything, so I had to learn the nitty-gritty.

I ended up building a CGI application that basically ran the entire website in Awk. Big learning experience, but rather cool.

> (As an aside, Lua is ~200K and awk is ~632K. That is just wrong. [Mumbling about rewrite])

Mmm. That does hurt.

But, at least Lua is an amazing language too, if a bit odd at times.

> I'm reasonably familiar with the "not wanting to count to 5" thing... heh. Interesting about the Scheme thing in that context. I'll definitely have to give it a look.

Scheme is stupid simple. Which is awesome.

Everything looks like:

( function arg arg arg )

So, Hello, World!

(display "Hello, World!")

Also, if a function ends with !, then it modifies state, and isn't really functional.

(set! varname value)

And finally, Scheme's macros are hygienic (don't interfere with variables that exist), and fairly simple:

    (define-syntax (syntax-rules (swap! x y)
      (let ([tmp x])
        (set! x y)
        (set! y tmp))))
After working a little while with it, those parenthesis won't be as scary, and you probably won't even notice them.

SICP also has lectures on Youtube [0], if you want a more formal way to get to grips with Scheme.

[0] https://www.youtube.com/watch?v=2Op3QLzMgSY

i336_
>> I've been contemplating tackling Python, in spite of the 2.x/3.x thing. (...)

> Thankfully, that's mostly gone now. Few enterprises still holding out, but most of the big libraries are 2/3 or 3 only.

Oh okay. I think I got got mildly tangled by https://news.ycombinator.com/item?id=13019819 a little while back.

> Awk is different, but great.

> I had a boss at a previous job who'd used it for at least a decade for everything, so I had to learn the nitty-gritty.

> I ended up building a CGI application that basically ran the entire website in Awk. Big learning experience, but rather cool.

Oh nice. That's kind of awesome.

(I was referring to gawk when I mentioned the 632K thing.)

> But, at least Lua is an amazing language too, if a bit odd at times.

I seriously don't understand why they won't allow ++ and -- and similar conveniences we've grown to almost expect from other languages...

(info about (Scheme))

This syntax looks fairly familiar (Lisp syntax is after all incredibly reductionist) and I remember ! from Ruby.

> After working a little while with it, those parenthesis won't be as scary, and you probably won't even notice them.

Haha, I remember someone telling me that about {} in C-like languages (PHP) a decade ago. I still notice them... but I know where to use them and what they're for now.

I don't think my brain will sprout a number-of-closing-parens-needed checker anytime soon; https://en.wikipedia.org/wiki/Subitizing is generally established to be mentally impossible for >4 items, so... (putting (everything) in parens) is fun, but s)t))ac))))ki))n)g) the closing ones onto single lines is... well all my brain's coming up with is a slightly incredulous blank stare.

--

By the way: I'm not quite sure how relevant this is to wherever you are (which I haven't been able to concretely ascertain), but I'm in Sydney. Curious to learn how far away I am from you, perhaps via email.

(I may have poked through some of your tales... and now I have to deal with a brain that's scrambling around trying to find a signup button so I can trail along with you sometime. =P)

PS. Just hit HN's post ratelimit for the first time, which was why this took a while to appear. Sorry!

shakna
> which I haven't been able to concretely ascertain

I do have fun befuddling geo stuff online. But these days, somewhere around Melbourne.

> I may have poked through some of your tales...

There's a website, with RSS in my HN account. Have fun!

i336_
>> which I haven't been able to concretely ascertain

> I do have fun befuddling geo stuff online. But these days, somewhere around Melbourne.

Ah, cool.

>> I may have poked through some of your tales...

> There's a website, with RSS in my HN account. Have fun!

Will definitely be reading through all of them ^^

Thanks very much for explaining what you did. I really appreciated that - I've never been able to compare languages by "easiest to understand when not mentally having a good day" before, which was great.

(NB. It's currently tricky for me to meet people IRL, so I'm always on the lookout when I say hi to people in the same country :) No worries!)

i336_
Oh, also - by "alternative-type stuff" I was referring to alternative health. But I also like alternative/left-of-center technologies too :D
Reminds me of the MIT SICP lecture videos from the 80s. The concepts of black box abstraction and the simplicity of using LISP like lego building blocks blew my mind and made me switch from being a UX designer dabbling in Rails to a full blown programmer.

It was still entirely relevant to today even though it was a few decades old as the fundamentals of computer science are still fundamental.

https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PL8FE88AA54...

Hearing that intro music still brings a smile to my face.

I just happen to be relearning math right now as I dive deeper into data science and this is perfect timing. Going to watch this series once I get through my math proofs book ("Book of Proof" by Richard Hammack which I recommend to people getting into math https://www.amazon.com/Book-Proof-Richard-Hammack/dp/0989472...).

theoh
The Book of Proof is freely available online: http://www.people.vcu.edu/~rhammack/BookOfProof/
I took 6.001 in 1999 (I think). Hal Abelson taught it along with a professor who took off his sweatshirt to reveal a Microsoft tee at his final lecture (he went to Microsoft).

What was great about Scheme (Lisp) is that most programs are basically words from your vocabulary, parentheses, and cars and cdrs. With procedural languages, there is always a sense that the language provides all the tools, and the magic happens somewhere underneath the hood. But with Scheme it feels as if it's happening right before you. It makes you feel like a wizard, and not a monkey. And it requires knowing all the spells. It requires knowing how to make magic happen. Or not. Once you know none of it's magic. But that's the point!

Python is arguably easier and more practical for both science and work. It's already popular on the web server, and is used everywhere else -- unlike Scheme.

But the recent resurgence of functional programming is super exciting with Elixir and Elm and the like.

I'm looking forward to building my next project using Elixir.

Here is the classic course: https://ocw.mit.edu/courses/electrical-engineering-and-compu...

Original video lecture on YouTube (linked from above): https://www.youtube.com/watch?v=2Op3QLzMgSY

kough
Also, the current grad-level class in the same area (taught by Sussman) is online here: http://groups.csail.mit.edu/mac/users/gjs/6.945/

By far the most rewarding class I've taken at MIT. Check out the psets, they're pretty fun.

brians
Miller? He advocated turning off Athena in the 90s, because NT had obviously won. Linux must have been a Hell of a surprise.
taeric
My favorite thing having finally taken a dive into lisp and SICP, is not that it is very friendly to functional. It is that it is very friendly to showing how it all works.

My favorite section is where they go over making a constraint based system that will either calculate degrees F or degrees C. Or validate that the two given values are accurate. All depending on what you have entered. And this is done from the ground up. No hidden magic of the system has to support you, by and large. (Not strictly true, as this does make use of GC and such throughout.)

If you hadn't seen it, https://www.infoq.com/presentations/We-Really-Dont-Know-How-... is a great video showing Sussman's style. He also showcases a program they wrote that could solve circuits based on this general idea.

mrcsparker
Thank you so much for linking this video. I had no idea it existed. Watching that lecture was so much fun.

I wish that there was a camera pointing to the crowd when he said the more controversial things.

I didn't know that Sussman hired Stallman. Their work changed our lives.

gavinpc
This is excerpted (as the "technology" segment) of another great video, Bret Victor's "Humane Representation of Thought." (at about thirty minutes in)

https://vimeo.com/115154289

Mar 21, 2017 · keithnz on Beautiful Online SICP
you all with your fancy pants "reading". Nothing beats the movie, not only showing the height of computing power, but the awesome fashion of uber geeks.

https://www.youtube.com/watch?v=2Op3QLzMgSY

Jan 01, 2017 · shakna on Ask HN: Learning Lisp?
The lectures for SICP are also up on youtube: [0]

[0] https://www.youtube.com/watch?v=2Op3QLzMgSY

If you haven't watched the SICP video lectures [1], they are really great.

[1]: https://www.youtube.com/watch?v=2Op3QLzMgSY

It's simpler than it seems, and how strange people make it out to be held me off for a long time.

There are cons cells. A cons cell is basically an untyped tuple. The syntax for a cons cell containing a number look like this: (1 nil) (where nil is the zero byte (I think, please correct me if I'm wrong)).

You can nest them, like this: (1 (2 (3 nil))).

Because nobody can be bothered to type those parens, the part of the compiler/interpreter called the Reader compiles syntax like this

(1 2 3)

to this

(1 (2 (3 nil))).

After being parsed by "the reader" (you could also call it "parser"), those nested tuples/cons cells are sent to eval function. The eval function takes the first element of the list (aka `car` in lisp-speak. My mnemonic is that a is to the left of the keyboard, therefore means first element). That first element is being regarded as the function, and the second element in the cons cell is the argument to the function.

The function to get the second item in the cons cell is `cdr`. I remember that by having `d` being to the right on my keyboard. (car 1 2) => 1. (cdr 1 2) => 2. (cdr 1 (2 (3 nil))) => (2 (3 nil)). That is how things are nested.

I digress. The eval function evaluates every symbol (aka atom) in these nested cons cells, and looks up the corresponding meaning of them in a big table of defined symbols. So the + symbol might match an addition function, the string-join symbol might match to a function for joining strings.

What is returned from eval you could say is the true AST, which still looks pretty darn similar to the lisp syntax. The function is then sent to the apply function, which applies the argument to the function. Remember that the argument can be a nested set of cons cell, so it can nest infinitely.

That's it! I'm no lisp expert, so if someone is, please correct me. But that's the gist of it. Syntax is read by the read function, the symbols are interpreted (lookup in a big key-value object) by the eval function, these function gets apply-ed with their argument. Very few primitives are needed by build a system from that. Here is a list of the primitives needed - these can be implemented in binary, C, assembly, Java, Javascript or whatever: http://stackoverflow.com/a/3482883.

BONUS SYNTAX:

If you want a eval to skip over a cons cell (and of course it's children), you can prepend it with a `'` symbol. So (cdr 1 '(this is never going to be evaled, but is just a list)) => (this is never going to be evaled, but is just a list)

This is called a 'quote'. Any Javascript array is basically working like a quoted list. If the first item in a javascript array was a function, and you applied the cdr of that array as argument to that funciton, it would be like using lisp with an not-quoted list.

BONUS VIDEO:

This is just lovely. I'm not American, and do not have a CS degree, so I feel like attending a place like this is a far dream. But happy to be able to enjoy it over the interwebs. Incredibly thankful to the giants of computing on whose shoulders we can stand https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PLF4E3E1B72...

dragonwriter
A note on notation:

The standard notation for a cons cell with car "x" and cdr "y" is (x . y), not (x y). The latter is a list, equivalent of (x . (y . nil)).

So (1 2 3) is equivalent to (1 . (2 . (3 . nil)), not (1 (2 (3 nil))). The latter is a nested list, which is equivalent to (1 . ((2 . ((3 . (nil . nil)) . nil)) . nil))

yjgyhj
Good catch, thank you.

To anyone who wants to play around with lisp, I recommend downloading the emacs text editor. Just open it, navigate around with the arrow keys. Type an s-expression, and place the point after the ")". Then type Ctrl-x Ctrl-e, and the expression will be sent thru read -> eval -> apply. Fun to just play around with.

(+ 1 2)_ ;; _ means the cursor, ";" means a comment

yjgyhj
BONUS FUN:

If you want to play around with lisp interactively, I reccomend checking out the program Emacs. It's a little lisp interpreter written in C, that comes with a text editor and such. Download it and run it. When you write some lisp code into the editor, place your cursor after the expression and hit Ctrl-x Ctrl-e. That is read, eval and apply what the expression, and display the result in the bottom of them window.

Here is a screenshot of me doing that with this code (apply (quote +) (quote (1 2))) ;; http://i.imgur.com/SrhMxS8.png

Play around with that - it's fun enough for an evening.

Some other fun code to run is this

    (reduce '+ '(1 2 3 4))

    (defun say-hello (&optional name) ;; a wild lambda appeared!
      (if (stringp name)
          (concat "hello " name)
        "I'm a lonely program"))
    (say-hello "Martin")
    (say-hello)
Watch Hal Abelson explain this on video, right at the start of the first 1986 SICP lecture: https://www.youtube.com/watch?v=2Op3QLzMgSY
sicp is the best programming course hands down. https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PLB63C06FAF...
ChicagoBoy11
Dat music, tho.
>"Computer science isn't a science and it isn't about computers"

Earliest use of this quote that I know of is from Abelson in his 1986 lecture teaching SICP: https://youtu.be/2Op3QLzMgSY?t=24

amelius
Hence, we should call it "Information Technology" instead.
> Note that you can be a programmer without ever having touched a computer, people have been coming up with algorithms and integrating them into systems for ages.

Slightly different, but the first episode of SICP videos helped me to reach the same conclusion:

https://www.youtube.com/watch?v=2Op3QLzMgSY

We can even say Newton, or Euclid wrote programs. Maybe not by sitting in front of a computer, but they described a process to calculate something.

dalke
"... wrote programs". I think it's better to say they wrote algorithms. A program is an algorithm instantiated for use by a machine, which might be analog or digital.

I can accept the argument that a human following an algorithm emulates a machine, but I consider it an imperfect analogy.

Big programs are made by writing a lot of small programs. Any other way fails.

This is about the only result of software engineering.

I would advice more small programs, but perhaps with the perspective to gather them in bigger programs: sicp

https://mitpress.mit.edu/sicp/

also comes in video:

https://www.youtube.com/watch?v=2Op3QLzMgSY&list=PLB63C06FAF...

http://htdp.org/

Jul 20, 2014 · r-s on Interactive SICP
One section of original lectures are freely available via MITs OpenCourseWare youtube channel: https://www.youtube.com/watch?v=2Op3QLzMgSY

It would be quite difficult to beat these lectures, while they appear a bit dated, the content is great.

SICP is a great way to learn about programming, although I am very surprised they got through the whole book in an introductory course. Chapters 4 and 5 cover some topics I was barely aware of nearing the end of my formal CS education.

atratus
I had no idea these existed, you just made my week
ikura
The 4-part FM synth theme tunes really make it special!
I think the underlying cause of this overabstraction is largely a result of abstraction being excessively glorified (mostly) by academics and formal CS curricula. In some ways, it's similar to the OOP overuse that has thankfully decreased somewhat recently but was extremely prevalent throughout the 90s. In software engineering, we're constantly subjected to messages like: Abstraction is good. Abstraction is powerful. Abstraction is the way to solve problems. More abstraction is better. Even in the famously acclaimed SICP lecture series [1] there is this quote:

"So in that sense computer science is like an abstract form of engineering. It's the kind of engineering where you ignore the constraints that are imposed by reality."

There is an implication that we should be building more complex software just because we can, since that is somehow "better". Efficiency is only thought of in strictly algorithmic terms, constants are ignored, and we're almost taught that thinking about efficiency should be discouraged unless absolutely necessary because it's "premature optimisation". The (rapidly coming to an end) exponential growth of hardware power made this attitude acceptable, and lower-level knowledge of hardware (or just simple things like binary/bit fields) is undervalued "because we have these layers of abstraction" - often leading to adding another layer on top just to reinvent things that could be easily accomplished at a lower level.

The fact that many of those in the demoscene who produce amazing results yet have never formally studied computer science leads me to believe that there's a certain amount of indoctrination happening, and I think to reverse this there will need to be some very massive changes within CS education. Demoscene is all about creative, pragmatic ways to solve problems by making the most of available resources, and that often leads to very simple and elegant solutions, which is something that should definitely be encouraged more in mainstream software engineering. Instead the latter seem more interested in building large, absurdly complex, baroque architectures to solve simple problems. Maybe the "every byte and clock cycle counts" attitude might not be ideal either for all problems, but not thinking at all about the amount of resources really needed to do something is worse.

> how much layers is too much layers?

Any more than is strictly necessary to perform the given task.

[1] http://www.youtube.com/watch?v=2Op3QLzMgSY#t=10m28s

lloeki
Up to the ~'60s gave us a vast theoretical foundation, and from then on we toyed with it, endlessly rediscovering it (worst case) or slightly prodding forward (best case), trying to turn this body of knowledge into something useful while accreting it into platforms of code, copper and silicon. My hope is that the next step will eventually be for some of us to stop our prototyping, think about what matters, and build stuff this time, not as a hyperactive yet legacy addicted child, but as a grown up, forward-thinking body that understands it's just not about a funny toy or a monolithic throwaway tool that will end up lasting decades, but a field that has a purpose and a responsibility.

To correct the quote:

Computer science is not an abstract form of engineering. Software (and hardware in the case it's made to run software) engineering is leveraging CS in the context of constraints imposed by reality.

> Any more than is strictly necessary to perform the given task.

Easy to say, but hard to define up front when 'task' is an OS + applications + browser + the hardware that supports it ;-)

This[0] is the typical scenario I'm hoping we would build a habit of doing.

[0]: http://www.folklore.org/StoryView.py?story=Negative_2000_Lin...

chillingeffect
> abstraction being excessively glorified (mostly) by academics and formal CS curricula.

It's not just academics, it's many developers, too.

We're in an old-school thread. We like what's really going on. Hang out in the Web Starter Kit from last night though, and you'll find tons of people who glorify abstraction.

The reality is that competing forces spread out the batter in different directions: the abstractionists write Java-like stuff. The old-schoolers exploit subtle non-linearities.

Actual commercial shipments rely on a complex "sandwich" of these opposed practices.

> Demoscene is all about creative, pragmatic ways to solve problems

Yes and I grew up with the demoscene (c64 and amiga 500) and it's also about magic, misdirection, being isolated for long winters and celebrating a peculiar set of values. Focus is shifted toward things that technologists know are possible, such as tight loops running a single algorithm that connects audio or video with pre-rendered data, not on what people want or need, such as CAD software or running mailing lists. Flexibility, integration and portability are eschewed in favor of performance.

Don't get me wrong, I LOVE the demoscene - it's the path that got me to love music. And I have near-total apathy for functional programming. I only code in Javascript when weapons are pointed at my heart, but with the proper balance, there are some very real reasons to make use of abstraction. It's not just academics, it's people solving real problems. The trick is to act strategically with respect to the question: which parts will you optimize and which parts will you offload to inefficient frameworks?

Scuds
> I think to reverse this there will need to be some very massive changes within CS education.

For instance, starting it elementary school. A surprisingly large amount of the mathematical portion of CS has very little in the way of prerequisites.

wiz21
Having been in the demoscene (Imphobia) for a long time and having been in more abstract (quad tree construction optimizations) stuff I can say that writing a demo is not the same as computing theory. Writing a demo is most often exploiting a very narrow area of a given technology to produce a seducing effect (more often than not, to fake something thought impossible so that it looks possible). So you're basically constraining the problem to fit your solution.

On the other hand, designing pure algorithms is about figuring a solution for a given, canonical and often unforgiving problem (quicksort, graph colouring ?). To me, this is much harder. It involves quite the same amount of creativity but somehow, it's harder on your brain : no you can't cheat, no you can't linearize n² that easily :-)

To take an example. You can make "convincing" 3D on a C64 in a demo because you can cheat, precalculate, optimize in various way for a given 3D scene. Now, if you want to do the same level of 3D but for a video game where the user can look at your scene from unplanned point of views, then you need to have more flexible algorithms such as BSP trees. So you end up working at the algorithm/abstract level...

A very good middle ground here was Quake's 3D engine. They used the BSP engine and optimized it with regular techniques (and there they used the very smart idea of potentially visible sets) but they also used techniques found in demo's (M. Abrash work on optimizing texture mapping is a nice "cheat" -- and super clever)

Now don't get me wrong, academics is not more impressive than demoscene (but certainly a bit more "useful" for the society as whole) These are just two different problems and there are bright minds that makes super impressive stuff in both of them...

stF

resu_nimda
I think to reverse this there will need to be some very massive changes within CS education.

Well, I mean, that is most definitely true regardless. But, with my experience getting my BS in CS a few years ago, it had nothing to do with "mainstream software engineering" either. I had classes on formal logic and automata, algorithms (using CLRS), programming language principles (where we compared the paradigms in Java, Lisp, Prolog, and others), microprocessor design (ASM, Verilog, VHDL), compilers, linear algebra, and so on. Very little in the way of architecting and implementing large, abstracted, real-world business applications or anything remotely web-related. In my experience I did not meet anyone interested in glorifying heaps of whiz-bang abstraction, they seemed to be more in line with the stereotypical "stubbornly resisting all change and new development" camp of academics.

al2o3cr
"Demoscene is all about creative, pragmatic ways to solve problems by making the most of available resources"

It probably doesn't hurt that nobody expects a demo scene app to adapt to radical changes in requirements, or to interoperate with other things that are changing as well - for that matter, to even conform to any specific requirements other than "being epic".

For instance, the linked 8088 demo encodes video in a format that's tightly coupled to both available CPU cycles and available memory bandwidth. Its goal is "display something at 24fps".

Not that I'm a fan of abstraction-for-its-own-sake, but putting scare-quotes around real problems like premature optimization is an excessive counter-reaction.

Apr 15, 2013 · felipebueno on Lisp Is Too Powerful
Here is a nice 1986 MIT lecture on Lisp given by... Harrison Ford? :) - http://www.youtube.com/watch?v=2Op3QLzMgSY
Heh that's interesting, I just watched 106A yesterday to get an idea of how the programming introduction course is compared to my university (utwente.nl).

I actually was a bit disappointed to see that it is very much like our introduction course, except that this teacher is a bit more enthousiastic, and perhaps the course starts at an even more basic level I think.

I think for real CS enthousiasts 6.001 at MIT is much more interesting (and fun) to watch: http://www.youtube.com/watch?v=2Op3QLzMgSY

(it teaches scheme and the lecturer has an awesome way of explaining it)

AustinGibbons
While not being able to comment on 6.001, at Stanford there is an alternative for ``enthusiasts'' called 106X, which is a sort of more intense version of the 106B class and is taught in C++ I believe. Some students skip A/B and just take X. A, B and X all get really good reviews.
yifanlu
I took CS106X this quarter and while I enjoyed the lecturer and the assignments, as an "enthusiast", I did not feel challenged. It was more of "let me get some more experience with programming" and not "wow, that's nice to know". The class just teaches some more data structures not covered in CS106A and recursion and touches on inheritance. From what I hear though CS140/143 (os & compilers) seem more interesting to me.
rawatson
I'm one of the TAs (section leaders) for 106X this quarter. We get a pretty wide diversity of skill levels in the class, so it's tough to ensure that we challenge the experienced students while not scaring off everyone else.

However, the goal of 106X isn't to teach all of the complexities and nuances of algorithms (take CS161), fundamental principles of computing (take 103), or OS/low level understanding (take 107/140/143). Sure, we'd like students to think about these things, but we really want to make sure that students know how to program with

* Proper decomposition

* Sensible commenting/documentation

* Use of appropriate data structures/knowing when to employ recursion

If you want to get more "huh, that's cool", try taking 107 in the winter (Jerry is teaching that course as well).

None
None
lotso
106a is meant to make programming accessible to people who aren't or weren't considering being a CS major. I would say it is one of the most well thought and enjoyable classes at Stanford.
Read this books and watch this video lectures.

* Introduction to Algorithms (SMA 5503) http://ocw.mit.edu/courses/electrical-engineering-and-comput...

* Structure and Interpretation of Computer Programs (aka SICP) http://www.youtube.com/watch?v=2Op3QLzMgSY

Feb 26, 2010 · 2 points, 0 comments · submitted by sown
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.