HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Richard Feynman Computer Heuristics Lecture

Muon Ray · Youtube · 260 HN points · 34 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Muon Ray's video "Richard Feynman Computer Heuristics Lecture".
Youtube Summary
Donate and Support this Channel: https://www.paypal.com/donate/?cmd=_s-xclick&hosted_button_id=BLJ283JMTMT7S
Introduction Article to Heuristics and Metaheuristics: http://muonray.blogspot.ie/2016/04/meta-heuristics-and-universal-power-law.html
Richard Feynman, Winner of the 1965 Nobel Prize in Physics, gives us an insightful lecture about computer heuristics: how computers work, how they file information, how they handle data, how they use their information in allocated processing in a finite amount of time to solve problems and how they actually compute values of interest to human beings. These topics are essential in the study of what processes reduce the amount of work done in solving a particular problem in computers, giving them speeds of solving problems that can outmatch humans in certain fields but which have not yet reached the complexity of human driven intelligence. The question if human thought is a series of fixed processes that could be, in principle, imitated by a computer is a major theme of this lecture and, in Feynman's trademark style of teaching, gives us clear and yet very powerful answers for this field which has gone on to consume so much of our lives today.

No doubt this lecture will be of crucial interest to anyone who has ever wondered about the process of human or machine thinking and if a synthesis between the two can be made without violating logic.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Reminds me of Richard Feynman's explanation of a computer as an extremely large and fast filing system

https://youtu.be/EKWGGDXe5MA

Apr 03, 2022 · 15 points, 0 comments · submitted by Anon84
I followed some of the links in the comments and found this talk by Feynman given from the same period he was working at thinking machines.

This part of the talk amazes me that people in 1985 were having the exact same discussions we are today about the computer and it's effect on privacy with respect to "big brother" and the totalitarian government's obsession with collecting information on people.

https://youtu.be/EKWGGDXe5MA?t=3785 (timestamped at the correct location).

ebcode
https://en.wikipedia.org/wiki/Colossus%3A_The_Forbin_Project -- from 1970
iSnow
>that people in 1985 were having the exact same discussions we are today about the computer and it's effect on privacy with respect to "big brother" and the totalitarian government's obsession with collecting information on people.

Sure did, the 1970's and early 1980's were the birth time of the computer privacy movement. That's where the iconic Apple commercial has its background: https://www.youtube.com/watch?v=2zfqw8nhUwA

Of course, today everyone is spending their time enjoying indoctrination in front of a miniature Apple screen instead of getting called into a center...

Great response, thank you!

> I'm been a compiled-language programmer for 35 or so years.

40 or so for me. :-)

> I perceive [procedure calls] as a fundamental mechanism for organizing code.

Yes, I've been arguing this for some time: that the call/return architectural style has been so incredible dominant for so long that it is not really seen as a specific architectural style, but instead as simply all there is. It is the mechanism for organising code.

A paradigm in the true sense.

I call this The Gentle Tyranny of Call/Return[1]. "Gentle" because as architectural style go, it's probably one of the better ones to have, particularly if you only get one.

Anyway, if you study software architecture, you will find that procedure calls[2] are just one of many ways to organise software. However, they are given special status, because they have special language support. So if you use procedure calls to organise your code, to glue your components together, you get to write code naturally. If not, it starts to look messy. And it turns out that procedure calls not always ideal, because...

> Glue is a way to move data (and perhaps other stuff too) across code boundaries that are imposed

And a lot of what we do today is moving data around. Computers don't really compute all that much[3].

>But your TFA seems to be talking about a much higher level of glue than this, I think.. protobufs, RPC, serialized JSON, even ye olde CORBA/OLE models

Yes, these are also glue. But so are procedure calls.

> In that sense, procedure calls are low level mechanism that will support a higher level solution.

One might even say they are the "Assembly Language" for connecting: [2]

[1] https://2020.programming-conference.org/details/salon-2020-p...

[2] https://resources.sei.cmu.edu/library/asset-view.cfm?assetid...

[3] https://www.youtube.com/watch?t=278&v=EKWGGDXe5MA&feature=yo...

Richard Feynman's lecture is a great resource for this: https://www.youtube.com/watch?v=EKWGGDXe5MA

He proposes a file clerk that gets progressively dumber and faster until they get so dumb that they can be simulated by an electronic circuit.

One thing to remember about this book that is easy to miss is that it's not a memoir or autobiography in the usual sense. You may have seen this book, and its sequel, What Do You Care What Other People Think?, described as being "by Richard Feynman and Ralph Leighton" or "edited by Ralph Leighton" or "as told to Ralph Leighton". It turns out that it was not written as a book, but his friend Ralph Leighton (son of the physicist Robert B. Leighton) took several hours of (recorded) conversations with Feynman talking to him, and selected parts of them to go into the book.

So although the book is sold as "by Richard Feynman" (which is true in some sense: it's in the first person, published when he was still alive, and he really did say everything that's in the book), it would be more accurate to call it a book by Ralph Leighton, and an appropriate title may be "Things my friend Richard Feynman told me about himself that I thought were fun". (This also explains the subtitle "Adventures of a Curious Character"—this is not Feynman calling himself that, but Leighton describing his friend that way.)

Now consider that (1) Ralph Leighton was not a physicist but Feynman's "close friend and drumming partner", and (2) Feynman was a natural conversationalist, automatically adapting his style depending on whom he was speaking to, whether it was a friend, or undergraduate physics students, or he was talking about computers to a New Age crowd at Esalen (https://www.youtube.com/watch?v=EKWGGDXe5MA), and you have this result. The friendly conversational style increases the appeal of the book and gives some insight into the speaker, but the technical detail of Feynman's core work, which was very important to him, gets diluted.

(One of my professors disliked the book for giving the impression that you get a Nobel Prize just living a life of having fun and playing around, while in fact Feynman was known to work very hard, at all hours of day and night. He liked to re-derive by himself anything he learned till he was satisfied he understood it, and that took prodigious amounts of pen-and-paper calculation. This may partly be Feynman projecting an aura of effortless brilliance, but I think it's more likely a combination of the fact that hard work doesn't seem hard if you enjoy it enough, and that going in detail about how hard you worked does not make for very good conversation.)

It is true that many memoirs are written this same way (dictating to someone else), but I think this book shows the effects more than most: including the selection of topics, as you observed.

(BTW the audio material that went into the books is available too, as "The Feynman Tapes", and listening to it may give a different impression than listening to an audiobook of someone else reading the text of a book itself transcribed from audio: https://kongar-olondar.bandcamp.com/ )

jacobmischka
That explains a lot, I did not know that. Thank you, that changes my opinion a bit.
It reminds me how Feynman described a computer as a superfast filing system https://youtu.be/EKWGGDXe5MA

Though programming is less about precise detailed instructions, it is more about glueing together mostly existing components in a manner understood by your team.

Nov 14, 2020 · max_ on Simulating RAM in Clojure
There is also Richard Feynman's computer heuristics lecture[0].

[0]:https://www.youtube.com/watch?v=EKWGGDXe5MA

Jun 07, 2020 · 62 points, 5 comments · submitted by ColinWright
kumarvvr
I just love Feynman videos.

The way he talks, it shows how much of the subject he has internalized. I just love people who can flow with their thoughts. You know, at some level, that the discoveries he made, the math he did, the thought process he used, was effortless.

I wish the world encouraged and nurtured more such people.

dhimes
You know, at some level, that the discoveries he made, the math he did, the thought process he used, was effortless.

Probably in the same way that a great athlete makes something look effortless. It takes years and years of dedicated practice and love to make it "effortless."

kolmogorov_opt
Diligent in preparation, effortless in action.
eismcc
There’s a related set of videos about the Quantum Mechanical nature of reality that are just amazing.

https://www.youtube.com/playlist?list=PLW_HsOU6YZRkdhFFznHNE...

endlessvoid94
Happy to see these videos posted, they are exceptional. He clearly understood this stuff at a very deep level. I recall reading a story somewhere about his collaboration with Danny Hillis at Thinking Machines (I think?) and applying his unique problem solving skills to the issue of coordinating so many individual processors to perform a job. He wasn't a "computer scientist" but he apparently applied some unique ideas that helped significantly.

Edit: Here's the link: http://longnow.org/essays/richard-feynman-connection-machine...

I can recommend to get a copy of

* Datapoint: The Lost Story of the Texans Who Invented the Personal Computer Revolution, by Lamont Wood.

* The Soul of A New Machine, by Tracy Kidder.

* Books by George Dyson (son of ...): Darwin Among the Machines, Turing's Cathedral. (unrelated, but also give Project Orion: The Atomic Spaceship a go)

* The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley, by Leslie Berlin.

* Richard Feynman Computer Heuristics Lecture at https://www.youtube.com/watch?v=EKWGGDXe5MA

* Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing, by Thierry Bardini.

Related: Richard Feynman, "Computers Don't Compute"

"One of the miseries of life is that everyone names everything a litte bit wrong, and so it makes everything a little harder to understand in the world than it would be if it were named differently. A computer does not primarily compute in the sense of doing arithmetic. Strange. Although they call them computers, that's not what they primarily do. They primarily are filing systems. People in the computer business say they're not really computers, they are "data handlers". All right. That's nice. Data handlers would have been a better name because it gives a better idea of the idea of a filing system. "

https://youtu.be/EKWGGDXe5MA?t=278

77544cec
In French (and some other romance language speaking countries), a computer is called an ordinateur(fr)/ordenador(es) as in 'to ordinate' (put in order).
Oct 22, 2019 · 1 points, 0 comments · submitted by argsv
"One of the miseries of life is that everyone names everything a litte bit wrong, and so it makes everything a little harder to understand in the world than it would be if it were named differently. A computer does not primarily compute in the sense of doing arithmetic. Strange. Although they call them computers, that's not what they primarily do. They primarily are filing systems. People in the computer business say they're not really computers, they are "data handlers". All right. That's nice. Data handlers would have been a better name because it gives a better idea of the idea of a filing system."

Richard Feynman

https://youtu.be/EKWGGDXe5MA?t=278

Feynman is one of the few dead people I wish I could have had many conversations with while on a long hike.

His talk describing computers to a lay audience at the Esalen Institute [1] is my goto resource whenever someone asks me how computers work. It's from the same era as the referenced article, he's even wearing the same Thinking Machines shirt.

[1] https://www.youtube.com/watch?v=EKWGGDXe5MA

Nov 24, 2018 · 142 points, 13 comments · submitted by espeed
tosh
I like how he describes chess about 44 minutes in. Makes me wonder how modern chess engines like stockfish rate a certain board configuration. I expect it is quite more nuanced than his description of dead/not dead and summing up the material on both sides.
thanosnose
> I expect it is quite more nuanced than his description of dead/not dead and summing up the material on both sides.

It is a bit more nuanced. There are heuristics ( breadth and depth searches ) which assigns positional values and also opening and end game database searches. Using remaining material values is the most basic form of chess engine. If that's all stockfish did, most chess players would beat stockfish.

I built a very simple chess engine for my AI class. I started off with the basic "material values". Then added basic heuristics. Then added database lookups.

Now with neural networks and machine learning, chess engines are even more sophisticated.

jointpdf
There is a fascinating (and charming) paper by Alan Turing that describes his "Turochamp" chess 'engine'. Apparently, it was the first program capable of playing a complete game of chess, and the first program that could be described as a computer game (although it sadly only ever existed on paper). The general pattern he outlines (a heuristic evaluation function with hand-tuned weights, along with minimax game tree search--i.e. backwards induction) has formed the basis of most chess engines, both ancient and modern. Here's the original copy: https://docs.google.com/file/d/0B0xb4crOvCgTNmEtRXFBQUIxQWs/...

Intriguingly, Turing posed the question, "Could one make a machine to play chess, and to improve its play, game by game, profiting from its experience?" This reinforcement learning approach to chess did not enjoy much success--until AlphaZero. That story that has been well-told in many places, but perhaps best so by David Silver in this recently released lecture by DeepMind: https://www.youtube.com/watch?v=ld28AU7DDB4. The first ~40 mins are a lucid explanation of the classical methods, and the rest covers RL/MCTS/AlphaZero.

opo
>..."Could one make a machine to play chess, and to improve its play, game by game, profiting from its experience?" This reinforcement learning approach to chess did not enjoy much success--until AlphaZero.

Don't forget Samuel's computer checkers program from 1959. It was among the world's first successful self-learning programs.

https://en.wikipedia.org/wiki/Arthur_Samuel

mrtnmcc
Interesting that he noted pattern recognition as the limitation for computers.
sweezyjeezy
It would be really interesting to know what Feynman would make of the current state of machine learning. It's awesome what tasks can be performed with it, but I imagine he would be disappointed with our level of understanding of how these systems work.
laichzeit0
Why would you say we don't understand how these systems work? Stochastic gradient descent, for example, is not particularly enigmatic. Pattern Recognition and Machine Learning by Christopher Bishop is a good place to start if you want to gain an understanding of how and why machine learning algorithms work.
EGreg
Because of emergent phenomena. That’s like saying you understand how organisms work because you understand how molecules work. Look at Wolfram’a New Kind of Science and even finite automata can have amazing patterns.
faceplanted
He's talking about understanding their process, not their mechanics, machine learning systems are usually black boxes with no guarantees, we run into issues with this fact regularly because all we can do is train them and then study the results, it can't tell us anything for certain.
None
None
dang
A small discussion from 2014: https://news.ycombinator.com/item?id=7457172
None
None
harry8
Note the way he picks on a woman in the audience for no reason to her surprise and bemusement. If you've read "Surely You're Joking..." you'll recognise it as one of his pick-up techniques. I'm a massive Feynmann fan, he has such insight, wisdom and humour. All of us have foibles and we see (IMHO) one of his in this one. They're all adults and I can completely forgive him for it just as I could if it was him putting his index finger up his left nostril and having a good dig while on camera. Not super pleasant to watch though. Adults deciding or not to have sex is fine. Being jerks to try and do it, eh, no more than yuk but, yeah, yuk.
None
None
zorga
If you're going to make a comment like that, about a single moment in a video over an hour long, you could try and be kind to the reader and at least say where in the video this occurs.
goldenkey
At 1:08:35 Feynman tries to put his glasses in his t-shirt, thinking he has on a dress shirt with a pocket. He plays it off by rubbing the glasses against his t-shirt. Pretty awesome how he's still making me smile and chuckle from the grave.

Hats off to you Mr. Feynman. Your output may have been finite, but its effect is limitless.

https://www.youtube.com/watch?v=EKWGGDXe5MA&t=1h8m35s

Doing the "wrong things" in parallel for a net benefit is already a very common theme in all parallel computing, especially in GPUs. All interesting parallel algorithms (more than just a parallel-for over tasks) do more total work than their serial versions, but have shorter critical sections.

I found Feynman's lecture and it's a good analogy for the differences between GPUs and CPUs. The GPU has thousands of dumb clerks who are fast at using their scratch paper and doing a few different arithmetic operations, but there's only one boss giving the whole group their instructions.

https://youtu.be/EKWGGDXe5MA?t=6m

Most recently I really liked "The Unreasonable Effectiveness of Dynamic Typing for Practical Programs": https://vimeo.com/74354480

The ones that had the most profound effect for me would be Linus's talk about git and Carsten Dominik on org-mode.

Also, Richard Feynman explaining how computers work: https://www.youtube.com/watch?v=EKWGGDXe5MA This actually changed the way I think about it after years of programming.

Apr 06, 2018 · 3 points, 1 comments · submitted by haxiomic
Eridrus
The comments about the war games is interesting, because I have heard of these same results from defense blogs, and there the opinions are not usually so dismissive (though of course they're citing them to advance their point), in that a lot of people find the idea of small attack boats with a single gun a compelling strategy that the US could never use (for PR/morale reasons), but posit that these are the exact sort of tactics that others like Iran would be more willing to use effectively against the US.
> Since when have computers been “basically built for copying”

Digital computers (i.e. Shannon's Information Theory) was a solution for the S/N getting worse every time the signal was amplified. Shannon's solution of digital circuits allowed information to be copied without the limit. An important consequence of this was the marginal cost for copying information falling very close to zero. Or, stated in economic terms, information ws no longer scarce.

> computer

You're using a very narrow definition of "computer". Colloquial use of the word obviously includes a variety of technologies and concepts - such as Shannon's digital circuits - not just model of computation (Turing or otherwise).

> modern computers are effective communication device

You don't even need to consider {,inter}networking; the modern devices commonly called "computers" spend most of their time managing data. Very little time is spent doing any actual computation. I recommend this[1] lecture by Feynman, where he explains how what we call "computers" are are really closer to a filing system that stores and copies data (including the store program and most of the mechanics of computation).

[1] https://www.youtube.com/watch?v=EKWGGDXe5MA

qubex
> You’re using a very narrow definition of ”computer”

Yes, I am, and that’s because I have a mathematical background. I’m quite comfortable in my knowledge that the common-parlance item is actually a ’computer’ as I intend it (narrowly) fused with a communicator that allows it to internetwork with other similarly compound devices.

What's fascinating to me is that index cards are a kind of spiritual predecessor to modern database systems and computers - perhaps even more closely related than counting devices like the abacus. Richard Feynman touches on this in one of his lectures [1] that's been linked many times on HN.

The theory of information and computing seems pretty fundamental, and not necessarily tied to what we typically think of as a computer, with CPUs, RAM, SSDs, etc. In a way, a card catalog full of index cards and run by a bunch of people is a computer too. Maybe this isn't an incredible revelation, but it's still interesting to think about.

[1] https://m.youtube.com/watch?v=EKWGGDXe5MA

I believe it was Feynman who introduced the analogy:

desktop : filing cabinet :: RAM : hard drive

Here's a video: "Richard Feynman Computer Heuristics Lecture" (1985) https://youtu.be/EKWGGDXe5MA

Somewhere in my comments here, I talk about topologically sorting CS concepts; in what little time I spent, I think I suggested "Constructor Theory" (Deutsch 201?) as a first physical principle. https://en.wikipedia.org/wiki/Constructor_theory

westurner
> Constructor Theory

https://en.wikipedia.org/wiki/Constructor_theory#Outline

Task, Constructor, Computation Set, Computation Medium, Information Medium, Super information Medium (quantum states)

The filing cabinet and disk storage are information mediums / media.

How is the desktop / filling cabinet metaphor mismatched or limiting?

There may be multiple desktops (RAM/Cache/CPU; Computation mediums): is the problem parallelizable?

Consider a resource scheduling problem: there are multiple rooms, multiple projectors, and multiple speakers. Rooms and projectors cost so much. Presenters could use all of an allotted period of time; or they could take more or less time. Some presentations are logically sequence able (SHOULD/MUST be topologically sorted). Some presentations have a limited amount of time for questions afterward.

Solution: put talks online with an infinite or limited amount of time for asynchronous questions/comments

Solution: in between attending a presentation, also research and share information online (concurrent / asynchronous)

And, like a hash map, make the lookup time for a given resource with a type(s) ~O(1) with URLs (URIs) that don't change. (Big-O notation for computational complexity)

Resource scheduling (SLURM,): https://news.ycombinator.com/item?id=15267146

Most likely this one: https://youtu.be/EKWGGDXe5MA

Check out Feynman's very cool "Thinking Machines" t-shirt. He was working with Danny Hillis at the time.

Sep 10, 2017 · 13 points, 1 comments · submitted by mhasbini
raw23
Watched this the other day, good layman's introduction to what computers actually do. Feynman has a serious skill at explaining complex topics in simple terms that anyone can understand.

love to watch Feynman's lectures even if I have a fairly good understanding of the topic, such an entertaining lecturer.

Dec 01, 2016 · 2 points, 0 comments · submitted by deanmen
Aug 15, 2016 · 2 points, 0 comments · submitted by kornish
You might enjoy his talk on what a computer actually is: https://www.youtube.com/watch?v=EKWGGDXe5MA
smaddox
Great talk; hilarious how he keeps almost falling over,
May 24, 2016 · 2 points, 0 comments · submitted by Extigy
Sure. I don't have anything to link on the spot but this was/is/has been foreseeable for some time. Although it's all very cool and shiny - most business applications of machine learning remain squarely in the territory of classic algos like GLM & forests (random, boosted trees etc. etc.). As a fun note, advances like these highlight that data scientists etc. will not be beaten by more complex automated methods, but simply by speed. Much like the filing system that 'runs' whatever you're using to see these words (https://www.youtube.com/watch?v=EKWGGDXe5MA).

Edit: to elaborate... single model training runs are possible to do quite fast now, but knowing how to tune hyper parameters remains the 'voodoo' of the field. But the best hyper params are also possible to discover through brute force: try every combination you can! Today, you can use various heuristics to improve this process, but either way, being able to train whatever X times faster just means we can search hyper parameter space that much faster. The robots are coming :)

visarga
They could run 10x more experiments for the same cost and experiment with many more configurations, but soon enough there will probably be an algorithm that can do the same on an single computer. I am waiting for the moment neural networks will become as good as people in designing neural networks.
This seems like a good companion (probably as a sequel) to:

"Richard Feynman - Computer Heuristics Lecture": https://www.youtube.com/watch?v=EKWGGDXe5MA

In general, Feynman might be a poster child for using simple language to explain complex things.

> It doesn't really matter what sets you on a particular part, what matters is the end result.

To put it in other words: Birds are limited by evolution. They are not an optimal design - they are a successful reproductive design in a wide ecosystem where flying is a tool.

Our intelligence is no different.

This is something Feynman addressed in this beautiful talk (in the Q&A iirc): https://www.youtube.com/watch?v=EKWGGDXe5MA

Unreal. Watching him speak about computers, while wearing his logo'd t-shirt, makes me imagine him giving this talk at a recent meetup. https://www.youtube.com/watch?v=EKWGGDXe5MA
I am always amazed how Richard Feynman can make such simple stories and comparisons to explain the situation/theory in "explain like I'm 5" style.

It seems that in every lecture or interview he gives he can explain the situation very fluently and in a way that literally, probably, a 5 year old could understand. The best one I saw is where R. Feynman explains how a computer works [1]. Even though I knew everything he told, I was astonished how intriguing and simple his way of presenting was.

[1] https://www.youtube.com/watch?v=EKWGGDXe5MA

rimantas
Wasn't he the one who said that if you cannot explain something to a child you do not really understand it?
trequartista
At the time of his death, the following 2 quotes were found on his blackboard:

"What I cannot create, I do not understand" and right beneath this "Know how to solve every problem that has been solved".

_gqkh
That second quote is very powerful. Often enough we proceed to higher usage of previous knowledge without understanding its foundations. Often enough, you really don't need to understand previous steps to make a leap to the next. That's simply progress. Taking the next logical step.

However, some have argued that with deriving foundational knowledge you attain a discipline and through this process the decisions you make about taking the next step are done completely differently. You sincerely evaluate whether to take that next step.

huhtenberg
There's an expression that goes "to know is to be able to explain". Einstein had a saying to that effect, as did Feynman and ... Aristotle :)
rimantas
Well, Aristotle likely had it backwards, thinking that explaining can pass for knowledge ;)
trymas
I think he was. I understood this way too late after my studies.

Now when I am learning something I always try to explain it in my mind (or write it down) to imaginary 'student'. It's so effective it's amazing. It lets me catch many knowledge 'holes' in my new studied topic. Whereas I used to just read the material (multiple times) and imagine that I know it. If somebody does not know what I am talking about just try to explain some topic you recently.

P.S. AFAIK, teaching is the best way to study/learn new topics.

RogerL
This is some of the best advice available on HN.

Don't (just) read books. Put down the highlighter. Explain them. Get in front of a whiteboard and explain it to an imaginary audience. Imagine the questions you get back. Try to answer them. Sometimes you can do that nearly sentence by sentence, other times you have to do it section by section.

I was lucky enough to have a Feynman level (in terms of teaching) EE professor. I banged my head against modelling electron behavior in NPN junctions until I recognized that his teaching style was exactly how I needed to approach the material. From there on out everything became much easier.

scoggs
I find sometimes that people around me think I'm strange or crazy for getting excitable or long-winded while explaining or going through a story I've recently read that gave me some insight into something new and cool, to me.

I'm just realizing now that it's possible that those people don't have that same inner spark that I have. That same passion and drive to learn and understand. I really still feel like a kid in so many ways in life but even when I feel that twinge of embarrassment for geeking out on a topic too hard I will now and forever feel content in the fact that I'm always trying to teach that inner kid in me.

fnordfnordfnord
A bit like rubber duck debugging.
HardDaysKnight
Michael L. Jones discusses the teaching out loud technique in, The Overnight Student (1990)[1]. He says that in high school he was a B and C student, and in college his first semester GPA was 1.9. The next semester he failed every class. He was expelled. When allowed to re-enroll he again got all F's. A few years later he tried again with success. He credits teaching out loud as the technique that made him an A student.

[1] https://docs.google.com/viewer?a=v&pid=sites&srcid=ZGVmYXVsd...

masklinn
"QED: The Strange Theory of Light and Matter" is absolutely brilliant. Made my feel like I actually understood the subject.
rexignis
I think even Feynman would admit this is not the case, he had no simple way of explaining magnetism because it is inherently complicated.

https://www.youtube.com/watch?v=MO0r930Sn_8

trymas
Though, he had a point, that magnetism is difficult, he explained thoroughly and clearly where's the problem.

I do not know how R. Feynman's university lectures sound like, but his popular lectures seem to follow a pattern, of giving casual situation and using that situation to pull a listener, as well as himself, into explaining something with an analogy.

I liked that R. Feynman explained knowledge as 'a framework that allows something to be true', and not everything can be explained by using 'common knowledge framework' and you need to understand some more details.

rimunroe
The lectures that were put together into the Feynman Lectures on Physics were fabulous. I believe they actually originally thought the audio was lost, but later someone found a bunch of tape reels in storage. The audio is heavily distorted, but a lot of work was done to restore them. I highly recommend anyone with an interest in physics listen to them while reading along in the book. It feels close to actually being in class. He was an exceptionally good teacher.
fnordfnordfnord
>I do not know how R. Feynman's university lectures sound like

Enjoy. You might also find the audio in mp3/ogg/etc format.

  Video http://www.feynmanphysicslectures.com/
  YT https://www.youtube.com/user/FeynmanVideoLectures
  Another video source http://research.microsoft.com/apps/tools/tuva/
  Text http://www.feynmanlectures.info/
  Links to more stuff http://www.feynmanlectures.info/links.html
Here's him explaining how computers work (first 35 minutes): https://www.youtube.com/watch?v=EKWGGDXe5MA
Mar 05, 2015 · 1 points, 0 comments · submitted by ilitirit
I'm pretty sure this is what Feynman wouldn't do. What he would do is be extremely confrontational from the start if he had to be there, eg his army psychological assessment, or just walk out if he didn't. If there was one defining characteristic of the man on the subject it was not suffering this kind of foolishness.

Nobody mentioned Feynman's work for "Thinking machines" A couple more Feynman computing links: https://www.youtube.com/watch?v=EKWGGDXe5MA http://www.scribd.com/doc/52657907/Feynman-Lectures-on-Compu...

Intermernet
True, but I think he may have had some fun with the interviewer, and thought it may be worth taking him down a peg or two.

Feynman was amazingly good at giving serious smackdowns in a convivial way. He would be cheerful, accepting and charming while creating a logical argument that was almost always faultless.

He had some negatives, but I think he was one of the most amazing people to have ever lived. I encourage everyone out there to read and watch his publications and lectures. From grad student til death, he forced people to rethink the hardest and most important problems around, and he had fun doing it.

He was also a prankster, a lock-picker and safe-cracker, and apparently a pretty good musician.

</hero_worship>

Oct 19, 2014 · acqq on Simple CPU
Here Feynman explains the basic logic behind building of all of the computers. He is inspiring even if you know more details about the topics he talks about, because he stimulates you to a fresh view. The youtube title is, IMHO, misleading. The lecture is about the basic hardware concepts and not about heuristics, so I think it fits perfectly with the OP.

http://www.youtube.com/watch?v=EKWGGDXe5MA

Edit: At the start the original title is visible it's "Richard Feynman: The Computers From the Inside Out" and the lecture is from 1985.

The major quote:

"Computers are not primarily computing machines, they are actually filing systems."

AlexanderDhoore
Listening to Feynman talking about computers on a sunday evening. Life is good.
ca98am79
Listening to the Q&A, I am surprised that, despite how much computers have changed in the last 30 years, how little has changed:

https://www.youtube.com/watch?v=EKWGGDXe5MA&feature=youtu.be...

kemist
Thank you
tinco
Thanks for that. Man, Robin Williams would've played the heck out of a Feynman movie..
rdc12
Saying that is torturous
Here's a talk by Richard Feynman where he explains how computers work https://www.youtube.com/watch?v=EKWGGDXe5MA
Jul 14, 2014 · evanb on What Problems to Solve
You might be interested in this lecture that Feynman gave at a meeting called "Idiosyncratic Thinking":

https://www.youtube.com/watch?v=EKWGGDXe5MA

Jun 04, 2014 · 7 points, 0 comments · submitted by luu
I haven't finished it yet, but this video starts out amazingly. https://www.youtube.com/watch?v=EKWGGDXe5MA

Of course, Feynman was an amazing "explainer" of complex things. I can't think of a modern analog. I'm hoping other commenters will chime in and prove me wrong.

Nov 19, 2013 · 3 points, 0 comments · submitted by proksoup
Jun 12, 2013 · 3 points, 0 comments · submitted by crescendo
Good catch re: 'late antiquity' -> I should have been a bit more rigorous in my research there.

I like your point about "how little knowledge is a perilous thing?" Completely agree that programming is far from the only important skill. I just generally think that demystifying what is popularly considered to be "esoteric" knowledge is almost always good for a healthy society. The less we know about how the mysterious machines work, the more we fear them and the more trust and power we give to the people who can manipulate them. I think this is hogwash and completely unnecessary since a computer is really just a super-fast filing clerk [http://www.youtube.com/watch?v=EKWGGDXe5MA].

I can't stand the sans serif typesetting and cramped mathematical formulas. The tone is kind of obnoxious, too.

When he introduces group theory:

Group theory basics. It is time to note that our one-parameter symmetries are groups in the sense of modern algebra. Why? To masturbate with nomenclature as you do in an abstract algebra class? No. Because, as you will soon see, studying the group structure of a symmetry of a differential equation will have direct relevance to reducing its order to lower order, and will have direct relevance to finding some, possibly all of the solutions to the given differential equation—ordinary, partial, linear, or nonlinear. So what is a group?

I don't get the pedagogical purpose of calling what one does in an abstract algebra class "masturbating with nomenclature." I think every word in a textbook should be crafted with a pedagogical goal in mind. Making the material more light-hearted and less daunting is a valid purpose, but this tone just seems sour.

In fact, I count three uses of the word "masturbate" in the notes.

I prefer something like Richard Feynman's style, where he makes a subject accessible while still respecting the subject.

Here's a fantastic example of Feynman explaining how a computer works, using an analogy of an ever-faster filing clerk: http://www.youtube.com/watch?v=EKWGGDXe5MA

crntaylor
What really struck me about that lecture is that he only uses one blackboard throughout the entire talk, and he doesn't start writing on it until 20 minutes in. I wish more lectures and talks were like that.
Dec 30, 2012 · 4 points, 0 comments · submitted by solipsist
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.