HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
Turing's Cathedral: The Origins of the Digital Universe

George Dyson · 5 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "Turing's Cathedral: The Origins of the Digital Universe" by George Dyson.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
A Wall Street Journal Best Business Book of 2012 A Kirkus Reviews Best Book of 2012 In this revealing account of how the digital universe exploded in the aftermath of World War II, George Dyson illuminates the nature of digital computers, the lives of those who brought them into existence, and how code took over the world. In the 1940s and ‘50s, a small group of men and women—led by John von Neumann—gathered in Princeton, New Jersey, to begin building one of the first computers to realize Alan Turing’s vision of a Universal Machine. The codes unleashed within this embryonic, 5-kilobyte universe—less memory than is allocated to displaying a single icon on a computer screen today—broke the distinction between numbers that mean things and numbers that do things, and our universe would never be the same. Turing’s Cathedral is the story of how the most constructive and most destructive of twentieth-century inventions—the digital computer and the hydrogen bomb—emerged at the same time.
HN Books Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
Feb 20, 2021 · calhoun137 on Native Type Theory
> None of Zuse, Babbage, Torres y Quevedo, Ludgate, Dickinson, Desch, Atanasoff–Berry, Mauchly / Eckert, nor many of the other pioneers of computing came via Hilbert and FOM problems

I don't think this is a fair comparison. The modern computer is really distinct from everything that came before. That's because it was built according to the theory of Turing Machines.

One of the most important historical papers for the development of modern computers was the Report on the ENIAC by Von Neumann[1].

Von Neumann took the idea's of others working in the field, and was able to apply his understanding of mathematical logic to formulate the principles which led to the first working modern computer, which Von Neumann built in the basement of IAS. At that time, there was a major debate at IAS between Einstein and Von Neumann, which centered around whether or not to only do pure math at IAS, with the idea that building a computer was part of experimental science. [2]

> Regarding the arrow of influence: a Fields medalist spent a decade coming up with a new foundation of mathematics (or at least algebraic topology), only to realise that the computer science department already teaches Coq to undergraduates!

LOL! That is an interesting and funny story. However, I don't think this example demonstrates that in the future, mathematics will not be the source of improvements to code writing standards.

Question: if code writing standards improve, where else will these improvements come from other than pure mathematics? I consider this question to be a problem type similar to maximum compression algorithms, i.e. its a question whose solution can only be verified using the language of pure mathematics. Therefore it seems likely these improvements can also have their roots in pure mathematics as well. At least, I would not say it "seems unlikely"

[1] https://en.wikipedia.org/wiki/First_Draft_of_a_Report_on_the...

[2] https://www.amazon.com/Turings-Cathedral-Origins-Digital-Uni...

GregarianChild
Regarding your question, I think mathematics and programming about the same: being precise!

Programming is much more demanding, much more unforgiving in this regard, because you are talking to a stupid machine, rather than a smart human. The powers automation gives mathematicians (once you've climbed the mountain of initial unfamiliarity), are so spectacular, take the average Joe so much beyond what even Gauss, Newtown, Grothendieck or Archimedes had at their disposal, that I expect that over the next century mathematics will be rewritten to suit computing, rather than vice versa. K. Buzzards's work is one step in this direction, scriptable notebooks like Jupyter or Mathematica are another.

calhoun137
> I expect that mathematics will be rewritten to suit computing, rather than vice versa

I agree with this. I believe pure mathematics is suffering greatly because many mathematicians refuse to fully embrace the computational power of modern technology.

My belief is the age of pretty formulas is coming to an end, and that the future of mathematics will be it focuses more and more on computational aspects of the subject, and problem sets in pure math courses will be done using programs that are much more advanced than anything which exists today, and everyone will think nothing more of those programs than we do about calculators.

Apologies for the self plug, but this has been my vision with mathinspector[1]. I've been working very hard on that, and this is why I got so interested in your statement. Thank you for clarifying your thinking here. Makes sense to me, and you could be right

[1] https://github.com/MathInspector/MathInspector

GregarianChild
The MathInspector is nice. Reminds me of the "Incredible Proof Machine" [1] which I find to be a neat tool in teaching logic.

[1] https://incredible.pm/

calhoun137
Thank you!!! It's so funny that you mentioned "scriptable notebooks like Jupyter or Mathematica" since I have been spending all of my time on mathinspector recently.

I think our points of view are actually very strongly aligned. However I believe the next big idea is likely to come from outside of computer science.

Personally, I am betting on biology. So many of the most sophisticated techniques are based on biology, e.g. neural nets and genetic algorithms. I have done a lot of work on extending the theory of computation with a new axiom which gives Turing machines a self replicating axiom[1], [2]

In many parts of science, there is a cross pollination, where new ways of thinking about subject X come from a new discovery in subject Y. Typically, research will follow a group think pattern until it hits a brick wall, then you need that really big breakthrough idea. This line of reasoning leads to the conclusion, imo, that it's approximately equally likely to come from either pure computer science, or pure mathematics, or somewhere else.

[1] https://math.stackexchange.com/questions/3605352/what-is-the...

[2] https://medium.com/swlh/self-replicating-computer-programs-8...

GregarianChild
It's hard to predict the future.

Alan Turing invented neural nets in a little-known paper entitled Intelligent Machinery (see [1]), in 1948. Since, the use of NNs has moved away decisively from inspiration by nature. I reckon, nature's last big win in AI were convolutional NNs: Kunihiko Fukushima's neocognitron was published in 1980, and inspired by 1950s work of Hubel and Wiesel [2]. Modern deep learning is largely an exercise in distributed systems: how can you feed stacks and stacks of tensor-cores and TPUs with floating point numbers, while minimising data movement (the real bottleneck of all computing)?

Not unlike, I think, how airplanes were originally inspired by birds, but nowadays the two have mostly parted ways, for solid technical reasons.

[1] http://www.alanturing.net/turing_archive/pages/Reference%20A...

[2] https://en.wikipedia.org/wiki/Neocognitron

calhoun137
> take the average Joe so much beyond what even Gauss, Newtown, Grothendieck or Archimedes had at their disposal

I think this comment really sums up very well what is at the core of our discussion: the future of mathematics and science.

My strong belief is that thousands of years from now, Archimedes and Gauss will still be remembered, and everything we think is great now will be forgotten while they are not. That tells me that they were much farther ahead of their times than us, even though they didn't have modern computers.

Mathematicians and computer scientists both have it totally backwards imo. On the one hand, mathematicians think they have something to teach us about computer science, but they refuse to use technology properly. On the other hand, when we write code, it's all governed by mathematical laws and there are many questions (but maybe not you know, coding standards or the philosophy of writing good code) we could really use the guiding hand of mathematicians with, and they need to catch up with the times and we programmers need to accept they have something valuable to offer and to teach us.

randomNumber7
> I agree with this. I believe pure mathematics is suffering greatly because many mathematicians refuse to fully embrace the computational power of modern technology.

They can't even give meaningful names to their variables. When showing code to mathematicians we should always rename all variables with only one character. Then look down at them because they can't understand it ;)

calhoun137
I think the reason mathematicians use single letter variables is because mathematics is about the way that people think about patterns, understand patterns, and the relationship between them. Therefore the letter is used to help our brain understand the connection between abstract concepts (Categories).

As opposed to programming languages, where the goal is to do something practical, in pure mathematics, our goal is to create a language capable of helping our brains understand the infinite complexities of nature.

https://math.stackexchange.com/questions/24241/why-do-mathem...

Most of mine are going to be books by philosophers or scientists (about philosophy or other things).

- The Conquest of Happiness by Bertrand Russell was a pretty good one. He has a lot of ideas that were ahead of their time (positive psychology, etc). You can see a lot of parallels between his ideas and modern Stoicism (although Russell criticized it elsewhere, I think he came to some of the same conclusions).

- Introduction To Mathematical Philosophy by Bertrand Russell. Another Russell one. I think this is probably the clearest and easiest to understand explanation I've ever read of the underpinnings of mathematical foundations. It's written in a style that should be accessible to almost anyone with a high school education. He wrote it while he was in prison (for refusing to be drafted) during WW1. Apparently he left a copy of it to the prison warden.

- An Enquiry Concerning Human Understanding by David Hume. This is worth reading because it is the motivation for basically all of modern philosophy of science (at least in the west). It's also pretty easy to read and if you read it you'll be able to more easily understand other books and papers that are responses to it.

- Turing's Cathedral by George Dyson. This book should be required reading for every programmer or aspiring programmer IMO. I learned so much about the history of computing that I didn't know before reading this. You will not regret buying this one.

- I Am A Strange Loop by Douglas Hofstadter. Obviously everyone knows about GEB, but he also wrote a shorter follow up that in my opinion expresses his ideas much more clearly. I think that even if you disagree with him, it's worth reading because there are so many things you can take away from this book. For example, he talks about his wife's death, and ties that into his theory of mind and explains the unstated purposes of why we have funerals/wakes for people.

- An Introduction to Information Theory by John R. Pierce. For someone like me who doesn't really have a very strong math background, this was a very clear intro to the ideas behind information theory, and why they're important historically. I would recommend this to anyone who feels like they need a gentle intro to the ideas and motivation for them. Dover mathematics books in general are great.

- Borrow: The American Way of Debt by Louis Hyman. This is a fantastic historical overview of personal credit in the US that covers the past 120 years or so. I learned a ton from reading this that I had no clue about. Recommended to anyone who wants to understand the origins of credit cards / loans, and how society came to embrace being in debt.

https://archive.org/details/in.ernet.dli.2015.222834/page/n7

https://people.umass.edu/klement/imp/imp-ebk.pdf

https://archive.org/details/humeenquiry00humerich/page/n7

https://www.amazon.com/Turings-Cathedral-Origins-Digital-Uni...

https://www.amazon.com/Am-Strange-Loop-Douglas-Hofstadter/dp...

https://www.amazon.com/Introduction-Information-Theory-Symbo...

https://www.amazon.com/Borrow-American-Debt-Louis-Hyman/dp/0...

Gleick's "The Information" and Dyson's "Turing's Cathedral" would be 2 other good pop sci books on the origins of CS.

https://www.amazon.com/Information-History-Theory-Flood/dp/1...

https://www.amazon.com/Turings-Cathedral-Origins-Digital-Uni...

Turing's Cathedral covers von Neumann's wartime work, the IAS machine and the purpose for which it was built, and many other aspects of his life.

http://www.amazon.com/Turings-Cathedral-Origins-Digital-Univ...?

Seconding Dream Machine. Three more:

The Information: A History, A Theory, A Flood http://www.amazon.com/The-Information-History-Theory-Flood/d...

Turing's Cathedral: The Origins of the Digital Universe http://www.amazon.com/Turings-Cathedral-Origins-Digital-Univ...

Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers http://www.amazon.com/Nine-Algorithms-That-Changed-Future/dp...

HN Books is an independent project and is not operated by Y Combinator or Amazon.com.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.