HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
Code: The Hidden Language of Computer Hardware and Software

Charles Petzold · 68 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
What do flashlights, the British invasion, black cats, and seesaws have to do with computers? In CODE, they show us the ingenious ways we manipulate language and invent new means of communicating with each other. And through CODE, we see how this ingenuity and our very human compulsion to communicate have driven the technological innovations of the past two centuries. Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines. It’s a cleverly illustrated and eminently comprehensible story—and along the way, you’ll discover you’ve gained a real context for understanding today’s world of PCs, digital media, and the Internet. No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.
HN Books Rankings
  • Ranked #2 this year (2021) · view
  • Ranked #1 all time · view

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
I don't think there's an easy answer to this question. Software engineers still don't know how to exactly or even efficiently communicate with each other. It's still an evolving field and process. In general, it is helpful to understand software development as a sub-field of systems theory and design, so any book that discusses systems should help one better understand software development.

In general, I do also echo some of the other comments. If you are helping to design the app, you shouldn't necessarily need to understand the implementation details. In my experience, clients, whether they be external or internal or colleagues, getting too involved into what they think the implementation should be is usually a disaster. It puts pressure on the system to conform to how they think it should be, which is usually not necessarily how it should be, and it basically adds unnecessary constraints. The real constraints should be what the software should do and specifications on that, including how the software is intended to be maintained and extended.

Some thoughts on some specific courses and books that I think would be helpful to better understand the goals of software development and design and ways to think about it all:

Programming for Everyone - An Introduction to Visual Programming Languages:

I think this course should be taken by managers, designers, and even software engineers. The primary result is that you'll come out of it knowing state charts, which are an extension to state machines, and this will be very useful for thinking about software and organizing what the software should do. Handling state is one of the primary problems in software, and you might notice that all of the various paradigms (OOP, functional, imperative, actors, etc.) in computer programming relate to the various ways people think about handling state in a computing system.

How to Code: Simple Data and Complex Data:

These courses are taught by a designer of the Common Lisp language and based upon the excellent book How to Design Programs. It is essentially a language agnostic course that uses Racket to build up design paradigms that teaches you how to sort out your domain problem and designs into data and functions that operate on that data. The courses are part of a MicroMasters program, so if you really want to get into Java, that is taught in the follow-on courses.

Based upon your last comment, here are some book suggestions on how computers work:

Code: The Hidden Language of Computer Hardware and Software:

The Pattern On The Stone: The Simple Ideas That Make Computers Work:

But How Do It Know? - The Basic Principles of Computers for Everyone:

The Elements of Computing Systems: Building a Modern Computer from First Principles:

I've heard good things about Code by Petzold although I haven't read it myself.

I've read it; it's a phenomenal book to understand more about how computers work on a hardware and conceptual level, but it doesn't really teach programming. Still strongly recommended.
I really loved this book:

Which goes from bits on up without shying away from circuit diagrams. It's also really well written and you can read it from start to finish.

It puts it in a historical context too which makes it fun to read.

Once you read that, they'll be fewer unknown unknowns.

This is a far better idea than "Operating Systems: Three Easy Pieces" as suggested by the OP.
"This is not my hat":

The humour is subversive, the illustration is lovely, and these ("This is not my hat" is another) are great books for younger children. My child loved it, and the people I've given this to have gone on to buy other books by the writer or illustrator.

"Mr Birdsnest and the House Next Door":

Little Gems are a set of books printed on reduced contrast paper, with a large clear font. They're short, simple, but fun. They're good for younger readers or for slightly older reluctant readers. My child enjoyed reading this book, and loved the illustration. The other child I gave this to took out other books in the Little Gems series from the library, and bought other Julia Donaldson books with her pocket money.

"Code: The Hidden Language of Computer Hardware and Software" I had a friend who knew a lot about the software, and knew a lot about hardware but their hardware knowledge was a bit patchy. Code helped solidify their knowledge. If I could have afforded it I would have given them The Art of Electronics and the companion Student Manual. (This was in the 1990s. I haven't read the new version and I don't know how well it works today.)

"Bomber Command" I liked this book because it describes how we (the UK) went into world war 2 with ethical notions around not bombing civilian populations and ended up fire-bombing several heavily populated German cities. It's also eye-opening about the scale of this part of the war, and the cost in lives of aircrew.

I would surely read "Code: The Hidden Language of Computer Hardware and Software"
I found the „Hello world from scratch“[1] series from Ben Eater incredibly helpful in connecting the dots between electricity and modern computers. Strictly speaking it is about electronics, still it is superbly presented and incredibly enlightening when coming from „normal“ software engineering perspective of things.

What actually got me there was the book „Code“ by Charles Petzold[2] which traces the development from early circuitry like light bulbs and telegraph wires to modern digital logic. I found that after being introduced to these concepts, learning about the fundamental physics was much more accessible since it was framed in the context of contemporary application.



Try "Code: The Hidden Language of Computer Hardware and Software." It provides a very simple introduction to electricity. Beyond that, it's just a great introductory book on computing.

I absolutely love this book, but I'd say it's more an intro to computing (like you said) than electricity. Electricity is in there, but IIRC it doesn't go much further than the "water analogy" style of thinking about electricity.
I recommend Code: The Hidden Language of Computer Hardware and Software by Charles Petzold [1]. It is far more comprehensive than the OP, goes from pre-computer code, to electrical circuits, to an overview of assembly. No prior knowledge needed except how to read.


Code's good but it doesn't cover Kleisli categories and Kleisli composition, Peano arithmetic, parametric polymorphism, sum types, pattern matching, or any of numerous other things covered in Maguire's How These Things Work. So it's not accurate to say Code is "far more comprehensive"; Code mentions FORTRAN, ALGOL, COBOL, PL/1, and BASIC, but the example programs it contains are written in assembly, ALGOL, and BASIC. It doesn't contain any programs you can actually run except for an three-line BASIC program and some even simpler assembly programs.
"The Architecture of Symbolic Computers" by Kogge comes close.
It's very depressing that I've been a developer for 8 years and I've never heard any of those terms you mentioned. I'm self taught but I've always felt like I should go back and really learn the fundamentals.
I don't think you can be much more efficient than learning over a period of 8 years if you maintain so-called work-life balance, or being an autodidact. Remember that they have 4-year degrees for this.
I'm not saying you shouldn't always strive to learn new things (for your own personal growth and curiosity), but I think it's important to point out that the link between being a developer and knowing about these things-- esoteric topics of applied Mathematics-- is pretty weak.

Imagine a carpenter spending their time getting a chemistry degree in order to better understand how wood glue works.

Those aren't esoteric topics of applied mathematics if you're programming in Haskell or using formal methods. Moreover, some of them will improve your ability to write working Python. (The others I don't understand. Maybe they will too once I understand them.)
I don't think so. Understanding what goes on underneath the hood is really what differentiates decent coders from great engineers. Compare the data structures of Subversion to those of git. Or look at some of the work John Carmack did in video games. That requires depth.

If your goal is to be a carpenter who puts together housing frames, you absolutely don't need depth. You're also interchangeable and get paid union blue collar wages. On the other hand, if you want to be a craftsman who invents new wooden things, you need depth in some direction, be that structural engineering, artistic, or otherwise.

There's a ceiling you hit unless you learn much more of this stuff. The direction is your choice (but new APIs ain't it -- we're talking depth).

Not everyone needs to be great.

What I actually want to say is that OP shouldn't feel guilty about not knowing those things. It's okay to want to master these things, if it's what you want. But it's pointless to feel bad about not knowing them.

Of course there is no necessity for excellence. The only necessary thing about human life is death; everything else is optional. Before your death, you can cultivate excellence in yourself, or not — many people instead cultivate hatred, addiction, or greed. There are many ways to cultivate excellence; learning is only one of them, and there are many things to learn. Mathematics, and in particular logic (which is what we are talking about here) are the foundation of all objective knowledge, but objective knowledge is not the only kind that has value.

The true philosopher, motivated by love for the truth rather than pride, is so noble in spirit that when she sees evidence that she may be in error, she immediately investigates it rather than turning away; and if she discovers that the evidence is valid, she immediately changes her position. I see such nobility so routinely among mathematicians and logicians that it is noteworthy in the rare cases where it is absent. I see it rarely outside of that field; in some fields, like psychology and theology, I do not see it at all. So I conclude — tentatively — that excellence in mathematics and logic promotes humility and nobility of spirit, which is the highest and most praiseworthy kind of excellence.

So, while I do not think the OP should feel guilty about not knowing those things, I also do not agree with the implication that there is nothing praiseworthy about knowing them.

Well, I agree with you. I think that pursuing our interests in mathematics, music, literature or whatever strikes our fancy is admirable. And I think it makes us happier, wiser and more humble as you say.

At the same time, I maintain that we shouldn't feel guilty if we aren't doing it that, for whatever reason. Sure, sometimes we actually want to pursuit some of these things, but don't. Maybe it's because we have a messy schedule, we can't organize ourselves to prioritize passions.

Feeling guilty does little to actually make you pursue your passions. You're better off learning about habits and how to pick ones that serve you.

Agreed, except that I don't think pursuing our interests in whatever strikes our fancy is admirable.
As long as it's not hurting someone, anything goes as far as I'm concerned.
You are correct, Code does not contain all of the knowledge relevant to computer science. In fact, no book does, as far as I'm aware. But it is far more comprehensive than the OP because it covers a greater breadth of subjects and in greater depth with more accessibility. You're comparing 50 pages of blog posts to 300 pages of book.
I think there's more computer science in those 50 pages, really. It's an easy win.
I'm reading Code right now and it's fantastic. I'm a bit more than 1/2 the way through and so far it's only been about how computers work and not really about computer science.

I heard an expression this weekend that I think is apt - a computer is to computer science as a telescope is to astronomy.

Dijkstra is supposed to have said something like this though the origin is disputed.

"Computer science is no more about computers than astronomy is about telescopes."

"Calling it 'computer science' is like calling surgery 'knife science'."

(Also, "CS could be called the post-Turing decline in the study of formal systems." But I don't know for sure if that was Dijkstra. It's one of my favorite jokes.)

Code is what got me as a teenager interested in tech. It is an awesome book.
This is part of what allowed me to get into programming. The "no prior" knowledge part is absolutely true.

I did start getting lost around the second half of the book.

The amazing thing about Code is how it traces the connection of formal logic (in the Aristotelian sense) to the, as you say, pre-computer code of braille and even flag signals to form the foundations of modern computing.

I am a self-taught developer and probably had 10 years experience in web development when I first read Code. I would have these little moments of revelation where my mind would get ahead of the narrative of the text because I was working backwards from my higher level understanding to Petzolds lower level descriptions. I think of this book fairly often when reading technical documentation or articles.

I recently listened to Jim Keller relate engineering and design to following recipes in cooking [1]. Most people just execute stacks of recipes in their day-to-day life and they can be very good at that and the results of what they make can be very good. But to be an expert at cooking you need to achieve a deeper understanding of what is food and how food works (say, on a physics or thermodynamic level). I am very much a programming recipe executor but reading Code I got to touch some elements of expertise, which was rewarding.

Code: The Hidden Language of Computer Hardware and Software

From binary to a full computer

I constantly found myself having to put the book down, puff my cheeks out and say "whoa." It really blew my mind.
I was inspired to change my career (was originally studying physics) because of this book
This book made me understand pointers. As I read it, I followed along building all the circuits in Logisim [1] from half-adders to latches to multiplexers all the way up to a full CPU.

Many will probably recognize the author, Charles Petzold [2], from his Windows programming books.



I would say the Telegraph.

In the book [CODE: The Hidden Language of Computer Hardware and Softwar](, Charles Petzold talks about how it's foundational to the eventual invention of the computer.

Back then, it also meant coast to coast communications were almost instantaneous. And soon after, transatlantic cable-enabled telegraph boosted commerce between America and Europe.

I recently enjoyed the chapter in "code" about Morse code [1]. After some googling around I also found this morse chat rook which is a good laugh for anyone looking to kill some time struggling to spell profanities in morse :) [2]



[2] was strange, keyboard straight key is the worst.
If you would like to start at a slightly lower level, Charles Petzold has written a great book on the subject:

Thank you. Going to see if I can get it from my local library.
Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines.

... cleverly illustrated and eminently comprehensible story ... No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.

Code doesn’t sound “slightly lower level”.

Code goes into how logic gates like the nand gate are built, so it does start at a lower level than Nand to Tetris which start with nand gates as primitives.
I think you and the GP are using "lower level" in different ways. GP means with a lower barrier to entry, you mean "closer to the electrons" if I understand both of you correctly.
> I'm not really sure such a thing exists (or can exist). Verilog isn't "programming" in any way most developers are familiar with, it's a way of describing how logic gates are interconnected.

I've just finished reading Code ( that builds up a RAM array and 8-bit CPU starting from relays. I'm familiar with the concepts. I'm just looking for something that explains how to express these concepts using Verilog.

No disrespect to Petzold intended (I'm a huge fan of his technical works) but "Code" is at best at a pop-sci level. It prepares one for doing digital logic as well as "A Brief History Of Time" prepares one for doing coursework in cosmology.

Try the Brown book or Mano's "Digital Design" (my preference; took a while to remember his name) and see if it works for you. Good luck.

What three books? (Especially interested in data transfer protocols - I got a little bit into it for game development, but never found anything resembling a good source).

I learned a lot of these things from taking Harvard's CS50 online, reading the blog posts I linked (and doing problems), and I would imagine this book ( would help out - a friend recommended it, but I never got around to it.

^Covers logic gates and the basics of how a computer is built to an incredible degree, but it's VERY time intensive. Still, I really recommend it.

These days CS50 lectures are online. The TCP/IP and Linux books I can't recall. That was some 20+ years ago. I think the TCP/IP/Protocols book was from Sams and the Linux book(s) were from O'Reilly. Algorithms/Data was The Algorithm Design Manual by Skiena. Cracking the Coding Interview is great too. Crypto was just a bunch of reading online, I haven't found a good book on that I would recommend, same for data transfer protocols. I didn't just read three books, I'm saying, 20/20 hindsight I could have read about 3 books to garner most of the stuff I didn't learn by skipping college.

Maybe some of us self taught developers should get together and curate a clear path for CS knowledge and ancillary helpful things.

"Maybe some of us self taught developers should get together and curate a clear path for CS knowledge and ancillary helpful things."

I wouldn't mind spending some time contributing to something like that. There is an endless amount of material to read and classes to take, but narrowing it down and prioritizing is the hard part.

This might seem like a roundabout way to start, but I'd recommend Code by Charles Petzold[0].

It starts out with just wires, switches, and relays, and as the book progresses he goes through the process of building up a simple CPU and RAM one step at a time. The book even walks you through coming up with opcodes and assembly language.

Even if you already know this stuff, I found the book was helpful in developing an intuitive feel for how everything works and fits together.

After reading it, you'd probably have a good mental model of how you'd want to approach writing an emulator.

The Nand to Tetris courses and their accompanying textbook would probably be helpful here too[1][2].

[0] [1] [2]

Chapter 22 of "Code: The Hidden Language of Computer Hardware and Software" uses CP/M to illustrate the inner workings of operating systems. I highly recommend this book overall:
The errata page: which is pretty small. It seems to already be getting traffic (HN hug?) so you might need to reload it a couple times.
Given that I've just found a PDF copy of this book on the internet archive and it hasn't been taken down from there, I'm going to assume the following is okay. If there are any issues, I'm happy to delete this, or a mod can. Hopefully it's all good.

- PDF (9.2MB):

- The seemingly-legit-looking account of the uploader who put it on

- The front page for the item on

- The directory the item is in:


I also found a 176MB scanned copy of the book as pure images:

Teaching CS is hard but I think the root of it is our ability to teach people how to think. Learning to program is easy compared to the problem of learning how to think algorithmically.

IMHO this is because a lot of CS courses start at a very high level with very abstract concepts, which might also be because they can't afford to start with the very basics of "how does a computer work" due to the limited time available.

On the other hand, I think CS should always start with a book like this one, which truly does start at the basics:

A large part of why beginners fail is because they expect too much of, and don't comprehend, how the machine works. I believe that once they see how this complexity is actually the result of a large number of very, very simple operations, they will have a better understanding of what it means to instruct the machine.

I'd recommend Petzold's too.
My mothers neighbour is working part time and averaging $9000 a month. I'm a single mum and just got my first paycheck for $6546! I still can't believe it. I tried it out cause I got really desperate and now I couldn't be happier. Heres what I do, •••••••••>>
But software is so pervasive that within a generation or two, not understanding how it works will put you at a severe disadvantage.

Unfortunately the corporations seem determined to put a stop to that sort of pervasive knowledge, if only for the purpose of controlling and monetising their users. They don't want people to know how easy it is to do things like strip DRM or remove artificial limitations from software. [See Stallman's famous story, or the Cory Doctorow articles on the demise of general-purpose computing.]

And thus most of the "learn to code" efforts I've seen have seemed to focus on some ultra-high-level language, removed from reality and often sandboxed, so that while people do learn the very-high-level concepts of how computers work, they are no less unclear about the functioning of the actual systems they use daily --- or how to use them fully to their advantage. In some ways, they're trying to teach how to write programs without understanding existing ones.

The document said children should be writing programs and understand how computers store information by their final years of primary or intermediate school.

However, this sounds more promising. Especially if they're starting more along the lines of this book:

Actually no, the nature of "learn to code" efforts has nothing to do with some corporations' monetization schemes.
I'm extremely grateful and was not at all expecting such an explanation.

I wanna exlpain few things.

Let me rephrase what I meant by "minimize the time wasting". You see there are lot of great advice available online. You ask something on a subreddit or here and then people will share great resources. I love this and this kind of learning. My concern is that sometimes these resources and advice is given along the lines of "although its not completely necessary, it'll still be an experience in itself".

The problem here is that such kind of learning sometime waste too much of time and leave you with confusion. People daily ask so many questions on CompSci and you'll find books starting from complete basics of computer like Code, Nand2tetris course etc to something very sophisticated like AI. I hope you can understand that if a person spends too much time on these kinda things given that he's got a job or he's student in university with a sweet CompSci curriculum (you know what I mean) then its a problem. Although the above mentioned resources are exceptional there are others too which teaches the same thing. Can a person read all of them one by one "just to satisfy his curiosity and thinking that it'll help him in future"?

RE is already an extremely sophisticated and vast field which requires computer mastery. I'm in college and it has made me hate things I loved. I'm extremely curious guy and can spend 10-20 hours in front of PC easily. I've ~6 years of experience with linux. Now I'm literally not in a state to read 2-3 400-800 page books on a single topic which I don't even know would be required in RE. There are some topics which are quite difficult but at least if we have an idea that it IS mandatory for RE then you can be sure and refer other resources. If you don't even know what's your syllabus how can one concentrate and master it let alone learning. RE requires you to study every minute details or computer system but wasting too much of time on those horrible digital logics and design is really not worth it.

So My purpose is to make it completely clear what I actually need to know so that I can focus on it instead of reading each and every topic in complete detail thinking that if I'll miss the direction of even a single electron in I/O I won't be able to do efficient reversing. I'm literally fed up of those architecture diagrams with arrows and cramming those definitions ROM, EEROM, EEPROM.............. again and again for tests and assignments.

I've few questions for you:

You mentioned Computer Organization and Design which I think is authored by Patterson and Hennessy which is used by almost all Universities. I'm just curious about its not so good looking amazon reviews. Also what's your opinion on Tanenbaum's books which you've mentioned in that reddit link.

Now let's summarize what I've understood (PLEASE help me correct if I'm wrong)

>>>> UNDERSTANDING the system you want to hack

> Learn the most used fundamental programmming languages. (the way we TALK with computers) 1. C (also C++ in some cases) 2. Python or Ruby (given its dominance in industry right now thanks to its productive nature, also being used exploit writing) 3. Java or C# (object oriented programming which along with above languaged completes our programming fundamentals) 4. Assembly (obviously needed in RE) I think it need not be mentioned that we need to have good grasp of Data Structures and Algorithms with above languages (obviously not all)

> Understand each and every data flow and HOW a computer system work

Computer Organization and Design and Architecture

(OS fundamentals, memory management, virtual memory, paging, caching etc, Linux(macOS too) and Windows internals part I think comes here)

You restored my faith in humanity when you said I can skip the hardware and microcode part (please explain what specific topics, I swear I won't look at them again until I'm done with required topics.)

> Network Fundamentals and Programming Basics of http, TCP/IP and other protocols.... Socket programming


> Learning WHAT loopholes are there in this above process of data read write Types of attacks (buffer overflows, heap overflows....)

> HOW those loopholes are exploited

>Reverse Engineering (Learning tools of trade: IDA, gdb.....) learning and practising reversing. Fuzzing

>Exploiting the bugs making exploits.

Please review and correct. Thanks again.

EDIT: I don't know why the edit is not updating.

"Basics of http" and "making exploits" are from next line. Thanks for bearing with me. ;)

Shameless self-promotion. I have a YouTube channel where I basically try to offer a path for learning exploitation. I'm done covering all the basics, and we will soon move to more advanced stuff. I have videos on various different security topics, but here is the probably more relevant playlist:
I know your channel very well. Its praised everywhere because of such good content. I will be happy if you go through my main concern in the details and read the above discussion. Thanks again for such a wonderful channel. I'll surely learn from it when I'll cover the prereqs to understand what you're saying in those videos.
> I want to understand what are the ACTUALLY NECESSARY topics required and in RIGHT ORDER to MINIMIZE the TIME WASTING and wandering in between topics so that the knowledge aqcuired is more practical in context of current vulnerabilities rather than being more theoretical.

To be honest with you? I consider that sentence almost offensive. I hear you, but I think you have absolutely wrong expectations. You want to learn something that is not a profession like plumber where a really good expert can teach you everything you need to know with all the little tricks learned over the years. The field is sooo huge diverse and complicated that this won't work. And I think my playlist offers a rough outline that you can follow, but without going down rabbit holes left and right, and getting stuck many many times, you wont become good at it.

I understand the frustration that you don't want to "waste time" and that you are busy already. But everybody I know who is good in this field, including my own experience shows me, that nobody learns this stuff through a straight path. And everybody knows that most of the time will be spent chasing rabbits through a labyrinth and getting stuck.

Also there is no clear path. It's a complicated web you have to learn to traverse. For example like "Learn C" - what the f* does that even mean? To what extend? Hello World? Drivers? Or Operating System? "Learn assembler" - which assembler? have you looked into the Intel Instruction spec once? I doubt any human knows every instruction. Also who said that intel is the way to go, why not ARM or AVR. All of these fields offer a lifetime of studying in itself.

The "art" in becoming good at security and RE is to get a broad knowledge of a lot of things and try to simultaneously go deeper 'n deeper in all of them. And if you are interested in a specific field, put more weight on those topics.

You know how long it takes to reverse engineer something? People stare on IDA for weeks or months at a time. You can't learn RE just by reading a book or a blog. You gotta start to just doing it, and hopefully find a few blogs and people to keep up the spirit.

Why is it that K&R is referred as the greatest book on C but never recommended to a complete beginner but only seen as a reference book?

Why is it that several resources exist on buffer overflows yet we ask question on which one is better?

Why is it that you started your channel even though resources like Art of exploitation and Shellcode Handbook already exist?

Why is that there are people asking question like "computer science books you wish you had read earlier"?

Are the one who is questioning or answering is asking or telling a short-trick to become the super h4x0r?

Internet forums exist for a reason. It is always wise to take the advice of someone more experienced than you. I don't see any wrong in it.

The people who are on top are there because of a reason. The root of hacking lies in outsmarting a coder by exploiting the mistakes in his code. Now even a field like this has become a corporate profession.

But there's something that differentiates a hacker from rest of the people. I think learning from somebody else's mistakes is one of the smartest thing you can to do.

I still haven't read anything better than Code by Charles Petzold [1] and it's not even close.


It is a fantastic book. It doesn't take u into typical algorithms (at least that I recall), but rather it explains as intuitively as possible how a computer is built up from flip flops and binary logic to assembly, intermediate language and on to full-on compilation of a useable language.

Basically beginner programmers can acquire a broad understanding of the foundation the programs you're building are built on by reading this book. It reads more like a non-fiction expose than a programming language tutorial book, which is to say, given its subject, it's an easy read you can do on the couch. Depending on your skill and knowledge level, there may be a few sections you have to re-read several times until you understand it, but you won't feel as though you need to go over to your computer chair and try something to fully grasp it.

If you can do basic arithmetic, you can get through this book. That seems to be the hidden premise. That computers are easy and should be easy to understand. This book is a testament of that. Though I'm sure some will find this doesn't go deep enough. But the point is: learning so generally will create many entry points for you to follow up on in your journey into programming and computer science. It will clear up many things and essentially make the path seem less scary and out of reach. This book achieves that really well. High level programmers will come away feeling far less insecure about their lack of knowledge of the underpinnings of whatever it is they are developing. I know I did. I can't say enough about this book. It's the real deal. I'm sure those with a computer science degree might have more to say (that is they likely think it's a cursory overview), but I think for everyone else it's a computer science degree in a book you can read in one or two weeks. At least half the degree. For the second half, I recommend Algorithms In A Nutshell. And done! Go back to programming your high level JavaScript react app and get on with your life.

On a side note: it's my opinion that theory first is the wrong way. Application first, theory as needed is the right approach. Otherwise it's like learning music theory before u know u even like to play music. U might not even like being a programmer or be natural at it. And if u spend 4 years studying theory first, u will have spent a lot of time to discover what u could have in like a month. In addition, it can suck the joy and fun out of the exploration of programming and computer science. It's natural and fun to learn as u dive into real problems. Everything u can learn is on the internet. It's very rewarding and often faster to learn things when you are learning it to attain a specific goal. The theory u do learn seems to make much more sense in the face of some goal you are trying to apply it to. In short over ur computing career u can learn the same stuff far faster and far more enjoyably if you do so paired with actual problems.

But that said sometimes u do gotta step back and allocate time for fundamentals, even if u have no specific problem they are related to. However you will know when it's time to brush up on algorithms, or finally learn how the computer works below your day to day level of abstraction. Just know that a larger and larger percentage of us programmers went the applied route, rather than the computer science theory first + formal education route. It's probably the majority of programmers at this point in time. In short u r not alone learning this as u go. Learn to enjoy that early on and save yourself from the pain of insecurity of not knowing everything. This is an exploration and investigation, and perhaps you will make some discoveries nobody else has been able to make, and far before u have mastered and understood everything there is to know about he computer. Perhaps that's it's biggest selling point--you don't have to know everything before you can contribute to the computer world! So enjoy your pursuits in programming, knowing in your unique exploration at any time u may come up with something highly novel and valuable.

Is this book relevant only to beginners? Do you think a programmer with about 7 years of experience (in C, C++, C#, Java) will find it useful?
Absolutely, the book sits nicely between being fairly general but goes into just enough depth without bewildering you.
Yes! I read it about 5 years after finishing my degree (which already covered many of the topics in depth), and it was very enjoyable. It gives a very good, succinct (if simplified) overview of computer architecture.
It's a really short and interesting read. It's difficult to think anyone could read it and find it to be a waste of time.
Thank you for your inspiring words!
I don't think theory first is wrong by itself.

I think that everyone will get new information better when it is something that fixes an immediate issue or clarifies an immediate doubt.

But this is not a contradiction. Theory can be presented in such a way that you want and need to know the next piece of information, the way mystery novels work.

It can be easier for the writer to create this need with examples instead of narrative, no doubt about it.

But let's not fall in the opposite direction of having only examples and no theory, so common with blogs now. I feel empty when I read such materials.

After quitting a job with a lot of commute time in it, and having failed to monetize a side project, I finally landed a teaching position on a local technical university.

I always loved learning and teaching, and a side effect of this is that now I've regained the curiosity I always had about the fundamentals of our industry (I've a CS PhD). So now I'm back reading about the fundamentals of electricity and building 8-bit digital adders with basic AND/OR/XOR logic gates [0].

There's still lots of fundamental things that I want to re-learn, and for 2017 I'm thinking on writing a book about learning programming from exercises (with just enough theoretical concepts) starting from flow-charts and pseudocode, up-to some basic algorithms / abstract data structures/types (probably using Python). My idea is that there are lots of students out there that could benefit of learning how to program by solving focused exercises and learn enough about algorithms and structures to feel capable of doing more complex things (i.e, not feel the "impostor" syndrome).

[0] -

I agree, the book isn't very "bottom-up" at all, perhaps with the exception of "Binary and Number Representation" being the second chapter; the rest of it looks like OS stuff.

This is what I'd consider "bottom up":

My choice for bottom-up book: The Elements of Computing Systems: Building a Modern Computer from First Principles
This is also known as the "nand2tetris" course:

Great book!

I have a physics background but not an EE background. I found verilog pretty easy to grasp. VHDL took me a lot longer.

To get some basic ideas I always recommend the book code by charles petzold:

It walks you through everything from the transistor to the operating system.

(Apparently I need to add that I work for AWS on every message so yes I work for AWS)

Thank you
I always recommend this book for those who would like to know how computers really work:

While I like Petzold's "Code", it's really aimed at nontechnical audiences. Jon Stokes' "Inside the Machine" or Nisan and Schocken's textbook "The Elements of Computing Systems" are far better if you have a technical background.
I second the Stokes recommendation. I have very little hardware background, but quite a bit science and tech (engineering). He provided the right amount of background and introduction for me.
Code by Charles Petzold would be the best in my opinion.


+1. Great gift for non-technical friends/relatives. It does get at a technical depth of computer organization and stuff, but it has a nice gradual intro. But if op focus is computer science history, there may be better options (don't know which ones, and Code does share some history detail, but not that much. It should get your niece interested in computer science though).
I don't think we're near that many layers of abstraction (yet --- and hopefully never?), but certainly more than ten. Indeed it's all about switching binary signals eventually. There's a great book about that too:

Well, let's see how many layers of abstraction the Alto has. 1) Software (a BCPL program). 2) Library / OS code. 3) Machine code. 4) Microcode. 5) Logic signals (TTL chips). 6) Analog signals (e.g. rise time and voltage of the logic signals). 7) Transistors inside the chips. 8) Quantum mechanics. (The last two levels are hidden to us.) If you're using Smalltalk, you could add another layer for the interpreter.
Funny, I was thinking of Petzold's book when I wrote that. +1 for mentioning it!

I still say there are that many layers of abstraction, maybe more. Individual bits, ganged as words, with logic to treat as such, and ability to store and move along a bus. Then you add the ability to act on data: Control logic, arithmetic units, caches, and interpret data as instructions -- microcode running in the CPU, and processor instructions.Now move up to main memory, storage subsystems, networking, graphics (GPUs!!) etc. etc. etc. That gets you through the hardware. Now add software: BIOS, operating system, drivers, user space code, compilers, interpreters, it just goes on and on. And for every one of these giant groups of something, there are just so many logical groupings and abstractions.

It's like individual cells in a body, forming organelles, forming organs, forming creatures. So many layers of organization. Amazing beauty.

But always at the bottom of the pile, there is a switch.

> "Finally, an honorable mention to three papers that don't qualify, but which I think you should read anyway."

If we're going for papers, then I'm guessing books are allowed too. If so, for anyone interested in giving themselves a grounding in the fundamentals, it's worth checking out Code by Charles Petzold. I've been going through it, it's excellently written, and has helped me fill in gaps in my understanding of how computers work.

SICP, Land of Lisp, Exploding the Phone, and The Cuckoo's Egg, while I haven't finished all of them, were all instrumental in making me who I am today.
Haven't read the book you're recommending, but I feel it's more or less close to Code[0] by Charles Petzold, which in itself is a fascinating read.


I read both books a long time ago. My recollection is that Code spends more time building context and explaining why something works as it does. "Elements of Computing Systems" gives more detail on how to implement many of the same concepts in a simple way. Both are great books.
The two are not at all similar. Petzold's "Code" is good but is aimed at non-technical readers while "The Elements of Computing Systems" is more or less a textbook that encapsulates a longitudinal slice of a 4-year computer engineering program, complete with exercises. It's really quite impressive in what it manages to cover (although the massive amount of material glossed over or omitted does make me wince).
This is the sort of thread that hits me right in the wallet.

Here are some books I've given as gifts recently:

* The Knowledge: How to Rebuild Civilization in the Aftermath of a Cataclysm, Lewis Dartnell[1]

* The Black Swan, Nassim Taleb[2]

* Siddhartha, Hermann Hesse[3]

* The Happiness Trap, Russ Harris and Steven Hayes[4]

* Code, Charles Petzold[5]






The Black Swan is on my list too.
Is it really that good?

I've started it a few times. Nassim Nicholas Taleb seems to make sure never to use one word when ten could possibly be used, especially if some of them about himself.

No, it is not. The idea behind the book is as sound as it is simple: shoehorning normal distributions in places where they shouldn't go just to make problems tractable will end in disaster due to an excess of fat tails in the real world.

However, Taleb has been pontificating on that single idea for fifteen years now and has parlayed twenty pages worth of ideas into three books, a collapsed hedge fund and numerous academic positions.

Skim the first three chapters of any of his three books, and you will have learnt all there is to learn from him.

While I can't disagree too much with my sibling comments, I do believe that the shift in mental model is worth the criticisms.

It's a shame his style, wordiness and pretension sometimes gets in the way of communicating a really significant and fundamental concept that I believe everyone should incorporate into their world view.

I thought fooled by randomness was much better. Higher information density.
Yep, I haven't been able to finish the book either, and what I've read didn't stand up to all the hype.

Taleb's Antifragile I did, unfortunately, finish, and it's way, way worse.

Now that I think about it, both books have a similar pattern: the first dozen or so pages present an interesting idea, which does give you a fresh and useful mental model in understanding the world. The rest of the book, unfortunately, meanders off into superficial redundant applications of it and pounding into the reader's head how anti-establishment Taleb is.

Have you read a book called Code by Charles Petzold [0] by chance? It walks you through how a computer works from low level relays all the way up to building a processor and running an OS. He talks about both the Z80 and 6502, and peppers in a lot of the history along the way. The book has me really itching to play around with one of the two and it sound like the Z80 might be the way to go.


I started it a few weeks ago, yeah. Definitely looking forward to churning through all of it.
I'm a programmer without a computer science degree and I'm quite aware that CS is a bit of a blind spot for me so I've tried to read up to rectify this a little.

I found The New Turing Omnibus[1] to give a really nice overview of a bunch of topics, some chapters were a lot harder to follow than others but I got a lot from it.

Code by Charles Petzold[2] is a book I recommend to anyone who stays still long enough; it's a brilliant explanation of how computers work.

Structure and Interpretation of Computer Programs (SICP)[3] comes up all the time when this kind of question is asked and for good reason; it's definitely my favourite CS/programming book, and it's available for free online[4].

I'm still a long way off having the kind of education someone with a CS degree would have but those are my recommendations. I'd love to hear the views of someone more knowledgable.

[1] [2] [3] [4]

> no one has time to learn everything

Nonsense. The fundamentals don't take a long time to learn. And once you know them, everything else becomes much easier to learn. That's the reason that learning the fundamentals matters: it's a huge lever.

Here's a single, small, very accessible book that takes you all the way from switches to CPUs:

SICP gets you from CPUs to Scheme (well, it goes in the other direction, but the end result is the same). That's two books to get from switches to compilers. Anyone who thinks they don't have time for that needs to learn to manage their time better.

> SICP gets you from CPUs to Scheme

I don't recall anything about CPUs in SICP. Its more about data driven programming and writing of intepreters.

What i liked about SICP and scheme programming was that it is a pretty good environment for tinkering - the REPL makes it easy to combine functions, and to work in a bottom-up manner. (btw you had less of that in common lisp, and most other environments that teach you to work in a top down way, however you can still work with the Python REPL).

Maybe this bottom-up approach is what Sussman really has in mind when he is talking about first principles, because SICP is really not about working up from the atoms to the scheme interpreter/compiler.

Sorry, why do you feel there is less of that in Common Lisp? Surely the REPLs offer equal facilities...
> I don't recall anything about CPUs in SICP.

yes, the compiler target. I stand corrected.
I bought this book for my son who will be starting a CS program in the fall. He seems to have enjoyed it, and I'm hopeful it will give him a good grasp of the fundamentals that you might miss by starting out with Java.
Yes, learning the fundamentals is a huge lever. I absolutely agree. But I still stand by the assertion that "no one has time to learn everything" - especially at the beginning of their career.

As the old saying goes "if I had 3 days to cut down a tree, I'd spend the first 2.5 days sharpening my axe". Sure, but at some point you have to actually start chopping. By analogy, at some point you have to quit worrying about fundamentals and learn the "stuff" you need to actually build systems in the real world.

By and large I'm in favor of spending a lot of time on fundamentals, and being able to reason things out from first principles. And when I was younger, I thought that was enough. But the longer I do this stuff, and the larger the field grows, the more I have to concede that, for some people, some of the time, it's a smart tradeoff to spend more of their time on the "get stuff done" stuff.

>> "no one has time to learn everything" - especially at the beginning of their career.

I wish I had this book at the beginning of my career. Makes you design the hardware, then the software for that hardware.

Should not take more than 8 - 12 weeks with school work/day job.

"no one has time to learn everything"

right. No one has time to learn endless js frameworks, injections and reinventions of the wheel. So (1) read the fucking SICP book, (2) learn about the business problem you are trying to solve, put 1 and 2 together and "get stuff done".

This is what co-op programs address. 5 year degree, learn all the fundamentals from silicon to applications, with 6 co-op placements of 4 months each interspersed throughout. That was the Waterloo formula when I went through their CS program and it works tremendously well. Sure, it's a lot to learn in 5 years, and there's always more to learn, but it does give you a very solid foundation to build on.
especially at the beginning of their career.

That's why you spend four years in a university before you start your career. If you're not going to learn all the fundamentals, you might as well go to an 8 week code camp and save your time.

But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful. So what do we do, have people do a 4 year degree, and then go spend 8 weeks, or 16 weeks, or a year, learning to actually build systems?

I don't know, maybe that is the answer. But I suspect the MIT guys have a point in terms of making some small concessions in favor of pragmatism. Of course, one acknowledges that college isn't mean to be trade school... Hmmm... maybe there is no perfect answer.

> But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful

A minor point to make here, college isn't about learning practical skills; that's what trade schools and on the job training do. College is about learning the fundamentals, however, computer science is not software engineering and you don't need learn computer science to be a good software engineer because software engineers don't do much in the way of computers science anyway.

A minor point to make here, college isn't about learning practical skills;

Yes, on paper that is true. And I alluded to that point in my answer above (or at least I meant to). But from a more pragmatic and real world point of view, colleges are at least partially about teaching practical skills. I don't know the exact percentage, but I'm pretty sure a large percentage of college students go in expecting to learn something that will pointedly enable them to get a job. Of course one can argue whether that's a good idea or not, and I don't really have a position on that. But I don't think we can ignore the phenomenon.

> even 4 whole years isn't enough to learn "all the fundamentals"

No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental." It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn.

" It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn."

Thanks, you succeeded in verbalizing succinctly the agony of so many software projects. However, I would claim it's just not ignorance of the fundamentals. It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems. I call it cargo-cult progress.

> It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems.

Spolsky calls these 'architecture astronauts' ( (As a mathematician, I have a certain leaning to this mindset, so I clarify that I am not criticising it, just reporting on existing terminology.)

It is absolutely essential to have a theory of system while implementing it. The software system itself, however, should exhibit only a simple subset of the whole theory in conscise form as possible. Because - usually the full theory becomes obvious only while writing the software. And, in practice, one uses only a tiny part for the implementation at hand.

I think one facet of this thing is that people have an ambition to build huge systems that encompass the entire universe - while in fact, most software system only need to utilize the already excisting capabilities in the ecosystem.

It's like since people are eager tinkerers they approach software projects with the mindset of miniature railroad engineers - while in fact, the proper way to attack the task should be as brutally simple as possible.

The reason huge theoretical systems are a problem in software engineering is that a) they tend to be isomorphic with things already existing b) while not implementing anything new and c) they obfuscate the software system through the introduction of system specific interface (e.g. they invent a new notation of algebra that is isomorphous in all aspectd to the well known one). And the worst sin is, this method destroys profitability and value.


How could they ever understand "simple and easy"? Their concept of simple is not based in reality.

There seems to be this idea (I wish it had a name) that one can just apply a new abstraction layer and the lower layers magically disappear or are rendered insignifcant.

And of course producing 100's or 1000's of pages of "documentation". This is disrespectful toward the reader's time. The best documentation is and will always be the source code. If I cannot read that, then for my purposes there is no documentation.

This is not to say some of these old new thing higher-level of abstraction solutions do not work well or are not easy to use or do not look great. Many people love them, I'm sure. But until I see more common sense being applied it is just not worth my time to learn. I would rather focus on more basic skills that I know will always be useful.

No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental."

I think what we actually disagree about is just the exact definition of "fundamentals". I personally would not say "the fundamentals are very simple and easy to learn". At least not if you're learning them to the level of depth that I would want to learn them.

Now if we're just saying that people only need a very high level overview of "the fundamentals", then that might swing the equation back the other way. But that, to me, is exactly the knob that the MIT guys were twiddling... moving away from SICP and using Python doesn't mean they aren't still teaching fundamentals, it just sounds like they aren't going quite as deep.

Anyway, it's an analog continuum, not a binary / discrete thing. We (the collective we) will probably always be arguing about exactly where on that scale to be.

> I think what we actually disagree about is just the exact definition of "fundamentals".

That may well be, but as I am the one who first used the word into this conversation (, my definition ( is the one that applies here.

Fair enough... from that point of view, I think we agree more than we disagree.

And please don't take anything I'm saying here as an argument against learning fundamentals.. all I'm really saying is that I can understand, even appreciate, the decision MIT made. Whether or not I entirely agree with it is a slightly different issue. But I will say that I don't think they should be excoriated over the decision.

I disagree. The field has exploded. It's becoming more and more difficult to take vertical slices of every sub-field. What should we consider fundamental?

Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

There are plenty of core fields that I've missed. Which are fundamental? Which do we teach? It simply isn't possible to adequately teach everyone the core fundamentals of all of these fields during the course of an undergraduate degree while also conveying the basic design fundamentals that a software developer needs to know. There is just too much in the field to expect every software developer to have a complete understanding of all of the fundamentals in every sub-field. Our field is getting a lot wider and a lot deeper, and with that, people's expertise is forced to narrow and deepen.

> What should we consider fundamental?

A fair question, and a full answer would be too long for a comment (though it would fit in a blog post, which I'll go ahead and write now since this seems to be an issue). But I'll take a whack at the TL;DR version here.

AI, ML, and NLP and web design are application areas, not fundamentals. (You didn't list computer graphics, computer vision, robotics, embedded systems -- all applications, not fundamentals.)

You can cover all the set theory and graph theory you need in a day. Most people get this in high school. The stuff you need is just not that complicated. You can safely skip category theory.

What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically. This is the part that the vast majority of people are missing nowadays, and it can be a little tricky to wrap your brain around at first. You need to understand what a fixed point is and why it matters.

You need automata theory, but again, the basics are really not that complicated. You need to know about Turing-completeness, and that in addition to Turing machines there are PDAs and FSAs. You need to know that TMs can do things that PDAs can't (like parse context-sensitive grammars), and that PDAs can to things that FSAs can't (like parse context-free grammars) and that FSAs can parse regular expressions, and that's all they can do.

You need some programming language theory. You need to know what a binding is, and that there are two types of bindings that matter: lexical and dynamic. You need to know what an environment is (a mapping between names and bindings) and how environments are chained. You need to know how evaluation and compilation are related, and the role that environments play in both processes. You need to know the difference between static and dynamic typing. You need to know how to compile a high-level language down to an RTL.

For operating systems, you need to know what a process is, what a thread is, some of the ways in which parallel processes lead to problems, and some of the mechanisms for dealing with those problems, including the fact that some of those mechanisms require hardware support (e.g. atomic test-and-set instructions).

You need a few basic data structures. Mainly what you need is to understand that what data structures are really all about is making associative maps that are more efficient for certain operations under certain circumstances.

You need a little bit of database knowledge. Again, what you really need to know is that what databases are really all about is dealing with the architectural differences between RAM and non-volatile storage, and that a lot of these are going away now that these architectural differences are starting to disappear.

That's really about it.

I'm of two minds about this. Everything you mention is great background to have. (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I think this deep background is a great goal. But in another way programming is a craft. You can learn as you go. There are millions of bright people who could do useful, quality work without an MIT-level CS education. They just need some guidance, structure, and encouragement.

> (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I wouldn't be so sure. I once applied for a company that was about proving various safety properties of control and signalling applications (for trains). Sizeable applications. They have problems with the halting problem and combinatorial explosions, but they get around those and do it anyway.

A very vaguely related question: are bindings lexical or dynamic in R? Or would it be fair to say that it's actually both at the same time? Or do we need a new term altogether?

For those unfamiliar with it, in R, environments are first-class objects, and identifier lookup in them is dynamic. But the way those environments are chained for lookup purposes is defined by the lexical structure of the program (unless modified at runtime - which is possible, but unusual) - i.e. if a function's environment doesn't have a binding with a given name, the next environment that is inspected is the one from the enclosing function, and not from the caller. So R functions have closure semantics, despite dynamic nature of bindings.

It would appear that this arrangement satisfies the requirements for both dynamic scoping and for lexical scoping.

> AI, ML, and NLP and webdesign are application areas

On first thought, I do agree. However, they are fundamental applications. Category Theory is categorizing the fundamentals. It uses a lot of the fundamentals on itself, I guess, but that doesn't mean ti me it's inaccessible or useless.

Don't confuse "important" with "fundamental". He probably meant foundational to begin with.

The web for instance is an unholy mess. We can do better for mass electronic publishing. We don't because that huge pile of crap is already there, and it's so damn useful.

Webdesign IMHO is be an extreme example of formatted output. I/O is a fundamental concept.
>What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically.

Yes please.

> You need automata theory... Turing-completeness... PDAs and FSAs...

Why? I know that stuff inside and out, and across multiple jobs I have used none of it ever. What chance to use this favourite bit of my education did I miss? (Or rather, might I have missed, so that you might speak to the general case)

> You need to know how to compile a high-level language down to an RTL.

Why? Same comment as above.

> You need to understand what a fixed point is and why it matters.

Well, I don't, and I don't. I request a pointer to suitable study material, noting that googling this mostly points me to a mathematical definition which I suspect is related to, but distinct from, the definition you had in mind.

Otherwise... as I read down this thread I was all ready to disagree with you, but it turns out I'd jumped to a false conclusion, based on the SICP context of this thread. Literacy, what a pain in the butt.

In particular, the idea that ((the notion of an environment) is a fundamental/foundational concept) is a new idea to me, and immediately compelling. I did not learn this in my academic training, learned it since, and have found it be very fruitful. Likewise with lexical vs dynamic binding, actually.

>> You need automata theory... Turing-completeness... PDAs and FSAs...

> Why?

So you can write parsers. To give a real-world example, so you can know immediately why trying to parse HTML with a regular expression can't possibly work.

>> You need to know how to compile a high-level language down to an RTL.


So that when your compiler screws up you can look at its output, figure out what it did wrong, and fix it.

>> You need to understand what a fixed point is and why it matters.

> Well, I don't, and I don't. I request a pointer to suitable study material

Particularly lectures 2A and 7A.

I don't know about your university, but mine at least some coverage of all those categories.

At a minimum an education should give you a strong enough base that you can teach yourself those other things should you so desire.

> Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

That was exactly the curriculum of my CS degree, minus the web design, and I didn't even go to a first rate CS program like MIT (PennStateU 20 yrs ago, no idea what the curriculum is now.).

There is the actual complexity, and then there is the accidental complexity lamented by the poster to whom you responded to. I would claim both are a thing. Especially in projects where the true complexity is not that great and the theoretical basis of the solution is not that well documented people have a tendency to create these onion layered monstrosities (the mudball effect).

If we look just traditional CRUD-kin (i.e. database, view, domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.

> If we look just traditional CRUD-kin (i.e. database, view, domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.

Perhaps because this type of program is so old, it had so much time to stick lots of mud on it. :-)

Code: The Hidden Language of Computer Hardware and Software[0]

My kids enjoyed this book, similar topic, but fairly playful in how it was put together and an extremely gentle introduction without actually shying away from how things actually work. It's hard to imagine a reader not coming away with a much better understanding of what computing is all about. It starts at gates and works up to actual (machine) code at the end of the book. Very good diagrams throughout.

Despite being from 2000, I don't think it's become outdated. I'd love it if there was a sequel that covered putting things together with a cheap FPGA.


This book was something I found very fun to read:

N.B. I did read the book when I was in university studying CS, but I felt like it was a good balance of history and tech information.

Jun 11, 2015 · spb on What Is Code?
> Intro articles like this do a lot to reveal biases and misunderstandings.

This is one of the reasons I barely recommend any intro articles in Lean Notes ( almost every single one is just a stream of incomplete and incorrect statements about how the world works, based on the author's myopic personal experiences.

Rather than properly generalizing and consolidating what needs to be said to convey a full understanding of the topic, most intros settle for the first example they can think of that could be remotely construed as related to the words they've previously used for whatever subject, regardless of whether it has meaning in any context. (Example: saying that type safety prevents you from trying to "multiply seven by cats".)

It seems like a pretty Dunning-Kruger thing: the less broad your knowledge is, the more justified you feel in writing an introductory text to the field.

The only time I've ever seen somebody actually qualified to write an introductory text actually doing so (as I can immediately recall) is Charles Petzold's [Code: The Hidden Language of Computer Hardware and Software][Code] (although I suspect, from the few excerpts of it I've seen, that Brian Kernighan's "D is for Digital" is good, too).


May 23, 2015 · barking on Nand to Tetris 1
A great book and accompanying software imo lacks just 2 things.

1:the building of the Nand gate itself and 2: the building of a flip flop.

Both these tasks can be easily accomplished with reference to the book 'Code' by Charles Petzold (

and software such as this

Code is a great book. What I think is the most powerful and interesting point is that a computer can be built from (a sufficiently large number of) any component that can perform logic functions.

I stopped reading Nand2Tetris in any great amount of detail and just skimmed the rest after I saw that it confused Harvard (separate code and data) and Von Neumann (combined) architectures.

I haven't read 'Code' yet, but I think starting from NAND gates is right; it has to stop at some level. I feel CMOS, MOSFETs, semiconductor physics.. are all interesting topics, but this book is already too broad.
Code actually starts with relays.
highly reccomend reading Code: The Hidden Language of Computer Hardware and software. The author basically starts from how a telegraph works and extrapolates modern computing from that starting point. Great read, and helped me understand exactly what you're referring to, understanding classical computers down to the metal.

Here's a book that I had a random encounter with at a teenager, that gave me an excellent understanding of how computers work at the lowest levels:

It basically starts with a "code" of two friends talking to each other between houses at night via blinking flashlights, and gradually builds up from there to a full, if somewhat barebones, microprocessor, logic gate by logic gate. And it does so in a way that teenage me was able to follow.

Came here to post this. It's a great and approachable book about how information has been codified historically, why that matters, and how foundational it is to computing, and therefore the world at large.
The Soul of a New Machine by Tracy Kidder, the classic book following the development of a new minicomputer in the late 70s.

Stealing The Network: How to Own the Box. This is a collection of fictional accounts of "hacking" written by hackers. Real world techniques are described though its in lightweight detail, the aim of the book is more to give an insight into how an attacker thinks. It's quite an enjoyable read too.

Kingpin: How One Hacker Took Over the Billion-Dollar Cybercrime Underground by Kevin Poulsen. This one's a true story.

Code: The Hidden Language of Computer Hardware and Software By Charles Petzold. I still have to read this one, but I expect it would fit in with what you're after quite well.

Oct 19, 2014 · bprater on Simple CPU
If you want a comprehensive read on this topic, check out the book Code by Charles Petzold:
I haven't read that one, but I'll definitely check it out. I'd also recommend Charles Petzold's book called "Code". One of the best technical books I've ever read!

Besides "Code," I highly recommend "The Pattern on the Stone" by Danny Hillis, who also happens to be the designer of Thinking Machines' massively parallel supercomputer, the Connection Machine. It covers more material than Code in a lot less time.

In "Code," Petzold makes sure to cover the material exteremely slowly so that you won't miss any details, so it may get boring for some, but I would also read Code if you have the time and interest.

I was part of that generation. IIRC most kids didn't give a damn about it. At that age everything is just a weird random novelty. It was exciting for the device though, TO7 and lightpen were cute. Beside in the 80s, computers weren't a thing, even video games were barely established at home. And LOGO didn't feel like programming, turtling was felt more about geometry (left is down if facing left) than anything else, at least to me. We didn't really go into iterations and such.

I hope the new effort will use books like Code or something similar that don't take a macbook air for granted but instead use down to earth first principles that can be shown, built and tested with kids hands.

For slightly older kids, wishing for HtDP inspired classes.

One of the best books I've seen takes this approach:


Starting from either extreme (pure maths or pure electrical engineering) is quite healthy--starting in the middle, though, does a disservice.

Jan 03, 2014 · sanderjd on I, Health Insurer
You may enjoy a book he wrote called CODE[0], which is one of my absolute favorites, but I doubt it will convince you to share his political opinions.


Some resources on making tiny Hello World programs down to the kernel level that may be useful:

A wee bit heavy, but it's comprehensive. It deals with what happens when you run code, how the architecture of the computer works (by and large) including at the logic level:

If you want to go lower (and higher).. look at Understanding the Linux kernel for a good understanding of how an OS is put together, with specific examples i.e. Linux.

Code, by Petzold, deals with logic and computers from the ground up. It starts with relays and builds them up into gates and usable arithmetic blocks.

The physics is fairly simple, at least from a CRT or LED display perspective. Gets more tricky dealing with interconnecting microprocessors because a good chunk is vendor specific.

I think this kind of project is well suited to a guide on how to build a computer from the ground up, starting with logic gates, writing a real time OS and developing a scripting language that will run and compile on it. Then you can skip a lot of largely extraneous stuff and have a solid understanding of how the hardware works.

Sep 16, 2013 · planckscnst on Gift for a 10 year old
Petzold's "Code: The Hidden Language of Computer Hardware and Software" aids in the understanding of how computers work at the lowest levels.

Cool article! "Code" by Charles Petzold[0] talks about morse code while covering ways that we encode data. May be a good read for anyone interested in this topic.


I would have probably said (with some imprecision): Electrical circuits either carry a current (binary 1) or they don't (binary 0). [Ignoring the fact that you could also measure the AMOUNT of current carried here] Because of this underlying limitation in electrics you need to make do with just a binary system in computers.

Depending on the amount of time remaining I would either go into more depth or point him towards "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold (

I read that book as a child and loved it, but it is so hard to find a book named "Code" when that is all you remember. That link just made my day.
Thanks. The low level stuff certainly seems like a good thing to learn—I made an effort to read Code[0] a while back and most of it went over my head. The high level programming languages most people work in today are abstracted to a point where it's extremely hard to see their relation to the low level stuff. It would be nice to understand that link better.


The best bottom up approach to this that I have seen is Charles Petzold's "Code: The Hidden Language of Computer Hardware and Software" [1] which starts with using a flashlight to send messages and walks up the abstraction chain (switch, relay, alu, memory, cpu...) to most of the components of a modern computer. It's very accessible.


The book Algorithmics: The Spirit of Computing doesn't read like a textbook to me, and it's quite interesting.

The New Turing Omnibus

is also good, as is Code by Charles Petzold.

AFTER EDIT: While I thought about the first three books I mentioned, I thought of another, Write Great Code, Volume 1: Understanding the Machine by Randall Hyde.

I've see Code in B&N but never did more than quickly skim through it. And I've never heard of the other two. Thanks!
A great book for anyone curious about this topic (going from simple logic gates to more modern processing technology):
Jun 22, 2012 · showerst on Ask HN: Summer reading
If you'd like a great non-technical tour of how computers really work conceptually, starting from simple morse-code switches through to assembler, Charles Petzold's "Code" is awesome:

Even having understood for years how computers work in principal, nothing quite put it together for me like this book.

There's a similarly great book on the history/methods of cryptography called "The Code Book" by Simon Singh that I recommend too - It's great because it traces the history but also walks you through how the cyphers actually worked, and provides the best intros I've ever seen to public key and quantum cryptography.

Simon Singh's book was my introduction to crypto and the wonderful Mathematics and mathematicians behind its vivid history. I had it with me all the time while doing the online course a couple of months ago for a good historic perspective supporting Prof. Boneh's hardcore crypto topics. Highly recommended. Actually, his Fermat's Last Theorem book is also fascinating if you are interested in Math history.
Not sure if this is the right direction or not. In different ways they are all, essentially, about programming...

Good luck!

Ditto on "Code." As a non-programmer learning to program, it was everything I needed to know to wrap my head around how everything actually works.
This is absolutely the right direction--thanks! The one book on the list that might get kicked back is "Structure and Interpretation of Computer Programs". Too bad, since it's probably the most concrete. This is a great starting point, thanks again.
Recently, I've come to know "Code: The Hidden Language of Computer Hardware and Software"[1], and I really enjoyed it. I don't think it's necessarily the first book every programmer should read, but I do think it's a book every programmer should read. It's an easy read, it's fun, and really does provide what it promises. Highly recommended.


I really enjoyed it. Being self-taught, it provided a really handy primer and computer-architecture fundamentals, logic gates, etc. that I imagine I'd have otherwise gotten in school.
I love stuff like this. I really hope keen engineering youth are able to get involved with building toy CPUs. Maybe not that complex, but enough to grasp what memory mapping and registers actually are.

Anyone interested could read either: ( {the intro is too gentle for too long, then bamm it's too hard for many people.}

( the student lab manual for the art of electronics. Probably best with AoE, which is showing its age but still excellent.

If you want to learn it from the bottom up, read Code.

It's a page turner and you'll know more about computers than many developers. You still won't be a programming guru from this, but its a great holistic approach that you can then supplement.

Thanks. I've seen Code at my local B&N (and of course, on Amazon), and I've come close to buying it.

I'm getting it tomorrow.

I don't necessarily know of any one book that meets all of your friends requirements, but...

Tracy Kidder's The Soul of a New Machine might be good for your friend.

Another good option might be Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.

Or, how about Coders at Work?

Another one that I have (but haven't had time to read yet) is Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software by Scott Rosenberg. It might have something that your friend would find interesting.

Another one that may be inspirational, although it's more about personalities than computer science per-se, would be Steven Levy's Hackers: Heroes of the Computer Revolution.

thanks for the references! i really appreciate you taking the time to reply to my question.

btw "Dreaming in Code" is the only one of those that I've read, and I don't think it's a good fit for my friend because it's basically the story of software project management gone awry ... hardly inspirational for someone aspiring to learn about the beauty of CS :)

Jul 18, 2010 · pan69 on Ask HN: Should I learn C?
While you're at it you might want to pick up a copy of "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold. One of the most compelling and well written books on the subject I've read. Reading this along side learning C and Assembly language will make it all "click"., it's a "general interest non-fiction" book with lots of historical stuff, e.g. coverage of the 8080 and 6500.

I can't find it now, but it reminds me of a MIT Press book for the educated layman that covered electronics, the various generations of semi-conductors (and how at that period TI was the only company to negotiate all of them, this was written at the dawn of the LSI or VLSI era), the critical details of wafer yield and resultant profitability, etc.

Perhaps not the right book for the original poster, but for many people it could be very useful.

OK. Depending on your programming background you're facing a steep learning curve. That's why I recommend a bottom up approach for you.

First read "Code" by Charles Petzold. This book will get you "in the mood" and in the right frame of mind:

Then I suggest you pick up a good book on Assembler. This might be a good choice:

Start writing some drivers for Linux. Like a memdrive or something. Do it all in Assembler! Oh, you need to read other books on how to do this...

Then pick up the K&R book on C. Now write your memdrive driver in C.

That should get you started. I think it will take you at least up to two years before you're passed the learning curve and to be comfortable with this level of programming.

Oh, you need to be willing to do it for the love of it because it's highly unlikely that you will make a living using these sort of technologies (nowadays).

Good luck!

PS: I miss the old days...

Sounds good, thanks bro, interesting approach ;D
I wouldn't start off by writing a driver in assembler, probably better to write it in C first, then use the -S flag to get the intermediate and study that until you drop.
Writing device drivers in C, and then using the -S flag ... have you seen the output gcc -S on Linux (Ubuntu) returns for a trivial hello world program? ;)
Sure. And for non-trivial programs as well.

Low level assembly is what you want, that presumably includes interfacing with you favorite C code.

Calling conventions, stack frames and so on.

Besides, the boilerplate runtime stuff has nothing to do with writing kernel code, so a 'trivial hello world' program will have a lot of cruft added to it:

        .file   "hello.c"
        .section        .rodata
        .string "helo, world!"
  .globl main
        .type   main, @function 
        leal    4(%esp), %ecx
        andl    $-16, %esp

        pushl   -4(%ecx)
        pushl   %ebp
        movl    %esp, %ebp
        pushl   %ecx

        subl    $20, %esp
        movl    $.LC0, (%esp)
        call    puts
        addl    $20, %esp

        popl    %ecx
        popl    %ebp

        leal    -4(%ecx), %esp
        .size   main, .-main
        .ident  "GCC: (Ubuntu 4.3.3-5ubuntu4) 4.3.3"
        .section        .note.GNU-stack,"",@progbits
that's not that bad.

Note the stack frame alignment trick, the fact that 'printf' was specified in the source but puts is being called!

The -S flag was invaluable for me when I was moving from C to assembly. Then again, I used it in an iterative fashion, do something simple -S, do the same thing -S -O1, -S -O2, -S O3, and then a spattering of the different optimization techniques to see how it converted the C to assembly. I miss those days (did I just admit to being old?).
Well I'll admit it right along with you, I miss those days too.

I think I must have coded myself to the moon and back by now if you'd print it all out on fanfold paper (in C, mostly), but the joy of getting a little OS to boot up from nothing is never going to pale in comparison to installing framework X and building some web-app. No matter how successful.

Web-apps make substantially more $ though...

I can assure you that if you want to learn C AND Assembler and you start with C, you will never ever get to the Assembler bit. A memdrive isn't that difficult and if you never solve a 'real' problem, you never learn anything because you don't have to push yourself. Just my 2 cents...
It really is your two cents, what goes for you does not go for everybody else.

Some people will learn just for the fun of learning.

You are obviously missing my point. But that's alright...
For this kind of thing, I recommend the book

"Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold.

Being a CS major, I bought this book for the sheer amazement at how the gist of everything I had learned about computers in the university could be conveyed in one book in such easy and fun manner. The learning curve is so gentle, it just couldn't be easier. I cannot recommend this book enough to _every_ person who really wants to understand how computers work.

HN Books is an independent project and is not operated by Y Combinator or
~ [email protected]
;laksdfhjdhksalkfj more things ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.