HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
Code: The Hidden Language of Computer Hardware and Software

Charles Petzold · 72 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
What do flashlights, the British invasion, black cats, and seesaws have to do with computers? In CODE, they show us the ingenious ways we manipulate language and invent new means of communicating with each other. And through CODE, we see how this ingenuity and our very human compulsion to communicate have driven the technological innovations of the past two centuries. Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines. It’s a cleverly illustrated and eminently comprehensible story—and along the way, you’ll discover you’ve gained a real context for understanding today’s world of PCs, digital media, and the Internet. No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.
HN Books Rankings
  • Ranked #7 this year (2024) · view
  • Ranked #1 all time · view

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
There's a book for that:

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

unethical_ban
I own both editions and am re-reading with 2e.

Thank you for the recommendation, it's a good one. I stand by my position that the video seems confusing without context. "Start with Why".

lisper
It's hard for me to imagine anyone with internet access not knowing why transistors are important.
Koshkin
Way back when, everyone knew transistors were important since it is what they called portable radios.
Jun 10, 2022 · localhost on “Code” 2nd Edition
Code is my favorite technical book of all time [1]. Charles does an amazing job of building a computer up from basic principles "two young boys who want to communicate after their parents tell them to go to bed at night" all the way to modern (for 1999) computers. He layers abstraction on top of abstraction all the way to a working computer. My only (slight) disappointment in the book is that he tries to cover operating systems -> object oriented programming in a couple of chapters at the end. That could have been a multi-volume series in its own right.

It goes really well with Elements of Computing Systems (2nd ed) [2] which I kind of think of as a "lab manual" where you get to build a computer from first principles.

[1] https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

[2] https://www.amazon.com/Elements-Computing-Systems-second-Pri...

greymalik
When the book “builds” a computer, I’m assuming that’s a virtual computer? If so, what language does it use?
jagged-chisel
GP said “builds up to.” It’s more about the abstractions adding up so that the reader understands how electrons can be programmed. There is no implementation of any kind of virtual or hardware machine in the book.

He goes: electricity, relays, logic gates, circuits (like adders), CPU, RAM …

It’s been awhile so I might have missed a step or two. You’ll come away knowing how we used electricity to get from lightning bolts, to pocket computers. You will not come away with a programmable machine.

Closi
It’s more a book that actually shows you how a computer really works, from first principles.

It’s not a book about implementing a particular computer - it’s more about giving you the rough thought process about how it actually works at the level of electricity and wires, and how this gets built up into something that actually calculates stuff.

It’s the first book that took me from “computers are mystical magic” to “computers are understandable magic”.

cptnapalm
There is a much older text book called the Art of Digital Design. In it, if you buy and solder the necessary components, you get a fully working PDP-8.
Closi
It’s more a book that actually shows you how a computer really works, from first principles.

It’s not a book about implementing a particular computer - more about

Jtsummers
If you're actually looking for something that does build to a virtual computer, Nand2Tetris (second edition of the book The Elements of Computing Systems that goes with it is now out) is a great companion to Code: https://www.nand2tetris.org/.

Code is more high level, Nand2Tetris and Elements is project based but covers some similar territory.

nonima
Code is not more high level it starts from as low as you can get in terms of abstractions, it explains what electricity is and how it moves through wires and and moves on to creating simple logic gates using switches.
overgrownzygote
Highly recommend Nand2Tetris. I completed it a few months ago as a relatively non-technical person (I had only taken CS50 as my first CS class ever in 2020) and learned a ton.
greymalik
I was actually asking about The Elements of Computing Systems in my comment, not Code. What language does it use to build a virtual computer?
localhost
You program in HDL using the Nand2Tetris software suite [1]. Super fun - I went through the book on my vacation last year. Highly recommended.

[1] https://www.nand2tetris.org/software

BobLaw
Quoted from the 2021 edition:

"Project 4 is written in the computer’s assembly language, and projects 9 and 12 (a simple computer game and a basic operating system) are written in Jack—the Java-like high-level language for which we build a compiler in chapters 10 and 11."

ayushka
Nand2tetris uses its own version of Hardware Description Language (HDL). You will also write a compiler for it's own language called Hack.
virissimo
The hardware exercises expect you to use a simple HDL, you write an assembler in your language of choice (I used Ruby, but others have used Python or JavaScript), and you write programs for the computer you "built" in earlier chapters using Jack (a language with Java-like syntax).
cptnapalm
I wrote mine in awk :)
danielvaughn
I picked it up years ago, got through the first few chapters, but then never finished it. I loved the early buildup and still want to go back and keep reading.
unethical_ban
Same! Still one of my favorite books ever read, but other than the floating point chapter it started getting over my head.
https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
ideakitchen
Thanks, just picked this up, look forward to reading this.
imdsm
Thanks. Seems there is a second edition due October 26, 2022.

https://www.amazon.co.uk/Code-Language-Computer-Hardware-Sof...

I feel tempted to buy the current edition, but also to wait for the second.

blowski
Oh that's brilliant, I'm looking forward to that!

There is still so much of the tech stack that hasn't changed since this book was published in 2000 - TCP, IP, DNS, binary, logic gates, etc. But there's also much that has changed that could be written about - virtualisation, containers, wifi, fibre, cloud infrastructure, USB, GPUs.

imdsm
I've gone ahead and bought the first edition, and then when the other is out, I'll add that to my collection too. Looking forward to reading it. No matter how much I learn, I find going over the basics again, reading introductions again, really helps to keep everything active up there. Let's face it, we learn so much but we also seem to lose so much as well.
(Including Amazon links, but just for convenience, buy wherever you want)

Code by Petzold (https://www.amazon.com/Code-Language-Computer-Hardware-Softw...) - non-technical (in the sense it isn't something to "work through"), covers a lot of interesting topics. Especially approachable for that age.

Elements of Computing Systems by Nisan & Schoken (https://www.amazon.com/Elements-Computing-Systems-second-Pri...) - more technical (has content to work through). I've read the first edition, not the second. Has a companion site: https://www.nand2tetris.org. It's well-written, and a motivated high schooler could work through it.

The Code Book by Singh (https://www.amazon.com/Code-Book-Science-Secrecy-Cryptograph...)

The Codebreakers by Kahn (https://www.amazon.com/Codebreakers-Comprehensive-History-Co...)

I was always interested in ciphers and such as a kid so those two books got my attention when I found them in high school/college. I'm a bit fuzzy, now, about which one I was more interested in but both were good books. (I still have them, may give them a re-read next month.)

There are a few others I have in mind, but just can't recall the titles at the moment.

I don't think there's an easy answer to this question. Software engineers still don't know how to exactly or even efficiently communicate with each other. It's still an evolving field and process. In general, it is helpful to understand software development as a sub-field of systems theory and design, so any book that discusses systems should help one better understand software development.

In general, I do also echo some of the other comments. If you are helping to design the app, you shouldn't necessarily need to understand the implementation details. In my experience, clients, whether they be external or internal or colleagues, getting too involved into what they think the implementation should be is usually a disaster. It puts pressure on the system to conform to how they think it should be, which is usually not necessarily how it should be, and it basically adds unnecessary constraints. The real constraints should be what the software should do and specifications on that, including how the software is intended to be maintained and extended.

Some thoughts on some specific courses and books that I think would be helpful to better understand the goals of software development and design and ways to think about it all:

Programming for Everyone - An Introduction to Visual Programming Languages: https://www.edx.org/course/programming-for-everyone-an-intro...

I think this course should be taken by managers, designers, and even software engineers. The primary result is that you'll come out of it knowing state charts, which are an extension to state machines, and this will be very useful for thinking about software and organizing what the software should do. Handling state is one of the primary problems in software, and you might notice that all of the various paradigms (OOP, functional, imperative, actors, etc.) in computer programming relate to the various ways people think about handling state in a computing system.

How to Code: Simple Data and Complex Data:

https://www.edx.org/course/how-to-code-simple-data

https://www.edx.org/course/how-to-code-complex-data

https://www.edx.org/micromasters/ubcx-software-development

These courses are taught by a designer of the Common Lisp language and based upon the excellent book How to Design Programs. It is essentially a language agnostic course that uses Racket to build up design paradigms that teaches you how to sort out your domain problem and designs into data and functions that operate on that data. The courses are part of a MicroMasters program, so if you really want to get into Java, that is taught in the follow-on courses.

Based upon your last comment, here are some book suggestions on how computers work:

Code: The Hidden Language of Computer Hardware and Software: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

The Pattern On The Stone: The Simple Ideas That Make Computers Work: https://www.amazon.com/Pattern-Stone-Computers-Science-Maste...

But How Do It Know? - The Basic Principles of Computers for Everyone: https://www.amazon.com/But-How-Know-Principles-Computers/dp/...

The Elements of Computing Systems: Building a Modern Computer from First Principles: https://www.amazon.com/Elements-Computing-Systems-Building-P...

I've heard good things about Code by Petzold although I haven't read it myself.

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

cmrdsprklpny
I've read it; it's a phenomenal book to understand more about how computers work on a hardware and conceptual level, but it doesn't really teach programming. Still strongly recommended.
I really loved this book: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

Which goes from bits on up without shying away from circuit diagrams. It's also really well written and you can read it from start to finish.

It puts it in a historical context too which makes it fun to read.

Once you read that, they'll be fewer unknown unknowns.

alexdowad
This is a far better idea than "Operating Systems: Three Easy Pieces" as suggested by the OP.
"This is not my hat": https://www.amazon.co.uk/This-Not-Hat-Jon-Klassen/dp/1406353...

The humour is subversive, the illustration is lovely, and these ("This is not my hat" is another) are great books for younger children. My child loved it, and the people I've given this to have gone on to buy other books by the writer or illustrator.

"Mr Birdsnest and the House Next Door": https://www.amazon.co.uk/Birdsnest-House-Next-Door-Little/dp...

Little Gems are a set of books printed on reduced contrast paper, with a large clear font. They're short, simple, but fun. They're good for younger readers or for slightly older reluctant readers. My child enjoyed reading this book, and loved the illustration. The other child I gave this to took out other books in the Little Gems series from the library, and bought other Julia Donaldson books with her pocket money.

"Code: The Hidden Language of Computer Hardware and Software" https://www.amazon.co.uk/Code-Language-Computer-Hardware-Sof... I had a friend who knew a lot about the software, and knew a lot about hardware but their hardware knowledge was a bit patchy. Code helped solidify their knowledge. If I could have afforded it I would have given them The Art of Electronics and the companion Student Manual. (This was in the 1990s. I haven't read the new version and I don't know how well it works today.)

"Bomber Command" https://www.amazon.co.uk/Bomber-Command-Pan-Military-Classic... I liked this book because it describes how we (the UK) went into world war 2 with ethical notions around not bombing civilian populations and ended up fire-bombing several heavily populated German cities. It's also eye-opening about the scale of this part of the war, and the cost in lives of aircrew.

notoriousarun
I would surely read "Code: The Hidden Language of Computer Hardware and Software"
I found the „Hello world from scratch“[1] series from Ben Eater incredibly helpful in connecting the dots between electricity and modern computers. Strictly speaking it is about electronics, still it is superbly presented and incredibly enlightening when coming from „normal“ software engineering perspective of things.

What actually got me there was the book „Code“ by Charles Petzold[2] which traces the development from early circuitry like light bulbs and telegraph wires to modern digital logic. I found that after being introduced to these concepts, learning about the fundamental physics was much more accessible since it was framed in the context of contemporary application.

1: https://youtu.be/LnzuMJLZRdU

2: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

Try "Code: The Hidden Language of Computer Hardware and Software." It provides a very simple introduction to electricity. Beyond that, it's just a great introductory book on computing.

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

coldpie
I absolutely love this book, but I'd say it's more an intro to computing (like you said) than electricity. Electricity is in there, but IIRC it doesn't go much further than the "water analogy" style of thinking about electricity.
I recommend Code: The Hidden Language of Computer Hardware and Software by Charles Petzold [1]. It is far more comprehensive than the OP, goes from pre-computer code, to electrical circuits, to an overview of assembly. No prior knowledge needed except how to read.

1. https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

kragen
Code's good but it doesn't cover Kleisli categories and Kleisli composition, Peano arithmetic, parametric polymorphism, sum types, pattern matching, or any of numerous other things covered in Maguire's How These Things Work. So it's not accurate to say Code is "far more comprehensive"; Code mentions FORTRAN, ALGOL, COBOL, PL/1, and BASIC, but the example programs it contains are written in assembly, ALGOL, and BASIC. It doesn't contain any programs you can actually run except for an three-line BASIC program and some even simpler assembly programs.
bordercases
"The Architecture of Symbolic Computers" by Kogge comes close.
tenaciousDaniel
It's very depressing that I've been a developer for 8 years and I've never heard any of those terms you mentioned. I'm self taught but I've always felt like I should go back and really learn the fundamentals.
bordercases
I don't think you can be much more efficient than learning over a period of 8 years if you maintain so-called work-life balance, or being an autodidact. Remember that they have 4-year degrees for this.
eat
I'm not saying you shouldn't always strive to learn new things (for your own personal growth and curiosity), but I think it's important to point out that the link between being a developer and knowing about these things-- esoteric topics of applied Mathematics-- is pretty weak.

Imagine a carpenter spending their time getting a chemistry degree in order to better understand how wood glue works.

kragen
Those aren't esoteric topics of applied mathematics if you're programming in Haskell or using formal methods. Moreover, some of them will improve your ability to write working Python. (The others I don't understand. Maybe they will too once I understand them.)
murgindrag
I don't think so. Understanding what goes on underneath the hood is really what differentiates decent coders from great engineers. Compare the data structures of Subversion to those of git. Or look at some of the work John Carmack did in video games. That requires depth.

If your goal is to be a carpenter who puts together housing frames, you absolutely don't need depth. You're also interchangeable and get paid union blue collar wages. On the other hand, if you want to be a craftsman who invents new wooden things, you need depth in some direction, be that structural engineering, artistic, or otherwise.

There's a ceiling you hit unless you learn much more of this stuff. The direction is your choice (but new APIs ain't it -- we're talking depth).

elbear
Not everyone needs to be great.

What I actually want to say is that OP shouldn't feel guilty about not knowing those things. It's okay to want to master these things, if it's what you want. But it's pointless to feel bad about not knowing them.

kragen
Of course there is no necessity for excellence. The only necessary thing about human life is death; everything else is optional. Before your death, you can cultivate excellence in yourself, or not — many people instead cultivate hatred, addiction, or greed. There are many ways to cultivate excellence; learning is only one of them, and there are many things to learn. Mathematics, and in particular logic (which is what we are talking about here) are the foundation of all objective knowledge, but objective knowledge is not the only kind that has value.

The true philosopher, motivated by love for the truth rather than pride, is so noble in spirit that when she sees evidence that she may be in error, she immediately investigates it rather than turning away; and if she discovers that the evidence is valid, she immediately changes her position. I see such nobility so routinely among mathematicians and logicians that it is noteworthy in the rare cases where it is absent. I see it rarely outside of that field; in some fields, like psychology and theology, I do not see it at all. So I conclude — tentatively — that excellence in mathematics and logic promotes humility and nobility of spirit, which is the highest and most praiseworthy kind of excellence.

So, while I do not think the OP should feel guilty about not knowing those things, I also do not agree with the implication that there is nothing praiseworthy about knowing them.

elbear
Well, I agree with you. I think that pursuing our interests in mathematics, music, literature or whatever strikes our fancy is admirable. And I think it makes us happier, wiser and more humble as you say.

At the same time, I maintain that we shouldn't feel guilty if we aren't doing it that, for whatever reason. Sure, sometimes we actually want to pursuit some of these things, but don't. Maybe it's because we have a messy schedule, we can't organize ourselves to prioritize passions.

Feeling guilty does little to actually make you pursue your passions. You're better off learning about habits and how to pick ones that serve you.

kragen
Agreed, except that I don't think pursuing our interests in whatever strikes our fancy is admirable.
elbear
As long as it's not hurting someone, anything goes as far as I'm concerned.
grumple
You are correct, Code does not contain all of the knowledge relevant to computer science. In fact, no book does, as far as I'm aware. But it is far more comprehensive than the OP because it covers a greater breadth of subjects and in greater depth with more accessibility. You're comparing 50 pages of blog posts to 300 pages of book.
kragen
I think there's more computer science in those 50 pages, really. It's an easy win.
criddell
I'm reading Code right now and it's fantastic. I'm a bit more than 1/2 the way through and so far it's only been about how computers work and not really about computer science.

I heard an expression this weekend that I think is apt - a computer is to computer science as a telescope is to astronomy.

berkeleyjunk
Dijkstra is supposed to have said something like this though the origin is disputed.

"Computer science is no more about computers than astronomy is about telescopes."

carapace
"Calling it 'computer science' is like calling surgery 'knife science'."

(Also, "CS could be called the post-Turing decline in the study of formal systems." But I don't know for sure if that was Dijkstra. It's one of my favorite jokes.)

abnry
Code is what got me as a teenager interested in tech. It is an awesome book.
throw1234651234
This is part of what allowed me to get into programming. The "no prior" knowledge part is absolutely true.

I did start getting lost around the second half of the book.

look_lookatme
The amazing thing about Code is how it traces the connection of formal logic (in the Aristotelian sense) to the, as you say, pre-computer code of braille and even flag signals to form the foundations of modern computing.

I am a self-taught developer and probably had 10 years experience in web development when I first read Code. I would have these little moments of revelation where my mind would get ahead of the narrative of the text because I was working backwards from my higher level understanding to Petzolds lower level descriptions. I think of this book fairly often when reading technical documentation or articles.

I recently listened to Jim Keller relate engineering and design to following recipes in cooking [1]. Most people just execute stacks of recipes in their day-to-day life and they can be very good at that and the results of what they make can be very good. But to be an expert at cooking you need to achieve a deeper understanding of what is food and how food works (say, on a physics or thermodynamic level). I am very much a programming recipe executor but reading Code I got to touch some elements of expertise, which was rewarding.

https://youtu.be/Nb2tebYAaOA?t=1351

Code: The Hidden Language of Computer Hardware and Software

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

From binary to a full computer

klondike_klive
I constantly found myself having to put the book down, puff my cheeks out and say "whoa." It really blew my mind.
ibeckermayer
I was inspired to change my career (was originally studying physics) because of this book
btmills
This book made me understand pointers. As I read it, I followed along building all the circuits in Logisim [1] from half-adders to latches to multiplexers all the way up to a full CPU.

Many will probably recognize the author, Charles Petzold [2], from his Windows programming books.

[1]: http://www.cburch.com/logisim/

[2]: https://en.wikipedia.org/wiki/Charles_Petzold

I would say the Telegraph.

In the book [CODE: The Hidden Language of Computer Hardware and Softwar](https://www.amazon.com/Code-Language-Computer-Hardware-Softw...), Charles Petzold talks about how it's foundational to the eventual invention of the computer.

Back then, it also meant coast to coast communications were almost instantaneous. And soon after, transatlantic cable-enabled telegraph boosted commerce between America and Europe.

I recently enjoyed the chapter in "code" about Morse code [1]. After some googling around I also found this morse chat rook which is a good laugh for anyone looking to kill some time struggling to spell profanities in morse :) [2]

[1] https://www.amazon.co.uk/Code-Language-Computer-Hardware-Sof...

[2] http://morsecode.me/?room=1

lightlyused
[2] was strange, keyboard straight key is the worst.
If you would like to start at a slightly lower level, Charles Petzold has written a great book on the subject:

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

bigmit37
Thank you. Going to see if I can get it from my local library.
WalterGR
Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines.

... cleverly illustrated and eminently comprehensible story ... No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.

Code doesn’t sound “slightly lower level”.

sergex
Code goes into how logic gates like the nand gate are built, so it does start at a lower level than Nand to Tetris which start with nand gates as primitives.
mrunkel
I think you and the GP are using "lower level" in different ways. GP means with a lower barrier to entry, you mean "closer to the electrons" if I understand both of you correctly.
> I'm not really sure such a thing exists (or can exist). Verilog isn't "programming" in any way most developers are familiar with, it's a way of describing how logic gates are interconnected.

I've just finished reading Code (https://www.amazon.com/Code-Language-Computer-Hardware-Softw...) that builds up a RAM array and 8-bit CPU starting from relays. I'm familiar with the concepts. I'm just looking for something that explains how to express these concepts using Verilog.

None
None
ThrowawayR2
No disrespect to Petzold intended (I'm a huge fan of his technical works) but "Code" is at best at a pop-sci level. It prepares one for doing digital logic as well as "A Brief History Of Time" prepares one for doing coursework in cosmology.

Try the Brown book or Mano's "Digital Design" (my preference; took a while to remember his name) and see if it works for you. Good luck.

What three books? (Especially interested in data transfer protocols - I got a little bit into it for game development, but never found anything resembling a good source).

I learned a lot of these things from taking Harvard's CS50 online, reading the blog posts I linked (and doing problems), and I would imagine this book (https://bigmachine.io/products/the-imposters-handbook/) would help out - a friend recommended it, but I never got around to it.

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

^Covers logic gates and the basics of how a computer is built to an incredible degree, but it's VERY time intensive. Still, I really recommend it.

ryanmarsh
These days CS50 lectures are online. The TCP/IP and Linux books I can't recall. That was some 20+ years ago. I think the TCP/IP/Protocols book was from Sams and the Linux book(s) were from O'Reilly. Algorithms/Data was The Algorithm Design Manual by Skiena. Cracking the Coding Interview is great too. Crypto was just a bunch of reading online, I haven't found a good book on that I would recommend, same for data transfer protocols. I didn't just read three books, I'm saying, 20/20 hindsight I could have read about 3 books to garner most of the stuff I didn't learn by skipping college.

Maybe some of us self taught developers should get together and curate a clear path for CS knowledge and ancillary helpful things.

None
None
madeuptempacct
"Maybe some of us self taught developers should get together and curate a clear path for CS knowledge and ancillary helpful things."

I wouldn't mind spending some time contributing to something like that. There is an endless amount of material to read and classes to take, but narrowing it down and prioritizing is the hard part.

This might seem like a roundabout way to start, but I'd recommend Code by Charles Petzold[0].

It starts out with just wires, switches, and relays, and as the book progresses he goes through the process of building up a simple CPU and RAM one step at a time. The book even walks you through coming up with opcodes and assembly language.

Even if you already know this stuff, I found the book was helpful in developing an intuitive feel for how everything works and fits together.

After reading it, you'd probably have a good mental model of how you'd want to approach writing an emulator.

The Nand to Tetris courses and their accompanying textbook would probably be helpful here too[1][2].

[0] https://www.amazon.com/Code-Language-Computer-Hardware-Softw... [1] https://www.coursera.org/learn/build-a-computer [2] https://www.coursera.org/learn/nand2tetris2

Chapter 22 of "Code: The Hidden Language of Computer Hardware and Software" uses CP/M to illustrate the inner workings of operating systems. I highly recommend this book overall: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...
exikyut
The errata page: http://www.charlespetzold.com/code/ which is pretty small. It seems to already be getting traffic (HN hug?) so you might need to reload it a couple times.
exikyut
Given that I've just found a PDF copy of this book on the internet archive and it hasn't been taken down from there, I'm going to assume the following is okay. If there are any issues, I'm happy to delete this, or a mod can. Hopefully it's all good.

- archive.org PDF (9.2MB): https://ia801607.us.archive.org/7/items/CodeTheHiddenLanguag...

- The seemingly-legit-looking account of the uploader who put it on archive.org: https://archive.org/details/@archiver849271

- The front page for the item on archive.org: https://archive.org/details/CodeTheHiddenLanguageOfComputerH...

- The directory the item is in: https://archive.org/download/CodeTheHiddenLanguageOfComputer...

--

I also found a 176MB scanned copy of the book as pure images: http://learning.caitlinmorris.net/sfpc/CharlesPetzold_Code.p...

Teaching CS is hard but I think the root of it is our ability to teach people how to think. Learning to program is easy compared to the problem of learning how to think algorithmically.

IMHO this is because a lot of CS courses start at a very high level with very abstract concepts, which might also be because they can't afford to start with the very basics of "how does a computer work" due to the limited time available.

On the other hand, I think CS should always start with a book like this one, which truly does start at the basics:

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

A large part of why beginners fail is because they expect too much of, and don't comprehend, how the machine works. I believe that once they see how this complexity is actually the result of a large number of very, very simple operations, they will have a better understanding of what it means to instruct the machine.

I'd recommend Petzold's https://www.amazon.com/Code-Language-Computer-Hardware-Softw... too.
Elissa343
My mothers neighbour is working part time and averaging $9000 a month. I'm a single mum and just got my first paycheck for $6546! I still can't believe it. I tried it out cause I got really desperate and now I couldn't be happier. Heres what I do, •••••••••>> http://www.joinmate2.com
But software is so pervasive that within a generation or two, not understanding how it works will put you at a severe disadvantage.

Unfortunately the corporations seem determined to put a stop to that sort of pervasive knowledge, if only for the purpose of controlling and monetising their users. They don't want people to know how easy it is to do things like strip DRM or remove artificial limitations from software. [See Stallman's famous story, or the Cory Doctorow articles on the demise of general-purpose computing.]

And thus most of the "learn to code" efforts I've seen have seemed to focus on some ultra-high-level language, removed from reality and often sandboxed, so that while people do learn the very-high-level concepts of how computers work, they are no less unclear about the functioning of the actual systems they use daily --- or how to use them fully to their advantage. In some ways, they're trying to teach how to write programs without understanding existing ones.

The document said children should be writing programs and understand how computers store information by their final years of primary or intermediate school.

However, this sounds more promising. Especially if they're starting more along the lines of this book: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

SamReidHughes
Actually no, the nature of "learn to code" efforts has nothing to do with some corporations' monetization schemes.
None
None
I'm extremely grateful and was not at all expecting such an explanation.

I wanna exlpain few things.

Let me rephrase what I meant by "minimize the time wasting". You see there are lot of great advice available online. You ask something on a subreddit or here and then people will share great resources. I love this and this kind of learning. My concern is that sometimes these resources and advice is given along the lines of "although its not completely necessary, it'll still be an experience in itself".

The problem here is that such kind of learning sometime waste too much of time and leave you with confusion. People daily ask so many questions on CompSci and you'll find books starting from complete basics of computer like Code https://www.amazon.com/dp/0735611319, Nand2tetris course http://www.nand2tetris.com etc to something very sophisticated like AI. I hope you can understand that if a person spends too much time on these kinda things given that he's got a job or he's student in university with a sweet CompSci curriculum (you know what I mean) then its a problem. Although the above mentioned resources are exceptional there are others too which teaches the same thing. Can a person read all of them one by one "just to satisfy his curiosity and thinking that it'll help him in future"?

RE is already an extremely sophisticated and vast field which requires computer mastery. I'm in college and it has made me hate things I loved. I'm extremely curious guy and can spend 10-20 hours in front of PC easily. I've ~6 years of experience with linux. Now I'm literally not in a state to read 2-3 400-800 page books on a single topic which I don't even know would be required in RE. There are some topics which are quite difficult but at least if we have an idea that it IS mandatory for RE then you can be sure and refer other resources. If you don't even know what's your syllabus how can one concentrate and master it let alone learning. RE requires you to study every minute details or computer system but wasting too much of time on those horrible digital logics and design is really not worth it.

So My purpose is to make it completely clear what I actually need to know so that I can focus on it instead of reading each and every topic in complete detail thinking that if I'll miss the direction of even a single electron in I/O I won't be able to do efficient reversing. I'm literally fed up of those architecture diagrams with arrows and cramming those definitions ROM, EEROM, EEPROM.............. again and again for tests and assignments.

I've few questions for you:

You mentioned Computer Organization and Design which I think is authored by Patterson and Hennessy which is used by almost all Universities. I'm just curious about its not so good looking amazon reviews. Also what's your opinion on Tanenbaum's books which you've mentioned in that reddit link.

Now let's summarize what I've understood (PLEASE help me correct if I'm wrong)

>>>> UNDERSTANDING the system you want to hack

> Learn the most used fundamental programmming languages. (the way we TALK with computers) 1. C (also C++ in some cases) 2. Python or Ruby (given its dominance in industry right now thanks to its productive nature, also being used exploit writing) 3. Java or C# (object oriented programming which along with above languaged completes our programming fundamentals) 4. Assembly (obviously needed in RE) I think it need not be mentioned that we need to have good grasp of Data Structures and Algorithms with above languages (obviously not all)

> Understand each and every data flow and HOW a computer system work

Computer Organization and Design and Architecture

(OS fundamentals, memory management, virtual memory, paging, caching etc, Linux(macOS too) and Windows internals part I think comes here)

You restored my faith in humanity when you said I can skip the hardware and microcode part (please explain what specific topics, I swear I won't look at them again until I'm done with required topics.)

> Network Fundamentals and Programming Basics of http, TCP/IP and other protocols.... Socket programming

>>>> THE HACKING PART

> Learning WHAT loopholes are there in this above process of data read write Types of attacks (buffer overflows, heap overflows....)

> HOW those loopholes are exploited

>Reverse Engineering (Learning tools of trade: IDA, gdb.....) learning and practising reversing. Fuzzing

>Exploiting the bugs making exploits.

Please review and correct. Thanks again.

Qrius
EDIT: I don't know why the edit is not updating.

"Basics of http" and "making exploits" are from next line. Thanks for bearing with me. ;)

None
None
LiveOverflow
Shameless self-promotion. I have a YouTube channel where I basically try to offer a path for learning exploitation. I'm done covering all the basics, and we will soon move to more advanced stuff. I have videos on various different security topics, but here is the probably more relevant playlist: https://www.youtube.com/playlist?list=PLhixgUqwRTjxglIswKp9m...
Qrius
I know your channel very well. Its praised everywhere because of such good content. I will be happy if you go through my main concern in the details and read the above discussion. Thanks again for such a wonderful channel. I'll surely learn from it when I'll cover the prereqs to understand what you're saying in those videos.
LiveOverflow
> I want to understand what are the ACTUALLY NECESSARY topics required and in RIGHT ORDER to MINIMIZE the TIME WASTING and wandering in between topics so that the knowledge aqcuired is more practical in context of current vulnerabilities rather than being more theoretical.

To be honest with you? I consider that sentence almost offensive. I hear you, but I think you have absolutely wrong expectations. You want to learn something that is not a profession like plumber where a really good expert can teach you everything you need to know with all the little tricks learned over the years. The field is sooo huge diverse and complicated that this won't work. And I think my playlist offers a rough outline that you can follow, but without going down rabbit holes left and right, and getting stuck many many times, you wont become good at it.

I understand the frustration that you don't want to "waste time" and that you are busy already. But everybody I know who is good in this field, including my own experience shows me, that nobody learns this stuff through a straight path. And everybody knows that most of the time will be spent chasing rabbits through a labyrinth and getting stuck.

Also there is no clear path. It's a complicated web you have to learn to traverse. For example like "Learn C" - what the f* does that even mean? To what extend? Hello World? Drivers? Or Operating System? "Learn assembler" - which assembler? have you looked into the Intel Instruction spec once? I doubt any human knows every instruction. Also who said that intel is the way to go, why not ARM or AVR. All of these fields offer a lifetime of studying in itself.

The "art" in becoming good at security and RE is to get a broad knowledge of a lot of things and try to simultaneously go deeper 'n deeper in all of them. And if you are interested in a specific field, put more weight on those topics.

You know how long it takes to reverse engineer something? People stare on IDA for weeks or months at a time. You can't learn RE just by reading a book or a blog. You gotta start to just doing it, and hopefully find a few blogs and people to keep up the spirit.

Qrius
Why is it that K&R is referred as the greatest book on C but never recommended to a complete beginner but only seen as a reference book?

Why is it that several resources exist on buffer overflows yet we ask question on which one is better?

Why is it that you started your channel even though resources like Art of exploitation and Shellcode Handbook already exist?

Why is that there are people asking question like "computer science books you wish you had read earlier"?

Are the one who is questioning or answering is asking or telling a short-trick to become the super h4x0r?

Internet forums exist for a reason. It is always wise to take the advice of someone more experienced than you. I don't see any wrong in it.

The people who are on top are there because of a reason. The root of hacking lies in outsmarting a coder by exploiting the mistakes in his code. Now even a field like this has become a corporate profession.

But there's something that differentiates a hacker from rest of the people. I think learning from somebody else's mistakes is one of the smartest thing you can to do.

I still haven't read anything better than Code by Charles Petzold [1] and it's not even close.

[1] https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

None
None
faceyspacey
It is a fantastic book. It doesn't take u into typical algorithms (at least that I recall), but rather it explains as intuitively as possible how a computer is built up from flip flops and binary logic to assembly, intermediate language and on to full-on compilation of a useable language.

Basically beginner programmers can acquire a broad understanding of the foundation the programs you're building are built on by reading this book. It reads more like a non-fiction expose than a programming language tutorial book, which is to say, given its subject, it's an easy read you can do on the couch. Depending on your skill and knowledge level, there may be a few sections you have to re-read several times until you understand it, but you won't feel as though you need to go over to your computer chair and try something to fully grasp it.

If you can do basic arithmetic, you can get through this book. That seems to be the hidden premise. That computers are easy and should be easy to understand. This book is a testament of that. Though I'm sure some will find this doesn't go deep enough. But the point is: learning so generally will create many entry points for you to follow up on in your journey into programming and computer science. It will clear up many things and essentially make the path seem less scary and out of reach. This book achieves that really well. High level programmers will come away feeling far less insecure about their lack of knowledge of the underpinnings of whatever it is they are developing. I know I did. I can't say enough about this book. It's the real deal. I'm sure those with a computer science degree might have more to say (that is they likely think it's a cursory overview), but I think for everyone else it's a computer science degree in a book you can read in one or two weeks. At least half the degree. For the second half, I recommend Algorithms In A Nutshell. And done! Go back to programming your high level JavaScript react app and get on with your life.

On a side note: it's my opinion that theory first is the wrong way. Application first, theory as needed is the right approach. Otherwise it's like learning music theory before u know u even like to play music. U might not even like being a programmer or be natural at it. And if u spend 4 years studying theory first, u will have spent a lot of time to discover what u could have in like a month. In addition, it can suck the joy and fun out of the exploration of programming and computer science. It's natural and fun to learn as u dive into real problems. Everything u can learn is on the internet. It's very rewarding and often faster to learn things when you are learning it to attain a specific goal. The theory u do learn seems to make much more sense in the face of some goal you are trying to apply it to. In short over ur computing career u can learn the same stuff far faster and far more enjoyably if you do so paired with actual problems.

But that said sometimes u do gotta step back and allocate time for fundamentals, even if u have no specific problem they are related to. However you will know when it's time to brush up on algorithms, or finally learn how the computer works below your day to day level of abstraction. Just know that a larger and larger percentage of us programmers went the applied route, rather than the computer science theory first + formal education route. It's probably the majority of programmers at this point in time. In short u r not alone learning this as u go. Learn to enjoy that early on and save yourself from the pain of insecurity of not knowing everything. This is an exploration and investigation, and perhaps you will make some discoveries nobody else has been able to make, and far before u have mastered and understood everything there is to know about he computer. Perhaps that's it's biggest selling point--you don't have to know everything before you can contribute to the computer world! So enjoy your pursuits in programming, knowing in your unique exploration at any time u may come up with something highly novel and valuable.

bluesilver07
Is this book relevant only to beginners? Do you think a programmer with about 7 years of experience (in C, C++, C#, Java) will find it useful?
djhworld
Absolutely, the book sits nicely between being fairly general but goes into just enough depth without bewildering you.
nvarsj
Yes! I read it about 5 years after finishing my degree (which already covered many of the topics in depth), and it was very enjoyable. It gives a very good, succinct (if simplified) overview of computer architecture.
rickhanlonii
It's a really short and interesting read. It's difficult to think anyone could read it and find it to be a waste of time.
spraak
Thank you for your inspiring words!
Shorel
I don't think theory first is wrong by itself.

I think that everyone will get new information better when it is something that fixes an immediate issue or clarifies an immediate doubt.

But this is not a contradiction. Theory can be presented in such a way that you want and need to know the next piece of information, the way mystery novels work.

It can be easier for the writer to create this need with examples instead of narrative, no doubt about it.

But let's not fall in the opposite direction of having only examples and no theory, so common with blogs now. I feel empty when I read such materials.

After quitting a job with a lot of commute time in it, and having failed to monetize a side project, I finally landed a teaching position on a local technical university.

I always loved learning and teaching, and a side effect of this is that now I've regained the curiosity I always had about the fundamentals of our industry (I've a CS PhD). So now I'm back reading about the fundamentals of electricity and building 8-bit digital adders with basic AND/OR/XOR logic gates [0].

There's still lots of fundamental things that I want to re-learn, and for 2017 I'm thinking on writing a book about learning programming from exercises (with just enough theoretical concepts) starting from flow-charts and pseudocode, up-to some basic algorithms / abstract data structures/types (probably using Python). My idea is that there are lots of students out there that could benefit of learning how to program by solving focused exercises and learn enough about algorithms and structures to feel capable of doing more complex things (i.e, not feel the "impostor" syndrome).

[0] - https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

I agree, the book isn't very "bottom-up" at all, perhaps with the exception of "Binary and Number Representation" being the second chapter; the rest of it looks like OS stuff.

This is what I'd consider "bottom up":

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

MattSteelblade
My choice for bottom-up book: The Elements of Computing Systems: Building a Modern Computer from First Principles https://www.amazon.com/Elements-Computing-Systems-Building-P...
impappl
This is also known as the "nand2tetris" course: http://www.nand2tetris.org/

Great book!

I have a physics background but not an EE background. I found verilog pretty easy to grasp. VHDL took me a lot longer.

To get some basic ideas I always recommend the book code by charles petzold: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

It walks you through everything from the transistor to the operating system.

(Apparently I need to add that I work for AWS on every message so yes I work for AWS)

technological
Thank you
I always recommend this book for those who would like to know how computers really work:

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

RandomOpinion
While I like Petzold's "Code", it's really aimed at nontechnical audiences. Jon Stokes' "Inside the Machine" or Nisan and Schocken's textbook "The Elements of Computing Systems" are far better if you have a technical background.
grzm
I second the Stokes recommendation. I have very little hardware background, but quite a bit science and tech (engineering). He provided the right amount of background and introduction for me.
Code by Charles Petzold would be the best in my opinion.

link: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

napsterbr
+1. Great gift for non-technical friends/relatives. It does get at a technical depth of computer organization and stuff, but it has a nice gradual intro. But if op focus is computer science history, there may be better options (don't know which ones, and Code does share some history detail, but not that much. It should get your niece interested in computer science though).
I don't think we're near that many layers of abstraction (yet --- and hopefully never?), but certainly more than ten. Indeed it's all about switching binary signals eventually. There's a great book about that too:

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

kens
Well, let's see how many layers of abstraction the Alto has. 1) Software (a BCPL program). 2) Library / OS code. 3) Machine code. 4) Microcode. 5) Logic signals (TTL chips). 6) Analog signals (e.g. rise time and voltage of the logic signals). 7) Transistors inside the chips. 8) Quantum mechanics. (The last two levels are hidden to us.) If you're using Smalltalk, you could add another layer for the interpreter.
jackhack
Funny, I was thinking of Petzold's book when I wrote that. +1 for mentioning it!

I still say there are that many layers of abstraction, maybe more. Individual bits, ganged as words, with logic to treat as such, and ability to store and move along a bus. Then you add the ability to act on data: Control logic, arithmetic units, caches, and interpret data as instructions -- microcode running in the CPU, and processor instructions.Now move up to main memory, storage subsystems, networking, graphics (GPUs!!) etc. etc. etc. That gets you through the hardware. Now add software: BIOS, operating system, drivers, user space code, compilers, interpreters, it just goes on and on. And for every one of these giant groups of something, there are just so many logical groupings and abstractions.

It's like individual cells in a body, forming organelles, forming organs, forming creatures. So many layers of organization. Amazing beauty.

But always at the bottom of the pile, there is a switch.

> "Finally, an honorable mention to three papers that don't qualify, but which I think you should read anyway."

If we're going for papers, then I'm guessing books are allowed too. If so, for anyone interested in giving themselves a grounding in the fundamentals, it's worth checking out Code by Charles Petzold. I've been going through it, it's excellently written, and has helped me fill in gaps in my understanding of how computers work.

https://www.amazon.co.uk/Code-Language-Computer-Hardware-Sof...

qwertyuiop924
SICP, Land of Lisp, Exploding the Phone, and The Cuckoo's Egg, while I haven't finished all of them, were all instrumental in making me who I am today.
Haven't read the book you're recommending, but I feel it's more or less close to Code[0] by Charles Petzold, which in itself is a fascinating read.

[0]: https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

lukeck
I read both books a long time ago. My recollection is that Code spends more time building context and explaining why something works as it does. "Elements of Computing Systems" gives more detail on how to implement many of the same concepts in a simple way. Both are great books.
GrumpyYoungMan
The two are not at all similar. Petzold's "Code" is good but is aimed at non-technical readers while "The Elements of Computing Systems" is more or less a textbook that encapsulates a longitudinal slice of a 4-year computer engineering program, complete with exercises. It's really quite impressive in what it manages to cover (although the massive amount of material glossed over or omitted does make me wince).
This is the sort of thread that hits me right in the wallet.

Here are some books I've given as gifts recently:

* The Knowledge: How to Rebuild Civilization in the Aftermath of a Cataclysm, Lewis Dartnell[1]

* The Black Swan, Nassim Taleb[2]

* Siddhartha, Hermann Hesse[3]

* The Happiness Trap, Russ Harris and Steven Hayes[4]

* Code, Charles Petzold[5]

[1] https://www.amazon.com/Knowledge-Rebuild-Civilization-Afterm...

[2] https://www.amazon.com/Black-Swan-Improbable-Robustness-Frag...

[3] https://www.amazon.com/Siddhartha-Hermann-Hesse/dp/161382378...

[4] https://www.amazon.com/Happiness-Trap-Struggling-Start-Livin...

[5] https://www.amazon.com/Code-Language-Computer-Hardware-Softw...

Frogolocalypse
The Black Swan is on my list too.
nl
Is it really that good?

I've started it a few times. Nassim Nicholas Taleb seems to make sure never to use one word when ten could possibly be used, especially if some of them about himself.

MagnumOpus
No, it is not. The idea behind the book is as sound as it is simple: shoehorning normal distributions in places where they shouldn't go just to make problems tractable will end in disaster due to an excess of fat tails in the real world.

However, Taleb has been pontificating on that single idea for fifteen years now and has parlayed twenty pages worth of ideas into three books, a collapsed hedge fund and numerous academic positions.

Skim the first three chapters of any of his three books, and you will have learnt all there is to learn from him.

gooseus
While I can't disagree too much with my sibling comments, I do believe that the shift in mental model is worth the criticisms.

It's a shame his style, wordiness and pretension sometimes gets in the way of communicating a really significant and fundamental concept that I believe everyone should incorporate into their world view.

None
None
b_emery
I thought fooled by randomness was much better. Higher information density.
sundarurfriend
Yep, I haven't been able to finish the book either, and what I've read didn't stand up to all the hype.

Taleb's Antifragile I did, unfortunately, finish, and it's way, way worse.

Now that I think about it, both books have a similar pattern: the first dozen or so pages present an interesting idea, which does give you a fresh and useful mental model in understanding the world. The rest of the book, unfortunately, meanders off into superficial redundant applications of it and pounding into the reader's head how anti-establishment Taleb is.

Have you read a book called Code by Charles Petzold [0] by chance? It walks you through how a computer works from low level relays all the way up to building a processor and running an OS. He talks about both the Z80 and 6502, and peppers in a lot of the history along the way. The book has me really itching to play around with one of the two and it sound like the Z80 might be the way to go.

[0] https://www.amazon.com/gp/aw/d/0735611319/

mindcrime
I started it a few weeks ago, yeah. Definitely looking forward to churning through all of it.
I'm a programmer without a computer science degree and I'm quite aware that CS is a bit of a blind spot for me so I've tried to read up to rectify this a little.

I found The New Turing Omnibus[1] to give a really nice overview of a bunch of topics, some chapters were a lot harder to follow than others but I got a lot from it.

Code by Charles Petzold[2] is a book I recommend to anyone who stays still long enough; it's a brilliant explanation of how computers work.

Structure and Interpretation of Computer Programs (SICP)[3] comes up all the time when this kind of question is asked and for good reason; it's definitely my favourite CS/programming book, and it's available for free online[4].

I'm still a long way off having the kind of education someone with a CS degree would have but those are my recommendations. I'd love to hear the views of someone more knowledgable.

[1] https://www.amazon.co.uk/New-Turing-Omnibus-K-Dewdney/dp/080... [2] https://www.amazon.co.uk/Code-Language-Computer-Hardware/dp/... [3] https://www.amazon.co.uk/Structure-Interpretation-Computer-E... [4] https://mitpress.mit.edu/sicp/full-text/book/book.html

> no one has time to learn everything

Nonsense. The fundamentals don't take a long time to learn. And once you know them, everything else becomes much easier to learn. That's the reason that learning the fundamentals matters: it's a huge lever.

Here's a single, small, very accessible book that takes you all the way from switches to CPUs:

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

SICP gets you from CPUs to Scheme (well, it goes in the other direction, but the end result is the same). That's two books to get from switches to compilers. Anyone who thinks they don't have time for that needs to learn to manage their time better.

MichaelMoser123
> SICP gets you from CPUs to Scheme

I don't recall anything about CPUs in SICP. Its more about data driven programming and writing of intepreters.

What i liked about SICP and scheme programming was that it is a pretty good environment for tinkering - the REPL makes it easy to combine functions, and to work in a bottom-up manner. (btw you had less of that in common lisp, and most other environments that teach you to work in a top down way, however you can still work with the Python REPL).

Maybe this bottom-up approach is what Sussman really has in mind when he is talking about first principles, because SICP is really not about working up from the atoms to the scheme interpreter/compiler.

vithlani
Sorry, why do you feel there is less of that in Common Lisp? Surely the REPLs offer equal facilities...
lisper
> I don't recall anything about CPUs in SICP.

https://mitpress.mit.edu/sicp/full-text/book/book-Z-H-30.htm...

MichaelMoser123
yes, the compiler target. I stand corrected.
quicklyfrozen
I bought this book for my son who will be starting a CS program in the fall. He seems to have enjoyed it, and I'm hopeful it will give him a good grasp of the fundamentals that you might miss by starting out with Java.
mindcrime
Yes, learning the fundamentals is a huge lever. I absolutely agree. But I still stand by the assertion that "no one has time to learn everything" - especially at the beginning of their career.

As the old saying goes "if I had 3 days to cut down a tree, I'd spend the first 2.5 days sharpening my axe". Sure, but at some point you have to actually start chopping. By analogy, at some point you have to quit worrying about fundamentals and learn the "stuff" you need to actually build systems in the real world.

By and large I'm in favor of spending a lot of time on fundamentals, and being able to reason things out from first principles. And when I was younger, I thought that was enough. But the longer I do this stuff, and the larger the field grows, the more I have to concede that, for some people, some of the time, it's a smart tradeoff to spend more of their time on the "get stuff done" stuff.

oumua_don17
>> "no one has time to learn everything" - especially at the beginning of their career.

I wish I had this book at the beginning of my career. http://www.amazon.com/Elements-Computing-Systems-Building-Pr.... Makes you design the hardware, then the software for that hardware.

Should not take more than 8 - 12 weeks with school work/day job.

shitgoose
"no one has time to learn everything"

right. No one has time to learn endless js frameworks, injections and reinventions of the wheel. So (1) read the fucking SICP book, (2) learn about the business problem you are trying to solve, put 1 and 2 together and "get stuff done".

jkaunisv1
This is what co-op programs address. 5 year degree, learn all the fundamentals from silicon to applications, with 6 co-op placements of 4 months each interspersed throughout. That was the Waterloo formula when I went through their CS program and it works tremendously well. Sure, it's a lot to learn in 5 years, and there's always more to learn, but it does give you a very solid foundation to build on.
ktRolster
especially at the beginning of their career.

That's why you spend four years in a university before you start your career. If you're not going to learn all the fundamentals, you might as well go to an 8 week code camp and save your time.

mindcrime
But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful. So what do we do, have people do a 4 year degree, and then go spend 8 weeks, or 16 weeks, or a year, learning to actually build systems?

I don't know, maybe that is the answer. But I suspect the MIT guys have a point in terms of making some small concessions in favor of pragmatism. Of course, one acknowledges that college isn't mean to be trade school... Hmmm... maybe there is no perfect answer.

gnaritas
> But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful

A minor point to make here, college isn't about learning practical skills; that's what trade schools and on the job training do. College is about learning the fundamentals, however, computer science is not software engineering and you don't need learn computer science to be a good software engineer because software engineers don't do much in the way of computers science anyway.

mindcrime
A minor point to make here, college isn't about learning practical skills;

Yes, on paper that is true. And I alluded to that point in my answer above (or at least I meant to). But from a more pragmatic and real world point of view, colleges are at least partially about teaching practical skills. I don't know the exact percentage, but I'm pretty sure a large percentage of college students go in expecting to learn something that will pointedly enable them to get a job. Of course one can argue whether that's a good idea or not, and I don't really have a position on that. But I don't think we can ignore the phenomenon.

lisper
> even 4 whole years isn't enough to learn "all the fundamentals"

No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental." It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn.

fsloth
" It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn."

Thanks, you succeeded in verbalizing succinctly the agony of so many software projects. However, I would claim it's just not ignorance of the fundamentals. It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems. I call it cargo-cult progress.

JadeNB
> It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems.

Spolsky calls these 'architecture astronauts' (http://www.joelonsoftware.com/articles/fog0000000018.html). (As a mathematician, I have a certain leaning to this mindset, so I clarify that I am not criticising it, just reporting on existing terminology.)

fsloth
It is absolutely essential to have a theory of system while implementing it. The software system itself, however, should exhibit only a simple subset of the whole theory in conscise form as possible. Because - usually the full theory becomes obvious only while writing the software. And, in practice, one uses only a tiny part for the implementation at hand.

I think one facet of this thing is that people have an ambition to build huge systems that encompass the entire universe - while in fact, most software system only need to utilize the already excisting capabilities in the ecosystem.

It's like since people are eager tinkerers they approach software projects with the mindset of miniature railroad engineers - while in fact, the proper way to attack the task should be as brutally simple as possible.

The reason huge theoretical systems are a problem in software engineering is that a) they tend to be isomorphic with things already existing b) while not implementing anything new and c) they obfuscate the software system through the introduction of system specific interface (e.g. they invent a new notation of algebra that is isomorphous in all aspectd to the well known one). And the worst sin is, this method destroys profitability and value.

genop
Amen.

How could they ever understand "simple and easy"? Their concept of simple is not based in reality.

There seems to be this idea (I wish it had a name) that one can just apply a new abstraction layer and the lower layers magically disappear or are rendered insignifcant.

And of course producing 100's or 1000's of pages of "documentation". This is disrespectful toward the reader's time. The best documentation is and will always be the source code. If I cannot read that, then for my purposes there is no documentation.

This is not to say some of these old new thing higher-level of abstraction solutions do not work well or are not easy to use or do not look great. Many people love them, I'm sure. But until I see more common sense being applied it is just not worth my time to learn. I would rather focus on more basic skills that I know will always be useful.

mindcrime
No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental."

I think what we actually disagree about is just the exact definition of "fundamentals". I personally would not say "the fundamentals are very simple and easy to learn". At least not if you're learning them to the level of depth that I would want to learn them.

Now if we're just saying that people only need a very high level overview of "the fundamentals", then that might swing the equation back the other way. But that, to me, is exactly the knob that the MIT guys were twiddling... moving away from SICP and using Python doesn't mean they aren't still teaching fundamentals, it just sounds like they aren't going quite as deep.

Anyway, it's an analog continuum, not a binary / discrete thing. We (the collective we) will probably always be arguing about exactly where on that scale to be.

lisper
> I think what we actually disagree about is just the exact definition of "fundamentals".

That may well be, but as I am the one who first used the word into this conversation (https://news.ycombinator.com/item?id=11630205), my definition (https://news.ycombinator.com/item?id=11632368) is the one that applies here.

mindcrime
Fair enough... from that point of view, I think we agree more than we disagree.

And please don't take anything I'm saying here as an argument against learning fundamentals.. all I'm really saying is that I can understand, even appreciate, the decision MIT made. Whether or not I entirely agree with it is a slightly different issue. But I will say that I don't think they should be excoriated over the decision.

None
None
gizmo385
I disagree. The field has exploded. It's becoming more and more difficult to take vertical slices of every sub-field. What should we consider fundamental?

Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

There are plenty of core fields that I've missed. Which are fundamental? Which do we teach? It simply isn't possible to adequately teach everyone the core fundamentals of all of these fields during the course of an undergraduate degree while also conveying the basic design fundamentals that a software developer needs to know. There is just too much in the field to expect every software developer to have a complete understanding of all of the fundamentals in every sub-field. Our field is getting a lot wider and a lot deeper, and with that, people's expertise is forced to narrow and deepen.

lisper
> What should we consider fundamental?

A fair question, and a full answer would be too long for a comment (though it would fit in a blog post, which I'll go ahead and write now since this seems to be an issue). But I'll take a whack at the TL;DR version here.

AI, ML, and NLP and web design are application areas, not fundamentals. (You didn't list computer graphics, computer vision, robotics, embedded systems -- all applications, not fundamentals.)

You can cover all the set theory and graph theory you need in a day. Most people get this in high school. The stuff you need is just not that complicated. You can safely skip category theory.

What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically. This is the part that the vast majority of people are missing nowadays, and it can be a little tricky to wrap your brain around at first. You need to understand what a fixed point is and why it matters.

You need automata theory, but again, the basics are really not that complicated. You need to know about Turing-completeness, and that in addition to Turing machines there are PDAs and FSAs. You need to know that TMs can do things that PDAs can't (like parse context-sensitive grammars), and that PDAs can to things that FSAs can't (like parse context-free grammars) and that FSAs can parse regular expressions, and that's all they can do.

You need some programming language theory. You need to know what a binding is, and that there are two types of bindings that matter: lexical and dynamic. You need to know what an environment is (a mapping between names and bindings) and how environments are chained. You need to know how evaluation and compilation are related, and the role that environments play in both processes. You need to know the difference between static and dynamic typing. You need to know how to compile a high-level language down to an RTL.

For operating systems, you need to know what a process is, what a thread is, some of the ways in which parallel processes lead to problems, and some of the mechanisms for dealing with those problems, including the fact that some of those mechanisms require hardware support (e.g. atomic test-and-set instructions).

You need a few basic data structures. Mainly what you need is to understand that what data structures are really all about is making associative maps that are more efficient for certain operations under certain circumstances.

You need a little bit of database knowledge. Again, what you really need to know is that what databases are really all about is dealing with the architectural differences between RAM and non-volatile storage, and that a lot of these are going away now that these architectural differences are starting to disappear.

That's really about it.

jim-greer
I'm of two minds about this. Everything you mention is great background to have. (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I think this deep background is a great goal. But in another way programming is a craft. You can learn as you go. There are millions of bright people who could do useful, quality work without an MIT-level CS education. They just need some guidance, structure, and encouragement.

loup-vaillant
> (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I wouldn't be so sure. I once applied for a company that was about proving various safety properties of control and signalling applications (for trains). Sizeable applications. They have problems with the halting problem and combinatorial explosions, but they get around those and do it anyway.

int_19h
A very vaguely related question: are bindings lexical or dynamic in R? Or would it be fair to say that it's actually both at the same time? Or do we need a new term altogether?

For those unfamiliar with it, in R, environments are first-class objects, and identifier lookup in them is dynamic. But the way those environments are chained for lookup purposes is defined by the lexical structure of the program (unless modified at runtime - which is possible, but unusual) - i.e. if a function's environment doesn't have a binding with a given name, the next environment that is inspected is the one from the enclosing function, and not from the caller. So R functions have closure semantics, despite dynamic nature of bindings.

It would appear that this arrangement satisfies the requirements for both dynamic scoping and for lexical scoping.

conceit
> AI, ML, and NLP and webdesign are application areas

On first thought, I do agree. However, they are fundamental applications. Category Theory is categorizing the fundamentals. It uses a lot of the fundamentals on itself, I guess, but that doesn't mean ti me it's inaccessible or useless.

loup-vaillant
Don't confuse "important" with "fundamental". He probably meant foundational to begin with.

The web for instance is an unholy mess. We can do better for mass electronic publishing. We don't because that huge pile of crap is already there, and it's so damn useful.

conceit
Webdesign IMHO is be an extreme example of formatted output. I/O is a fundamental concept.
ktRolster
>What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically.

Yes please.

jholman
> You need automata theory... Turing-completeness... PDAs and FSAs...

Why? I know that stuff inside and out, and across multiple jobs I have used none of it ever. What chance to use this favourite bit of my education did I miss? (Or rather, might I have missed, so that you might speak to the general case)

> You need to know how to compile a high-level language down to an RTL.

Why? Same comment as above.

> You need to understand what a fixed point is and why it matters.

Well, I don't, and I don't. I request a pointer to suitable study material, noting that googling this mostly points me to a mathematical definition which I suspect is related to, but distinct from, the definition you had in mind.

Otherwise... as I read down this thread I was all ready to disagree with you, but it turns out I'd jumped to a false conclusion, based on the SICP context of this thread. Literacy, what a pain in the butt.

In particular, the idea that ((the notion of an environment) is a fundamental/foundational concept) is a new idea to me, and immediately compelling. I did not learn this in my academic training, learned it since, and have found it be very fruitful. Likewise with lexical vs dynamic binding, actually.

lisper
>> You need automata theory... Turing-completeness... PDAs and FSAs...

> Why?

So you can write parsers. To give a real-world example, so you can know immediately why trying to parse HTML with a regular expression can't possibly work.

>> You need to know how to compile a high-level language down to an RTL.

>Why?

So that when your compiler screws up you can look at its output, figure out what it did wrong, and fix it.

>> You need to understand what a fixed point is and why it matters.

> Well, I don't, and I don't. I request a pointer to suitable study material

http://ocw.mit.edu/courses/electrical-engineering-and-comput...

Particularly lectures 2A and 7A.

ktRolster
I don't know about your university, but mine at least some coverage of all those categories.

At a minimum an education should give you a strong enough base that you can teach yourself those other things should you so desire.

infinite8s
> Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

That was exactly the curriculum of my CS degree, minus the web design, and I didn't even go to a first rate CS program like MIT (PennStateU 20 yrs ago, no idea what the curriculum is now.).

fsloth
There is the actual complexity, and then there is the accidental complexity lamented by the poster to whom you responded to. I would claim both are a thing. Especially in projects where the true complexity is not that great and the theoretical basis of the solution is not that well documented people have a tendency to create these onion layered monstrosities (the mudball effect).

If we look just traditional CRUD-kin (i.e. database, view, domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.

wolfgke
> If we look just traditional CRUD-kin (i.e. database, view, domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.

Perhaps because this type of program is so old, it had so much time to stick lots of mud on it. :-)

Code: The Hidden Language of Computer Hardware and Software[0]

My kids enjoyed this book, similar topic, but fairly playful in how it was put together and an extremely gentle introduction without actually shying away from how things actually work. It's hard to imagine a reader not coming away with a much better understanding of what computing is all about. It starts at gates and works up to actual (machine) code at the end of the book. Very good diagrams throughout.

Despite being from 2000, I don't think it's become outdated. I'd love it if there was a sequel that covered putting things together with a cheap FPGA.

[0] http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

This book was something I found very fun to read: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

N.B. I did read the book when I was in university studying CS, but I felt like it was a good balance of history and tech information.

Jun 11, 2015 · spb on What Is Code?
> Intro articles like this do a lot to reveal biases and misunderstandings.

This is one of the reasons I barely recommend any intro articles in Lean Notes (http://www.leannotes.com/): almost every single one is just a stream of incomplete and incorrect statements about how the world works, based on the author's myopic personal experiences.

Rather than properly generalizing and consolidating what needs to be said to convey a full understanding of the topic, most intros settle for the first example they can think of that could be remotely construed as related to the words they've previously used for whatever subject, regardless of whether it has meaning in any context. (Example: saying that type safety prevents you from trying to "multiply seven by cats".)

It seems like a pretty Dunning-Kruger thing: the less broad your knowledge is, the more justified you feel in writing an introductory text to the field.

The only time I've ever seen somebody actually qualified to write an introductory text actually doing so (as I can immediately recall) is Charles Petzold's [Code: The Hidden Language of Computer Hardware and Software][Code] (although I suspect, from the few excerpts of it I've seen, that Brian Kernighan's "D is for Digital" is good, too).

[Code]: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

May 23, 2015 · barking on Nand to Tetris 1
A great book and accompanying software imo lacks just 2 things.

1:the building of the Nand gate itself and 2: the building of a flip flop.

Both these tasks can be easily accomplished with reference to the book 'Code' by Charles Petzold (http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...)

and software such as this http://logic.ly/

userbinator
Code is a great book. What I think is the most powerful and interesting point is that a computer can be built from (a sufficiently large number of) any component that can perform logic functions.

I stopped reading Nand2Tetris in any great amount of detail and just skimmed the rest after I saw that it confused Harvard (separate code and data) and Von Neumann (combined) architectures.

sepeth
I haven't read 'Code' yet, but I think starting from NAND gates is right; it has to stop at some level. I feel CMOS, MOSFETs, semiconductor physics.. are all interesting topics, but this book is already too broad.
arundelo
Code actually starts with relays.
highly reccomend reading Code: The Hidden Language of Computer Hardware and software. The author basically starts from how a telegraph works and extrapolates modern computing from that starting point. Great read, and helped me understand exactly what you're referring to, understanding classical computers down to the metal.

http://amzn.com/0735611319

Here's a book that I had a random encounter with at a teenager, that gave me an excellent understanding of how computers work at the lowest levels: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

It basically starts with a "code" of two friends talking to each other between houses at night via blinking flashlights, and gradually builds up from there to a full, if somewhat barebones, microprocessor, logic gate by logic gate. And it does so in a way that teenage me was able to follow.

sanderjd
Came here to post this. It's a great and approachable book about how information has been codified historically, why that matters, and how foundational it is to computing, and therefore the world at large.
The Soul of a New Machine by Tracy Kidder, the classic book following the development of a new minicomputer in the late 70s.

http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/03164...

Stealing The Network: How to Own the Box. This is a collection of fictional accounts of "hacking" written by hackers. Real world techniques are described though its in lightweight detail, the aim of the book is more to give an insight into how an attacker thinks. It's quite an enjoyable read too.

http://www.amazon.co.uk/Stealing-Network-How-Own-Cyber-Ficti...

Kingpin: How One Hacker Took Over the Billion-Dollar Cybercrime Underground by Kevin Poulsen. This one's a true story.

http://www.amazon.co.uk/Kingpin-Hacker-Billion-Dollar-Cyberc...

Code: The Hidden Language of Computer Hardware and Software By Charles Petzold. I still have to read this one, but I expect it would fit in with what you're after quite well.

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

Oct 19, 2014 · bprater on Simple CPU
If you want a comprehensive read on this topic, check out the book Code by Charles Petzold: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
I haven't read that one, but I'll definitely check it out. I'd also recommend Charles Petzold's book called "Code". One of the best technical books I've ever read!

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

drivers99
Besides "Code," I highly recommend "The Pattern on the Stone" by Danny Hillis, who also happens to be the designer of Thinking Machines' massively parallel supercomputer, the Connection Machine. It covers more material than Code in a lot less time.

In "Code," Petzold makes sure to cover the material exteremely slowly so that you won't miss any details, so it may get boring for some, but I would also read Code if you have the time and interest.

I was part of that generation. IIRC most kids didn't give a damn about it. At that age everything is just a weird random novelty. It was exciting for the device though, TO7 and lightpen were cute. Beside in the 80s, computers weren't a thing, even video games were barely established at home. And LOGO didn't feel like programming, turtling was felt more about geometry (left is down if facing left) than anything else, at least to me. We didn't really go into iterations and such.

I hope the new effort will use books like Code http://www.amazon.com/Code-Language-Computer-Hardware-Softwa... or something similar that don't take a macbook air for granted but instead use down to earth first principles that can be shown, built and tested with kids hands.

For slightly older kids, wishing for HtDP inspired classes.

One of the best books I've seen takes this approach:

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

~

Starting from either extreme (pure maths or pure electrical engineering) is quite healthy--starting in the middle, though, does a disservice.

Jan 03, 2014 · sanderjd on I, Health Insurer
You may enjoy a book he wrote called CODE[0], which is one of my absolute favorites, but I doubt it will convince you to share his political opinions.

0: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

Some resources on making tiny Hello World programs down to the kernel level that may be useful:

https://blogs.oracle.com/ksplice/entry/hello_from_a_libc_fre...

http://www.muppetlabs.com/~breadbox/software/tiny/teensy.htm...

http://timelessname.com/elfbin/

A wee bit heavy, but it's comprehensive. It deals with what happens when you run code, how the architecture of the computer works (by and large) including at the logic level:

http://www.amazon.co.uk/Computer-Systems-Programmers-Randal-...

If you want to go lower (and higher).. look at Understanding the Linux kernel for a good understanding of how an OS is put together, with specific examples i.e. Linux.

Code, by Petzold, deals with logic and computers from the ground up. It starts with relays and builds them up into gates and usable arithmetic blocks.

http://www.amazon.co.uk/Code-Language-Computer-Hardware-Soft...

http://www.amazon.co.uk/Understanding-Linux-Kernel-Daniel-Bo...

The physics is fairly simple, at least from a CRT or LED display perspective. Gets more tricky dealing with interconnecting microprocessors because a good chunk is vendor specific.

I think this kind of project is well suited to a guide on how to build a computer from the ground up, starting with logic gates, writing a real time OS and developing a scripting language that will run and compile on it. Then you can skip a lot of largely extraneous stuff and have a solid understanding of how the hardware works.

Sep 16, 2013 · planckscnst on Gift for a 10 year old
Petzold's "Code: The Hidden Language of Computer Hardware and Software" aids in the understanding of how computers work at the lowest levels.

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

Cool article! "Code" by Charles Petzold[0] talks about morse code while covering ways that we encode data. May be a good read for anyone interested in this topic.

[0]: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

I would have probably said (with some imprecision): Electrical circuits either carry a current (binary 1) or they don't (binary 0). [Ignoring the fact that you could also measure the AMOUNT of current carried here] Because of this underlying limitation in electrics you need to make do with just a binary system in computers.

Depending on the amount of time remaining I would either go into more depth or point him towards "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold (http://amzn.com/0735611319)

dfritsch
I read that book as a child and loved it, but it is so hard to find a book named "Code" when that is all you remember. That link just made my day.
Thanks. The low level stuff certainly seems like a good thing to learn—I made an effort to read Code[0] a while back and most of it went over my head. The high level programming languages most people work in today are abstracted to a point where it's extremely hard to see their relation to the low level stuff. It would be nice to understand that link better.

[0] http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

The best bottom up approach to this that I have seen is Charles Petzold's "Code: The Hidden Language of Computer Hardware and Software" [1] which starts with using a flashlight to send messages and walks up the abstraction chain (switch, relay, alu, memory, cpu...) to most of the components of a modern computer. It's very accessible.

[1] http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

The book Algorithmics: The Spirit of Computing doesn't read like a textbook to me, and it's quite interesting.

http://www.amazon.com/Algorithmics-Spirit-Computing-David-Ha...

The New Turing Omnibus

http://www.amazon.com/The-New-Turing-Omnibus-Excursions/dp/0...

is also good, as is Code by Charles Petzold.

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

AFTER EDIT: While I thought about the first three books I mentioned, I thought of another, Write Great Code, Volume 1: Understanding the Machine by Randall Hyde.

http://www.amazon.com/Write-Great-Code-Understanding-ebook/d...

jh3
I've see Code in B&N but never did more than quickly skim through it. And I've never heard of the other two. Thanks!
A great book for anyone curious about this topic (going from simple logic gates to more modern processing technology): http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
Jun 22, 2012 · showerst on Ask HN: Summer reading
If you'd like a great non-technical tour of how computers really work conceptually, starting from simple morse-code switches through to assembler, Charles Petzold's "Code" is awesome:

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

Even having understood for years how computers work in principal, nothing quite put it together for me like this book.

There's a similarly great book on the history/methods of cryptography called "The Code Book" by Simon Singh that I recommend too - http://www.amazon.com/The-Code-Book-Science-Cryptography/dp/... It's great because it traces the history but also walks you through how the cyphers actually worked, and provides the best intros I've ever seen to public key and quantum cryptography.

buzain
Simon Singh's book was my introduction to crypto and the wonderful Mathematics and mathematicians behind its vivid history. I had it with me all the time while doing the crypto-class.com online course a couple of months ago for a good historic perspective supporting Prof. Boneh's hardcore crypto topics. Highly recommended. Actually, his Fermat's Last Theorem book is also fascinating if you are interested in Math history.
Not sure if this is the right direction or not. In different ways they are all, essentially, about programming...

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

http://www.amazon.com/Elements-Computing-Systems-Building-Pr...

http://www.amazon.com/Structure-Interpretation-Computer-Prog...

http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/...

http://www.amazon.com/Pragmatic-Programmer-Journeyman-Master...

Good luck!

dlf
Ditto on "Code." As a non-programmer learning to program, it was everything I needed to know to wrap my head around how everything actually works.
syedkarim
This is absolutely the right direction--thanks! The one book on the list that might get kicked back is "Structure and Interpretation of Computer Programs". Too bad, since it's probably the most concrete. This is a great starting point, thanks again.
Recently, I've come to know "Code: The Hidden Language of Computer Hardware and Software"[1], and I really enjoyed it. I don't think it's necessarily the first book every programmer should read, but I do think it's a book every programmer should read. It's an easy read, it's fun, and really does provide what it promises. Highly recommended.

1. http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

famousactress
I really enjoyed it. Being self-taught, it provided a really handy primer and computer-architecture fundamentals, logic gates, etc. that I imagine I'd have otherwise gotten in school.
I love stuff like this. I really hope keen engineering youth are able to get involved with building toy CPUs. Maybe not that complex, but enough to grasp what memory mapping and registers actually are.

Anyone interested could read either: (http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...) {the intro is too gentle for too long, then bamm it's too hard for many people.}

(http://www.amazon.com/Art-Electronics-Student-Manual/dp/0521...) the student lab manual for the art of electronics. Probably best with AoE, which is showing its age but still excellent.

If you want to learn it from the bottom up, read Code.

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

It's a page turner and you'll know more about computers than many developers. You still won't be a programming guru from this, but its a great holistic approach that you can then supplement.

jasondrowley
Thanks. I've seen Code at my local B&N (and of course, on Amazon), and I've come close to buying it.

I'm getting it tomorrow.

I don't necessarily know of any one book that meets all of your friends requirements, but...

Tracy Kidder's The Soul of a New Machine might be good for your friend.

http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/03164...

Another good option might be Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

Or, how about Coders at Work?

http://www.amazon.com/Coders-Work-Reflections-Craft-Programm...

Another one that I have (but haven't had time to read yet) is Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software by Scott Rosenberg. It might have something that your friend would find interesting.

http://www.amazon.com/Dreaming-Code-Programmers-Transcendent...

Another one that may be inspirational, although it's more about personalities than computer science per-se, would be Steven Levy's Hackers: Heroes of the Computer Revolution.

http://www.amazon.com/Hackers-Computer-Revolution-Steven-Lev...

pgbovine
thanks for the references! i really appreciate you taking the time to reply to my question.

btw "Dreaming in Code" is the only one of those that I've read, and I don't think it's a good fit for my friend because it's basically the story of software project management gone awry ... hardly inspirational for someone aspiring to learn about the beauty of CS :)

Jul 18, 2010 · pan69 on Ask HN: Should I learn C?
While you're at it you might want to pick up a copy of "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold. One of the most compelling and well written books on the subject I've read. Reading this along side learning C and Assembly language will make it all "click".

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa..., it's a "general interest non-fiction" book with lots of historical stuff, e.g. coverage of the 8080 and 6500.

I can't find it now, but it reminds me of a MIT Press book for the educated layman that covered electronics, the various generations of semi-conductors (and how at that period TI was the only company to negotiate all of them, this was written at the dawn of the LSI or VLSI era), the critical details of wafer yield and resultant profitability, etc.

Perhaps not the right book for the original poster, but for many people it could be very useful.

OK. Depending on your programming background you're facing a steep learning curve. That's why I recommend a bottom up approach for you.

First read "Code" by Charles Petzold. This book will get you "in the mood" and in the right frame of mind: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

Then I suggest you pick up a good book on Assembler. This might be a good choice: http://www.amazon.com/Professional-Assembly-Language-Program...

Start writing some drivers for Linux. Like a memdrive or something. Do it all in Assembler! Oh, you need to read other books on how to do this...

Then pick up the K&R book on C. Now write your memdrive driver in C.

That should get you started. I think it will take you at least up to two years before you're passed the learning curve and to be comfortable with this level of programming.

Oh, you need to be willing to do it for the love of it because it's highly unlikely that you will make a living using these sort of technologies (nowadays).

Good luck!

PS: I miss the old days...

hilarious
Sounds good, thanks bro, interesting approach ;D
jacquesm
I wouldn't start off by writing a driver in assembler, probably better to write it in C first, then use the -S flag to get the intermediate and study that until you drop.
hilarious
Writing device drivers in C, and then using the -S flag ... have you seen the output gcc -S on Linux (Ubuntu) returns for a trivial hello world program? ;)
jacquesm
Sure. And for non-trivial programs as well.

Low level assembly is what you want, that presumably includes interfacing with you favorite C code.

Calling conventions, stack frames and so on.

Besides, the boilerplate runtime stuff has nothing to do with writing kernel code, so a 'trivial hello world' program will have a lot of cruft added to it:

        .file   "hello.c"
        .section        .rodata
  .LC0:
        .string "helo, world!"
        .text
  .globl main
        .type   main, @function 
  main:
        leal    4(%esp), %ecx
        andl    $-16, %esp

        pushl   -4(%ecx)
        pushl   %ebp
        movl    %esp, %ebp
        pushl   %ecx

        subl    $20, %esp
        movl    $.LC0, (%esp)
        call    puts
        addl    $20, %esp

        popl    %ecx
        popl    %ebp

        leal    -4(%ecx), %esp
        ret
        .size   main, .-main
        .ident  "GCC: (Ubuntu 4.3.3-5ubuntu4) 4.3.3"
        .section        .note.GNU-stack,"",@progbits
that's not that bad.

Note the stack frame alignment trick, the fact that 'printf' was specified in the source but puts is being called!

ismarc
The -S flag was invaluable for me when I was moving from C to assembly. Then again, I used it in an iterative fashion, do something simple -S, do the same thing -S -O1, -S -O2, -S O3, and then a spattering of the different optimization techniques to see how it converted the C to assembly. I miss those days (did I just admit to being old?).
jacquesm
Well I'll admit it right along with you, I miss those days too.

I think I must have coded myself to the moon and back by now if you'd print it all out on fanfold paper (in C, mostly), but the joy of getting a little OS to boot up from nothing is never going to pale in comparison to installing framework X and building some web-app. No matter how successful.

Web-apps make substantially more $ though...

pan69
I can assure you that if you want to learn C AND Assembler and you start with C, you will never ever get to the Assembler bit. A memdrive isn't that difficult and if you never solve a 'real' problem, you never learn anything because you don't have to push yourself. Just my 2 cents...
jacquesm
It really is your two cents, what goes for you does not go for everybody else.

Some people will learn just for the fun of learning.

pan69
You are obviously missing my point. But that's alright...
For this kind of thing, I recommend the book

"Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold.

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

Being a CS major, I bought this book for the sheer amazement at how the gist of everything I had learned about computers in the university could be conveyed in one book in such easy and fun manner. The learning curve is so gentle, it just couldn't be easier. I cannot recommend this book enough to _every_ person who really wants to understand how computers work.

HN Books is an independent project and is not operated by Y Combinator or Amazon.com.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.