HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics)

John R. Pierce · 5 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics)" by John R. Pierce.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
"Uncommonly good...the most satisfying discussion to be found." — Scientific American. Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. To give a solid introduction to this burgeoning field, J. R. Pierce has revised his well-received 1961 study of information theory for a second edition. Beginning with the origins of the field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy, language and meaning, efficient encoding, and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. A glossary of terms and an appendix on mathematical notation are proved to help the less mathematically sophisticated. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. His Introduction to Information Theory continues to be the most impressive nontechnical account available and a fascinating introduction to the subject for lay readers.
HN Books Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
Most of mine are going to be books by philosophers or scientists (about philosophy or other things).

- The Conquest of Happiness by Bertrand Russell was a pretty good one. He has a lot of ideas that were ahead of their time (positive psychology, etc). You can see a lot of parallels between his ideas and modern Stoicism (although Russell criticized it elsewhere, I think he came to some of the same conclusions).

- Introduction To Mathematical Philosophy by Bertrand Russell. Another Russell one. I think this is probably the clearest and easiest to understand explanation I've ever read of the underpinnings of mathematical foundations. It's written in a style that should be accessible to almost anyone with a high school education. He wrote it while he was in prison (for refusing to be drafted) during WW1. Apparently he left a copy of it to the prison warden.

- An Enquiry Concerning Human Understanding by David Hume. This is worth reading because it is the motivation for basically all of modern philosophy of science (at least in the west). It's also pretty easy to read and if you read it you'll be able to more easily understand other books and papers that are responses to it.

- Turing's Cathedral by George Dyson. This book should be required reading for every programmer or aspiring programmer IMO. I learned so much about the history of computing that I didn't know before reading this. You will not regret buying this one.

- I Am A Strange Loop by Douglas Hofstadter. Obviously everyone knows about GEB, but he also wrote a shorter follow up that in my opinion expresses his ideas much more clearly. I think that even if you disagree with him, it's worth reading because there are so many things you can take away from this book. For example, he talks about his wife's death, and ties that into his theory of mind and explains the unstated purposes of why we have funerals/wakes for people.

- An Introduction to Information Theory by John R. Pierce. For someone like me who doesn't really have a very strong math background, this was a very clear intro to the ideas behind information theory, and why they're important historically. I would recommend this to anyone who feels like they need a gentle intro to the ideas and motivation for them. Dover mathematics books in general are great.

- Borrow: The American Way of Debt by Louis Hyman. This is a fantastic historical overview of personal credit in the US that covers the past 120 years or so. I learned a ton from reading this that I had no clue about. Recommended to anyone who wants to understand the origins of credit cards / loans, and how society came to embrace being in debt.

https://archive.org/details/in.ernet.dli.2015.222834/page/n7

https://people.umass.edu/klement/imp/imp-ebk.pdf

https://archive.org/details/humeenquiry00humerich/page/n7

https://www.amazon.com/Turings-Cathedral-Origins-Digital-Uni...

https://www.amazon.com/Am-Strange-Loop-Douglas-Hofstadter/dp...

https://www.amazon.com/Introduction-Information-Theory-Symbo...

https://www.amazon.com/Borrow-American-Debt-Louis-Hyman/dp/0...

I was introduced to Shannon's theory through Pierce's text, which is also surprisingly good and cheap [1].

[1] https://www.amazon.com/Introduction-Information-Theory-Symbo...

anongraddebt
I was introduced to Shannon through the same text, and also found it good and cheap. I second the recommendation!
The information theory (mathematical) definition of randomness is explained nicely in Chapter V of An Introduction to Information Theory:

http://www.amazon.com/Introduction-Information-Theory-Symbol...

In the particular case of a series of n equally-probably, independent events, the entropy is given as H = - log 1/n, measured in bits. For example, the entropy of a fair die is - log 1/6 = 2.58 bits per throw.

In this case, the random event is words chosen from a word list. Four words are chosen from fifty thousand, with each word having equal probability of being chosen. So the entropy (measure of randomness) is - log 1/50,000 = 15.6 bits per word, or 62.4 bits per four-word combination. (The script also adds random numbers or symbols, to add up to 90 bits.)

If you want an actual introduction to Information Theory, I'd recommend "An Introduction to Information Theory: Symbols, Signals and Noise".

It deals with complicated information theory topics similar to Kolmogorov Complexity (albeit by a different name) in an easy-to-approach way. Highly recommended.

[1]: http://www.amazon.com/An-Introduction-Information-Theory-Mat...

Information theory is a really cool, really practical form of math / computer science. I highly suggest reading up on it (I'm using this book to get through the basic ideas: http://www.amazon.com/Introduction-Information-Theory-Symbol...).

Basically it has to do with measuring entropy / uncertainty in a message, and I'm finding it has applications almost everywhere, from file compression to image editing to highway design to just the way I talk and communicate information. Fun stuff!

chaosinorder
Will add it to the list. Thanks!
HN Books is an independent project and is not operated by Y Combinator or Amazon.com.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.