Hacker News Comments on
An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics)
·
5
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this book.Most of mine are going to be books by philosophers or scientists (about philosophy or other things).- The Conquest of Happiness by Bertrand Russell was a pretty good one. He has a lot of ideas that were ahead of their time (positive psychology, etc). You can see a lot of parallels between his ideas and modern Stoicism (although Russell criticized it elsewhere, I think he came to some of the same conclusions).
- Introduction To Mathematical Philosophy by Bertrand Russell. Another Russell one. I think this is probably the clearest and easiest to understand explanation I've ever read of the underpinnings of mathematical foundations. It's written in a style that should be accessible to almost anyone with a high school education. He wrote it while he was in prison (for refusing to be drafted) during WW1. Apparently he left a copy of it to the prison warden.
- An Enquiry Concerning Human Understanding by David Hume. This is worth reading because it is the motivation for basically all of modern philosophy of science (at least in the west). It's also pretty easy to read and if you read it you'll be able to more easily understand other books and papers that are responses to it.
- Turing's Cathedral by George Dyson. This book should be required reading for every programmer or aspiring programmer IMO. I learned so much about the history of computing that I didn't know before reading this. You will not regret buying this one.
- I Am A Strange Loop by Douglas Hofstadter. Obviously everyone knows about GEB, but he also wrote a shorter follow up that in my opinion expresses his ideas much more clearly. I think that even if you disagree with him, it's worth reading because there are so many things you can take away from this book. For example, he talks about his wife's death, and ties that into his theory of mind and explains the unstated purposes of why we have funerals/wakes for people.
- An Introduction to Information Theory by John R. Pierce. For someone like me who doesn't really have a very strong math background, this was a very clear intro to the ideas behind information theory, and why they're important historically. I would recommend this to anyone who feels like they need a gentle intro to the ideas and motivation for them. Dover mathematics books in general are great.
- Borrow: The American Way of Debt by Louis Hyman. This is a fantastic historical overview of personal credit in the US that covers the past 120 years or so. I learned a ton from reading this that I had no clue about. Recommended to anyone who wants to understand the origins of credit cards / loans, and how society came to embrace being in debt.
https://archive.org/details/in.ernet.dli.2015.222834/page/n7
https://people.umass.edu/klement/imp/imp-ebk.pdf
https://archive.org/details/humeenquiry00humerich/page/n7
https://www.amazon.com/Turings-Cathedral-Origins-Digital-Uni...
https://www.amazon.com/Am-Strange-Loop-Douglas-Hofstadter/dp...
https://www.amazon.com/Introduction-Information-Theory-Symbo...
https://www.amazon.com/Borrow-American-Debt-Louis-Hyman/dp/0...
I was introduced to Shannon's theory through Pierce's text, which is also surprisingly good and cheap [1].[1] https://www.amazon.com/Introduction-Information-Theory-Symbo...
⬐ anongraddebtI was introduced to Shannon through the same text, and also found it good and cheap. I second the recommendation!
The information theory (mathematical) definition of randomness is explained nicely in Chapter V of An Introduction to Information Theory:http://www.amazon.com/Introduction-Information-Theory-Symbol...
In the particular case of a series of n equally-probably, independent events, the entropy is given as H = - log 1/n, measured in bits. For example, the entropy of a fair die is - log 1/6 = 2.58 bits per throw.
In this case, the random event is words chosen from a word list. Four words are chosen from fifty thousand, with each word having equal probability of being chosen. So the entropy (measure of randomness) is - log 1/50,000 = 15.6 bits per word, or 62.4 bits per four-word combination. (The script also adds random numbers or symbols, to add up to 90 bits.)
If you want an actual introduction to Information Theory, I'd recommend "An Introduction to Information Theory: Symbols, Signals and Noise".It deals with complicated information theory topics similar to Kolmogorov Complexity (albeit by a different name) in an easy-to-approach way. Highly recommended.
[1]: http://www.amazon.com/An-Introduction-Information-Theory-Mat...
Information theory is a really cool, really practical form of math / computer science. I highly suggest reading up on it (I'm using this book to get through the basic ideas: http://www.amazon.com/Introduction-Information-Theory-Symbol...).Basically it has to do with measuring entropy / uncertainty in a message, and I'm finding it has applications almost everywhere, from file compression to image editing to highway design to just the way I talk and communicate information. Fun stuff!
⬐ chaosinorderWill add it to the list. Thanks!