Hacker News Comments on
Deep Learning (Adaptive Computation and Machine Learning series)
·
37
HN points
·
11
HN comments
- This course is unranked · view top recommended courses
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this book.How would you compare this book to the first part of "Deep Learning" book (by Ian Goodfellow, Yoshua Bengio, and Aaron Courville)? https://www.amazon.com/gp/product/0262035618
⬐ zer0sugarI feel obligated to interject here as I have not read the book OP linked to but I have attempted to read the paper weight calling itself a book that you linked to.I have it right here actually. It's basically total trash. They claim to show you how to do the math but at the very best all they do is restate random formulas without any explanation. It's not even good enough to serve as a refresher if you know the math. It relies very heavily on you mentally decompiling mathematical notation. I can't believe I got fooled into buying that book.
If you just want to learn the math there's no easier way than to pick up some math books from half price books. They're $10 a pop. It's an affordable way to learn at your own pace.
⬐ random314As a counterpoint, I have the book and found it to be really helpful.⬐ disgruntledphd2I enjoyed the book, but have a background in this area.I was amused by the suggestion that computer science undergrads could handle the book, as clearly the authors and I have met very different computer science undergrads.
⬐ mlthoughts2018The Courville, Goodfellow and Bengio book is definitely suitable for undergraduates. In my current job, we often have new junior level (bachelor’s grads) ML hires work through that book and present chapters in the team reading group. In my experience both as a TA in my PhD program and in industry, that book is fairly easy to read through for anyone with solid understanding of linear algebra and vector calculus, which are freshman / sophomore level college math courses.⬐ disgruntledphd2Cool, I'm glad it's been working out for you. Don't get me wrong, I enjoyed that book, even the start which wasn't focused so much on deep learning specifically.I just don't know many computer science undergrads who'd have the background to make that book useful, as the presentation leans towards the terse.
⬐ zer0sugarHere's a photo of a random page.https://i.imgur.com/vv1CRLv.jpg
You can trust me when I say the entire book is about as unreadable as that and often worse. I'm not afraid of math either. But the book certainly is not teaching anyone anything.
⬐ random314I am astounded by how you continue to insist that the book doesn't teach anyone anything, when I have already stated that I learned something from it! And of course the book has equations in it. What did you expect?⬐ zer0sugar⬐ r-zipWhat did you learn from it?None of my math books are as obtuse as it is. The equations are presented on their own without explanations. On that page alone they're using quite a bit of mathematical notation that I, at least, have never seen before and I suspect it's largely unnecessary.
What did I expect? I expected a book that explained the concepts in plain english as well as mathematically. I expected the authors to be mature enough not to heavily decorate every single equation with as much mathematical notation as possible. Sort of like how bad coders make their code hard to read. That's the vibe I'm getting from the book.
⬐ random314I learnt eigen decomposition, Hessians, PCA, backpropagation, CNN, dropout, maxpooling etc.The page you linked above is the derivation of PCA using linear algebra.
First part derives the encoding matrix from the decoding matrix. 2nd part derives the encoding matrix by minimizing the L2 norm.
If you find the math too heavy, you should take Andrew ngs course at Coursera (not his Stanford lectures, which follow a pattern similar to this book). Or pick up any book targeting programmers, machine learning for hackers etc.
I disagree. If you aren't interested in the math, then you don't have to read in that detail. But it's nice to have it there if you care about it.There's a trend toward "democratizing" ML, which often just means learning a watered-down version of the concepts. This book gives a stronger foundation than many other books out there, but in my experience it was quite an easy read compared to books for related fields (signal/image processing, information theory, data compression).
Whether it's suitable for undergrads is another question.
Ian Goodfellow’s Deep Learning book pretty much useless. I own it and have read through most parts of it. I couldn’t explain it better than top Amazon reviews:https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...
And I’m surprised to not find Aurelion Geron’s absolute masterpiece listed below. I believe it is the best machine learning book ever, although Statistical Learning mentioned in the article is really good as well :
https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-T...
⬐ whymauriThis is pretty harsh. I use Goodfellow as a reference text and then supplement the mathematics behind it with more comprehensive texts like Hastie or Wasserman. Maybe if you sit down and read it cover-to-cover, it will seem disjointed. But I usually read chapters independently - I recently read the convolutional neural network chapter in preparation for an interview and I thought it was fine.⬐ spectramax⬐ a_bonoboI much prefer reading Karpathy's notes and watching Stanford CS230 than to delve into Goodfellow. It sits on my shelf collecting dust.⬐ yaspWhat are Wasserman and Hastie?⬐ esfandia⬐ platz- Wasserman has a book called "All of statistics" that gives a lot of the background required to understand modern machine learning- Hastie is a co-author of two machine learning books, one is "Elements of Statistical Learning" which is very comprehensive, and "Introduction to Statistical Learning", which is more approachable by people without too much background in stats.
So it's a reference, not a pedagogical tool then?A reference implies you already know the topic and just want an index to jog your memory for things you can't hold all in your head at once.
That is different than a pedagogical tool. If so, you shouldn't recommend it to those want to learn the topic.
⬐ whymauriPhrased like this, I agree with you. I didn't think about it like that.I agree with the poster below. Outside of classes, lecture notes, the books I listed, and Sutton/Barto (Intro. to Reinforcement Learning) have taught me the material. I use Goodfellow to brush up before interviews or jog my memory about topic I don't work with very often (like computer vision).
FYI, there's a new edition of Geron's book coming out in August which will include Keras: https://www.goodreads.com/book/show/40363665-hands-on-machin...⬐ hooloovoo_zooI think the Amazon review is rather dramatic, and probably not in a position to comment on style. I thought both Goodfellow and Geron were good. Goodfellow is deep learning for academics coming from a different field; Geron is deep learning for software engineers.⬐ spectramax⬐ Tarq0nThat's the thing, even as an academic book it falls short. It feels disjointed, unorganized and poorly written - and most frustratingly, incomplete.The reason for Goodfellow's popularity is that it was publish in 2014 right at the turn of exponential interest in Deep Learning after AlexNet. It took off and became popular, but readers now feel it is stale for the aforementioned reasons.
⬐ hooloovoo_zooI certainly agree it could be better, but I also think "disjointed, unorganized and poorly written - and most frustratingly, incomplete" more or less sums up the field of deep learning :).I can second the recommendation for Geron's book, it's absolutely stellar.⬐ hsikkaDo you think there is a need for a better written DL textbook? I definitely agreed with the review you linked.I've always thought that Hands on ML by Geron was great implementation wise, but lacking in the mathematical rigor and depth. While I would have a general sense of what is going on after reading it, and I'd certainly be able to structure and implement a model, I don't know if I would have any deep intuitions.
⬐ spectramax⬐ ausbahGeron's book is more of a tutorial/cookbook coalesced with important insights into the practice of machine learning. So, I recommend reading Introduction to Statistical Learning (and Elements of Statistical Learning for theoretical background) before jumping into Geron's book. As engineers, I agree we need to have some theoretical background but at the same time, we are applying this knowledge to real world problems. Geron's book is invaluable and I hope publishes more, it is a gem.⬐ sharcerer2nd edition coming in August. Preorders opened a few days ago. He posted on twitter. Some preview chapters available on O'Reilly's site.I've heard poor things about Goodwell's Deep Learning book as well, what is a good alternative?⬐ spectramax⬐ DanielleMolloyTake CS230, I believe it used to be taught by Fei Fei followed by Karpathy and I don't know who teaches it now.⬐ glialFor what it's worth, I read Goodwell's book cover to cover and loved it. It answers "why" questions rather than "how" questions, but those are the questions I had, and you can find "how" questions answered for your framework of choice on the internet.Gérons book is both entertaining and educational, I really enjoy it so far.⬐ nilknFor what it's worth, I disagree quite strongly with that review. The book is aimed at those with a pretty mature appetite for abstract mathematical reasoning, but not much specific knowledge in the areas of statistics, machine learning, and neural networks. It's an actual graduate-level book, and one must approach it with the appropriate background and education.The Goodfellow book is not complete as an academic intro, but no one book can be. It's not very useful as a practical tutorial, but no book seeking this could cover the mathematical arguments that Goodfellow's book does. I found Goodfellow's book extremely useful for consolidating a lot of handwaving that I'd seen elsewhere and putting it in a slightly more rigorous framework that I could make sense of and immediately work with as a (former) mathematician.
Goodfellow's treatment is especially useful for mathematicians and mathematically-trained practitioners who nevertheless lack a background in advanced statistics. The Elements of Statistical Learning, for instance, is extremely heavy on statistics-specific jargon, and I personally found it far more difficult to extract useful insights from that book than I did from Goodfellow's.
⬐ freyrThe best textbooks, the ones considered classic gems in their field, are careful about what they say and what they leave out. You get the sense that every word is in its right place.By comparison, Goodfellow and his co-authors seemed to just dump everything they know onto the page. It's fragmented, bloated, and it meanders all over the place. Goodfellow was on a recent podcast where he seems to acknowledge that the book straddles an awkward place between tutorial and reference.
I don't mean to sound too harsh. I appreciate its scope, and I've certainly read much worse textbooks.
⬐ nilkn⬐ spectramaxConsidering how much is not in the book -- and that the book is not even that large as far as textbooks go -- part of me feels this criticism is somewhat disingenuous. I'd agree that the direction of the final part on research topics feels fragmented, but the rest of the book certainly doesn't. It's very clearly focused on developing deep neural networks and doesn't actually meander at all.If the worst you can say is it's not a classic text, that's really not saying much at all. I feel weird defending the book so much when to me it's just a book I found useful and I don't even feel that strongly towards it. But the strength of some of the criticism here doesn't seem motivated by the book itself.
⬐ freyrI have to disagree. There are several cases in a chapter’s introductory section, for example, he veers off onto an unnecessarily detailed tangent.The problem with Goodfellow's book is that it is half-baked. I don't know why he has to introduce Linear Algebra section for a 3rd of the book (which ends abruptly) but then moves on to the ANNs. If Goodfellow intended this book for mathematicians, that whole section about LA can be omitted with literally no loss in the book's value. The whole book arguably feels rushed.So, no amount of praise and mathematician's justification makes sense. I agree that it is inclined for mathematicians, but this book is overrated and it is terribly due for a rewrite, update and frankly in my personal view - the writing style.
I am curious of specific parts of the book you found valuable.
⬐ jhanschooPerhaps restructuring the LA part into an appendix would be preferable. For mature readers, it nevertheless serves as a way to focus and agree on notation that is used for the rest of the book.⬐ nilknThe section on linear algebra moves quickly. I don't see how a book including extra prerequisite material is an example of it being half-baked or rushed. Surely that would actually be an example of the opposite? That section is a nice reference to have immediately available for things like SVD and PCA.
The list does not describe why they are the best books, except for a very short blurb. We read the Deep Learning book by Goodfellow, Bengio, and Courville in our reading group when it came out. Even though it contains useful information, it is written in a very haphazard fashion. It is also very unclear what its target audience is. Some sections start as a foundational description, to suddenly change into something that is only for readers with a strong maths background. No one in the reading group was enthusiastic about the book and most actively recommend against it (some called it 'the deep learning book for people who already know deep learning').The highest-rated Amazon reviews seem to have come to the same conclusion: https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...
Put differently, a list such as the linked one may attract a lot of visitors. But without critical, in-depth reviews it is not very useful and might set potential learners on the wrong path.
⬐ dna_polymeraseIt's a great book if you need references on the basic deep learning stuff for publications or your thesis. However, for getting started it is horrible.⬐ ericptsWhat books did your group find well written? It would be very helpful for outsiders to know what people who know their stuff consider good learning material.⬐ akg_67I find it ironic that none of the DL promoters ever apply DL in their own promotions.This book list would have been much better if promoters would have taken time to apply DL to reviews of promoted books and share results.
Whole DS/AI/DL/ML area is infested with such lack of application on their own stuff.
⬐ freyir> it is written in a very haphazard fashionI felt the same way. Knowledgeable authors, loads of information, but quite poorly written.
That said, I don’t know of another book that’s as up to date or comprehensive, so I guess we’re stuck with it till something better comes along.
I ordered a copy of Deep Learning [1] from Amazon last week. On Barnes & Noble [2] the book costs $76.80. On Amazon it is just $28.00. I received the book a couple of days ago. The pages look like it was printed using a low-resolution printer, and the ink color is uneven across pages. I am returning the book. Possible counterfeit, sold by a third party seller. On the other hand this book is also available free online [3]. Maybe it is legal to print it and sell it?[1] https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...
[2] https://www.barnesandnoble.com/w/deep-learning-ian-goodfello...
⬐ WaltPurvis>On the other hand this book is also available free online [3]. Maybe it is legal to print it and sell it?No, it's not legal.
I found that Amazon sells what is presumably a genuine edition of the book for $67.
I did not realize that Amazon had started displaying other sellers as the default seller for brand new books that Amazon itself sells. It used to be that Amazon was always the default seller if they carried a book, and the only way you would see other sellers was to click on the link for other sellers. Which is the way it should be. It's insane that they've changed this.
⬐ ZoomStop⬐ dspillettIf your price is lower than Amazon by a magic percentage, you have a good sales record, and probably other unknown factors, Amazon will grant the buybox now to other sellers on Amazon carried items.> Maybe it is legal to print it and sell it?The first entry in the FAQ doesn't directly address that issue but strongly suggests an answer, morally if not legally:
I don't seen any obvious link to a licence that might confirm or contradict this impression, so if you need/want to know for sure you'll have to contact them to ask for clarification if you can't find it elsewhere.Q: Can I get a PDF of this book? A: No, our contract with MIT Press forbids distribution of too easily copied electronic formats of the book.
⬐ ghaff⬐ joshvmIt's likely not really about licensing. They have a contract with the publisher that presumably allows them to do certain things and doesn't allow them to do others. (I know that's been the case when I've signed a book contract.) Presumably, in this case, they're allowed to publish the full contents in web form on their site but not to publish them in a form that would allow someone to easily print the entire contents in book form.Often these are low cost prints for students in developing countries, for example India. They're not counterfeit as such, but the quality is low to keep the price down. Not sure if that's what you got though, as they're all super cheap on Amazon.com. Normally if it's a marketplace seller it'll say where it ships from. Deep Learning RRPs about £50 in the UK.I'm not sure what the print status of the book is. There seems to be an official MIT Press print, but my university library struggled to get a copy through the inter-library loan network because the eBook has the same ISBN.
⬐ jbay808I ordered a textbook on control theory from Amazon. It arrived, but the material inside the covers is from a small animal veterinary textbook. Byaa different publisher. I took a closer look at the cover, it was poorly printed and didn't quite fit.I'm still confused about how the vendor expected to get away with it. Did they just guess that half of people won't bother opening the textbook? And why not fill it with blank pages then?
⬐ vonmoltkeMight just be that a large-scale counterfeit printing operation mixed up the covers and bound materials. There are probably counterfeit veterinary textbooks floating around that contain control theory materials.⬐ lostapathyQuite possible. It seems likely that the people assembling the counterfeits don’t even read English, let alone have enough interest in the material to notice.
Would you enjoy something that gives a broad overview? Norvig's AI book https://www.amazon.com/Artificial-Intelligence-Modern-Approa... should give you a very broad perspective of the entire field. There will be many course websites with lecture material and lectures to go along with it that you may find useful.The book website http://aima.cs.berkeley.edu/ has lots of resources.
But it sounds like you are specifically interested in deep learning. A Google researcher wrote a book on deep learning in Python aimed at a general audience - https://www.amazon.com/Deep-Learning-Python-Francois-Chollet... - which might be more directly relevant to your interests.
There's also what I guess you would call "the deep learning book". https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...
(People have different preferences for how they like to learn and as you can see I like learning from books.)
(I apologize if you already knew about these things.)
⬐ mlejvaThank you for the tips.The Deep Learning Book (http://deeplearningbook.org) was one of my main studying materials. How would you compare the other DL book you mentioned (https://www.amazon.com/Deep-Learning-Python-Francois-Chollet...) against this one?
In machine learning, hands down these are some of the best related textbooks:- [0] Pattern Recognition and Machine Learning (Information Science and Statistics)
and also:
- [1] The Elements of Statistical Learning
- [2] Reinforcement Learning: An Introduction by Barto and Sutton
- [3] The Deep Learning by Aaron Courville, Ian Goodfellow, and Yoshua Bengio
- [4] Neural Network Methods for Natural Language Processing (Synthesis Lectures on Human Language Technologies) by Yoav Goldberg
Then some math tid-bits:
[5] Introduction to Linear Algebra by Strang
----------- links:
- [0] [PDF](http://users.isr.ist.utl.pt/~wurmd/Livros/school/Bishop%20-%...)
- [0][AMZ](https://www.amazon.com/Pattern-Recognition-Learning-Informat...)
- [2] [amz](https://www.amazon.com/Reinforcement-Learning-Introduction-A...)
- [2] [site](https://www.deeplearningbook.org/)
- [3] [amz](https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...)
- [3] [pdf](http://incompleteideas.net/book/bookdraft2017nov5.pdf)
- [4] [amz](https://www.amazon.com/Language-Processing-Synthesis-Lecture...)
- [5] [amz](https://www.amazon.com/Introduction-Linear-Algebra-Gilbert-S...)
⬐ Thriptic+1 for Elements. I started with Introduction to Statistical Learning and then graduated to Elements as I learned more and grew more confident. Those are fantastic books.⬐ caliber⬐ larrydagCould you elaborate how you switched to Elements? I am curious if it makes sense for one to go through both books in sequence.⬐ jrumbutIf you reading Elements is difficult then I would recommend Introduction.I'm not sure if reading Introduction will prepare you for Elements so much as it will just give you some knowledge you can use and see if it makes sense for you and what you want to do to go and (re)learn some of the math tidbits that you need for Elements.
⬐ turingcompetemeAs an engineer who hadn't studied that type of math in quite a while, Elements was pretty tough and I was getting stuck a lot.ISLR introduces you to many of the same topics in a less rigorous way. Once I was familiar with the topics and had worked through the exercises, Elements became much easier to learn from.
For regression I really like Frank Harrell's Regression Modeling Strategies. http://biostat.mc.vanderbilt.edu/wiki/Main/RmS⬐ rwilson4⬐ dajohnson89I recently read Seber and Lee, Linear Regression Analysis, and highly recommend it.https://www.amazon.com/Linear-Regression-Analysis-George-Seb...
⬐ jrumbutFrank Harrell writes a lot of great stuff and his answers on the Cross Validated Stack Exchange site are worth just reading even if you didn't think you wanted to ask the question they reply to.His blog, http://www.fharrell.com, also contains interesting posts.
>[5] Introduction to Linear Algebra by StrangPeople seem to love this textbook - and understandably so because it's very approachable. But I really struggled with how informal the tone was, and how friendly it was. Perhaps I'd grown too accustomed to the typical theorem -> proof -> example -> problem set format.
⬐ None⬐ cbHXBY1DNoneI have to disagree with The Deep Learning book. I don't find it a good book for anyone. For beginners it's too advanced/theoretical and for experienced ML scientists it's entirely too basic. I very much agree with this review on Amazon [1].For the former, I would recommend Hands-On Learning with Scikit-Learn and Tensorflow
[1] https://www.amazon.com/gp/customer-reviews/R1XNPL1BX5IVOM/re...
⬐ phonebucket>For beginners it's too advanced/theoretical and for experienced ML scientists it's entirely too basic.As a scientist coming to deep learning from another field, I found Courville et al to be pitched at the perfect level.
I made the same transition earlier in my career. One book on deep learning that meets your requirements is [0]. It’s readable, covers a broad set of modern topics, and has pragmatic tips for real use cases.For general machine learning, there are many, many books. A good intro is [1] and a more comprehensive, reference sort of book is [2]. Frankly, by this point, even reading the documentation and user guide of scikit-learn has a fairly good mathematical presentation of many algorithms. Another good reference book is [3].
Finally, I would also recommend supplementing some of that stuff with Bayesian analysis, which can address many of the same problems, or be intermixed with machine learning algorithms, but which is important for a lot of other reasons too (MCMC sampling, hierarchical regression, small data problems). For that I would recommend [4] and [5].
Stay away from bootcamps or books or lectures that seem overly branded with “data science.” This usually means more focus on data pipeline tooling, data cleaning, shallow details about a specific software package, and side tasks like wrapping something in a webservice.
That stuff is extremely easy to learn on the job and usually needs to be tailored differently for every different project or employer, so it’s a relative waste of time unless it is the only way you can get a job.
[0]: < https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma... >
[1]: < https://www.amazon.com/Pattern-Classification-Pt-1-Richard-D... >
[2]: < https://www.amazon.com/Pattern-Recognition-Learning-Informat... >
[3]: < http://www.web.stanford.edu/~hastie/ElemStatLearn/ >
⬐ soVeryTired+1 for Gelman, but I hate Bishop's book [2]. It was an early go-to reference in the field, but there are better books out there now.⬐ hikarudo⬐ Iwan-ZotowWhat do you hate about Bishop's book? I'm genuinely curious.⬐ soVeryTiredHonestly, I don't understand the way he explains things. The maths is difficult to follow, and it just never clicks for me. Maybe he's writing for someone with a physics background or something, but I feel stupid when I read bishop.I just read over his description of how to transform a uniform random variable into a variable with a desired distribution (p. 526). It's a fairly easy trick, but if I didn't already know it I wouldn't understand his explanation
⬐ bllguoI'm trying to read through it and I have to agree, his math isn't that clear to me. What do you recommend?⬐ soVeryTiredDavid Barber!Goodfellow book [0] is available for free, http://www.deeplearningbook.org/
Our vision is to be the tool that starts with great settings for beginners but lets you graduate into the internals as you become more expert - at the lowest level you can interactively create computation graphs and see their results as you change settings, sort of like eager mode for ml frameworks on steroids (or other visual computation graph programs that designers use like Origami/Quartz Composer).The lobes in the UI are all essentially functions that you double click into to see the graph they use, all the way down to the theory/math.
If you want more comprehensive ways to learn the theory, I highly recommend Stanford's 231n course (http://cs231n.stanford.edu/) and the Goodfellow/Bengio/Courville Deep Learning book (https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...)
⬐ ghosthamlet@mbeissinger thanks, great to see Lobe have low levels for play.
Here’s the book that’s mentioned:http://www.deeplearningbook.org/
Seems to have good reviews on Amazon:
https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...
⬐ colmvpIt's a pretty great general book that I go back to every now and then when I'm reviewing something I'm learning (i.e. GANs) but isn't something I would recommend for newbies to just pick up and try to understanding without at least spending some time learning math/stats since it's a bit more technical than what non-technical might be prepared for.There's better introductory material out there to Deep Learning, i.e. fast.ai, Ng's Coursera course, Thrun's Udacity's introductory course, tutorials on Medium, where they explain the math/stats behind something but you can get away with learning the process first and play around with code.
Furthermore for some newbies, I think it's a little easier to understand the material when they try to play with it in notebooks (as is the case of books like Hands-On Machine Learning with Sci-Kit Learn and Tensorflow) than trying to just memorize statements on a page.
But certainly for those here who are more technically orientated, it's an excellent book to pore over. I just like to caution friends interested in Deep Learning that it's not the be all, end all way of getting into it and that if someone out there is interested there are more gradual learning curves elsewhere to get their feet wet before committing to trying to go deeper.
⬐ gregatragenet3Sadly no Kindle / digital format available. I did find that the online version run through pdfcreator (printer driver) and then through k2pdfopt produced something surprisingly readable on smartphones.⬐ travisglinesNote that the text of the book is available free here:
Deep Learning (Adaptive Computation and Machine Learning series) by Ian Goodfellow, Yoshua Bengio, Aaron CourvilleCame out in November 2016. Split in 3 parts:
Part I: Applied Math and Machine Learning Basics (Linear Algebra, Probability and Information Theory, Numerical computation)
Part II: Deep Networks: Modern Practices (Deep Feedforward Networks, Regularization, CNNs, RNNs, Practical Methodology & Applications)
Part III: Deep Learning Research (Linear Factor Models, Autoencoders, Representation Learning, Structured Probabilistic Models, Monte Carlo Methods, Inference, Partition Function, Deep Generative Models)
https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...
⬐ m-i-lAlso available at http://www.deeplearningbook.org/ .
I'll give you a couple. Note that some of these are rehashes of my earlier comments.# Elements of Programming
https://www.amazon.com/Elements-Programming-Alexander-Stepan...
This book proposes how to write C++-ish code in a mathematical way that makes all your code terse. In this talk, Sean Parent, at that time working on Adobe Photoshop, estimated that the PS codebase could be reduced from 3,000,000 LOC to 30,000 LOC (=100x!!) if they followed ideas from the book https://www.youtube.com/watch?v=4moyKUHApq4&t=39m30s
Another point of his is that the explosion of written code we are seeing isn't sustainable and that so much of this code is algorithms or data structures with overlapping functionalities. As the codebases grow, and these functionalities diverge even further, pulling the reigns in on the chaos becomes gradually impossible.
Bjarne Stroustrup (aka the C++ OG) gave this book five stars on Amazon (in what is his one and only Amazon product review lol).
This style might become dominant because it's only really possible in modern successors of C++ such as Swift or Rust, not so much in C++ itself.
https://smile.amazon.com/review/R1MG7U1LR7FK6/
# Grammar of graphics
https://www.amazon.com/Grammar-Graphics-Statistics-Computing...
This book changed my perception of creativity, aesthetics and mathematics and their relationships. Fundamentally, the book provides all the diverse tools to give you confidence that your graphics are mathematically sound and visually pleasing. After reading this, Tufte just doesn't cut it anymore. It's such a weird book because it talks about topics as disparate Bayesian rule, OOP, color theory, SQL, chaotic models of time (lolwut), style-sheet language design and a bjillion other topics but always somehow all of these are very relevant. It's like if Bret Victor was a book, a tour de force of polymathical insanity.
The book is in full color and it has some of the nicest looking and most instructive graphics I've ever seen even for things that I understand, such as Central Limit Theorem. It makes sense the the best graphics would be in the book written by the guy who wrote a book on how to do visualizations mathematically. The book is also interesting if you are doing any sort of UI interfaces, because UI interfaces are definitely just a subset of graphical visualizations.
# Scala for Machine Learning
https://www.amazon.com/Scala-Machine-Learning-Patrick-Nicola...
This book almost never gets mentioned but it's a superb intro to machine learning if you dig types, scalable back-ends or JVM.
It’s the only ML book that I’ve seen that contains the word monad so if you sometimes get a hankering for some monading (esp. in the context of ML pipelines), look no further.
Discusses setup of actual large scale ML pipelines using modern concurrency primitives such as actors using the Akka framework.
# Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques for Building Intelligent Systems
https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-T...
Not released yet but I've been reading the drafts and it's a nice intro to machine learning using modern ML frameworks, TensorFlow and Scikit-Learn.
# Basic Category Theory for Computer Scientists
https://www.amazon.com/gp/product/0262660717/ref=as_li_ss_tl...
Not done with the book but despite it's age, hands down best intro to category theory if you care about it only for CS purposes as it tries to show how to apply the concepts. Very concise (~70 pages).
# Markov Logic: An Interface Layer for Artificial Intelligence
https://www.amazon.com/Markov-Logic-Interface-Artificial-Int...
Have you ever wondered what's the relationship between machine learning and logic? If so look no further.
# Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series)
https://www.amazon.com/gp/product/0262018020/ref=as_li_ss_tl...
Exhaustive overview of the entire field of machine learning. It's engaging and full of graphics.
# Deep Learning
https://www.amazon.com/gp/product/0262035618/ref=as_li_ss_tl...
http://www.deeplearningbook.org/
You probably have heard about this whole "deep learning" meme. This book is a pretty self-contained intro into the state of the art of deep learning.
# Designing for Scalability with Erlang/OTP: Implement Robust, Fault-Tolerant Systems
https://www.amazon.com/Designing-Scalability-Erlang-OTP-Faul...
Even though this is an Erlang book (I don't really know Erlang), 1/3 of the book is devoted to designing scalable and robust distributed systems in a general setting which I found the book worth it on it's own.
# Practical Foundations for Programming Languages
https://www.amazon.com/gp/product/1107150302/ref=as_li_ss_tl...
Not much to say, probably THE book on programming language theory.
# A First Course in Network Theory
https://www.amazon.com/First-Course-Network-Theory/dp/019872...
Up until recently I didn't know the difference between graphs and networks. But look at me now, I still don't but at least I have a book on it.
⬐ bad_userAmazon links with your affiliate tag, seriously?⬐ None⬐ hackermailmanNone⬐ adamnemecekwhat about them?⬐ krannerI see nothing wrong with GP providing their affiliate tag.They are referring customers to Amazon, and customers don't pay extra.
⬐ bad_userAs an ex-Amazon Affiliate myself, I disagree because the incentive to post those links is not aligned with the reader's expectations.Do you enjoy viewing commercials and product placements without the proper disclaimer? Because this is exactly what this is. I surely don't appreciate hidden advertising, not because of the quality of the advertised products, but because I cannot trust such recommendations, as a salesman can say anything in order to sell his shit.
Notice how this is the biggest list of recommendations in this thread. Do you think that's because the author is very knowledgeable or is it because he has an incentive to post links?
⬐ adamnemecek> As an ex-Amazon Affiliate myself, I disagree because the incentive to post those links is not aligned with the reader's expectations.Please don't project your behavior onto others. I take book recommendations seriously. I actually really enjoy it, people have told me IRL that my recommendations helped them a lot.
> Notice how this is the biggest list of recommendations in this thread.
They are all books that I've read in the last ~4 monthish (not all in entirety). Just FYI I'm not sure how much money you think I'm making off this but for me it's mostly about the stats, I'm curious what people are interested in.
> Do you think that's because the author is very knowledgeable
I'm more than willing to discuss my knowledgeability.
> or is it because he has an incentive to post links?
It's the biggest list because due to circumstances I have the luxury of being able to read a ton. I own all the books on the list, I've read all of them and I stand by all of them and some of these are really hidden gems that more people need to know about. I've written some of the reviews before. Just FYI I've posted extensive non-affiliate amazon links before and I started doing affiliate only very recently.
Furthermore, HN repeatedly upvotes blog posts that contain affiliate links. Why is that any different?
Practical Foundations for Programming Languages by Bob Harper is really good, plus there's a free draft of the second version on the author's site http://www.cs.cmu.edu/~rwh/pfpl.htmlI always go to the book author's page first not only to get the errata but also discover things such as free lectures as in the case with Skeina's Algorithm Design Book