Hacker News Stories and CommentsAll the comments and stories posted to Hacker News that reference this course.
The Georgia Tech online masters?
Depending on what stats you want to do, there are some pretty decent MOOCs. No one is going to claim that Daphne Koller's PGM course is weak in anyway for example.
Nowadays, there are a couple of really excellent online lectures to get you started.
The list is too long to include them all. Every one of the major MOOC sites offers not only one but several good Machine Learning classes, so please check [coursera](https://www.coursera.org/), [edX](https://www.edx.org/), [Udacity](https://www.udacity.com/) yourself to see which ones are interesting to you.
However, there are a few that stand out, either because they're very popular or are done by people who are famous for their work in ML. Roughly in order from easiest to hardest, those are:
* Andrew Ng's [ML-Class at coursera](https://www.coursera.org/course/ml): Focused on application of techniques. Easy to understand, but mathematically very shallow. Good for beginners!
* Hasti/Tibshirani's [Elements of Statistical Learning](http://statweb.stanford.edu/~tibs/ElemStatLearn/): Also aimed at beginners and focused more on applications.
* Yaser Abu-Mostafa's [Learning From Data](https://www.edx.org/course/caltechx/caltechx-cs1156x-learnin...): Focuses a lot more on theory, but also doable for beginners
* Geoff Hinton's [Neural Nets for Machine Learning](https://www.coursera.org/course/neuralnets): As the title says, this is almost exclusively about Neural Networks.
* Hugo Larochelle's [Neural Net lectures](http://www.youtube.com/playlist?list=PL6Xpj9I5qXYEcOhn7TqghA...): Again mostly on Neural Nets, with a focus on Deep Learning
* Daphne Koller's [Probabilistic Graphical Models](https://www.coursera.org/course/pgm) Is a very challenging class, but has a lot of good material that few of the other.
Yes, I did my research but there is no such interactive tutorial online like Treehouse or Codecademy. There are so many tutorials but none of it tells you the whole path.
Here are the resources I found useful:
========================================== Advices from Open AI, Facebook AI leaders
Yaser Abu-Mostafa’s Machine Learning course which focuses much more on theory than the Coursera class but it is still relevant for beginners.(https://work.caltech.edu/telecourse.html)
Neural Networks and Deep Learning (Recommended by Google Brain Team) (http://neuralnetworksanddeeplearning.com/)
Probabilistic Graphical Models (https://www.coursera.org/learn/probabilistic-graphical-model...)
Computational Neuroscience (https://www.coursera.org/learn/computational-neuroscience)
Statistical Machine Learning (http://www.stat.cmu.edu/~larry/=sml/)
From Open AI CEO Greg Brockman on Quora
Deep Learning Book (http://www.deeplearningbook.org/) ( Also Recommended by Google Brain Team )
It contains essentially all the concepts and intuition needed for deep learning engineering (except reinforcement learning). by Greg
2. If you’d like to take courses: Linear Algebra — Stephen Boyd’s EE263 (Stanford) (http://ee263.stanford.edu/) or Linear Algebra (MIT)(http://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebr...)
Neural Networks for Machine Learning — Geoff Hinton (Coursera) https://www.coursera.org/learn/neural-networks
Neural Nets — Andrej Karpathy’s CS231N (Stanford) http://cs231n.stanford.edu/
Advanced Robotics (the MDP / optimal control lectures) — Pieter Abbeel’s CS287 (Berkeley) https://people.eecs.berkeley.edu/~pabbeel/cs287-fa11/
Deep RL — John Schulman’s CS294–112 (Berkeley) http://rll.berkeley.edu/deeprlcourse/
From Director of AI Research at Facebook and Professor at NYU Yann LeCun on Quora
In any case, take Calc I, Calc II, Calc III, Linear Algebra, Probability and Statistics, and as many physics courses as you can. But make sure you learn to program.
⬐ atarianWhat does physics have to do with ML/AI?⬐ kevinphy⬐ JJarrard"The Extraordinary Link Between Deep Neural Networks and the Nature of the Universe" https://www.technologyreview.com/s/602344/the-extraordinary-...Thank you!
You're welcome! FWIW I mostly agree with argonaut's point elsewhere in this thread - very few people successfully self-teach ML from a textbook alone. So whichever book(s) you choose, it might also be worth working through some course materials. I've already suggested Stanford's CS229 for solid foundations, but depending on your interests in bioinformatics, Daphne Koller's Coursera course on probabilistic graphical models (https://www.coursera.org/learn/probabilistic-graphical-model...) might be especially relevant. Koller literally wrote the book on PGMs, has done a lot of work in comp bio, and her MOOC is apparently the real deal: very intense but well-reviewed by the people that make it through.
Gain background knowledge first, it will make your life much easier. It will also make the difference between just running black box libraries and understanding what's happening. Make sure you're comfortable with linear algebra (matrix manipulation) and probability theory. You don't need advanced probability theory, but you should be comfortable with the notions of discrete and continuous random variables and probability distributions.
Khan Academy looks like a good beginning for linear algebra: https://www.khanacademy.org/math/linear-algebra
MIT 6.041SC seems like a good beginning for probability theory: https://www.youtube.com/playlist?list=PLUl4u3cNGP60A3XMwZ5se...
Then, for machine learning itself, pretty much everyone agrees that Andrew Ng's class on Coursera is a good introduction: https://www.coursera.org/learn/machine-learning
If you like books, "Pattern Recognition and Machine Learning" by Chris Bishop is an excellent reference of "traditional" machine learning (i.e., without deep learning).
"Machine Learning: a Probabilistic Perspective" book by Kevin Murphy is also an excellent (and heavy) book: https://www.cs.ubc.ca/~murphyk/MLbook/
This online book is a very good resource to gain intuitive and practical knowledge about neural networks and deep learning: http://neuralnetworksanddeeplearning.com/
Finally, I think it's very beneficial to spend time on probabilistic graphical models. Here is a good resource: https://www.coursera.org/learn/probabilistic-graphical-model...
Courses You MUST Take:
2. Yaser Abu-Mostafa’s Machine Learning course which focuses much more on theory than the Coursera class but it is still relevant for beginners.(https://work.caltech.edu/telecourse.html)
3. Neural Networks and Deep Learning (Recommended by Google Brain Team) (http://neuralnetworksanddeeplearning.com/)
4. Probabilistic Graphical Models (https://www.coursera.org/learn/probabilistic-graphical-model...)
4. Computational Neuroscience (https://www.coursera.org/learn/computational-neuroscience)
5. Statistical Machine Learning (http://www.stat.cmu.edu/~larry/=sml/)
If you want to learn AI: https://medium.com/open-intelligence/recommended-resources-f...
⬐ pedrosorioIf you want to get started with machine learning you MUST take computational neuroscience? I don't think so.
Sure, a couple things.
(I'm assuming you're comfortable with multivariable calculus.)
Andrew Ng's coursera course is good.
PRML (pattern recognition and machine learning) by bishop is good, and has a useful introduction to probability theory.
You also want a good grounding in linear algebra. Strang is basically the authority on linear: http://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-...
You want a strong grounding in probability theory and statistics. (This is the basic language and intuition of the entire field.) I don't have as many preferences here (although its the most important); someone in this thread pointed to a course on statistical learning @ stanford that's good.
A good understanding of optimization is helpful. Here's a link that leads to a useful MOOC for that: http://stanford.edu/~boyd/cvxbook/
there's a lot of other stuff (markov decision processes, gaussian processes, monte carlo methods come to mind) that is useful that I'm not pointing to, but if you've hit the other stuff here then you'll probably be able to find out those things.
If you're into it, https://www.coursera.org/course/pgm is good but not vital.
You may want to know about reinforcement learning. This answer does better than I can: https://www.quora.com/What-are-the-best-books-about-reinforc...
Deep learning seems popular these days :) (http://www.deeplearningbook.org/)
Otherwise, it depends on the domain.
For NLP, there's a great stanford course on deep learning + NLP (http://cs224d.stanford.edu/syllabus.html), but there's a ton of domain knowledge for most NLP work (and a lot of it really centers around data preparation).
For speech, theoretical computer science matters (weighted finite state transducers, formal languages, etc.)
For vision, again, stanford: (http://cs231n.stanford.edu/syllabus.html)
For other applications, well, ask someone else? :)
arxiv.org/list/cs.CL/recent arxiv.org/list/cs.NE/recent arxiv.org/list/cs.LG/recent arxiv.org/list/cs.AI/recent
EDIT: unfortunately, there's also a lot of practitioner's dark art; I picked a lot up as a research assistant, and then my first year in industry felt like being strapped to a rocket.
⬐ hiddencostOh no! I forgot about information theory! I don't have a specific recommendation, but it's very useful background.
There's a coursera course on Probabilistic Graphical Models: https://www.coursera.org/course/pgm
I would guess that's more approachable than a text book, but who knows.
Andrew Ng's ML Class - https://www.coursera.org/learn/machine-learning
Daphne Koller's PGM Class - https://www.coursera.org/course/pgm
Dan Jurafsky's and Christopher Manning's NLP Class - https://www.coursera.org/course/nlp
Thanks for this. Incidentally, the second paper you link to is co-auhored, among others, by Daphne Koller, who teaches [this great course on probabilistic graphical models](https://www.coursera.org/course/pgm) and Andrew Ng, who teaches [the best-known intro MOOC in machine learning](https://www.coursera.org/learn/machine-learning/home/welcome).
The title sounds familiar, it's also a course on coursera:
Last session was in 2013 though.
Some good books on Machine Learning:
Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Flach): http://www.amazon.com/Machine-Learning-Science-Algorithms-Se...
Machine Learning: A Probabilistic Perspective (Murphy): http://www.amazon.com/Machine-Learning-Probabilistic-Perspec...
Pattern Recognition and Machine Learning (Bishop): http://www.amazon.com/Pattern-Recognition-Learning-Informati...
There are some great resources/books for Bayesian statistics and graphical models. I've listed them in (approximate) order of increasing difficulty/mathematical complexity:
Think Bayes (Downey): http://www.amazon.com/Think-Bayes-Allen-B-Downey/dp/14493707...
Bayesian Methods for Hackers (Davidson-Pilon et al): https://github.com/CamDavidsonPilon/Probabilistic-Programmin...
Doing Bayesian Data Analysis (Kruschke), aka "the puppy book": http://www.amazon.com/Doing-Bayesian-Data-Analysis-Second/dp...
Bayesian Data Analysis (Gellman): http://www.amazon.com/Bayesian-Analysis-Chapman-Statistical-...
Bayesian Reasoning and Machine Learning (Barber): http://www.amazon.com/Bayesian-Reasoning-Machine-Learning-Ba...
Probabilistic Graphical Models (Koller et al): https://www.coursera.org/course/pgm http://www.amazon.com/Probabilistic-Graphical-Models-Princip...
If you want a more mathematical/statistical take on Machine Learning, then the two books by Hastie/Tibshirani et al are definitely worth a read (plus, they're free to download from the authors' websites!):
Introduction to Statistical Learning: http://www-bcf.usc.edu/~gareth/ISL/
The Elements of Statistical Learning: http://statweb.stanford.edu/~tibs/ElemStatLearn/
Obviously there is the whole field of "deep learning" as well! A good place to start is with: http://deeplearning.net/
⬐ alexcasalboniThose are great resources!
In case you are interested in MLaaS (Machine Learning as a Service), you can check these as well:
Amazon Machine Learning: http://aws.amazon.com/machine-learning/ (my review here: http://cloudacademy.com/blog/aws-machine-learning/)
Azure Machine Learning: http://azure.microsoft.com/en-us/services/machine-learning/ (my review here: http://cloudacademy.com/blog/azure-machine-learning/)
Google Prediction API: https://cloud.google.com/prediction/
OpenML: http://openml.org/⬐ yedhukrishnan⬐ yedhukrishnanI went through the links and your review. They are really good. Thanks!Those are really useful. Thank you. Books are pricey though!⬐ shogunmikeI know...some of them are indeed expensive!
At least the latter two ("ISL" and "ESL") are free to download though.
I don't know how he learned, but I studied it through the very demanding, and worth every second, course from Coursera:
⬐ rndnI read somewhere that this is one of the hardest Coursera courses.⬐ waterlesscloudIt's definitely the hardest one I've taken there. Most of the difficulty comes from the density of the lectures. She moves fast and takes it for granted that you're piecing everything together as you go. You're probably not, but at least you can go back and watch it again if necessary!
Hinton's Neural Network class was very challenging for me too, mostly because many of the concepts were unfamiliar to me. But again, I could re-watch whatever I needed to in order to get it.⬐ aylonsIndeed, it is demanding, but fascinating and very well taught.
Too bad they haven't offered it since 2013. I didn't finished it by them for personal reasons :c/⬐ mtraven⬐ sonabinu15 - 20 hours a week is pretty demanding!
I heartily recommend the notebooks published in this course as excellent applied reference material to estimation and optimization.
I love it how code and coursework are intermingled, reminiscing me of Knuth's Literate Programming 
My beef with many other courses offered (including Coursera) is that they use Matlab when it's clearly advantageous to use IPython Notebook as a better experimenting environment. For example, Daphne Koeller's PGM course is still in Matlab and no matter what you do the code looks extremely clumsy and hard to read. N.B. I wrote tens of thousands of lines of Matlab code, including GUI programs, but that does not mean it's a good language to use especially in cases like this.
something like PGM  (this is not a lightweight class) helps to understand the concepts. But it still seems like more of a niche domain right now than a general programming technique.
When one can apply it though, it really shines.
I understand the current implementation of matching for xbox live is a big mess of imperative code - this is one area where knowledge of math can actually simplify the programming 
"Online gaming systems such as Microsoft’s Xbox Live rate relative skills of players playing online games so as to match players with comparable skills for game playing. The problem is to come up with an estimate of the skill of each player based on the outcome of the games each player has played so far. A Bayesian model for this has been proposed..."