HN Academy

Online courses recommended by Hacker News users. [about]

Learn Machine Learning

Coursera · Stanford University · 22 HN points · 42 HN comments

Machine learning is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, practical speech recognition, ...

HN Academy may receive a referral commission when you make purchases on sites after clicking through links on this page. Most courses are available for free with the option to purchase a completion certificate.

Hacker News Comments about Learn Machine Learning

All the comments and stories posted to Hacker News that reference this course.

Nowadays, there are a couple of really excellent online lectures to get you started.

The list is too long to include them all. Every one of the major MOOC sites offers not only one but several good Machine Learning classes, so please check [coursera]( https://www.coursera.org/ ), [edX]( https://www.edx.org/ ), [Udacity]( https://www.udacity.com/ ) yourself to see which ones are interesting to you.

However, there are a few that stand out, either because they're very popular or are done by people who are famous for their work in ML. Roughly in order from easiest to hardest, those are:

* Andrew Ng's [ML-Class at coursera]( https://www.coursera.org/course/ml ): Focused on application of techniques. Easy to understand, but mathematically very shallow. Good for beginners!

* Hasti/Tibshirani's [Elements of Statistical Learning]( http://statweb.stanford.edu/~tibs/ElemStatLearn/ ): Also aimed at beginners and focused more on applications.

* Yaser Abu-Mostafa's [Learning From Data]( https://www.edx.org/course/caltechx/caltechx-cs1156x-learnin... ): Focuses a lot more on theory, but also doable for beginners

* Geoff Hinton's [Neural Nets for Machine Learning]( https://www.coursera.org/course/neuralnets ): As the title says, this is almost exclusively about Neural Networks.

* Hugo Larochelle's [Neural Net lectures]( http://www.youtube.com/playlist?list=PL6Xpj9I5qXYEcOhn7TqghA... ): Again mostly on Neural Nets, with a focus on Deep Learning

* Daphne Koller's [Probabilistic Graphical Models]( https://www.coursera.org/course/pgm ) Is a very challenging class, but has a lot of good material that few of the other.

I have recently wrote an article collecting the best AI resources:

https://medium.com/@rayalez/best-deep-learning-resources-76b...

Specifically, I would reccommend AIMA as the best introduction to AI in general, and a fantastic video course from Berkeley:

https://www.youtube.com/channel/UCshmLD2MsyqAKBx8ctivb5Q/vid...

and also Andrew Ng's course on coursera:

https://www.coursera.org/course/ml

For neural networks there's an awesome course by Hinton:

https://class.coursera.org/neuralnets-2012-001/lecture

and UFLDL tutorial:

http://deeplearning.stanford.edu/wiki/index.php/UFLDL_Tutori...

I think you could play around as a hobby. You might try Theano as a place to start (for LSTM: http://deeplearning.net/tutorial/lstm.html ). If you become passionate about neural networks you might find yourself in grad school simply because that's a great place for diving in more deeply. It's really really helpful to know machine learning. Andrew Ng's Coursera is a great place to start: https://www.coursera.org/course/ml

Someone linked to a coursera course on machine learning in that thread, but the URL seems to have changed. It's now found at https://www.coursera.org/course/ml

"Andrew Ng's Standford course has been a god send in laying out the mathmatics of Machine Learning. Would be a good next step for anybody who was intrigued by this article."

jjoonathan

I don't remember Andrew Ng's coursera class giving a satisfying introductory mathematical treatment. I remember frequent handwaving away of the calculus intuition in favor of just dropping the "shovel-ready" equations into our laps so that we could do the homeworks. If you wanted a better treatment you had to dig it up for yourself (which wasn't too hard if you visited the forums but still).

Has it been supplemented since then?

thanks! I am currently doing the Andrew Ng one ( https://www.coursera.org/course/ml?from_restricted_preview=1... ), saw a Geoff Hinton talk on youtube and didn't even know about this course.

I think these courses are awesome as well:

6006 Introduction To Algorithms from MIT

http://ocw.mit.edu/courses/electrical-engineering-and-comput...

Machine Learning from Stanford: Learn about the most effective machine learning techniques, and gain practice implementing them and getting them to work for yourself. https://www.coursera.org/course/ml

krat0sprakhar

My main aim with this list was to have a collection of lesser known (but awesome) courses. That's one reason why I stayed away from adding MIT's OCW or a MOOC on the list.

Yadi

awesome! Yeah the list is super cool!

My guess is that you won't find any course that explains all the prerequisite math. It's probably more useful to build a solid foundation in probability theory (and therefore calculus) before going on.

For machine learning, a good place to start is Andrew Ng's course on Coursera:

https://www.coursera.org/course/ml

It's pretty light on math, while at the same time giving you experience in implementing and understanding these techniques.

From there, I might recommend Learning from Data and the associated video lectures:

https://work.caltech.edu/telecourse.html

It is a bit of a jump, but it is a great course in presenting the field of machine learning and explaining the mathematical and statistical underpinnings in a systematic way.

jamra

I just finished the Coursera course by Andrew Ng. It was great. The only hand waving done with math was when calculus was necessary. You can take some extra time to do that work yourself if you like, but you will not be missing the underpinnings of why things work statistically. The introduction to neural networks what finally gave me that aha moment.

It is a very self contained course that is quite easy to follow. You can skip the programming exercises if you don't have the time.

crazypyro

For anyone interested in more about the specific math of neural networks, http://www.iro.umontreal.ca/~bengioy/dlbook has a couple good introductory chapters that give overviews of most of the necessary topics for NNs, but also provides additional resource suggestions if you need more in-depth info on a certain subject.

I think you should start working on your Math (Khan academy courses) and ML foundations (Andrew Ng's coursera course). Then Geoffrey Hinton's coursera course on Neural networks could be a gentle introduction to Neural networks, deep learning and their applications. Last but not least, do a small project on Deep learning or try out few kaggle competitions to deepen your understanding.

Links:

https://www.coursera.org/course/ml

https://www.khanacademy.org/math/linear-algebra

https://www.coursera.org/course/neuralnets

www.iro.umontreal.ca/~bengioy/papers/ftml.pdf

http://www.iro.umontreal.ca/~bengioy/dlbook/

http://techtalks.tv/talks/deep-learning/58122/

I will try to list resources in a linear fashion, in a way that one naturally adds onto the previous (in terms of knowledge)

[PREREQUISITES]

First things first, I assume you went to a highschool, so you don't have a need for a full pre-calculus course. This would assume you, at least intuitively, understand what a function is; you know what a polynomial is; what rational, imaginary, real and complex numbers are; you can solve any quadratic equation; you know the equation of a line (and of a circle) and you can find the point that intersects two lines; you know the perimiter, area and volume formulas for common geometrical shapes/bodies and you know trigonometry in a context of a triangle. Khan Academy website (or simple googling) is good to fill any gaps in this.

[BASICS]

You would obviously start with calculus. Jim Fowlers Calculus 1 is an excellent first start if you don't know anything about the topic. Calculus: Single Variable https://www.coursera.org/course/calcsing is the more advanced version which I would strongly suggest, as it requires very little prerequisites and goes into some deeper practical issues.

By far the best resource for Linear Algebra is the MIT course taught by Gilbert Strang http://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebr... If you prefer to learn through programming, https://www.coursera.org/course/matrix might be better for you, though this is a somewhat lightweight course.

[SECOND STEP]

After this point you'd might want to review single variable calculus though a more analytical approach on MIT OCW http://ocw.mit.edu/courses/mathematics/18-01sc-single-variab... as well as take your venture into multivariable calculus http://ocw.mit.edu/courses/mathematics/18-02sc-multivariable...

Excellent book for single variable calculus (though in reality its a book in mathematical analysis) is Spivaks "Calculus" (depending on where you are, legally or illegally obtainable here http://libgen.org/ (as are the other books mentioned in this post)). A quick and dirty run through multivariable analysis is Spivaks "Calculus on Manifolds".

Another exellent book (that covers both single and multivar analysis) is Walter Rudins "Principles of Mathematical Analysis" (commonly referred to as "baby rudin" by mathematicians), though be warned, this is an advanced book. The author wont cradle you with superfluous explanations and you may encounter many examples of "magical math" (you are presented with a difficult problem and the solution is a clever idea that somebody magically pulled out of their ass in a strike of pure genius, making you feel like you would have never thought of it yourself and you should probably give up math forever. (Obviously don't, this is common in mathematics. Through time proofs get perfected until they reach a very elegant form, and are only presented that way, obscuring the decades/centuries of work that went into the making of that solution))

At this point you have all the necessery knowledge to start studying Differential Equations http://ocw.mit.edu/courses/mathematics/18-03sc-differential-...

Alternativelly you can go into Probability and Statistics https://www.coursera.org/course/biostats https://www.coursera.org/course/biostats2

[FURTHER MATH]

If you have gone through the above, you already have all the knowledge you need to study the areas you mentioned in your post. However, if you are interested in further mathematics you can go through the following:

The actual first principles of mathematics are prepositional and first order logic. It would, however, (imo) not be natural to start your study of maths with it. Good resource is https://www.coursera.org/course/intrologic and possibly https://class.stanford.edu/courses/Philosophy/LPL/2014/about

For Abstract algebra and Complex analysis (two separate subjects) you could go through Saylors courses http://www.saylor.org/majors/mathematics/ (sorry, I didn't study these in english).

You would also want to find some resource to study Galois theory which would be a nice bridge between algebra and number theory. For number theory I recommend the book by G. H. Hardy

At some point in life you'd also want to go through Partial Differential Equations, and perhaps Numerical Analysis. I guess check them out on Saylor http://www.saylor.org/majors/mathematics/

Topology by Munkres (its a book)

Rudin's Functional Analysis (this is the "big/adult rudin")

Hatcher's Algebraic Topology

[LIFE AFTER MATH]

It is, I guess, natural for mathematicians to branch out into:

[Computer/Data Science]

There are, literally, hundreds of courses on edX, Coursera and Udacity so take your pick. These are some of my favorites:

Artificial Intelligence https://www.edx.org/course/artificial-intelligence-uc-berkel...

Machine Learning https://www.coursera.org/course/ml

The 2+2 Princeton and Stanford Algorithms classes on Coursera

Discrete Optimization https://www.coursera.org/course/optimization

Convex Optimization https://itunes.apple.com/itunes-u/convex-optimization-ee364a... https://itunes.apple.com/us/course/convex-optimization-ii/id...

[Physics]

http://theoreticalminimum.com/courses

http://ocw.mit.edu/courses/physics/

Dec 07, 2014 · bennetthi on Statistical Learning

I found the Stanford ML class on Coursera ( https://www.coursera.org/course/ml ) really amazing. Although it uses Octave, not R.

In red is your model whereas in green is the real one, M being the number of parameters.

The technical term for the last one is "overfitting" if I remember correctly. But in the case you have an enormous amount of data, it is unlikely to happen.

It reminds me of this awesome course: https://www.coursera.org/course/ml

edit: The parent's parent's parent mention overfit for the MIT work, I don't think it'd be the case if you have that amount of data in hands

Houshalter

It's entirely possible to overfit with enormous amounts of data. As people are now creating models with enormous numbers of parameters.

barisser

Here's another way to think of it.

If the parameter space for my model includes, let's say 10 binary decisions (which is very conservative), that's 1024 possible states of my model. If I tested all 1024 states against historical data, it is likely that some of them might do very well (depending on the general architecture of the model of course). What if I then selected the successful minority and held them up as clever strategies? Their success would very likely have been arbitrary. By basically brute-forcing enough strategies, I will inevitably come across some that were historically successful. But these same historically successful strategies are unlikely to outperform another random strategy in the future. It's not impossible you'll find a nugget of wisdom hidden from everyone else, just much less likely than the more simple explanation I'm offering.

So to your point, it's not just the size of the parameter space versus the data set that matters. Brute-forcing the former alone will likely produce a deceptive minority of winners.

Enzolangellotti

There is a fun chapter on this topic in Jordan Ellenberg's latest book "How not to be wrong". It's called the "Baltimore stockbroker fraud".

Sep 22, 2014 · 4 points, 0 comments · submitted by thrush

Do this Coursera course, targeted towards beginners and its taught by Andrew Ng (Co-founder of Coursera) - https://www.coursera.org/course/ml

brudgers

The videos are available here: https://class.coursera.org/ml-005/lecture

I think it is worth mentioning that "beginners" should be understood as beginners in machine learning and not beginners in computer science. Ng's course more or less assumes students have the equivalent of an upper division undergraduate or graduate background in computer science.

While it may be a valuable learning experience for a person who isn't comfortable knocking out algorithms, academic success will be rather difficult to achieve.

There is a nice tutorial here [1] about max-ent classifiers. Ultimately in neural networks there are usually a number of cost functions you can use for the last layer - softmax or cross-entropy are other possibilities that may be easier to understand, though possibly less performant for NLP tasks.

I thought the introduction to neural networks from Andrew Ng's coursera course (even though it meant writing MATLAB) was quite good, and allows you to implement backprop, cost functions, etc. while still having some other helper code to make things easier. I highly recommend working through that course if you are intersted in ML in general [2].

[1] http://www.cs.berkeley.edu/~klein/papers/maxent-tutorial-sli...

[2] https://www.coursera.org/course/ml

Textbooks? Really?

How about start with a great lecturer like -

Nando de Freitas - https://www.youtube.com/channel/UC0z_jCi0XWqI8awUuQRFnyw

David Mackay - http://videolectures.net/course_information_theory_pattern_r...

or the (sometimes too dense) Andrew Ng - https://www.coursera.org/course/ml

burkaman

Not sure how you saw the books and missed the explanation, but

"But why textbooks? Because they're one of the few learning mediums where you'll really own the knowledge. You can take a course, a MOOC, join a reading group, whatever you want. But with textbooks, it's an intimate bond. You'll spill brain-juice on every page; you'll inadvertently memorize the chapter titles, the examples, and the exercises; you'll scribble in the margins and dog-ear commonly referenced areas and look for applications of the topics you learn -- the textbook itself becomes a part of your knowledge (the above image shows my nearest textbook). Successful learners don't just read textbooks. Learn to use textbooks in this way, and you can master many subjects -- certainly machine learning."

orasis

I saw the explanation and I don't buy it. In pretty much every occasion that I've found a textbook valuable, I've found lectures by the authors far more valuable.

David Mackay's lectures are incredible and go much further in explaining the material in an understandable fashion than his excellent book "Information Theory, Inference and Learning Algorithms".

incision
>'Textbooks?'

The article addresses this almost immediately.

It's fine to disagree, but crapping a 'Really?' plus some contextless links onto someone who put forth general reasoning for the nature of the recommendations and spent 1300+ words describing expectations, key takeaways and projects for those recommendations is just lame.

orasis

On a given topic, I believe the best textbook is an inferior medium to the best lecture. Rather than blathering for 1300 words, I provided links to some excellent machine learning lectures.

beejiu

You sound like you've had some bad experiences with books. If you learn better with lectures, great. However, it's not for everybody -- I personally think spending time away from the computer (until I'm programming something), with a book, paper and a pen is very good time spent.

Chronic29

But time spent with a computer is time efficiently spent. You can't say that about books.

Houshalter

I read textbooks on my computer actually. Not by choice though, textbooks are difficult to find and too expensive.

I used to hate lectures when I was in school but now I sort of prefer them. It's easier for some reason. It's more passive; you just sit there and listen rather than actively read. It doesn't seem to be slower like others complain, and may even be faster. I read difficult texts very slowly and methodically, and often have to reread stuff.

HamSession

Its because textbooks are usually more in depth than lectures can go due to time considerations, this is why any graduate program is 90% papers and textbooks.

A great example of this is Andrew Ng's course, even though he is the co-inventor of LDA (complicated Bayes network) he does not explain Bayesian analysis in his course.

Coursera has a nice course on Machine Learning[0] and the 4th and 5th week deal with Neural Networks specifically if anyone wants to learn more and get his hands dirty with octave/matlab code.

[0] https://www.coursera.org/course/ml

dehrmann

I'm in that course! What's interesting is comparing it to the Stanford AI course from a few years ago. The professors have very different approaches.

mikejholly

Nice! I've seen this and will add it to my list :)

valdiorn

Yeah I'm doing that right now. I find it relatively easy, it's a good introduction course, gives a good overview, but you probably want to take a follow-up course that builds on top of that one to really get into ML. (I'm aiming for the Neural Networks course on Coursera, and then I want to look into decision trees and Bayesian networks)

Mar 04, 2014 · 2 points, 0 comments · submitted by cryptolect

There's a great Machine Learning course up on Coursera from Stanford: https://www.coursera.org/course/ml

Some I can recommend that are still available on Coursera:

- Introduction to mathematical thinking [1]

- Introduction to Mathematical Philosophy [2]

- Machine Learning (actually a CS course, but involves linear algebra and some calculus) [3]

- Calculus: Single Variable [4]

[1] https://www.coursera.org/course/maththink

[2] https://www.coursera.org/course/mathphil

[3] https://www.coursera.org/course/ml

[4] https://www.coursera.org/course/calcsing

mitochondrion

Many thanks!

You can learn a lot about machine learning from this course https://www.coursera.org/course/ml

OFFER TO VOLUNTEER - Machine Learning, Artificial Intelligence, Scientific and Open Source projects

I'm an engineer with experience working for startups doing web development. Currently I'm one of the Community TAs for the Startup Engineering class and for the Machine Learning class at Coursera (Stanford).

I'm looking for opportunities to volunteer, preferably Machine Learning, Artificial Intelligence or Scientific projects. Being a Community TA has been a great experience and an opporunity to get a deep understanding on these topics. I'm eager to contribute to scientific, open source projects and the like.

Drop me a line: [email protected]

Startup Engineering: https://www.coursera.org/course/startup

Machine Learning: https://www.coursera.org/course/ml

Github: https://github.com/ccarpenterg

ML code: https://github.com/ccarpenterg/ML

None

Machine Learning. I thought it was a really tough class and took more time than they say (they say workload: 5-7 hours/week -- maybe if you are perfect and your code never has bugs that you need to spend time debugging -- the course is based on programming assignments in Octave where you have to demonstrate mastery of machine learning concepts), but I put in a lot of extra time, mastered everything, finished with a 100. Andrew Ng is a top-notch teacher, even though his speaking style is very low-key. https://www.coursera.org/course/ml

None

OFFER TO VOLUNTEER - Machine Learning, Artificial Intelligence, Scientific and Open Source projects

I'm an engineer with experience working for startups doing web development. Currently I'm one of the Community TAs for the Startup Engineering class and for the Machine Learning class at Coursera (Stanford).

I'm looking for opportunities to volunteer, preferably Machine Learning, Artificial Intelligence or Scientific projects. Being a Community TA has been a great experience and an opporunity to get a deep understanding on these topics. I'm eager to contribute to scientific, open source projects and the like.

Drop me a line: [email protected]

Startup Engineering: https://www.coursera.org/course/startup

Machine Learning: https://www.coursera.org/course/ml

Github: https://github.com/ccarpenterg

pknerd

Me too. If anyone is offering something then do contact me as well!

SteveMorin

pknerd , I tried to reach you on linkedin but email me at [email protected]

artmageddon

I'm a student in the Stanford machine learning class at the moment and wanted to take a moment to say thank you for all the contribution that the community TAs have put in to help us out!

Nov 14, 2013 · kot-behemoth on Deep Learning 101

Link for the impatient https://www.coursera.org/course/ml Looks great indeed!

Nov 14, 2013 · mbeissinger on Deep Learning 101

Definitely a solid foundation in linear algebra and statistics (mostly Bayesian) are necessary for understanding how the algorithms work. Check out the wiki portals ( http://en.wikipedia.org/wiki/Machine_learning ) and ( http://en.wikipedia.org/wiki/Artificial_intelligence ) for overviews of the most common approaches.

Also, Andrew Ng's coursera course on machine learning is amazing ( https://www.coursera.org/course/ml ) as well as Norvig and Thrun's Udacity course on AI ( https://www.udacity.com/course/cs271 )

Yes, I'm taking it now.

Edit: https://www.coursera.org/course/ml

MLfan

There's also a more sophisticated course on ML by Hinton: https://www.coursera.org/course/neuralnets Have you tried it as well?

Anon84

I'm browsed it a bit. I'm hoping they will offer it again.

I've been following the self paced AI class in Udacity https://www.udacity.com/course/cs271

OFFER TO VOLUNTEER - Machine Learning, Artificial Intelligence, Scientific and Open Source projects

I'm a Civil Industrial Engineer ( http://en.wikipedia.org/wiki/Industrial_engineering ) with some experience working for startups doing web development. Currently I'm one of the Community TAs for the Startup Engineering class and for the Machine Learning class at Coursera (Stanford).

I'm looking for opportunities to volunteer, preferably Machine Learning, Artificial Intelligence or Scientific projects. Being a Community TA has been a great experience and an opporunity to get a deep understanding on these topics. I'm eager to contribute to scientific, open source projects and the like.

Drop me a line: [email protected]

Startup Engineering: https://www.coursera.org/course/startup

Machine Learning: https://www.coursera.org/course/ml

If you're interested in Machine Learning, the outstanding Coursera course on machine learning just started a couple of days ago. It covers a variety of machine learning topics, including image recognition. The first assignment isn't due for a couple of weeks, so it's a perfect time to jump in and take the machine learning course!

https://www.coursera.org/course/ml

Here's a comment which I had written earlier on another article. The context was about learning ML with Python - of course the objective of Hal is more generic but some parts of it apply here too.

https://news.ycombinator.com/item?id=5802968

The book details building ML systems with Python and does not necessarily teach ML per se. It is a good time to write a ML book in Python particularly keeping in mind efforts to make Python scale to Big Data [0].

What material you want to refer to is entirely dependent on What you want to do?. Here are some of my recommendations-

Q : Do you want to have an "Introduction to ML", some applications with Octave/Matlab as your toolbox?

A :Take up Andrew Ng's course on ML in Coursera [1].

Q : Do you want to have a complete understanding of ML with the mathematics, proofs and build your own algorithms in Octave/Matlab?

A : Take up Andrew Ng's course on ML as taught in Stanford; video lectures are available for free download [2]. Note - This is NOT the same as the Coursera course. For textbook lovers, I have found the handouts distributed in this course far better than textbooks with obscure and esoteric terms. It is entirely self contained. If you want an alternate opinion, try out Yaser Abu-Mostafa's ML course at Caltech [3].

Q : Do you want to apply ML along with NLP using Python ?

A : Try out Natural Language Tool Kit [4]. The HTML version of the NLTK book is freely available (Jump to Chapter 6 for the ML part) [5]. There is an NLTK cookbook available as well which has simple code examples to get you started [6].

Q: Do you want to apply standard ML algorithms using Python?

A : Try out scikit-learn [7]. The OP's book also seems to be a good fit in this category (Disclaimer - I haven't read the OP's book and this is not an endorsement).

[0] http://www.drdobbs.com/tools/us-defense-agency-feeds-python/... .

[1] https://www.coursera.org/course/ml

[2] http://academicearth.org/courses/machine-learning/

[3] http://work.caltech.edu/telecourse.html

[4] http://nltk.org

[5] http://nltk.org/book/

[6] http://www.amazon.com/Python-Text-Processing-NLTK-Cookbook/d... .

[7] http://scikit-learn.org

Sep 15, 2013 · tga on A Course in Machine Learning

If you are interested in this you might want to also look at Andrew Ng's (Stanford) Machine Learning course that is starting soon on Coursera.

https://www.coursera.org/course/ml

kyro

Is this worth going through over picking up a textbook or two? I've found that Coursera courses are actually quite bloated. Lots and lots of empty talking, and very little substance.

Discordian93

Depends on the course, but other a than a select few, I tend to agree. Not the case with edX courses, though, I think they are up to a much higher standard. (At least the programming/math ones).

sampo

I've taken the Coursera ML class. It's very easy if you have the adequate math background. And it's not very comprehensive, there are lots of machine learning methods that are not covered. So it's more like an introductory course to machine learning.

But it's absolutely commendable how Andrew Ng takes the topic to such an understandable level that a clever high schooler who knows a little about programming could take the course. There's even extra videos serving as a crash-course to linear algebra and octave programming in the first week.

So he really manages to make the topic accessible to a very large audience.

dominotw

>adequate math background

Do you know what kind of math is needed other than linear algebra.

Discordian93

Haven't done the course, but from the preview of the videos, they cover the math you need and it's just basic, high-school level knowledge of linear algebra.

sampo

Basic vector and matrix operations. The first half of a typical freshman linear algebra course is more than enough. But like I said, there is a matrix review in the beginning, so if you're willing to study those extra lectures, then almost no prior knowledge is needed.

Also being able to take derivatives helps in a couple of places, but is not necessary.

yelnatz

Thats linear algebra.

nrmn

Some calculus and linear algebra. The majority is linear algebra.

rm999

For a beginner to machine learning I'd recommend Andrew Ng's course notes and lectures over any textbook I've seen. But I prefer his Stanford CS 229 notes to Coursera for exactly the reasons you state: they are watered down. After you really can understand Andrew Ng's course notes I'd recommend a textbook because they go in more detail and cover more topics. My two favorites for general statistical machine learning are:

* Pattern Recognition and Machine Learning by Christopher M. Bishop

* The Elements of Statistical Learning by Trevor Hastie, Robert Tibshirani and Jerome Friedman

Both are very intensive, perhaps to a fault. But they are good references and are good to at least skim through after you have baseline machine learning knowledge. At this stage you should be able to read almost any machine learning paper and actually understand it.

reyan

Isn't Murphy's book more up to date and comprehensive as a reference?

Edit: Andrew Ng's Coursera course is CS229A ( http://cs229a.stanford.edu/ ), not really watered down.

achompas

I'm a big fan of Murphy but its comprehensiveness means you lose some detailed explanations. Bishop really gets at those details (so does EoSL).

denzil_correa

Here's a comment which I had written earlier on another article. The context was about learning ML with Python - of course the objective of Hal is more generic but some parts of it apply here too.

https://news.ycombinator.com/item?id=5802968

The book details building ML systems with Python and does not necessarily teach ML per se. It is a good time to write a ML book in Python particularly keeping in mind efforts to make Python scale to Big Data [0].

What material you want to refer to is entirely dependent on What you want to do?. Here are some of my recommendations-

Q : Do you want to have an "Introduction to ML", some applications with Octave/Matlab as your toolbox?

A :Take up Andrew Ng's course on ML in Coursera [1].

Q : Do you want to have a complete understanding of ML with the mathematics, proofs and build your own algorithms in Octave/Matlab?

A : Take up Andrew Ng's course on ML as taught in Stanford; video lectures are available for free download [2]. Note - This is NOT the same as the Coursera course. For textbook lovers, I have found the handouts distributed in this course far better than textbooks with obscure and esoteric terms. It is entirely self contained. If you want an alternate opinion, try out Yaser Abu-Mostafa's ML course at Caltech [3].

Q : Do you want to apply ML along with NLP using Python ?

A : Try out Natural Language Tool Kit [4]. The HTML version of the NLTK book is freely available (Jump to Chapter 6 for the ML part) [5]. There is an NLTK cookbook available as well which has simple code examples to get you started [6].

Q: Do you want to apply standard ML algorithms using Python?

A : Try out scikit-learn [7]. The OP's book also seems to be a good fit in this category (Disclaimer - I haven't read the OP's book and this is not an endorsement).

[0] http://www.drdobbs.com/tools/us-defense-agency-feeds-python/... .

[1] https://www.coursera.org/course/ml

[2] http://academicearth.org/courses/machine-learning/

[3] http://work.caltech.edu/telecourse.html

[4] http://nltk.org

[5] http://nltk.org/book/

[6] http://www.amazon.com/Python-Text-Processing-NLTK-Cookbook/d... .

[7] http://scikit-learn.org

Exactly ;) Damn I should have paid more attention in https://www.coursera.org/course/ml

The book details building ML systems with Python and does not necessarily teach ML per se. It is a good time to write a ML book in Python particularly keeping in mind efforts to make Python scale to Big Data [0].

What material you want to refer to is entirely dependent on What you want to do? . Here are some of my recommendations-

Q : Do you want to have an "Introduction to ML", some applications with Octave/Matlab as your toolbox?

A :Take up Andrew Ng's course on ML in Coursera [1].

Q : Do you want to have a complete understanding of ML with the mathematics, proofs and build your own algorithms in Octave/Matlab?

A : Take up Andrew Ng's course on ML as taught in Stanford; video lectures are available for free download [2]. Note - This is NOT the same as the Coursera course. For textbook lovers, I have found the handouts distributed in this course far better than textbooks with obscure and esoteric terms. It is entirely self contained. If you want an alternate opinion, try out Yaser Abu-Mostafa's ML course at Caltech [3].

Q : Do you want to apply ML along with NLP using Python ?

A : Try out Natural Language Tool Kit [4]. The HTML version of the NLTK book is freely available (Jump to Chapter 6 for the ML part) [5]. There is an NLTK cookbook available as well which has simple code examples to get you started [6].

Q: Do you want to apply standard ML algorithms using Python?

A : Try out scikit-learn [7]. The OP's book also seems to be a good fit in this category (Disclaimer - I haven't read the OP's book and this is not an endorsement).

[0] http://www.drdobbs.com/tools/us-defense-agency-feeds-python/...

[1] https://www.coursera.org/course/ml

[2] http://academicearth.org/courses/machine-learning/

[3] http://work.caltech.edu/telecourse.html

[4] http://nltk.org

[5] http://nltk.org/book/

[6] http://www.amazon.com/Python-Text-Processing-NLTK-Cookbook/d...

[7] http://scikit-learn.org

For the curious, I think nothing beats the introduction to ML class from Stanfords Andrew Nq [1]. He lectures and explains with a clarity and consistency I don't often see.

[1]: https://www.coursera.org/course/ml

yankoff

Taking this course right now. Awesome.

winter_blue

I really want to take the course -- but I wish there was a textual version of it. I strongly strongly prefer text to audio/video.

Also, the voice of some of these lecturers have this sort-of monotone to it, that has the tendency to let you mind wander off. They're just not "arresting" enough.

For instance, I took the Crypto I class part-way on Coursera, and had this experience. The instructor voice was slow, drawn-out and kind-of put you to sleep. I actually downloaded the videos and just played it on VLC at 1.25x or 1.5x the speed (because he spoke so annoyingly slow).

On the other hand Tim Roughegarden (I think that's his name), who teaches an Algorithms class on Coursera, has an amazing "video personality". Just the way he speaks -- it catches your attention. He passion and enthusiasm for the topic really come across. Now, I'm not saying the other professors aren't as passionate about what they teach -- but it's just that some of these lecturers have a really good way of bringing it through (their love for the topic) on video. Not everyone can (or is) doing it.

derpadelt

You get annotated as well as original PPT-slides along with clear text transcripts of what he says in the videos. Can be a bit awkward as it is not a textbook text but it gets the job done. I honestly think it is hard to do a better course than what you get from Ng's Machine learning on Coursera.

I took this one https://www.coursera.org/course/ml and it started on Apr 22 and ends around July 1st I think. So not sure what course you are talking about.

pfg

Well, this is weird. I'm taking the exact same course and here's what I see for the first programming assignment: http://i.imgur.com/GyEwcUG.png (same with review questions)

ivanist

Sorry my bad. I was under that assumption since I saw that the hard deadline for review questions were today, so I thought that must hold for the programming assignment.

That's awesome to hear! I'm taking the Coursera class now, and it's been great so far. It just started a couple weeks ago, so it definitely isn't too late to join! https://www.coursera.org/course/ml

primaryobjects

I'm also taking the class. Just finished the logistic regression programming assignment. Great stuff. You can take those algorithms and kind of add a bit of magic to your software.

psbp

I wish Coursera followed the Udacity model. I always find out about these classes after they're already weeks in progress or over.

berberous

Check out class-central.com for a list of all current and upcoming classes. Coursera also lets you star a specific class and get notified if they are repeated in a new cycle.

pfg

You can star any Coursera class to receive notifications whenever new sessions are announced.

Also, I believe it's still possible to join the current session (first assignment was due this weekend, but you can turn it in late with just a 20% penalty.)

visarga

I take new courses at any time, even if they have ended. Later on, when they recycle, I can do them all over again with ease.

nkraft

Unless you're attached to getting a certificate of completion, you can pretty much follow the Udacity model. As long as the course hasn't finished, sign up and get around to the videos and assignments when you get to them. There isn't the same discussion forum interchange, and your homework isn't graded, but they don't drop the class from your list even if you do nothing during the run.

packetslave

Note that you'll want to be careful to cache the materials offline if you do this, especially if you plan on "catching up" after the formal end date for the class. Some of the courses (notably the Princeton algorithms ones) disable access to the materials once the official course ends.

etherealG

thanks very much for pointing this out, joined the class now, hope i can catch up

ivanist

Haha me too late to the party. I just joined. Frantically watching the video lectures since the assignments are due today (hard deadline)!

pfg

Okay, now you got me scared. The hard deadlines for all my assignments are on July 8th 8:59AM (that's CEST, so it's probably July 7th in PST.)

ivanist

I took this one https://www.coursera.org/course/ml and it started on Apr 22 and ends around July 1st I think. So not sure what course you are talking about.

pfg

Well, this is weird. I'm taking the exact same course and here's what I see for the first programming assignment: http://i.imgur.com/GyEwcUG.png (same with review questions)

ivanist

Sorry my bad. I was under that assumption since I saw that the hard deadline for review questions were today, so I thought that must hold for the programming assignment.

guiambros

Keep it going! I attended the very first one (~75%), and then did a second time last year. It was then easy to get to a perfect 100% score, having done most of the exercises during the first time.

It is one of the best Coursera classes. I had a blast, and strongly recommend it. I decided to continue learning ML, mostly because Prof. Ng.

Andrew Ng's course on Machine Learning ( https://www.coursera.org/course/ml ) has rave reviews whenever I see it mentioned. It just started again a few weeks ago and I had hoped to join this time, but other commitments made that impossible.

As for the other part of your question...You may as well be asking if it's too late to research 'science'. Machine Learning may have been studied for a fair few years now, but it is still very much in its infancy. The possible developments in this area we can't yet imagine dwarfs the possible developments we can, which in turn dwarfs the 'major contributions' so far.

Feb 10, 2013 · 7 points, 4 comments · submitted by posharma

Would anyone here be knowing the difference between these 3 machine learning courses?

Coursera (Washington univ): https://www.coursera.org/course/machlearning Coursera (Stanford): https://www.coursera.org/course/ml Stanford SEE: http://see.stanford.edu/see/courseinfo.aspx?coll=348ca38a-3a6d-4052-937d-cb017338d7b1

Thanks.

dragonbonheur

Stanford won DARPA's self-driving car race, that's all we need to know.

ankitml

Research and development success can accurately determine quality of teaching. All these are good, but which one is best can be told by course reviews of students. Car race winning doesnt help in this.

dvdhsu

Though Coursera's Stanford course and SEE's course are taught by the same professor, they are different. The Coursera one is 229A(pplied) and doesn't really explore the math behind the techniques; the SEE one is the original 229 that most Stanford students take, and is significantly more math-intensive. The 229A page explains the differences well [1]:

> Q: How does CS229A relate to CS229? Which should I take?

> A: CS229A is complementary to CS229, and provides more applied skills. It's okay to take both, though enrollment in CS229A is limited, and we may give priority to students who have not taken and who are not taking CS229. If your goal is a deep mathematical understanding of machine learning, or if your goal is to do research in AI or machine learning, you should definitely take CS229 (either instead of, or in addition to, CS229A). CS229 has a more difficult set of prerequisites. If you are interested in machine learning but aren't sure if you're ready for the mathematical depth that CS229 requires, then consider taking CS229A instead.

I haven't taken Washington's course, but it seems to be more comprehensive than both of Stanford's. It's currently only available in preview mode, though, so you won't have access to quizzes and programming assignments, which are vital for comprehension and retention.

If you're interested in getting started with machine learning and want to solving problems with it, I'd suggest the Coursera Stanford one. If you're interested in theory, go with the SEE one. I'm not familiar with the Washington one, but I don't recommend it as your primary course as it's still in preview mode, and only the lectures are available.

1. http://cs229a.stanford.edu/faq.html

posharma

Thanks a lot!

Your brain is a neural network of neural networks, during sleep, a cost function is applied across the entire grid. Important aspects of your day are done, and redone at high velocity, simultaneously (leading to dreams).

Cost benefit analysis are done against what you might have done, and the results of that, and actions that would have caused more desirable outcomes are projected, as best as it can see, and the habits, and motor neurons are reconfigured accordingly, this explains why when you get good sleep, and you wake up, you find yourself much better able to do tasks than had you not slept. If you don't sleep, you die.

Source of these points:

http://www.amazon.com/How-Create-Mind-Thought-Revealed/dp/06...

https://www.coursera.org/course/ml

Title is misleading, this function also has to do with encoding short term memories to long term memories. Since the mind only has limited space (limited number of neurons to configure), that only the most useful memories are stored into permanent disk. Disruption of the 7 to 9 hour sleep cycle garbage collects the memories that were about to be stored. The mind queues them up to be dealt with the following day, but sometimes are displaced or missed by more passionate things in the present.

Sleep is one of the most important things you can do to maintain your mind and keep it in top running condition for as long as possible, not too little, not too much, sleep in intervals of 90 minutes. If you consume garbage knowledge on a daily basis, your mind will encode that garbage to permanent disk, and you will become that garbage.

Conspiracy theorists suffer from a mental misconfiguration where the cost function applied to the neural network of neural networks suffers from "over fitting". Finding patterns in randomness leading to conclusions are not valid. A lambda function can be applied against the cost function which will alleviate this. I can do it in software, and when I discover the operating principles of the neo cortex, I will be able to fix all the conspiracy nuts in the local nut house. Take care to not take for granted the fresh slate of your mind while you are young, because when you are old, it'll be mostly full and encoding new skills to disk much more difficult, the cost function is more reluctant to modify the grids since doing so would damage your ability to consume resources, find mates and create more of you. Fill you mind with timeless wisdom and get good sleep before your hard disks become full.

rgbrenner

"If you don't sleep, you die."

No human has ever died from simply not sleeping (excluding accidents, etc caused by lack of sleep)

http://www.scientificamerican.com/article.cfm?id=how-long-ca...

http://www.abc.net.au/health/talkinghealth/factbuster/storie...

fuzzythinker

See also http://en.wikipedia.org/wiki/Thai_Ngoc http://en.wikipedia.org/wiki/Al_Herpin

elteto

I can't say if you are entirely accurate, but you couldn't have explained that in better terms!

etherael
Conspiracy theorists suffer from a mental misconfiguration where the cost function applied to the neural network of neural networks suffers from "over fitting". Finding patterns in randomness leading to conclusions are not valid. A lambda function can be applied against the cost function which will alleviate this. I can do it in software, and when I discover the operating principles of the neo cortex, I will be able to fix all the conspiracy nuts in the local nut house.

Wouldn't this work the other way, too? Not finding patterns in what turns out to not be randomness sometimes ends up getting you killed. It's a fine line between paranoia and attention to detail. Anyone with aspirations to "fix" this should probably take that into consideration.

maeon3

correct, the opposite of overfitting is underfitting, knowing that whenever you talk to joe you get punched, and you have 10 training examples, but this time is different, he's wearing his brown shirt, so it's probably safe now.

not finding the signal from the noise, because joe hitting me 10 times in a row is not conclusive, because most humans never hit me. and joe is wearing new clothing, so it's safe because joe is a human.

Dec 26, 2012 · ajdecon on Why use SVM

Andrew Ng, who teaches the CS 229 Machine Learning course at Stanford, has his lecture notes online: http://cs229.stanford.edu/materials.html . I have found these useful in the past.

He also teaches the Coursera machine learning course: https://www.coursera.org/course/ml

Cool stuff... Andrew Ng's excellent Machine Learning course on Coursera also had a programming exercise that involved using k-means to reduce an image's palette (in Octave, though, rather than Python), so if you find this interesting you might consider signing up for his course the next time it comes around:

https://www.coursera.org/course/ml

You can use the same techniques used in Linear Regression (with multiple features) to do Polynomial Regression. For example suppose you have two features x1 and x2, you can add higher order features like x1 x2 or x1^2 or x2^2 or a combination of these. While doing linear regression you treat these terms as individual features so x1 x2 is a feature say x3. This way you can fit non-linear data with a non-linear curve. However there is a problem of overfitting, i.e your curve may try to be too greedy and fit the data perfectly, but that's not what you want. So Regularization is used to lower the contributions of the higher order terms.

Wikipedia has an article on Polynomial Regression: http://en.wikipedia.org/wiki/Polynomial_regression

P.S I'm doing this course https://www.coursera.org/course/ml so my knowledge may not be entirely correct so take everything I've said with a pinch of salt. :)

The class is starting again in three weeks: https://www.coursera.org/course/ml .

I'm pretty much in the same boat as you. Looking forwards to it!

I am doing the https://www.coursera.org/course/ml from Stanford by Andrew Ng & I definitely recommend it.

I'm really excited by all of this free university level material flooding the web as I never even started college due to financial concerns (aka I didn't want to get any loans).

mikhael

Do you know whether Prof. Ng has updated the material since the first run of the class?

We are still in the honeymoon phase of free, online university courses, so I think there's been relatively little criticism of what's available now, but I'll go for it: I was disappointed by the Coursera/Stanford ML class. It was obviously watered down, the homeworks were very (very) easy, and I retained little or nothing of significance.

In contrast, the Caltech class was clearly not watered down, and, as the material was much more focused (with a strong theme of generalization, an idea almost entirely absent from the Stanford class, as I recall) I feel I learned far more.

Another big difference: the Caltech class had traditional hour-long lectures, a simple web form for submitting answers to the multiple-choice* homeworks, and a plain vBulletin forum. The lectures were live on ustream, but otherwise, no fancy infrastructure.

So I think that some interesting questions will come up. Do we need complex (new) platforms to deliver good classes? For me, the answer right now is no -- what clearly matters is the quality and thoughtfulness of the material and how well it is delivered. Can a topic like machine learning be taught effectively to someone who doesn't have a lot of time, or who doesn't have the appropriate background (in CS, math)? Can/should it be faked? I don't think so, but I think there are certainly nuances here.

* Despite being multiple-choice, the homeworks were not easy -- they typically required a lot of thought, and many required writing a lot of code from scratch.

binarysolo

The Coursera ML class is nowhere near the Stanford-level class in terms of academic rigor.

That being said, several of my peers who didn't go to the school really appreciated it for its accessibility.

I think the expectation of that class is to render ML education accessible and palatable, not to train everyone at an elite level. As this field grows, I'm sure the needs of various parties would be filled to an extent.

plafl

I think the courses are great to get an idea of what the subject is about. If you face a related problem at least you will know wether it can be efficently solved. It will allow you to speak to an expert in the field at a basic level at least. That said, they certainly can be greatly improved.

DaveInTucson

Somewhat of a side-topic, but I just finished the Coursera compilers class. It didn't seem watered down to me, covering regular expression (including NFA and DFA representations), parsing theory and various top-down and bottom-up parsing algorithms, semantics checking (including a formal semantics notation), code generation (with formal operational notation), local and global optimization, register allocation and garbage collection.

I guess it was partially watered down in that the programming part of the class was optional.

bgilroy26

One of the conscious aims of the undergraduate coursera classes has been to lower the bar (in terms of assumed prerequisites, pace, and scope) in order to increase participation.

Daphne Koller's Probabilistic Graphical Models was their first graduate class and it was definitely tougher than other Coursera offerings have been.

lightcatcher

This. The Coursera PGM class is the only free online class that I've enrolled in that felt like a similar difficult to a slightly harder than average undergrad course at Caltech (where I go to school).

auston

I don't think he has. In hindsight, I guess it does seem watered down - but personally, that is ok, I enjoy the pace / difficulty level right now.

However, I'm glad you pointed it out, because I'm eager to learn about ML & hope to use this (CalTech) material to augment the foundation I get from the coursera class.

baotiao

I also see the course. I also definitely recommend it.

abhgh

Don't miss out on the original(i.e. before coursera) Andrew Ng lectures, starting here: http://tinyurl.com/6uqeoo2 These are also mathematically more rigorous.

Have you seen the online courses?

https://www.coursera.org/course/ml (From one of the authors of this paper!)

https://www.coursera.org/course/vision

https://www.coursera.org/course/computervision

Prof. Hinton's videos are very watchable:

http://www.youtube.com/watch?v=AyzOUbkUf3M

http://www.youtube.com/watch?v=VdIURAu1-aU

sown

Yes, often. Thanks for the links.

magoghm

If you like math, Caltech's "Learning from Data" is awesome http://work.caltech.edu/telecourse.html

Why is this the top link on HN? There are already numerous courses available that will allow you to learn this stuff for free from very highly ranked universities, including Stanford [1] and CMU [2], among others. This will just teach you similar things while also taking your money and giving you a "certificate".

I guess if you want to enter a new field and you need to have some certifiable expertise, this may be a good option. That being said, if the field you plan on entering really does require some documented education, having this certificate will not even put you in the same playing field as those with actual degrees in the field, not to mention those with advanced degrees.

[1]: https://www.coursera.org/course/ml

[2]: http://www.cs.cmu.edu/~tom/10701_sp11/lectures.shtml

phren0logy

I would think this would be more appropriate to add to an existing skill set in another field.

swGooF

Yes, there are some good options available online. However, the certificate program and data science are more than just machine learning.

dundun

BigDataUniversity ( http://bigdatauniversity.com/ ) also has free courses and cover hadoop and some other stuff.

The site appears to be push sponsor products and don't really talk about alternatives. Expect to see lots of endorsements of IBM products.

dbecker

The Coursera courses are excellent, but Coursera, Udacity, CMU, etc. are offering a different set of courses than the UW progam. For instance, I don't think any of the current players are offering Hadoop courses... In general, it looks like the UW program is more technology-specific and applied than the other programs.

Personally, I'd prefer the less technology-specific topics already on offer. But, my employer would be much more likely to hire someone with UW's course-mix. So, there should be some demand for that.

And, if we are talking about a career decision, $3,000 is small potatoes compared to the value of getting the right topics.

ivan_ah

> Why is this the top link on HN? >

Because this is the response of the "university system" trying to protect its cash flow. Perhaps you missed it, but udacity said recently it will do some certification program for "minimal costs", which was the beginning of the conversation. Education should be free (Khan style), says Udacity.

The University establishment's response is "here we will give you certificates, but you have to give us three grand."

Money for certificates, knowledge for free.

None
sown

For me, I'm wondering if I'm learning the right material. If I self teach, how do I really know if I'm doing this right unless I get feedback? Also, many hiring managers might not recognize self-taught expertise. I might not want to work for those managers, but unfortunately, this includes a very large swath of potential jobs.

Of course, I wonder if this certificate gets me anywhere as far as employment goes.

Apr 23, 2012 · 2 points, 0 comments · submitted by coconutrandom
Apr 23, 2012 · 7 points, 0 comments · submitted by fooyc
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
HN Academy is an independent project and is not operated by Y Combinator, Coursera, edX, or any of the universities and other institutions providing courses.
~ [email protected]
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~