Created by:  University of Washington

  • Emily Fox

    Taught by:  Emily Fox, Amazon Professor of Machine Learning


  • Carlos Guestrin

    Taught by:  Carlos Guestrin, Amazon Professor of Machine Learning

    Computer Science and Engineering
Basic Info
Course 2 of 4 in the Machine Learning Specialization.
Commitment6 weeks of study, 5-8 hours/week
How To PassPass all graded assignments to complete the course.
User Ratings
4.8 stars
Average User Rating 4.8See what learners said

How It Works

Each course is like an interactive textbook, featuring pre-recorded videos, quizzes and projects.

Help from Your Peers
Help from Your Peers

Connect with thousands of other learners and debate ideas, discuss course material, and get help mastering concepts.


Earn official recognition for your work, and share your success with friends, colleagues, and employers.

University of Washington
Founded in 1861, the University of Washington is one of the oldest state-supported institutions of higher education on the West Coast and is one of the preeminent research universities in the world.
AuditPurchase Course
Access to course materials



Access to graded materials

Not available


Receive a final grade

Not available


Earn a shareable Course Certificate

Not available


Ratings and Reviews
Rated 4.8 out of 5 of 2,846 ratings


A little bit boring and hard to focus on, sometimes

Interesting course. However, I have some mixed feelings:

I have a BS in mathematics, in Mexico (a "licenciatura", which is just between "BS" and "MS")

So, I'd say I have pretty good knowledge of statistics. So, now it is "training" instead of "fitting". It's "overfitting" instead of "multi colinearity". There are some algorithms to remove/add features (Ridge/Lasso), which -as noted- induce bias in the parameters. However, more "formal" methods susch as stepwise regression and bayesian sequences, are completely ignored.

That'd be fine except for the fact that there not even the slightest attempt to approach statistic significant, neither for the model nor for the individual parameters.

Some other methods (moving averages, Henderson MA, Mahalanobis distances) should also be covered.

So, in summary, an interesting course in the sense that ti gives an idea as to where lies the state of the art, but a little bit disappointing in the sense that -except for some new labels for the same tricks, and a humongous computing power- there is still nothing new under the sun. Still, worth the time invested

Excellent course. Emily and Carlos are fantastic teachers and have clearly put in a huge amount of effort in makign a great course. Thanks guys!