Chevron Left
Volver a Mathematics for Machine Learning: Multivariate Calculus

Opiniones y comentarios de aprendices correspondientes a Mathematics for Machine Learning: Multivariate Calculus por parte de Imperial College London

5,115 calificaciones
911 reseña

Acerca del Curso

This course offers a brief introduction to the multivariate calculus required to build many common machine learning techniques. We start at the very beginning with a refresher on the “rise over run” formulation of a slope, before converting this to the formal definition of the gradient of a function. We then start to build up a set of tools for making calculus easier and faster. Next, we learn how to calculate vectors that point up hill on multidimensional surfaces and even put this into action using an interactive game. We take a look at how we can use calculus to build approximations to functions, as well as helping us to quantify how accurate we should expect those approximations to be. We also spend some time talking about where calculus comes up in the training of neural networks, before finally showing you how it is applied in linear regression models. This course is intended to offer an intuitive understanding of calculus, as well as the language necessary to look concepts up yourselves when you get stuck. Hopefully, without going into too much detail, you’ll still come away with the confidence to dive into some more focused machine learning courses in future....

Principales reseñas


12 de nov. de 2018

Excellent course. I completed this course with no prior knowledge of multivariate calculus and was successful nonetheless. It was challenging and extremely interesting, informative, and well designed.


3 de ago. de 2019

Very Well Explained. Good content and great explanation of content. Complex topics are also covered in very easy way. Very Helpful for learning much more complex topics for Machine Learning in future.

Filtrar por:

676 - 700 de 915 revisiones para Mathematics for Machine Learning: Multivariate Calculus

por Felix G S S

26 de mar. de 2021


por Sinatrio B W M

2 de mar. de 2021


por Md. R Q S

21 de ago. de 2020


por Kailun C

25 de ene. de 2020


por Doni S

27 de mar. de 2022


por Burra s g

18 de ene. de 2022


por 李由

23 de ago. de 2021


por Dwi F D S M

23 de mar. de 2021


por Ahmad H N

16 de mar. de 2021


por Habib B K

12 de mar. de 2021


por Indah D S

27 de feb. de 2021



25 de jul. de 2020


por Nat

6 de mar. de 2020


por Zhao J

11 de sep. de 2019


por Harsh D

26 de jun. de 2018


por Amini D P S

26 de mar. de 2022


por Roberto

25 de mar. de 2021


por Angel E E V

30 de nov. de 2021


por Omar D

5 de may. de 2020


por Гончарова П В

10 de may. de 2022


por Aidana P B

26 de abr. de 2021


por Naga V B G

7 de ago. de 2020


por Kaushal K K

23 de abr. de 2022

A good, brief overview of the topics in multivariate calculus relevant to machine learning and optimisation. It may not necessarily go deep enough to make you an expert in solving problems in multivariate calculus that might be seen at the university level; rather, it goes just deep enough to enable you to understand how multivariable calculus operates in various machine learning scenarios. Some of these scenarios include:

(1) The process of backpropagation in basic neural networks.

(2) Using the Newton-Raphson method to find the roots of a function in the multivariate case.

(3) Use of the Taylor series to approximate a function in the multivariate case, and how such an approximation can be used for optimisation.

(4) Using gradient descent to reach the nearest minimum points in the parameter space, so as to optimise the parameters in a machine learning model with multiple parameters.

T​he quizzes provide a few example problems for us to work on, but as mentioned earlier, they are of the more basic variety; it is quite unlikely that undergraduate courses have examples that are this straightforward. However, I feel that this is a good thing, given that their aim is only to allow us to get a feel for multivariable calculus without bogging us down with needless complexity.

T​he overall aim of the course is to build intuition, which I think it accomplishes.

H​owever, compared to the previous course in this specialization, it is harder to draw the links between the material that is covered in one week as compared to the next. It is harder to see how they are related, and how the material for each week fits into the overall picture. This was not the case in the previous course. The concepts from the previous weeks would be seemlessly integrated into those from the current week. There seems to be an unspoken expectation that the course participant should refer to external resources to fill in the blanks, and find the coherence within the material by themselves. I feel that the course instructors can do better at integrating the concepts taught across the weeks, so that it does not feel quite so fragmented.

por Rinat T

1 de ago. de 2018

the part about neural networks needs improvement (some more examples of simple networks, the explanation of the emergence of the sigmoid function). exercises on partial derivatives need to be focused more on various aspects of partial differentiation rather than on taking partial derivatives of some complicated functions. I felt like there was too much of the latter which is not very efficient because the idea of partial differentiation is easy to master but not always its applications. just taking partial derivatives of some sophisticated functions (be it for the sake of Jacobian or Hessian calculation) turns into just doing lots of algebra the idea behind which has been long understood. so while some currently existing exercises on partial differentiation, Jacobian and Hessian should be retained, about 50 percent or so of them should be replaced with exercises which are not heavy on algebra but rather demonstrate different ways and/or applications in which partial differentiation is used. otherwise all good.

por yarusx

8 de abr. de 2020

1) Totally British English with a bunch of very rare-used words and phrases globally. 2) The pace of the course is just not suitable for me. If you don't have strong math or engineer background you will need to search for the explanations somewhere else (khan academy - a great resource, etc.). Closer to the end of the course I stopped having a full understanding of what's going on and why. So I could calculate things, but I don't feel that I will able to that in 1-2 week because I didn't have a time and opportunity to strengthen gained skills. 3) Also I don't understand why instructors (especially David) don't visualize what they say like Sal or Grant are doing. They draw on the desk and on the plots and so on. Sometime it looks like you just listen to audio-book about the Math.

I will take Stanford ML course after this course and also review what I've learned here with Khan Academy resource.