## Video: Why is Probability Important to Machine Learning?

Probability is a big part of many aspects of machine learning, but this may not be totally obvious from the outset. In this video we take a look at different ways that probability informs ML. This is part of a series of videos for COS 302: Mathematics for Numerical Computation and Machine Learning, replacing lectures after the course went remote due to the COVID-19 pandemic.

## Video: Why is the Gradient the Direction of Steepest Ascent

We often talk about the gradient of a scalar function as being the direction of steepest ascent. Rather than taking that for granted, let’s convince ourselves that it is true. This is part of a series of videos for COS 302: Mathematics for Numerical Computation and Machine Learning, replacing lectures after the course went remote due to the COVID-19 pandemic.

## Video: Derivative as the Best Affine Approximation

A useful way to think about derivatives (and gradients/Jacobians more generally) is as the maps that give you the best affine approximation at a point. This is part of a series of videos for COS 302: Mathematics for Numerical Computation and Machine Learning, replacing lectures after the course went remote due to the COVID-19 pandemic.

## Video: Partial Derivatives

We think about gradients a lot in machine learning. This video talks about partial derivatives in general and Jacobian matrices, which specialize to gradients for scalar functions. This is part of a series of videos for COS 302: Mathematics for Numerical Computation and Machine Learning, replacing lectures after the course went remote due to the COVID-19 pandemic.