We will separate this module into an applied and a theoretical part because different courses are more apt at addressing each of these aspects. You can start with either one or take both subjects together.
If you are looking for an approachable and super practical intro to topics like decision trees and random forests, this is the course to take.
If you are looking for a more comprehensive course (or additional material), go for the Columbia course. Both are quite complementary, so you can easily take both at the same time.
In typical fast.ai fashion this course will introduce you topics such as model validation and logistic regression from a bottom-up fashion, e.g. by building random forests from scratch.
Taught by one of the core devs of scikit-learn, this is a very hands-on course. It is more comprehensive than the fast.ai ML course and covers topics such as data preprocessing, gradient boosting, calibration, model inspection and time series.
For the most comprehensive introduction, take Cornell's CS4780. While the course doesn't have any published exercises, you should be able to go through Stanford's CS229 exercises as well with the knoweldge from this course.
If you prefer Andrew Ng's style of teaching, are looking for a shorter course or want to explore RL as well, take a look at Stanford's CS229.
The Bloomberg course is also excellent, a bit more approachable than the above ones and is slightly more practically oriented.
This is the Stanford/lecture version of Andrew Ng's famous ML MOOC. While it is not as comprehensive as the Cornell course, it goes through topics such as RL that are not disscussed in CS4780. Furthermore, it also has coding and written exercises.
While not as comprehensive as the above courses, this course offers some of the best intuitive explanations on topics such as convex optimization and maximum likelihood estimation.
Peter Bloem, the author of this incredible blog post on Transformers, also teaches this ML course at VU Amsterdam. In the same style to his blog, Peter uses simple visuals to express ideas that consequently become much more digestible.
ML Blinks is one of the best visual ML resources online. Islem Rekik has a gift for expressing her teaching through beautiful illustrations. Her videos are basically what you'd imagine if 3blue1brown and mathematicalmonk collaborated.
One of the "classic" books in statistical learning, covering topics like regression, regularization, boosting trees, SVMs, random forests and undirected graphical models.
One of the best ways to see ML in practice is Kaggle's Kernels section, which has a myriad of high-quality tutorials and examples of using ML on various datasets.