Phase 3: Mathematics for MLΒΆ
This folder provides the mathematical intuition behind the rest of the curriculum. The goal is not to turn this repo into a pure math degree. The goal is to give you enough fluency to understand optimization, probability, embeddings, attention, and evaluation without treating them as magic.
Folder MapΒΆ
foundational/: the main starting point for most learnersmml-book/course/: Mathematics for Machine Learning notebook sequencemml-book/exercises/: additional practiceislp-book/: statistical learning and classical ML foundationscs229-course/course/: Stanford-style ML theory and algorithmsmlpp-book/: probabilistic modeling depthadvanced/: research-level topics; selective, not required on a first passresources/: PDFs and reference material
Recommended First PassΒΆ
Strong Follow-On PathsΒΆ
For ML engineer depth: mml-book/course/ and cs229-course/course/
For data science depth: islp-book/ and selected mlpp-book/ notebooks
For research curiosity: selected topics in advanced/
Practical Learning RulesΒΆ
Learn the intuition before the notation.
Re-derive small examples by hand when possible.
If a symbol-heavy notebook feels abstract, reconnect it to one downstream use case: gradient descent, cosine similarity, cross-entropy, PCA, or attention.
High-Value ModulesΒΆ
Linear algebra: embeddings, PCA, attention, matrix ops
Calculus and optimization: training dynamics and backprop
Probability and statistics: uncertainty, evaluation, inference, Bayesian thinking
Information theory: cross-entropy and KL divergence
What To AvoidΒΆ
Do not try to finish every notebook before continuing the curriculum.
Do not spend weeks on theorem-level depth if your goal is applied AI engineering.
Do not skip probability and statistics just because you prefer neural networks.
Best Next StepΒΆ
After the foundational notebooks, continue into 05-embeddings/ and 06-neural-networks/, then come back here as needed.