Phase 3: Mathematics for MLΒΆ

This folder provides the mathematical intuition behind the rest of the curriculum. The goal is not to turn this repo into a pure math degree. The goal is to give you enough fluency to understand optimization, probability, embeddings, attention, and evaluation without treating them as magic.

Folder MapΒΆ

  • foundational/: the main starting point for most learners

  • mml-book/course/: Mathematics for Machine Learning notebook sequence

  • mml-book/exercises/: additional practice

  • islp-book/: statistical learning and classical ML foundations

  • cs229-course/course/: Stanford-style ML theory and algorithms

  • mlpp-book/: probabilistic modeling depth

  • advanced/: research-level topics; selective, not required on a first pass

  • resources/: PDFs and reference material

Strong Follow-On PathsΒΆ

Practical Learning RulesΒΆ

  • Learn the intuition before the notation.

  • Re-derive small examples by hand when possible.

  • If a symbol-heavy notebook feels abstract, reconnect it to one downstream use case: gradient descent, cosine similarity, cross-entropy, PCA, or attention.

High-Value ModulesΒΆ

  • Linear algebra: embeddings, PCA, attention, matrix ops

  • Calculus and optimization: training dynamics and backprop

  • Probability and statistics: uncertainty, evaluation, inference, Bayesian thinking

  • Information theory: cross-entropy and KL divergence

What To AvoidΒΆ

  • Do not try to finish every notebook before continuing the curriculum.

  • Do not spend weeks on theorem-level depth if your goal is applied AI engineering.

  • Do not skip probability and statistics just because you prefer neural networks.

Best Next StepΒΆ

After the foundational notebooks, continue into 05-embeddings/ and 06-neural-networks/, then come back here as needed.