AI 501 Mathematics for Artificial Intelligence

Fall 2024

Syed Babar Ali School of Science and Engineering
Lahore University of Management Sciences



Course Overview

This course offers an in-depth exploration of the mathematical principles that form the foundations of machine learning (ML) and artificial intelligence (AI). Aimed at graduate students and industry professionals, this course is designed to provide a rigorous understanding of the mathematical concepts crucial for developing, implementing, and evaluating ML and AI algorithms.

In broad brush terms, we will be covering the following topics in the course:

  • Vector and Matrix Operations: Understanding vectors, matrices, and their operations is fundamental to data representation and transformations in ML. The course covers vector spaces, linear transformations, eigenvalues, and eigenvectors, focusing on their practical applications in data analysis and feature extraction.

  • Linear and Logistic Regression: Students will explore regression techniques, crucial for prediction and classification tasks. The course delves into the least squares method, regularization techniques, and logistic regression, linking theoretical concepts to practical applications in supervised learning.

  • Matrix Decompositions: The course includes detailed discussions on eigenvalue decomposition (EVD) and singular value decomposition (SVD), highlighting their importance in dimensionality reduction, data compression, and noise reduction.

  • Dimensionality Reduction: Techniques such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are explored, providing tools to manage and visualize high-dimensional data.

  • Optimization Techniques: Essential for training ML models, optimization topics cover gradient descent, convex optimization, and advanced methods like Newton's method. Students will learn to implement these techniques to minimize cost functions effectively.

  • Probability and Statistics: Fundamental probabilistic concepts are covered, including random variables, distributions, Bayes’ theorem, and inference methods. This foundation is crucial for understanding probabilistic models and Bayesian inference.

  • Machine Learning Algorithms: The course includes a comprehensive overview of key ML algorithms, including Support Vector Machines (SVM), decision trees, and neural networks. Advanced topics such as deep learning and convolutional neural networks are also introduced.

  • Real-World Applications and Case Studies: Throughout the course, theoretical concepts are linked with practical applications through case studies and real-world examples, providing students with insights into how ML and AI are applied across various domains.

Announcements

  • (Oct 12) We will have Mid-Exam on 26th October. More information about the mid-exam will be posted on Slack.

  • (Sep 12) Homework 01 has been released.

  • (Aug 28) Welcome to AI 501. Course outline has been posted.

Administrative Details

  • Course Outline (Click to download)

  • Suggested Books:

    • (click to download pdf): S.Boyd and L. Vandenberghe. Introduction to Applied Linear Algebra - Vectors, Matrices, and Least Squares. Cambridge University Press, 2019

    • (click to download pdf): M. P. Deisenroth, A. A. Faisal and Cheng Soon Ong. Mathematics for Machine Learning. Cambridge University Press, 2019

    • Ali H. Sayed, Inference and Learning from Data, Foundations (Volume 1). Cambridge University Press, 2022

    • G. Strang. Introduction to Linear Algebra. 2016

    • J. A. Gubner, Probability and Random Processes for Electrical and Computer Engineers, Cambridge University Press, 2006.

    • S. L. Miller and D. Childers, Probability and Random Processes: With Applications to Signal Processing and Communications.

    • A. Papoulis and S.U. Pillai, Probability, Random Variables, and Stochastic Processes.

    • Class notes will be provided to supplement these readings

  • Office Hours and Contact Information

    • Instructor: Zubair Khalid (zubair.khalid@lums.edu.pk), Office hours: Tuesday, Thursday 3 pm-4 pm

    • Lead Teaching Assistant: Muhammad Salaar Arif Khan (muhammad.salaar@lums.edu.pk), Office hours: Thursday 5-6 pm

    • Teaching Assistant: Fatima Abaid (25100139@lums.edu.pk), Office hours: Thursday, 12 pm to 1 pm

    • Teaching Assistant: Muhammad Ibrahim Farrukh (25100227@lums.edu.pk), Office hours: Sunday, 5 pm to 6 pm

    • Teaching Assistant: Umer Raja (26100063@lums.edu.pk), Office hours: Tuesday, 6 pm to 7 pm

    • Teaching Assistant: Muhammad Ayyan Ahmed (25100163@lums.edu.pk), Office hours: Saturday, 5 pm to 6 pm

Grading Distribution

  • Homeworks, 25 %

  • Quizzes (1 per week), 20 %

  • CP, 5 %

  • Mid Exam, 20 %

  • Final Exam, 30 %

Homeworks

Homework Solutions
Homework 01 Solutions
Homework 02 Solutions
Homework 03 Solutions
Homework 04 Solutions
Homework 05 Solutions

Quizzes

Quiz Solutions
Quiz 01 Solutions
Quiz 02 Solutions
Quiz 03 Solutions
Quiz 04 Solutions
Quiz 05 Solutions

Schedule

  • Week 01 (Notes)

    • Course Introduction

    • Overview of AI and ML (Brief)

  • Week 02 (Notes)

    • Operations on vectors: Additivity, Scaling, Linear Combination, Affine and Convex Combination, Norm, Distance, Angle

    • Linear Independence, Span, Basis, Orthonormal vectors

  • Week 03 (Notes)

    • Gram-Schmidt orthogonalization (See Week 2 notes)

    • Vector space and subspaces (See Week 2 notes)

    • Matrices

    • Matrix-vector product

  • Week 04 (Notes)

    • Linear system of equations (See Week 3 notes)

    • Matrix Inverses and Pseudo-inverse (See Week 3 notes)

    • Supervised Learning Overview

    • kNN Algorithm for Classification

  • Week 05a (Notes)

    • Classifier’s Performance Evaluation, Confusion Matrix, ROC, AUC, F1-Score

  • Week 05b (Notes)

    • Regression Set-up

    • Linear Regression

  • Week 06a (See Week 5b notes)

    • Linear Regression Solution

    • Polynomial Regression

    • Underfitting/Overfitting

    • Regularization

  • Week 06b (Notes)

    • EigenValue Decomposition (EVD) Overview

  • Week 07a (Notes)

    • EigenValue Decomposition (EVD) Example

    • Singular Value Decomposition (SVD)

  • Week 07b (Notes)

    • Principal Component Analysis