CISC271, Linear Data Analysis: Lectures

Description:

These lectures are approximately aligned with classes in the course notes. There may be differences between the notes and the videos because these evolve over time.

Prerequisite material is overviewed in a series of videos that are available under the "Prerequisites" topic in the navigation bar, which is to the left of this text.

The lectures were produced using technology that is described in this video:
https://youtu.be/ltOxgb28ZKY

No. PDF Video
Week   1
1 Class #01 Introduction To Linear Methods for AI
1a Course Overview
2 Class #02 Eigenvalues, Eigenvectors
2a Matrix Columns
2b Eigenfacts
3 Class #03 Graphs: Adjacency Matrix and Laplacian Matrix
3a Introduction To Graphs
3b Relevant Definitions For Graphs
3c The Adjacency Matrix
3d Non-Bipartite Graphs
3e The Degree Matrix
3f A Laplacian Matrix
3g Properties Of A Laplacian Matrix
3h The Fiedler Vector
Week   2
4 Class #04 Vector Spaces
4a Introduction To Vector Spaces
4b Block Partitioning A Matrix
4c Vector-Space Properties
4d The Column Space
4e The Null Space
5 Class #05 Spanning Sets And Basis Vectors
5a Introduction To Vector Spaces
5b Relevant Definitions For Basis Vectors
5c Basis Vectors For A Column Space
5d Orthogonal Subspaces
6 Quiz #1 Matrix Properties and Vector Spaces
Week   3
7 Class #07 Diagonalizable Matrices
7a Similar Matrices
7b Diagonalizability
7c Examples of Diagonalizability
7d The Matrix Square Root
8 Class #08 Spectral Decomposition and Positive [Semi-]Definite Matrices
8a Real Normal Matrices
8b Real Orthogonal Matrices
8c Real Symmetric Matrices
8d The Spectral Theorem
8e Positive Definite Matrices
8f The Quadratic Form
8g Mean And Variance Of Data
8h The Covariance Matrix
9 Test #1 Matrix Properties and Vector Spaces
Week   4
10 Class #10 Design Matrix And Standardized Data
10a Variables And Observations
10b Variables As Vectors
10c Zero-Mean Data
10d Unit-Variance Data
10e Standardized Data For Regression
10f Measuring Standard Deviations
11 Class #11 Orthogonal Projection
11a Concepts In Orthogonal Projection
11b Projecting A Vector To A Vector
11c Projecting A Vector To A Vector Space
11d The Normal Equation
11e Overdetermined Linear Equations
12 Class #12 Patterns - Linear Regression
12a Concepts In Statistical Regression
12b Concepts In Linear Regression
12c Examples Of Linear Regression
12d Data Standardization For Linear Regression
12e Residual Error In Linear Regression
Week   5
13 Class #13 Cross-Validating Linear Regression
13a Validation Of Linear Regression
13b Training Sets And Testing Sets
13c K-Fold Cross-Validation Of Linear Regression
13d Examples Of 5-Fold Cross-Validation
14 Class #14 Singular Value Decomposition, Or SVD
14a Introduction To The SVD
14b The Left-Transpose Product
14c The Right-Transpose Product
14d Structure Of The SVD
15 Quiz #2 Matrices and Linear Regression
Week   6
16 Class #16 Orthonormal Basis Vectors And The SVD
16a I Did Not Shoot The SVD
16b Examples Of The SVD
16c Matrix Spaces And The SVD
16d The Null Space And The SVD
16e Orthonormal Basis Vectors And The SVD
17 Class #17 Principal Components Analysis, Or PCA
17a Introduction To PCA
17b PCA From Covariance Matrix
17c PCA As Spectral Decomposition
17d Computing Data Scores Using PCA
17e Matrix Norms
17f L2 Matrix Norm And Frobenius Matrix Norm
17g Matrix Series From The SVD
17f Scree Plot Of Singular Values
18 Test #2 Matrices and Linear Regression
Reading Week
Holiday Special
Week   7
19 Class #19 PCA - Algebra, Dimensionality Reduction
19a Revisiting PCA
19b The Scatter Matrix Of Variables For PCA
19c PCA As Matrix Approximation
19d PCA As Dimensionality Reduction
19e Low-Rank Approximations
20 Class #20 Unsupervised Learning - K-Means Clustering
20a A Conceptual Hierarchy Of Machine Learning
20b Hyperplane of Separation
20c Basics Of Vector Clustering
20d A K-Means Clustering Algorithm
20e Clustering The Iris Data
21 Quiz #3 The SVD, PCA, And Dimensionality Reduction
Week   8
22 Class #22 Classification - Linear Separability
22a Separating Two Clusters
22b A Hyperplane From Cluster Centroids
22c Hyperplanes For Multiple Clusters
22d The Davies-Bouldin Index For Clusters
23 Class #23 Classification - Assessment With Confusion Matrix
23a Data Labels As Dependent Variables
23b Confusion Matrix For Binary Labels
23c Example Of Confusion Matrix
24 Test #3 The SVD, PCA, And Dimensionality Reduction
Week   9
25 Class #25 Classification - Assessment With ROC Curve
25a Receiver Operator Characteristic, Or ROC
25b ROC And The Confusion Matrix
25c Example ROC Curve For Fictitious Virus
26 Class #26 Odds Of Occurrence And Probability
26a Odds Of Hyperplane Classification
26b Odds And Probability
26c The Logistic Function For Odds
26d Properties Of The Logistic Function
27 Quiz #4 Classification Assessment, Odds Of Occurrence
Week   10
28 Class #28 Elementary Numerical Optimization
28a Stationary Points
28b Iteration With Steepest Descent
29 Class #29 Artificial Neuron - Learning Weights
29a Artificial Neuron - Simple Model
29b Data Flow And Computations For Neurons
29c Hyperplane Separation For Neurons
29d Steepest Descent For Artificial Neurons
29e Hyperplane Classification Of Iris Data
30 Test #4 Classification Assessment, Odds Of Occurrence
Week   11
31 Class #31 Classification - Logistic Regression
31a Shortcomings Of Perceptrons
31b Scores From Logistic Activation
31c Residual Error Of Scores
31d Logistic Regression For Iris Data
32 Class #32 Nonlinear Separation - Embeddings, Gram Matrix
32a Some Data Are Nonlinearly Separable
32b Embedding A Vector Space
32c Gram Matrix For An Embedding
33 Class #33 Nonlinear Separation - Kernel PCA
33a High-Dimensional PCA
33b Scatter Matrix Of Observations
33c Kernel PCA Using The Gram Matrix
33d Kernel PCA For Iris Data
Week   12
34 Class #34 Spectral Clustering Of Data
34a Fiedler Vector And Spectral Decomposition
34b Spectral Clustering Using Eigenvectors
34c Distance Matrix And Spectral Clustering
35 Class #35 Course Summary
35a Course Summary
Unplanned Recording Events
Extra Material
Table of Contents
References


Queen's University is situated on the territory of the Haudenosaunee and Anishinaabek.

Ne Queen's University e'tho nón:we nikanónhsote tsi nón:we ne Haudenosaunee táhnon Anishinaabek tehatihsnonhsáhere ne onhwéntsya.

Gimaakwe Gchi-gkinoomaagegamig atemagad Naadowe miinwaa Anishinaabe aking.


Last updated