CISC271, Linear Data Analysis: Lectures

Description:

These lectures are approximately aligned with classes in the course notes. There may be differences between the notes and the videos because these evolve over time.

Prerequisite material is overviewed in a series of videos that are available under the "Prerequisites" topic in the navigation bar, which is to the left of this text.

The lectures were produced using technology that is described in this video:
https://youtu.be/ltOxgb28ZKY

No. PDF Video
Week   1
1 Class #01 Introduction To Linear Data Analysis
1a Course Overview
1b Course Organization
1c Matrix Columns
1d Eigenfacts
2 Class #02 The Graph Adjacency Matrix
2a Introduction To Graphs
2b Relevant Definitions For Graphs
2c The Adjacency Matrix
2d Non-Bipartite Graphs
3 Class #03 A Graph Laplacian Matrix
3a The Degree Matrix
3b A Laplacian Matrix
3c Properties Of A Laplacian Matrix
3d The Fiedler Vector
Week   2
4 Class #04 Vector Spaces
4a Introduction To Vector Spaces
4b Block Partitioning A Matrix
4c Vector-Space Properties
4d The Column Space
4e The Null Space
5 Class #05 Spanning Sets And Basis Vectors
5a Introduction To Vector Spaces
5b Relevant Definitions For Basis Vectors
5c Basis Vectors For A Column Space
5d Orthogonal Subspaces
6 Class #06 Diagonalizable Matrices
6a Similar Matrices
6b Diagonalizability
6c Examples of Diagonalizability
6d The Matrix Square Root
Week   3
7 Class #07 Normal Matrices And Spectral Decomposition
7a Real Normal Matrices
7b Real Orthogonal Matrices
7c Real Symmetric Matrices
7d The Spectral Theorem
8 Class #08 Positive [Semi-]Definite Matrices
8a Positive Definite Matrices
8b The Quadratic Form
8c Mean And Variance Of Data
8d The Covariance Matrix
9 Class #09 Design Matrix And Standardized Data
9a Variables And Observations
9b Variables As Vectors
9c Zero-Mean Data
9d Unit-Variance Data
9e Standardized Data For Regression
9f Measuring Standard Deviations
Week   4
10 Test #1 Basic Linear Analysis
11 Class #11 Orthogonal Projection
11a Concepts In Orthogonal Projection
11b Projecting A Vector To A Vector
11c Projecting A Vector To A Vector Space
11d The Normal Equation
11e Overdetermined Linear Equations
12 Class #12 Patterns - Linear Regression
12a Concepts In Statistical Regression
12b Concepts In Linear Regression
12c Examples Of Linear Regression
12d Data Standardization For Linear Regression
12e Residual Error In Linear Regression
Week   5
13 Class #13 Cross-Validating Linear Regression
13a Validation Of Linear Regression
13b Training Sets And Testing Sets
13c K-Fold Cross-Validation Of Linear Regression
13d Examples Of 5-Fold Cross-Validation
14 Class #14 Singular Value Decomposition, Or SVD
14a Introduction To The SVD
14b The Left-Transpose Product
14c The Right-Transpose Product
14d Structure Of The SVD
15 Class #15 Orthonormal Basis Vectors And The SVD
15a I Did Not Shoot The SVD
15b Examples Of The SVD
15c Matrix Spaces And The SVD
15d The Null Space And The SVD
15e Orthonormal Basis Vectors And The SVD
Week   6
16 Test #2 Matrices and Linear Regression
17 Class #17 Matrix Approximation
17a Matrix Norms
17b L2 Matrix Norm And Frobenius Matrix Norm
17c Matrix Series From The SVD
17d Low-Rank Approximations
17e Scree Plot Of Singular Values
18 Class #18 Principal Components Analysis, Or PCA
18a Introduction To PCA
18b PCA From Covariance Matrix
18c PCA As Spectral Decomposition
18d Computing Data Scores Using PCA
Reading Week
Holiday Special
Week   7
19 Class #19 Unsupervised Learning - K-Means Clustering
19a A Conceptual Hierarchy Of Machine Learning
19b Hyperplane of Separation
19c Basics Of Vector Clustering
19d A K-Means Clustering Algorithm
19e Clustering The Iris Data
20 Class #20 Classification - Linear Separability
20a Separating Two Clusters
20b A Hyperplane From Cluster Centroids
20c Hyperplanes For Multiple Clusters
20d The Davies-Bouldin Index For Clusters
21 Class #21 PCA - Matrix Algebra and Dimensionality Reduction
21a Revisiting PCA
21b The Scatter Matrix Of Variables For PCA
21c PCA As Matrix Approximation
21d PCA As Dimensionality Reduction
Week   8
22 Test #3 The SVD, PCA, And Dimensionality Reduction
23 Class #23 PCA And The Rayleigh Quotient
23a PCA, Scores, And The SVD
23b PCA Maximizes A Linear Transformation
23c The Rayleigh Quotient And PCA
24 Class #24 Patterns - Linear Discriminant Analysis, or LDA
24a Finding Patterns In Labeled Data
24b Example Of PCA For Labeled Data
24c Fisher's Linear Discriminant, Or LDA
24d Example Of LDA For Labeled Data
24e Example Of LDA For Iris Data
Week   9
25 Class #25 Classification - Assessment With Confusion Matrix
25a Data Labels As Dependent Variables
25b Confusion Matrix For Binary Labels
25c Example Of Confusion Matrix
26 Class #26 Classification - Assessment With ROC Curve
26a Receiver Operator Characteristic, Or ROC
26b ROC And The Confusion Matrix
26c Example ROC Curve For Fictitious Virus
27 Class #27 Classification - Single Artificial Neuron
27a Artificial Neurons
27b Data Flow And Computations For Neurons
27c Hyperplane Separation For Neurons
Week   10
28 Test #4 LDA, Assessment, Odds Of Occurrence
29 Class #29 Odds Of Occurrence And Probability
29a Odds Of Hyperplane Classification
29b Odds And Probability
29c The Logistic Function For Odds
29d Properties Of The Logistic Function
30 Class #30 Supervised Learning - Perceptron Rule
30a Perceptron Model Of Neurons
30b Deriving The Perceptron Rule
30c Pseudocode For The Perceptron Rule
30d Perceptron Rule For Iris Data
Week   11
31 Class #31 Classification - Logistic Regression
31a Shortcomings Of Perceptrons
31b Scores From Logistic Activation
31c Residual Error Of Scores
31d Logistic Regression For Iris Data
32 Class #32 Nonlinear Separation - Embeddings And Gram Matrix
32a Some Data Are Nonlinearly Separable
32b Embedding A Vector Space
32c Gram Matrix For An Embedding
33 Class #33 Nonlinear Separation - Kernel PCA
33a High-Dimensional PCA
33b Scatter Matrix Of Observations
33c Kernel PCA Using The Gram Matrix
33d Kernel PCA For Iris Data
Week   12
34 Test #5 Machine Learning
35 Class #35 Spectral Clustering Of Data
35a Fielder Vector And Spectral Decomposition
35b Spectral Clustering Using Eigenvectors
35c Distance Matrix And Spectral Clustering
36 Class #36 The Curse Of Dimensionality
36a The Curse of Dimensionality
36b Dimensionality And Hypercube Vertices
36c Dimensionality And Uniform Distributions
36d Dimensionality And Gaussian Distributions
37 Class #37 Course Summary
37a Course Summary
37b Unplanned Recording Events
Extra Material
Table of Contents
References

Queen's University is situated on traditional Anishinaabe and Haudenosaunee Territory.


Last updated