Section 15 Week 7-8 Learning Goals
Here are the knowledge and skills you should master by the end of the seventh and eighth weeks.
15.1 Orthogonality and SVD
I should be able to do the following tasks:
- Find the length of a vector
- Find the distance between two vectors
- Normalize a vector
- Find the cosine of the angle between two vectors
- Find the orthogonal projection of one vector onto another
- Find the orthogonal projection of one vector onto a subspace (using an orthogonal basis)
- Find the orthogonal complement of a subspace
- Find the least squares approximation for an inconsistent system
- Formulate a curve fitting problem as an inconsistent linear system \(A \mathsf{x} = \mathsf{b}\)
- Orthogonally diagonalize a symmetric matrix as \(A=PDP^{\top}\).
- Find the spectral decomposition \(A = \lambda_1 \mathsf{v}_1 \mathsf{v}_1^{\top} + \lambda_2 \mathsf{v}_2 \mathsf{v}_2^{\top} + \cdots + \lambda_n \mathsf{v}_n \mathsf{v}_n^{\top}\) of a symmetric matrix \(A\)
- Use an orthogonal diagonalization to find the best rank \(k\) approximation of symmetric matrix \(A\)
15.2 Vocabulary
I should know and be able to use and explain the following terms or properties.
- dot product of two vectors \(\mathsf{v} \cdot \mathsf{w} = \mathsf{v}^{\top} \mathsf{w}\) (aka scalar product, inner product)
- length (magnitude) of a vector
- angle between vectors
- normalize
- unit vector
- orthogonal vectors
- orthogonal complement of a subspace
- orthogonal projection
- orthogonal basis
- orthonormal basis
- normal equations for a least squares approximation
- least squares solution
- residual vector
- symmetric matrix
- orthogonally diagonalizable
- outer product of two vectors \(\mathsf{v} \, \mathsf{w}^{\top}\)
- spectral decomposition of a symmetric matrix
15.3 Conceptual Thinking
I should understand and be able to explain the following concepts:
- The dot product gives an algebraic encoding of the geometry (lengths and angles) of \(\mathbb{R}^n\)
- If two vectors are orthogonal, then they are perpendicular, or one of them is the zero vector
- An orthogonal projection is a linear transformation
- The row space of a matrix is orthogonal to its nullspace
- The inverse of orthogonal matrix \(A\) is the transpose \(A^{\top}\)
- Cosine similarity is a useful way to compare vectors, especially in high-dimensional vector spaces.
- The residual vector measures the quality of fit of a least squares solution
- The outer product \(\mathsf{v}\, \mathsf{w}^{\top}\) is a square matrix with rank 1