This is an evolving note, such that new concepts will constantly be added.

- Dot Product
- dot product of two vectors can be seem as linearly transform one to the 1D line defined by the other
- if and only if and are orthogonal to each other

- Matrix Multiplication
- Transpose
- Vector Differentiation
- Determinants
- determinant of two 2D vectors is the area of the signed parallelogram formed by these two vectors ()
- determinant of three 3D vectors is the signed volume of the parallelepiped formed by these three vectors
- if the determinant of a matrix A is 0, then A is singular. Below are some more properties of determinant of matrix:

- Norms
- p_norm (-norm): for , ,
- Uniform norm (sup norm): for ,

- Rank of a matrix (for real matrix )
- Symetric Matrices
- A matrix is positive semidefinite if for all vectors such that
- is positive semidefinite iff
- is positive semidefinite

- A matrix is positive semidefinite if for all vectors such that
- Trace (for square matrix)
- where

- Linear Transformation
- Invertibility of a matrix
- A matrix is invertible iff it is full rank

- Orthogonal Matrix
- “an orthogonal matrix is a square matrix with real entries whose columns and rows are orthogonal unit vectors” (from wikipedia)
- “The rows of an orthogonal matrix are an orthonormal basis. That is, each row has length one, and are mutually perpendicular. Similarly, the columns are also an orthonormal basis. In fact, given any orthonormal basis, the matrix whose rows are that basis is an orthogonal matrix. It is automatically the case that the columns are another orthonormal basis.” (from Wolfram MathWorld)
- Let and be orthogonal matrices:
- (when it equals , is a rotation matrix; o.w. is a reflection matrix)
- and are both orthogonal matrices

- Eigenvectors and Eigenvalues
- when a transformation only scale or reverse the vector but doesn’t change the direction of the vector (except reverse), we say the vector is a eigenvector of the transformation
- where is the eigenvalue associates with the eigenvector
- when we assume there is at least one eigenvector, we can use this equation to find it:
- eigenvalue decomposition
- Definition: “Let be a matrix of eigenvectors of a given square matrix and be a diagonal matrix with the corresponding eigenvalues on the diagonal. Then, as long as is a square matrix, can be written as an eigen decomposition . Furthermore, if A is symmetric, then the columns of P are orthogonal vectors.” (from Wolfram MathWorld)

- Properties

- Singular Value Decomposition