Matrices and vector spaces

Review: Vector algebra

  • Def: Hilbert Space
  • Def: Linear dependant: Suppose vectors: If NOT ALL ZEROS , that
  • Linear independant:

Inner Product

 

  • Properties:
  • Def: norm:

  • Orthogonal:

  • Kronecker delta symbol:

some useful inequalities

  • Schwarz’s inequality:

    where the equality holds when a is a scalar multiple of b, i.e. when .

    Proof: (p. 246)

    Hint: suqare the equivelant

  • The triangle inequality:

    Hint: same as Schwarz’s inequality

  • Bessel’s inequality:

    equality hold when

    Proof: (p.247)

    Hint: measure

  • the parallelogram equality:

    Proof: by defiantion

Basic matrix algebra

  • Def: from linear transform

Example: –

  • Def: Null matrix

  • Def: identity matrix

  • Def: Function of matrix:

    Example: Taylor expansion

  • Def: Transpose of matrix:

  • Def: Complex conjugate:

For is a square matrix:

  • Def: Hermitian conjugate:

  • Def: Trace:

    Properties:

Dererminant

  • Def: (remove minor)

    Properties:

the inverse of matrix:

Determinant is used to be feed inverse

  • Def: For square matrix A, if , we say A is singular

If A, B is non-singular, then ,

  • Def: :

Find :

use cofactor :

Example: Find if

for matrix:

Properties:

Rank of matrix

  • Def: R(A): the number of A, or R(A), is the number of independant vectors in set

    Remark: If we write A by , we have same defination.

  • Def: submatrix: any matrix from by ignoring 1 or more Column or Row

  • Def: Rank(A): the size of the largest submatrix of A whose determinant is not zero.

Orthogonal matrix

  • Def: or

  • Properties:

    1. A is orth. is orth.
  • Def: Hermitian:

    Skew(anti)-hermitian:

  • Def: Unitary:

  • Def: Normal:

Eigenvalue problem

  • Def: For if we have , then for any non-zero vector satisfies for some value is called eigenvalue.

    is called eigenpair of A.

    or

    Remark: if is eigenpair, then for is also eigenpair.

For , , then what is the eigenpair for ?

for normal matrix,

If , then are orthogonal

Given , find for A

  1. Def: Characteristic equation:

  2. By plugging in , to solve

 

Similarity transformation

If we have relation under a given basis set, What is the relation under another basisi set?

, ,

By: , where

Def: the above transformation is called similarity transformation.

Properties:

  1. If ,
  2. If is a Unitary matrix, then
  3. If is Hermitian/Unitary, is also Hermitian/Unitary

C: Transfer an orthogonal to another orthogonal

Diaonalitation of Matrix

Given a matrix A, If we construct the matrix C that has the eigenvectors of A as its column, then the matrix is diagonal and hasn the eigenvalues of A as its diagonal elements.

Remark:

  1. Any matrix with distinct eigen value can be diagonalized
  2. If , then

Diagonalise the matrix:

This matrix is symmetric so may be diagonalised by the form , where

Quadric and Hermitian forms

  • Def: A quadratic form is a scalar function of a real vector given by for linear operator .

    Remark: We only care about the symmetric A

  • Def: Hermitian form: , where is hermitian, may be complex

    Remark: H is real

  • Def: Positive definate: If Quadratic/Hermitian form

If is eigenvector, since

However if and are eigenvectors corresponding to different eigenvalues then they are orthogonal, so

Quadratic surface:

is a surface has stationary values of its radius

Simultaneous Lineart equation

In application, we have

  1. If

    all , the system is homosenous

    otherwise, inhomogenerous

  2. If

    M>N Overditermined system

    M=N Determined system

    M<N Underdetermined system

The range and nulll space of Matrix

  1. , A maps a value to a value , this W called the range of A.

  2. If A is singular, then , the maps onto zero vector in . This subspace is called null space of A, is called nullity of .

    • Def: for , if

      , A is calld Singular

      , A is calld Non-singular

      Remark: If with , then x is unique

The way to solve linear equation

Computational Complexity ()

Problem size:

Vector multiplication

Matrix vector multiplication

Matrix matrix multiplication

Algebraic Multigrid Method: an algorithm first introduced by CCCP

  1. Gauss Elimination (complex: )

  2. Direct inversion ()

  3. LU decomposition ()

    L: Lower triangle

    U: Upper triangle

    If A is SPD (Symmetric positive define), then (cholesky decomposition)

  4. Cramer’s rule

    If :

    If , the unique solution of is given by .

  5. Singular Value Decomposition

    For wheather M and N,

    1. is matrix (can be complex). Suppose that , where , ,

      1. Unitary matrix

      2. and is diagonal ()

        is called singular value,

      3. Unitary matrix

    2. Find

      1. where is the eigenvectors of

      2. where is the eigenvectors of

  6. Rayleigh-Ritz method

Last Updated on 3 years by Yichen Liu