## Review: Vector algebra

**Def:**Hilbert Space**Def:**Linear dependant: Suppose vectors:If NOT ALL ZEROS , that - Linear independant:

## Inner Product

**Properties:****Def:**norm:Orthogonal:

Kronecker delta symbol:

## some useful inequalities

**Schwarz’s inequality:**where the equality holds when a is a scalar multiple of b, i.e. when

. Proof: (p. 246)

Hint: suqare the equivelant

**The triangle inequality:**Hint: same as Schwarz’s inequality

**Bessel’s inequality:**equality hold when

Proof: (p.247)

Hint: measure

**the parallelogram equality:**Proof: by defiantion

## Basic matrix algebra

**Def:**from linear transform

Example: –

**Def:**Null matrix**Def:**identity matrix**Def:**Function of matrix:Example: Taylor expansion

**Def:**Transpose of matrix:**Def:**Complex conjugate:

For

**Def:**Hermitian conjugate:**Def:**Trace:**Properties:**

## Dererminant

**Def:**(remove minor) **Properties:**

## the inverse of matrix:

Determinant is used to be feed inverse

**Def:**For square matrix A, if, we say A is **singular**

If A, B is non-singular, then

**Def:**:

Find

use cofactor

Example: Find

if

for

**Properties:**

## Rank of matrix

**Def:**R(A): the number of A, or R(A), is the number of independant vectors in set**Remark:**If we write A by, we have same defination. **Def:**submatrix: any matrix fromby ignoring 1 or more Column or Row **Def:**Rank(A): the size of the largest submatrix of A whose determinant is not zero.

## Orthogonal matrix

**Def:**or **Properties:**- A is orth.
is orth.

- A is orth.
**Def:**Hermitian:Skew(anti)-hermitian:

**Def:**Unitary:**Def:**Normal:

## Eigenvalue problem

**Def:**Forif we have , then for any non-zero vector satisfies for some value is called eigenvalue. is called eigenpair of A. or

**Remark:**ifis eigenpair, then for is also eigenpair.

For

, , then what is the eigenpair for ?

for **normal** matrix,

If

Given

, find for A

Def:Characteristic equation:By plugging

in , to solve

## Similarity transformation

If we have relation

under a given basis set, What is the relation under another basisi set?

, , By:

, where

**Def:** the above transformation is called **similarity transformation**.

**Properties:**

- If
, - If
is a **Unitary matrix**, then - If
is Hermitian/Unitary, is also Hermitian/Unitary

C: Transfer an orthogonal to another orthogonal

## Diaonalitation of Matrix

Given a matrix A, If we construct the matrix C that has the eigenvectors of A as its column, then the matrix

**Remark:**

- Any matrix with distinct eigen value can be diagonalized
- If
, then

Diagonalise the matrix:

This matrix is symmetric so may be diagonalised by the form

, where

## Quadric and Hermitian forms

**Def:**A**quadratic form**is a scalar function of a real vector given by for linear operator . **Remark:**We only care about the symmetric A**Def:****Hermitian form**:, where is hermitian, may be complex **Remark:**H is real **Def:****Positive definate**: If Quadratic/Hermitian form

If

However if

**Quadratic surface**:

is a surface has stationary values of its radius

## Simultaneous Lineart equation

In application, we have

If

all

, the system is **homosenous**otherwise,

**inhomogenerous**If

M>N

**Overditermined system**M=N

**Determined system**M<N

**Underdetermined system**

The range and nulll space of Matrix

, A maps a value to a value , this W called the range of A. If A is singular, then

, the maps onto zero vector in . This subspace is called null space of A, is called nullity of . **Def:**for, if , A is calld **Singular**, A is calld **Non-singular****Remark:**Ifwith , then x is unique

## The way to solve linear equation

Computational Complexity (

) Problem size:

Vector multiplication

Matrix vector multiplication

Matrix matrix multiplication

Algebraic Multigrid Method: an algorithm first introduced by CCCP

**Gauss Elimination**(complex:) **Direct inversion**() **LU decomposition**() L: Lower triangle

U: Upper triangle

If A is SPD (Symmetric positive define), then

(cholesky decomposition) **Cramer’s rule**If

: If

, the unique solution of is given by . **Singular Value Decomposition**For wheather M and N,

is matrix (can be complex). Suppose that , where , , Unitary matrix and is diagonal ( ) is called **singular value**,Unitary matrix

Find

where

is the eigenvectors of where

is the eigenvectors of

**Rayleigh-Ritz method**

Last Updated on 2 years by Yichen Liu