# Matrices and vector spaces

## Review: Vector algebra

• Def: Hilbert Space $\left$
• Def: Linear dependant: Suppose vectors: $v_1, v_2...$ If $\exists$ NOT ALL ZEROS $a_1, a_2...$, that $a_1v_1+a_1v_1+...=0$
• Linear independant: $...\neq 0$

## Inner Product

• ###### Properties:
• Def: norm: $||a||=^{\frac{1}{2}}$

• Orthogonal: $=0$

• Kronecker delta symbol:

## some useful inequalities

• Schwarz’s inequality:

where the equality holds when a is a scalar multiple of b, i.e. when $a=\lambda b$ .

Proof: (p. 246)

Hint: suqare the equivelant

• The triangle inequality:

Hint: same as Schwarz’s inequality

• Bessel’s inequality:

equality hold when $a \in \mathbb{R}$

Proof: (p.247)

Hint: measure $\| a - \sum _ { i } \langle \hat { e } _ { i } | a \rangle \hat { e } _ { i } \| ^ { 2 }$

• the parallelogram equality:

Proof: by defiantion

## Basic matrix algebra

• Def: from linear transform

Example: –

• Def: Null matrix

• Def: identity matrix

• Def: Function of matrix: $B=f(a)$

Example: Taylor expansion

• Def: Transpose of matrix: $A^T$

$(AB)^T=B^TA^T$

$(AB)_{ij}^T=(AB)_{ji}$

• Def: Complex conjugate: $A^*$

For $A$ is a $n\times n$ square matrix:

• Def: Hermitian conjugate: $A^\dagger$

• Def: Trace: $Tr(A)=\sum A_{ii}$

Properties: $Tr(A+B)=Tr(A)+Tr(B), Tr(AB)=Tr(BA), Tr(A)=Tr(A^*)=Tr(A^\dagger)$

## Dererminant

• Def: $|A|$ (remove minor)

Properties:

## the inverse of matrix:

Determinant is used to be feed inverse

• Def: For square matrix A, if $\det A \neq 0$, we say A is singular

If A, B is non-singular, then $AB=P$, $\Rightarrow$ $B=A^{-1}P, A=B^{-1}$

• Def: $A^{-1}$ : $AA^{-1}=A^{-1}A=I$

Find $A^{-1}$:

use cofactor $A_{ik}$:

Example: Find $A^{-1}$ if $A = \left( \begin{array} { c c c } { 2 } & { 4 } & { 3 } \\ { 1 } & { - 2 } & { - 2 } \\ { - 3 } & { 3 } & { 2 } \end{array} \right)$

1. $|A|=11$
2. $C = \left( \begin{array} { c c c } { 2 } & { 4 } & { - 3 } \\ { 1 } & { 13 } & { - 18 } \\ { - 2 } & { 7 } & { - 8 } \end{array} \right) \quad \text { and } \quad C ^ { T } = \left( \begin{array} { c c c } { 2 } & { 1 } & { - 2 } \\ { 4 } & { 13 } & { 7 } \\ { - 3 } & { - 18 } & { - 8 } \end{array} \right)$
3. $A ^ { - 1 } = \frac { C ^ { T } } { | A | } = \frac { 1 } { 11 } \left( \begin{array} { c c c } { 2 } & { 1 } & { - 2 } \\ { 4 } & { 13 } & { 7 } \\ { - 3 } & { - 18 } & { - 8 } \end{array} \right)$

for $2\times 2$ matrix:

Properties:

## Rank of matrix

• Def: R(A): the number of A, or R(A), is the number of independant vectors in set $\{V_1, V_2, \dots, V_n\}$

Remark: If we write A by $A^T$, we have same defination.

• Def: submatrix: any matrix from $A$ by ignoring 1 or more Column or Row

• Def: Rank(A): the size of the largest submatrix of A whose determinant is not zero.

## Orthogonal matrix

• Def: $A^T=A^{-1}$ or $A^TA=I$

• Properties:

1. A is orth. $\Rightarrow$ $A^-1$ is orth.
2. $A^TA = I \Rightarrow |A^T||A|=|A|^2=1$
3. $A^TA = I \Rightarrow \vec { y } = A \vec { x } \Rightarrow \left = \left = (Ax)^TAx = x^Tx = \left$
• Def: Hermitian: $A=A^\dagger$

Skew(anti)-hermitian: $-A=A^\dagger$

• Def: Unitary: $AA^\dagger=I$

• Def: Normal: $AA^\dagger = A^\dagger A$

## Eigenvalue problem

• Def: For $A(n*n)$ if we have $Ax=\lambda x$, then for any non-zero vector $x$ satisfies $Ax=\lambda(x)$ for some value $\lambda$ is called eigenvalue.

$(\lambda, x)$ is called eigenpair of A.

or $(A-\lambda I)x=0$

Remark: if $(\lambda, x)$ is eigenpair, then $(\lambda, \mu x)$ for $\mu \in \mathbb{R}$ is also eigenpair.

For $A$, $Ax=\lambda x$, then what is the eigenpair for $A^{-1}$?

for normal matrix, $\lambda(A^\dagger)=\lambda(A^*)$

If $\lambda_i \neq \lambda_j$, then $x^i, x^j$ are orthogonal

Given $A$, find $(\lambda, x)$ for A

1. Def: Characteristic equation: $|A-\lambda I|=0$

2. By plugging $\lambda$ in $(A-\lambda I)_x=0$, to solve $x$

## Similarity transformation

If we have relation $y=Ax$ under a given basis set, What is the relation under another basisi set?

$y=Ax$, $y'=A'x'$,

By: $y=Cy' \Rightarrow C^{-1}Cy'=C^{-1}ACx' \Rightarrow y'=C^{-1}ACx'$, where $A'=C^{-1}AC$

Def: the above transformation is called similarity transformation.

Properties:

1. If $A=I$, $A'=I$
2. $|A|=|A'|$
3. $\lambda(A)=\lambda(A')$
4. $Tr(A)=Tr(A')$
5. If $C$ is a Unitary matrix, then $A'=C^\dagger AC$
6. If $A$ is Hermitian/Unitary, $A'$ is also Hermitian/Unitary

C: Transfer an orthogonal to another orthogonal

## Diaonalitation of Matrix

Given a matrix A, If we construct the matrix C that has the eigenvectors of A as its column, then the matrix $A'=C^{-1}AC$ is diagonal and hasn the eigenvalues of A as its diagonal elements.

Remark:

1. Any matrix with distinct eigen value can be diagonalized
2. If $A'=C^{-1}AC$, then $Tr(A)=Tr(A')$

Diagonalise the matrix:

This matrix is symmetric so may be diagonalised by the form $C^\dagger AC$, where

• Def: A quadratic form $Q$ is a scalar function of a real vector $x$ given by $Q(x)= \left =x^TAx = \sum A_{ij}x_ix_j$ for linear operator $A$.

Remark: We only care about the symmetric A

• Def: Hermitian form: $H(x) = x^\dagger Ax$, where $A$ is hermitian, $x$ may be complex

Remark: $H^*=H \Rightarrow$H is real

• Def: Positive definate: If Quadratic/Hermitian form $Q \text{ or } H >0$

If $x$ is eigenvector, $Q=x^TAx=x^T\lambda x=\lambda$ since $Ax=\lambda x$

However if $x$ and $y$ are eigenvectors corresponding to different eigenvalues then they are orthogonal, so

is a surface has stationary values of its radius

## Simultaneous Lineart equation

In application, we have

1. If

all $b_i=0$, the system is homosenous

otherwise, inhomogenerous

2. If

M>N Overditermined system

M=N Determined system

M<N Underdetermined system

3. $Ax=b$

The range and nulll space of Matrix

1. $Ax=b, y=Ax \Rightarrow Ax=b$

2. $Ax=b$, A maps a value $x\in V$ to a value $b\in W$, this W called the range of A. $Rank(A)=Rank(W)$

3. If A is singular, then $\exists x\in V_{null}\text{(subspace of V)}$, the maps onto zero vector in $W$. This subspace is called null space of A, $Rank(A)$ is called nullity of $A$.

4. $Rank(A)+Nullity(A)=N\text{(number of unknowns)}$

• Def: for $A: N\times N$, if

$|A|=0$, A is calld Singular

$|A|\neq 0$, A is calld Non-singular

Remark: If $Ax=b$ with $|A|\neq 0$, then x is unique

## The way to solve linear equation

Computational Complexity ($O(N)$)

Problem size: $N$

Vector multiplication $aa^T$ $O(N)$

Matrix vector multiplication $y=Ax$ $O(N^2)$

Matrix matrix multiplication $C=AB$ $O(N^3)$

Algebraic Multigrid Method: an algorithm first introduced by CCCP

1. Gauss Elimination (complex: $O(N^3)$)

2. Direct inversion ($A^{-1}$)

$Ax=b \Rightarrow A^{-1}Ax=A^{-1}b \Rightarrow x=A^{-1}b$

3. LU decomposition ($A=LU$)

L: Lower triangle

U: Upper triangle

$Ax=b \Rightarrow LUx=b \Rightarrow y=Ux, Ly=b \Rightarrow Ux=y$

If A is SPD (Symmetric positive define), then $A=LL^\dagger$(cholesky decomposition)

4. Cramer’s rule

If $Ax=b$:

If $|A|\neq 0$, the unique solution of $Ax=b$ is given by $x_i=\frac{\Delta i}{|A|}$.

5. Singular Value Decomposition

For wheather M and N,

1. $A$ is $M\times N$ matrix (can be complex). Suppose that $A=USV^\dagger$, where , ,

1. $U: M\times M$ Unitary matrix

2. $S: M \times N$ and is diagonal ($S_{ij}=0, i\neq j$)

$s_{ii}=s_i$ is called singular value, $i<\min M,N$

3. $V: N\times N$ Unitary matrix

2. Find $USV^\dagger$

1. $s_i = \sqrt{\lambda_i}$

2. where $u^i$ is the eigenvectors of $AA^\dagger$

3. where $v^i$ is the eigenvectors of $A^\dagger A$

6. Rayleigh-Ritz method

Last Updated on 2 years by Yichen Liu