Linear Algebra
Table of Contents
Won’t worry here about how to compute inverse, but it’s very siminp.linargr to the standard method for solving linear equations
We will use a numpy
to compute
import numpy as np
A = np.array([[4, -5],[-2, 3]])
print(A)
b = np.array([[-13],[9]])
print(b)
$A^{-1} b$
x = np.linalg.inv(A).dot(b)
print(x)
A = np.asmatrix(A)
b = np.asmatrix(b)
x = A.I*b
print(x)
A = np.array([[4, -5],
[-2, 3]])
b = np.array([[-13],
[9]])
x = np.linalg.inv(A).dot(b)
print(x)
A = np.asmatrix(A)
b = np.asmatrix(b)
x = A.I*b
print(x)
$$
y=
\begin{bmatrix}
y_{1} \\
y_{2} \\
\vdots \\
y_{m}
\end{bmatrix}
\qquad A = \begin{bmatrix}
a_{11}&a_{12}&\cdots&a_{1n} \\
a_{21}&a_{22}&\cdots&a_{2n} \\
\vdots&\vdots&\ddots&\vdots\\
a_{m1}&a_{m2}&\cdots&a_{mn} \\
\end{bmatrix}
\qquad x=
\begin{bmatrix}
x_{1} \\
x_{2} \\
\vdots \\
x_{n}
\end{bmatrix}
$$
x = np.array([[1],
[1]])
y = np.array([[2],
[3]])
print(x.T.dot(y))
x = np.asmatrix(x)
y = np.asmatrix(y)
print(x.T*y)
z = x.T*y
print(z.A)
A vector norm is any function $f : \mathbb{R}^{n} \rightarrow \mathbb{R}$ with
x = np.array([[4],
[3]])
np.linalg.norm(x, 2)
np.linalg.norm(x, 1)
Two vectors $x, y \in \mathbb{R}^n$ are orthogonal if
$$x^Ty = 0$$
They are orthonormal if, in addition,
$$\lVert x \rVert _{2} = \lVert y \rVert _{2} = 1 $$
x = np.matrix([[1],[2]])
y = np.matrix([[2],[-1]])
x.T*y
$$\begin{array}\
\qquad \quad \text{Given} & & \qquad \text{Interpret}\\
\text{linear transformation} & \longrightarrow & \text{matrix}\\
\text{matrix} & \longrightarrow & \text{linear transformation}\\
\end{array}
$$
$$\begin{array}{c}\
\vec x\\
\text{input}
\end{array}
\begin{array}{c}\
\quad
\text{linear transformation}\\
\implies
\end{array}
\quad
\begin{array}{l}
\vec y\\
\text{output}
\end{array}$$
$$\text{transformation} =\text{rotate + stretch/compress}$$
import numpy as np
theta = 90/180*np.pi
R = np.matrix([[np.cos(theta), -np.sin(theta)],
[np.sin(theta), np.cos(theta)]])
x = np.matrix([[1],[0]])
y = R*x
print(y)
Example
$T$: stretch $a$ along $\hat x$-direction & stretch $b$ along $\hat y$-direction
Compute the corresponding matrix $A$
$$\vec y = A \vec x$$
$$\begin{array}\\
\begin{bmatrix}ax_1\\ bx_2\end{bmatrix}& = A\begin{bmatrix}x_1\\ x_2\end{bmatrix} \Longrightarrow A = \,?\\\\
& = \begin{bmatrix}a & 0\\ 0 & b\end{bmatrix}\begin{bmatrix}x_1\\ x_2\end{bmatrix}
\end{array}$$
$$\begin{array}\\
A\begin{bmatrix}1\\0\end{bmatrix} & = \begin{bmatrix}a\\0\end{bmatrix} \\
A\begin{bmatrix}0\\1\end{bmatrix} & = \begin{bmatrix}0\\b\end{bmatrix} \\\\
A\begin{bmatrix}1 & 0\\ 0 &1\end{bmatrix} & = A = \begin{bmatrix}a & 0\\0 & b\end{bmatrix} \\
\end{array}$$
More importantly, by looking at $A = \begin{bmatrix}a & 0\\0 & b\end{bmatrix}$, can you think of transformation $T$?
$$
\begin{array}\\
P \begin{bmatrix} 1 \\ 0 \end{bmatrix} & = \begin{bmatrix} 1 \\ 0 \end{bmatrix}\\
P \begin{bmatrix} 0 \\ 1 \end{bmatrix} & = \begin{bmatrix} 0 \\ 0 \end{bmatrix}\\\\
P \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} & = \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}
\end{array}
$$
import numpy as np
P = np.matrix([[1, 0],
[0, 0]])
x = np.matrix([[1],[1]])
y = P*x
print(y)
$$\begin{array}{l}
\vec x & = a_1\vec v_1 + a_2\vec v_2 & (a_1 \;\text{and } a_2 \;\text{unique})\\ \\
T(\vec x) & = T(a_1\vec v_1 + a_2\vec v_2) \\
& = a_1T(\vec v_1) + a_2T(\vec v_2)\\
& = a_1\vec {\omega}_1 + a_2\vec {\omega}_2\\
\end{array}$$
This is why a linear system makes our life much easier
Only thing that we need is to observe how basis are linearly-transformed
$$
A \vec v = \lambda \vec v$$
$$
\begin{array}\\
\lambda & = &\begin{cases}
\text{positive}\\
0\\
\text{negative}
\end{cases}\\
\lambda \vec v & : & \text{stretched vector}\\
&&\text{(same direction with } \vec v)\\
A \vec v & : &\text{linearly-transformed vector}\\
&&(\text{generally rotate + stretch})
\end{array}$$
$$\begin{array}{l}
\vec x & = a_1\vec v_1 + a_2\vec v_2 & (a_1 \;\text{and } a_2 \;\text{unique})\\ \\
T(\vec x) & = T(a_1\vec v_1 + a_2\vec v_2) \\
& = a_1T(\vec v_1) + a_2T(\vec v_2)\\
& = a_1 \lambda_1\vec {v}_1 + a_2 \lambda_2 \vec {v}_2\\
& = \lambda_1 a_1 \vec {v}_1 + \lambda_2 a_2 \vec {v}_2\\
\end{array}$$
Example
$A = \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}$ : projection onto $\hat x$- axis
Find eigenvalues and eigenvectors.
import numpy as np
A = np.array([[1, 0],
[0, 0]])
D, V = np.linalg.eig(A)
print('D :', D)
print('V :', V)
Example
Projection onto the plane. Find eigenvalues and eigenvectors.
For any $\vec x$ in the plane, $P\vec x = \vec x \Rightarrow \lambda = 1$
For any $\vec x$ perpendicular to the plane, $P\vec x = \vec 0 \Rightarrow \lambda = 0$
\begin{array}{l}
\quad & \quad & \quad & \quad & \quad & \quad & \quad & \quad & \quad & \large AX = B
\end{array}
\begin{array}{l}
\quad & \quad & \quad & \quad & \quad & \quad & \quad & \quad & \quad & \large AX = B
\end{array}
$$\large AX = B$$
$$
\begin{bmatrix}
a_{11} & a_{12}\\
a_{21} & a_{22}\\
\end{bmatrix}
\begin{bmatrix}
x_{1}\\
x_{2}
\end{bmatrix} =
\begin{bmatrix}
b_{1}\\
b_{2}\\
\end{bmatrix}
$$
$$
\begin{bmatrix}
a_{11} & a_{12}\\
\end{bmatrix}
\begin{bmatrix}
x_{1}\\
x_{2}
\end{bmatrix}=b_1
$$
$$
\begin{bmatrix}
a_{11} & a_{12}\\
a_{21} & a_{22}\\
a_{31} & a_{32}\\
\end{bmatrix}
\begin{bmatrix}
x_{1}\\
x_{2}
\end{bmatrix} =
\begin{bmatrix}
b_{1}\\
b_{2}\\
b_{3}\\
\end{bmatrix}
$$
For over-determined linear system
$$
\begin{align*}
\begin{bmatrix}
a_{11} & a_{12}\\
a_{21} & a_{22}\\
a_{31} & a_{32}
\end{bmatrix}
\begin{bmatrix}
x_{1}\\
x_{2}
\end{bmatrix} &\neq
\begin{bmatrix}
b_{1}\\
b_{2}\\
b_{3}
\end{bmatrix} \quad\text{ or }\quad AX \neq B \\ \\ x_1
\begin{bmatrix}
a_{11} \\
a_{21} \\
a_{31}
\end{bmatrix} + x_2
\begin{bmatrix}
a_{12} \\
a_{22} \\
a_{32}
\end{bmatrix} &\neq
\begin{bmatrix}
b_{1}\\
b_{2}\\
b_{3}
\end{bmatrix}
\end{align*}
$$
Find $X$ that minimizes $\lVert E \rVert$ or $\lVert E \rVert^2$
i.e. optimization problem
$$
\begin{align*}
Y & \perp \left( X - W \right)\\
\implies & Y^T \left( X - W \right) = Y^T \left( X - \omega \frac{Y}{\lVert Y \rVert} \right) = 0\\
\implies & \omega = \frac{Y^T X}{Y^T Y}\lVert Y \rVert\\
& W = \omega \frac{Y}{\lVert Y \rVert} = \frac{Y^TX}{Y^TY}Y = \frac{\langle X, Y \rangle}{\langle Y, Y \rangle}Y
\end{align*}
$$
import numpy as np
X = np.matrix([[1],[1]])
Y = np.matrix([[2],[0]])
print(X)
print(Y)
print(Y.T*Y)
omega = (X.T*Y)/(Y.T*Y)
print(float(omega))
omega = float(omega)
W = omega*Y
print(W)
Projection of $B$ onto a subspace $U$ of span of $A_1$ and $A_2$
Orthogonality
import numpy as np
A = np.matrix([[1,0],[0,1],[0,0]])
B = np.matrix([[1],[1],[1]])
X = (A.T*A).I*A.T*B
print(X)
Bstar = A*X
print(Bstar)
$\quad$ for every $u \in U$. Furthermore, if $u \in U$ and the inequality above is an equality, then $u = P_u\omega$
%%javascript
$.getScript('https://kmahelona.github.io/ipython_notebook_goodies/ipython_notebook_toc.js')