Machine Learning for Mechanical Engineering

Linear Algebra

Prof. Seungchul Lee
Industrial AI Lab at KAIST
  • For your handwritten solutions, please scan or take a picture of them. Alternatively, you can write them in markdown if you prefer.

  • Only .ipynb files will be graded for your code.

    • Ensure that your NAME and student ID are included in your .ipynb files. ex) SeungchulLee_20241234_HW01.ipynb
  • Compress all the files into a single .zip file.

    • In the .zip file's name, include your NAME and student ID.


    • Submit this .zip file on KLMS
  • Do not submit a printed version of your code, as it will not be graded.

Problem 01

Let $x$ be a block vector with two vector elements, $x = \begin{bmatrix}a\\b\end{bmatrix}$ where $a$ and $b$ are vectors of size $n$ and $m$, respectively. Show that

$$\lVert x \rVert = \left(\lVert a \rVert^2 + \lVert b \rVert^2 \right)^{\frac{1}{2}} = \left\lVert \begin{bmatrix} \lVert a \rVert \\ \lVert b \rVert \end{bmatrix} \right\rVert$$

(Note that the norm on the right-hand side is of a 2-vector)

Problem 02

Show that the hyperplane $g(X) = \omega^TX + b = 0$ is perpendicular to $\omega$.

Problem 03

Prove that any projection matrix $P$ for an orthogonal projection in $\mathbb{R}^n$ satisfies these two properties:

(a) $P^2 = P$

(b) $P$ is symmetric

Problem 04

Let $R = R(\theta)$ be a rotation matrix with a rotational angle of $\theta$ in $\mathbb{R}^n$.

(a) Prove that $$R^TR=I \;\text{in}\; \mathbb{R}^n$$

(b) Prove that $$R^T(\theta) = R^{-1}(\theta) = R(-\theta)\;\text{in}\;\mathbb{R}^n$$

(c) Show that column vectors in $R$ are orthogonal in $\mathbb{R}^n$.

Problem 05

In this problem, we want to explore a reflection transformation of $X$, which produces a mirror vector $Z$ with respect to vector $V$.

(a) Is a reflection transformation linear?

(b) If yes, find matrix $M$ such that $Z = MX$

(c) For $V = \begin{bmatrix} 1 & 1 \end{bmatrix}^T$, compute $M$ and its eigenvalues/eigenvectors (here, vector $V$ is a 45 degree angle to the x-axis.)

Problem 06

A linear map $P: X \rightarrow X$ acting on a vector space $X$ is called a projection if $P^2 = P$.

(a) Show that the matrix $P = \begin{bmatrix} 0& 1\\ 0& 1 \end{bmatrix}$ is a projection. Draw a sketch of $\mathbb{R}^2$ showing the vectors $[1, 2]^T$, $[−1, 0]^T$, and $[0, 3]^T$ and their images under the map $P$.

(b) Repeat this for $Q = I − P$.

Problem 07

Consider projection transformation of a vector $x$ on a given vector $y$. Let $y$ be a given $n$-vector, and consider the function $f : \mathbb{R}^n \Rightarrow \mathbb{R}^n$, defined as

$$f(x) = \frac{x^Ty}{\lVert y\rVert^2}y$$

We know that $f(x)$ is the projection of $x$ on the (fixed, given) vector $y$.

Is $f$ a linear function of $x$?

  • If your answer is yes, provide an $n \times n$ matrix $A$ such that $f(x) = Ax$ for all $x$ .

  • If your answer is no, show with an example that $f$ does not satisfy the definition of linearity $f(\alpha u + \beta v) = \alpha f(u) + \beta f(v)$.

Problem 08

Let $v \in \mathbb{R}^n$ be a unit vector and $Px$ the orthogonal projection of $x \in \mathbb{R}^n$ in the direction of $v$, that is, if $x = cv$ for some real constant $c$, then $Px = x$, while if $x \perp v$, then $Px = 0$. Show that $P = v v^T$

Problem 09

Suppose $T_1$ and $T_2$ are linear transformations from vector space $V$ to itself. Note that $T_1$ and $T_2$ are not in a form of matrix.

(a) Prove that mapping $T_1 + T_2$ is a linear transformation.

(b) Prove that mapping $T_1 \cdot T_2$ is a linear transformation.

Problem 10

Represent each of the following three functions $f: \mathbb{R}^2 \rightarrow \mathbb{R}^2$ as a matrix-vector product $f(x) = Ax$

(a) $f(x)$ is obtained by reflecting $x$ about the $x_1$ axis

(b) $f(x)$ is $x$ reflected about the $x_1$ axis, followed by a counter clockwise rotation of 30 degrees.

(c) $f(x)$ is $x$ rotated counterclockwise over 30 degrees, followed by a reflection about the $x_1$ axis

Problem 11

Let $A$ be an invertible matrix with eigenvalues $\lambda_1, \lambda_2,\cdots,\lambda_k$ and corresponding eigenvectors $v_1, v_2, \cdots ,v_k$. What can you say about the eigenvalues and eigenvectors of $A^{−1}$? Justify your response.

Problem 12

Fibonacci sequence is defined as the follwing:

$$F_{k+2} = F_{k+1} + F_{k} \qquad \text{with} \;F_0 = 0, \;F_1 = 1$$

It is known that the ratio of $F_{k+1}$ to $F_{k}$ is close to 'golden mean'. That is,

$$\lim_{k \to \infty} \frac{F_{k+1}}{F_{k}} = \frac{\left(1 + \sqrt{5}\right)}{2}$$

Show the above equation using eigenvectors and eigenvalues.

  • Hint. Write fibonacci sequence in matrix form

Problem 13

Let's assume a transformation $T : \mathbb{R}^2 \to \mathbb{R}^2$ projects an arbitrary vector onto a space spanned by $x = [1, 1]^T$.

Find the corresponding matrix $A$ by using

(a) Eigen decomposition (in other words, see how two eigenvectos are transformed by $T$) and

(b) Projection matrix.

Problem 14

There are system of equations

$$ \begin{align*} 83x + 12y + 10z &= 141 \\ -68x + 84y + 95z &= 145 \\ 32x + 23y + 43z &= 89 \end{align*} $$

(a) Write the equations in a matrix form. ($AX = B$)

In [ ]:
# Write your code here
# A =
# B =

(b) Does this system of equations have a solution?

  • If yes, find the solution of system of equations
  • If no, find the solution of system of equations that minimizes $\lVert AX-B \rVert $
In [ ]:
# Write your code here
# X_opt =

Problem 15

Solve the following optimization problem:

$$\begin{align*} \min \quad &\rVert x \lVert \\ \text{subjec to}\quad &2x_1 - x_2 + x_3 -x_4 = 3\\ &\qquad \,\, x_2 - x_3 - x_4 = 1 \end{align*}$$