# Vector Space

A non-empty set of vectors $V$ is called a `vector space`

(or linear space) if:

- For all $\boldsymbol{x}, \boldsymbol{y} \in V$, $\boldsymbol{x} + \boldsymbol{y} \in V$, i.e. $V$ is closed under addition.
- For all $\boldsymbol{x} \in V$, $k \in \mathbb{R}$, $k\boldsymbol{x} \in V$, i.e. $V$ is closed under scalar multiplication.

The second condition implies that the zero vector is always included in any vector space $V$. Alternatively, we may say that $V$ is closed under linear combination, i.e.

$$ k_1 \boldsymbol{x}_1 + \cdots + k_n \boldsymbol{x}_n \in V $$

for all $\boldsymbol{x}_1, \cdots, \boldsymbol{x}_n \in V$ and $k_1, \cdots, k_n \in \mathbb{R}$.

For example, below are some valid/invalid vector spaces:

- $V = \{(x, y)^\prime: x \in \mathbb{R}, y \in \mathbb{R}\}$ is a vector space.
- $V = \{(x, y)^\prime: x > y\}$ is not a vector space.
- $V = \{(x, y, 0)^\prime: x \in \mathbb{R}, y \in \mathbb{R}\}$ is a vector space.
- $V = \{(x, y, z)^\prime: 2x = y, x \in \mathbb{R}, y \in \mathbb{R}, z \in \mathbb{R}\}$.

## Subspace

A set $W$ is a `subspace`

of a vector space $V$ if $W \subset V$, and $W$ itself is a vector space.

For example, $V = \mathbb{R}^2 = \{(x, y)^\prime \mid x \in \mathbb{R}, y \in \mathbb{R}\}$ is a vector space. Let

$$ W_1 = \{ (x, y)^\prime: y-2x = 0, x, y \in \mathbb{R} \} $$

We first check if $W_1$ is a subset of $V$. For any element $\boldsymbol{u} = (u_1, u_2)^\prime \in W_1$ such that $u_2 = 2u_1$, as $\boldsymbol{u}$ is a $2 \times 1$ vector, $\boldsymbol{u} \in V = \mathbb{R}^2$.

Then we need to show that $W_1$ itself is a vector space by showing that $W_1$ is closed under linear combination. For all $\boldsymbol{u}, \boldsymbol{v} \in W_1$, $\boldsymbol{u} = (u_1, u_2)^\prime$ such that $u_2 = 2u_1$, $\boldsymbol{v} = (v_1, v_2)^\prime$ such that $v_2 = 2v_1$, consider

$$ a \boldsymbol{u} + b \boldsymbol{v} = (au_1 + bv_1, au_2 + bv_2)^\prime \in W_1 \quad \forall a, b. $$

We need to show that $2(au_1 + bv_1) = au_2 + bv_2$, which is obvious. Therefore $W_1$ is a subspace.

Let

$$ W_2 = \{ (x, y)^\prime: y - 2x + 1 = 0 \} $$

Following the same procedure, we can show that $W_2$ is not closed under addition and therefore is not a subspace.

## Span

Let $V$ be an ambient vector space ($\mathbb{R}^2, \mathbb{R}^3, \cdots$). Let $S = \{\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m\}$^{1} where $\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m \in V$. The `(linear) span`

of $\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m$, or $S$, is the set that contains all linear combinations of $\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m$. The span is denoted:

$$
\begin{aligned}
W &= \mathcal{L}(\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m) = span(\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m) \\

&= \left\{ \boldsymbol{u} \mid \boldsymbol{u} = \sum_{i=1}^m k_i \boldsymbol{u}_i, k_i \in \mathbb{R} \right\}
\end{aligned}
$$

We say $W$ is spanned or generated by $S$, or by $\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m$. For example,

$$ \boldsymbol{u}_1 = (1, 0, 0)^\prime, \quad \boldsymbol{u}_2 = (0, 1, 1)^\prime $$

$W = \mathcal{L}(\boldsymbol{u}_1, \boldsymbol{u}_2)$ contains all possible linear combinations of $\boldsymbol{u}_1$ and $\boldsymbol{u}_2$, such as $(0, 0, 0)^\prime$, $(1, 2, 2)^\prime$, etc.

**A span is a subspace.** Let $W = \mathcal{L}(\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m)$ where $\boldsymbol{u}_i \in V$. For any $\boldsymbol{w} \in W$, $\boldsymbol{w} =\sum_{i=1}^m c_i \boldsymbol{u}_i \in V$ because $V$ is a vector space. Also for any $\boldsymbol{w}, \boldsymbol{v} \in W$,

$$ a\boldsymbol{w} + b\boldsymbol{v} = a\sum_{i=1}^m c_i \boldsymbol{u}_i + b\sum_{i=1}^m d_i \boldsymbol{u}_i = \sum_{i=1}^m (ac_i + bd_i)\boldsymbol{u}_i, $$

which is a linear combination of the $\boldsymbol{u}_i$’s in $W$.

### Spanning set

Let $V$ be a vector space (ambient space or a subspace of that). Suppose there exists a set of vectors $\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m$ in $V$ such that for any $\boldsymbol{u} \in V$, we can express $\boldsymbol{u}$ as a linear combination of $\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m$. Then we say $\{ \boldsymbol{u}_1, \cdots, \boldsymbol{u}_m \}$ is a `spanning set`

of $V$.

Suppose $S = \{\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m\}$ is a spanning set of $V$, then $\mathcal{L}(\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m) = V$. The LHS is the set of all possible linear combinations of the $\boldsymbol{u}_i$’s. As $V$ is closed under linear combination, LHS $\subset$ RHS. For all $\boldsymbol{v} \in V$, as $S$ is a spanning set, $\boldsymbol{v}$ can be expressed as a linear combination of the $\boldsymbol{u}_i$’s. Since the LHS contains all possible linear combinations of the $\boldsymbol{u}_i$’s, $\boldsymbol{v} \in LHS$, thus RHS $\subset$ LHS.

A vector space is associated with a *finite* number of vectors. For example, $\mathbb{R}^3$ has a spanning set of {$(1, 0, 0)^\prime$, $(0, 1, 0)^\prime$, $(0, 0, 1)^\prime$}. It’s important to know that a spanning set is not unique.
{$(1, 1, 1)^\prime$, $(1, 1, 0)^\prime$, $(0, 1, 1)^\prime$, $(1, 0, 0)^\prime$} is also a valid spanning set for $\mathbb{R}^3$. In fact, there’s a infinite number of spanning sets for a vector space.

## Basis

A `basis`

is nothing but a linearly independent spanning set. $\{\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m\}$ is a basis of $V$ if $\{\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m\}$ is linearly independent, and $\mathcal{L}(\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m) = V$, i.e. it’s a spanning set of $V$.

In general, a vector space $V$ has infinite bases. But all bases have the same number of vectors, called the `dimension`

of $V$. The dimension of $V$ is defined as the maximum number of linearly independent vectors in $V$.

For example, $\mathbb{R}^3$ has a basis $\{(1, 1, 1)^\prime, (1, 1, 0)^\prime, (0, 1, 1)^\prime\}$. We can easily show that the set of vectors are LIN. We can also show that $\forall (x, y, z)^\prime \in \mathbb{R}^3$, it can be expressed as a linear combination of the three vectors:

$$
\begin{aligned}
(x, y, z)^\prime &= a_1 \boldsymbol{u}_1 + a_2 \boldsymbol{u}_2 + a_3 \boldsymbol{u}_3 \\

&= (a_1 + a_2, a_1 + a_2 + a_3, a_1 + a_3)^\prime \\

&\Rightarrow a_3 = y - x, a_1 = z - y + x, a_2 = y - z
\end{aligned}
$$

Given a vector space $W$, e.g. $W = \{(x, y, z)^\prime \mid y - 2x = 0\}$, to find a basis of $W$, we first need to find two LIN vectors from $W$. Then, show that

$$ (x, 2x, z)^\prime = x(1, 2, 0)^\prime + z(0, 0, 1)^\prime $$

to prove $\{(1, 2, 0)^\prime, (0, 0, 1)^\prime\}$ is a basis of $W$.

**Theorem:** Every vector in $V$ has a unique representation in terms of a given basis.

Suppose $\{\boldsymbol{u}_1, \cdots, \boldsymbol{u}_n\}$ is a basis of $V$. Suppose there exists two sets of coefficients $\{a_1, \cdots, a_n\}$ and $\{b_1, \cdots, b_n\}$ such that $\boldsymbol{x} = \sum a_i\boldsymbol{u}_i$ and $\boldsymbol{x} = \sum b_i \boldsymbol{u}_i$ where $a_i \neq b_i$ for some $i$. If we subtract one from the other,

$$ \boldsymbol{0} = \sum_{i=1}^n (a_i - b_i) \boldsymbol{u}_i = \sum_{i=1}^n c_i \boldsymbol{u}_i $$

As the $\boldsymbol{u}_i$’s are linearly independent, $c_i = 0 \forall i$, i.e. $a_i = b_i \forall i$.

## Dimension

When we say the dimension of a vector space, it’s not the same as the number of entries in a vector. For $V = \{\boldsymbol{0}\}$, the dimension of $V$ is $dim(V) = 0$. Otherwise, the `dimension`

of $V$ is the number of vectors in any basis of $V$.

For $V = \mathcal{L}(\boldsymbol{u}_1, \cdots, \boldsymbol{u}_m)$, $dim(V) = m$.

### Examples

The dimension of $\mathbb{R}^n = n$. A possible basis is

$$ \{(1, 0, \cdots, 0)^\prime, (0, 1, 0, \cdots, 0)^\prime, \cdots, (0, 0, \cdots, 0, 1)^\prime\} $$

For $V = \mathcal{L}(\boldsymbol{u}_1, \boldsymbol{u}_2, \boldsymbol{u}_3)$ where

$$ \boldsymbol{u}_1 = (1, 1, 1)^\prime, \boldsymbol{u}_2 = (1, 0, -1)^\prime, \boldsymbol{u}_3 = (3, 2, 1)^\prime $$

We construct linear combinations such that $\sum \alpha_i \boldsymbol{u}_i = 0$:

$$
\begin{cases}
\alpha_1 + \alpha_2 + 3\alpha_3 = 0 \\

\alpha1 + 2 \alpha_3 = 0 \\

\alpha_1 - \alpha_2 + \alpha_3 = 0
\end{cases} \Rightarrow (\alpha_1, \alpha_2, \alpha_3) = (2\alpha_2, \alpha_2, -\alpha_2)
$$

There exists infinite solutions such as $(2, 1, -1)$, $(4, 2, -2)$ etc., so $\{\boldsymbol{u}_1, \boldsymbol{u}_2, \boldsymbol{u}_3\}$ is linearly dependent, and $dim(V) < 3$.

We can see that $\{\boldsymbol{u}_1, \boldsymbol{u}_2\}$ is linearly independent, so they form a basis and $dim(V) = 2$.

The third example is $W = \{(x, y, z)^\prime \mid 2x - y = 0 \}$. We have a linear restriction on $x$ and $y$, so the dimension of $W$ is $3-1=2$. Two possible methods are

- find two LIN vectors in $W$ that form a basis, such as $\{(0, 0, 1)^\prime, (1, 2, 0)^\prime\}$. We need to show that these are LIN, and they span $W$.
- $(x, y, z)^\prime = (x, 2x, z)^\prime = x(1, 2, 0)^\prime + z(0, 0, 1)^\prime$.

The second method is generalizable. Let $W = \{(x, y, z, w)^\prime \mid x - y + w = 0 \}$.

$$ (y-w, y, z, w)^\prime = y(1, 1, 0, 0)^\prime + z(0, 0, 1, 0)^\prime + w(-1, 0, 0, 1)^\prime $$

If we add another constraint of $z = 2w$, then

$$ (y-w, y, 2w, w)^\prime = y(1, 1, 0, 0)^\prime + w(-1, 0, 2, 1)^\prime $$

The dimension is the dimension of the ambient space $dim(\mathbb{R}^4) = 4$ minus the number of LIN restrictions.

### Facts

- Every non-null vector space $V$ of finite dimension has a basis.
- Any two bases of $V$ must have the same number of vectors.
- If $S = \{\boldsymbol{u}_1, \cdots, \boldsymbol{u}_n\}$ is linearly independent, $\boldsymbol{u}_i \in V$, and the dimension of $V$ is $n$, then $S$ is a basis.
- If $S = \{\boldsymbol{u}_1, \cdots, \boldsymbol{u}_n\}$ spans $V$ and $dim(V) = n$, then $S$ is linearly independent and a basis of $V$.
- If $W$ is a subspace of $V$ and $dim(W) = dim(V)$, then $W = V$.
- What if $W_1$ and $W_2$ are both subspaces of $V$, and $dim(W_1) = dim(W_2)$? Is $W_1 = W_2$? The answer is
**no**. - If $dim(V) = n$, and $\{\boldsymbol{u}_1, \cdots, \boldsymbol{u}_r\} \subset V$ are linearly independent, then we can find a basis that contains this set as a subset.

The difference between $V$ and $S$ is that $S$ is a finite set. ↩︎

Oct 12 | Matrix Inverse | 6 min read |

Sep 21 | Projection | 7 min read |

Sep 14 | Definitions in Arbitrary Linear Space | 6 min read |

Aug 26 | Matrices | 8 min read |

Nov 04 | Quadratic Form | 15 min read |