#### Span
The $\text{Span}(f_{1},\dots, f_{n})$ is the set of all functions $c_{1}f_{1}+\dots +c_{n}f_{n}$ where $c_{1},\dots,c_{n}$ are real numbers
#### Vector Space
Then $S$ is a real vector space if all of the following are true:
1. There is a zero vector in $S$
There is a vector $\mathbf{0}$ in $S$ such that $\mathbf{v} + \mathbf{0} = \mathbf{v}$ for every vector $\mathbf{v}$ in $S$.
2. Closed under scalar multiplication
Multiplying any one vector in $S$ by a real number gives another vector in $S$
3. Closed under vector addition
Adding any two vectors in $S$ gives another vector in $S$
#### Linear Independence
Vectors are linearly dependent (think redundant) if at least one of them is a linear combination of the others.
#### Dimension
The dimension of a vector space is the number of vectors in any basis.
The dimension of the space of solutions to an $n^\text{th}$ order homogeneous ODE with constant coefficients is $n$.
#### Basis
A basis of a vector space $S$ is a list of vectors $v_1, v_2, …, v_n$, such that
1. $\text{Span}(v_1, v_2, …, v_n) = S$
2. The vectors $v_1, v_2, …, v_n$ are linearly independent.
A set of vectors $\{ v_{1}, v_{2}, \dots , v_{k}\}$ **generate** subspace $S$ if each $v_{i}$ is in $S$ and $\text{span}(\{ v_{1}, v_{2}, \dots , v_{k}\}) = S$.
#### Vector Operations
Inner (dot) product: $\langle a,v \rangle = a\cdot b= a^{\mathrm{T}}b$
Outer product: $ab^{\mathrm{T}}$
Length:
$|v| = \sqrt{ v_{1}^{2}+v_{2}^{2}+\dots+v_{d}^{d} }$
Angle between two vectors:
$\cos\theta= \frac{\langle v,w \rangle}{ \lvert v \rvert ~\lvert w \rvert}$
Cross product:
$\vec{a} \times \vec{b} = \begin{vmatrix}\hat{i} & \hat{j} & \hat{k} \\ a_{1} & a_{2} & a_{3} \\ b_{1} & b_{2} & b_{3}\end{vmatrix} =\begin{bmatrix}0 & -a_{3} & a_{2} \\ a_{3} & 0 & -a_{1} \\ -a_{2} & a_{1} & 0 \end{bmatrix}\vec{b}$
#### Inner Product
Vectors $u$ and $v$ are orthogonal if their inner product is zero:
$u \cdot v = \sum_{i=1}^{n}u_{i}v_{i}$
The inner product is symmetric and linear.
Any set of non-zero vectors $\{ v_{1}, v_{2}, \dots , v_{k}\}$ that are pairwise orthogonal are linearly independent.
*Proof:* $v_{1}\cdot \sum_{i=1}^{k}\alpha_{i}v_{i}=\alpha_{1} \lvert v_{1} \rvert^{2}=0$ for each $v_{1}$ so $\alpha_{i}$ are all zero
#### Generalized pythagorean theorem
Let $\{ v_{1}, v_{2}, \dots , v_{k}\}$ be pairwise orthogonal. Then
$\lvert v_{1} + \dots + v_{k} \rvert^{2} = \lvert v_{1} \rvert^{2} + \dots + \lvert v_{k} \rvert^{2}$
*Proof:* expand left side as an inner product
#### Orthonormal Bases
A set of vectors $\{ v_{1}, v_{2}, \dots , v_{k}\}$ is an orthonormal basis for the subspace $S$ if they generate $S$, all have length one and are pairwise orthogonal.
#### Orthogonal Decomposition
Every vector can be written as
$w = u + v$, where $\begin{cases}u \in U \\ v \in U^{\perp}\end{cases}$ and $u \cdot v=0$
#### Columnspace
The columnspace (CA) of an $m$ x $n$ matrix $A$ is the span of its $n$ columns.
$Ax=b$ has a solution if and only if $b \in C(A)$
If $A$ is an $n$ x $n$ matrix, $A$ is invertible if and only if $C(A)=\mathbb{R}^{n}$
*Proof:*
- If$A$ is invertible, for all $b\in \mathbb{R}^{n}$, $Ax=b$ for $x=A^{-1}b$*
- If $C(A)=\mathbb{R}^{n}$, there exists $Ax_{1}=e_{1}$ for each of the standard basis vectors $e_{1},e_{2}, \dots e_{n}$ such that $A\begin{bmatrix}x_{1} & x_{2} & \dots x_{n}\end{bmatrix}= I$, which gives us $\begin{bmatrix}x_{1} & x_{2} & \dots x_{n}\end{bmatrix} = A^{-1}$
#### Nullspace
The nullspace is all $x$ such that $Ax = 0$
If $A$ is $n$ x $n$ then $C(A) = \mathbb{R}^{n}$ if and only if $N(A) = 0$
*Proof:*
- If $N(A)\neq 0$ then we must get a row of all zeros in the row echelon form of A, making it singular and so $C(A) \neq \mathbb{R}^{n}$
- If $N(A)=0$, the row echelon form of $A$ must have all nonzero pivots, making A invertible so that $C(A)=\mathbb{R}^{n}$