<< Chapter < Page | Chapter >> Page > |
Let $V$ be a vector space with basis $B$ . The dimension of $V$ , denoted $\mathrm{dim}(V)$ , is the cardinality of $B$ .
Every vector space has a basis.
Every basis for a vector space has the same cardinality.
$\implies \mathrm{dim}(V)$ is well-defined .
If $\mathrm{dim}(V)$∞ , we say $V$ is finite dimensional .
vector space | field of scalars | dimension |
---|---|---|
$\mathbb{R}^{N}$ | $\mathbb{R}$ | |
$\mathbb{C}^{N}$ | $\mathbb{C}$ | |
$\mathbb{C}^{N}$ | $\mathbb{R}$ |
Every subspace is a vector space, and therefore has its own dimension.
Suppose $(S=\{{u}_{1}, , {u}_{k}\})\subseteq V$ is a linearly independent set. Then $$\mathrm{dim}(<S>)=$$
Let $V$ be a vector space, and let $S\subseteq V$ and $T\subseteq V$ be subspaces.
We say $V$ is the direct sum of $S$ and $T$ , written $V=(S, T)$ , if and only if for every $v\in V$ , there exist unique $s\in S$ and $t\in T$ such that $v=s+t$ .
If $V=(S, T)$ , then $T$ is called a complement of $S$ .
$$V={C}^{}=\{f:\mathbb{R}\mathbb{R}|f\text{is continuous}\}$$ $$S=\text{even funcitons in}{C}^{}$$ $$T=\text{odd funcitons in}{C}^{}$$ $$f(t)=\frac{1}{2}(f(t)+f(-t))+\frac{1}{2}(f(t)-f(-t))$$ If $f=g+h={g}^{}+{h}^{}$ , $g\in S$ and ${g}^{}\in S$ , $h\in T$ and ${h}^{}\in T$ , then $g-{g}^{}={h}^{}-h$ is odd and even, which implies $g={g}^{}$ and $h={h}^{}$ .
Invoke a basis.
Let $V$ be a vector space over $F$ . A norm is a mapping $(V, F)$ , denoted by $()$ , such that forall $u\in V$ , $v\in V$ , and $\in F$
Euclidean norms:
$x\in \mathbb{R}^{N}$ : $$(x)=\sum_{i=1}^{N} {x}_{i}^{2}^{\left(\frac{1}{2}\right)}$$ $x\in \mathbb{C}^{N}$ : $$(x)=\sum_{i=1}^{N} \left|{x}_{i}\right|^{2}^{\left(\frac{1}{2}\right)}$$
Every norm induces a metric on $V$ $$d(u, v)\equiv (u-v)$$ which leads to a notion of "distance" between vectors.
Let $V$ be a vector space over $F$ , $F=\mathbb{R}$ or $\mathbb{C}$ . An inner product is a mapping $V\times VF$ , denoted $\cdot $ , such that
$\mathbb{R}^{N}$ over: $$x\cdot y=x^Ty=\sum_{i=1}^{N} {x}_{i}{y}_{i}$$
$\mathbb{C}^{N}$ over: $$x\cdot y=(x)y=\sum_{i=1}^{N} \overline{{x}_{i}}{y}_{i}$$
If $(x=\left(\begin{array}{c}{x}_{1}\\ \\ {x}_{N}\end{array}\right))\in \mathbb{C}$ , then $$(x)\equiv \left(\begin{array}{c}\overline{{x}_{1}}\\ \\ \overline{{x}_{N}}\end{array}\right)^T$$ is called the "Hermitian," or "conjugatetranspose" of $x$ .
If we define $(u)=u\cdot u$ , then $$(u+v)\le (u)+(v)$$ Hence, every inner product induces a norm.
For all $u\in V$ , $v\in V$ , $$\left|u\cdot v\right|\le (u)(v)$$ In inner product spaces, we have a notion of the angle between two vectors: $$((u, v)=\arccos \left(\frac{u\cdot v}{(u)(v)}\right))\in \left[0 , 2\pi \right)$$
$u$ and $v$ are orthogonal if $$u\cdot v=0$$ Notation: $(u, v)$ .
If in addition $(u)=(v)=1$ , we say $u$ and $v$ are orthonormal .
In an orthogonal (orthonormal) set , each pair of vectors is orthogonal (orthonormal).
An Orthonormal basis is a basis $\{{v}_{i}\}$ such that $${v}_{i}\cdot {v}_{i}={}_{ij}=\begin{cases}1 & \text{if $i=j$}\\ 0 & \text{if $i\neq j$}\end{cases}$$
The standard basis for $\mathbb{R}^{N}$ or $\mathbb{C}^{N}$
The normalized DFT basis $${u}_{k}=\frac{1}{\sqrt{N}}\left(\begin{array}{c}1\\ e^{-(i\times 2\pi \frac{k}{N})}\\ \\ e^{-(i\times 2\pi \frac{k}{N}(N-1))}\end{array}\right)$$
If the representation of $v$ with respect to $\{{v}_{i}\}$ is $$v=\sum {a}_{i}{v}_{i}$$ then $${a}_{i}={v}_{i}\cdot v$$
Every inner product space has an orthonormal basis. Any (countable) basis can be made orthogonal by theGram-Schmidt orthogonalization process.
Let $S\subseteq V$ be a subspace. The orthogonal compliment $S$ is $${S}^{}=\{u\colon u\in V\land (u\cdot v=0)\land \forall v\colon v\in S\}$$ ${S}^{}$ is easily seen to be a subspace.
If $\mathrm{dim}(v)$∞ , then $V=(S, {S}^{})$ .
Loosely speaking, a linear transformation is a mapping from one vector space to another that preserves vector space operations.
More precisely, let $V$ , $W$ be vector spaces over the same field $F$ . A linear transformation is a mapping $T:VW$ such that $$T(au+bv)=aT(u)+bT(v)$$ for all $a\in F$ , $b\in F$ and $u\in V$ , $v\in V$ .
In this class we will be concerned with linear transformations between (real or complex) Euclidean spaces , or subspaces thereof.
$$()$$
Also known as the kernel: $$\mathrm{ker}(T)=\{v\colon v\in V\land (T(v)=0)\}$$
Both the image and the nullspace are easily seen to be subspaces.
$$\mathrm{rank}(T)=\mathrm{dim}(())$$
$$\mathrm{null}(T)=\mathrm{dim}(\mathrm{ker}(T))$$
$$\mathrm{rank}(T)+\mathrm{null}(T)=\mathrm{dim}(V)$$
Every linear transformation $T$ has a matrix representation . If $T:{\mathbb{E}}^{N}{\mathbb{E}}^{M}$ , $\mathbb{E}=\mathbb{R}$ or $\mathbb{C}$ , then $T$ is represented by an $M\times N$ matrix $$A=\begin{pmatrix}{a}_{11} & & {a}_{1N}\\ & & \\ {a}_{M1} & & {a}_{MN}\\ \end{pmatrix}$$ where $\left(\begin{array}{c}{a}_{1i}\\ \\ {a}_{Mi}\end{array}\right)=T({e}_{i})$ and ${e}_{i}=\left(\begin{array}{c}0\\ \\ 1\\ \\ 0\end{array}\right)$ is the $i^{\mathrm{th}}$ standard basis vector.
$$\mathrm{colspan}(A)=<A>=()$$
If
$A:{\mathbb{R}}^{N}{\mathbb{R}}^{M}$ , then
$$\mathrm{ker}(A)^{}=()$$
If
$A:{\mathbb{C}}^{N}{\mathbb{C}}^{M}$ , then
$$\mathrm{ker}(A)^{}=()$$
The linear transformation/matrix $A$ is invertible if and only if there exists a matrix $B$ such that $AB=BA=I$ (identity).
Only
square matrices
can be invertible.
Let $A:{\mathbb{F}}^{N}{\mathbb{F}}^{N}$ be linear, $\mathbb{F}=\mathbb{R}$ or $\mathbb{C}$ . The following are equivalent:
If $A^{(-1)}=A^T$ (or $(A)$ in the complex case), we say $A$ is orthogonal (or unitary ).
Notification Switch
Would you like to follow the 'Fundamentals of signal processing' conversation and receive update notifications?