Skip to main content

Section 7.3 Understand the linear transformation

In this section, we will learn how to deal the linear transformation relative to abstract vector spaces.
If you are thinking how to define matrix for the linear transformation, I will be proud of you.

Subsection 7.3.1 Matrix relative to bases

Matrix relative to bases.

Let \(V\) and \(W\) be vector spaces with bases \(\alpha=\{\alpha_1,\alpha_2,\ldots,\alpha_n\}\) and \(\beta=\{\beta_1,\beta_2,\ldots,\beta_m\}\text{,}\) respectively. Let \(T\) be a linear transformation from \(V\) to \(W\text{.}\) Then the matrix
\begin{equation*} [T]_{\alpha}^{\beta} = \Big[[T(\mathbf{\alpha}_1)]_\beta\quad [T(\mathbf{\alpha}_2)]_\beta\quad \ldots\quad [T(\mathbf{\alpha}_n)]_\beta\Big] \end{equation*}
is called the matrix of \(T\) relative to the bases \(\alpha\) and \(\beta\text{.}\)
If \(V=W\) and \(\alpha=\beta\text{,}\) then \([T]_{\alpha}^{\beta}\) will be written as \([T]_{\alpha}.\)
Let us use an example to understand this definition first.
Example: Let \(T:\mathbb{R}^{3}\rightarrow \mathbb{R}^{3}\) be a linear transformation such that
\begin{equation*} T\left(\begin{array}{r} x_1 \\ x_2 \\ x_3 \end{array}\right)=\left(\begin{array}{r} x_1+3x_2 \\ 3x_1+x_2\\ -2x_3 \end{array}\right)\ \text{and}\ \alpha=\beta=\left\{\left(\begin{array}{r} 1\\ 1\\ 0 \end{array}\right), \left(\begin{array}{r} 1\\ -1\\ 0 \end{array}\right), \left(\begin{array}{r} 0\\ 0\\ 1 \end{array}\right)\right\}. \end{equation*}
Compute the matrix \([T]_{\alpha}\) of the linear map \(T\text{.}\)
Exercise: Find the matrix \([T]_{\alpha}\) of the linear map in Example 7.1.4, where \(\alpha=\{1, 1-\frac{1}{2}x,1-x+\frac{1}{4}x^{2}\}\)