Orrite, let's just get down to business - these are some answers (not solutions, just answers) to last week's (tute 9) lab sheets:

2 A Linear Transformation in \(\mathcal{P}_3\)

1)a)\begin{align}T(1)=&12\cdot1\\ T(x)=&10\cdot x\\ T(x^2)=&2\cdot 1+6\cdot x^2 \\ T(x^3)=&6\cdot x^2.\end{align}

1)b)\begin{align}\left[\begin{array}{c}12 \\ 0 \\ 0 \\ 0 \end{array}\right],\left[\begin{array}{c}0 \\ 10 \\ 0 \\ 0 \end{array}\right],\left[\begin{array}{c} 2 \\ 0 \\ 6 \\ 0 \end{array}\right],\left[\begin{array}{c}0 \\ 6 \\ 0 \\ 0 \end{array}\right]. \end{align}

1)c)\begin{align}\left[\begin{array}{cccc}12 & 0 & 2 & 0 \\ 0 & 10 & 0 & 0 \\ 0 & 0 & 6 & 0 \\ 0 & 0 & 0 & 0 \end{array}\right].\end{align}

2)

A basis for the kernel of \(T\) is given by: \(\{3x-5x^3\}\).

A basis for the image of \(T\) is given by: \(\{1,x,x^2\}\).

3)a)

Everything in the kernel.

3)b)

No, there is no solution to

*blah blah blah*\(=x^3\).

Yes, anything of the form \(\frac{5}{6}+\alpha(3x-5x^3)\) will be a solution to

*blah blah blah*\(=10\).

Note: it is crucial that the answer is given in polynomial form, since the column vectors are merely representatives of these polynomials and hence not appropriate answers to 1)a), 3)a) and 3)b).

3 Change of Basis

4)a)

I...personally think that this is a slightly silly question. I suppose that an answer might be something like: "Coordinate changes are linear transformations, and transition matrices are their corresponding matrix of transformations.".

4)b)\begin{align}P_{B\rightarrow S}=\left[\begin{array}{cccc}1 & 1 & 0 & 0 \\ 1 & -1 & 0 & 0 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & -1 \end{array}\right],\,P_{S\rightarrow B}=\frac{1}{2}\left[\begin{array}{cccc}1 & 1 & 0 & 0 \\ 1 & -1 & 0 & 0 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & -1 \end{array}\right].\end{align} 4)c)\begin{align}P_{C\rightarrow B}=\left[\begin{array}{cccc}\frac{1}{2} & 1 & \frac{1}{2} & 0 \\ \frac{1}{2} & 0 & \frac{-1}{2} & 0 \\ 0 & 0 & \frac{1}{2} & 1 \\ 0 & 0 & \frac{1}{2} & 0\end{array}\right].\end{align}

4)d)\begin{align}[3+x-10x^2+x^3]_S=\left[\begin{array}{c} 3 \\ 1 \\ -10 \\ 1 \end{array}\right],\,[3+x-10x^2+x^3]_B=\left[\begin{array}{c} 2 \\ 1 \\ \frac{-9}{2} \\ \frac{-11}{2} \end{array}\right].\end{align}

4 Linear Transformations and Eigenthings

5)a)\begin{align}A=\left[\begin{array}{cc}5 & -3 \\ 6 & -4 \end{array}\right].\end{align}

5)b)\begin{align}T(2,2)=(4,4),\, T(1,2)=(-1,-2).\end{align}

5)c)\begin{align}T(2,0)&=T((4,4)+(-2,-4))=T(2(2,2)-2(1,2)) \\ &=2T(2,2)-2T(1,2)=(8,8)+(2,4)=(10,12).\end{align}

5)d)

Ewww...

*so gross*. There's no way I'mma answer this one at all. I mean, you might need to do three pictures just to show what's going on...

*bleh.*

5)e)

*The*(*) eigenvectors of \(A\) are \(\{\left[\begin{array}{c}2 \\ 2\end{array}\right],\,\left[\begin{array}{c}1 \\ 2\end{array}\right]\}\) with respective eigenvalues \(\lambda=2,-1\).

5)f)\begin{align}P_{B\rightarrow S}=\left[\begin{array}{cc} 2 & 1 \\ 2 & 2 \end{array}\right] \Rightarrow P_{S\rightarrow B}=\frac{1}{2}\left[\begin{array}{cc} 2 & -1 \\ -2 & 2 \end{array}\right].\end{align}

5)g)\begin{align}[T]_B=P_{S\rightarrow B}[T]_S P_{B\rightarrow S}=\left[\begin{array}{cc} 2 & 0 \\ 0 & -1 \end{array}\right]. \end{align}

5)h)

I've noticed that my answer to (g) is a diagonal matrix whose diagonal values correspond to the eigenvalues I obtained in part (e) (**).

5)i) The basis \(B\) tells me that the linear transformation \(T\) acts on a vector in \(\mathbb{R}^2\) by doubling the \((2,2)\) "component" of this vector and flipping the \((1,2)\) "component" of this same vector. I use the term "component" somewhat loosely here - I just mean that any vector in \(\mathbb{R}^2\) may be written uniquely as a linear combination of \((2,2)\) and \((1,2)\), and not something to do with vector-resolutes.

Note: just as with the polynomials, it's important to put in row vectors instead of column vectors in 5)b) and 5)c) - but not 5)e) as it asks for eigenvectors to \(A\), not \(T\).

Additional note: these eigenstuff calculations are generally pretty start forward, although a lot of people will screw up part (g) just because they have the matrices in the wrong order. I personally avoid memorising what you have to put down, and just think through the matrix multiplications - making sure that I remember that when you apply a matrix to a column vector, you start from the RIGHT HAND SIDE!

*: This

*doesn't*mean that the eigenvectors of \(A\) are unique, we're simply writing down representatives (up to linear combinations of eigenvectors with the same eigenvalue).

**: In particular, the position of these diagonal values line up with which eigenvector I've set as the columns of the matrix \(P_{B\rightarrow S}\) I formed in part (f).

## No comments:

## Post a Comment