Thursday 30 May 2013

Maths on Steroids: 10 + 11

"This cosmic dance of bursting decadence and withheld permissions twists all our arms collectively, but if sweetness can win, and it can, then I'll still be here tomorrow to high-five you yesterday, my friend. Peace."

-Royal Tart Toter (Adventure Time).

Hey guys...and so it came to pass that the semester drew to an end, but not so your edification. I hope that you all had a fun semester doing some maths. I know that I had fun tutoring you guys. And I guess that I just wanna point out and remind you that ultimately, you kinda learnt everything by yourselves. All I did was facilitate some discussion and thoughts every so often, and this is meant to be one of the take-home messages of university education: there won't always be teachers, and you guys can and will have to teach yourselves. Which is good, because it turns out that most of the time you're better at teaching yourselves than we (teachers) are.

Having said that, during this week's tute, I got a request for some harder eigen-stuff questions. I couldn't really find any hard ones of the net, so I just made some up. I'll only provide the answer for the first one (for now). If you wanna know if you got something right, ask one of your friends to try the problem independently and see if your answers match up? Oh, and I should warn you that I haven't ordered the questions in order of difficulty =P

  1. What do you call an eigen-sheep? (Ans: a lamb, duh).
  2. Consider the (infinite dimensional) vector space \(\mathcal{P}\) of real polynomials in \(x\). Show that the following function is a linear transformation, and find its eigenvalues and eigenvectors: \begin{align}T:\mathcal{P}&\rightarrow\mathcal{P}\\ p(x)&\mapsto x\cdot\frac{\mathrm{d}p}{\mathrm{d}x}.\end{align}
  3. Generalise the above setup/result/calculations to vector spaces of real polynomials in more than two variable.
  4. Once you've figured out the right generalisation in 3 to the linear transformation \(T\) defined in question 2, notice that it works on smooth functions in general. So then try to find non-polynomial eigenvectors in the vector space of smooth functions with eigenvalue \(2\).
  5. The weather in the Ice Kingdom in the land of Ooo sucks hard. If it snows today, then the probability that it will snow tomorrow is \(\frac{3}{4}\). And if it doesn't snow today, then the chance that it will snow tomorrow is \(\frac{1}{5}\). Given that there is no climate change in the Ice Kingdom, in the long term, for what proportion of the days in the Ice Kingdom will it snow?
  6. Most of the time we've dealt with matrices that can be diagonalised. Can you find a \(2\times 2\) matrice that can't be diagonalised? And how might you generalise to a \(k\times k\) matrix?
  7. Consider the following matrix of transformation on \(\mathbb{R}^3\): \begin{align}A=\left[\begin{array}{ccc}0 & -\frac{\sqrt{3}}{4} & -\frac{1}{4} \\ \frac{1}{\sqrt{3}} & \frac{1}{2} & -\frac{\sqrt{3}}{2} \\ \frac{1}{3} & -\frac{\sqrt{3}}{2} & \frac{3}{2} \end{array}\right],\end{align} and also consider the following set in \(\mathbb{R}^3\): \begin{align}S=\left\{\left[\begin{array}{c} u \\ \frac{v\sqrt{3}}{2}-\frac{1}{2} \\ \frac{v}{2}+\frac{\sqrt{3}}{2} \end{array}\right]: 0\leq u, v, \leq 1\right\}.\end{align} First show that \(S\) is a unit square, then figure out what \(A\cdot S\) and \(A^2\cdot S\) and so forth look like. In particular, calculate the area of \(A^n\cdot S\), hence find the total area of all the \(A^n\cdot S\)s as you vary \(n\) over all the non-negative integers and figure out what the union of all of these shapes geometrically looks like (*).
Methinks that's enough for now. Plus, if you can do those, you really should be set for any eigenvalue stuff you're likely to see in first year maths. =)

*: you might need to be slightly more subtle in your approach to eigenvectors and eigenvalues for this problem. It can be tricksy...

No comments:

Post a Comment