But first,

*a very short story*.

There was a period, during the summer holidays, that my friends and I used to play 20 Questions (I did

*not*know that it was made into a TV show) after lunch every day. And being the sound logicians that we are, we ask all the regular questions like:

- "Are you real?"
- "Are you male?" and
- "Do most of the letters constituting your name lie in the first half of the alphabet, where "most" is taken to mean "at least half""?

We soon figured out that either he'd given us a really philosophical response, or - as was the case - we were dealing with Alexander Grothendieck.

And that's the end of my story. (Note: Grothendieck has since passed away. =/ )

Enter: vector resolutes vs. projections (a la our tutorial).

So, one of you asked me why we've renamed "vector resolutes" as "vector projections" and I made an offhanded remark about mathematicians loving the term "projection" partially due to Grothendieck. Well, that was at least half-false. =P I don't really know why we like "projection" - although it is certainly a more conceptually meaningful word (*) than "resolute" - I was sorta hinting at the mathematical construction called "projectivisation" (usually just Proj) that's used like, a hella lot in algebraic geometry. And somewhere in the deep dark recesses of my mind, there's a small footnote that says that Grothendieck has a lot to do with projectivisations.

*Anyhoo*, let's talk about proving the following vector identity:\begin{align}\vec{a}\times(\vec{b}\times\vec{c})=(\vec{a}\cdot\vec{c})\vec{b}-(\vec{a}\cdot\vec{b})\vec{c}.\end{align}Now, you all got pretty close to bashing this out this identity as per the hint that was given in the yellow book, and I think that's probably the

*safest*proof of this identity that I've seen. To my understanding, most of you will probably just expand out the left hand side, expand out the right hand side and go - O look, they're the same.

That's fine, but you should have a think about how to get straight from the left all the way to the right via factorisation near the end. Hint: you'll need to add and subtract the term \(a_ib_ic_i\) for each coordinate.

I'm going to explain an alternate way of proving this result, which - although algebraic - is slightly more geometric in feel. To begin with, let's talk about the standard unit vectors in \(\mathbb{R}^3\):\begin{align}\vec{e}_1=[1,0,0],\;\vec{e}_2=[0,1,0]\text{ and }\vec{e}_3=[0,0,1].\end{align}Now, these vectors are mutually perpendicular (

*ortho*) + they're unit vectors (

*normal*), and in maths, we call them (together) orthonormal vectors. And you guys can easily show that\begin{align}\vec{e}_1\times\vec{e}_2&=-\vec{e}_2\times\vec{e}_1=\vec{e}_3\\\vec{e}_2\times\vec{e}_3&=-\vec{e}_3\times\vec{e}_2=\vec{e}_1\\\vec{e}_3\times\vec{e}_1&=-\vec{e}_1\times\vec{e}_3=\vec{e}_2.\end{align}

Thanks to these cross-product identities, it's really really easy to verify our vector identity above for these three vectors, right? In fact, by the right-hand rule, if we think about what's geometrically happening to three generic orthogonal vectors \(\vec{u},\vec{v},\vec{w}=\vec{u}\times\vec{v}\), then we should certainly expect the following identities:\begin{align}\vec{u}\times\vec{v}&=-\vec{v}\times\vec{u}=\vec{w}\\\vec{v}\times\vec{w}&=-\vec{w}\times\vec{v}=\vec{u}\\\vec{w}\times\vec{u}&=-\vec{u}\times\vec{w}=\vec{v},\end{align}and we'd have no problems checking our result. So, wouldn't it be great if somehow the vectors \(\vec{a},\vec{b},\vec{c}\) were sorta close to being orthonormal-ish?

So here's a way to make that

*almost*happen. Observe (i.e.: stare really hard) that the

*truthiness*of our identity isn't affected by us multiplying any of the three vectors by a non-zero constant. So, what I mean is that if we know that: \begin{align}\vec{a}\times(\frac{\vec{b}}{|\vec{b}|}\times\vec{c})=(\vec{a}\cdot\vec{c})\frac{\vec{b}}{|\vec{b}|}-(\vec{a}\cdot\frac{\vec{b}}{|\vec{b}|})\vec{c}\end{align}

is true/false, then our desired identity must be true/false - this is of course assuming that \(\vec{b}\neq\vec{0}\), but then the identity is really easy to prove if that were the case. Therefore, we may assume

*without loss of generality (wlog)*that \(\vec{b}\) is a unit vector.

Now, this scaling business means that we can make any of the vectors a unit vector (unless if it's the \(\vec{0}\) vector, but then the identity is really easy if any of them is the zero vector). But that's not quite good enough - I mean, orthogonality is sort of the more important bit.

*If only*we could somehow replace \(\vec{c}\) with \[\vec{c}'=\vec{c}-(\vec{b}\cdot\vec{c})\vec{b},\]

which we know is orthogonal to \(\vec{b}\). Well, we can certainly do this on the left-hand side, because\[\vec{b}\times\vec{c}'=\vec{b}\times\vec{c}-(\vec{b}\cdot\vec{c})\vec{b}\times\vec{b}=\vec{b}\times\vec{c}.\]

And we can do so on the right hand side because\begin{align}&(\vec{a}\cdot(\vec{c}-(\vec{b}\cdot\vec{c})\vec{b}))\vec{b}-(\vec{a}\cdot\vec{b})(\vec{c}-(\vec{b}\cdot\vec{c})\vec{b})\\=&(\vec{a}\cdot\vec{c})\vec{b}-(\vec{b}\cdot\vec{c})(\vec{a}\cdot\vec{b})\vec{b}-(\vec{a}\cdot\vec{b})\vec{c}+(\vec{a}\cdot\vec{b})(\vec{b}\cdot\vec{c})\vec{b}\\=&(\vec{a}\cdot\vec{c})\vec{b}-(\vec{a}\cdot\vec{b})\vec{c}.\end{align}So, we may assume wlog that \(\vec{c}\) is orthogonal to \(\vec{b}\). In fact, we can also assume that \(\vec{c}\) is normal by normalising \(\vec{c}'\), and again I should say something like if \(\vec{c}'\) is the zero vector, then everything is easy.

So, we've got orthonormal vectors \(\vec{b},\;\vec{c}\) and \(\vec{d}:=\vec{b}\times\vec{c}\), and if I re-position my \(x,y,z\)-axes in \(\mathbb{R}^3\) to line up with these three vectors, I can see that any vector in \(\mathbb{R}^3\) may be written in a unique way as a sum of these three vectors. In fact, you'll learn after the Easter break that this is what we call a

*basis*for \(\mathbb{R}^3\). In particular, our vector \(\vec{a}\) may be written in the form:\[\vec{a}=\alpha_b\vec{b}+\alpha_c\vec{c}+\alpha_d\vec{d}.\]

Now, let's calculate the two sides of our desired identity and prove it!

On the left-hand side, we have:\begin{align}&\vec{a}\times(\vec{b}\times\vec{c})\\

&=(\alpha_b\vec{b}+\alpha_c\vec{c}+\alpha_d\vec{d})\times\vec{d}\\

&=-\alpha_b\vec{c}+\alpha_c\vec{b}.\end{align}Note that this makes use of that stuff we were saying about cross-products of orthonormal vectors at the start of this proof. And on the right-hand side, we have:\begin{align}(\vec{a}\cdot\vec{c})\vec{b}-(\vec{a}\cdot\vec{b})\vec{c}=\alpha_c\vec{b}-\alpha_b\vec{c},\end{align}

where we've made use of the orthogonality of the vectors \(\vec{b},\vec{c}\) and \(\vec{d}\).

Orrite, to be honest, that did seem kinda long as a proof, but I think that it's nicer than bashing it out directly because you get this additional understanding to do with scaling these vectors and making them orthogonal and jazz. But I have to warn you: I did make use of the right-hand rule and we had a physical-intuition based argument for why the cross products of any orthonormal vectors \(\vec{u},\vec{v},\vec{w}\) should be as I described. And for that reason alone, if this question crops up in an assignment or (God-forbid?) the exam, just bash it out. It's actually not so painful the second time you do it - I think that my second time took about 2~3 minutes?

So...ugh...happy Easter guys!

(*): I'm thinking of movie projectors here, but I think that might actually be kinda backwards, because the peeps who first invented and made projectors might actually be semi-familiar with the word "projection".

## No comments:

## Post a Comment