I have basically zero cold tolerance. I wonder if it has something to do with my consistently running off to South East Asia during the last few Aussie winters. Maaan, those were good trips: got more than fifty mosquito bites on just one arm on the first night (in a malarial region, the guy I shared a room with and the two kids next door had malaria), spent hours cleaning toilets in a house of guys, nearly rode a motorbike off a cliff...etc, and yet still some of my happiest and most relaxing holidays (*) ever! =D
So yea, sometimes things aren't quite what you expect them to be - and today, I'mma write down one example of something that you mightn't have expected to be a vector space.
A vector space is essentially a whole bunch of vectors so that you may perform vector addition and scalar multiplication to your heart's content. So why are these two things important? Well, they're the basis of row-reduction, right? If you can add and subtract vectors and scalar multiply vectors then you can do row-reduction, which in turn means that you can solve linear equations...etc, and you basically get all of linear algebra.
We formalise this intuition with a semi-clunky definition that says that a vector space is a set \(V\) with a notion of addition \(\oplus\) and scalar multiplication \(\otimes\) that satisfy eight axioms. Most of the time, the addition and scalar multiplication is pretty obvious, but if I choose my vector space to be based on the set \(V:=\mathbb{R}^{+}=(0,\infty)\), then the usual addition and scalar multiplication will not suffice to make \(V:=\mathbb{R}^+\) a vector space.
But what if we took \begin{align}u\oplus v:=u\times v=uv\text{ and }\alpha\otimes v:=v^\alpha?\end{align} Turns out this will make \((V,\oplus,\otimes)\) a real vector space.
Let's check this for all eight axioms. We're going to assume through-out and \(u,v,w\) are arbitrary vectors in \(V\), and \(\alpha,\beta\) are arbitary real scalars in \(\mathbb{R}\).
- Associativity of \(\oplus\): \begin{align}u\oplus(v\oplus w)=u\oplus (vw)=u(vw)=uvw=(uv)w=(uv)\oplus w=(u\oplus v)\oplus w.\end{align}
- Commutativity of \(\oplus\): \begin{align}u\oplus v=uv=vu=v\oplus u.\end{align}
- Additive identity element: this is actually slightly tricky. The identity (or zero) element in this vector space isn't the number \(0\) - which isn't even an element in \(V=\mathbb{R}^+\). It is in fact the number \(1\in V\)! And here's the proof that \(1\) is the zero-element (**):\begin{align}1\oplus v=1\times v=v.\end{align}
- (Additive) Inverse element: this is also slightly tricky, but I guess if you're figured out the last point, this should make sense. Now, since \(v\in\mathbb{R}^+\), its reciprocal \(v^{-1}\) is also in \(V=\mathbb{R}^+\) and is indeed the inverse element of \(v\):\begin{align}v\oplus v^{-1}=v\times v^{-1}=1.\end{align}
- Distributivity I:\begin{align}\alpha\otimes(u\oplus v)=\alpha\otimes (uv)=(uv)^{\alpha}=u^{\alpha}v^{\alpha}=(\alpha\otimes u)(\alpha\otimes v)=\alpha\otimes u\oplus\alpha\otimes v.\end{align}
- Distributivity II:\begin{align}(\alpha+\beta)\otimes v=v^{\alpha+\beta}=v^{\alpha}v^{\beta}=(\alpha\otimes v)(\beta\otimes v)=\alpha\otimes v\oplus\beta\otimes v.\end{align}
- Compatibility of scalar multiplication:\begin{align}\alpha\otimes(\beta\otimes v)=\alpha\otimes(v^{\beta})=(v^\beta)^{\alpha}=v^{\alpha\beta}=(\alpha\beta)\otimes v.\end{align}
- Scalar identity element: in this case, the scalar identity element is \(1\in\mathbb{R}\) - note that we're distinguishing this from \(1\in V=\mathbb{R}^+\), which is totally a vector. =)\begin{align}1\otimes v=v^{1}=v.\end{align}
Now, if you really think about it, what's REALLY going on is that I've simply disguised the usual one-dimensional real vector space \((\mathbb{R},+,\times)\) as \((V,\oplus, \otimes)\) by exponentiating and then re-interpreting what addition and multiplication means once we've exponentiated (***). So here's an exercise:
Make \((0,1)\) into a vector space!
*: surprisingly more relaxing than spending a lot of time onsen-ing in Japan?
**: yes, it sounds stupid, but the truth often sounds stupid. =)
***:it doesn't matter which exponent you choose, so long as it's positive and not \(1\).
There's a bijection
ReplyDelete$\varphi: \mathbb{R}^+ \rightarrow (0,1)$ given by $x \mapsto \frac{x}{x+1}$.
Define vector addition by $x + y = \varphi( \varphi^{-1}(x) \oplus \varphi^{-1}(y))$ where $\oplus$ is the sum you defined in your post.
Scalar multiplication is defined similarly. You've checked all the axioms for me!
Was there a cooler way to do this?
Oh you should probably delete / not post this comment if you prefer to leave the question open for your students.
Hey Anonymous! It's great to finally meet you - you seem to be omnipresent on the interwebs, how are you so prolific? :P
DeleteI don't really know of a cooler way to do this actually - I mean, you could do it more directly with any bijection \(\mathbb{R}\rightarrow(0,1)\) and do the same trick but using the usual addition and scalar multiplication on R. And for a function like \(\mathrm{Arctan}\) there might be a semi-nice interpretation of the operations in terms of trigonometric identities and geometry...=/
Btw: you can TeX using backslash+open-bracket, backslash-close bracket in MathJax.