# multilineal lección 2

## on multilinear algebra of inner product vectorspaces

this lecture gonna be devoted to re-understand all abstract previous treatment in lección 1.

on PDF

An inner vectorspace over the field $\mathbb{R}$ is an abelian group $(V,+)$ which is an vector space that possess a 2-variable map

$g:V\times V\to\mathbb{R}$

which satisfy

1. $g$ is bilinear
2. $g(v,w)=g(w,v)$, i.e. $g$ is symmetric
3. $g(v,v)\ge 0$, i.e. $g$ is positive-definite
4. $g(v,v)=0$ if and only if $v=\stackrel{\to}0$

Here we gotta to study things like the famous rising and lowering of indexes in a tensor to get several isomorphisms called musical morphisms which is a device to reduce the parafernalia of multiindexed quantities.

An example is the inner product $\langle\quad ,\rangle$ en $\mathbb{R}^n$:

$\left\langle\left(\begin{array}{c}v^1\\v^2\\...\\v^n \end{array}\right),\left(\begin{array}{c}w^1\\w^2\\...\\w^n \end{array}\right)\right\rangle=v^1w^1+v^2w^2+\cdots+v^nw^n=u^sw^s=v^sw^t\delta_{st}$

A bilinear map $B:V\times V\to\mathbb{R}$ is also dubbed bilinear form. It is posible to give many examples simply by choosing a matrix $M$ to make

$(x,y)\mapsto x'My=[x]'[M][y]$

$x'My=x^sy^tM_{st}$

An inner product vector space it is also dubbed euclidean vector space and the map $g$ is called metric in $V$.

Other inner products. A vector space with a bilinear form determined by a symmetric matrix with positive eigen-values determine another way (or a non std way) to measure lengths of vectors, angles between them, areas of polygonal regions, lenght of curves, areas of surfaces, ect

Gram matrix. With an euclidean vector space $V$ and a basis in it, say $b_1,b_2,...,b_n$, the numbers

$g(b_i,b_j)=g_{ij}$

got by the use of the metric $g:V\times V\to\mathbb{R}$, can be arrange into a matrix

$[g_{ij}]=[\langle b_i,b_j\rangle]$

$[g_{ij}]=\left(\begin{array}{cccc}g_{11}&g_{12}&\cdots&g_{1n}\\g_{21}&g_{22}&\cdots&g_{2n}\\\vdots&&&\vdots\\g_{n1}&g_{n2}&\cdots&g_{nn}\end{array}\right)$

$[g_{ij}]^{-1}=\left(\begin{array}{cccc}g^{11}&g^{12}&\cdots&g^{1n}\\g^{21}&g^{22}&\cdots&g^{2n}\\\vdots&&&\vdots\\g^{n1}&g^{n2}&\cdots&g^{nn}\end{array}\right)=[g^{ij}]$

$[g_{ij}][g^{ij}]=[g_{is}g^{sj}]=[{\delta_i}^j]$

$[g^{ij}][g_{ij}]=[g^{is}g_{sj}]=[{\delta^i}_j]$

$b^i=g^{is}b_s$ is a change of basis in $V$ this change of basis is called rising index law among the initial basis and its reciprocal one

$V={\rm{gen}}\{b^1,b^2,...,b^n\}$ is called reciprocal basis

$[g^{ij}]:V\to V$ is a musical isomorphism

$\langle b^i,b_j\rangle={\delta^i}_j$

Riesz’s representation lemma

$\forall f\in V^*\ \exists! a\in V$ such that $f(x)=\langle a,x\rangle$

Proof:

First observe that for $\beta^i(\quad)=\langle b^i,\quad\rangle$ we have

$\beta^i(b_j)=\langle b^i,b_j\rangle=\langle g^{is}b_s,b_j\rangle=g^{is}\langle b_s,b_j\rangle=g^{is}g_{sj}={\delta^i}_j$

Second $f(b_k)=f_s$ and in the other hand $\langle a,b_k\rangle=\langle a^sb_s,b_k\rangle=a^sg_{sk}=a_k$

Then if we want  to see a linear combination: $f=f_s\beta^s$, we gotta choose $f=a_s\beta^s$ since

• $f(b_k)=a_k$

and with $a=a_sb^s$

• $\langle a,b_k\rangle=a_s\langle b^s,b_k\rangle=a_s{\delta^s}_k\rangle=a_k$

and by, linear extension also: $f(x)=\langle a,x\rangle$ for each $x\in V$

To see unicity suppose that $f(x)=\langle c,x\rangle$ as $f(x)=\langle a,x\rangle$, but evaluating in basics $\langle c,b_l\rangle=\langle a,b_l\rangle$, we gonna get $\langle c_sb^s,b_l\rangle=\langle a_sb^s,b_l\rangle$, and  then $c_s{\delta^s}_l=a_s{\delta^s}_l$, so $c_l=a_l$ for each $l=1,...,n$. Hence $c=a$

$\Box$ (end of Riesz’ lemma proof)

———————————————-

The relation among the two ways of writting $a=a^sb_s$ and $a=a_sb^s$ are related by

$a_s=g_{sk}a^k$

is called lowering index law among the components of a vector between the basis $b_i$ and its reciprocal $b^i$

———————————————–

$\S\S$ Do you remember the problem of finding the dual basis (in $\mathbb{R}^2$ say) when we give an oblique basis $b_1,b_2$  (non orthonormal as $e_1,e_2$)? that is, finding a couple of covectors

$\beta^1,\beta^2$

such that

$\beta^i(b_j)={\delta^i}_j$?

This problem is solved more quickly by building the reciprocal basis $b^1,b^2$ to represent in the Riesz’ way the covectors $\beta^i$.

Let’s us explain with an example: if $b_1=\left(\begin{array}{c}1\\1\end{array}\right)$ and $b_2=\left(\begin{array}{c}0\\1\end{array}\right)$, then we have the corresponding Gram matrix

$[g_{ij}]=\left(\begin{array}{cc}2&1\\1&1\end{array}\right)$

this is because $\langle b_1,b_1\rangle=2$, $\langle b_1,b_2\rangle=1$, $\langle b_2,b_1\rangle=1$ and $\langle b_2,b_2\rangle=1$, so the corresponding inverse is

$[g^{ij}]=\left(\begin{array}{cc}1&-1\\-1&2\end{array}\right)$

from which one can construct

$b^1=g^{11}b_1+g^{21}b_2$ and $b^2=g^{12}b_1+g^{22}b_2$

which in our case $b^1=\left(\begin{array}{c}1\\ 0\end{array}\right)$ and $b^2=\left(\begin{array}{c}-1\\1\end{array}\right)$.

Then $\beta^1(\quad)=\left\langle\left(\begin{array}{c}1\\ 0\end{array}\right),\qquad\right\rangle$ and $\beta^2(\quad)=\left\langle\left(\begin{array}{c}-1\\1\end{array}\right),\qquad\right\rangle$ are the covectors basis, dual  to our original $b_1,b_2$

$\Box$ (end of the example)

———————————————–

$\S\S$ Example of inner-product with a non diagonal matrix

We are going to see an example of how a pairing

$\langle v,w\rangle_A=v^{\top}[A]w$

where $[A]=\left(\begin{array}{ccc}1&0&0\\ 0&2&1\\ 0&1&5\end{array}\right)$, gives a quadratic form definite positive and non degenerated:

In general, $\langle v,w\rangle_A=[v^1,v^2,v^3]\left(\begin{array}{ccc}1&0&0\\ 0&2&1\\ 0&1&5\end{array}\right)\left(\begin{array}{c}w^1\\w^2\\w^3\end{array}\right)$, which is  $\langle v,w\rangle_A=v^1w^1+2v^2w^2+v^2w^3+v^3w^2+5v^3w^3$

But for a generic $u=\left(\begin{array}{c}x\\y\\z\end{array}\right)\in\mathbb{R}^3$ we have  $\langle u,u\rangle_A=x^2+2y^2+2yz+5z^2$

Now, if we question about the positivity of the previous expresion we gotta make a change of basis in $\mathbb{R}^3$ in such a way the the matrix $[A]$ gets is simpler form the answer is related with a diagonalization process which achieve:

$\langle u,u\rangle_A=u^{\top}Au=(JJ^{-1}u)^{\top}A(JJ^{-1}u)$

$\langle u,u\rangle_A=u^{\top}Au=(J^{-1}u)^{\top})J^{\top}AJ(J^{-1}u)$

$\langle u,u\rangle_A=u^{\top}Au=(J^{-1}u)^{\top}DJ^{-1}u$

that is, $J^{\top}AJ=D$  is  a diagonal matrix. This is posible when we take $J=[v_1,v_2,v_3]$ a $3\times 3$-matrix with the eigen-vector $v_i$ of the matrix $[A]$

$Av_1=\lambda_1v_1$$Av_2=\lambda_2v_2$$Av_3=\lambda_3v_3$

Or matricialy $A[v_1,v_2,v_3]=[\lambda_1v_1,\lambda_2v_2,\lambda_3v_3]$ which is the same as

$AJ=J\left(\begin{array}{ccc}\lambda_1&0&0\\ 0&\lambda_2&0\\ 0&0&\lambda_3\end{array}\right)$

But $J^{\top}J$ gonna be diagonal and positive too, so $J^{\top}AJ=J^{\top}J\left(\begin{array}{ccc}\lambda_1&0&0\\ 0&\lambda_2&0\\ 0&0&\lambda_3\end{array}\right)=\left(\begin{array}{ccc}\mu_1&0&0\\ 0&\mu_2&0\\ 0&0&\mu_3\end{array}\right)=D$ with $\mu_i >0$

So, for the matrix $[A]=\left(\begin{array}{ccc}1&0&0\\ 0&2&1\\ 0&1&5\end{array}\right)$  gives $\lambda_1=1,\lambda_2=\frac{7-\sqrt{13}}{2}, \lambda_3=\frac{7+\sqrt{13}}{2}$

Once we see than the eigenvalues are positives the quadratic form is represented

$\langle u,u\rangle_A=\langle J^{-1}u,J^{-1}u\rangle_D$

$=(J^{-1}u)^{\top}\left(\begin{array}{ccc}\mu_1&0&0\\ 0&\mu_2&0\\ 0&0&\mu_3\end{array}\right)J^{-1}u$

$=\mu_1x^2+\mu_2y^2+\mu_3z^2$

Hence $\langle u,u\rangle_A>0$ if and only if $\mu_1,\mu_2,\mu_3>0$

Clearly $\langle u,u\rangle_A=0$ if and only if $u=\stackrel{\to}0$.

You can see HERE, the calculation done with Mathematica

$\Box$ (end of example)

Here the challenge is to find where else the formula

$J^{\top}AJ=J^{\top}J{\rm{diag}}[\lambda_1,...,\lambda_n]={\rm{diag}}[\mu_1,...,\mu_n]$

is…

But also observe that

$\langle u,u\rangle_A=u^{\top}Au$

$=(JJ^{-1}u)^{\top}A(JJ^{-1}u)$

$=(J^{-1}u)^{\top}J^{\top}AJ(J^{-1}u)$

$=u^{\top}J^{-\top}J^{\top}JDJ^{-1}u$

$\langle u,u\rangle_A=u^{\top}JDJ^{-1}u$

Beware, the matrix $JDJ^{-1}$ need not be diagonal, so to determine the signature of the quadratic form it is superior $J^{\top}AJ$ which surely gonna be diagonal

• the 4 ways to see a bilinear map on a euclidean vector space $V$, see an example:

continuará…

### 13 responses to “multilineal lección 2”

1. Lexgo

Hi professor, where can i get leccion 2 on pdf?

so, the reciprocal vectors are the transposed of the covectors?

• reciprocal vectors live in V and covectors in V*…

3. c-kit

$A,B\in\Lambda^3(V)$ then $\langle A,B\rangle_3=g^{ij}g^{kl}g^{rs}A_{ikr}B_{jls}$

———

$A,B\in\Lambda^k(V)$ then $\langle A,B\rangle_k=g^{i_1j_1}g^{i_2j_2}\cdots g^{i_kj_k}A_{i_1i_2...i_k}B_{j_1j_2...j_k}$

4. he maestro, cual es el libro que menciona en clase, gravitation q….?, y no sabe donde puedo descargar el de murray R spiegel…..que ya lo busque un chingo….

• c-kit

“Gravitation” de Misner, Thorne y Wheeler, fácil de hallar en google y del Spiegel no se donde bajar, pero la verdad vale la pena (\$) tenerlo… quizá prestando o de 2da mano… saludos

5. David D

q onda maestro…q tal guanajuato?, el viernes vio algo nuevo?, q no pude ir por los tramites de la beca de verano, si vieron algo nuevo para irlo leyendo prof……

• en Gto., todo marcha bien hacia el PhD… el viernes completamos los detalles de los puntos críticos de las funciones “proyección en los ejes” relativo al toro… también
hicimos ejemplos que ilustran el Lema de Representación de Riesz para covectores… el martes veremos la demostración del Lema…

6. no cualquier matriz cumple. es necesario que sea simétrica y de eigen-valores positivos