A vector can be written as a sum of its components.
\(v=\sum_i e_i v^i\)
The Einstein summation convention is to remove the \(\sum_i\) symbols where they are implicit.
For example we would instead write the vector as:
\(v=e_iv^i\)
\(v+w=(\sum_i e_i v^i)+(\sum_i f_iw^i)\)
\(v+w=\sum_i (e_iv^i+f_iw^i)\)
\(v+w=e_iv^i+f_iw^i\)
If the bases are the same then:
\(v+w=e_i (v^i+w^i)\)
\(cv=c\sum_ie_iv^i\)
\(cv=\sum_i ce_iv^i\)
\(cv=ce_iv^i\)
\(AB_{ik}=\sum_jA_{ij}B_{jk}\)
\(AB_{ik}=A_{ij}B_{jk}\)
\(\langle v, w\rangle =\langle \sum_i e_iv^i, \sum_j f_jw^j\rangle\)
\(\langle v, w\rangle =\sum_iv^i\langle e_i, \sum_j f_iw^j\rangle\)
\(\langle v, w\rangle =\sum_i \sum_jv^i\overline {w^j}\langle e_i, f_j\rangle\)
If the two bases are the same then:
\(\langle v, w\rangle =\sum_i \sum_jv^i\overline {w^j}\langle e_i, e_j\rangle\)
We can define the metric as:
\(g_{ij}:=\langle e_i,e_j\rangle\)
\(\langle v, w\rangle =v^i\overline {w^j}g_{ij}\)
In element form we write a vector as:
\(v=e_iv^i\)
The indices are raised and lowered to reflect whether the value is covariant or contravariant.
\(v^i\) is contravariant. If the basis moves one way, it moves the other.
\(e_i\) is covariant. If the basis moves, it moves with it.
We have spaces \(V\) and \(W\) over field \(F\). If we have a linear operation which takes a vector from each space and returns a scalar from the underlying field, it is an element of the tensor product of the two spaces.
For example if we have two vectors:
\(v=e_iv^i\)
\(w=e_jw^j\)
A tensor product would take these and return a scalar.
There are three types of tensor products:
Both are from the vector space
\(T_{ij}v^iw^j\)
\(T_{ij} \in V\otimes W\)
Both are from the dual space
\(T^{ij}v_iw_j\)
\(T_{ij} \in V^*\otimes W^*\)
One is from each space
\(T_i^jv^iw_j\)
\(T_{ij} \in V\otimes W^*\)
As a vector space, we can add together tensor products, and do scalar multiplication.
Not all elements spanned by a basis of a tensor product are themselves tensor products.
We can define homomorphisms in terms of tensor products.
\(Hom (V) =V \otimes V^*\)
\(T_j^i\)
We use the dual space for the second argument. This is because it ensures that changes to the bases do not affect the maps.
\(w^j=T_i^j v^i\)
We showed that the inner product between two vectors with the same basis can be written as:
\(\langle v, w\rangle =\langle \sum_i e_iv^i, \sum_j f_jw^j\rangle\)
\(\langle v, w\rangle =v^i\overline {w^j}\langle e_i, e_j\rangle\)
Defining the metric as:
\(g_{ij}:=\langle e_i,e_j\rangle\)
\(\langle v, w\rangle =v^i\overline {w^j}g_{ij}\)
We can use this to define the inverse of the metric.
\(g^{ij}:=(g_{ij})^{-1}\)
We can use this to raise and lower vectors.
\(v_i:=v^jg_{ij}\)
If we have tensor:
\(T_{ij}\)
We can define:
\(T_i^k=T_{ij}g^{jk}\)
\(T^{il}=T_{ij}g^{jk}g^{kl}\)
If we have:
\(T_{ij}x^j\)
We can contract it to:
\(T_{ij}x^j=v_i\)
Similarly we can have:
\(T^{ij}x_j=v^i\)
Consider matrix multiplication \(AI\).
We have:
\(AI_{ik}=A_{ij}I_{jk}\)
We write this instead as:
\(AI_{ik}=A_{ij}\delta_{jk}\)
Where \(\delta_{jk}=0\) if \(j\ne k\) and \(\delta_{jk}=1\) if \(j=k\).
For second order tensors we have:
\(T_j^i\)
\(T_{ij}\)
\(T^{ij}\)
For each of these we can define an inverse:
\(T_i^jU_j^k=\delta_i^k\)
\(T_{ij}U^{jk}=\delta_i^k\)
\(T^{ij}U_{jk}=\delta_i^k\)
If we have \(T_{ij}U^{jk}=\delta_i^k\), we can instead write:
\(T_{ij}T^{jk}=\delta_i^k\)
We have a vector \(v\in V\) and \(w\in V^*\).
\(\mathbf v=\sum_i v^i \mathbf e_i\)
\(\mathbf w=\sum_i w_i \mathbf f^i\)
\(\mathbf w\mathbf v=[\sum_i v^i \mathbf e_i][\sum_i w_i \mathbf f^i]\)
\(\mathbf w\mathbf v=\sum_i \sum_j [v^i \mathbf e_i][w_j \mathbf f^j]\)
\(\mathbf w\mathbf v=\sum_i \sum_j v^i w_j \mathbf e_i\mathbf f^j\)
We use the dual basis so:
\(\mathbf w\mathbf v=\sum_i \sum_j v^i w_j \mathbf e_i\mathbf e^j\)
\(\mathbf w\mathbf v=\sum_i \sum_j v^i w_j \delta_i^j\)
We can see that this value is unchanged when there is a change in basis.
What if these were both from \(V\)?
\(\mathbf v=\sum_i v^i \mathbf e_i\)
\(\mathbf w=\sum_i w^i \mathbf e_i\)
\(\mathbf w\mathbf v=[\sum_i v^i \mathbf e_i][\sum_i w^i \mathbf e_i]\)
\(\mathbf w\mathbf v=\sum_i \sum_j v^i w^j \mathbf e_i\mathbf e_i\)
This term is dependent on the basis, and so we do not contract.
So if we have \(v_iw^i\), we can contract, because the result (calculated from the components) does not depend on the basis.
But if we have \(v_iw_i\), the result (calcualted from the components) will change depending on the choice of basis.
We define a new object
\(c=\sum_i w^iv_i\)
This new term, c, does not depend on \(i\), and so we have contracted the index.
Consider a tensor, e g \(T_{abc}\).
In general, this is not symmetric, that is:
\(T_{abc}\ne T_{bac}\)
We can write the symmetric part of this with regard to \(a\) and \(b\).
\(T_{(ab)c}=\dfrac{1}{2}(T_{abc}+T_{bac})\)
Clearly, \(T_{(ab)c}=T_{(ba)c}\)
We can also have an an antisymmetric part with regard to \(a\) and \(b\).
\(T_{[ab]c}=\dfrac{1}{2}(T_{abc}-T_{bac})\)
Clearly, \(T_{[ab]c}=-T_{[ba]c}\)
\(T_{(ab)c}+T_{[ab]c}=\dfrac{1}{2}(T_{abc}+T_{bac})+\dfrac{1}{2}(T_{abc}-T_{bac})\)
\(T_{(ab)c}+T_{[ab]c}=T_{abc}\)
We can create higher order tensors products. For example
\(V\otimes V \otimes V\otimes V^* \otimes V^*\)
We write elements of these as:
\(T_{j_1,...,j_q}^{i_1,...,i_p}\)
We can map from matrix to matrix etc higher dimensional
Matrix has \(A\): \(a_{ij}\).
Tensor can have \(T\): \(t_{ijk}\) for example
\(0\) rank tensor: scalar
\(1\) rank tensor: vector
\(2\) rank tensor: matrix
page on covariance and contravariance and type \((p,q)\)
This is a bilinear map from two vectors from the same vector space to another vector space.
\(V\times V \rightarrow V\)
\(u\otimes v = w\)
\(w_{ij}=u_iv_j\)
\(\dim (V \otimes W)= \dim V \times \dim W\)
The dot product in the trace of the outer product.
The Kronecker product takes the concept of the outer product and applies to to matrices.
We can essentially repace every element in the matrix on the left with the element multiplied by the entire matrix on the right.
Like outer products, Kronecker products are written as:
\(u\otimes v=w\)
This is a bilinear form, a mapping from two vectors in the same vector space to the underlying field.
\(V\times V \rightarrow F\)
This is calculated by multiplying each matching element, and summing the results.
\(u\cdot v =\sum_{i=1}^nu_iv_i\)
Properties don’t hold. Can get zero vectors from non-zero inputs. Get complex numbers from dot product on itself.
Inner products better deal with complex number fields. However they are not bilinear maps.
A tensor is an element of a tensor product.