# Tensor product

In linear algebra and abstract algebra, the tensor product of two vector spaces is introduced to reduce the study of bilinear operators to that of linear operators. Formally, the tensor product of the two vector spaces V and W over the same base field F is defined by the following universal property: it is a vector space T over F, together with a bilinear operator Ä: V x W -> T, such that for every bilinear operator B: V x W -> X there exists a unique linear operator L: T -> X with B = L o Ä, i.e. B(x,y) = L(xÄy) for all x in V and y in W.

The tensor product is up to a unique isomorphism uniquely specified by this requirement, and we may therefore write V Ä W instead of T. Using a rather involved construction, one can show that the tensor product for any two vector spaces exits. The space V Ä W is generated by the image of Ä, and even more: if S is a basis of V and T is a basis of W, then { s Ä t : s in S and t in T} is a basis for V Ä W. The dimension of the vector space V Ä W is therefore given by the product of the dimensions of V and W.

Note that the space (VÄW)* (the dual space of VÄW containing all linear functionals on that space) corresponds naturally to the space of all bilinear functionals on VxW. In other words, every bilinear functional is a functional on the tensor product, and vice versa.

There is a natural isomorphism between V*ÄW* and (VÄW)*. So, the tensors of the linear functionals are bilinear functionals. This gives us a new way to look at the space of bilinear functionals, as a tensor product itself.

It is possible to generalize the definition to a tensor product of any number of spaces. The tensor product is associative: (V Ä W) Ä Z is naturally isomorphic to V Ä (W Ä Z). The universal property of VÄWÄX is that every tri-linear operator on VxWxX corresponds to a unique linear operator on VÄWÄX. Tensor spaces allow us to use the theory of linear operators to study multi-linear operators.

It is also possible to generalize the definition to tensor products of modules over the same ring. If the ring is non-commutative, we'll need to be careful about distinguishing right modules and left modules. We will write RM for a left module, and MR for a right module. If a module has both a left module structure over a ring R and a right module structure over a ring S, and in addition for every m, r, s, r(ms) = (rm)s, then we will say M is a left-right module, and will denote it by RMS. Note that every left-module is a left-right module with Z as the right ring, and vice versa.

When defining the tensor, we also need to be careful about the ring: most modules can be considered as modules over several different rings.

The most general form of the tensor definition is as follows: let SMR and RNK be modules, then the tensor over R is an R-bilinear operator T: M x N -> P such that for every R-bilinear operator B: M x N -> O there is a unique linear operator L: P -> O such that L o T = B. There is unique left-right module structure on P such that T is right-linear over S and left-linear over K, and if B is right-linear over S then so is L, and if B is left-linear over K then so is L. (P, T) are unique up to a unique isomorphism, and are called the "tensor space" and "tensor product" respectively.

Note that for a commutative ring R, and in particular for a field, a module is both a right module and a left module. Hence, the product of two modules over a commutative ring is again a module over that ring. Also note that this definition is also naturally associative, and we can use this to define the tensor product for any number of spaces.

Example: Consider the rational numbers Q and the integers modulo n Zn. Both can be considered as modules over the integers, Z. Let B: Q x Zn -> M be a bilinear operator. Then B(q,i) = B(q/n, n*i) = B(q/p, 0) = 0, so every bilinear operator is identically zero. Therefore, if we define P to be the trivial module, and T to be the zero bilinear function, then we see that the properties for the tensor product are satisfied. Therefore, the tensor product of Q and Zn is {0}.

The tensor product is useful when wishing to deal with bilinear operators as if they were linear operators. However, linear subspaces of bilinear operators (or in general, multilinear operators) determine natural quotient spaces of the tensor space, which are frequently useful.

The definition of an anti-symmetric multilinear operator is an operator m: Vn -> X such that if there is a linear dependance between its arguments, the result is 0. Note that the addition of anti-symmetric operators, or multiplying one by a scalar, is still anti-symmetric -- so it is a vector space.

The most famous example of an anti-symmetric operator is the determinant.

The nth wedge space W, for a module V over a commutative ring R, together with the anti-symmetric linear wedge operator w: Vn -> W is such that for every n-linear anti-symmetric operator m: Vn -> X there exists a unique linear operator l: W -> X such that m = l o w. The wedge is unique up to a unique isomorphism.

One way of defining the wedge space constructively is by dividing the Tensor space by the subspace generated by all the tensors of n-tuples which are linearily dependant.

The dimension of the kth wedge space for a free module of dimension n is n!/k!(n-k)!. In particular, that means that up to a constant, there is a single anti-symmetric functional with the arity of the dimension of the space. Also note that every linear functional is anti-symmetric.

Note that the wedge operator commutes with the * operator. In other words, we can define a wedge on functionals such that the result is an anti-symmetric multilinear functional. In general, we can define the wedge of an n-linear anti-symmetric functional and an m-linear anti-symmetric functional to be an (n+m)-linear anti-symmetric functional. Since it turns out that this operation is associative, we can also define the power of an anti-symmetric linear functional.

When dealing with differentiable manifolds, we define a "n-form" to be a function from the manifold to the n-th wedge of the cotangent bundle. Such a form will be said to be differentiable if, when applied to n differentiable vector fields, the result is a differentiable function.