representation of tensor product

The Tensor Product, Demystified

Previously on the blog , we've discussed a recurring theme throughout mathematics: making new things from old things. Mathematicians do this all the time:

  • When you have two integers, you can find their greatest common divisor or least common multiple.
  • When you have some sets, you can form their Cartesian product or their union.
  • When you have two groups, you can construct their direct sum or their free product.
  • When you have a topological space, you can look for a subspace or a quotient space.
  • When you have some vector spaces, you can ask for their direct sum or their intersection.
  • The list goes on!

Today, I'd like to focus on a particular way to build a new vector space from old vector spaces: the tensor product . This construction often come across as scary and mysterious, but I hope to help shine a little light and dispel some of the fear. In particular, we won't talk about axioms, universal properties, or commuting diagrams . Instead, we'll take an elementary, concrete look:

Given two vectors $\mathbf{v}$ and $\mathbf{w}$, we can build a new vector, called the tensor product $\mathbf{v}\otimes \mathbf{w}$. But what is that vector, really ? Likewise, given two vector spaces $V$ and $W$, we can build a new vector space, also called their  tensor product $V\otimes W$. But what is that vector space, really ? 

Making new vectors from old

In this discussion, we'll assume $V$ and $W$ are finite dimensional vector spaces. That means we can think of $V$ as $\mathbb{R}^n$ and $W$ as $\mathbb{R}^m$ for some positive integers $n$ and $m$. So a vector $\mathbf{v}$ in $\mathbb{R}^n$ is really just a list of $n$ numbers, while a vector $\mathbf{w}$ in $\mathbb{R}^m$ is just a list of $m$ numbers.

representation of tensor product

Let's try to make new, third vector out of $\mathbf{v}$ and $\mathbf{w}$. But how? Here are two ideas: We can stack them on top of each other, or we can first multiply the numbers together and then stack them on top of each other.

representation of tensor product

The first option gives a new list of $n+m$ numbers, while the second option gives a new list of $nm$ numbers. The first gives a way to build a new space where the dimensions add;  the second gives a way to build a new space where the dimensions multiply . The first is a vector $(\mathbf{v},\mathbf{w})$ in the direct sum  $V\oplus W$ (this is the same as their direct product $V\times W$); the second is a vector $\mathbf{v}\otimes \mathbf{w}$ in the tensor product $V\otimes W$.

And that's it! 

representation of tensor product

Forming the tensor product $\mathbf{v}\otimes \mathbf{w}$ of two vectors is a lot like forming the Cartesian product of two sets $X\times Y$. In fact, that's exactly what we're doing if we think of $X$ as the set whose elements are the entries of $\mathbf{v}$ and similarly for $Y$. 

representation of tensor product

So a tensor product is like a grown-up version of multiplication. It's what happens when you systematically multiply a bunch of numbers together, then organize the results into a list. It's multi-multiplication, if you will.

representation of tensor product

There's a little more to the story.

Does every vector in $V\otimes W$ look like $\mathbf{v}\otimes\mathbf{w}$ for some $\mathbf{v}\in V$ and $\mathbf{w}\in W$? Not quite. Remember, a vector in a vector space can be written as a weighted sum of basis vectors , which are like the space's building blocks. This is another instance of making new things from existing ones: we get a new vector by taking a weighted sum of some special vectors!

representation of tensor product

So a typical vector in $V\otimes W$ is a weighted sum of basis vectors. W hat are those basis vectors ? Well, there must be exactly $nm$ of them, since the dimension of $V\otimes W$ is $nm$. Moreover, we'd expect them to be built up from the basis of $V$ and the basis of $W$. This brings us again to the "How can we construct new things from old things?" question. Asked explicitly: If we have $n$ bases $\mathbf{v}_1,\ldots,\mathbf{v}_n$ for $V$ and if we have $m$ bases $\mathbf{w}_1,\ldots,\mathbf{w}_m$ for $W$ then how can we combine them to get a new set of $nm$ vectors?

This is totally analogous to the construction we saw above: given a list of $n$ things and a list of $m$ things, we can obtain a list of $nm$ things by multiplying them all together. So we'll do the same thing here! We'll simply multiply the $\mathbf{v}_i$ together with the $\mathbf{w}_j$ in all possible combinations, except  "multiply $\mathbf{v}_i$ and $\mathbf{w}_j$ " now means "take the tensor product of $\mathbf{v}_i$ and $\mathbf{w}_j$." 

Concretely, a basis for $V\otimes W$ is the set of all vectors of the form $\mathbf{v}_i\otimes\mathbf{w}_j$ where $i$ ranges from $1$ to $n$ and $j$ ranges from $1$ to $m$. As an example, suppose $n=3$ and $m=2$ as before. Then we can find the six basis vectors for $V\otimes W$ by forming a 'multiplication chart.' (The sophisticated way to say this is: "$V\otimes W$ is the free vector space on $A\times B$, where $A$ is a set of generators for $V$ and $B$ is a set of generators for $W$.")

representation of tensor product

So $V\otimes W$ is the six-dimensional space with basis

$$\{\mathbf{v}_1\otimes\mathbf{w}_1,\;\mathbf{v}_1\otimes\mathbf{w}_2,\; \mathbf{v}_2\otimes\mathbf{w}_1,\;\mathbf{v}_2\otimes\mathbf{w}_2,\;\mathbf{v}_3\otimes\mathbf{w}_1,\;\mathbf{v}_3\otimes\mathbf{w}_2 \}$$

This might feel a little abstract with all the $\otimes$ symbols littered everywhere. But don't forget—we know exactly what each $\mathbf{v}_i\otimes\mathbf{w}_j$ looks like—it's just a list of numbers!  Which list of numbers? Well, 

representation of tensor product

So what is $V\otimes W$? It's the vector space whose vectors are linear combinations of the $\mathbf{v}_i\otimes\mathbf{w}_j$. For example, here are a couple of vectors in this space:

representation of tensor product

Well, technically ...

Technically, $\mathbf{v}\otimes\mathbf{w}$ is called the outer product of $\mathbf{v}$ and $\mathbf{w}$ and is defined by $$\mathbf{v}\otimes\mathbf{w}:=\mathbf{v}\mathbf{w}^\top$$ where $\mathbf{w}^\top$ is the same as $\mathbf{w}$ but written as a row vector. (And if the entries of $\mathbf{w}$ are complex numbers, then we also replace each entry by its complex conjugate.) So technically the tensor product of vectors is matrix:

representation of tensor product

This may seem to be in conflict with what we did above, but it's not! The two go hand-in-hand. Any $m\times n$ matrix can be reshaped into a $nm\times 1$ column vector and vice versa. (So thus far, we've exploiting the fact that $\mathbb{R}^3\otimes\mathbb{R}^2$ is isomorphic to $\mathbb{R}^6$.) You might refer to this as matrix-vector duality.

representation of tensor product

It's a little like a process-state duality . On the one hand, a matrix $\mathbf{v}\otimes\mathbf{w}$ is a process —it's a concrete representation of a (linear) transformation. On the other hand, $\mathbf{v}\otimes\mathbf{w}$ is, abstractly speaking, a vector. And a vector is the mathematical gadget that physicists use to describe the state of a quantum system. So matrices encode processes; vectors encode states. The upshot is that a vector in a tensor product $V\otimes W$ can be viewed in either way simply by reshaping the numbers as a list or as a rectangle.

representation of tensor product

By the way, this idea of viewing a matrix as a process can easily be generalized to higher dimensional arrays , too. These arrays are called tensors and whenever you do a bunch of these processes together, the resulting mega-process gives rise to a tensor network . But manipulating high-dimensional arrays of numbers can get very messy very quickly: there are lots of numbers that  all have to be multiplied together. This is like multi-multi-multi-multi...plication. Fortunately, tensor networks come with lovely pictures that make these computations very simple. (It goes back to Roger Penrose's graphical calculus .) This is a conversation I'd like to have here, but it'll have to wait for another day !

In quantum physics

One application of tensor products is related to the brief statement I made above: "A vector  is the mathematical gadget that physicists use to describe the  state  of a quantum system." To elaborate: if you have a little quantum particle, perhaps you’d like to know what it’s doing. Or what it’s capable of doing. Or the probability that it’ll be doing something. In essence, you're asking: What’s its status? What’s its  state ? The answer to this question— provided by a postulate of quantum mechanics—is given by a unit vector in a vector space. (Really, a Hilbert space, say $\mathbb{C}^n$.) That unit vector encodes information about that particle.

representation of tensor product

The dimension $n$ is, loosely speaking, the number of different things you could observe after making a measurement on the particle. But what if we have two little quantum particles? The state of that two-particle system can be described by something called a density matrix  $\rho$ on the tensor product of their respective spaces $\mathbb{C}^n\otimes\mathbb{C}^n$. A density matrix is a generalization of a unit vector—it accounts for interactions between the two particles. 

representation of tensor product

The same story holds for $N$ particles—the state of an $N$-particle system can be described by a density matrix on an $N$-fold tensor product.

representation of tensor product

But why the tensor product?  Why is it that this construction—out of all things—describes the interactions within a quantum system so well, so naturally? I don’t know the answer, but perhaps the appropriateness of tensor products shouldn't be too surprising. The tensor product itself captures all ways that basic things can "interact" with each other!

representation of tensor product

Of course, there's lots more to be said about tensor products. I've only shared a snippet of basic arithmetic. For a deeper look into the mathematics, I recommend reading through Jeremy Kun's wonderfully lucid How to Conquer Tensorphobia  and  Tensorphobia and the Outer Product . Enjoy!

What's a Quotient Group, Really? Part 1

At the interface of algebra and statistics, what is an operad part 2, what is galois theory anyway.

representation of tensor product

Representation Tensor Product

This entry contributed by Todd Rowland

Explore with Wolfram|Alpha

WolframAlpha

More things to try:

  • binomial calculator
  • CA k=3 r=2 rule 914752986721674989234787899872473589234512347899
  • Hilbert matrices

Cite this as:

Rowland, Todd . "Representation Tensor Product." From MathWorld --A Wolfram Web Resource, created by Eric W. Weisstein . https://mathworld.wolfram.com/RepresentationTensorProduct.html

Subject classifications

Search

www.springer.com The European Mathematical Society

  • StatProb Collection
  • Recent changes
  • Current events
  • Random page
  • Project talk
  • Request account
  • What links here
  • Related changes
  • Special pages
  • Printable version
  • Permanent link
  • Page information
  • View source

Tensor product

$\newcommand{\tensor}{\otimes}$ $\newcommand{\lieg}{\mathfrak{g}}$ $\newcommand{\iso}{\cong}$

  • 1.1 Comments
  • 2 Tensor product of two algebras
  • 3 Tensor product of two matrices (by D.A. Suprunenko)
  • 4.1 Comments
  • 5.1 Comments
  • 6 References

Tensor product of two unitary modules

The tensor product of two unitary modules $V_1$ and $V_2$ over an associative commutative ring $A$ with a unit is the $A$-module $V_1 \tensor_A V_2$ together with an $A$-bilinear mapping

$$(x_1, x_2) \mapsto x_1 \tensor x_2 \in V_1 \tensor_A V_2$$ which is universal in the following sense: For any $A$-bilinear mapping $\beta: V_1 \times V_2 \to W$, where $W$ is an arbitrary $A$-module, there is a unique $A$-linear mapping $b : V_1 \tensor_A V_2 \to W$ such that

$$\beta(x_1, x_2) = b(x_1 \tensor x_2), \qquad x_1 \in V_1, \qquad x_2 \in V_2.$$ The tensor product is uniquely defined up to a natural isomorphism. It always exists and can be constructed as the quotient module of the free $A$-module $F$ generated by the set $V_1 \times V_2$ modulo the $A$-submodule $R$ generated by the elements of the form

$$(x_1 + y, x_2) - (x_1, x_2) - (y, x_2),$$

$$(x_1, x_2 + z) - (x_1, x_2) - (x_1, z),$$

$$(cx_1, x_2) - c(x_1, x_2),$$

$$(x_1, cx_2) - c(x_1, x_2),$$

$$x_1, y \in V_1, \qquad x_2, z \in V_2, \qquad c \in A;$$ then $x_1 \tensor x_2 = (x_1, x_2) + R$. If one gives up the requirement of commutativity of $A$, a construction close to the one described above allows one to form from a right $A$-module $V_1$ and a left $A$-module $V_2$ an Abelian group $V_1 \tensor_A V_2$, also called the tensor product of these modules [1] . In what follows $A$ will be assumed to be commutative.

The tensor product has the following properties:

$$A \tensor_A V \iso V,$$

$$V_1 \tensor_A V_2 \iso V_2 \tensor_A V_1,$$

$$(V_1 \tensor_A V_2) \tensor V_3 \iso V_1 \tensor_A (V_2 \tensor_A V_3),$$

$$\left( \bigoplus_{i \in I} V_i \right) \tensor_A W \iso \bigoplus_{i \in I} (V_i \tensor_A W)$$ for any $A$-modules $V$, $V_i$ and $W$.

If $(x_i)_{i \in I}$ and $(y_j)_{j \in J}$ are bases of the free $A$-modules $V_1$ and $V_2$, then $(x_i \tensor y_j)_{(i,j) \in I\times J}$ is a basis of the module $V_1 \tensor_A V_2$. In particular,

$$\dim(V_1 \tensor_A V_2) = \dim V_1 \cdot \dim V_2$$ if the $V_i$ are free finitely-generated modules (for instance, finite-dimensional vector spaces over a field $A$). The tensor product of cyclic $A$-modules is computed by the formula

$$(A/I) \tensor_A (A/J) \iso A/(I+J)$$ where $I$ and $J$ are ideals in $A$.

One also defines the tensor product of arbitrary (not necessarily finite) families of $A$-modules. The tensor product

$$\bigotimes^p V = V \tensor_A \cdots \tensor_A V \qquad (p \text{ factors})$$ is called the $p$-th tensor power of the $A$-module $V$; its elements are the contravariant tensors (cf. Tensor on a vector space ) of degree $p$ on $V$.

To any pair of homomorphisms of $A$-modules $\alpha_i : V_i \to W_i$, $i=1,2$, corresponds their tensor product $\alpha_1 \tensor \alpha_2$, which is a homomorphism of $A$-modules $V_1 \tensor_A V_2 \to W_1 \tensor_A W_2$ and is defined by the formula

$$(\alpha_1 \tensor \alpha_2) (x_1 \tensor x_2) = \alpha(x_1)\tensor \alpha_2(x_2), \qquad x_i \in V_i.$$ This operation can also be extended to arbitrary families of homomorphisms and has functorial properties (see Module ). It defines a homomorphism of $A$-modules $$\Hom_A(V_1, W_1) \tensor_A \Hom_A(V_2, W_2) \to \Hom_A(V_1 \tensor V_2, W_1 \tensor W_2),$$ which is an isomorphism if all the $V_i$ and $W_i$ are free and finitely generated.

An important interpretation of the tensor product in (theoretical) physics is as follows. Often the states of an object, say, a particle, are defined as the vector space $V$ over $\C$ of all complex linear combinations of a set of pure states $e_i$, $i \in I$. Let the pure states of a second similar object be $f_j$, $j \in J$, yielding a second vector space $W$. Then the pure states of the ordered pair of objects are all pairs $(e_i, f_j)$ and the space of states of this ordered pair is the tensor product $V\tensor_\C W$.

Tensor product of two algebras

The tensor product of two algebras $C_1$ and $C_2$ over an associative commutative ring $A$ with a unit is the algebra $C_1 \tensor_A C_2$ over $A$ which is obtained by introducing on the tensor product $C_1 \tensor_A C_2$ of $A$-modules a multiplication according to the formula

$$(x_1 \tensor x_2)(y_1 \tensor y_2) = (x_1 y_1) \tensor (x_2 y_2), \qquad x_i, y_i \in C_i.$$ This definition can be extended to the case of an arbitrary family of factors. The tensor product $C_1 \tensor_A C_2$ is associative and commutative and contains a unit if both algebras $C_i$ have a unit. If $C_1$ and $C_2$ are algebras with a unit over the field $A$, then $\widetilde C_1 = C_1 \tensor \mathbf{1}$ and $\widetilde C_2 = \mathbf{1} \tensor C_2$ are subalgebras of $C_1 \tensor_A C_2$ which are isomorphic to $C_1$ and $C_2$ and commute elementwise. Conversely, let $C$ be an algebra with a unit over the field $A$, and let $C_1$ and $C_2$ be subalgebras of it containing its unit and such that $x_1 x_2 = x_2 x_1$ for any $x_i \in C_i$. Then there is a homomorphism of $A$-algebras $\phi : C_1 \tensor_A C_2 \to C$ such that $\phi(x_1 \tensor x_2) = x_1 x_2$, $x_i \in C_i$. For $\phi$ to be an isomorphism it is necessary and sufficient that there is in $C_1$ a basis over $A$ which is also a basis of the right $C_2$-module $C$.

Tensor product of two matrices (by D.A. Suprunenko)

The tensor product, or Kronecker product (cf. Matrix multiplication ), of two matrices $A = [ \alpha_{ij} ]$ and $B$ is the matrix

$$A \tensor B = \begin{bmatrix} \alpha_{11} B & \cdots & \alpha_{1n} B \\ \vdots & \ddots & \vdots \\ \alpha_{m1} B & \cdots & \alpha_{mn} B \end{bmatrix}.$$ Here, $A$ is an $(m\times n)$-matrix, $B$ is a $(p \times q)$-matrix and $A \tensor B$ is an $(mp \times nq)$-matrix over an associative commutative ring $k$ with a unit.

Properties of the tensor product of matrices are:

$$(A_1 + A_2) \tensor B = A_1 \tensor B + A_2 \tensor B,$$

$$A \tensor (B_1 + B_2) = A \tensor B_1 + A\tensor B_2,$$

$$\alpha(A \tensor B) = \alpha A \tensor B = A \tensor \alpha B,$$ where $\alpha \in k$,

$$(A \tensor B)(C \tensor D) = AC \tensor BD.$$ If $m=n$ and $p=q$, then

$$\det(A \tensor B) = (\det A)^p (\det B)^n.$$ Let $k$ be a field, $m=n$ and $p=q$. Then $A\tensor B$ is similar to $B \tensor A$, and $\det(A \tensor E_p - E_n \tensor B)$, where $E_s$ is the unit matrix, coincides with the resultant of the characteristic polynomials of $A$ and $B$.

If $\alpha : V \to V'$ and $\beta : W \to W'$ are homomorphisms of unitary free finitely-generated $k$-modules and $A$ and $B$ are their matrices in certain bases, then $A \tensor B$ is the matrix of the homomorphism $\alpha \tensor \beta : V \tensor W \to V' \tensor W'$ in the basis consisting of the tensor products of the basis vectors.

Tensor product of two representations (by A.I. Shtern)

The tensor product of two representations $\pi_1$ and $\pi_2$ of a group $G$ in vector spaces $E_1$ and $E_2$, respectively, is the representation $\pi_1 \tensor \pi_2$ of $G$ in $E_1 \tensor E_2$ uniquely defined by the condition

$$(\pi_1 \tensor \pi_2) (g) (\xi_1 \tensor \xi_2) = \pi_1(g) \xi_1 \tensor \pi_2(g) \xi_2 \tag{*}$$ for all $\xi_1 \in E_1$, $\xi_2 \in E_2$ and $g \in G$. If $\pi_1$ and $\pi_2$ are continuous unitary representations of a topological group $G$ in Hilbert spaces $E_1$ and $E_2$, respectively, then the operators $(\pi_1 \tensor \pi_2)(g)$, $g \in G$, in the vector space $E_1 \tensor E_2$ admit a unique extension by continuity to continuous linear operators $(\pi_1 \tensor -\pi_2)g$, $g\in G$, in the Hilbert space $E_1 \tensor -E_2$ (being the completion of the space $E_1 \tensor E_2$ with respect to the scalar product defined by the formula $(\xi_1 \tensor \xi_2, \eta_1 \tensor \eta_2) = (\xi_1, \eta_1)(\xi_2, \eta_2)$) and the mapping $\pi_1 \tensor \pi_2 : g \to (\pi_1 \tensor -\pi_2)g$, $g \in G$, is a continuous unitary representation of the group $G$ in the Hilbert space $E_1 \tensor -E_2$, called the tensor product of the unitary representations $\pi_1$ and $\pi_2$. The representations $\pi_1 \tensor \pi_2$ and $\pi_2 \tensor \pi_1$ are equivalent (unitarily equivalent if $\pi_1$ and $\pi_2$ are unitary). The operation of tensor multiplication can be defined also for continuous representations of a topological group in topological vector spaces of a general form.

If $\pi_i$ is a representation of an algebra $A_i$ in a vector space $E_i$, $i=1,2$, one defines the tensor product $\pi_1 \tensor \pi_2$, which is a representation of $A_1\tensor A_2$ in $E_1\tensor E_2$, by

$$(\pi_1 \tensor \pi) (a_1 \tensor a_2) = \pi_1(a_1) \tensor \pi_2(a_2).$$ In case $A = A_1 = A_2$ is a bi-algebra (cf. Hopf algebra ), composition of this representation with the comultiplication $A \to A \tensor A$ (which is an algebra homomorphism) yields a new representation of $A$, (also) called the tensor product.

In case $G$ is a group, a representation of $G$ is the same as a representation of the group algebra $k[G]$ of $G$, which is a bi-algebra, so that the previous construction applies, giving the same definition as (*) above. (The comultiplication on $k[G]$ is given by $g\mapsto g \tensor g$.)

In case $\lieg$ is a Lie algebra, a representation of $\lieg$ is the same as a representation of its universal enveloping algebra , $U_\lieg$, which is also a bi-algebra (with comultiplication defined by $x\mapsto 1 \tensor x + x \tensor 1$, $x \in \lieg$). This permits one to define the tensor product of two representations of a Lie algebra:

$$(\pi_1 \tensor \pi_2)(x) = 1 \tensor \pi_2(x) + \pi_1(x) \tensor 1.$$

Tensor product of two vector bundles

The tensor product of two vector bundles $E$ and $F$ over a topological space $X$ is the vector bundle $E\tensor F$ over $X$ whose fibre at a point $x \in X$ is the tensor product of the fibres $E_x \tensor F_x$. The tensor product can be defined as the bundle whose transfer function is the tensor product of the transfer functions of the bundles $E$ and $F$ in the same trivializing covering (see Tensor product of matrices, above).

For a vector bundle $E$ over a space $X$ and a vector bundle $F$ over a space $Y$ one defines the vector bundle $E \times F$ over $X \times Y$ (sometimes written $E \tensor F$) as the vector bundle over $X \times Y$ with fibre $E_x \tensor F_y$ over $(x, y)$. Pulling back this bundle by the diagonal mapping $x \mapsto (x, x)$ defines the tensor product defined above.

[1] N. Bourbaki, "Elements of mathematics. Algebra: Algebraic structures. Linear algebra" , , Addison-Wesley (1974) pp. Chapt.1;2 (Translated from French)
[2] F. Kasch, "Modules and rings" , Acad. Press (1982) (Translated from German)
[3] A.I. Kostrikin, Yu.I. Manin, "Linear algebra and geometry" , Gordon & Breach (1989) (Translated from Russian)
[4] P.R. Halmos, "Finite-dimensional vector spaces" , v. Nostrand (1958)
[5] M.F. Atiyah, "$K$-theory: lectures" , Benjamin (1967)
  • This page was last edited on 23 July 2018, at 03:52.
  • Privacy policy
  • About Encyclopedia of Mathematics
  • Disclaimers
  • Impressum-Legal

All Subjects

8.1 Definition and properties of tensor products

2 min read • july 25, 2024

Tensor products in representation theory combine two representations to create a new, larger one. They're crucial for understanding how complex systems interact, allowing us to build up more intricate structures from simpler components.

Mastering tensor products opens doors to analyzing symmetries in physics and decomposing representations. The key properties— bilinearity , associativity , and distributivity —provide powerful tools for manipulating and simplifying these mathematical objects.

Understanding Tensor Products in Representation Theory

Tensor product of representations.

  • Tensor product V ⊗ W V \otimes W V ⊗ W combines representations V and W creating vector space of formal linear combinations v ⊗ w v \otimes w v ⊗ w
  • Bilinearity property allows distributive manipulation ( a v + b v ′ ) ⊗ w = a ( v ⊗ w ) + b ( v ′ ⊗ w ) (av + bv') \otimes w = a(v \otimes w) + b(v' \otimes w) ( a v + b v ′ ) ⊗ w = a ( v ⊗ w ) + b ( v ′ ⊗ w ) and v ⊗ ( c w + d w ′ ) = c ( v ⊗ w ) + d ( v ⊗ w ′ ) v \otimes (cw + dw') = c(v \otimes w) + d(v \otimes w') v ⊗ ( c w + d w ′ ) = c ( v ⊗ w ) + d ( v ⊗ w ′ )
  • Scalar multiplication interacts consistently a ( v ⊗ w ) = ( a v ) ⊗ w = v ⊗ ( a w ) a(v \otimes w) = (av) \otimes w = v \otimes (aw) a ( v ⊗ w ) = ( a v ) ⊗ w = v ⊗ ( a w )
  • Group action extends naturally to tensor product g ( v ⊗ w ) = g v ⊗ g w g(v \otimes w) = gv \otimes gw g ( v ⊗ w ) = gv ⊗ g w for group element g g g

Top images from around the web for Tensor product of representations

representation of tensor product

SU(3) [The Physics Travel Guide] View original

Is this image relevant?

representation of tensor product

Representation Theory [The Physics Travel Guide] View original

representation of tensor product

Bilinear Tensor Product in TensorFlow - Stack Overflow View original

Construction of tensor products

  • Identify basis elements of each representation (e.g., { v i } \{v_i\} { v i ​ } for V, { w j } \{w_j\} { w j ​ } for W)
  • Form tensor products of basis elements { v i ⊗ w j } \{v_i \otimes w_j\} { v i ​ ⊗ w j ​ }
  • Define group action on these tensor products
  • Resulting basis of V ⊗ W V \otimes W V ⊗ W consists of all v i ⊗ w j v_i \otimes w_j v i ​ ⊗ w j ​ combinations
  • Dimension multiplies dim ⁡ ( V ⊗ W ) = dim ⁡ ( V ) ⋅ dim ⁡ ( W ) \dim(V \otimes W) = \dim(V) \cdot \dim(W) dim ( V ⊗ W ) = dim ( V ) ⋅ dim ( W ) (e.g., 3-dim V and 2-dim W yields 6-dim V ⊗ W V \otimes W V ⊗ W )

Properties of tensor products

  • Associativity ( U ⊗ V ) ⊗ W ≅ U ⊗ ( V ⊗ W ) (U \otimes V) \otimes W \cong U \otimes (V \otimes W) ( U ⊗ V ) ⊗ W ≅ U ⊗ ( V ⊗ W ) enables flexible grouping of multiple tensor products
  • Distributivity over direct sums ( U ⊕ V ) ⊗ W ≅ ( U ⊗ W ) ⊕ ( V ⊗ W ) (U \oplus V) \otimes W \cong (U \otimes W) \oplus (V \otimes W) ( U ⊕ V ) ⊗ W ≅ ( U ⊗ W ) ⊕ ( V ⊗ W ) allows splitting computations
  • Proofs involve constructing isomorphisms between spaces and showing bijections between basis elements

Dimension of tensor products

  • Formula dim ⁡ ( V ⊗ W ) = dim ⁡ ( V ) ⋅ dim ⁡ ( W ) \dim(V \otimes W) = \dim(V) \cdot \dim(W) dim ( V ⊗ W ) = dim ( V ) ⋅ dim ( W ) derived by counting basis elements
  • Independence from chosen bases crucial for consistency
  • Applications extend to character theory χ V ⊗ W ( g ) = χ V ( g ) ⋅ χ W ( g ) \chi_{V \otimes W}(g) = \chi_V(g) \cdot \chi_W(g) χ V ⊗ W ​ ( g ) = χ V ​ ( g ) ⋅ χ W ​ ( g )
  • Facilitates decomposition of tensor products into irreducible representations (e.g., analyzing symmetries in quantum mechanics)

Key Terms to Review ( 18 )

© 2024 fiveable inc. all rights reserved., ap® and sat® are trademarks registered by the college board, which is not affiliated with, and does not endorse this website..

  • skip to content
  • Recent Changes
  • Media Manager
  • Experiments
  • Basic Tools
  • Advanced Tools
  • Basic Notions
  • Advanced Notions
  • Open Problems

Add a new page:

Tensor Product Representation

Given an $n$- and $m$-dimensional representations, we can construct an $nm$-dimensional tensor-product representation by letting it act on the tensor-product space. In particular, we can construct an $n^2$-dimensional representation from two copies of an $n$-dimensional representation.

Tensor-product representations are useful for constructing new representations from old ones. Take the product of two old representations and break the result up into irreducible representations.

In quantum mechanics, tensor-product representations are useful for describing multi-particle systems and study entanglement among the particles.

The diagram below shows the defining representation of $SU(2)$ in its upper branch. To construct the tensor-product representation from two copies of the defining representation, we let it act on the tensor-product space, $\mathbb{C}^2 \otimes \mathbb{C}^2$, as shown in the lower branch.

The resulting 4-dimensional representation is reducible, breaking up into a 1- and 3-dimensional irreducible representation. This is usually written as $\bf 2 \otimes 2 = 1 \oplus 3$.

This tensor-product representation is useful for describing a system of two spin-1/2 particles, in particular, to analyze the spin states of the combined system.

su2_tensor_rep.jpg

For a more detailed explanation of this diagram see Fun with Symmetry .

Why is it interesting?

Contributing authors:

  • Edit this page
  • Old revisions
  • Back to top

CC Attribution-Share Alike 4.0 International

nLab tensor product

Monoidal categories.

monoidal categories

enriched monoidal category , tensor category

string diagram , tensor network

With braiding

braided monoidal category

balanced monoidal category

symmetric monoidal category

With duals for objects

category with duals (list of them)

dualizable object (what they have)

rigid monoidal category , a.k.a. autonomous category

pivotal category

spherical category

ribbon category , a.k.a. tortile category

compact closed category

With duals for morphisms

monoidal dagger-category ?

symmetric monoidal dagger-category

dagger compact category

With traces

traced monoidal category

Closed structure

closed monoidal category

cartesian closed category

closed category

star-autonomous category

Special sorts of products

cartesian monoidal category

semicartesian monoidal category

monoidal category with diagonals

multicategory

Semisimplicity

semisimple category

fusion category

modular tensor category

monoidal functor

( lax , oplax , strong bilax , Frobenius )

braided monoidal functor

symmetric monoidal functor

Internal monoids

monoid in a monoidal category

commutative monoid in a symmetric monoidal category

module over a monoid

tensor product

closed monoidal structure on presheaves

Day convolution

coherence theorem for monoidal categories

monoidal Dold-Kan correspondence

In higher category theory

monoidal 2-category

  • braided monoidal 2-category

monoidal bicategory

  • cartesian bicategory

k-tuply monoidal n-category

  • little cubes operad

monoidal (∞,1)-category

  • symmetric monoidal (∞,1)-category

compact double category

Definitions

In a multicategory, in terms of heteromorphisms, of modules in a monoidal category, of modules in a bicategory, in a virtual double category, related concepts.

The term tensor product has many different but closely related meanings.

In its original sense a tensor product is a representing object for a suitable sort of bilinear map and multilinear map . The most classical versions are for vector spaces ( modules over a field ), more generally modules over a ring , and even more generally algebras over a commutative monad . In modern language this takes place in a multicategory .

Consequently, the functor ⊗ : C × C → C \otimes : C \times C \to C which is part of the data of any monoidal category C C is also often called a tensor product , since in many examples of monoidal categories it is induced from a tensor product in the above sense (and in fact, any monoidal category underlies a multicategory in a canonical way). In parts of the literature (certain) abelian monoidal categories are even addressed as tensor categories .

Given two objects in a monoidal category ( C , ⊗ ) (C,\otimes) with a right and left action , respectively, of some monoid A A , their tensor product over A A is the quotient of their tensor product in C C by this action. If A A is commutative, then this is a special case of the tensor product in a multicategory.

This generalizes to modules over monads in a bicategory, which includes the notion of tensor product of functors .

Finally, tensor products in a multicategory and tensor products over monads in a bicategory are both special cases of tensor products in a virtual double category .

For M M a multicategory and A A and B B objects in M M , the tensor product A ⊗ B A \otimes B is defined to be an object equipped with a universal multimorphism A , B → A ⊗ B A,B\to A \otimes B in that any multimorphism A , B → C A,B\to C factors uniquely through A , B → A ⊗ B A,B\to A \otimes B via a (1-ary) morphism A ⊗ B → C A \otimes B\to C .

M M is the category Ab of abelian groups, made into a multicategory using multilinear maps as the multimorphisms, then we get the usual tensor product of abelian groups . That is, A ⊗ B A \otimes B is equipped with a universal map from A × B A \times B (as a set) to C C such that this map is linear (a group homomorphism) in each argument separately. This tensor product can also be constructed explicitly by

  • starting with the cartesian product A × B A\times B in sets,
  • generating a free abelian group from it, and then
  • quotienting by relations ( a 1 , b ) + ( a 2 , b ) ∼ ( a 1 + a 2 , b ) (a_1,b)+(a_2,b)\sim (a_1+a_2,b) and ( a , b 1 ) + ( a , b 2 ) ∼ ( a , b 1 + b 2 ) (a,b_1)+(a,b_2)\sim (a,b_1+b_2) . (The 0-ary relations ( 0 , b ) ∼ 0 (0,b)\sim 0 and ( a , 0 ) ∼ 0 (a,0)\sim 0 follow automatically; you need them explicitly if you generalise to abelian monoid s.)

Note that in this case, A ⊗ B A\otimes B is not a subobject or a quotient of the cartesian product A × B A\times B . However, in many other cases the tensor product in a multicategory can be obtained as a quotient of some other pre-existing product; see tensor product of modules below.

Other examples of tensor products in multicategories:

The Gray tensor product of strict 2-categories is a tensor product in the multicategory of 2-categories and cubical functor ? s. Likewise for Sjoerd Crans’ tensor product of Gray-categories.

In particular, any closed category (even if not monoidal) has an underlying multicategory. Tensor products in this multicategory are characterized by the adjointness relation

This may be the oldest notion of tensor product, since the definition of the internal-hom of abelian groups and vector spaces, unlike that of their tensor product, is intuitively obvious.

While the universal property referred to above (every bilinear map A , B → C A,B\to C factors uniquely through A , B → A ⊗ B A,B\to A\otimes B via a map A ⊗ B → C A\otimes B \to C ) suffices to define the tensor product, it does not suffice to prove that it is associative and unital. For this we need the stronger property that any multilinear map D 1 , … , D m , A , B , E 1 , … , E n → C D_1,\dots,D_m,A,B,E_1,\dots, E_n \to C factors uniquely through A , B → A ⊗ B A,B\to A\otimes B via a multilinear map D 1 , … , D m , A ⊗ B , E 1 , … , E n → C D_1,\dots,D_m, A\otimes B ,E_1,\dots, E_n \to C .

An alternative approach is to define the tensor product via an inter-categorical universal property involving heteromorphisms . Tensor products do not always arise via an adjunction, but we can observe that hom ( a ⊗ b , c ) ≃ het ( ⟨ a , b ⟩ , c ) hom (a \otimes b, c) \simeq het (\langle a, b \rangle, c) in general. That is to say, any morphism from a ⊗ b a \otimes b to c c in some category C C corresponds to a heteromorphism from ⟨ a , b ⟩ \langle a, b \rangle in C × C C \times C to c c in C C . In other words, the tensor product is a left representation of het ( ⟨ a , b ⟩ , c ) het (\langle a, b \rangle, c) .

When tensor products exist, we have a canonical het η ⟨ a , b ⟩ : ⟨ a , b ⟩ → a ⊗ b \eta_{\langle a, b \rangle} \colon \langle a, b \rangle \to a \otimes b from id a ⊗ b ∈ hom ( a ⊗ b , a ⊗ b ) id_{a \otimes b} \in hom (a \otimes b, a \otimes b) . Given another het ϕ : ⟨ a , b ⟩ → c \phi \colon \langle a, b \rangle \to c , we get the following commutative diagram.

This represents an example of a more general method for translating universal properties in multicategories into ones involving heteromorphisms.

Let R R be a commutative ring and consider the multicategory R R Mod of R R - modules and R R - multilinear maps . In this case the tensor product of modules A ⊗ R B A\otimes_R B of R R -modules A A and B B can be constructed as the quotient of the tensor product of abelian groups A ⊗ B A\otimes B underlying them by the action of R R ; that is,

More category-theoretically, this can be constructed as the coequalizer of the two maps

given by the action of R R on A A and on B B .

If R R is a field , then R R -modules are vector spaces; this gives probably the most familiar case of a tensor product spaces, which is also probably the situation where the concept was first conceived.

This tensor product can be generalized to the case when R R is not commutative, as long as A A is a right R R -module and B B is a left R R -module. More generally yet, if R R is a monoid in any monoidal category (a ring being a monoid in Ab with its tensor product), we can define the tensor product of a left and a right R R -module in an analogous way. If R R is a commutative monoid in a symmetric monoidal category , so that left and right R R -modules coincide, then A ⊗ R B A\otimes_R B is again an R R -module, while if R R is not commutative then A ⊗ R B A\otimes_R B will no longer be an R R -module of any sort.

Not all tensor products in multicategories are instances of this construction. In particular, the tensor product in Ab is not the tensor product of modules over any monoid in the cartesian monoidal category Set . Abelian groups can be considered as “sets with an action by something,” but that something is more complicated than a monoid: it is a special sort of monad called a commutative theory .

Conversely, if R R is a commutative monoid in a symmetric monoidal category, there is a multicategory of R R -modules whose tensor product agrees with the coequalizer defined above, but if R R is not commutative this is impossible. However, see the section on tensor products in virtual double categories, below.

The tensor product of left and right modules over a noncommutative monoid in a monoidal category is a special case of the tensor product of modules for a monad in a bicategory . If R : x → x R: x\to x is a monad in a bicategory B B , a right R R -module is a 1-cell A : y → x A: y\to x with an action by R R , a left R R -module is a 1-cell B : x → z B: x\to z with an action by R R , and their tensor product, if it exists, is a 1-cell y → z y\to z given by a similar coequalizer. Regarding a monoidal category as a 1-object bicategory, this recovers the above definition.

For example, consider the bicategory V − Mat V-Mat of V V -valued matrices for some monoidal category V V . A monad in V − Mat V-Mat is a V V - enriched category A A , an ( A , I ) (A,I) -bimodule is a functor A → V A\to V , an ( I , A ) (I,A) -bimodule is a functor A op → V A^{op}\to V , and their tensor product in V − Mat V-Mat is a classical construction called the tensor product of functors . It can also be defined as a coend .

A virtual double category is a common generalization of a multicategory and a bicategory (and actually of a double category ). Among other things, it has objects, 1-cells, and “multi-2-cells.” We leave it to the reader to define a notion of tensor product of 1-cells in such a context, analogous to the tensor product of objects in a multicategory. A multicategory can be regarded as a 1-object virtual double category, so this generalizes the notion of tensor product in a multicategory.

On the other hand, in any bicategory (in fact, any double category) there is a virtual double category whose objects are monads and whose 1-cells are bimodules, and the tensor product in this virtual double category is the tensor product of modules in a bicategory defined above. Thus, tensor products in a virtual double category include all notions of tensor product discussed above.

tensor product of abelian groups

tensor product of groups

tensor product of modules

tensor product of vector spaces

  • inductive tensor product

smash product of pointed sets

tensor product of representations

tensor product of vector bundles

tensor product of algebras

tensor product of algebras over a commutative monad

tensor product of Lie algebras

tensor product of chain complexes

tensor product of Banach spaces

tensor product of functors

tensor product of enriched categories

Deligne tensor product of abelian categories

tensor product of presentable (infinity,1)-categories

in terms of linear type theory the tensor product is the categorical semantics of the multiplicative conjunction .

tensor calculus

cartesian product

external tensor product

internal hom

cotensor product

pushout product

multiplicative conjunction , quantum entanglement

composite system

Tensor products were introduced by Hassler Whitney in

  • Hassler Whitney , Tensor products of Abelian groups , Duke Mathematical Journal 4:3 (1938), 495-528. doi .

Lecture notes:

  • Keith Conrad , Tensor products [part I: pdf , pdf ; part II: pdf , pdf ]

For more on the

For the category theoretic formalization of tensor products see the references at

  • monoidal category and tensor category

Formalization of tensor products in dependent linear type theory :

  • Mitchell Riley , A Bunched Homotopy Type Theory for Synthetic Stable Homotopy Theory , PhD Thesis (2022) [ doi:10.14418/wes01.3.139 , ir:3269 , pdf ]

Last revised on January 25, 2024 at 18:41:55. See the history of this page for a list of all contributions to it.

Stack Exchange Network

Stack Exchange network consists of 183 Q&A communities including Stack Overflow , the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Representation of tensor product

If we take $u\in X\otimes Y$ then we can find $\{x_{i}\}_{i=1...n}$ and $\{y_{i}\}_{i=1...n}$ linear independent such that $u=\sum_{i=1}^{n}x_{i}\otimes y_{i}.$ It is true for tensor product of three or greater number elements? For example is there exist $\{x_{i}\}_{i=1...n}$, $\{y_{i}\}_{i=1...n}$, $\{z_{i}\}_{i=1...n}$ linear independent such that $x\otimes y\otimes z=\sum_{i=1}^{n}x_{i}\otimes y_{i} \otimes z_{i}?$

  • tensor-products

Vanthuan's user avatar

  • $\begingroup$ Any remarks about this? $\endgroup$ –  Vanthuan Commented Mar 22, 2014 at 18:03
  • $\begingroup$ In the case you have written, it is immediate if you take $n=1$ and $x_1 = x, y_1 = y, z_1 = z$. Did you mean, for any $u \in X \otimes Y \otimes Z$? $\endgroup$ –  NerdOnTour Commented Jul 2, 2022 at 11:07

You must log in to answer this question.

Browse other questions tagged tensor-products ..

  • Featured on Meta
  • User activation: Learnings and opportunities
  • Site maintenance - Mon, Sept 16 2024, 21:00 UTC to Tue, Sept 17 2024, 2:00...
  • 2024 Election Results: Congratulations to our new moderator!

Hot Network Questions

  • What was it that Wittgenstein found funny here?
  • Trying to find air crash for a case study
  • What was the main implementation programming language of old 16-bit Windows versions (Windows 1 - Windows 3.11)?
  • How can Sanhedrin abolish "whole body of Halachah" from the Torah?
  • What was the newest chess piece
  • Need help for translating old signs
  • History of the migration of ERA from AMS to AIMS in 2007
  • How should I email HR after an unpleasant / annoying interview?
  • What factors cause differences between dried herb/spice brands?
  • Browse a web page through SSH? (Need to access router web interface remotely, but only have SSH access to a different device on LAN)
  • Ubuntu 22.04.5 - Final Point Release
  • Is it possible to draw this picture without lifting the pen? (I actually want to hang string lights this way in a gazebo without doubling up)
  • Swapping front Shimano 105 R7000 34x50t 11sp Chainset with Shimano Deore FC-M5100 chainset; 11-speed 26x36t
  • How many engineers/scientists believed that human flight was imminent as of the late 19th/early 20th century?
  • The consequence of a good letter of recommendation when things do not work out
  • Is it a correct rendering of Acts 1,24 when the New World Translation puts in „Jehovah“ instead of Lord?
  • Rocky Mountains Elevation Cutout
  • Why did early ASCII have ← and ↑ but not ↓ or →?
  • Is it true that before European modernity, there were no "nations"?
  • How to reply to a revise and resubmit review, saying is all good?
  • Understanding symmetry in a double integral
  • Doesn't nonlocality follow from nonrealism in the EPR thought experiment and Bell tests?
  • Drawing a tree whose nodes are smaller trees
  • Can All Truths Be Scientifically Verified?

representation of tensor product

Stack Exchange Network

Stack Exchange network consists of 183 Q&A communities including Stack Overflow , the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.

Induction of tensor product vs. tensor product of inductions

This is a pure curiosity question and may turn out completely devoid of substance.

Let $G$ be a finite group and $H$ a subgroup, and let $V$ and $W$ be two representations of $H$ (representations are finite-dimensional per definitionem, at least per my definitions). With $\otimes$ denoting inner tensor product, how are the two representations $\mathrm{Ind}^G_H\left(V\otimes W\right)$ and $\mathrm{Ind}^G_HV\otimes \mathrm{Ind}^G_HW$ are related to each other? There is a fairly obvious map of representations from the latter to the former, but it is neither injective nor surjective in general. I am wondering whether we can say anything about the decompositions of the two representations into irreducibles.

  • rt.representation-theory
  • tensor-products

darij grinberg's user avatar

  • $\begingroup$ Be careful about the dimensions involved: induction multiplies the dimension of a given module by the index $[G:H]$ , so your second module is typically bigger than your first. $\endgroup$ –  Jim Humphreys Commented Mar 15, 2010 at 20:10
  • $\begingroup$ Oh! I knew about the different dimensions, but I confused smaller and larger. So we've got a map neither injective nor surjective (because it only maps to pure tensors), which makes the question way less attractive. $\endgroup$ –  darij grinberg Commented Mar 15, 2010 at 20:17
  • 1 $\begingroup$ In general, it's hard to say anything helpful about the decomposition of an induced module or of a tensor product of two modules for a finite group. For groups of Lie type, much work has been done in such directions for specific types of subgroups and modules, but results usually depend strongly on special methods such as Hecke algebras. Clifford theory and Mackey theory do provide some general guidelines for arbitrary finite groups, of course. $\endgroup$ –  Jim Humphreys Commented Mar 15, 2010 at 20:58
  • $\begingroup$ Does it help to generalise? You have an inclusion $H\times H\rightarrow G\times G$ and the diagonal subgroups $H\rightarow G$. You take $V\otimes W$ as a representation of $H\times H$. Then induce to $G\times G$ followed by restriction to $G$. Alternatively restrict to $H$ and then induce to $G$. This suggests taking a commutative square of group homomorphisms and comparing induction followed by restriction with restriction followed by induction. $\endgroup$ –  Bruce Westbury Commented Mar 15, 2010 at 21:09

4 Answers 4

Surely you mean "former to latter"?

I think the natural map is injective. Let $V$ and $W$ have bases $v_1,\ldots,v_r$ and $w_1,\ldots,w_s$ respectively. Let $g_1,\ldots,g_t$ be coset representatives for $H$ in $G$. Then a basis for $\mathrm{Ind}_H^G V\otimes \mathrm{Ind}_H^G W$ consists of the $(v_i g_k)\otimes(w_j g_l)$. The image of the natural injection from $\mathrm{Ind}_H^G(V\otimes W)$ is spanned by the $(v_i g_k)\otimes(w_j g_l)$ with $k=l$. There are exactly the right number of these.

Robin Chapman's user avatar

  • $\begingroup$ I think my argument is essentially the same as that in Bruce's comment, but phrased in a more low-brow way. $\endgroup$ –  Robin Chapman Commented Mar 16, 2010 at 21:05

Try using Frobenius reciprocity. Let $V$ and $W$ be two representations of $H$, and let $U$ be a representation of $G$. Consider first the space: $$Hom_G \left(U, Ind_H^G (V \otimes W) \right) \cong Hom_H \left( U, V \otimes W \right),$$ by Frobenius reciprocity.

On the other hand, one can consider the space: $$Hom_G(U, Ind_H^G V \otimes Ind_H^G W).$$ This is canonically isomorphic to $$Hom_G(U \otimes Ind_H^G V', Ind_H^G W),$$ where $V'$ denotes the dual representation of $V$. By Frobenius reciprocity again, this is isomorphic to: $$Hom_H(U \otimes Ind_H^G V', W).$$ This is canonically isomorphic to $$Hom_H(U, (Ind_H^G V) \otimes W).$$ Now, we are led to compare the two spaces: $$Hom_H(U, V \otimes W), \quad Hom_H \left( U, (Ind_H^G V) \otimes W \right).$$

There is a natural embedding of $V$ into $Res_H^G Ind_H^G V$. This gives a natural map: $$\iota: Hom_H(U, V \otimes W) \rightarrow Hom_H \left( U, (Ind_H^G V) \otimes W \right).$$

Using complete reducibility, let us (noncanonically) decompose $H$-representations: $$Res_H^G Ind_H^G V \cong V \oplus V^\perp.$$ It follows that $$Hom_H \left( U, (Ind_H^G V) \otimes W \right) \cong Hom_H \left(U, V \otimes W \right) \oplus Hom_H \left( U, V^\perp \otimes W \right).$$

It follows that $\iota$ is injective. This explains (via Yoneda, if you like) why $Ind_H^G(V \otimes W)$ is canonically a subrepresentation of $Ind_H^G V \otimes Ind_H^G W$. It also explains that computation of "the rest" of $Ind_H^G V \otimes Ind_H^G W$ -- the full decomposition into irreducibles -- requires Mackey theory: the decomposition of $Res_H^G Ind_H^G V$. There can be no neat answer, without performing this kind of Mackey theory.

Marty's user avatar

  • $\begingroup$ Thanks, this looks interesting. I'll reply in more detail when I have read this more carefully, but can it be that at the end you forgot that the decomposition of Res Ind V into V (+) V^perp was noncanonical? $\endgroup$ –  darij grinberg Commented Mar 15, 2010 at 22:09
  • $\begingroup$ Anyway, your post taught me that induction commutes with taking duals, so you're already got +1. $\endgroup$ –  darij grinberg Commented Mar 15, 2010 at 22:17
  • $\begingroup$ Ah, I see. You say that the decomposition is noncanonical because it isn't canonical with respect to G and H. Of course it's canonical with respect to V (it's an application of Mackey's double coset theorem). $\endgroup$ –  darij grinberg Commented Mar 15, 2010 at 22:21
  • $\begingroup$ I guess I haven't thought too carefully about the canonicity (canonicalness) of the Mackey double coset theorem here. But in any case, the injectivity of the unit (or is it counit?) map from $V$ into $Res Ind V$ is the important point. I mentioned that it is a summand, because I didn't want to rely on exactness of any functors. I was trying to be extra careful, but I might have been confusing instead. $\endgroup$ –  Marty Commented Mar 15, 2010 at 22:43

Just for the case someone is interested, here is another answer I've found. Probably it's equivalent to Robin's and Marty's answers, with the only difference that it's more abstract-nonsense than Robin's (so if you don't consider this as a virtue in itself, you don't have to read on) and shorter than Marty's (in particular, I don't switch to dual representations).

We will always abbreviate $\mathrm{Ind}^G_H$ by $\mathrm{Ind}$ and $\mathrm{Res}^G_H$ by $\mathrm{Res}$. Then, the push-pull formula states that $\mathrm{Ind}\left(U\otimes\mathrm{Res} T\right)\cong \mathrm{Ind}U\otimes T$ for any representation $T$ of $G$. Applying this to $T=\mathrm{Ind}V$, we get $\mathrm{Ind}\left(U\otimes\mathrm{Res}\mathrm{Ind}V\right)\cong \mathrm{Ind}U\otimes\mathrm{Ind}V$. Now, we can see the representation $\mathrm{Res}\mathrm{Ind}V$ as the left $k\left[H\right]$-module $k\left[G\right]\otimes _{k\left[H\right]}V$. Then, there is a canonical $H$-equivariant injection $V\to \mathrm{Res}\mathrm{Ind}V$ given by $v\mapsto 1\otimes v$, and there is a canonical $H$-equivariant projection $\mathrm{Res}\mathrm{Ind}V\to V$ given by $g\otimes v\mapsto gv$ for $g\in H$ and $g\otimes v\mapsto 0$ for $g\not\in H$. This projection splits the injection, and therefore the representation $V$ is canonically a direct summand of the representation $\mathrm{Res}\mathrm{Ind}V$. Hence, $\mathrm{Ind}\left(U\otimes V\right)$ is canonically a direct summand of $\mathrm{Ind}\left(U\otimes\mathrm{Res}\mathrm{Ind}V\right)\cong \mathrm{Ind}U\otimes\mathrm{Ind}V$.

"Canonically" means "canonically with respect to $U$ and $V$ and kind-of canonically with respect to $G$ and $H$" here. "Kind-of canonically with respect to $G$ and $H$" means that it's functorial with respect to maps which preserve both "lying in $H$" and "not lying in $H$", and I think we can't do better. As opposed to Robin's and Marty's proof, we don't need to rely on some randomly chosen system of representatives of cosets or double cosets.

Community's user avatar

$Ind_H^G X = R[G] \otimes_{R[H]} X$ and hence there are two canonical maps:

$R[G]\otimes V \otimes W \to R[G] \otimes V\otimes R[G] \otimes W, x\otimes v\otimes w\mapsto x\otimes v\otimes 1 \otimes w$ and

$R[G]\otimes V \otimes R[G] \otimes W \to R[G] \otimes V \otimes W, x\otimes v\otimes y \otimes w\mapsto xy \otimes v \otimes w$.

Obviously the second is a right inverse to the first. Hence the first one is injective and the second is surjective. If $R$ is a field, then there cannot be injective (surjective) maps in the other direction because the dimensions don't agree.

Johannes Hahn's user avatar

  • $\begingroup$ Hmm. I am not convinced that the first map is a map of representations. All of these tensor signs are different kinds of tensor signs; the induction is a tensor product of "consistent" bimodules (i. e. a right R[H]-module (X) a right R[H]-module), while V (X) W is an inner tensor product of left modules. $\endgroup$ –  darij grinberg Commented Mar 15, 2010 at 20:25
  • $\begingroup$ All my tensorproducts are R[H]-tensors. What is an "inner" tensor product? $\endgroup$ –  Johannes Hahn Commented Mar 16, 2010 at 13:29
  • $\begingroup$ Johannes, not all your $\otimes$ are over $R[H]$:: for example, you write $R[G]\otimes V\otimes W$ which is $R[G]\otimes_{R[H]}(V\otimes_R W)$. $\endgroup$ –  Mariano Suárez-Álvarez Commented Mar 16, 2010 at 13:36
  • $\begingroup$ damn it... Okay, I promise to think before I post again. $\endgroup$ –  Johannes Hahn Commented Mar 16, 2010 at 13:48
  • $\begingroup$ No problem. If I would think before posting my postcount would be much smaller here... $\endgroup$ –  darij grinberg Commented Mar 16, 2010 at 14:28

Your Answer

Sign up or log in, post as a guest.

Required, but never shown

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy .

Not the answer you're looking for? Browse other questions tagged rt.representation-theory tensor-products or ask your own question .

  • Featured on Meta
  • User activation: Learnings and opportunities
  • Site maintenance - Mon, Sept 16 2024, 21:00 UTC to Tue, Sept 17 2024, 2:00...

representation of tensor product

IMAGES

  1. The Tensor Product, Demystified

    representation of tensor product

  2. Tensor Product Representation [The Physics Travel Guide]

    representation of tensor product

  3. The Tensor Product, Demystified

    representation of tensor product

  4. The Tensor Product, Demystified

    representation of tensor product

  5. The Tensor Product, Demystified

    representation of tensor product

  6. The Tensor Product, Demystified

    representation of tensor product

VIDEO

  1. Lecture52:Tensor product of irreducible represent'n 1:Composite objects from fundamental particles

  2. Tensor product of irreducible representation 1_ Composite objects from fundamental particles

  3. 3. Tensors: second order tensors and their operations

  4. Quantum Circuit Diagram Examples for Quantum Fourier Transform, Tensor Product Representation

  5. The Derivative over Banach Algebra (Introduction lecture)

  6. Basics of Quantum Information

COMMENTS

  1. Tensor product of representations

    Tensor product of representations. In mathematics, the tensor product of representations is a tensor product of vector spaces underlying representations together with the factor-wise group action on the product. This construction, together with the Clebsch-Gordan procedure, can be used to generate additional irreducible representations if one ...

  2. PDF The tensor product of representations

    Suppose that (r, U) is a representation of G and (s, V) is a representation of H. We can form the tensor product, U@ V, of the two vector spaces U and Vt. Recall from the theory of tensor products that if AeHom (U, U) and BeHom(V, V), then there is a unique transformation A on U@ V such that Also tr(A = (tr B).

  3. Tensor product

    The tensor product of two vector spaces is a vector space that is defined up to an isomorphism.There are several equivalent ways to define it. Most consist of defining explicitly a vector space that is called a tensor product, and, generally, the equivalence proof results almost immediately from the basic properties of the vector spaces that are so defined.

  4. The Tensor Product, Demystified

    And that's it! Forming the tensor product v⊗w v ⊗ w of two vectors is a lot like forming the Cartesian product of two sets X×Y X × Y. In fact, that's exactly what we're doing if we think of X X as the set whose elements are the entries of v v and similarly for Y Y. So a tensor product is like a grown-up version of multiplication.

  5. Representation Tensor Product -- from Wolfram MathWorld

    The vector space tensor product V tensor W of two group representations of a group G is also a representation of G. An element g of G acts on a basis element v tensor w by g(v tensor w)=gv tensor gw. If G is a finite group and V is a faithful representation, then any representation is contained in tensor ^nV for some n. If V_1 is a representation of G_1 and V_2 is a representation of G_2, then ...

  6. Tensor product

    This permits one to define the tensor product of two representations of a Lie algebra: $$(\pi_1 \tensor \pi_2)(x) = 1 \tensor \pi_2(x) + \pi_1(x) \tensor 1.$$ Tensor product of two vector bundles.

  7. Definition and properties of tensor products

    Algebraic Tensor Product: The algebraic tensor product is a way to combine two vector spaces over a field into a new vector space, which allows for the study of bilinear forms and multilinear mappings. This construction is essential in representation theory, as it provides a method to create new representations from existing ones, capturing the interactions between them.

  8. PDF Tensors: Geometry and Applications J.M. Landsberg

    Groups and representations 30 §2.3. Tensor products 32 §2.4. The rank and border rank of a tensor 35 §2.5. Examples of invariant tensors 39 v. vi Contents §2.6. Symmetric and skew-symmetric tensors 40 ... Tensor product states arising in quantum information theory and algebraic statistics are then introduced as they are 1. 2,

  9. PDF 1 Introduction to the Tensor Product

    The tensor product V ⊗ W is thus defined to be the vector space whose elements are (complex) linear combinations of elements of the form v ⊗ w, with v ∈ V,w ∈ W, with the above rules for manipulation. The tensor product V ⊗ W is the complex vector space of states of the two-particle system! Comments . 1.

  10. PDF Tutorial 3: Tensor products of representations. Representations of S

    utorial 3: ensor products of representations. Representations of Sn and Young diagrams.Exercise 1. Ou. aim is the following isomorphism of Sn-representations, n ≥ 3: Vn−1,1 ∼= V st.1. For 1 ≤ i ≤ n, denote by Ti the Young tableau of shape (n − 1, 1) having i in the second row, and the.

  11. Tensor Product Representation [The Physics Travel Guide]

    To construct the tensor-product representation from two copies of the defining representation, we let it act on the tensor-product space, $\mathbb{C}^2 \otimes \mathbb{C}^2$, as shown in the lower branch. The resulting 4-dimensional representation is reducible, breaking up into a 1- and 3-dimensional irreducible representation.

  12. PDF Introduction to the Tensor Product

    Introduction to the Tensor Product James C Hateley In mathematics, a tensor refers to objects that have multiple indices. Roughly speaking this can be thought of as a multidimensional array. A good starting point for discussion the tensor product is the notion of direct sums. REMARK:The notation for each section carries on to the next. 1 ...

  13. PDF Tensor Product Space

    S (S ) . (S ) is the smallest closed subspace containing S. We say that X is the direct sum of two subspaces M and. N , denoted by M = M L N , if every representation of the form x = m +. 2 X has a unique. where m 2 M and. 2 N . The Classical Projection Theorem: Let M be a closed sub-space of a Hilbert space H .

  14. PDF Topics in Representation Theory: SU(2) Representations and Their

    This decomposition of the tensor product goes under the name "Clebsch-Gordan" decomposition. The general issue of how the tensor product of two irreducibles decomposes is an important one we will study in general later. It is related to the representation theory of the symmetric group, by "Schur-Weyl duality".

  15. PDF Notes on Tensor Products and the Exterior Algebra

    The scalar product: V F !V The dot product: R n R !R The cross product: R 3 3R !R Matrix products: M m k M k n!M m n Note that the three vector spaces involved aren't necessarily the same. What these examples have in common is that in each case, the product is a bilinear map. The tensor product is just another example of a product like this ...

  16. Irreducible representations of a product of two groups

    21.4k 6 52 120. 2. For algebraically irreducible representations this is true and can be found in many sources like Curtis and Reiner. For any finite dimensional algebras the irreducible representations of the tensor product are the tensor products of irreducibles. You can replace the group algebra by the tensor product of the images of CG and ...

  17. tensor product in nLab

    Idea. The term tensor product has many different but closely related meanings.. In its original sense a tensor product is a representing object for a suitable sort of bilinear map and multilinear map.The most classical versions are for vector spaces (modules over a field), more generally modules over a ring, and even more generally algebras over a commutative monad.

  18. PDF Vector and Tensor Algebra

    The tensor product of two vectors represents a dyad, which is a linear vector transformation. A dyad is a special tensor - to be discussed later -, which explains the name of this product. ... 1.1.8 Matrix representation of a vector In every point of a three-dimensional space three independent vectors exist. Here we assume that these base ...

  19. group theory

    We write: ρs = ρ1s ⊗ ρ2s. The ρs above defines a linear representation of G in V1 ⊗ V2 which is called the tensor product of the given representations. Defining the above using matrix notation: let {ei1} be a basis for V1 and let ri1j1(s) be the matrix of ρ1s with respect to this basis, define {ei2} and ri2j2(s) in a similar manner.

  20. PDF 13 Tensor Products of Representations and Characters

    13 Tensor Products of Representations and Characters Tensor products of vector spaces and matrices are recalled/introduced in Appendix C. We are now going to use this construction to build products of characters. Proposition 13.1 Let G and H be finite groups, and let ρ V: G ›Ñ GLpVq and ρ W: H ›Ñ GLpWq be C-representations with ...

  21. Matrix Representation of the Tensor Product of Linear Maps

    12. I'm trying to work out some examples of applying the tensor product in some concrete cases to get a better understanding of it. Within this context, let f: R2 → R2 f: R 2 → R 2 be a linear map with matrix A A and let g: R2 → R2 g: R 2 → R 2 be a linear map with matrix B B. It follows almost immediately from the universal property of ...

  22. Representation of tensor product

    Matrix Representation of the Tensor Product of Linear Maps. 15. Tensor product of operators. 1. Elementary problem about Tensor product and Kronecker product defined by linear map. 2. An example of tensor product. 6. Tensor product and the pure elements. 1. Tensor Product of Two Group Representations (Action Well-definedness) 2.

  23. rt.representation theory

    I am not convinced that the first map is a map of representations. All of these tensor signs are different kinds of tensor signs; the induction is a tensor product of "consistent" bimodules (i. e. a right R[H]-module (X) a right R[H]-module), while V (X) W is an inner tensor product of left modules. $\endgroup$ -