Setting up matrix objects

Simplest case

Matrices are tensors with a subset of indices marked as matrix indices. These indices have nonmetric types, which means that raising and lowering are forbidden. The typical examples of matrix objects are Dirac $\gamma_{\mu}$-matrices, SU(N) matrices, spinors etc.

In Redberry for example to define G_m as $\gamma$-matrix one can do as follows:

defineMatrices 'G_a', Matrix1.matrix
Here the first argument is matrix itself, while the second is type of matrix (see next subsection). With this line Redberry will treat all specified objects as matrices and will automatically insert additional matrix indices. For example, to input product $\gamma_{a} \gamma_{b}$ one can just input it as is:
println 'G_a*G_b'.t
   > G_a*G_b
To input a trace $\mbox{Tr}[\gamma_{a} \gamma_{b}]$ one can proceed in the same way:
println 'Tr[G_a*G_b]'.t
   > Tr[G_a*G_b]
Internally, Redberry inserts special matrix indices for matrix object when parsing strings. These matrix indices are not printed by default; in order to print them one should explicitly specify OutputFormat:
println 'G_a*G_b'.t.toString(OutputFormat.Redberry)
   > G_a^a'_b'*G_b^b'_c'
println 'Tr[G_a*G_b]'.t.toString(OutputFormat.Redberry)
   > G_a^a'_b'*G_b^b'_a'
If one use the default output format then Redberry will print matrices in that order in which they appear in the inputting string.


The rules of matrix multiplication can be formulated as follows. Suppose we have two tensors \[ A_{\mu}{}^{u_1\dots u_{n_u}}{}_{l_1\dots l_{n_l}} \quad\mbox{and}\quad B_{\nu}{}^{\tilde u_1\dots \tilde u_{k_u}}{}_{\tilde l_1\dots \tilde l_{k_l}}. \] where $u_i$ and $l_i$ denotes matrix upper and matrix lower indices respectively, and $\mu$ and $\nu$ denotes a whole subset of other indices. Then the product $T_{\mu\nu} = A_\mu B_\nu$ has the following form: \begin{eqnarray*} & T_{\mu\nu}{}^{u_1\dots u_{n_u} \tilde u_{n_l+1}\dots \tilde u_{k_u}}% {}_{\tilde l_1\dots \tilde l_{k_l}} = A_{\mu}{}^{u_1\dots u_{n_u}}{}_{c_1\dots c_{n_l}} B_{\nu}{}^{c_1\dots c_{n_l}\tilde u_{n_{l +1}}\dots \tilde u_{k_u}}% {}_{\tilde l_1\dots \tilde l_{k_l}} & \mbox{if}\quad n_l \le k_u,\\ & T_{\mu\nu}{}^{u_1\dots u_{n_u}}% {}_{l_{k_u +1}\dots l_{n_l} \tilde l_1\dots \tilde l_{k_l}} = A_{\mu}{}^{u_1\dots u_{n_u}}{}_{c_1\dots c_{k_u} l_{k_u + 1}\dots l_{n_l}} B_{\nu}{}^{c_1\dots c_{k_u}}% {}_{\tilde l_1\dots \tilde l_{k_l}} & \mbox{if}\quad n_l > k_u, \end{eqnarray*}

These definitions becomes especially clear in a particular cases. Consider, for example, three matrices: matrix $v^{i}$ with one upper matrix index (vector), matrix $\gamma_{\mu}{}^i{}_j$ with one upper and one lower index, and matrix $\bar v_{i}$ with one lower matrix index (covector). Then using the above rules, one can obtain the well-known expressions: \begin{eqnarray*} & T_{\mu\nu} = \gamma_{\mu}\gamma_{\nu} \quad\to\quad T_{\mu\nu}{}^i{}_j = \gamma_{\mu}{}^i{}_c\gamma_{\nu}{}^c{}_j &\quad\mbox{(matrix)}\\ & T_{\mu} = \gamma_{\mu} v \quad\to\quad T_{\mu\nu}{}^i= \gamma_{\mu}{}^i{}_c v^c &\quad\mbox{(vector)}\\ & T_{\mu} = \bar v \gamma_{\mu} \quad\to\quad T_{\mu\nu}{}_i= \bar v_{c} \gamma_{\mu}{}^c{}_i &\quad\mbox{(covector)}\\ & T_{\mu\nu} = \bar v \, \gamma_{\mu}\gamma_{\nu} \, v \quad\to\quad T_{\mu\nu}= \bar v_{i} \gamma_{\mu}{}^i{}_c\gamma_{\nu}{}^c{}_j v^j &\quad\mbox{(scalar)} \end{eqnarray*}

To specify Redberry what tensors should be considered as matrices and allow it to automatically insert additional matrix indices according to the formulated rules one can do:

defineMatrices 'G_a', 'G', Matrix1.matrix,
          'v', Matrix1.vector, 'cv', Matrix1.covector
Method defineMatrices takes a set of string representations of matrices, where the corresponding matrix signature (type and number of upper/lower matrix indices) is specified after each set. By default, Redberry provides four predefined nonmetric types: Matrix1 (Latin lower case letters with strokes), Matrix2 (Latin upper case letters with strokes), Matrix3 (Greek lower case letters with strokes), Matrix4 (Greek upper case letters with strokes). The number of upper and lower indices of matrix type is specified using the corresponding property: vector means one upper index, covector means one lower index, matrix means one upper and one lower index.

Let us consider different example combinations of these matrices and print matrix indices explicitly:

println 'G_a*G_b'.t
   > G_a^a'_b'*G_b^b'_c'
println 'G_a*v'.t
   > G_a^a'_b'*v^b'
println 'cv*G_a'.t
   > cv_a'*G_a^a'_b'
println 'G*G_a = G*v*cv*G_a + f_a'.t
   > G^a'_c'*G_a^c'_b'= G^a'_c'*v^c'*cv_d'*G_a^d'_b'+f_a*d^a'_b'
   > println 'cv*G_a*G_b*v + g_ab'.t
   > v_a'*G_a^a'_b'*v^b' + g_ab
println 'Tr[G_a*G_b + n_b*G_a] + n_a*n_b'.t
   > G_a^a'_b'*G_b^b'_a' + n_b*G_a^a'_a' + n_a*n_b
println 'Tr[G_a*G_b + n_b*G_a + n_a*n_b]'.t
   > G_a^a'_b'*G_b^b'_a' + n_b*G_a^a'_a' + n_a*n_b*d^a'_a'
As we see from the lines 7 and 10, Redberry inserts similar free matrix indices for the l.h.s. and r.h.s. of the expression. If sum contains both matrix (with non empty free matrix indices) and non-matrix expressions (without matrix indices), then latter will be multiplied by the identity matrices, i.e. Kronecker deltas.

In order to define a matrix object of a more general form, i.e. with $p$ upper and $q$ lower indices, one can do

defineMatrices 'M_\\mu', Matrix1.tensor(2, 3)
println 'G*M_\\alpha'.t.toString(OutputFormat.Redberry)
   > G^{a'}_{f'}*M_{\alpha}^{f'b'}_{c'd'e'}

Syntax for traces of matrices shown in the lines 9 and 10. By default the trace is taken with respect to all matrix indices, but if some expression is a matrix with respect to several indices types, then it is possible to specify particular types of indices for which the trace should be taken:

defineMatrices 'A, B', Matrix1.matrix, Matrix2.matrix
println 'Tr[A*B]'.t.toString(OutputFormat.Redberry)
   > A^{a'}_{b'}^{A'}_{B'}*B^{b'}_{a'}^{B'}_{A'}
println 'Tr[A*B, Matrix1]'.t.toString(OutputFormat.Redberry)
   > A^{a'}_{b'}^{A'}_{C'}*B^{b'}_{a'}^{C'}_{B'}

See also