Easier proofs of three determinant identities

Shuling Gao1, Wenchang Chu2
1School of Mathematics and Statistics, Zhoukou Normal University, Zhoukou (Henan), China
2NITheCS (Via Dalmazio Birago 9/E, Lecce 73100, Italy

Abstract

Three remarkable determinant identities of skew–symmetric matrices are reviewed in a more transparent manner.

Keywords: determinant, vandermond dererminant, skew–symmetric matrix

1. Introduction and Outline

There exist numerous determinant identities in the literature (cf. [4,9]). For example, the determinants of Vandermonde and Cauchy \[\Lambda_m = \det_{1 \leq i,j \leq m} \left[x_i^{m-j}\right] = \prod_{1 \leq i < j \leq m} (x_i – x_j),\] \[\det_{1 \leq i,j \leq m} \left[ \frac{1}{x_i + y_j} \right] = \frac{\prod_{1 \leq i < j \leq m} (x_i – x_j)(y_i – y_j)} {\prod_{1 \leq i,j \leq m} (x_i + y_j)};\] play an important role in symmetric functions and group characters (cf. [3,5]).

A special class of determinants concern the skew–symmetric matrices. For a given skew matrix \({M}\), it is well known that \(\det{M}=0\) when the order of \({M}\) is odd. Instead, when the order of \({M}\) is even, the determinant \(\det{M}\) results in a square of a polynomial (called Pfaffian of \({M}\)) in the entries of \({M}\).

In this little article, we shall review three important determinants of skew–matrices. Let \(T\) and \(\{x_k\}_{1\le k\le 2n}\) be indeterminates. Define three skew–matrices by

\[U_{2n} = \left[u_{i,j}\right]_{1 \leq i,j \leq 2n} : \quad u_{i,j} = \frac{(x_i^n – x_j^n)^2}{x_i – x_j},\label{e1} \tag{1}\] \[V_{2n} = \left[v_{i,j}\right]_{1 \leq i,j \leq 2n} : \quad v_{i,j} = \frac{x_i – x_j}{x_i + x_j}, \label{e2}\tag{2}\] \[W_{2n} = \left[w_{i,j}\right]_{1 \leq i,j \leq 2n} : \quad w_{i,j} = \frac{x_i – x_j}{1 – Tx_i x_j}.\label{e3} \tag{3}\]

Their determinants are explicitly given by the following remarkable theorems.

Theorem 1.1. (Ishikawa and Wakayama [1]). \[\det U_{2n} = \prod_{1 \leq i < j \leq 2n} (x_i – x_j)^2.\]

Theorem 1.2. (Schur [6]). \[\det V_{2n} = \prod_{1 \leq i < j \leq 2n} \left( \frac{x_i – x_j}{x_i + x_j} \right)^2.\]

Theorem 1.3. (Sundquist [8]). \[\det W_{2n} = T^{2n(n-1)} \prod_{1 \leq i < j \leq 2n} \left( \frac{x_i – x_j}{1 – T x_i x_j} \right)^2.\]

When \(T = 1\), the last theorem reduces to the following evaluation due to Lascoux–Thorup [2] (see also [7]): \[\det_{1 \leq i,j \leq 2n} \left[ \frac{x_i – x_j}{1 – x_i x_j} \right] = \prod_{1 \leq i < j \leq 2n} \left( \frac{x_i – x_j}{1 – x_i x_j} \right)^2.\]

We shall present new and easier proofs for these three formulae in the next three sections, that are believed to be more transparent and accessible to the readers.

2. Proof of Theorem 1.1

Rewriting the matrix entry \(u_{i,j}\) in \(U_{2n}\) by \[u_{i,j} = \frac{(x_i^n – x_j^n)^2}{x_i – x_j} = (x_i^n – x_j^n) \sum_{k=1}^{n} x_i^{n-k} x_j^{k-1}\] \[= \sum_{k=1}^{n} x_i^{2n-k} x_j^{k-1} – \sum_{k=n+1}^{2n} x_i^{2n-k} x_j^{k-1},\] where we have made the replacement \(k \to k – n\) for the rightmost sum.

Then we can express the matrix \(U_{2n}\) as a product of two matrices \[U_{2n} = U_{2n}' U_{2n}'',\] where \(U_{2n}'\) and \(U_{2n}''\) are explicitly given by \[U_{2n}' = [u_{i,j}'] : \quad u_{i,j}' = x_i^{2n-j},\] \[U_{2n}'' = [u_{i,j}''] : \quad u_{i,j}'' = \begin{cases} x_i^{j-1}, & 1 \leq i \leq n; \\ -x_i^{j-1}, & n < i \leq 2n. \end{cases}\]

Evaluating the following two determinants by the Vandermonde determinant \[\det U_{2n}' = \Lambda_{2n}, \quad \det U_{2n}'' = \Lambda_{2n},\] their product results in \(\Lambda_{2n}^2\), which confirms the formula in Theorem 1.1. 0◻

3. Proof of Theorem 1.2

For the matrix \(V_{2n}\), by subtracting the last row from the other rows, we can check that the resulting matrix becomes \[V_{2n}^a = [v_{i,j}^a]_{1 \leq i,j \leq 2n} : \quad v_{i,j}^a = \begin{cases} \frac{2(x_i – x_{2n}) x_j}{(x_i + x_j)(2n + x_j)}, & 1 \leq i < 2n; \\ \frac{x_{2n} – x_j}{x_{2n} + x_j}, & i = 2n. \end{cases}\]

Then for the matrix \(V_{2n}^a\), making the same operations on the columns, i.e., subtracting the last column from the other columns yields the matrix \[V_{2n}^b = [v_{i,j}^b]_{1 \leq i,j \leq 2n} : \quad v_{i,j}^b = \begin{cases} \frac{(x_i – x_j)(x_i – x_{2n})(x_j – x_{2n})}{(x_i + x_j)(x_i + x_{2n})(x_j + x_{2n})}, & 1 \leq i,j < 2n; \\ \frac{x_{2n} – x_j}{x_{2n} + x_j}, & i = 2n; \\ \frac{x_i – x_{2n}}{x_i + x_{2n}}, & j = 2n. \end{cases}\]

By extracting the common row factor \(\frac{x_i – x_{2n}}{x_i + x_{2n}}\) and the common column factor \(\frac{x_j – x_{2n}}{x_j + x_{2n}}\) for \(1 \leq i, j < 2n\), we find the following determinant equality \[\label{e4} \det V_{2n} = \det V_{2n}^c \times \prod_{i=1}^{2n-1} \left( \frac{x_i – x_{2n}}{x_i + x_{2n}} \right)^2, \tag{4}\] where the matrix \(V_{2n}^c\) is given by the following simpler form \[V_{2n}^c = [v_{i,j}^c]_{1 \leq i,j \leq 2n} : \quad v_{i,j}^c = \begin{cases} \frac{x_i – x_j}{x_i + x_j}, & 1 \leq i, j < 2n; \\ \chi(j = 2n) – 1, & i = 2n; \\ 1 – \chi(i = 2n), & j = 2n. \end{cases}\]

The matrix \(V_{2n}^c\) can further be reduced analogously by repeating the precedent row and column operations. First subtracting the penultimate row from the other rows (except for the last one) gives rise to the following matrix \[V_{2n}^d = [v_{i,j}^d]_{1 \leq i,j \leq 2n} : \quad v_{i,j}^d = \begin{cases} \frac{2(x_i – x_{2n-1}) x_j}{(x_i + x_j)(2n-1 + x_j)}, & 1 \leq i \leq 2n – 2; \\ \frac{x_{2n-1} – x_j}{x_{2n-1} + x_j}, & i = 2n-1; \\ \chi(j = 2n) – 1, & i = 2n. \end{cases}\]

Then for the matrix \(V_{2n}^d\), subtracting the penultimate column from the other columns (except for the last one) leads us to the matrix \[V_{2n}^e = [v_{i,j}^e]_{1 \leq i,j \leq 2n},\]

where the matrix entries are explicitly displayed as below \[v_{i,j}^e = \begin{cases} \frac{(x_i – x_j)(x_i – x_{2n-1})(x_j – x_{2n-1})}{(x_i + x_j)(x_i + x_{2n-1})(x_j + x_{2n-1})}, & 1 \leq i,j \leq 2n – 2; \\ \frac{x_{2n-1} – x_j}{x_{2n-1} + x_j}, & i = 2n – 1 \text{ and } i \neq j; \\ \frac{x_i – x_{2n-1}}{x_i + x_{2n-1}}, & j = 2n – 1 \text{ and } i \neq j; \\ -\chi(j = 2n – 1), & i = 2n; \\ \chi(i = 2n – 1), & j = 2n. \end{cases}\]

Now by extracting the common row factor \(\frac{x_i – x_{2n-1}}{x_i + x_{2n-1}}\) and the common column factor \(\frac{x_j – x_{2n-1}}{x_j + x_{2n-1}}\) for \(1 \leq i, j \leq 2n – 2\), we find the following determinant equality \[\label{e5} \det V_{2n}^e = \det V_{2n}^f \times \prod_{i=1}^{2n-2} \left( \frac{x_i – x_{2n-1}}{x_i + x_{2n-1}} \right)^2, \tag{5}\] where the matrix \(V_{2n}^f\) is given by the following simpler form \[V_{2n}^f = [v_{i,j}^f]_{1 \leq i,j \leq 2n} : \quad v_{i,j}^f = \begin{cases} \frac{x_i – x_j}{x_i + x_j}, & 1 \leq i, j \leq 2n – 2; \\ \chi(j = 2n – 1) – 1, & i = 2n – 1 \text{ and } j \neq 2n; \\ 1 – \chi(i = 2n – 1), & j = 2n – 1 \text{ and } i \neq 2n; \\ -\chi(j = 2n – 1), & i = 2n; \\ \chi(i = 2n – 1), & j = 2n. \end{cases}\]

Finally for the matrix \(V_{2n}^f\), by adding first the last row to the other rows and then the last column to the other columns, we reduce it to the following double-bordered matrix in blocks: \[\begin{bmatrix} V_{2n-2} & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & -1 \end{bmatrix}.\]

Expanding the determinant along the last two columns leads us to the determinant equality \[\det V_{2n}^f = \det V_{2n-2}.\] By combining this with (4) and (5), we establish the following recurrence relation \[\det V_{2n} = \det V_{2n-2} \times \prod_{k=1}^{n-1} \prod_{i=1}^{2n-k} \left( \frac{x_i – x_{2l+2n-k}}{x_i + x_{2l+2n-k}} \right)^2.\] Iterating this relation \((n – 1)\)-times and taking into account that \[\det V_2 = \left( \frac{x_1 – x_2}{x_1 + x_2} \right)^2\]

we derive the explicit expression \[\begin{aligned} \det V_{2n} =& \det V_2 \times \prod_{k=1}^{n-1} \prod_{i=1}^{2n-2k} \left( \frac{x_i – x_{2l+2n-k}}{x_i + x_{2l+2n-k}} \right)^2\\ =& \prod_{k=1}^{n-1} \prod_{i=1}^{2n-1-k} \left( \frac{x_i – x_{2l+2n-k}}{x_i + x_{2l+2n-k}} \right)^2\\ =& \prod_{j=1}^{n-1} \prod_{j=2}^{2n-1} \left( \frac{x_i – x_j}{x_i + x_j} \right)^2\\ =& \prod_{1 \leq i < j \leq 2n} \left( \frac{x_i – x_j}{x_i + x_j} \right)^2. \end{aligned}\] this completes the proof of Theorem 1.2. 0◻

4. Proof of Theorem 1.3

By following exactly the same procedure as done in the last section, we can explicitly evaluate the determinant for the matrix \(W_{2n}\). Subtracting the last row from the other rows transforms \(W_{2n}\) into the following one: \[W_{2n}^a = [w_{i,j}^a]_{1 \leq i,j \leq 2n} : \quad w_{i,j}^a = \begin{cases} \frac{(1 – T x_j^2)(x_i – x_{2n})}{(1 – T x_j x_{2n})(1 – T x_j x_{2n})}, & 1 \leq i < 2n; \\ \frac{x_{2n} – x_j}{1 – T x_j x_{2n}}, & i = 2n. \end{cases}\]

Then for the matrix \(W_{2n}^a\), making the same operations on the columns, i.e., subtracting the last column from the other columns yields the matrix \[W_{2n}^b = [w_{i,j}^b]_{1 \leq i,j \leq 2n},\] where the matrix entries read explicitly as \[w_{i,j}^b = \begin{cases} \frac{T (x_i – x_j)(x_i – x_{2n})(x_j – x_{2n})}{(1 – T x_j x_{2n})(1 – T x_j x_{2n})}, & 1 \leq i,j < 2n; \\ \frac{x_{2n} – x_j}{1 – T x_j x_{2n}}, & i = 2n; \\ \frac{x_i – x_{2n}}{1 – T x_i x_{2n}}, & j = 2n. \end{cases}\]

By extracting the common row factor \(\frac{x_i – x_{2n}}{1 – T x_i x_{2n}}\) and the common column factor \(\frac{x_j – x_{2n}}{1 – T x_j x_{2n}}\) for \(1 \leq i, j < 2n\), we find the following determinant equality \[\label{e6} \det W_{2n} = \det W_{2n}^c \times \prod_{i=1}^{2n-1} \left( \frac{x_i – x_{2n}}{1 – T x_i x_{2n}} \right)^2, \tag{6}\] where the matrix \(W_{2n}^c\) is given by the following simpler form \[W_{2n}^c = [w_{i,j}^c]_{1 \leq i,j \leq 2n} : \quad w_{i,j}^c = \begin{cases} \frac{T(x_i – x_j)}{1 – T x_i x_j}, & 1 \leq i, j < 2n; \\ \chi(j = 2n) – 1, & i = 2n; \\ 1 – \chi(i = 2n), & j = 2n. \end{cases}\]

The matrix \(W_{2n}^b\) can further be reduced analogously by repeating the precedent row and column operations. First subtracting the penultimate row from the other rows (except for the last one) gives rise to the following matrix \[W_{2n}^d = [w_{i,j}^d]_{1 \leq i,j \leq 2n} : \quad w_{i,j}^d = \begin{cases} \frac{T(1 – T x_j^2)(x_i – x_{2n-1})}{(1 – T x_j x_{2n-1})(1 – T x_j x_{2n-1})}, & 1 \leq i \leq 2n – 2; \\ \frac{T(x_{2n-1} – x_j)}{1 – T x_j x_{2n-1}}, & i = 2n – 1; \\ \chi(i = 2n – 1), & i = 2n; \\ \chi(i = 2n – 1), & j = 2n. \end{cases}\]

Then for the matrix \(W_{2n}^d\), subtracting the penultimate column from the other columns (except for the last one) leads us to the matrix \[W_{2n}^e = [w_{i,j}^e]_{1 \leq i,j \leq 2n},\] where the matrix entries are explicitly given by \[w_{i,j}^e = \begin{cases} \frac{T^2(x_i – x_j)(x_i – x_{2n-1})(x_j – x_{2n-1})}{(1 – T x_j x_{2n-1})(1 – T x_j x_{2n-1})}, & 1 \leq i, j \leq 2n – 2; \\ \frac{T(x_{2n-1} – x_j)}{1 – T x_j x_{2n-1}}, & i = 2n – 1 \text{ and } i \neq j; \\ \frac{T(x_i – x_{2n-1})}{1 – T x_i x_{2n-1}}, & j = 2n – 1 \text{ and } i \neq j; \\ -\chi(j = 2n – 1), & i = 2n; \\ \chi(i = 2n – 1), & j = 2n. \end{cases}\]

Now by extracting the common row factor \(\frac{T(x_i – x_{2n-1})}{1 – T x_i x_{2n-1}}\) and the common column factor \(\frac{T(x_j – x_{2n-1})}{1 – T x_j x_{2n-1}}\) for \(1 \leq i, j \leq 2n – 2\), we find the following determinant equality \[\label{e7} \det W_{2n}^e = \det W_{2n}^f \times \prod_{i=1}^{2n-2} \left( \frac{T(x_i – x_{2n-1})}{1 – T x_i x_{2n-1}} \right)^2, \tag{7}\] where the matrix \(W_{2n}^f\) is given by the following simpler form \[W_{2n}^f = [w_{i,j}^f]_{1 \leq i,j \leq 2n} : \quad w_{i,j}^f = \begin{cases} \frac{x_i – x_j}{1 – T x_i x_j}, & 1 \leq i, j \leq 2n – 2; \\ \chi(j = 2n – 1) – 1, & i = 2n – 1 \text{ and } j \neq 2n; \\ 1 – \chi(i = 2n – 1), & j = 2n – 1 \text{ and } i \neq 2n; \\ -\chi(j = 2n – 1), & i = 2n; \\ \chi(i = 2n – 1), & j = 2n. \end{cases}\]

Finally for the matrix \(W_{2n}^f\), by adding first the last row to the other rows and then the last column to the other columns, we reduce it to the following double-bordered matrix in blocks:

\[\begin{bmatrix} W_{2n-2} & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & -1 \end{bmatrix}.\]

Expanding the determinant along the last two columns leads us to the determinant equality \[\det W_{2n}^f = \det W_{2n-2}.\]

By combining this with (6) and (7), we establish the following recurrence relation \[\det W_{2n} = \det W_{2n-2} \times T^{4n-4} \prod_{k=1}^{2} \prod_{i=1}^{2n-k} \left( \frac{x_i – x_{1+2n-k}}{1 – T x_i x_{1+2n-k}} \right)^2.\]

Iterating this relation \((n – 1)\)-times and taking into account that \[\det W_2 = \left( \frac{x_1 – x_2}{1 – T x_1 x_2} \right)^2\] we derive the explicit expression \[\begin{aligned} \det W_{2n} =& \det W_2 \times T^{2n(n-1)} \prod_{k=1}^{2n-2-n-k} \prod_{i=1}^{2n-k} \left( \frac{x_i – x_{1+2n-k}}{1 – T x_i x_{1+2n-k}} \right)^2\\ =& T^{2n(n-1)} \prod_{k=1}^{2n-1} \prod_{i=1}^{2n-k} \left( \frac{x_i – x_{1+2n-k}}{1 – T x_i x_{1+2n-k}} \right)^2\\ =& T^{2n(n-1)} \prod_{j=1}^{n-1} \prod_{j=2}^{2n-1} \left( \frac{x_i – x_j}{1 – T x_i x_j} \right)^2\\ =& T^{2n(n-1)} \prod_{1 \leq i < j \leq 2n} \left( \frac{x_i – x_j}{1 – T x_i x_j} \right)^2. \end{aligned}\] this completes the proof of Theorem 1.3. ◻

References:

  1. M. Ishikawa and M. Wakayama. Applications of minor summation formula iii, plücker relations, lattice paths and pfaffian identities. Journal of Combinatorial Theory, Series A, 113(1):113–155, 2006. https://doi.org/10.1016/j.jcta.2005.05.008.
  2. D. Laksov, A. Lascoux, and A. Thorup. On giambelli’s theorem on complete correlations, 1989.
  3. I. G. Macdonald. Symmetric functions and Hall polynomials. Oxford university press, 1998.
  4. T. Muir and W. H. Metzler. A Treatise on the Theory of Determinants. Courier Corporation, 2003.
  5. S. Okada. Pfaffian formulas and schur q-function identities. Advances in Mathematics, 353:446–470, 2019. https://doi.org/10.1016/j.aim.2019.07.006.
  6. J. Schur. Über die darstellung der symmetrischen und der alternierenden gruppe durch gebrochene lineare substitutionen. Journal fur die Reine und Angewandte Mathematik, 139:155–250, 1911.
  7. J. R. Stembridge. Nonintersecting paths, pfaffians, and plane partitions. Advances in Mathematics, 83:96–131, 1990.
  8. T. S. Sundquist. Pfaffians, involutions, and Schur functions. University of Minnesota, 1992.
  9. R. Vein and P. Dale. Determinants and their applications in mathematical physics, volume 134. Springer Science & Business Media, 2006.