The Triple Combo

Let \mathbf A \in \mathcal M_{2 \times 2}(\mathbb K) be a symmetric matrix with associated bilinear form [\mathbf u, \mathbf v ]_{\mathbf A} = \mathbf u^{\mathrm T} \mathbf A \mathbf v and quadrance Q_{\mathbf A}(\mathbf v) = [\mathbf v, \mathbf v]_{\mathbf A}. For \mathbf u, \mathbf v with nonzero quadrances, define the spread by

\displaystyle s_{\mathbf A}(\mathbf u, \mathbf v) := 1 - \frac{[\mathbf u, \mathbf v]_{\mathbf A}^2}{Q_{\mathbf A}(\mathbf u)Q_{\mathbf A}(\mathbf v)}.

Problem 1. Define the Gram matrix by

\mathbf G_{\mathbf A}(\mathbf u, \mathbf v) = \begin{bmatrix} [\mathbf u, \mathbf u ]_{\mathbf A}  & [\mathbf u, \mathbf v ]_{\mathbf A} \\ [\mathbf v, \mathbf u ]_{\mathbf A} & [\mathbf v, \mathbf v ]_{\mathbf A}  \end{bmatrix}.

Using the Gram matrix, or otherwise, prove that

Q_{\mathbf A}(\mathbf u)Q_{\mathbf A}(\mathbf v) = [\mathbf u, \mathbf v]^2 + \det(\mathbf A) \det(\mathbf u, \mathbf v)^2.

Deduce that

\displaystyle s_{\mathbf A}(\mathbf u, \mathbf v) = \frac{\det(\mathbf A) \det(\mathbf u, \mathbf v)^2}{Q_{\mathbf A}(\mathbf u) Q_{\mathbf A}(\mathbf v)}.

(Click for Solution)

Solution. We first observe that the Gram matrix can be written as a product of matrices:

\displaystyle \begin{aligned} \mathbf G_{\mathbf A}(\mathbf u, \mathbf v) &= \begin{bmatrix} [\mathbf u, \mathbf u ]_{\mathbf A}  & [\mathbf u, \mathbf v ]_{\mathbf A} \\ [\mathbf v, \mathbf u ]_{\mathbf A} & [\mathbf v, \mathbf v ]_{\mathbf A}  \end{bmatrix} \\ &= \begin{bmatrix} \mathbf u^{\mathrm T} \mathbf A \mathbf u  & \mathbf u^{\mathrm T} \mathbf A \mathbf v \\ \mathbf v^{\mathrm T} \mathbf A \mathbf u & \mathbf v^{\mathrm T} \mathbf A \mathbf v  \end{bmatrix} \\ &= \begin{bmatrix} \mathbf u^{\mathrm T}  \\ \mathbf v^{\mathrm T} \end{bmatrix} \mathbf A \begin{bmatrix}\mathbf u & \mathbf v \end{bmatrix} \\ &= \begin{bmatrix} \mathbf u  & \mathbf v \end{bmatrix}^{\mathrm T} \mathbf A \begin{bmatrix}\mathbf u & \mathbf v \end{bmatrix} \end{aligned}

Taking determinants,

\begin{aligned} |\mathbf G_{\mathbf A}(\mathbf u,\mathbf v)| &= \left| \begin{bmatrix} \mathbf u  & \mathbf v \end{bmatrix}^{\mathrm T} \right| \det(\mathbf A) \left| \begin{bmatrix} \mathbf u  & \mathbf v \end{bmatrix} \right| \\ &=\left| \begin{bmatrix} \mathbf u  & \mathbf v \end{bmatrix}^{\mathrm T} \right| \det(\mathbf A) \left| \begin{bmatrix} \mathbf u  & \mathbf v \end{bmatrix}^{\mathrm T} \right| \\ &=\det(\mathbf A) \left| \begin{bmatrix} \mathbf u  & \mathbf v \end{bmatrix}^{\mathrm T} \right|^2 \\ &=\det(\mathbf A) \det(\mathbf u, \mathbf v)^2 \end{aligned}

On the other hand, by the symmetry of [\cdot, \cdot]_{\mathbf A}, directly computing |\mathbf G_{\mathbf A}(\mathbf u,\mathbf v)| yields

\begin{aligned}|\mathbf G_{\mathbf A}(\mathbf u,\mathbf v)| &= \left| \begin{bmatrix} [\mathbf u, \mathbf u ]_{\mathbf A}  & [\mathbf u, \mathbf v ]_{\mathbf A} \\ [\mathbf v, \mathbf u ]_{\mathbf A} & [\mathbf v, \mathbf v ]_{\mathbf A}  \end{bmatrix}  \right| \\ &= [\mathbf u, \mathbf u ]_{\mathbf A} [\mathbf v, \mathbf v ]_{\mathbf A} - [\mathbf u, \mathbf v ]_{\mathbf A} [\mathbf v, \mathbf u ]_{\mathbf A} \\ &= Q_{\mathbf A}(\mathbf u) Q_{\mathbf A}(\mathbf v) - [\mathbf u, \mathbf v]_{\mathbf A}^2. \end{aligned}

Therefore,

Q_{\mathbf A}(\mathbf u) Q_{\mathbf A}(\mathbf v) - [\mathbf u, \mathbf v]_{\mathbf A}^2 = \det(\mathbf A) \det(\mathbf u, \mathbf v)^2,

yielding the desired result. For the spread, we use its definition to conclude

\displaystyle \begin{aligned} s_{\mathbf A}(\mathbf u, \mathbf v) &= 1 - \frac{[\mathbf u, \mathbf v]_{\mathbf A}^2}{Q_{\mathbf A}(\mathbf u)Q_{\mathbf A}(\mathbf v)} \\  &= \frac{ Q_{\mathbf A}(\mathbf u)Q_{\mathbf A}(\mathbf v) - [\mathbf u, \mathbf v]_{\mathbf A}^2}{Q_{\mathbf A}(\mathbf u)Q_{\mathbf A}(\mathbf v)} \\ &= \frac{ \det(\mathbf A) \det(\mathbf u, \mathbf v)^2}{Q_{\mathbf A}(\mathbf u)Q_{\mathbf A}(\mathbf v)}. \end{aligned}

Problem 2. Define the symmetric matrices \mathbf I, \mathbf J, \mathbf K \in \mathcal M_{2 \times 2}(\mathbb K) by

\mathbf I := \begin{bmatrix}1 & 0 \\ 0 & 1\end{bmatrix},\quad \mathbf J := \begin{bmatrix}1 & 0 \\ 0 & -1\end{bmatrix},\quad \mathbf K := \begin{bmatrix}0 & 1 \\ 1 & 0\end{bmatrix}.

Prove that for \mathbf u, \mathbf v with nonzero spreads (with respect to all three matrices),

\displaystyle \frac{1}{s_{\mathbf I}(\mathbf u, \mathbf v)} + \frac{1}{s_{\mathbf J}(\mathbf u, \mathbf v)} + \frac{1}{s_{\mathbf K}(\mathbf u, \mathbf v)} = 2.

(Click for Solution)

Solution. We first observe that \det(\mathbf I) = 1 and \det(\mathbf J) = \det(\mathbf K) = -1. Using Problem 1, it suffices to prove that

Q_{\mathbf I}(\mathbf u)Q_{\mathbf I}(\mathbf v) - Q_{\mathbf J}(\mathbf u)Q_{\mathbf J}(\mathbf v) - Q_{\mathbf K}(\mathbf u)Q_{\mathbf K}(\mathbf v) = 2 \det(\mathbf u,\mathbf v)^2.

Writing \mathbf u = \begin{bmatrix} u_1 \\ u_2 \end{bmatrix} and \mathbf v = \begin{bmatrix} v_1 \\ v_2 \end{bmatrix}, a direct computation yields

Q_{\mathbf I}(\mathbf u) = u_1^2 +u_2^2, \quad Q_{\mathbf J}(\mathbf u) = u_1^2 - u_2^2, \quad Q_{\mathbf K}(\mathbf u) = 2u_1u_2.

On the other hand, \det(\mathbf u,\mathbf v)^2 = u_1 v_2 - u_2 v_1. Expanding both sides using algebraic expansion yields the desired result.

—Joel Kindiak, 21 Mar 25, 1426H

,

Published by


Leave a comment