Form of a matrix
In
mathematics
, particularly in
linear algebra
, a
skew-symmetric
(or
antisymmetric
or
antimetric
[1]
)
matrix
is a
square matrix
whose
transpose
equals its negative. That is, it satisfies the condition
[2]
: p. 38
In terms of the entries of the matrix, if
denotes the entry in the
-th row and
-th column, then the skew-symmetric condition is equivalent to
Example
[
edit
]
The matrix
![{\displaystyle A={\begin{bmatrix}0&2&-45\\-2&0&-4\\45&4&0\end{bmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/1bcd44214971a9c18e032aa0b0f7ad758b6ea7d7)
is skew-symmetric because
![{\displaystyle -A={\begin{bmatrix}0&-2&45\\2&0&4\\-45&-4&0\end{bmatrix}}=A^{\textsf {T}}.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/3b734af1202d7216b2a8a6a0c425bf3c37cd8ab9)
Properties
[
edit
]
Throughout, we assume that all matrix entries belong to a
field
whose
characteristic
is not equal to 2. That is, we assume that
1 + 1 ≠ 0
, where 1 denotes the multiplicative identity and 0 the additive identity of the given field. If the characteristic of the field is 2, then a skew-symmetric matrix is the same thing as a
symmetric matrix
.
- The sum of two skew-symmetric matrices is skew-symmetric.
- A scalar multiple of a skew-symmetric matrix is skew-symmetric.
- The elements on the diagonal of a skew-symmetric matrix are zero, and therefore its
trace
equals zero.
- If
is a real skew-symmetric matrix and
is a real
eigenvalue
, then
, i.e. the nonzero eigenvalues of a skew-symmetric matrix are non-real.
- If
is a real skew-symmetric matrix, then
is
invertible
, where
is the identity matrix.
- If
is a skew-symmetric matrix then
is a symmetric
negative semi-definite matrix
.
Vector space structure
[
edit
]
As a result of the first two properties above, the set of all skew-symmetric matrices of a fixed size forms a
vector space
. The space of
skew-symmetric matrices has
dimension
Let
denote the space of
matrices. A skew-symmetric matrix is determined by
scalars (the number of entries above the
main diagonal
); a
symmetric matrix
is determined by
scalars (the number of entries on or above the main diagonal). Let
denote the space of
skew-symmetric matrices and
denote the space of
symmetric matrices. If
then
![{\displaystyle A={\frac {1}{2}}\left(A-A^{\mathsf {T}}\right)+{\frac {1}{2}}\left(A+A^{\mathsf {T}}\right).}](https://wikimedia.org/api/rest_v1/media/math/render/svg/f0d468021779e1d8f97f1482b4999a48dadf6f18)
Notice that
and
This is true for every
square matrix
with entries from any
field
whose
characteristic
is different from 2. Then, since
and
![{\displaystyle {\mbox{Mat}}_{n}={\mbox{Skew}}_{n}\oplus {\mbox{Sym}}_{n},}](https://wikimedia.org/api/rest_v1/media/math/render/svg/619dd30f7f4fb58cedac8c55e6d414aaacaefc72)
where
![{\displaystyle \oplus }](https://wikimedia.org/api/rest_v1/media/math/render/svg/8b16e2bdaefee9eed86d866e6eba3ac47c710f60)
denotes the
direct sum
.
Denote by
the standard
inner product
on
The real
matrix
is skew-symmetric if and only if
![{\displaystyle \langle Ax,y\rangle =-\langle x,Ay\rangle \quad {\text{ for all }}x,y\in \mathbb {R} ^{n}.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ace4cf15a25c139f742ba2b8a4e1a44fbc0cae75)
This is also equivalent to
for all
(one implication being obvious, the other a plain consequence of
for all
and
).
Since this definition is independent of the choice of
basis
, skew-symmetry is a property that depends only on the
linear operator
and a choice of
inner product
.
skew symmetric matrices can be used to represent
cross products
as matrix multiplications.
Furthermore, if
is a skew-symmetric (or
skew-Hermitian
) matrix, then
for all
.
Determinant
[
edit
]
Let
be a
skew-symmetric matrix. The
determinant
of
satisfies
![{\displaystyle \det \left(A^{\textsf {T}}\right)=\det(-A)=(-1)^{n}\det(A).}](https://wikimedia.org/api/rest_v1/media/math/render/svg/bad9781ab36adcc2bffd81ce6b1f0513cc88fe03)
In particular, if
is odd, and since the underlying field is not of characteristic 2, the determinant vanishes. Hence, all odd dimension skew symmetric matrices are singular as their determinants are always zero. This result is called
Jacobi’s theorem
, after
Carl Gustav Jacobi
(Eves, 1980).
The even-dimensional case is more interesting. It turns out that the determinant of
for
even can be written as the square of a
polynomial
in the entries of
, which was first proved by Cayley:
[3]
![{\displaystyle \det(A)=\operatorname {Pf} (A)^{2}.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/500bea4ecc6234df17b3b0556e8b833dfa047910)
This polynomial is called the
Pfaffian
of
and is denoted
. Thus the determinant of a real skew-symmetric matrix is always non-negative. However this last fact can be proved in an elementary way as follows: the eigenvalues of a real skew-symmetric matrix are purely imaginary (see below) and to every eigenvalue there corresponds the conjugate eigenvalue with the same multiplicity; therefore, as the determinant is the product of the eigenvalues, each one repeated according to its multiplicity, it follows at once that the determinant, if it is not 0, is a positive real number.
The number of distinct terms
in the expansion of the determinant of a skew-symmetric matrix of order
has been considered already by Cayley, Sylvester, and Pfaff. Due to cancellations, this number is quite small as compared the number of terms of a generic matrix of order
, which is
. The sequence
(sequence
A002370
in the
OEIS
) is
- 1, 0, 1, 0, 6, 0, 120, 0, 5250, 0, 395010, 0, …
and it is encoded in the
exponential generating function
![{\displaystyle \sum _{n=0}^{\infty }{\frac {s(n)}{n!}}x^{n}=\left(1-x^{2}\right)^{-{\frac {1}{4}}}\exp \left({\frac {x^{2}}{4}}\right).}](https://wikimedia.org/api/rest_v1/media/math/render/svg/f4092d4810db664c175bf49329c08429157eb791)
The latter yields to the asymptotics (for
even)
![{\displaystyle s(n)=\pi ^{-{\frac {1}{2}}}2^{\frac {3}{4}}\Gamma \left({\frac {3}{4}}\right)\left({\frac {n}{e}}\right)^{n-{\frac {1}{4}}}\left(1+O\left({\frac {1}{n}}\right)\right).}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ec6302ab239b4c05998234ab24c10389e5020c85)
The number of positive and negative terms are approximatively a half of the total, although their difference takes larger and larger positive and negative values as
increases (sequence
A167029
in the
OEIS
).
Cross product
[
edit
]
Three-by-three skew-symmetric matrices can be used to represent cross products as matrix multiplications. Consider
vectors
and
Then, defining the matrix
![{\displaystyle [\mathbf {a} ]_{\times }={\begin{bmatrix}\,\,0&\!-a_{3}&\,\,\,a_{2}\\\,\,\,a_{3}&0&\!-a_{1}\\\!-a_{2}&\,\,a_{1}&\,\,0\end{bmatrix}},}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7f7b16d3e9e086e644797e6285eaac3384026f59)
the cross product can be written as
![{\displaystyle \mathbf {a} \times \mathbf {b} =[\mathbf {a} ]_{\times }\mathbf {b} .}](https://wikimedia.org/api/rest_v1/media/math/render/svg/6ab16fc943f87547006ec9ddd66d083bd169aeb9)
This can be immediately verified by computing both sides of the previous equation and comparing each corresponding element of the results.
One actually has
![{\displaystyle [\mathbf {a\times b} ]_{\times }=[\mathbf {a} ]_{\times }[\mathbf {b} ]_{\times }-[\mathbf {b} ]_{\times }[\mathbf {a} ]_{\times };}](https://wikimedia.org/api/rest_v1/media/math/render/svg/b5349c358d6e2d044280e724b32df5542cfddb84)
i.e., the commutator of skew-symmetric three-by-three matrices can be identified with the cross-product of three-vectors. Since the skew-symmetric three-by-three matrices are the
Lie algebra
of the rotation group
this elucidates the relation between three-space
, the cross product and three-dimensional rotations. More on infinitesimal rotations can be found below.
Spectral theory
[
edit
]
Since a matrix is
similar
to its own transpose, they must have the same eigenvalues. It follows that the
eigenvalues
of a skew-symmetric matrix always come in pairs ±λ (except in the odd-dimensional case where there is an additional unpaired 0 eigenvalue). From the
spectral theorem
, for a real skew-symmetric matrix the nonzero eigenvalues are all pure
imaginary
and thus are of the form
where each of the
are real.
Real skew-symmetric matrices are
normal matrices
(they commute with their
adjoints
) and are thus subject to the
spectral theorem
, which states that any real skew-symmetric matrix can be diagonalized by a
unitary matrix
. Since the eigenvalues of a real skew-symmetric matrix are imaginary, it is not possible to diagonalize one by a real matrix. However, it is possible to bring every skew-symmetric matrix to a
block diagonal
form by a
special orthogonal transformation
.
[4]
[5]
Specifically, every
real skew-symmetric matrix can be written in the form
where
is orthogonal and
![{\displaystyle \Sigma ={\begin{bmatrix}{\begin{matrix}0&\lambda _{1}\\-\lambda _{1}&0\end{matrix}}&0&\cdots &0\\0&{\begin{matrix}0&\lambda _{2}\\-\lambda _{2}&0\end{matrix}}&&0\\\vdots &&\ddots &\vdots \\0&0&\cdots &{\begin{matrix}0&\lambda _{r}\\-\lambda _{r}&0\end{matrix}}\\&&&&{\begin{matrix}0\\&\ddots \\&&0\end{matrix}}\end{bmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/695f425e403040e077a31b770f94467b95216ef2)
for real positive-definite
. The nonzero eigenvalues of this matrix are ±λ
k
i
. In the odd-dimensional case Σ always has at least one row and column of zeros.
More generally, every complex skew-symmetric matrix can be written in the form
where
is unitary and
has the block-diagonal form given above with
still real positive-definite. This is an example of the Youla decomposition of a complex square matrix.
[6]
Skew-symmetric and alternating forms
[
edit
]
A
skew-symmetric form
on a
vector space
over a
field
of arbitrary characteristic is defined to be a
bilinear form
![{\displaystyle \varphi :V\times V\mapsto K}](https://wikimedia.org/api/rest_v1/media/math/render/svg/09f6261c57cb5e93e8df6b6decd4b672f696669c)
such that for all
in
![{\displaystyle \varphi (v,w)=-\varphi (w,v).}](https://wikimedia.org/api/rest_v1/media/math/render/svg/9bbf543ddab95b4005c30b56a7442e5a380e0683)
This defines a form with desirable properties for vector spaces over fields of characteristic not equal to 2, but in a vector space over a field of characteristic 2, the definition is equivalent to that of a symmetric form, as every element is its own additive inverse.
Where the
vector space
is over a field of arbitrary
characteristic
including characteristic 2, we may define an
alternating form
as a bilinear form
such that for all vectors
in
![{\displaystyle \varphi (v,v)=0.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/f0c96a9492544192b5e994e3298e1cfe9dfde509)
This is equivalent to a skew-symmetric form when the field is not of characteristic 2, as seen from
![{\displaystyle 0=\varphi (v+w,v+w)=\varphi (v,v)+\varphi (v,w)+\varphi (w,v)+\varphi (w,w)=\varphi (v,w)+\varphi (w,v),}](https://wikimedia.org/api/rest_v1/media/math/render/svg/85d98a3ffbede20ddfd65c0d6b70cab194b78c53)
whence
![{\displaystyle \varphi (v,w)=-\varphi (w,v).}](https://wikimedia.org/api/rest_v1/media/math/render/svg/9bbf543ddab95b4005c30b56a7442e5a380e0683)
A bilinear form
will be represented by a matrix
such that
, once a
basis
of
is chosen, and conversely an
matrix
on
gives rise to a form sending
to
For each of symmetric, skew-symmetric and alternating forms, the representing matrices are symmetric, skew-symmetric and alternating respectively.
Infinitesimal rotations
[
edit
]
Skew-symmetric matrices over the field of real numbers form the
tangent space
to the real
orthogonal group
at the identity matrix; formally, the
special orthogonal Lie algebra
. In this sense, then, skew-symmetric matrices can be thought of as
infinitesimal rotations
.
Another way of saying this is that the space of skew-symmetric matrices forms the
Lie algebra
of the
Lie group
The Lie bracket on this space is given by the
commutator
:
![{\displaystyle [A,B]=AB-BA.\,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/27a6e46435ae66281b6bbe69ed8e3677f418f5f7)
It is easy to check that the commutator of two skew-symmetric matrices is again skew-symmetric:
![{\displaystyle {\begin{aligned}{[}A,B{]}^{\textsf {T}}&=B^{\textsf {T}}A^{\textsf {T}}-A^{\textsf {T}}B^{\textsf {T}}\\&=(-B)(-A)-(-A)(-B)=BA-AB=-[A,B]\,.\end{aligned}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5cb4030319ce976ab62e704bac346e9f205f4a42)
The
matrix exponential
of a skew-symmetric matrix
is then an
orthogonal matrix
:
![{\displaystyle R=\exp(A)=\sum _{n=0}^{\infty }{\frac {A^{n}}{n!}}.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/142932604149508be58416e8edf5f7215fa666f1)
The image of the
exponential map
of a Lie algebra always lies in the
connected component
of the Lie group that contains the identity element. In the case of the Lie group
this connected component is the
special orthogonal group
consisting of all orthogonal matrices with determinant 1. So
will have determinant +1. Moreover, since the exponential map of a connected compact Lie group is always surjective, it turns out that
every
orthogonal matrix with unit determinant can be written as the exponential of some skew-symmetric matrix. In the particular important case of dimension
the exponential representation for an orthogonal matrix reduces to the well-known
polar form
of a complex number of unit modulus. Indeed, if
a special orthogonal matrix has the form
![{\displaystyle {\begin{bmatrix}a&-b\\b&\,a\end{bmatrix}},}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ca477d0cb0b849d47a44a56a73179bbf159de5da)
with
. Therefore, putting
and
it can be written
![{\displaystyle {\begin{bmatrix}\cos \,\theta &-\sin \,\theta \\\sin \,\theta &\,\cos \,\theta \end{bmatrix}}=\exp \left(\theta {\begin{bmatrix}0&-1\\1&\,0\end{bmatrix}}\right),}](https://wikimedia.org/api/rest_v1/media/math/render/svg/491a75ab21a56316190adbabfdb43ed62f2a486d)
which corresponds exactly to the polar form
of a complex number of unit modulus.
The exponential representation of an orthogonal matrix of order
![{\displaystyle n}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a601995d55609f2d9f5e233e36fbe9ea26011b3b)
can also be obtained starting from the fact that in dimension
![{\displaystyle n}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a601995d55609f2d9f5e233e36fbe9ea26011b3b)
any special orthogonal matrix
![{\displaystyle R}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4b0bfb3769bf24d80e15374dc37b0441e2616e33)
can be written as
![{\displaystyle R=QSQ^{\textsf {T}},}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4bba68948b0018046aed1cc03ef32da5dccca4ec)
where
![{\displaystyle Q}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8752c7023b4b3286800fe3238271bbca681219ed)
is orthogonal and S is a
block diagonal matrix
with
![{\textstyle \lfloor n/2\rfloor }](https://wikimedia.org/api/rest_v1/media/math/render/svg/c2576ba2cd33a90d371e3154bd87b064fcfce1fe)
blocks of order 2, plus one of order 1 if
![{\displaystyle n}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a601995d55609f2d9f5e233e36fbe9ea26011b3b)
is odd; since each single block of order 2 is also an orthogonal matrix, it admits an exponential form. Correspondingly, the matrix
S
writes as exponential of a skew-symmetric block matrix
![{\displaystyle \Sigma }](https://wikimedia.org/api/rest_v1/media/math/render/svg/9e1f558f53cda207614abdf90162266c70bc5c1e)
of the form above,
![{\displaystyle S=\exp(\Sigma ),}](https://wikimedia.org/api/rest_v1/media/math/render/svg/3fcaf5aef2778d021d881a749aea1ef49d882987)
so that
![{\displaystyle R=Q\exp(\Sigma )Q^{\textsf {T}}=\exp(Q\Sigma Q^{\textsf {T}}),}](https://wikimedia.org/api/rest_v1/media/math/render/svg/c0dd81883a6efecb543af874f1379ec3d1966bec)
exponential of the skew-symmetric matrix
![{\displaystyle Q\Sigma Q^{\textsf {T}}.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/99f966b78771c644a716256146aefd6850583fd9)
Conversely, the surjectivity of the exponential map, together with the above-mentioned block-diagonalization for skew-symmetric matrices, implies the block-diagonalization for orthogonal matrices.
Coordinate-free
[
edit
]
More intrinsically (i.e., without using coordinates), skew-symmetric linear transformations on a vector space
with an
inner product
may be defined as the
bivectors
on the space, which are sums of simple bivectors (
2-blades
)
The correspondence is given by the map
where
is the covector dual to the vector
; in orthonormal coordinates these are exactly the elementary skew-symmetric matrices. This characterization is used in interpreting the
curl
of a vector field (naturally a 2-vector) as an infinitesimal rotation or "curl", hence the name.
Skew-symmetrizable matrix
[
edit
]
An
matrix
is said to be
skew-symmetrizable
if there exists an invertible
diagonal matrix
such that
is skew-symmetric. For
real
matrices, sometimes the condition for
to have positive entries is added.
[7]
See also
[
edit
]
References
[
edit
]
- ^
Richard A. Reyment;
K. G. Joreskog
; Leslie F. Marcus (1996).
Applied Factor Analysis in the Natural Sciences
. Cambridge University Press. p. 68.
ISBN
0-521-57556-7
.
- ^
Lipschutz, Seymour; Lipson, Marc (September 2005).
Schaum's Outline of Theory and Problems of Linear Algebra
. McGraw-Hill.
ISBN
9780070605022
.
- ^
Cayley, Arthur
(1847). "Sur les determinants gauches" [On skew determinants].
Crelle's Journal
.
38
: 93?96.
Reprinted in
Cayley, A. (2009). "Sur les Determinants Gauches".
The Collected Mathematical Papers
. Vol. 1. pp. 410?413.
doi
:
10.1017/CBO9780511703676.070
.
ISBN
978-0-511-70367-6
.
- ^
Voronov, Theodore.
Pfaffian
, in:
Concise Encyclopedia of Supersymmetry and Noncommutative Structures in Mathematics and Physics, Eds. S. Duplij, W. Siegel, J. Bagger (Berlin, New York: Springer 2005), p. 298.
- ^
Zumino, Bruno (1962). "Normal Forms of Complex Matrices".
Journal of Mathematical Physics
.
3
(5): 1055?1057.
Bibcode
:
1962JMP.....3.1055Z
.
doi
:
10.1063/1.1724294
.
- ^
Youla, D. C. (1961).
"A normal form for a matrix under the unitary congruence group"
.
Can. J. Math
.
13
: 694?704.
doi
:
10.4153/CJM-1961-059-8
.
- ^
Fomin, Sergey; Zelevinsky, Andrei (2001). "Cluster algebras I: Foundations".
arXiv
:
math/0104151v1
.
Further reading
[
edit
]
External links
[
edit
]