In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,
A is symmetric ⟺ A = A T . {\displaystyle A{\text{ is symmetric}}\iff A=A^{\textsf {T}}.}
Because equal matrices have equal dimensions, only square matrices can be symmetric.
The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if a i j {\displaystyle a_{ij}} denotes the entry in the i {\displaystyle i} th row and j {\displaystyle j} th column then
A is symmetric ⟺ for every i , j , a j i = a i j {\displaystyle A{\text{ is symmetric}}\iff {\text{ for every }}i,j,\quad a_{ji}=a_{ij}}
for all indices i {\displaystyle i} and j . {\displaystyle j.}
Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.
In linear algebra, a real symmetric matrix represents a self-adjoint operator represented in an orthonormal basis over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.