# Orthogonal matrix

**orthogonal matricesorthogonalorthogonal transformorthogonal transformationsorthogonal linear transformationorthogonal transformationOrthonormal matrixmatricesnearest orthonormal representationorthogonal 3×3 matrix**

An orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors (i.e., orthonormal vectors), i.e. :where I is the identity matrix.wikipedia

224 Related Articles

### Determinant

**determinantsdetmatrix determinant**

The determinant of any orthogonal matrix is either +1 or −1.

Special types of matrices have special determinants; for example, the determinant of an orthogonal matrix is always plus or minus one, and the determinant of a complex Hermitian matrix is always real.

### Orthogonal group

**special orthogonal grouprotation grouporthogonal**

, known as the orthogonal group.

orthogonal matrices, where the group operation is given by matrix multiplication; an orthogonal matrix is a real matrix whose inverse equals its transpose.

### Transpose

**matrix transposetranspositionmatrix transposition**

This leads to the equivalent characterization: a matrix Q is orthogonal if its transpose is equal to its inverse:

A square matrix whose transpose is equal to its inverse is called an orthogonal matrix; that is, A is orthogonal if

### Unitary matrix

**unitaryunitary matricesunitary operator**

), unitary (

The real analogue of a unitary matrix is an orthogonal matrix.

### Point group

**point groupssymmetryRosette groups**

For example, the point group of a molecule is a subgroup of O(3).

Point groups can be realized as sets of orthogonal matrices M that transform point x into point y:

### QR decomposition

**QR factorizationQR**

Because floating point versions of orthogonal matrices have advantageous properties, they are key to many algorithms in numerical linear algebra, such as QR decomposition.

In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization is a decomposition of a matrix A into a product A = QR of an orthogonal matrix Q and an upper triangular matrix R.

### Normal matrix

**normalnormal matricesnon-normal matrices**

) and therefore normal (

Likewise, among real matrices, all orthogonal, symmetric, and skew-symmetric matrices are normal.

### Reflection (mathematics)

**reflectionreflectionsmirror plane**

As a linear transformation, an orthogonal matrix preserves the dot product of vectors, and therefore acts as an isometry of Euclidean space, such as a rotation, reflection or rotoreflection.

The matrix for a reflection is orthogonal with determinant −1 and eigenvalues −1, 1, 1, ..., 1.

### Permutation matrix

**permutation matricescyclic permutation matrixpermutation**

and therefore exchanges x and y; it is a permutation matrix, with a single 1 in each column and row (and otherwise 0):

As permutation matrices are orthogonal matrices (i.e., ), the inverse matrix exists and can be written as

### Rotation (mathematics)

**rotationrotationsrotate**

As a linear transformation, an orthogonal matrix preserves the dot product of vectors, and therefore acts as an isometry of Euclidean space, such as a rotation, reflection or rotoreflection.

orthogonal matrix that is multiplied to column vectors.

### Eigenvalues and eigenvectors

**eigenvalueeigenvalueseigenvector**

Stronger than the determinant restriction is the fact that an orthogonal matrix can always be diagonalized over the complex numbers to exhibit a full set of eigenvalues, all of which must have (complex) modulus 1.

Around the same time, Brioschi proved that the eigenvalues of orthogonal matrices lie on the unit circle, and Clebsch found the corresponding result for skew-symmetric matrices.

### Involutory matrix

**involutoryits own inverse**

A reflection is its own inverse, which implies that a reflection matrix is symmetric (equal to its transpose) as well as orthogonal.

An involutory matrix which is also symmetric is an orthogonal matrix, and thus represents an isometry (a linear transformation which preserves Euclidean distance).

### Symmetric matrix

**symmetricsymmetric matricessymmetrical**

A reflection is its own inverse, which implies that a reflection matrix is symmetric (equal to its transpose) as well as orthogonal.

The finite-dimensional spectral theorem says that any symmetric matrix whose entries are real can be diagonalized by an orthogonal matrix.

### Orthonormal basis

**orthonormal basesorthonormalcomplete**

A real square matrix is orthogonal if and only if its columns form an orthonormal basis of the Euclidean space

### Matrix exponential

**exponentialmatrix exponentiationLieb's Theorem**

Going the other direction, the matrix exponential of any skew-symmetric matrix is an orthogonal matrix (in fact, special orthogonal).

is orthogonal.

### Diagonalizable matrix

**diagonalizablediagonalizeddiagonalization**

Stronger than the determinant restriction is the fact that an orthogonal matrix can always be diagonalized over the complex numbers to exhibit a full set of eigenvalues, all of which must have (complex) modulus 1.

orthogonal matrix) and P^{-1} equals the conjugate transpose of P (if real, then the transpose of P).

### Improper rotation

**rotoreflectionproper rotationrotoinversion**

As a linear transformation, an orthogonal matrix preserves the dot product of vectors, and therefore acts as an isometry of Euclidean space, such as a rotation, reflection or rotoreflection.

An indirect isometry is an affine transformation with an orthogonal matrix that has a determinant of −1.

### Singular value decomposition

**singular-value decompositionSVDsingular values**

;Singular value decomposition :

\mathbb{R}, unitary matrices are orthogonal matrices),

### Semidirect product

**semi-direct productssemidirect sum×**

is a semidirect product of

of orthogonal

### Matrix decomposition

**matrix factorizationdecompositionfactorization**

A number of important matrix decompositions involve orthogonal matrices, including especially:

Similarly, the QR decomposition expresses A as QR with Q an orthogonal matrix and R an upper triangular matrix.

### Discrete cosine transform

**DCTiDCTinverse discrete cosine transform**

As another example, with appropriate normalization the discrete cosine transform (used in MP3 compression) is represented by an orthogonal matrix.

This makes the DCT-I matrix orthogonal, if one further multiplies by an overall scale factor of, but breaks the direct correspondence with a real-even DFT.

### Eigendecomposition of a matrix

**eigendecompositioneigenvalue decompositiongeneralized eigenvalue problem**

;Eigendecomposition of a symmetric matrix (decomposition according to the spectral theorem) :

is an orthogonal matrix whose columns are the eigenvectors of

### Gram–Schmidt process

**Gram-Schmidt processGram–SchmidtGram–Schmidt orthogonalization**

A Gram–Schmidt process could orthogonalize the columns, but it is not the most reliable, nor the most efficient, nor the most invariant method.

The application of the Gram–Schmidt process to the column vectors of a full column rank matrix yields the QR decomposition (it is decomposed into an orthogonal and a triangular matrix).

### Orthogonal Procrustes problem

The problem of finding the orthogonal matrix Q nearest a given matrix M is related to the Orthogonal Procrustes problem.

In its classical form, one is given two matrices A and B and asked to find an orthogonal matrix R which most closely maps A to B.

### Skew-symmetric matrix

**skew-symmetricskew-symmetric matricesantisymmetric matrix**

In Lie group terms, this means that the Lie algebra of an orthogonal matrix group consists of skew-symmetric matrices.

The matrix exponential of a skew-symmetric matrix A is then an orthogonal matrix R: