# Matrix (mathematics)

**matrixmatricesmatrix theorysubmatrixmatrix algebraarraymatrix operationmatrix equationreal matrixsubmatrices**

[[File:Matris.png|thumb|Anwikipedia

1,422 Related Articles

### Rotation matrix

**rotation matricesInfinitesimal rotationrotations**

For example, the rotation of vectors in three-dimensional space is a linear transformation, which can be represented by a rotation matrix R: if v is a column vector (a matrix with only one column) describing the position of a point in space, the product Rv is a column vector describing the position of that point after a rotation.

In linear algebra, a rotation matrix is a matrix that is used to perform a rotation in Euclidean space.

### Irregular matrix

**irregular matricesragged matrix**

In mathematics, a matrix (plural matrices) is a rectangular array (see irregular matrix) of numbers, symbols, or expressions, arranged in rows and columns.

An irregular matrix, or ragged matrix, is a matrix that has a different number of elements in each row.

### Transformation matrix

**transformation matricesmatrix transformation3D vertex transformation**

The product of two transformation matrices is a matrix that represents the composition of two transformations.

In linear algebra, linear transformations can be represented by matrices.

### Matrix addition

**direct sumaddedaddition**

:Provided that they have the same size (each matrix has the same number of rows and the same number of columns as the other), two matrices can be added or subtracted element by element (see Conformable matrix).

In mathematics, matrix addition is the operation of adding two matrices by adding the corresponding entries together.

### Matrix calculus

**matrix derivativederivativederivative of a scalar with respect to a vector**

Matrix calculus generalizes classical analytical notions such as derivatives and exponentials to higher dimensions.

In mathematics, matrix calculus is a specialized notation for doing multivariable calculus, especially over spaces of matrices.

### Diagonal matrix

**diagonaldiagonal matricesscalar matrices**

Algorithms that are tailored to particular matrix structures, such as sparse matrices and near-diagonal matrices, expedite computations in finite element method and other computations.

In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices.

### Conformable matrix

**conformableconformably**

:Provided that they have the same size (each matrix has the same number of rows and the same number of columns as the other), two matrices can be added or subtracted element by element (see Conformable matrix).

In mathematics, a matrix is conformable if its dimensions are suitable for defining some operation (e.g. addition, multiplication, etc.).

### Scalar (mathematics)

**scalarscalarsbase field**

Any matrix can be multiplied element-wise by a scalar from its associated field.

The term is also sometimes used informally to mean a vector, matrix, tensor, or other usually "compound" value that is actually reduced to a single component.

### Mathematics

**mathematicalmathmathematician**

In mathematics, a matrix (plural matrices) is a rectangular array (see irregular matrix) of numbers, symbols, or expressions, arranged in rows and columns.

Perhaps the foremost mathematician of the 19th century was the German mathematician Carl Friedrich Gauss, who made numerous contributions to fields such as algebra, analysis, differential geometry, matrix theory, number theory, and statistics.

### Square matrix

**square matricessquarematrix**

A matrix with the same number of rows and columns is called a square matrix.

In mathematics, a square matrix is a matrix with the same number of rows and columns.

### Sparse matrix

**sparsesparse matricessparsity**

Algorithms that are tailored to particular matrix structures, such as sparse matrices and near-diagonal matrices, expedite computations in finite element method and other computations.

In numerical analysis and scientific computing, a sparse matrix or sparse array is a matrix in which most of the elements are zero.

### Derivative

**differentiationdifferentiablefirst derivative**

Matrix calculus generalizes classical analytical notions such as derivatives and exponentials to higher dimensions.

The Jacobian matrix is the matrix that represents this linear transformation with respect to the basis given by the choice of independent and dependent variables.

### Row and column vectors

**column vectorrow vectorvector**

For example, the rotation of vectors in three-dimensional space is a linear transformation, which can be represented by a rotation matrix R: if v is a column vector (a matrix with only one column) describing the position of a point in space, the product Rv is a column vector describing the position of that point after a rotation.

In linear algebra, a column vector or column matrix is an m × 1 matrix, that is, a matrix consisting of a single column of m elements,

### Transpose

**matrix transposetranspositionmatrix transposition**

Equivalently, a matrix A is orthogonal if its transpose is equal to its inverse:

### Transformation (function)

**transformationtransformationsmathematical transformation**

The product of two transformation matrices is a matrix that represents the composition of two transformations.

They are also operations that can be performed using linear algebra, and described explicitly using matrices.

### Hadamard product (matrices)

**Hadamard productSchur productelement-by-element multiplication**

Besides the ordinary matrix multiplication just described, there exist other less frequently used operations on matrices that can be considered forms of multiplication, such as the Hadamard product and the Kronecker product.

In mathematics, the Hadamard product (also known as the Schur product or the entrywise product ) is a binary operation that takes two matrices of the same dimensions and produces another matrix of the same dimension as the operands where each element

### Rank (linear algebra)

**rankcolumn rankfull rank**

The rank of a matrix A is the maximum number of linearly independent row vectors of the matrix, which is the same as the maximum number of linearly independent column vectors.

In linear algebra, the rank of a matrix A is the dimension of the vector space generated (or spanned) by its columns.

### Kronecker product

**Kronecker sumKronecker multiplicationKronecker power**

Besides the ordinary matrix multiplication just described, there exist other less frequently used operations on matrices that can be considered forms of multiplication, such as the Hadamard product and the Kronecker product.

In mathematics, the Kronecker product, denoted by ⊗, is an operation on two matrices of arbitrary size resulting in a block matrix.

### Determinant

**determinantsdetmatrix determinant**

If the matrix is square, it is possible to deduce some of its properties by computing its determinant. The minors and cofactors of a matrix are found by computing the determinant of certain submatrices.

The following identity holds for a Schur complement of a square matrix:

### Minor (linear algebra)

**minorsminorprincipal minor**

The minors and cofactors of a matrix are found by computing the determinant of certain submatrices.

In linear algebra, a minor of a matrix A is the determinant of some smaller square matrix, cut down from A by removing one or more of its rows and columns.

### 2 × 2 real matrices

**2 × 2 real matrix2 × 2 real matrix2 × 2 real matrices with determinant 1**

The following table shows a number of 2-by-2 matrices with the associated linear maps of R 2.

In mathematics, the associative algebra of real matrices is denoted by M(2, R).

### Index notation

**indexindicesindexed terms**

Each element of a matrix is often denoted by a variable with two subscripts.

Special (and more familiar) cases are vectors (1d arrays) and matrices (2d arrays).

### Sylvester equation

**SylvesterSylvester matrix equation**

They arise in solving matrix equations such as the Sylvester equation.

In mathematics, in the field of control theory, a Sylvester equation is a matrix equation of the form:

### Conjugate transpose

**Hermitian transposeHermitian transpositionadjoint matrix**

In complex matrices, symmetry is often replaced by the concept of Hermitian matrices, which satisfy A ∗ = A, where the star or asterisk denotes the conjugate transpose of the matrix, that is, the transpose of the complex conjugate of A.

In mathematics, the conjugate transpose or Hermitian transpose of an m-by-n matrix with complex entries is the n-by-m matrix obtained from by taking the transpose and then taking the complex conjugate of each entry.

### System of linear equations

**systems of linear equationslinear system of equationssimultaneous linear equations**

Another application of matrices is in the solution of systems of linear equations.

The vector equation is equivalent to a matrix equation of the form