Linear algebra

linearlinear algebraiclinear-algebraicVectorvectorsAdvanced Linear Algebraalgebraalgebraicbaseslinearity properties
Linear algebra is the branch of mathematics concerning linear equations such as : linear functions such as :and their representations through matrices and vector spaces.wikipedia
860 Related Articles

Vector space

vectorvector spacesvectors
:and their representations through matrices and vector spaces.
Vector spaces are the subject of linear algebra and are well characterized by their dimension, which, roughly speaking, specifies the number of independent directions in the space.

Gaussian elimination

Gauss–Jordan eliminationGauss-Jordan eliminationrow reduction
The procedure for solving simultaneous linear equations now called Gaussian elimination appears in the ancient Chinese mathematical text Chapter Eight: Rectangular Arrays of The Nine Chapters on the Mathematical Art.
Gaussian elimination, also known as row reduction, is an algorithm in linear algebra for solving a system of linear equations.


determinantsdetmatrix determinant
The first systematic methods for solving linear systems used determinants, first considered by Leibniz in 1693.
In linear algebra, the determinant is a scalar value that can be computed from the elements of a square matrix and encodes certain properties of the linear transformation described by the matrix.

Cramer's rule

In 1750, Gabriel Cramer used them for giving explicit solutions of linear systems, now called Cramer's rule.
In linear algebra, Cramer's rule is an explicit formula for the solution of a system of linear equations with as many equations as unknowns, valid whenever the system has a unique solution.

Functional analysis

functionalfunctional analyticalgebraic function theory
Also, functional analysis may be basically viewed as the application of linear algebra to spaces of functions.
In contrast, linear algebra deals mostly with finite-dimensional spaces, and does not use topology.

Matrix multiplication

matrix productmultiplicationproduct
Arthur Cayley introduced matrix multiplication and the inverse matrix in 1856, making possible the general linear group.
Matrix multiplication is thus a basic tool of linear algebra, and as such has numerous applications in many areas of mathematics, as well as in applied mathematics, statistics, physics, economics, and engineering.

Differential geometry

differentialdifferential geometerdifferential geometry and topology
The telegraph required an explanatory system, and the 1873 publication of A Treatise on Electricity and Magnetism instituted a field theory of forces and required differential geometry for expression.
Differential geometry is a mathematical discipline that uses the techniques of differential calculus, integral calculus, linear algebra and multilinear algebra to study problems in geometry.

Group representation

representationrepresentationsrepresentation theory
The mechanism of group representation became available for describing complex and hypercomplex numbers.
Representations of groups are important because they allow many group-theoretic problems to be reduced to problems in linear algebra, which is well understood.

Field (mathematics)

fieldfieldsfield theory
A vector space over a field
Most importantly for algebraic purposes, any field may be used as the scalars for a vector space, which is the standard general context for linear algebra.

Scalar multiplication

multipliedmultiplicationscalar multiple
. The second operation, scalar multiplication, takes any scalar
In mathematics, scalar multiplication is one of the basic operations defining a vector space in linear algebra (or more generally, a module in abstract algebra ).

Hermann Grassmann

GrassmannHermann Günther GrassmannGrassman
In 1844 Hermann Grassmann published his "Theory of Extension" which included foundational new topics of what is today called linear algebra.
This essay, first published in the Collected Works of 1894–1911, contains the first known appearance of what is now called linear algebra and the notion of a vector space.

Abstract algebra

algebraalgebraicmodern algebra
Linear algebra took its modern form in the first half of the twentieth century, when many ideas and methods of previous centuries were generalized as abstract algebra.

Linear subspace

These subsets are called linear subspaces.
In mathematics, and more specifically in linear algebra, a linear subspace, also known as a vector subspace is a vector space that is a subset of some larger vector space.

Kernel (linear algebra)

kernelnull spacenullspace
An essential question in linear algebra is testing whether a linear map is an isomorphism or not, and, if it is not an isomorphism, finding its range (or image) and the set of elements that are mapped to the zero vector, called the kernel of the map.
In mathematics, and more specifically in linear algebra and functional analysis, the kernel (also known as null space or nullspace) of a linear map L : V → W between two vector spaces V and W, is the set of all elements v of V for which L(v) = 0, where 0 denotes the zero vector in W.

Linear span

spanspannedspanning set
are in F form a linear subspace called the span of S. The span of S is also the intersection of all linear subspaces containing S. In other words, it is the (smallest for the inclusion relation) linear subspace containing S.
In linear algebra, the linear span (also called the linear hull or just span) of a set S of vectors in a vector space is the smallest linear subspace that contains the set.

Function (mathematics)

functionfunctionsmathematical function
Elements of a vector space may have various nature; for example, they can be sequences, functions, polynomials or matrices.
For example, in linear algebra and functional analysis, linear forms and the vectors they act upon are denoted using a dual pair to show the underlying duality.

Map (mathematics)

Linear maps are mappings between vector spaces that preserve the vector-space structure.
For instance, a "map" is a continuous function in topology, a linear transformation in linear algebra, etc.

Linear combination

linear combinationslinearly combined(finite) left ''R''-linear combinations
Another important way of forming a subspace is to consider linear combinations of a set S of vectors: the set of all sums
The concept of linear combinations is central to linear algebra and related fields of mathematics.

Charles Sanders Peirce

PeirceC. S. PeirceCharles S. Peirce
Benjamin Peirce published his Linear Associative Algebra (1872), and his son Charles Sanders Peirce extended the work later.
He also worked on linear algebra, matrices, various geometries, topology and Listing numbers, Bell numbers, graphs, the four-color problem, and the nature of continuity.

Mathematical model

mathematical modelingmodelmathematical models
Linear algebra is also used in most sciences and engineering areas, because it allows modeling many natural phenomena, and efficiently computing with such models.
For example, economists often apply linear algebra when using input-output models.

Coordinate vector

This isomorphism allows representing a vector by its inverse image under this isomorphism, that is by the coordinates vector or by the column matrix
In linear algebra, a coordinate vector is a representation of a vector as an ordered list of numbers that describes the vector in terms of a particular ordered basis.

Row and column vectors

column vectorrow vectorvector
This isomorphism allows representing a vector by its inverse image under this isomorphism, that is by the coordinates vector or by the column matrix
In linear algebra, a column vector or column matrix is an m × 1 matrix, that is, a matrix consisting of a single column of m elements,


manifoldsboundarymanifold with boundary
Linear algebra is flat differential geometry and serves in tangent spaces to manifolds.
The study of manifolds combines many important areas of mathematics: it generalizes concepts such as curves and surfaces as well as ideas from linear algebra and topology.

Matrix similarity

similarsimilarity transformationsimilar matrices
Two matrices that encode the same linear transformation in different bases are called similar.
In linear algebra, two n-by-n matrices A and B are called similar if there exists an invertible n-by-n matrix P such that


Linear algebra is the branch of mathematics concerning linear equations such as :
Another example of an algebraic theory is linear algebra, which is the general study of vector spaces, whose elements called vectors have both quantity and direction, and can be used to model (relations between) points in space.