# Sparse dictionary learning

**dictionary learningdictionary matrixMini-batch dictionary learningovercomplete dictionary**

Sparse dictionary learning is a representation learning method which aims at finding a sparse representation of the input data (also known as sparse coding) in the form of a linear combination of basic elements as well as those basic elements themselves.wikipedia

47 Related Articles

### Feature learning

**representation learningefficient codingsefficient data codings**

Sparse dictionary learning is a representation learning method which aims at finding a sparse representation of the input data (also known as sparse coding) in the form of a linear combination of basic elements as well as those basic elements themselves.

### K-SVD

**K-SVD algorithm**

K-SVD is an algorithm that performs SVD at its core to update the atoms of the dictionary one by one and basically is a generalization of K-means.

In applied mathematics, K-SVD is a dictionary learning algorithm for creating a dictionary for sparse representations, via a singular value decomposition approach.

### K-means clustering

**k-meansk''-means clusteringk-means algorithm**

K-SVD is an algorithm that performs SVD at its core to update the atoms of the dictionary one by one and basically is a generalization of K-means.

k-means clustering has been used as a feature learning (or dictionary learning) step, in either (semi-)supervised learning or unsupervised learning.

### Matching pursuit

**Orthogonal matching pursuit**

There has been developed a number of algorithms to solve it (such as matching pursuit and LASSO) which are incorporated into the algorithms described below.

MP is also used in dictionary learning.

### Sparse approximation

**sparse representationsprojected gradient descentSparse Coding**

The problem of finding an optimal sparse coding R with a given dictionary \mathbf{D} is known as sparse approximation (or sometimes just sparse coding problem).

These problems are typically accompanied by a dictionary learning mechanism that aims to fit D to best match the model to the given data.

### Online machine learning

**online learningon-line learningonline**

Such cases lie in the field of study of online learning which essentially suggests iteratively updating the model upon the new data points x becoming available.

### Neural coding

**sparse codingneural coderate coding**

Other models are based on matching pursuit, a sparse approximation algorithm which finds the "best matching" projections of multidimensional data, and dictionary learning, a representation learning method which aims to find a sparse matrix representation of the input data in the form of a linear combination of basic elements as well as those basic elements themselves.

### Orthogonal basis

**orthogonalorthogonal basesorthogonal basis set**

Atoms in the dictionary are not required to be orthogonal, and they may be an over-complete spanning set.

### Sparse matrix

**sparsesparse matricessparsity**

Sparse dictionary learning is a representation learning method which aims at finding a sparse representation of the input data (also known as sparse coding) in the form of a linear combination of basic elements as well as those basic elements themselves.

### Compressed sensing

**compressive sensingcompressed sensing techniquesCompressed-Sensing**

One of the most important applications of sparse dictionary learning is in the field of compressed sensing or signal recovery.

### Detection theory

**signal detection theorysignal detectionsignal recovery**

One of the most important applications of sparse dictionary learning is in the field of compressed sensing or signal recovery.

### Wavelet transform

**wavelet compressionwavelet transformswavelet**

Since not all signals satisfy this sparsity condition, it is of great importance to find a sparse representation of that signal such as the wavelet transform or the directional gradient of a rasterized matrix. Before this approach the general practice was to use predefined dictionaries (such as fourier or wavelet transforms).

### Basis pursuit

Once a matrix or a high dimensional vector is transferred to a sparse space, different recovery algorithms like basis pursuit, CoSaMP or fast non-iterative algorithms can be used to recover the signal.

### Signal processing

**signal analysissignalsignal processor**

The emergence of sparse dictionary learning methods was stimulated by the fact that in signal processing one typically wants to represent the input data using as few components as possible.

### Fourier transform

**continuous Fourier transformFourierFourier transforms**

Before this approach the general practice was to use predefined dictionaries (such as fourier or wavelet transforms).

### Noise reduction

**denoisingimage denoisingaudio noise reduction**

However, in certain cases a dictionary that is trained to fit the input data can significantly improve the sparsity, which has applications in data decomposition, compression and analysis and has been used in the fields of image denoising and classification, video and audio processing.

### Computer vision

**visionimage classificationImage recognition**

However, in certain cases a dictionary that is trained to fit the input data can significantly improve the sparsity, which has applications in data decomposition, compression and analysis and has been used in the fields of image denoising and classification, video and audio processing.

### Audio signal processing

**audio processoraudio processingsound processing**

However, in certain cases a dictionary that is trained to fit the input data can significantly improve the sparsity, which has applications in data decomposition, compression and analysis and has been used in the fields of image denoising and classification, video and audio processing.

### Optimization problem

**optimal solutionoptimizationobjective function**

This can be formulated as the following optimization problem:

### Lp space

**L'' ''p'' spaceL ''p'' spacesL'' ''p'' spaces**

The minimization problem above is not convex because of the ℓ 0 -"norm" and solving this problem is NP-hard.

### Taxicab geometry

**Manhattan distanceL1 normtaxicab metric**

In some cases L 1 -norm is known to ensure sparsity and so the above becomes a convex optimization problem with respect to each of the variables \mathbf{D} and \mathbf{R} when the other one is fixed, but it is not jointly convex in.

### Convex optimization

**convex minimizationconvexconvex programming**

In some cases L 1 -norm is known to ensure sparsity and so the above becomes a convex optimization problem with respect to each of the variables \mathbf{D} and \mathbf{R} when the other one is fixed, but it is not jointly convex in.

### Dimensionality reduction

**dimension reductionreduce the dimensionalitydimensional reduction**

This case is strongly related to dimensionality reduction and techniques like principal component analysis which require atoms d_1,...,d_n to be orthogonal.

### Principal component analysis

**principal components analysisPCAprincipal components**

This case is strongly related to dimensionality reduction and techniques like principal component analysis which require atoms d_1,...,d_n to be orthogonal.

### Basis (linear algebra)

**basisbasis vectorbases**

Overcomplete dictionaries, however, do not require the atoms to be orthogonal (they will never be a basis anyway) thus allowing for more flexible dictionaries and richer data representations.