Probability density function

probability densitydensity functiondensityPDFjoint probability density functionprobability densitiesdistributionprobability density function (PDF)densitiesdensity distribution
In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can be interpreted as providing a relative likelihood that the value of the random variable would equal that sample.wikipedia
564 Related Articles

Random variable

random variablesrandom variationrandom
In a more precise sense, the PDF is used to specify the probability of the random variable falling within a particular range of values, as opposed to taking on any one value.
Random variables can be discrete, that is, taking any of a specified finite or countable list of values, endowed with a probability mass function characteristic of the random variable's probability distribution; or continuous, taking any numerical value in an interval or collection of intervals, via a probability density function that is characteristic of the random variable's probability distribution; or a mixture of both types.

Cumulative distribution function

distribution functionCDFcumulative probability distribution function
In other sources, "probability distribution function" may be used when the probability distribution is defined as a function over general sets of values, or it may refer to the cumulative distribution function, or it may be a probability mass function (PMF) rather than the density. A distribution has a density function if and only if its cumulative distribution function F(x) is absolutely continuous.
In the case of a continuous distribution, it gives the area under the probability density function from minus infinity to x. Cumulative distribution functions are also used to specify the distribution of multivariate random variables.

Probability distribution

distributioncontinuous probability distributiondiscrete probability distribution
In other sources, "probability distribution function" may be used when the probability distribution is defined as a function over general sets of values, or it may refer to the cumulative distribution function, or it may be a probability mass function (PMF) rather than the density. In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can be interpreted as providing a relative likelihood that the value of the random variable would equal that sample. (usually with the Borel sets as measurable subsets) has as probability distribution the measure X ∗ P on : the density of X with respect to a reference measure \mu on is the Radon–Nikodym derivative:
On the other hand, a continuous probability distribution (applicable to the scenarios where the set of possible outcomes can take on values in a continuous range (e.g. real numbers), such as the temperature on a given day) is typically described by probability density functions (with the probability of any individual outcome actually being 0). The normal distribution is a commonly encountered continuous probability distribution.

Integral

integrationintegral calculusdefinite integral
This probability is given by the integral of this variable’s PDF over that range—that is, it is given by the area under the density function but above the horizontal axis and between the lowest and greatest values of the range.
Moreover, the integral under an entire probability density function must equal 1, which provides a test of whether a function with no negative values could be a density function or not.

Probability mass function

mass functionprobability massmass
In other sources, "probability distribution function" may be used when the probability distribution is defined as a function over general sets of values, or it may refer to the cumulative distribution function, or it may be a probability mass function (PMF) rather than the density.
A probability mass function differs from a probability density function (pdf) in that the latter is associated with continuous rather than discrete random variables; the values of the probability density function are not probabilities as such: a pdf must be integrated over an interval to yield a probability.

Uniform distribution (continuous)

uniform distributionuniformuniformly distributed
Unlike a probability, a probability density function can take on values greater than one; for example, the uniform distribution on the interval [0, ½] has probability density f(x) = 2 for 0 ≤ x ≤ ½ and f(x) = 0 elsewhere.
The probability density function of the continuous uniform distribution is:

Expected value

expectationexpectedmean
If a random variable X is given and its distribution admits a probability density function f, then the expected value of X (if the expected value exists) can be calculated as
The same principle applies to an absolutely continuous random variable, except that an integral of the variable with respect to its probability density replaces the sum.

Cantor distribution

Cantorneither
Not every probability distribution has a density function: the distributions of discrete random variables do not; nor does the Cantor distribution, even though it has no discrete component, i.e., does not assign positive probability to any individual point.
This distribution has neither a probability density function nor a probability mass function, since although its cumulative distribution function is a continuous function, the distribution is not absolutely continuous with respect to Lebesgue measure, nor does it have any point-masses.

Radon–Nikodym theorem

Radon–Nikodym derivativedensityRadon-Nikodym derivative
(usually with the Borel sets as measurable subsets) has as probability distribution the measure X ∗ P on : the density of X with respect to a reference measure \mu on is the Radon–Nikodym derivative:
Specifically, the probability density function of a random variable is the Radon–Nikodym derivative of the induced measure with respect to some base measure (usually the Lebesgue measure for continuous random variables).

Variance

sample variancepopulation variancevariability
For instance, the above expression allows for determining statistical characteristics of such a discrete variable (such as its mean, its variance and its kurtosis), starting from the formulas given for a continuous distribution of the probability.
If the random variable X represents samples generated by a continuous distribution with probability density function f(x), and F(x) is the corresponding cumulative distribution function, then the population variance is given by

Kernel (statistics)

kernelkernelskernel estimation
From the perspective of a given distribution, the parameters are constants, and terms in a density function that contain only parameters, but not variables, are part of the normalization factor of a distribution (the multiplicative factor that ensures that the area under the density—the probability of something in the domain occurring— equals 1). This normalization factor is outside the kernel of the distribution.
In statistics, especially in Bayesian statistics, the kernel of a probability density function (pdf) or probability mass function (pmf) is the form of the pdf or pmf in which any factors that are not functions of any of the variables in the domain are omitted.

Normalizing constant

normalizednormalization constantnormalization factor
From the perspective of a given distribution, the parameters are constants, and terms in a density function that contain only parameters, but not variables, are part of the normalization factor of a distribution (the multiplicative factor that ensures that the area under the density—the probability of something in the domain occurring— equals 1). This normalization factor is outside the kernel of the distribution.
In probability theory, a normalizing constant is a constant by which an everywhere non-negative function must be multiplied so the area under its graph is 1, e.g., to make it a probability density function or a probability mass function.

Mean

mean valuepopulation meanaverage
For instance, the above expression allows for determining statistical characteristics of such a discrete variable (such as its mean, its variance and its kurtosis), starting from the formulas given for a continuous distribution of the probability.
For a continuous distribution,the mean is, where f(x) is the probability density function.

Independence (probability theory)

independentstatistically independentindependence
Continuous random variables X 1, ..., X n admitting a joint density are all independent from each other if and only if
or equivalently, if the probability densities f_X(x) and f_Y(y) and the joint probability density exist,

Absolute continuity

absolutely continuousabsolutely continuous measureAbsolutely continuous function
A distribution has a density function if and only if its cumulative distribution function F(x) is absolutely continuous.
Thus, the absolutely continuous measures on R n are precisely those that have densities; as a special case, the absolutely continuous probability measures are precisely the ones that have probability density functions.

Normal distribution

normally distributednormalGaussian
The standard normal distribution has probability density
The probability density of the normal distribution is

Kernel density estimation

kernelkernel densitykernel density estimate
Kernel density estimation
In statistics, kernel density estimation (KDE) is a non-parametric way to estimate the probability density function of a random variable.

Atomic orbital

orbitalorbitalsatomic orbitals
Atomic orbital
functions that show probability density more directly, see the graphs of d-orbitals below.]]

Density estimation

estimatingdensity estimatesestimate
Density estimation
density estimation is the construction of an estimate, based on observed data, of an unobservable underlying probability density function.

Law of the unconscious statistician

See Law of the unconscious statistician.
If it is a continuous distribution and one knows its probability density function ƒ X (but not ƒ g(X) ), then the expected value of g(X) is

Kurtosis

excess kurtosisleptokurticplatykurtic
For instance, the above expression allows for determining statistical characteristics of such a discrete variable (such as its mean, its variance and its kurtosis), starting from the formulas given for a continuous distribution of the probability.
The probability density function is given by

Likelihood function

likelihoodlog-likelihoodlikelihood ratio
Likelihood function
Let X be a random variable following an absolutely continuous probability distribution with density function f depending on a parameter \theta.

Cauchy distribution

LorentzianCauchyLorentzian profile
This is the density of a standard Cauchy distribution.
The Cauchy distribution has the probability density function (PDF)

Dirac delta function

delta functionimpulsedirac delta
It is possible to represent certain discrete random variables as well as random variables involving both a continuous and a discrete part with a generalized probability density function, by using the Dirac delta function.
It represents the probability density at time t = ε of the position of a particle starting at the origin following a standard Brownian motion.

Secondary measure

Secondary measure
When ρ is a probability density function, a sufficient condition so that μ, while admitting moments of any order can be a secondary measure associated with ρ is that its Stieltjes Transformation is given by an equality of the type: