Probability distribution

distributioncontinuous probability distributiondiscrete probability distributionprobability distributionscontinuousdiscretecontinuous distributioncontinuous random variabledistributionsdiscrete distribution
In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.wikipedia
1,041 Related Articles

Probability theory

theory of probabilityprobabilityprobability theorist
In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.
Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes, which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion.

Randomness

randomchancerandomly
In more technical terms, the probability distribution is a description of a random phenomenon in terms of the probabilities of events.
A random process is a sequence of random variables whose outcomes do not follow a deterministic pattern, but follow an evolution described by probability distributions.

Random variable

random variablesrandom variationrandom
For instance, if the random variable is used to denote the outcome of a coin toss ("the experiment"), then the probability distribution of would take the value 0.5 for A univariate distribution gives the probabilities of a single random variable taking on various alternative values; a multivariate distribution (a joint probability distribution) gives the probabilities of a random vector – a list of two or more random variables – taking on various combinations of values.
A random variable has a probability distribution, which specifies the probability of its values.

Statistics

statisticalstatistical analysisstatistician
In probability theory and statistics, a probability distribution is a mathematical function that provides the probabilities of occurrence of different possible outcomes in an experiment.
Numerical descriptors include mean and standard deviation for continuous data types (like income), while frequency and percentage are more useful in terms of describing categorical data (like race).

Probability density function

probability densitydensity functiondensity
On the other hand, a continuous probability distribution (applicable to the scenarios where the set of possible outcomes can take on values in a continuous range (e.g. real numbers), such as the temperature on a given day) is typically described by probability density functions (with the probability of any individual outcome actually being 0). The normal distribution is a commonly encountered continuous probability distribution.
In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can be interpreted as providing a relative likelihood that the value of the random variable would equal that sample.

Joint probability distribution

joint distributionjoint probabilitymultivariate distribution
A probability distribution whose sample space is the set of real numbers is called univariate, while a distribution whose sample space is a vector space is called multivariate. A univariate distribution gives the probabilities of a single random variable taking on various alternative values; a multivariate distribution (a joint probability distribution) gives the probabilities of a random vector – a list of two or more random variables – taking on various combinations of values.
Given random variables X,Y,\ldots, that are defined on a probability space, the joint probability distribution for X,Y,\ldots is a probability distribution that gives the probability that each of X,Y,\ldots falls in any particular range or discrete set of values specified for that variable.

Probability mass function

mass functionprobability massmass
A discrete probability distribution (applicable to the scenarios where the set of possible outcomes is discrete, such as a coin toss or a roll of dice) can be encoded by a discrete list of the probabilities of the outcomes, known as a probability mass function.
The probability mass function is often the primary means of defining a discrete probability distribution, and such functions exist for either scalar or multivariate random variables whose domain is discrete.

Univariate distribution

univariateuni
A probability distribution whose sample space is the set of real numbers is called univariate, while a distribution whose sample space is a vector space is called multivariate.
In statistics, a univariate distribution is a probability distribution of only one random variable.

Binomial distribution

binomialbinomial probability distributionbinomial random variable
Important and commonly encountered univariate probability distributions include the binomial distribution, the hypergeometric distribution, and the normal distribution. Well-known discrete probability distributions used in statistical modeling include the Poisson distribution, the Bernoulli distribution, the binomial distribution, the geometric distribution, and the negative binomial distribution.
In probability theory and statistics, the binomial distribution with parameters n and p is the discrete probability distribution of the number of successes in a sequence of n independent experiments, each asking a yes–no question, and each with its own boolean-valued outcome: a random variable containing a single bit of information: success/yes/true/one (with probability p) or failure/no/false/zero (with probability q = 1 − p). A single success/failure experiment is also called a Bernoulli trial or Bernoulli experiment and a sequence of outcomes is called a Bernoulli process; for a single trial, i.e., n = 1, the binomial distribution is a Bernoulli distribution.

Cumulative distribution function

distribution functionCDFcumulative probability distribution function
On the other hand, the cumulative distribution function describes the probability that the random variable is no larger than a given value; the probability that the outcome lies in a given interval can be computed by taking the difference between the values of the cumulative distribution function at the endpoints of the interval.
In the case of a continuous distribution, it gives the area under the probability density function from minus infinity to x. Cumulative distribution functions are also used to specify the distribution of multivariate random variables.

Categorical distribution

categoricalcategorical probability distributioncategorical variable
Categorical distribution: for discrete random variables with a finite set of values.
In probability theory and statistics, a categorical distribution (also called a generalized Bernoulli distribution, multinoulli distribution ) is a discrete probability distribution that describes the possible results of a random variable that can take on one of K possible categories, with the probability of each category separately specified.

Median

averagesample medianmedian-unbiased estimator
Median: the value such that the set of values less than the median, and the set greater than the median, each have probabilities no greater than one-half.
The median is the value separating the higher half from the lower half of a data sample (a population or a probability distribution).

Heavy-tailed distribution

heavy tailsheavy-tailedheavy tail
Head: the range of values where the pmf or pdf is relatively high.
In probability theory, heavy-tailed distributions are probability distributions whose tails are not exponentially bounded: that is, they have heavier tails than the exponential distribution.

Statistical dispersion

dispersionvariabilityspread
Variance: the second moment of the pmf or pdf about the mean; an important measure of the dispersion of the distribution.
In statistics, dispersion (also called variability, scatter, or spread) is the extent to which a distribution is stretched or squeezed.

Symmetric probability distribution

symmetricsymmetric distributionSymmetry
Symmetry: a property of some distributions in which the portion of the distribution to the left of a specific value is a mirror image of the portion to its right.
In statistics, a symmetric probability distribution is a probability distribution—an assignment of probabilities to possible occurrences—which is unchanged when its probability density function or probability mass function is reflected around a vertical line at some value of the random variable represented by the distribution.

Skewness

skewedskewskewed distribution
Skewness: a measure of the extent to which a pmf or pdf "leans" to one side of its mean. The third standardized moment of the distribution.
In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable about its mean.

Event (probability theory)

eventeventsrandom event
In more technical terms, the probability distribution is a description of a random phenomenon in terms of the probabilities of events.
For many standard probability distributions, such as the normal distribution, the sample space is the set of real numbers or some subset of the real numbers.

Expected value

expectationexpectedmean
Expected value or mean: the weighted average of the possible values, using their probabilities as their weights; or the continuous analog thereof.
The expected value does not exist for random variables having some distributions with large "tails", such as the Cauchy distribution.

Standard deviation

standard deviationssample standard deviationsigma
Standard deviation: the square root of the variance, and hence another measure of dispersion.
The standard deviation of a random variable, statistical population, data set, or probability distribution is the square root of its variance.

Kurtosis

excess kurtosisleptokurticplatykurtic
Kurtosis: a measure of the "fatness" of the tails of a pmf or pdf. The fourth standardized moment of the distribution.
In probability theory and statistics, kurtosis (from κυρτός, kyrtos or kurtos, meaning "curved, arching") is a measure of the "tailedness" of the probability distribution of a real-valued random variable.

Mode (statistics)

modemodalmodes
Mode: for a discrete random variable, the value with highest probability (the location at which the probability mass function has its peak); for a continuous random variable, a location at which the probability density function has a local peak.
The mode is not necessarily unique to a given discrete distribution, since the probability mass function may take the same maximum value at several points x 1, x 2, etc. The most extreme case occurs in uniform distributions, where all values occur equally frequently.

Poisson distribution

PoissonPoisson-distributedPoissonian
Well-known discrete probability distributions used in statistical modeling include the Poisson distribution, the Bernoulli distribution, the binomial distribution, the geometric distribution, and the negative binomial distribution.
In probability theory and statistics, the Poisson distribution (in English often rendered ), named after French mathematician Siméon Denis Poisson, is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant rate and independently of the time since the last event.

Negative binomial distribution

negative binomialgamma-Poisson distributioninverse binomial distribution
Well-known discrete probability distributions used in statistical modeling include the Poisson distribution, the Bernoulli distribution, the binomial distribution, the geometric distribution, and the negative binomial distribution.
In probability theory and statistics, the negative binomial distribution is a discrete probability distribution of the number of successes in a sequence of independent and identically distributed Bernoulli trials before a specified (non-random) number of failures (denoted r) occurs.

Geometric distribution

geometricgeometrically distributedgeometric random variable
Well-known discrete probability distributions used in statistical modeling include the Poisson distribution, the Bernoulli distribution, the binomial distribution, the geometric distribution, and the negative binomial distribution.
In probability theory and statistics, the geometric distribution is either of two discrete probability distributions:

Multivariate random variable

random vectorvectormultivariate
A univariate distribution gives the probabilities of a single random variable taking on various alternative values; a multivariate distribution (a joint probability distribution) gives the probabilities of a random vector – a list of two or more random variables – taking on various combinations of values.
The distributions of each of the component random variables X_i are called marginal distributions.