# Marginal distribution

**marginal probabilitymarginalmarginalsmarginal probability distributionmarginalizingmarginal densitymarginalizationmarginalized outmarginal totalmarginalize**

In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset.wikipedia

112 Related Articles

### Conditional probability distribution

**conditional distributionconditional densityconditional**

This contrasts with a conditional distribution, which gives the probabilities contingent upon the values of the other variables.

The conditional distribution contrasts with the marginal distribution of a random variable, which is its distribution without reference to the value of the other variable.

### Joint probability distribution

**joint distributionjoint probabilitymultivariate distribution**

Given two discrete random variables X and Y whose joint distribution is known, the marginal distribution of X is simply the probability distribution of X averaging over information about Y. Given two continuous random variables X and Y whose joint distribution is known, then marginal probability density function can be obtained by integrating the joint probability distribution over Y, and vice versa.

These in turn can be used to find two other types of distributions: the marginal distribution giving the probabilities for any one of the variables with no reference to any specific ranges of values for the other variables, and the conditional probability distribution giving the probabilities for any subset of the variables conditional on particular values of the remaining variables.

### Compound probability distribution

**compound distributionmixturecompounding**

The compound distribution ("unconditional distribution") is the result of marginalizing (integrating) over the latent random variable(s) representing the parameter(s) of the parametrized distribution ("conditional distribution").

### Marginal likelihood

**evidencemodel evidenceBayesian model evidence**

In statistics, a marginal likelihood function, or integrated likelihood, is a likelihood function in which some parameter variables have been marginalized.

### Probability density function

**probability densitydensity functiondensity**

Given two continuous random variables X and Y whose joint distribution is known, then marginal probability density function can be obtained by integrating the joint probability distribution over Y, and vice versa.

Then, the joint density p(y,z) can be computed by a change of variables from U,V to Y,Z, and Y can be derived by marginalizing out Z from the joint density.

### Wasserstein metric

**Kantorovich metricWasserstein distanceLipschitz (first order Wasserstein; Kantorovich) metric**

where denotes the collection of all measures on M \times M with marginals \mu and \nu on the first and second factors respectively.

### Probability theory

**theory of probabilityprobabilityprobability theorist**

In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset.

### Statistics

**statisticalstatistical analysisstatistician**

In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset.

### Subset

**supersetproper subsetsubsets**

In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset.

### Indexed family

**familyindicesindex**

### Random variable

**random variablesrandom variationrandom**

Given two discrete random variables X and Y whose joint distribution is known, the marginal distribution of X is simply the probability distribution of X averaging over information about Y. Given two continuous random variables X and Y whose joint distribution is known, then marginal probability density function can be obtained by integrating the joint probability distribution over Y, and vice versa. In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset. That means, If X 1,X 2,...,Xn are discrete random variables, then the marginal probability mass function should be

### Probability distribution

**distributioncontinuous probability distributiondiscrete probability distribution**

Given two discrete random variables X and Y whose joint distribution is known, the marginal distribution of X is simply the probability distribution of X averaging over information about Y. In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset.

### Data analysis

**data analyticsanalysisdata analyst**

The context here is that the theoretical studies being undertaken, or the data analysis being done, involves a wider set of random variables but that attention is being limited to a reduced number of those variables.

### Expected value

**expectationexpectedmean**

A marginal probability can always be written as an expected value: This follows from the definition of expected value (after applying the law of the unconscious statistician)

### Law of the unconscious statistician

This follows from the definition of expected value (after applying the law of the unconscious statistician)

### Cumulative distribution function

**distribution functionCDFcumulative probability distribution function**

Finding the marginal cumulative distribution function from the joint culmulative distribution function is easy.

### Probability mass function

**mass functionprobability massmass**

That means, If X 1,X 2,...,Xn are discrete random variables, then the marginal probability mass function should be

### Copula (probability theory)

**copulaGaussian copulacopulas**

In probability theory and statistics, a copula is a multivariate cumulative distribution function for which the marginal probability distribution of each variable is uniform.

### Gibbs sampling

**Gibbs samplercollapsed Gibbs samplingGibbs**

This sequence can be used to approximate the joint distribution (e.g., to generate a histogram of the distribution); to approximate the marginal distribution of one of the variables, or some subset of the variables (for example, the unknown parameters or latent variables); or to compute an integral (such as the expected value of one of the variables).

### Junction tree algorithm

**clique tree propagation**

The junction tree algorithm (also known as 'Clique Tree') is a method used in machine learning to extract marginalization in general graphs.

### Posterior predictive distribution

**prior predictive distributionpredictive uncertainty quantification**

:And the posterior predictive distribution of \tilde{x} given \mathbf{X} is calculated by marginalizing the distribution of \tilde{x} given \theta over the posterior distribution of \theta given \mathbf{X}:

### Law of total probability

**overall probabilitytotal probability**

In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.

### Gaussian process

**Gaussian processesGaussianGaussian random process**

The prediction is not just an estimate for that point, but also has uncertainty information—it is a one-dimensional Gaussian distribution (which is the marginal distribution at that point).

### Mutual information

**Average Mutual Informationinformationalgorithmic mutual information**

where p_{(X,Y)} is the joint probability mass function of X and Y, and p_X and p_Y are the marginal probability mass functions of X and Y respectively.