A parameter of a distribution is a number or a vector of numbers describing some characteristic of that distribution.
Examples of distribution parameters are:
the expected value of a univariate probability distribution
its variance
one of its quantiles
one of its moments
All of the above are scalar parameters, that is, single numbers. Instead, the following are examples of vector parameters:
the expected value of a multivariate probability distribution
its covariance matrix (a matrix can be thought of as a vector whose entries have been written on multiple columns/rows)
a vector of cross-moments
You have probably read many times statements such as "Let us assume that the random variable has a normal distribution". What does such a statement mean?
It means that is a continuous random variable whose probability density function can be written as:
can be any real number and can be any positive real number. It can be proved that they are equal to the mean and variance of respectively (see Normal distribution). and are parameters (together they form a vector parameter). By changing them, we get different probability distributions of . So, when we say "Let us assume that the random variable has a normal distribution" withouth specifying the mean and the variance of , what we mean is "Let us assume that the distribution of belongs to the set of all normal distributions". This set can be obtained by varying the parameters and in the formula above, and it is called parametric family.
More in general, a parametric familiy is a set of probability distributions such that a member of the set is uniquely identified by a parameter (either a scalar or a vector). In our example, a member of the set of normal distributions is uniquely identified by its mean and variance.
Some examples of parametric families are reported in the next table.
Parametric family | Distribution parameters |
---|---|
Bernoulli | Probability of success |
Binomial | Probability of success and number of trials |
Poisson | Expected value |
Uniform | Upper and lower bounds of the support |
Exponential | Rate parameter |
Normal | Mean, variance |
Chi square | Degrees of freedom |
Student's t | Mean, scale parameter and degrees of freedom |
Multivariate normal | Expected value (vector), covariance matrix |
In statistical inference, we observe a sample of data and we make inferences about the probability distribution that generated the sample. What we typically do is to set up a statistical model and carry out inferences (estimation, testing, etc.) about a model parameter.
What does it mean to set up a statistical model? It just means that we make some hypotheses about the probability distribution that generated the data, that is, we restrict our attention to a well-defined set of probability distributions (e.g., the set of all continuous distributions, the set of all multivariate normal distributions, the set of all distributions having finite mean and variance). After setting up the model, we exploit the assumptions we have made to learn something about the distribution that generated the data. So, for instance, if we have assumed that the data come from a normal distribution, we can use the observed data to estimate the distribution parameters (mean and variance) or to test the null hypothesis that one of them is equal to a specific value.
Note that the concept of a statistical model is broader that the concept of a parametric family. They are both sets of probability distributions, but the members of a model need not be uniquely identified by a parameter. For example, suppose that our model is the set of all distributions having finite mean, and the parameter of interest, which we want to estimate, is the mean. Then, there are several distributions in the set having the same mean: the distributions are not uniquely identified by the parameter of interest. Actually, there is no parameter (single number or finite-dimensional vector) that allows to uniquely identify a member of the model.
In lecture entitled Statistical inference we define parameters, parametric families and inference in a more formal manner.
You can also have a look at a related glossary entry: Parameter space.
Previous entry: Null hypothesis
Next entry: Parameter space
Most of the learning materials found on this website are now available in a traditional textbook format.