In statistical inference, the set of observed data that is used to draw inferences is called sample. The number of observations in the sample is called sample size.

A more accurate definition follows.

Definition Suppose a sample is made of realizations of random variables - or random vectors. Then we say that the sample has size or that the sample size is .

When the sample size tends to infinity, the properties of the statistical inferences that are drawn by using the sample can be studied using asymptotic results such as the Law of Large Numbers and the Central Limit Theorem. A sample is called a large sample when the sample size is so large that the asymptotic properties (i.e., those that are valid for that tends to infinity) are deemed a very good approximation of the actual properties enjoyed by the sample. On the contrary, when the sample size is not sufficient to justify such an approximation, the sample is called a small sample.

You can go to the lecture entitled Statistical inference to read more details about samples and sample size.

Previous entry: Sample point

Next entry: Sample space

The book

Most learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Score test
- Multinomial distribution
- Delta method
- Bayes rule
- Wishart distribution
- Normal distribution

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

Share