Section 5.1 Random Variables
Many things in our world involve some level of uncertainty. We want to use tools from probability to make decisions and see what we can confidently say about whatever it is we’re investigating.
A random variable is a variable whose possible values are the result of some random process, or experiment.
Examples of Random Variables:
Coin toss: The number of heads from tossing two coins
Dice: The outcome of rolling a die
Stock dividend: The expected dividend payment on a stock
Investment returns: The expected standard deviation of investment returns
Cars passing a road: The number of cars that drive through a certain intersection in a given time period
Definition 5.1.1.
Two Types of Random Variables:
discrete random variable: outcomes are whole numbers after performing an experiment
continuous random variable: outcomes are any numerical value (including decimals or fractions) after performing an experiment
Exercise 5.1.2.
Which of the following are discrete random variables?
The number of heads from tossing two coins
-
The outcome of rolling a die
-
The expected divident payment on a stock
-
The expected standard deviation of investment returns
-
The number of cars that drive through a certain intersection in a given time period
-
The height of a random student at UCCS
-
Exercise 5.1.3.
Which of the following are continuous random variables?
The number of heads from tossing two coins
-
The outcome of rolling a die
-
The expected divident payment on a stock
-
The expected standard deviation of investment returns
-
The number of cars that drive through a certain intersection in a given time period
-
The height of a random student at UCCS
-
Random variables can be “distributed” differently. For example, think about how rolls of a die would be distributed versus how the heights of UCCS students would be distributed.
We’re going to start by discussing distributions of
discrete random variables, and then in
Chapter 7 we will discuss distributions of
continuous random variables.
Definition 5.1.4.
A discrete probability distribution is a listing of all possible outcomes of an experiment for a discrete random variable along with the relative frequency of each outcome
Each outcome in the distribution must be mutually exclusive
\(0\leq P(X)\leq 1\) for all \(X\text{;}\) i.e., the probability of each outcome is between 0 and 1
\(\sum P(X)=1\text{;}\) the sum of all probabilities is 1