2.1: Data 101 and Descriptive Statistics - Class Notes

Contents

Tuesday, September 17, 2019

Overview

Today we pick up where we left off from causality. We begin with an review and overview of using data and descriptive statistics. We want to quantify characteristics about samples as statistics, which we will later use to infer things about populations (which we will later identify causal relationships).

Next class will be on random variables and distributions. This full week is your crash course/review of basic statistics that we will need to start the “meat and potatoes” of this class: linear regression next Tuesday. As such, I’ll give you a brief homework to review these statistical concepts (with minimal use of R).

Slides

Problem Set

Problem Set 1 is due Thursday September 19.

Problem set 2 (on classes 2.1-2.2) will be posted shortly, and is due by Tuesday September 24.

Math Appendix

The Summation Operator

Definition

Many elementary propositions in econometrics (and statistics) involve the use of the sums of numbers. Mathematicians often use the summation operator (the greek letter Σ –“sigma”) as a shorthand, rather than writing everything out the long way. It will be worth your time to understand the summation operator, and some of its properties, and how these can provide shortcuts to proving more advanced theorems in econometrics.

Let X be a random variable from which a sample of n observations is observed, so we have a sequence {x1,x2,...,xn} i.e. $x_i, $ for i=1,2,...,n. Then the total sum of the observations (x1+x2+...+xn) can be represented as:

ni=1xi=x1+x2+...+xn

Useful Properties of Summation Operators

Rule 1: The summation of a constant k times a random variable Xi is equal to the constant times the summation of that random variable:

ni=1kXi=kni=1Xi

Proof:

ni=1kXi=kx1+kx2+...+kxn=k(x1+x2+...xn)=kni=1Xi.

Rule 2: The summation of a sum of two random variables is equal to the sum of their summations:

ni=1(Xi+Yi)=ni=1Xi+ni=1Yi

Proof:

ni=1(Xi+Yi)=(X1+Y1)+(X2+Y2)+...(Xn+Yn)=(X1+X2+...+Xn)+(Y1+Y2+...+Yn)=ni=1Xi+ni=1Yi

Rule 3: The summation of constant over n observations is the product of the constant and n:

ni=1k=nk

Proof:

ni=1k=k+k+...+kn times=nk

Combining these 3 rules: for the sum of a linear combination of a random variable (a+bX):

ni=1(a+bXi)=na+bni=1Xi

Proof: left to you as an exercise!

Advanced: Useful Properties for Regression

There are some additional properties of summations that may not be immediately obvious, but will be quite essential in proving properties of linear regressions.

Using the properties above, we can describe the mean, variance, and covariance of random variables.For more beyond the mere definition, see the appendix on Covariance and Correlation

First, define the mean of a sequence {Xi:i=1,...,n} and {Yi:i=1,...,n} as:

ˉX=1nni=1Xi

Second, the variance of X is:

var(X)=1nni=1(XiˉX)2

Third, the covariance of X and Y is:

cov(X,Y)=1nni=1(XiˉX)(YiˉY)

Rule 4: The sum of the deviations of observations of Xi from its mean (ˉX) is 0:

ni=1(XiˉX)=0

Proof:

ni=1(Xiˉx)=ni=1Xini=1ˉX=ni=1XinˉXSince ˉx is a constant=nni=1XinˉXnˉXMultiply the first term by nn=1=nˉXnˉXBy the definition of the mean ˉX=0

Rule 5: The squared deviations of X are equal to the product of X times its deviations:

ni=1(XiˉX)2=ni=1Xi(XiˉX)

Proof:

ni=1(XiˉX)2=ni=1(XiˉX)(XiˉX)Expanding the square=ni=1Xi(XiˉX)ni=1ˉX(XiˉX)Breaking apart the first term=ni=1Xi(XiˉX)ˉXni=1(XiˉX)Since ˉX is constant, not depending on is=ni=1Xi(XiˉX)ˉX(0)From rule 4=ni=1Xi(XiˉX)Remainder after multiplying by 0

Rule 6: The following summations involving X and Y are equivalent:

ni=1Yi(XiˉX)=ni=1Xi(YiˉY)=ni=1(XiˉX)(YiˉY)

Proof:

ni=1(XiˉX)(YiˉY)=ni=1Yi(XiˉX)ni=1ˉY(XiˉX)Breaking apart the second term=ni=1Yi(XiˉX)ˉY(0)From rule 4=ni=1Yi(XiˉX)Remainder after multiplying by 0

equivalently:

ni=1(XiˉX)(YiˉY)=ni=1Xi(YiˉY)ni=1ˉX(YiˉY)Breaking apart the first term=ni=1Xi(YiˉY)ˉX(0)From rule 4=ni=1Xi(YiˉY)Remainder after multiplying by 0