(a) The OLS estimator for β minimizes the Sum of Squared Residuals:
βˆ=argmin (y −βx)2
Take the first-order condition to show that
(b) Show that
ˆ ni=1 xiyi
β= n x2.
ˆ ni=1 xiεi
What is E[βˆ | β] and Var(βˆ | β)? Use this to show that, conditional on β, βˆ has the following
β|β∼Nβ, n x2.
(c) Suppose we believe that β is distributed normally with mean 0 and variance σ2 ; that is,
β ∼ N(0, σ2 ). Additionally assume that β is independent of εi. Compute the mean and
variance of βˆ. That is, what is E[βˆ] and Var(βˆ)?
(Hint you might find useful: E[w1] = E[E[w1 | w2]] and Var(w1) = E[Var(w1 | w2)] +
Var(E[w1 | w2]) for any random variables w1 and w2.)
Let us consider the linear regression model yi = β0 + β1xi + ui (i = 1, …, n), which satisfies
Assumptions MLR.1 through MLR.5 (see Slide 7 in “Linear_regression_review” under “Modules”
on Canvas)1. The xis (i = 1, …, n) and β0 and β1 are nonrandom. The randomness comes from uis
(i = 1, …, n) where var (ui) = σ2. Let βˆ0 and βˆ1 be the usual OLS estimators (which are unbiased for
β0 and β1, respectively) obtained from running a regression of . on . (the intercept
y n − 1
. Suppose you also run a regression of
x n − 1
a) Give the expression of β ̃1 as a function of yis and xis (i = 1, …, n).
(excluding the intercept column) to obtain another estimator β ̃1 of β1.
b) Derive E β1 in terms of β0, β1, and xis. Show that β1 is unbiased for β1 when β0 = 0. If
β0 ̸= 0, when will β ̃1 be unbiased for β1?
c) Derive Var β ̃ , the variance of β ̃ , in terms of σ2 and x s (i = 1,…,n).
1The model is a simple special case of the general multiple regression model in “Linear_regression_review”.
Solving this question does not require knowledge about matrix operations.
y n − 1 1
x n − 1
d) Show that Var β ̃ is no greater than Var βˆ ; that is, Var β ̃ ≤ Var βˆ . When do
you have Var β ̃ = Var βˆ ? (Hint you might find useful: use n x2 ≥ n (x − x ̄)2 where
x ̄ = n1 ni=1 xi.)
e) Choosing between βˆ1 and β ̃1 leads to a tradeoff between the bias and variance. Comment on
Let vˆ be an estimator of the truth v. Show that E (vˆ − v)2 = Var (vˆ) + [Bias (vˆ)]2 where Bias (vˆ) =
E (vˆ) − v. (Hint: The randomness comes from vˆ only and v is nonrandom).
Applied questions (with the use of R)
For this question you will be asked to use tools from R for coding.
Try it now!
How it works?
Follow these simple steps to get your paper done
Place your order
Fill in the order form and provide all details of your assignment.
Proceed with the payment
Choose the payment system that suits you most.
Receive the final file
Once your paper is ready, we will email it to you.
No need to work on your paper at night. Sleep tight, we will cover your back. We offer all kinds of writing services.
No matter what kind of academic paper you need and how urgent you need it, you are welcome to choose your academic level and the type of your paper at an affordable price. We take care of all your paper needs and give a 24/7 customer care support system.
Admission Essays & Business Writing Help
An admission essay is an essay or other written statement by a candidate, often a potential student enrolling in a college, university, or graduate school. You can be rest assurred that through our service we will write the best admission essay for you.
Our academic writers and editors make the necessary changes to your paper so that it is polished. We also format your document by correctly quoting the sources and creating reference lists in the formats APA, Harvard, MLA, Chicago / Turabian.
If you think your paper could be improved, you can request a review. In this case, your paper will be checked by the writer or assigned to an editor. You can use this option as many times as you see fit. This is free because we want you to be completely satisfied with the service offered.