I will answer first two i.e. (a) and (b) questions because eNotes rules forbid answering multiple questions.

**(a)**

Because of linearity of expectation `E[a X+bY]=aE[x]+bE[Y]`

we have

`E[Y_i]=E[betax_i+epsilon_i]betaE[x_i]+E[epsilon_i]=betaE[x_i]`

**(b)**

By definition `Var(X)=E[X^2]-(E[X])^2` hence we have

`Var(Y_i)=E[Y_i^2]-(E[Y_i])^2=E[(betax_i+epsilon_i)^2]-(E[Y_i])^2=`

`E[beta^2x_i^2+2betax_iepsilon_i+epsilon_i^2]-(betaE[x])^2=`

Here we've used expected value from part (a) and now we will use linearity of expectation.

`beta^2E[x_i^2]+2betaE[x_iepsilon_i]+E[epsilon_i^2]-beta^2(E[x_i])^2`

` ` Now because `beta^2E[x_i^2]-beta^2(E[x_i])^2=beta^2Var(x_i)`

` `and `E[epsilon_i^2]=Var(epsilon_i)-(E[epsilon_i])^2=sigma^2-0^2=sigma^2` we have

` ` `beta^2Var(x_i)+2betaE[x_iepsilon_i]+sigma^2` **which would be your solution.**

If we assume that `E[x_iepsilon_i]=0` which will be the case if the estimator `beta` is unbiased and consistent, then you would have

`beta^2Var(x_i)+sigma^2` **<-- Your solution if `x_i` and `epsilon_i` are uncorrelated. **