# Show f(a)+f(a+1)>=0 for each real a,if f(x)=x^2+3x+2?

*print*Print*list*Cite

### 1 Answer

You need to test the inequality `f(a)+f(a+1)>=0` , hence, you need to replace `a` and `a+1` for `x` such that:

`f(a) = a^2 + 3a + 2`

`f(a+1) = (a + 1)^2 + 3(a + 1) + 2`

Replacing `a^2 + 3a + 2` for `f(a)` and `(a + 1)^2 + 3(a + 1) + 2` for `f(a+1)` yields:

`a^2 + 3a + 2 + (a + 1)^2 + 3(a + 1) + 2 >= 0`

`a^2 + 3a + 2 + a^2 + 2a + 1 + 3a + 3 + 2 >= 0`

`2a^2 + 8a + 8 >= 0`

You need to divide by 2 such that:

`2a^2 + 8a + 8 >= 0 => a^2 + 4a + 4 >= 0 => (a + 2)^2 >= 0`

The inequality `(a + 2)^2 >= 0` is valid since a square is positive for `a in R` and `a + 2 = 0` for `a = -2` .

**Hence, testing if the inequality `f(a)+f(a+1)>=0` is valid, under the given conditions, yields `(a + 2)^2 >= 0` that holds for **`a in R.`