# Show that (x+2)f(x+1)=xf(-x-1)f(x)=(x^2+1)/(x^2+x)

### 1 Answer | Add Yours

You need to calculate the terms `f(x+1)` and`f(-x-1).`

`f(x+1) = ((x+1)^2 + 1)/((x+ 1)^2 + x + 1)`

Expanding the binomials yields:

`f(x+1) = (x^2 + 2x+ 1 + 1)/(x^2 + 2x + 1+ x + 1)`

`f(x+1) = (x^2 + 2x+ 2)/(x^2 + 3x +2)`

`f(-x-1) = ((-(x+1))^2 + 1)/((-(x+ 1))^2- x- 1)`

`f(-x-1) = (x^2 + 2x+ 1 + 1)/(x^2 + 2x+ 1- x- 1)`

`f(-x-1) = (x^2 + 2x+ 2)/(x^2 + x)`

You need to check if the expression holds:

`(x+2)f(x+1)=xf(-x-1)`

`` `((x+2)(x^2 + 2x+ 2))/(x^2 + 3x +2) = (x(x^2 + 2x+ 2))/(x^2 + x)`

You need to factor the polynomial `x^2 + 3x +2` , hence you need to look for simpler polynomials that multiplied to give `x^2 + 3x +2` .

You need to look for the roots of `x^2 + 3x +2 = 0` .

`x^2 + 2x + x + 2 = 0 =gt x(x + 2) + (x + 2) = 0`

Factoring out (x+2) yields:

`(x+2)(x+1) = 0`

You need to write the factored form of the expression such that:

`((x+2)(x^2 + 2x+ 2))/((x+2)(x+1)) = (x(x^2 + 2x+ 2))/(x(x + 1))`

Reducing common factors both sides yields:

`(x^2 + 2x+ 2)/(x+1) = (x^2 + 2x+ 2)/(x + 1)`

**The equal fractions of both sides proves that the expression `(x+2)f(x+1)=xf(-x-1)` is checked.**