# FunctionIf f(x)=(5/2)*x*(x+2)^1/2, determine a,b,c if F(x)=(ax^2+bx+c)*(x+2)^1/2 is the antiderivative of f(x).

### 1 Answer | Add Yours

Since the equation that relates a function and its primitive is` int f(x) dx = F(x) + c` , hence, you may evaluate the indefinite integral` int f(x) dx` , such that:

`int (5/2)*x*(x+2)^(1/2) dx = (ax^2+bx+c)*(x+2)^(1/2) + c`

You should come up with the following substitution, such that:

`x + 2 = u => dx = du`

`x = u - 2`

Changing the varaible yields:

`int (5/2)*x*(x+2)^(1/2) dx = (5/2)int (u - 2)*u^(1/2) du `

You need to open the brackets, such that:

`(5/2)int (u - 2)*u^(1/2) du = (5/2)int (u^(3/2) - 2u^(1/2)) du`

Using the property of linearity of integral, yields:

`(5/2)int (u^(3/2) - 2u^(1/2)) du = (5/2)int (u^(3/2)) du - 5 int (u^(1/2)) du`

`(5/2)int (u^(3/2) - 2u^(1/2)) du = (5/2)*u^(3/2 + 1)/(3/2 + 1) - 5*(u^(1/2+1))/(1/2+1) + c`

`(5/2)int (u^(3/2) - 2u^(1/2)) du = (5/2)*u^(5/2)/(5/2) - 10/3*u^(3/2) + c`

`(5/2)int (u^(3/2) - 2u^(1/2)) du = u^(5/2) - 10/3*u^(3/2) + c`

Substituting back `x + 2` for u yields:

`int (5/2)*x*(x+2)^(1/2) dx = (x + 2)^(5/2) - 10/3*(x + 2)^(3/2) + c`

Factoring out `(x + 2)^(3/2)` yields:

`int (5/2)*x*(x+2)^(1/2) dx = (x + 2)^(3/2)(x + 2 - 10/3) + c`

`int (5/2)*x*(x+2)^(1/2) dx = (x + 2)^(3/2)(x - 4/3) + c`

`int (5/2)*x*(x+2)^(1/2) dx = (3x + 6)(3x - 4)sqrt(x + 2) + c`

`int (5/2)*x*(x+2)^(1/2) dx = (9x^2 - 12x + 18x - 24)sqrt(x + 2) + c`

`int (5/2)*x*(x+2)^(1/2) dx = (9x^2 + 6x - 24)(x+2)^(1/2) + c`

`{(int (5/2)*x*(x+2)^(1/2) dx = (ax^2+bx+c)*(x+2)^(1/2) + c),(int (5/2)*x*(x+2)^(1/2) dx = (9x^2 + 6x - 24)(x+2)^(1/2) + c):} => (ax^2+bx+c)*(x+2)^(1/2) + c = (9x^2 + 6x - 24)(x+2)^(1/2) + c => {(a=9),(b=6),(c=-24):}`

**Hence, evaluating a,b,c, under the given conditions, yields **`a = 9, b = 6, c = -24.`