# What is the antiderivative of f(x)=(x^2+x^3)^1/2?

*print*Print*list*Cite

### 1 Answer

To determine the antiderivative of the function, we'll have to determine the indefinite integral of f(x).

Int f(x)dx = Int sqrt(x^2+x^3)dx

We'll factorize inside brackets by x^2:

Int sqrt(x^2+x^3)dx = Int sqrt x^2(1+x)dx

Int sqrt x^2(1+x)dx = Int |x|*sqrt(1+x)dx

We'll conside x positive:

Int |x|*sqrt(1+x)dx = Int x*sqrt(1+x)dx

We'll solve the integral using substitution:

sqrt(1+x) = t

We'll differentiate both sides:

dx/2sqrt(1+x) = dt

dx = 2tdt

We'll determine x with respect to t. For this reason, we'll raise to square both sides:

[sqrt(1+x)]^2 = t^2

1 + x = t^2

x = t^2 - 1

We'll re-write the integral in t:

Int (t^2 - 1)*t*2tdt = 2Int t^2(t^2 - 1)dt

We'll remove the brackets:

2Int t^2(t^2 - 1)dt = 2Int (t^4 - t^2)dt

We'll apply additive property:

2Int (t^4 - t^2)dt = 2Int t^4dt - 2Int t^2dt

2Int (t^4 - t^2)dt = 2t^5/5 - 2t^3/3 + C

We'll change the result in

**Int f(x)dx = 2[sqrt(1+x)]^5/5 - 2[sqrt(1+x)]^3/3 + C**