The integral of f(x)=(x+1)/(x^2+2x) can be found using substitution.
If y = x^2 + 2x
dy/dx = 2x + 2 = 2(x + 1)
=> (1/2)dy = (x + 1)dx
=> Int[(1/2)*(1/y) dy]
=> (1/2)ln|y| +C
substitute y = x^2 + 2x
=> (1/2)*ln|x^2 + 2x| + C
The required integral of f(x)=(x+1)/(x^2+2x) is (1/2)*ln|x^2 + 2x| + C
We know that the antiderivative of a function is the indefinite integral of the given function:
Int f(x)dx = Int (x+1)dx/(x^2+2x)
We notice that if we'll differentiate the denominator of the function, we'll get the numerator multiplied by 2.
We'll substitute the denominator by t.
x^2+2x = t
We'll differentiate both sides:
(2x + 2)dx = dt
We'll divide by 2:
(x + 1)dx = dt/2
We'll re-write the integral in t:
Int f(x)dx = Int dt/2t = (1/2)*ln |t| + C
Int f(x)dx = (1/2)*ln |x^2+2x| + C
The antiderivative of f(x) is: ln sqrt (x^2+2x) + C