# Calculate the antiderivative of the function f(x) given by f(x)=2x/(x+1)*(x^2+1)?

justaguide | College Teacher | (Level 2) Distinguished Educator

Posted on

We need to find the integral of f(x) = 2x/(x+1)*(x^2+1)

f(x) = 2x/(x+1)*(x^2+1)

finding the partial fractions

=> f(x) = (Ax + B)/(x^2 + 1) + C/(x + 1)

=> f(x) = [Ax^2 + Ax + Bx + B + Cx^2 + C]/(x^2 +1)(x +1)

=> f(x) = [x^2(A + C) + x(A + B) + B + C]/(x^2 +1)(x +1)

=> A + C = 0 , A + B = 2 and B + C = 0

A = -C

B - C = 2 and B + C = 0

=> 2B = 2

=> B = 1

C = -1

A = 1

This gives f(x) = 2*x/(x^2+1) - 1/(x+1) + 1/(x^2+1)

Int[f(x) dx]

=> Int[2*x/(x^2+1) - 1/(x+1) + 1/(x^2+1)]

=> 2*Int [x/(x^2 + 1) dx] - Int[1/(x + 1) dx] + Int[1/(x^2 + 1) dx]

=> 2*Int [x/(x^2 + 1) dx] - ln(x + 1) + arc tan x

Int[x/(x^2 + 1) dx]

let x^2 + 1 = y

dy/2 = x dx

=> (1/2)*Int [ 1/y dy]

=> (1/2)*ln y

=> (1/2)*ln ( x^2 + 1)

The required integral is (1/2)*ln (x^2 + 1) - ln(x + 1) +arc tan x

giorgiana1976 | College Teacher | (Level 3) Valedictorian

Posted on

We'll have to determine the indefinite integral of the given function.

For the beginning, we'll re-write the function as an algebraic sum of elementary fractions:

2x/(x+1)*(x^2+1) = A/(x+1) + (Bx + C)/(x^2 + 1)

2x = Ax^2 + A + Bx^2 + Bx + Cx + C

2x = x^2(A + B) + x(B+C) + A+C

Comparing, we'll get:

A+B = 0 > A = -B (1)

B+C = 2 (2)

A+C = 0 <=>-B + C = 0 (3)

(2)+(3) => 2C = 2 => C=1 => A=-1=>B =1

2x/(x+1)*(x^2+1) = -1/(x+1) + (x + 1)/(x^2 + 1)

Int f(x)dx = Int-dx/(x+1) + Int xdx/(x^2 + 1) + Int dx/(x^2 + 1)

Int-dx/(x+1) = -ln|x+1| + C

Int xdx/(x^2 + 1) = Int dt/2t = ln|t|/2 + C

x^2 + 1 = t

2xdx = dt

xdx = dt/2

Int dx/(x^2 + 1) = arctan x + C

Int f(x)dx = -ln|x+1| + ln|t|/2 + arctan x + C

The antiderivative of the given function is: Int f(x)dx = -ln|x+1| + ln|t|/2 + arctan x + C.