How would you integrate: 1/(x-1) dx + 1/(x-1)^2 dx - 2/(2x+3) dx?

Expert Answers

An illustration of the letter 'A' in a speech bubbles

We have to find the integral of 1/(x-1) dx + 1/(x-1)^2 dx - 2/(2x+3) dx.

Int [ 1/(x-1) dx + 1/(x-1)^2 dx - 2/(2x+3) dx ]

=> Int [ 1/(x-1) dx] + Int [1/(x-1)^2 dx] - Int [2/(2x+3) dx]

  • Int [ 1/(x-1) dx]

let u = x - 1 ,...

Unlock
This Answer Now

Start your 48-hour free trial to unlock this answer and thousands more. Enjoy eNotes ad-free and cancel anytime.

Start your 48-Hour Free Trial

We have to find the integral of 1/(x-1) dx + 1/(x-1)^2 dx - 2/(2x+3) dx.

Int [ 1/(x-1) dx + 1/(x-1)^2 dx - 2/(2x+3) dx ]

=> Int [ 1/(x-1) dx] + Int [1/(x-1)^2 dx] - Int [2/(2x+3) dx]

  • Int [ 1/(x-1) dx]

let u = x - 1 , du = dx

=> Int [ 1/u du ]

=> ln u

=> ln( x +1)

  • Int [1/(x-1)^2 dx]

let u = x - 1, du = dx

=> Int[ 1/u^2 du]

=> -1/u

=> -1/(x - 1)

  • Int [2/(2x+3) dx]

let u = 2x + 3 , du  = 2*dx

=> Int [ 1/u du]

=> ln u

=> ln (2x + 3)

Therefore we get the required integral as ln( x +1) - 1/(x - 1) - ln (2x + 3) + C

Approved by eNotes Editorial Team