What is x if 2/(2x-2) + 2/(2x+2) = 10

2 Answers | Add Yours

neela's profile pic

neela | High School Teacher | (Level 3) Valedictorian

Posted on

What is x if 2/(2x-2) + 2/(2x+2) = 10.

We simplify both terms:

2/(2x-2) = 2/2(x-1) = 1/(x-1).

1/(2x+2) = 2/2(x+1) = 1/(x+1).

Therefore  1/(x-1) +1/(x+1) = 10.....(1)

We multiply both sides of (1) by (x-1)(x+1).

(x+1)+(x-1) = 10(x-1)(x+1) .

=> 2x = 10(x^2-1) , as (a-b)(a+b) = a^2-b^2.

=> 2x = 10x^2-10.

 => 0 = 10x^2-10-2x.

=> 10x^2-2x -10 = 0.

=> 2(5x^2-x-5) = 0

=> 5x^2-x-5 = 0.

=> 5x^2-x-5 = 0.....(2) which is like ax^2+bx+c = 0 whose roots are x1 = {-b+sqrt(b^2-4ac)}/2a, or  x1 = {-b-sqrt(b^2-4ac)}/2a.

So x1 = {-(-1)+sqrt(-1)^2-4*5*-5)}/2*5 = (1+sqrt101}/10,

x2 = (1-sqrt101}/10.

So the required roots are x1 = (1+sqrt101}/10, or x2 = (1-sqrt101}/10.

giorgiana1976's profile pic

giorgiana1976 | College Teacher | (Level 3) Valedictorian

Posted on

First, We'll factorize by 2 the denominators from the given expression:

2/2(x-1) + 2/2(x+1) = 10

We'll simplify and we'll get:

1/(x-1) + 1/(x+1) = 10

we'll move all terms to one side:

1/(x-1) + 1/(x+1) - 10 = 0

Now, we'll determine the least common denominator:

LCD = (x-1)(x+1)

We notice that the result of the product is the difference of squares:

(x-1)(x+1)  = x^2 - 1

We'll re-write the equation:

x + 1 + x - 1 - 10(x^2 - 1) = 0

We'll remove the brackets and we'll combine and eliminate like terms:

2x - 10x^2 + 10 = 0

We'll divide by -2 and we'll re-arrange the terms:

5x^2 - x - 5 = 0

We'll apply the quadratic formula:

x1 = [1 + sqrt(1 + 100)]/10

x1 = (1+sqrt101)/10

x2 = (1-sqrt101)/10

Since the roots are different from the values 1 and -1, we'll consider as being valid.

We’ve answered 318,957 questions. We can answer yours, too.

Ask a question