# If a, b, are the roots of the equation x^2 + x + 1 calculate a^1980 + b^1980

### 2 Answers | Add Yours

x^2+x+1 = 0. Then find a^1980 +b^1980.

Since x^2+x+1 = 0

x1= (-1+sqt(1-4)}/2 = w

x2 = -(1+sqrt-3)/2 = w^2 .

Also w+w^2 = 1. And (w^2)^2 = w.

W^3 = 1 are the property w which is the cube root of unity. Or the roots of x^3 = 1. Or x^3-1 = 0 Or (x-1)(x^2+x+1) = 0.

Therefore

a^(1980) +b^(1980) = w^1980 +(w^2)1980 = (w^3)^660 +(w^6)^990 = 1^660 +1^330 = 2.

Therefore a^1980 +b^1980 = 2.

If x^2+x+1 = 0, then, if we'll multiply both sides by (x-1),we'll get:

(x-1)(x^2 + x + 1) = 0

But the product (x-1)(x^2 + x + 1) is the difference of cubes:

(x-1)(x^2 + x + 1) = x^3 - 1

So, if (x-1)(x^2 + x + 1) = 0, then x^3 - 1 = 0

x^3 = 1

Now, if a and b are the roots of the equation x^2 + x + 1, then they are the roots of the equation x^3 - 1 = 0, too.

According to the rule, each root substituted in the original equation, verifies it.

a^3 - 1 = 0

a^3 = 1

b^3 - 1 = 0

b^3 = 1

Now, we'll try to express the exponent 1980 as a multiple of 3:

1980 = 3*660

a^1980 + b^1980 = (a^3)^660 + (b^3)^660

But a^3 and b^3 = 1

a^1980 + b^1980 = 1^660 + 1^660

a^1980 + b^1980 = 1 + 1

**a^1980 + b^1980 = 2**