Show that all roots of polynomial (x+i)^10+((x-i)^10 are real.

2 Answers | Add Yours

Top Answer

sciencesolve's profile pic

sciencesolve | Teacher | (Level 3) Educator Emeritus

Posted on

You need to make the assumption that `alpha`  is a root of polynomial and `alpha in C` , hence `alpha = x+i*y` .

Since `alpha`  is a root of polynomial, hence `f(alpha) = 0`  such that:

`{(f(alpha) = (alpha+i)^10 + (alpha-i)^10),(f(alpha)=0):}` `=gt(alpha+i)^10 + (alpha-i)^10 = 0 =gt (alpha+i)^10=- (alpha-i)^10=gt |alpha + i| = |alpha - i|`

You need to substitute `x+iy`  for `alpha`  such that:

`|x+iy+i| = |x+iy-i| => |x+i(y+1)| = |x+i(y-1)|`

You need to evaluate the absolute values above such that:

`sqrt(x^2 + (y+1)^2) = sqrt(x^2 + (y-1)^2)`

Raising to square both sides yields:

`x^2 + (y+1)^2 = x^2 + (y-1)^2`

Expanding the squares yields:

`x^2 + y^2 + 2y + 1 = x^2 + y^2 - 2y + 1`

Reducing like terms yields:

`2y = -2y => 4y = 0 => y = 0`

Notice that y represents the imaginary part of the complex number `alpha = x + i*y` .

Since the imaginary part of the root alpha is zero, hence all the roots of the polynomial are real numbers: `alpha = x` .

Sources:

We’ve answered 318,955 questions. We can answer yours, too.

Ask a question