a^2 + b^2 = ( a+ b) ^2

Let us simplify the expression.

We know that:

(a+ b)^2 = a^2 + 2ab + b^2

We will substitute into the equation above.

==> a^2 + b^2 = a^2 + 2ab + b^2

Now we will subtract a^2 and b^2 from both sides:

==> a^2 + b^2 - a^2 - b^2 = a^2 + 2ab+ b^2 - a^2 - b^2

==> 0 = 2ab

==> 2ab = 0

==> a*b = 0

Then, we conclude that in order for the equality to hole, either a or b should equal zero.

Then: a = 0 OR b = 0

**Then there are unlimited possible solutions for the equality such that the pair ( a,b) is:**

**(a,b) = { ( a, 0) : a is any real number}.**

** ={( 0, b) : b is any real number}.**

To find all numbers that satisfy a^2+b^2 = (a+b)^2.

We take it granted that a^2+b^2 = (a+b)^2 is true .

Therefore a^2+b^2 = a^2+2ab+b^2. We subtract a^2+b^2 from both sides and we get:

o = 2ab.

Therefore a^2+b^2 = (a+b)^2 true only if 2ab = 0.

Now 2ab= 0 is possible only when a= 0 , or b = 0, or 2 = 0.

2 = 0 is absurd by any standard. So a or b = 0 Or both a and b are zero.

Therefore a^2+b^2 = (a+b)^2 , under the following 3 cases:

When (i) a= 0 and b = 0, or

(ii) a = 0 , b = n, where n is any number real or otherwise, or

(iii) a = n and b = 0 , where n is any number real or complex.

### Unlock This Answer Now

Start your **48-hour free trial to unlock** this answer and thousands more.