Show 6abc-b(a^2+c^2)<=(a^2+b^2)c+(b^2+c^2)a if a, b, c > 0

### 1 Answer | Add Yours

I suggest you to move the term `b(a^2+c^2)` to the right side such that:

`6abc =lt (a^2+b^2)c + (b^2+c^2)a + b(a^2+c^2)`

You need to use the following inequalities such that: `a^2+b^2gt=2ab ; a^2+c^2 gt= 2ac ; b^2+c^2 gt= 2bc` .

You need to multiply the inequality `a^2+b^2gt=2ab` by the positive number c such that `(a^2+b^2)cgt=2abc` .

You need to multiply the inequality `a^2+c^2gt=2ac` by the positive number c such that `(a^2+c^2)bgt=2abc` .

You need to multiply the inequality `c^2+b^2gt=2cb` by the positive number a such that `(c^2+b^2)agt=2abc` .

Adding the new inequalities yields:

`(a^2+b^2)c + (a^2+c^2)b + (c^2+b^2)agt=2abc + 2abc + 2abc`

`` `(a^2+b^2)c + (a^2+c^2)b + (c^2+b^2)agt=6abc`

**The last line proves that the inequality `(a^2+b^2)c + (a^2+c^2)b + (c^2+b^2)agt=6abc` is checked.**

### Join to answer this question

Join a community of thousands of dedicated teachers and students.

Join eNotes