Difference between revisions of "Proofs of AM-GM"
(→Alternate Proof by Convexity: Fixing Parentheses) |
Etmetalakret (talk | contribs) (Rewrote the Cauchy Induction proof to be longer, but more thorough for beginners. Also rearranged the document to prove it first, as it requires the least background knowledge.) |
||
Line 1: | Line 1: | ||
− | This pages lists some proofs of the weighted [[AM-GM]] Inequality. | + | This pages lists some proofs of the weighted [[AM-GM]] Inequality. The inequality's statement is as follows: for all nonnegative reals <math>a_1, \dotsc, a_n</math> and nonnegative reals <math>\lambda_1, \dotsc, \lambda_n</math> such that <math>\sum_{i=1}^n \lambda_i = 1</math>, then |
<cmath> \sum_{i=1}^n \lambda_i a_i \ge \prod_{i=1}^n a_i^{\lambda_i}, </cmath> | <cmath> \sum_{i=1}^n \lambda_i a_i \ge \prod_{i=1}^n a_i^{\lambda_i}, </cmath> | ||
with equality if and only if <math>a_i = a_j</math> for all <math>i,j</math> such that <math>\lambda_i, \lambda_j \neq 0</math>. | with equality if and only if <math>a_i = a_j</math> for all <math>i,j</math> such that <math>\lambda_i, \lambda_j \neq 0</math>. | ||
− | We first note that we may disregard any <math>a_i</math> for which <math>\lambda_i= 0</math>, as they contribute to neither side of the desired inequality. | + | We first note that we may disregard any <math>a_i</math> for which <math>\lambda_i= 0</math>, as they contribute to neither side of the desired inequality. We also note that if <math>a_i= 0</math> and <math>\lambda_i \neq 0</math>, for some <math>i</math>, then the right-hand side of the inequality is zero and the left hand of the inequality is greater or equal to zero, with equality if and only if <math>a_j = 0 = a_i</math> whenever <math>\lambda_j\neq 0</math>. Thus we may henceforth assume that all <math>a_i</math> and <math>\lambda_i</math> are ''strictly positive.'' |
− | == | + | == Proofs of Unweighted AM-GM == |
− | = | + | These proofs use the assumption that <math>\lambda_i = 1/n</math>, for all integers <math>1 \le i \le n</math>. |
− | We | + | === Proof by Cauchy Induction === |
− | + | We use [[Cauchy Induction]], a variant of induction that involves proving a result for <math>2</math>, all powers of <math>2</math>, and then a backward step where <math>n</math> implies <math>n-1</math>. | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | + | '''Base Case''': The smallest nontrivial case of AM-GM is in two variables. By the properties of perfect squares (or by the [[Trivial Inequality]]), <math>(x-y)^2 > 0,</math> with equality if and only if <math>x-y=0</math>, or <math>x=y</math>. Then because <math>x</math> and <math>y</math> are nonnegative, we can perform the following manipulations: <cmath>x^2 - 2xy + y^2 > 0</cmath> <cmath>x^2 + 2xy + y^2 > 4xy</cmath> <cmath>\frac{(x+y)^2}{4} \geq xy</cmath> <cmath>\frac{x+y}{2} \geq \sqrt{xy},</cmath> with equality if and only if <math>x=y</math>. This completes the proof of the base case. | |
− | <cmath> | ||
− | with equality | ||
− | + | '''Powers of Two''': We use induction. Suppose that AM-GM is true for <math>n</math> variables; we will then prove that the inequality is true for <math>2n</math>. Let <math>x_1, x_2, \ldots x_{2n}</math> be any list of nonnegative reals. Then, because the two lists <math>x_1, x_2 \ldots x_n</math> and <math>x_{n+1}, x_{n+2}, \ldots x_{2n}</math>, have <math>n</math> variables each, <cmath>\frac{x_1 + x_2 + \cdots + x_n}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_n} \textrm{ and } \frac{x_{n+1} + x_{n+2} + \cdots + x_{2n}}{n} \geq \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}.</cmath> Adding these two inequalities together and dividing by two yields <cmath>\frac{x_1 + x_2 + \cdots + x_{2n}}{2n} \geq \frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2}.</cmath> Then by AM-GM in two variables, <cmath>\frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2} \geq \sqrt[2n]{x_1 x_2 \ldots x_{2n}}.</cmath> Combining this inequality with the previous one yields unweighted AM-GM in <math>2n</math> variables, with one exception — equality. | |
− | < | ||
− | |||
− | <cmath> | ||
− | |||
− | <cmath> | ||
− | |||
− | < | + | For equality, note that every inequality mentioned must have equality as well; thus, inequality holds if and only if all the numbers in <math>x_1, x_2, \ldots x_n</math> are the same, all the numbers in <math>x_{n+1}, x_{n+2}, \ldots x_{2n}</math> are the same, and <math>\sqrt[n]{x_1 x_2 \cdots x_n} = \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}</math>. From here, it is trivial to show that this implies <math>x_1 = x_2 = \cdots x_{2n}</math>, which is the equality condition for unweighted AM-GM in <math>2n</math> variables. |
− | + | This completes the induction and implies that unweighted AM-GM holds for all powers of two. | |
− | < | + | '''Backward Step''': Assume that AM-GM holds for <math>n</math> variables. Letting <math>x_n = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math>, we have that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1})}.</cmath> Equality holds if and only if <math>x_1 = x_2 = \cdots x_{n-1} = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math>. However, note that the last equality is implied if all the numbers of <math>x_1, x_2 \ldots x_{n-1}</math> are the same; thus, equality holds if and only if <math>x_1 = x_2 = \cdots x_{n-1}</math>. |
− | + | We then simplify the lefthand side. Multiplying both sides of the fraction by <math>n-1</math> and combining like terms, we get that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} = \frac{nx_1 + nx_2 + \cdots + nx_{n-1}}{n(n-1)} = \frac{x_1 + x_2 + \cdots x_{n-1}}{n-1}.</cmath> Thus, <cmath>\frac{x_1 + x_2 + \cdots x_{n-1}}{n-1} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1})}.</cmath> Raising both sides to the <math>n</math>th power yields <cmath>(\frac{x_1 + x_2 + \cdots x_{n-1}}{n-1}^n \geq x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}).</cmath> From here, we divide by <math>\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math> and take the <math>n</math> root to get that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \geq \sqrt[n-1]{x_1 x_2 \cdots x_{n-1}}.</cmath> Every step we took preserves our earlier equality condition, which implies that the inequality holds for <math>n-1</math> variables. This completes the backwards induction, which proves the unweighted AM-GM Inequality, as required. <math>\square</math> | |
− | |||
− | < | ||
− | |||
− | |||
− | |||
− | <cmath> \ | ||
− | |||
− | |||
− | |||
− | <cmath> \frac{\ | ||
− | |||
− | <cmath> \ | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
=== Proof by Rearrangement === | === Proof by Rearrangement === | ||
− | Define the <math>n</math> sequence <math>\{ r_{i,j}\}_{i=1}^{n}</math> as <math>r_{i,j} = \sqrt[n]{a_i}</math>, for all integers <math>1 \le i,j \le n</math>. | + | Define the <math>n</math> sequence <math>\{ r_{i,j}\}_{i=1}^{n}</math> as <math>r_{i,j} = \sqrt[n]{a_i}</math>, for all integers <math>1 \le i,j \le n</math>. Evidently these sequences are similarly sorted. Then by the [[Rearrangement Inequality]], |
<cmath> \sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} , </cmath> | <cmath> \sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} , </cmath> | ||
− | where we take our indices modulo <math>n</math>, with equality exactly when all the <math>r_{i,j}</math>, and therefore all the <math>a_i</math>, are equal. | + | where we take our indices modulo <math>n</math>, with equality exactly when all the <math>r_{i,j}</math>, and therefore all the <math>a_i</math>, are equal. Dividing both sides by <math>n</math> gives the desired inequality. <math>\blacksquare</math> |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
=== Proof by Calculus === | === Proof by Calculus === | ||
Line 148: | Line 96: | ||
<cmath>a^2-2ab+b^2 \geq 0</cmath> <cmath>a^2+2ab+b^2 \geq 4ab</cmath> <cmath>\frac{(a+b)^2}4 \geq ab</cmath> <cmath>\frac{a+b}2 \geq \sqrt{ab}</cmath> | <cmath>a^2-2ab+b^2 \geq 0</cmath> <cmath>a^2+2ab+b^2 \geq 4ab</cmath> <cmath>\frac{(a+b)^2}4 \geq ab</cmath> <cmath>\frac{a+b}2 \geq \sqrt{ab}</cmath> | ||
From here, we proceed with induction through the second paragraph of the above proof with Cauchy Induction to show the theorem is true. <math>\blacksquare</math> | From here, we proceed with induction through the second paragraph of the above proof with Cauchy Induction to show the theorem is true. <math>\blacksquare</math> | ||
+ | |||
+ | == Proof of Weighted AM-GM == | ||
+ | |||
+ | === Proof by Convexity === | ||
+ | |||
+ | We note that the [[function]] <math>x \mapsto \ln x</math> is strictly [[concave]]. Then by [[Jensen's Inequality]], | ||
+ | <cmath> \ln \sum_i \lambda_i a_i \ge \sum_i \lambda_i \ln a_i = \ln \prod_i a_i^{\lambda_i} , </cmath> | ||
+ | with equality if and only if all the <math>a_i</math> are equal. | ||
+ | Since <math>x \mapsto \ln x</math> is a strictly increasing function, it then follows that | ||
+ | <cmath> \sum_i \lambda_i a_i \ge \prod_i a_i^{\lambda_i}, </cmath> | ||
+ | with equality if and only if all the <math>a_i</math> are equal, as desired. <math>\blacksquare</math> | ||
+ | |||
+ | === Alternate Proof by Convexity === | ||
+ | |||
+ | ''This proof is due to [[G. Pólya]].'' | ||
+ | |||
+ | Note that the function <math>f:x \mapsto e^x</math> is strictly convex. Let <math>g(x)</math> be the line tangent to <math>f</math> at <math>(0,1)</math>; then <math>g(x) = x+1</math>. Since <math>f</math> is also a [[continuous]], [[differentiable]] function, it follows that <math>f(x) \ge g(x)</math> for all <math>x</math>, with equality exactly when <math>x=0</math>, i.e., | ||
+ | <cmath> 1+x \le e^x, </cmath> | ||
+ | with equality exactly when <math>x=0</math>. | ||
+ | |||
+ | Now, set | ||
+ | <cmath> r_i = a_i/\biggl( \sum_{j=1}^n \lambda_j a_j \biggr) - 1, </cmath> | ||
+ | for all integers <math>1\le i \le n</math>. Our earlier bound tells us that | ||
+ | <cmath> r_i +1 \le \exp(r_i), </cmath> | ||
+ | so | ||
+ | <cmath> (r_i +1)^{\lambda_i} \le \exp(\lambda_i r_i) .</cmath> | ||
+ | Multiplying <math>n</math> such inequalities gives us | ||
+ | |||
+ | <cmath>\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} \le \prod_{i=1}^n \exp \lambda_i r_i </cmath> | ||
+ | |||
+ | Evaluating the left hand side: | ||
+ | |||
+ | <cmath>\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ (\sum_{j=1}^n \lambda_j a_j)^{\sum_{j=1}^n \lambda_i} } = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } ,</cmath> | ||
+ | |||
+ | for | ||
+ | |||
+ | <cmath> \sum_{j=1}^n \lambda_i = 1 .</cmath> | ||
+ | |||
+ | Evaluating the right hand side: | ||
+ | |||
+ | <cmath> \prod_{i=1}^n \exp \lambda_i r_i = \prod_{i=1}^n \exp \left(\frac{a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \lambda_i\right) = \exp \left(\frac{\sum_{i=1}^n a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \sum_{i=1}^n \lambda_i\right) = \exp 0 = 1 .</cmath> | ||
+ | |||
+ | Substituting the results for the left and right sides: | ||
+ | |||
+ | <cmath> \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } \le 1 </cmath> | ||
+ | |||
+ | <cmath> \prod_{i=1}^n a_i^{\lambda_i} \le \sum_{j=1}^n \lambda_j a_j = \sum_{i=1}^n \lambda_i a_i , </cmath> | ||
+ | |||
+ | as desired. <math>\blacksquare</math> | ||
[[Category:Inequality]] | [[Category:Inequality]] |
Revision as of 14:56, 19 December 2021
This pages lists some proofs of the weighted AM-GM Inequality. The inequality's statement is as follows: for all nonnegative reals and nonnegative reals such that , then with equality if and only if for all such that .
We first note that we may disregard any for which , as they contribute to neither side of the desired inequality. We also note that if and , for some , then the right-hand side of the inequality is zero and the left hand of the inequality is greater or equal to zero, with equality if and only if whenever . Thus we may henceforth assume that all and are strictly positive.
Contents
Proofs of Unweighted AM-GM
These proofs use the assumption that , for all integers .
Proof by Cauchy Induction
We use Cauchy Induction, a variant of induction that involves proving a result for , all powers of , and then a backward step where implies .
Base Case: The smallest nontrivial case of AM-GM is in two variables. By the properties of perfect squares (or by the Trivial Inequality), with equality if and only if , or . Then because and are nonnegative, we can perform the following manipulations: with equality if and only if . This completes the proof of the base case.
Powers of Two: We use induction. Suppose that AM-GM is true for variables; we will then prove that the inequality is true for . Let be any list of nonnegative reals. Then, because the two lists and , have variables each, Adding these two inequalities together and dividing by two yields Then by AM-GM in two variables, Combining this inequality with the previous one yields unweighted AM-GM in variables, with one exception — equality.
For equality, note that every inequality mentioned must have equality as well; thus, inequality holds if and only if all the numbers in are the same, all the numbers in are the same, and . From here, it is trivial to show that this implies , which is the equality condition for unweighted AM-GM in variables.
This completes the induction and implies that unweighted AM-GM holds for all powers of two.
Backward Step: Assume that AM-GM holds for variables. Letting , we have that Equality holds if and only if . However, note that the last equality is implied if all the numbers of are the same; thus, equality holds if and only if .
We then simplify the lefthand side. Multiplying both sides of the fraction by and combining like terms, we get that Thus, Raising both sides to the th power yields From here, we divide by and take the root to get that Every step we took preserves our earlier equality condition, which implies that the inequality holds for variables. This completes the backwards induction, which proves the unweighted AM-GM Inequality, as required.
Proof by Rearrangement
Define the sequence as , for all integers . Evidently these sequences are similarly sorted. Then by the Rearrangement Inequality, where we take our indices modulo , with equality exactly when all the , and therefore all the , are equal. Dividing both sides by gives the desired inequality.
Proof by Calculus
We will start the proof by considering the function . We will now find the maximum of this function. We can do this simply using calculus. We need to find the critical points of , we can do that by finding and setting it equal to . Using the linearity of the derivative . We need Note that this is the only critical point of . We can confirm it is the maximum by finding it's second derivative and making sure it is negative. letting x = 1 we get . Since the second derivative , is a maximum. . Now that we have that is a maximum of , we can safely say that or in other words . We will now define a few more things and do some manipulations with them. Let , with this notice that . This fact will come into play later. now we can do the following, let and plug this into , we get Adding all these results together we get Now exponentiating both sides we get This proves the AM-GM inequality.
Alternate Proof by Induction
Because is a square, . Now, through algebra: From here, we proceed with induction through the second paragraph of the above proof with Cauchy Induction to show the theorem is true.
Proof of Weighted AM-GM
Proof by Convexity
We note that the function is strictly concave. Then by Jensen's Inequality, with equality if and only if all the are equal. Since is a strictly increasing function, it then follows that with equality if and only if all the are equal, as desired.
Alternate Proof by Convexity
This proof is due to G. Pólya.
Note that the function is strictly convex. Let be the line tangent to at ; then . Since is also a continuous, differentiable function, it follows that for all , with equality exactly when , i.e., with equality exactly when .
Now, set for all integers . Our earlier bound tells us that so Multiplying such inequalities gives us
Evaluating the left hand side:
for
Evaluating the right hand side:
Substituting the results for the left and right sides:
as desired.