Muirhead's Inequality

Revision as of 23:07, 5 July 2023 by Franklin.vp (talk | contribs) (Example revisited: Changing pmatrix to bmatrix such that column 2-vectors do not look like binomial coefficients)

Muirhead's Inequality states that if a sequence $p$ majorizes a sequence $q$, then given a set of positive reals $x_1,x_2,\cdots,x_n$: \[\sum_{\text{sym}} {x_1}^{p_1}{x_2}^{p_2}\cdots {x_n}^{p_n}\geq \sum_{\text{sym}} {x_1}^{q_1}{x_2}^{q_2}\cdots {x_n}^{q_n}\]

Example

The inequality is easier to understand given an example. Since the sequence $(5,1)$ majorizes $(4,2)$ (as $5>4, 5+1=4+2$), Muirhead's inequality states that for any positive $x,y$,

\[x^5y^1+y^5x^1\geq x^4y^2+y^4x^2\]

Usage on Olympiad Problems

A common bruteforce technique with inequalities is to clear denominators, multiply everything out, and apply Muirhead's or Schur's. However, it is worth noting that any inequality that can be proved directly with Muirhead can also be proved using the Arithmetic Mean-Geometric Mean inequality. In fact, IMO gold medalist Thomas Mildorf says it is unwise to use Muirhead in an Olympiad solution; one should use an application of AM-GM instead. Thus, it is suggested that Muirhead be used only to verify that an inequality can be proved with AM-GM before demonstrating the full AM-GM proof.

Example revisited

To understand the proof further below, let's also prove the inequality $x^5y^1+y^5x^1\geq x^4y^2+y^4x^2$ from the (weighted) AM-GM inequality.

\begin{align*} x^5y^1+y^5x^1&=\frac{3}{4}\left(x^5y^1+y^5x^1\right)+\frac{1}{4}\left(x^5y^1+y^5x^1\right)\\ &=\left(\frac{3}{4}x^5y^1+\frac{1}{4}x^1y^5\right)+\left(\frac{3}{4}y^5x^1+\frac{1}{4}y^1x^5\right)\\ &\geq \left(x^{\frac{3}{4}5+\frac{1}{4}1}y^{\frac{3}{4}1+\frac{1}{4}5}\right)^{3/4+1/4} + \left(y^{\frac{3}{4}5+\frac{1}{4}1}x^{\frac{3}{4}1+\frac{1}{4}5}\right)^{3/4+1/4}\\ &=x^4y^2+y^2x^4 \end{align*}

where the step with the inequality consists in applying the weighted AM-GM inequality to each expression in parentheses.

The coefficients $3/4$ and $1/4$ and the use grouping of terms according to the permutations $(5,1)$ and $(1,5)$ come from the following arithmetic facts.

Note that $\begin{bmatrix}4\\2\end{bmatrix}=T\begin{bmatrix}5\\1\end{bmatrix}$, where the matrix $T=\begin{bmatrix}3/4&1/4\\1/4&3/4\end{bmatrix}$. This matrix has non-negative entries and all its rows and columns add up to $1$. Such matrices are called doubly stochastic matrices. We can write $T=\frac{3}{4}\begin{bmatrix}1&0\\0&1\end{bmatrix}+\frac{1}{4}\begin{bmatrix}0&1\\1&0\end{bmatrix}$. Matrices like $P_1=\begin{bmatrix}1&0\\0&1\end{bmatrix}$ and $P_2=\begin{bmatrix}0&1\\1&0\end{bmatrix}$, with exactly one $1$ in each row and each column and $0$ everywhere else, are called permutation matrices. Observe how $P_1\begin{bmatrix}5\\1\end{bmatrix}=\begin{bmatrix}5\\1\end{bmatrix}$ and $P_2\begin{bmatrix}5\\1\end{bmatrix}=\begin{bmatrix}1\\5\end{bmatrix}$ gives us all permutations of $\begin{bmatrix}5\\1\end{bmatrix}$.

So, we can write

\begin{align*} \begin{bmatrix}4\\2\end{bmatrix}&=T\begin{bmatrix}5\\1\end{bmatrix}\\ &=\left(\frac{3}{4}\begin{bmatrix}1&0\\0&1\end{bmatrix}+\frac{1}{4}\begin{bmatrix}0&1\\1&0\end{bmatrix}\right)\begin{bmatrix}5\\1\end{bmatrix}\\ &=\frac{3}{4}\begin{bmatrix}5\\1\end{bmatrix}+\frac{1}{4}\begin{bmatrix}1\\5\end{bmatrix} \end{align*}

A single inequality that follows from Muirhead's inequality could be proven from the AM-GM inequality in multiple ways. Another way is

\begin{align*} \frac{x^4+x^4+x^4+y^4}{4}&\geq\sqrt[4]{x^{12}y^4}=x^3y\\ \frac{x^4+y^4+y^4+y^4}{4}&\geq\sqrt[4]{x^4y^{12}}=xy^3 \end{align*} Adding these, \[x^4+y^4\geq x^3y+xy^3\] Multiplying both sides by $xy$ (as both $x$ and $y$ are positive), \[x^5y+xy^5\geq x^4y^2+x^2y^4\] as desired.

Proof

Given $p=(p_1,p_2,\ldots,p_n)\in\mathbb{R}^n$, let's write \[[p]=\frac{1}{n!}\sum_{sym}x_1^{p_1}x_2^{p_2}\dotsm x_n^{p_n}\] Let's denote that $p$ majorizes $q=(q_1,q_2,\ldots,q_n)\in\mathbb{R}^n$ as $p\succ q$. These notation is useful in the context of Muirhead's inequality. In particular the theorem can be stated as $p\succ q$ implies $[p]\geq [q]$.

Even though $p$ and $q$ are written above as $n$-tuples, it will be convenient to think of them as column vectors, vertical.

The first goal of the proof is to show that there is a doubly stochastic matrix $D$ such that $q=Dp$. Then Birkhoff-von Neuman theorem tells us that there are $c_1,c_2,\ldots c_{n!}\geq0$ such that $\sum_{i=1}^{n!}c_i=1$ and $D=\sum_{i=1}^{n!}c_iP_i$, where $P_i$ are permutation matrices.

Note that once we have such an expression for $D$, then \begin{align*} [p]&=\sum_{i=1}^{n!}c_i[p]\\ &\geq\left[\sum_{i=1}^{n!}c_iP_ip\right]\\ &=[Dp]\\ &=[q] \end{align*} where the step with the inequality (like in the example above) consists of applying the weighted AM-GM inequality grouping the terms from each element of the sum $\sum_{i=1}^{n!}$, corresponding to each permutation of the variables $x_1,x_2,\ldots,x_n$. Remember that the brackets $[\cdot]$ are representing also a summation $\frac{1}{n!}\sum_{sym}$.

Before showing how to produce the matrix $D$, note that knowing the computational aspect of Birkhoff-von Newmann theorem is useful, since it gives us an algorithm to compute the $c_1,c_2,\ldots,c_{n!}$.

To produce the matrix $D$ we construct a sequence $p=r_0, r_1, r_2, \ldots, r_k=q$ of $n$-tuples (or rather column vectors), such that $p\succ r_1\succ r_2\succ\ldots\succ r_k=q$, each $r_i$ has at least one more component equal to a component of $q$ than its predecessor $r_{i-1}$, and $r_{i}=T_ir_{i-1}$, for some doubly stochastic matrix $T_i$. Note that then we can take $D=T_k\dotsm T_2T_1$ to get $q=Dp$ and since products of doubly stochastic matrices is a doubly stochastic matrix, we get what we wanted.

In the case $p=q$ there is nothing to prove. Even Muirhead's inequality would be an equality in this case. So, assume that $p\succ q$ and $p\neq q$.

Define

\begin{align*} j&=\min\{i:\ p_i>q_i\}\\ k&=\min\{i:\ p_i<q_i\}\\ b&=\frac{p_j+p_k}{2}\\ d&=\frac{p_j-p_k}{2}\\ c&=\max\{|q_k-b|, |q_k-b|\}\\ \end{align*}

Define the matrix $T=T_1$ with entries

\begin{align*} T_{j,j}&=\frac{d+c}{2d}\\ T_{k,k}&=\frac{d+c}{2d}\\ T_{j,k}&=\frac{d-c}{2d}\\ T_{k,j}&=\frac{d-c}{2d} \end{align*} all other diagonal entries $T_{i,i}$, for $i\notin\{j,k\}$ are equal to $1$ and the remaining entries equal to $0$.

It is straightforward from the definition to verify that this matrix is doubly stochastic, satisfies $r_1=Tp$, all components of $r_1$ before position $j$ and after position $k$ are equal to those of $p$, but at least one of the components at position $j$ or $k$ became equal to the corresponding component in $q$. By repeating this construction with $r_1,r_2,...$ etc. we get the matrices $T_2,T_3,\ldots T_k$.