Mock AIME I 2012 Problems/Problem 12

Revision as of 20:25, 15 March 2015 by Viperstrike (talk | contribs) (Solution)

Problem

Let $P(x)$ be a polynomial of degree 10 satisfying $P(x^2) = P(x)P(x-1)$. Find the maximum possible sum of the coefficients of $P(x)$.

Solution

Notice that if $a$ is a root of $P$, then $a^2$ must be a root of $P$ and $(a + 1)^2$ must be a root of $P$. But then continuing this, $a^{2^n}$ and $(a + 1)^{2^n}$ must be roots of $P$ for all $n$. Since a polynomial has finitely many roots, $a$ and $a + 1$ must be roots of unity so that the above two sets contain finitely many elements. But there is a unique pair of roots of unity with real parts that differ by $1$, making $a =$ $-1/2 \pm i\sqrt{3}/2$. Then the disjoint union of the two sets above is $\{-1/2 + i\sqrt{3}/2, 1/2 + i\sqrt{3}/2\}$, the minimal polynomial for which is $x^2 + x + 1$. Since any power of this base polynomial will work, $P(x) = (x^2 + x + 1)^5$, making the sum of coefficients $\boxed{243}$.




FALSE THE ABOVE IS FALSE. "UNIQUE PAIR OF ROOTS OF UNITY WITH REAL PARTS THAT DIFFER BY 1" IS FALSE. CONSIDER ($i,1$) for example.!!!!!!