Mock AIME I 2012 Problems/Problem 12

Revision as of 00:56, 25 November 2016 by Djmathman (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Problem

Let $P(x)$ be a polynomial of degree 10 satisfying $P(x^2) = P(x)P(x-1)$. Find the maximum possible sum of the coefficients of $P(x)$.

Solution

Notice that if $a$ is a root of $P$, then $a^2$ must be a root of $P$ and $(a + 1)^2$ must be a root of $P$. But then continuing this, $a^{2^n}$ and $(a + 1)^{2^n}$ must be roots of $P$ for all $n$. Since a polynomial has finitely many roots, $a$ and $a + 1$ must be roots of unity so that the above two sets contain finitely many elements. But there is a unique such pair of roots of unity, making $a =$ $-1/2 \pm i\sqrt{3}/2$. Then the disjoint union of the two sets above is $\{-1/2 + i\sqrt{3}/2, 1/2 + i\sqrt{3}/2\}$, the minimal polynomial for which is $x^2 + x + 1$. Since any power of this base polynomial will work, $P(x) = (x^2 + x + 1)^5$, making the sum of coefficients $\boxed{243}$.