# Formal Groups and Where to Find Them

In 1946, S. Bochner published the paper Formal Lie Groups, in which he noted that several classical theorems (due to Sophus Lie) concerning infinitesimal transformations on Lie groups continue to hold when the (convergent) power series locally representing the group law was replaced by a suitable formal analogue.  It was not long before this formalism found far-reaching uses in algebraic number theory and algebraic topology.

Unfortunately, few students see more than two or three explicit (i.e. closed form) group laws before stumbling into the deep end of abstract nonsense.  In this article, we’ll see in a rigorous sense why this must be the case, providing along the way a complete classification of polynomial and rational formal group laws (over any reduced ring).

— PART I —

Following Bochner, a formal group law over a commutative ring $R$ (with unity) is a bivariate power series $F(x,y) \in R[[x,y]]$ such that the following two properties hold:

1. $F(x,y)=x+y+O(x,y)^2$;
2. $F(F(x,y),z)=F(x,F(y,z))$,

in which we borrow the O-notation $O(x,y)^n$ to denote an element of the ideal$(x,y)^n \subset R[[x,y]]$.  On occasion, we’ll stress the fact that $F(x,y)$ is a formal group law by writing $F(x,y)=x+_F\!y$.  Then (2) clearly implies that $+_F$ is an associative binary operation, while (1) states that $+_F$ acts locally (i.e. to first order) like “normal” addition on $R[[x,y]]$.

The reader may note that (1-2) fall a few axioms short of the well-known group axioms.  Actually, though, no further axioms are needed: the existence of inverses (and identity) are consequences of (1-2).  To be precise,

Proposition: Let $F(x,y)$ be a formal group law.  Then

1. $F(x,0)=x=F(0,x)$;
2. There exists $g \in R[[x,y]]$ such that $F(x,g)=F(g,x)=0$.

Proof: An exercise in the (formal) Implicit Function Theorem. $\square$

Although we won’t need (or prove) this result, we note that all formal group laws over a ring of characteristic $0$ are commutative.  (This result is sometimes known as Lazard’s Theorem.)

By one measure, the simplest of all formal group laws are those lying inside$R[x,y]$, i.e. those given by polynomials (versus formal power series). Here, two common examples come to mind:

Example 1: The simplest of all formal group laws is the additive formal group law, given by $F(x,y)=x+y$.  Less obvious is the multiplicative formal group law, $F(x,y)=x+y+xy$, which gives – for example – the group law for multiplication on $R$ under shifted coordinates.

These two examples have something in common: each is a member of the one-dimensional family

$F_\alpha(x,y):= x+y+\alpha xy, \qquad \alpha \in R$

of formal group laws.  And, as our first Theorem shows, these often exhaust the polynomial formal group laws over $R$.

Theorem: Suppose that $R$ is reduced (i.e. has trivial nilradical), and let$F(x,y) \in R[x,y]$ be a polynomial formal group law over $R$.  Then $F=F_\alpha$ for some $\alpha \in R$.

Proof: Suppose that $F(x,y)$ is given by the polynomial

$F(x,y)=\sum a_{ij} x^i y^j \in R[x,y]$,

and let $d=\mathrm{deg}_x F(x,y)$, the degree of $F$ in $x$.  Then $\mathrm{deg}_x F(F(x,y),z)$ is at most $d^2$, with the coefficient of $x^{d^2}$ equal to

$\displaystyle\left(\sum_{k=0} a_{dk} y^k\right)^d\left( \sum_{j=0} a_{dj} z^j\right)$.

Now, suppose that this coefficient is $0$.  Taking $z =y$, it follows that $\sum a_{dk} y^k \in R[y]$ is nilpotent.  By the Exercises, it follows that $a_{dk} \in \mathrm{nil}(R)$ for all $k$.  The nilradical of $R$ is trivial by hypothesis, so that $a_{dk}=0$ for all $k$. This contradicts that $\mathrm{deg}_x F(x,y)=d$, so that $\mathrm{deg}_x F(F(x,y),z)$ is exactly$d^2$.  On the other hand, a quick calculation shows that $\mathrm{deg}_x F(x,F(y,z))$ is at most $d$.  Condition (2) forces $d \leq 1$, and condition (1) gives $d=1$ exactly.

Likewise, the identity $F(z,F(x,y))=F(F(z,x),y)$ yields to us that $\mathrm{deg}_y F(x,y)=1$.  Thus $F(x,y)$ fits the form

$F(x,y) = a_{00}+a_{10}x+a_{01}y+a_{11}xy \in R[x,y]$,

whereupon condition (1) gives our result. $\square$

Thus, we see that polynomial formal group laws are unavoidably plain in the case of reduced rings.  (The Exercises contain further examples of polynomial formal group laws, over non-reduced rings.)

— PART II (RATIONAL GROUP LAWS) —

Ever on the hunt for simple examples, we now turn our attention to formal group laws defined by rational functions.  I mentioned previously that few students see more than three formal group laws expressed in closed form. This third example is often the following:

Example 2: Let $R$ be a commutative ring and define

$G_\beta(x,y)=\displaystyle\frac{x+y}{1+\beta x y}, \qquad \alpha \in R$,

considered as formal power series in $R[[x,y]]$.  (If $R$ is a field, we may view$G_\beta(x,y)$ as an element of $R(x,y)$ without incident.  In general, though, we are studying the localization of $R[x,y]$ with respect to the multiplicatively closed set $R^\times + (x,y)R[x,y]$.)

It can be shown that $G_\beta(x,y)$ defines a formal group law over $R$.  Readers may recognize two special cases of $G_\beta$, in that

$G_{-1}(\tan x, \tan y)=\tan (x+y)$;

$G_{1}(\tanh x, \tanh y)=\tanh (x+y)$.

(These may be obtained from each other via Osborne’s Rule.)  As a remark,$G_{-1}(x,y)$ also gives a law for adding velocities (with unit $c$, the speed of light) in the framework of special relativity.  (More information can be found here.)

Actually, the one-dimensional family of formal group laws $G_\beta(x,y)$ (as well as the family given after Example 1) belongs to a two-parameter family

$\displaystyle F_{\alpha,\beta}(x,y) = \frac{x+y+\alpha xy}{1+\beta x y}$,

defined over any commutative ring $R$.  As our next Theorem shows, these often exhaust the rational group laws over $R$:

Theorem: Let $R$ be a reduced (commutative) ring.  If $F(x,y)$ is a rational formal group law over $R$, then $F=F_{\alpha,\beta}$ for some constants $\alpha,\beta \in R$.

Note: This Theorem dates from 1976, in Rational Formal Group Lawsthe doctoral dissertation of Robert Bismuth [1].  Unfortunately, his proof clocks in at around 30 pages, and – in my opinion – fails to address the material in a conceptual way.

Here, we present instead (a significantly expanded version of) a later proof, due to R. Coleman and F. McGuinness [3].  This proof, published under the by-now-familiar title Rational Formal Group Laws, holds when $R$ is a field of characteristic $0$.  (The full proof may be thereafter obtained using techniques from [1].)

Proof: For the moment, let us assume that $R$ (a ring of characteristic $0$) is algebraically closed.  Let $g(x) = F(x,x)$ (a rational function), and define

$\displaystyle F_2(x,y) = \frac{\partial}{\partial y} F(x,y)$.

With this, we define the $1$-form $\omega = dx/F_2(x,0)$, which – as a claim – satisfies $g^*\omega = 2\omega$.  To see this, we recall the definition of the pullback:

$\displaystyle g^*\omega_t(v) := \omega_{g(t)} \left(\mathbf{d}g(v)\right)=\omega_{g(t)} \left(v g'(t)\right)=\frac{dx(v g'(t))}{F_2(t,0)}=\frac{v g'(t)}{F_2(g(t),0)}.$

To simplify this last expression, recall that $F(F(x,y),z)=F(x,F(y,z))$. Applying $\partial/\partial z$ at $z=0$ and setting $x=y$, it follows by the chain rule that$F_2(F(x,x),0) = F_2(x,F(x,0))F_2(x,0)$.  Then, as $F(x,0)=x$, we obtain$F_2(g(t),0)=F_2(t,t)F_2(t,0)$.  The chain rule gives $g'(t)=2F_2(t,t)$, so that

$\displaystyle g^*\omega_t(v) =\frac{v g'(t)}{F_2(t,0)F_2(t,t)}=\frac{2v}{F_2(t,0)}=2\omega_t(v)$,

as claimed.  Next, let $Y$ (resp. $Z$) denote the set of poles (resp. zeros) of $\omega$. With the equation $g^*\omega =2\omega$, it follows that $g^{-1}Y=Y$ and $g^{-1}Z=Z$. Moreover, viewing $g$ as a branched cover $\mathbb{P}(R) \to \mathbb{P}(R)$, we have

$\displaystyle\sum_{P \in g^{-1}(Q)} \left(1+\mathrm{ord}_P g^*\omega\right)=\deg g \left(1+\mathrm{ord}_Q \omega\right)$

$\Longrightarrow\displaystyle \sum_{P \in g^{-1}(Q)} \mathrm{ord}_P g^*\omega = \deg g \cdot\mathrm{ord}_Q \omega + \left(\deg g - \# g^{-1}(Q)\right)$.

The right-hand side is bounded above by $\mathrm{ord}_Q \omega$; summing over $Q \in Y$ gives

$\displaystyle\sum_{Q \in Y} \sum_{P \in g^{-1}(Q)} \!\!\!\!\mathrm{ord}_P g^* \omega = \deg g \!\sum_{Q \in Y} \!\mathrm{ord}_Q \omega \!+\!\#Y(\deg g \!- \#Y) \leq \!\sum_{Q \in Y} \mathrm{ord}_Q \omega$.

The double sum at left is simply $\sum_Y \mathrm{ord}_Q \omega$, since $g^*$ preserves the order of poles/zeros.  It follows that the inequality in the previous line is equality, i.e.

$\deg g\,\mathrm{ord}_Q \omega + \left( \deg g - \#g^{-1}(Q)\right)=\mathrm{ord}_Q \omega$

for all $Q \in Y$.  This equality can be written suggestively as

$\left( \deg g -1\right)\left(\mathrm{ord}_Q \omega+1\right)=\#g^{-1}(Q)-1$,                     (1)

in which we note that our left-hand side is non-positive and our right-hand side is non-negative (as $g$ surjects).  Thus each is $0$, and it follows that either$\deg g =1$ or $\mathrm{ord}_Q \omega =-1$ for all $Q \in Y$ (i.e. $\omega$ admits only simple poles).

In this first case, $g$ is a linear fractional transformation fixing $0$ (of infinite order, because $g^*\omega=2\omega$), so each power of $g$ has  unique non-zero fixed point.  As $\#Y$ is finite and $g^{-1}Y=Y$, there exist integers $n,k$ such that$g^n(Q)=g^{n+k}(Q)$, hence $Q$ is fixed by $g^k$.  On the other hand, any fixed point of $g$ is fixed by $g^k$, so that $g$ fixes $Q$ by uniqueness.  It follows that $Q$ is the unique pole of $\omega$. Likewise, we may show that $Z=\varnothing$, since $g^{-1}Z=Z$. That is, $\omega \in R(x) dx$ has one pole (of order at most $2$, by our condition on $g$) and no zeros, so that $\omega=L^*(dx)$, in which $L$ is some linear fractional transformation of $R$ fixing $0$.

In the second case, fix $Q \in Y$.  Then $g^{-1}(Q)$ consists of a single point (by (1)), say $P$.  A local calculation gives

$\mathrm{Res}_P 2\omega=\mathrm{Res}_P g^*\omega = \deg g \cdot \mathrm{Res}_Q \omega$.                       (2)

As in the previous case, there exists $k$ such that $g^k(Q)=Q$ (for some $Q$). For this $Q$, induction on (2) gives $2^n\mathrm{Res}_Q \omega = \deg(g)^n \mathrm{Res}_Q \omega$, hence$\deg g =2$ (because $\deg g \in \mathbb{Z}^{\geq 0}$ and $\mathrm{Res}_Q \omega \neq 0$).  If $P \in Y$ is not a branch point, then $\#g^{-1}(P)=2$.  Then $\#g^{-1} Y > \# Y$, a contradiction. Thus $Y$ (and similarly, $Z$) is contained in the branch locus of $g$, so that in particular $\# Y + \# Z \leq 2$.  As $Y$ is non-empty (as $\omega \neq 0$), $\#Y=2$ (by the Residue Theorem), so that $\omega$ has two (distinct) poles and no zeros.  It follows that $\omega = L^*(c dx/(x+1))$, in which $L$ is a linear fractional transformation fixing $0$ and $c \in R$ (as $R$ is algebraically closed).

In either case, we may write $F(x,y)=L^{-1} \circ G(Lx,Ly)$: in the first case with $G(x,y)=x+y$; in the second, with $G(x,y)=x+y+xy$.  (I.e. $F$ is isomorphic (see the Exercises) to either the additive or multiplicative formal group law on $R$.)  Regardless, simplification yields

$\displaystyle F(x,y)=\frac{x+y+\alpha xy}{1+\beta x y}$,                                 (3)

with $\alpha, \beta \in R$.  This concludes our proof when $R$ is algebraically closed.  In the general case, fix an embedding of $R$ into an algebraic closure, and consider which rational functions of the form (3) lie in $R(x)$. $\square$

— PART III (ALGEBRAIC GROUP LAWS) —

At the rate we’re going, a better name for this post might be Formal Groups (And Where Not to Find Them).  And so, to find the examples we seek, we shift our attention one final time: from rational formal group laws to algebraic formal group laws, group laws $F(x,y) \in R[[x,y]]$ such that there exists a polynomial $P \in R[w,x,y]$ satisfying

$P(F(x,y),x,y)=0 \in R[[x,y]]$.

Example 3: Using the addition formula for sine, we see that the function

$\displaystyle F(x,y)=x \sqrt{1-y^2}+y\sqrt{1-x^2}=\sum_{k=0}^\infty \textstyle\binom{1/2}{k}\left(yx^{2k}+xy^{2k}\right)$,

defines a formal group law over the ring $R:=\mathbb{Z}[1/2]$.  Moreover, $F$ satisfies the following polynomial relation:

$F(x,y)^4-2(x^2+y^2-2x^2y^2)F(x,y)^2+(x^4-2x^2y^2y^4)=0$.

Thus $F(x,y)$ is an algebraic formal group law over $R$.

Example 4: Fix a ring $R$, and consider the set

$S(R):=\{(x,y) \in R^2 \mid x^2+y^2=1\}$.

For $R =\mathbb{R}$, multiplication (in the complex sense) gives a group operation on$S(R)$, which makes $S(R)$ into a Lie group with identity $(1,0)$.  Near the identity, we obtain a chart for $S(R)$ of the form $(x,y)\mapsto y$.  Under these coordinates, the group law ($\oplus$) takes the form

$y_1 \oplus y_2 =y_1 \sqrt{1-y_2^2}+y_2 \sqrt{1-y_1^2}$,

i.e. $x \oplus y = F(x,y)$, where $F$ is as in Example 3.  That is, we have seen that $F$ arises as the group law of an algebraic group (with respect to local coordinates).  Consistent with the terminology of Bochner, a polynomial $F$ of this form is said to be a formal algebraic group (cf. formal Lie group).

These two Examples paint a picture which is indicative of a general rule, established by R. Coleman in 1986 [2]:

Theorem: Let $R$ be algebraically closed field, of characteristic zero.  If $F(x,y)$ is an algebraic formal group law over $R$, then $F$ is algebraically isomorphic to a formal algebraic group.

Proof: Can be found here. $\square$

That is, not only do algebraic groups give rise to algebraic formal groups, all such formal groups (up to isomorphism) appear in this form.  (The caveat “up to isomorphism” is necessary, as a formal power series need not converge.)

And so, we finally have an answer to our question: if you’re looking for “simple” (in an algebraic sense) examples of formal group laws, look to the theory of algebraic groups.  Not only will you find some great examples (e.g. matrix groups and elliptic curves), you’d be hard-pressed to find anything quite as simple.

— EXERCISES —

Exercise: A homomorphism $f: F \to G$ of formal groups over $R$ is a power series $f(x) \in R[[x]]$ such that $f(F(x,y)) =G(f(x),f(x))$.  Show that a homomorphism $f:F_\alpha \to F_\beta$ of polynomial formal groups over $R$ exists if $\beta \mid \alpha$.  It follows that associate elements define isomorphic formal groups.  Show that the converse need not hold.  Hint: when is $e^x-1$ defined over $R[[x]]$?

Exercise: If $f : F \to G$ is a homomorphism of polynomial formal groups over $R$ and $f$ is a polynomial, we’ll say that $f$ is a p-homomorphism of polynomial formal groups.  Similarly, $f$ is a p-isomorphism if $f$ admits a polynomial inverse.  Show that the p-isomorphism classes of polynomial formal groups are classified by the equivalence classes of associate elements in $R$.

Exercise: Let $R$ be a commutative ring.  Show that the nilradical of $R[x]$ is$\mathrm{nil}(R)[x]$.  Hint: the inclusion “$\supset$” is easy; for the converse, proceed by induction on degree.  (Solution can be found on Project Crazy Project.)

Exercise: Fix a nonzero integer $n$, and let $R$ denote the ring $R=\mathbb{Z}/(n^2)$. Show that

$F(x,y)=x+y+nx^2y^2 \in R[x,y]$

defines a polynomial formal group law, which is not of the form $F_\alpha(x,y)$. Find a polynomial formal group law over the ring $\mathbb{Z}/(n^3)$, not of the form $F_\alpha(x,y)$.

Exercise: Given the result of the main Theorem of Part II, find necessary and sufficient conditions for each of the two cases therein to occur.  Are rational formal group laws in general (rationally) isomorphic to the additive formal group law, or the multiplicative formal group law?

— REFERENCES —