## Lyapunov-Schmidt reduction and stability

I discussed the process of Lyapunov-Schmidt reduction in a previous post. Here I give an extension of that to treat the question of stability. I again follow the book of Golubitsky and Schaeffer. My interest in this question has the following source. Suppose we have a system of ODE $\dot y+F(y,\alpha)=0$ depending on a parameter $\alpha$. The equation for steady state solutions is then $F(y,\alpha)=0$. Sometimes we can eliminate all but one of the variables to obtain an equation $g(x,\alpha)=0$ for a real variable $x$ whose solutions are in one to one correspondence with those of the original equation for steady states. Clearly this situation is closely related to Lyapunov-Schmidt reduction in the case where the linearization has corank one. Often the reduced equation is much easier to treat that the original one and this can be used to obtain information on the number of steady states of the ODE system. This can be used to study multistationarity in systems of ODE arising as models in molecular biology. In that context we would like more refined information related to multistability. In other words, we would like to know something about the stability of the steady states produced by the reduction process. Stablity is a dynamical property and so it is not a priori clear that it can be investigated by looking at the equation for steady states on its own. Different ODE systems can have the same set of steady states. Note, however, that in the case of hyperbolic steady states the stability of a steady state is determined by the eigenvalues of the linearization of the function $F$ at that point. Golubitsky and Schaeffer prove the following remarkable result. (It seems clear that results of this kind were previously known in some form but I did not yet find an earlier source with a clear statement of this result free from many auxiliary complications.) Suppose that we have a bifurcation point $(y_0,\lambda_0)$ where the linearization of $F$ has a zero eigenvalue of multiplicity one and all other eigenvalues have negative real part. Let $x_0$ be the corresponding zero of $g$. The result is that if $g(x)=0$ and $g'(x)\ne 0$ then for $x$ close to $x_0$ the linearization of $F$ about the steady state has a unique eigenvalue close to zero and its sign is the same as that of $g'(x)$. Thus the stability of steady states arising at the bifurcation point is determined by the function $g$.

I found the proof of this theorem hard to follow. I can understand the individual steps but I feel that I am still missing a global intuition for the strategy used. In this post I describe the proof and present the partial intuition I have for it. Close to the bifurcation point the unique eigenvalue close to zero, call it $\mu$, is a smooth function of $x$ and $\alpha$ because it is of multiplicity one. The derivative $g'$ is also a smooth function. The aim is to show that they have the same sign. This would be enough to prove the desired stability statement. Suppose that the gradient of $g'$ at $x_0$ is non-zero. Then the zero set of $g$ is a submanifold in a neighbourhood of $x_0$. It turns out that $\mu$ vanishes on that manifold. If we could show that the gradient of $\mu$ is non-zero there then it would follow that the sign of $\mu$ off the manifold is determined by that of $g'$. With suitable sign conventions they are equal and this is the desired conclusion. The statement about the vanishing of $\mu$ is relatively easy to prove. Differentiating the basic equations arising in the Lyapunov-Schmidt reduction shows that the derivative of $F$ applied to the gradient of a function $\Omega$ arising in the reduction process is zero. Thus the derivative of $F$ has a zero eigenvalue and it can only be equal to $\mu$. For by the continuous dependence of eigenvalues no other eigenvalue can come close to zero in a neighbourhood of the bifurcation point.

After this the argument becomes more complicated since in general the gradients of $g'$ and $\mu$ could be zero. This is got around by introducing a deformation of the original problem depending on an additional parameter $\beta$ and letting $\beta$ tend to zero at the end of the day to recover the original problem. The deformed problem is defined by the function $\tilde F(y,\alpha,\beta)=F(y,\alpha)+\beta y$. Lyapunov-Schmidt reduction is applied to $\tilde F$ to get a function $\tilde g$. Let $\tilde\mu$ be the eigenvalue of $D\tilde F$ which is analogous to the eigenvalue $\mu$ of $DF$. From what was said above it follows that, in a notation which is hopefully clear, $\tilde g_x=0$ implies $\tilde\mu=0$. We now want to show that the gradients of these two functions are non-zero. Lyapunov-Schmidt theory includes a formula expressing $\tilde g_{x\beta}$ in terms of $F$. This formula allows us to prove that $\tilde g_{x\beta}(0,0,0)=\langle v_0^*,v_0\rangle>0$. Next we turn to the gradient of $\tilde \mu$, more specifically to the derivative of $\tilde \mu$ with respect to $\beta$. First it is proved that $\tilde\Omega (0,0,\beta)=0$ for all $\beta$. I omit the proof, which is not hard. Differentiating $\tilde F$ and evaluating at the point $(0,0,\beta)$ shows that $v_0$ is an eigenvalue of $D\tilde F$ there with eigenvalue $\beta$. Hence $\tilde\mu (0,0,\beta)=0$ for all $\beta$. Putting these facts together shows that $\tilde\mu (\tilde \Omega,0,\beta)=\beta$ and the derivative of $\tilde\mu (\tilde \Omega,0,\beta)$ with respect to $\beta$ at the bifurcation point is equal to one.

We now use the following general fact. If $f_1$ and $f_2$ are two smooth functions, $f_2$ vanishes whenever $f_1$ does and the gradients of both functions are non-zero then $f_2/f_1$ extends smoothly to the zero set of $f_1$ and the value of the extension there is given by the ratio of the gradients (which are necessarily proportional to each other). In our example we get $\tilde\mu(\tilde\Omega,\alpha,\beta)=\tilde a(x,\alpha,\beta)\tilde g_x(x,\alpha,\beta)$ with $\tilde a(0,0,0)=[\frac{\partial}{\partial\beta}(\tilde\Omega,0,0)]/\tilde g_{x\beta}(0,0,0)]>0$. Setting $\beta=0$ in the first equation then gives the desired conclusion.

This site uses Akismet to reduce spam. Learn how your comment data is processed.