It would be nice to find Hopf bifurcations in Mathematica by minimizing distance of eigenvalues to the imaginary axis. Since I always start from a stable fixed point, it suffices to NMaximize the angle to Oy, and stop eventually if we cross this axis. I have an ODE which admits Hopf, specified by RHS, var, par, and initial par values with stable fixed point, as may be checked by the script fpHopf, provided
(*SIR model with n=3 recovered compartments*)
var = {s, i, r1, r2, r3};
par = {\[Lambda], \[Mu], \[Beta], \[Beta]1, \[Beta]2, \[Beta]3, \
\[Gamma], \[Gamma]1, \[Gamma]2, \[Gamma]3};
RHS = {\[Lambda] - \[Mu] s - \[Beta] s i + \[Gamma]3 r3,
i (\[Beta] s + \[Beta]1 r1 + \[Beta]2 r2 + \[Beta]3 r3 - \[Gamma] \
- \[Mu]), \[Gamma] i -
r1 (\[Beta]1 i + \[Gamma]1 + \[Mu]), \[Gamma]1 r1 -
r2 (\[Beta]2 i + \[Gamma]2 + \[Mu]), \[Gamma]2 r2 -
r3 (\[Beta]3 i + \[Gamma]3 + \[Mu])};
(*Example parameter values with stable fixed point*)
p0Val = {0.02, 0.01, 0.3, 0.1, 0.05, 0.02, 0.1, 0.05, 0.03, 0.01};
(* Simplified fpHopf that outputs angle as 3rd output, eigs as 4th *)
fpHopf[RHS_, var_, par_, p0Val_] :=
Module[{eqns, allSols, posSols, eq, jac, eigs, complexEigs,
upperEig, angle, substitutionRules},
(* Create substitution rules *)
substitutionRules = Thread[par -> p0Val];
(* Create equations and find positive solutions *)
eqns = Thread[(RHS /. substitutionRules) == 0];
allSols = Quiet[NSolveValues[eqns, var, Reals]];
allSols =
Union[allSols, SameTest -> (Norm[#1 - #2] < 10^-10 &)];
allSols = Select[allSols, AllTrue[#, # >= 0 &] &];
posSols = Select[allSols, AllTrue[#, # > 0 &] &];
If[Length[posSols] == 0,
Return[{{}, {}, -90, {}}];
];
(* Use first positive solution for eigenvalue analysis *)
eq = posSols[[1]];
jac = D[RHS, {var}] /. substitutionRules /. Thread[var -> eq];
eigs = Chop[Eigenvalues[jac] // N];
complexEigs = Select[eigs, Im[#] != 0 &];
(* Calculate angle *)
angle = If[Length[complexEigs] > 0,
upperEig = First[Select[complexEigs, Im[#] > 0 &]];
ArcTan[Re[upperEig]/Im[upperEig]] * 180/Pi,
-90
];
{posSols, complexEigs, angle, eigs}
];
{posSols, complexEigs, angle, eigs} = fpHopf[RHS, var, par, p0Val]
Next, I define an optimization function, choosing to optimize only some of the parameters specified in optInd
optInd = {2, 3}(*used in search for Hopf*); optVars = par[[optInd]];
angleFunc[optInd_] :=
fpHopf[RHS, var, par, ReplacePart[p0Val, Thread[optInd -> #]]][[3]] &;
angleFunc[optInd][{0.012, 0.3}]
Finally, I turn to NMaximize, and here I want it to start from a good starting point
Timing[TimeConstrained[
NMaximize[{angleFunc[optInd][optVars],
0.005 <= optVars[[1]] <= 0.2 &&
0.25 <= optVars[[2]] <= .5}, {{optVars[[1]], 0.011,
0.013}, {optVars[[2]], .28, 0.31}},(*Starting values*)
MaxIterations -> 2], 10]]
But this leads to {1.4375, $Aborted} with TimeConstrained, and without it runs forever and gets worse value than the starting point, as soon as fpHopf fails to find a fixed point, yielding {} Any idea on how to fix this? (Incidentally, I am aware that EcoEvo package does similar things, but I want a standalone function, for easy integration).The probable problem here is that it is easy to get outside the region where fixed points exists, and then fpHopf sends {}, and angle -90, and Nintegrate gets lost. Trying to fix this amounts essentially to rewriting the NIntegrate ... Is there another available optimization method, besides NIntegrate and FindMax, that could be used?