This comment from Ben reminded me that lots of people are running nonlinear regressions using least squares and other unstable methods of point estimation.
You can do better, people!
Try stan_lmer, which fits nonlinear models and also allows parameters to vary by groups.
I think people have the sense that maximum likelihood or least squares is this rigorous, well-defined thing, and that Bayesian inference is flaky. The (mistaken) idea is that when using Bayesian inference you’re making extra assumptions and you’re trading robustness for efficiency.
Actually, though, Bayesian inference can be more robust than classical point estimation. They call it “regularization” for a reason!
P.S. I’m not kidding that this can make a difference. Consider this bit from an article cited in the above-linked post:
The point here is not that there’s anything wrong with the above steps, just that they represent a lot of effort to get something that’s kinda clunky and unstable. Just a lack of awareness of existing software.