We can generalize Gaussian belief propagation to use general elliptical laws by using Mahalanobis distance without presuming the Gaussian distribution (Agarwal et al. 2013; Davison and Ortiz 2019), making it into a kind of elliptical belief propagation.

## Robust

If we use a robust Huber loss instead of a Gaussian log-likelihood, then the resulting algorithm is usually referred to as a *robust factor* or as *dynamic covariance scaling* (Agarwal et al. 2013; Davison and Ortiz 2019).
The nice thing here is that we can imagine the transition from quadratic to linear losses gives us an estimate of which observations are outliers.

## Student-\(t\)

Surely this is around?
Certainly there is a special case in the t-process.
It is mentioned, I think, in Lan et al. (2006) and possibly Proudler et al. (2007) although the latter seems to be something more *ad hoc*.

## Gaussian mixture

Surely? TBD.

## Generic

There seem to be generic update rules (Aste 2021; Bånkestad et al. 2020) which could be used to construct a generic elliptical belief propagation algorithm.

## References

*2013 IEEE International Conference on Robotics and Automation*, 62–69.

*arXiv:1910.14139 [Cs]*, October.

*arXiv:1310.7320 [Cs, Math, Stat]*, October.

*Journal of Guidance, Control, and Dynamics*34 (2): 388–402.

*Computer Vision – ECCV 2006*, edited by Aleš Leonardis, Horst Bischof, and Axel Pinz, 3952:269–82. Berlin, Heidelberg: Springer Berlin Heidelberg.

*arXiv:2107.02308 [Cs]*, July.

*2007 15th International Conference on Digital Signal Processing*, 355–58.

## No comments yet. Why not leave one?