Quantisation of system state-space, considered in the abstract. Implicit in coding theory, compression, and mixture models.
Connection to classification.
A placeholder.
References
Gerber, S., L. Pospisil, M. Navandar, and I. Horenko. 2020. “Low-Cost Scalable Discretization, Prediction, and Feature Selection for Complex Systems.” Science Advances 6 (5): eaaw0961.
Horenko, Illia. 2020. “On a Scalable Entropic Breaching of the Overfitting Barrier in Machine Learning.”
Horenko, Illia, Edoardo Vecchi, Juraj Kardoš, Andreas Wächter, Olaf Schenk, Terence J. O’Kane, Patrick Gagliardini, and Susanne Gerber. 2023. “On Cheap Entropy-Sparsified Regression Learning.” Proceedings of the National Academy of Sciences 120 (1): e2214972120.
Peluffo-Ordónez, Diego H., John A. Lee, and Michel Verleysen. 2014. “Short Review of Dimensionality Reduction Methods Based on Stochastic Neighbour Embedding.” In Advances in Self-Organizing Maps and Learning Vector Quantization, 65–74. Springer.
Smola, Alex J., Robert C. Williamson, Sebastian Mika, and Bernhard Schölkopf. 1999. “Regularized Principal Manifolds.” In Computational Learning Theory, edited by Paul Fischer and Hans Ulrich Simon, 214–29. Lecture Notes in Computer Science 1572. Springer Berlin Heidelberg.
Vecchi, Edoardo, Lukáš Pospíšil, Steffen Albrecht, Terence J. O’Kane, and Illia Horenko. 2022. “eSPA+: Scalable Entropy-Optimal Machine Learning Classification for Small Data Problems.” Neural Computation 34 (5): 1220–55.
No comments yet. Why not leave one?