I could write about how it works, but for now I mostly care about implementations that are available to me.
How it works
Long story of which I understand only tiny fragments.
However, let us consider how it might be solved with neural nets (Lample and Charton 2019). The linked method looks like a heinous hack at first glance, but maybe it is suggestive of the potential differentiable search in the future? This kind of neural automata has much hype but ends up being very tricky in practice. Something like it is making gains in Natural language processing, though, so, hang around.
Maxima is a system for the manipulation of symbolic and numerical expressions, including differentiation, integration, Taylor series, Laplace transforms, ordinary differential equations, systems of linear equations, polynomials, sets, lists, vectors, matrices and tensors. Maxima yields high precision numerical results by using exact fractions, arbitrary-precision integers and variable-precision floating-point numbers. Maxima can plot functions and data in two and three dimensions. […]
Maxima is a descendant of Macsyma, the legendary computer algebra system developed in the late 1960s at the Massachusetts Institute of Technology. It is the only system based on that effort still publicly available and with an active user community, thanks to its open source nature. Macsyma was revolutionary in its day, and many later systems, such as Maple and Mathematica, were inspired by it.
SymPy is a Python library for symbolic mathematics. It aims to become a full-featured computer algebra system (CAS) while keeping the code as simple as possible in order to be comprehensible and easily extensible. SymPy is written entirely in Python.
Blazek, Paul J., and Milo M. Lin. 2020. “A Neural Network Model of Perception and Reasoning,” February. http://arxiv.org/abs/2002.11319.
Clark, Peter, Oyvind Tafjord, and Kyle Richardson. 2020. “Transformers as Soft Reasoners over Language.” In IJCAI 2020. http://arxiv.org/abs/2002.05867.
Lamb, Luis C., Artur Garcez, Marco Gori, Marcelo Prates, Pedro Avelar, and Moshe Vardi. 2020. “Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective.” In IJCAI 2020. http://arxiv.org/abs/2003.00330.
Lample, Guillaume, and François Charton. 2019. “Deep Learning for Symbolic Mathematics,” December. http://arxiv.org/abs/1912.01412.
Lozano-Duran, A, and M Bassenne. n.d. “Towards Model Discovery with Reinforcement Learning,” 11.