Computational symbolic mathematics

cheating at calculus by computer

I could write about how it works, but for now I mostly care about implementations that are available to me.

How it works

Long story of which I understand only tiny fragments.

However, let us consider how it might be solved with neural nets (Lample and Charton 2019). The linked method looks like a heinous hack at first glance, but maybe it is suggestive of the potential differentiable search in the future? This kind of neural automata has much hype but ends up being tricky in practice. Something like it is making gains in Natural language processing, though, so, hang around.


Famously Mathematica and MAPLE are the expensive options with big marketing budgets.

Other options?



Maxima is a system for the manipulation of symbolic and numerical expressions, including differentiation, integration, Taylor series, Laplace transforms, ordinary differential equations, systems of linear equations, polynomials, sets, lists, vectors, matrices and tensors. Maxima yields high precision numerical results by using exact fractions, arbitrary-precision integers and variable-precision floating-point numbers. Maxima can plot functions and data in two and three dimensions. […]

Maxima is a descendant of Macsyma, the legendary computer algebra system developed in the late 1960s at the Massachusetts Institute of Technology. It is the only system based on that effort still publicly available and with an active user community, thanks to its open source nature. Macsyma was revolutionary in its day, and many later systems, such as Maple and Mathematica, were inspired by it.


Somewhat newer, sympy: (as seen in sagemath, which also includes Maxima)

SymPy is a Python library for symbolic mathematics. It aims to become a full-featured computer algebra system (CAS) while keeping the code as simple as possible in order to be comprehensible and easily extensible. SymPy is written entirely in Python.

It has basic differential geometry support, interesting Geometric Algebra stuff, called GAlgebra, (see also the manual) which is a selling point since Mathematica is kinda crappy for that.


PARI/GP is a GPL package favoured by number theorists and has some fun stunts and has a lot of functions for modular forms, plus the ability to convert some number theoretic stuff to native C code.


Weirdly, there is a javascript symbolic math library, nerdamer. Bare bones but, I suppose, useful in javascript machine learning.


Blazek, Paul J., and Milo M. Lin. 2020. “A Neural Network Model of Perception and Reasoning.” February 26, 2020.
Clark, Peter, Oyvind Tafjord, and Kyle Richardson. 2020. “Transformers as Soft Reasoners over Language.” In IJCAI 2020.
Lamb, Luis C., Artur Garcez, Marco Gori, Marcelo Prates, Pedro Avelar, and Moshe Vardi. 2020. “Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective.” In IJCAI 2020.
Lample, Guillaume, and François Charton. 2019. “Deep Learning for Symbolic Mathematics.” December 2, 2019.
Lozano-Duran, A, and M Bassenne. n.d. “Towards Model Discovery with Reinforcement Learning,” 11.

Warning! Experimental comments system! If is does not work for you, let me know via the contact form.

No comments yet!

GitHub-flavored Markdown & a sane subset of HTML is supported.