Boolformer: Symbolic Regression of Logic Functions with Transformers
AuthorsStéphane d’Ascoli*, Arthur Renard*†, Vassilis Papadopoulos†, Clément Hongler†, Josh Susskind, Samy Bengio, Emmanuel Abbé‡
AuthorsStéphane d’Ascoli*, Arthur Renard*†, Vassilis Papadopoulos†, Clément Hongler†, Josh Susskind, Samy Bengio, Emmanuel Abbé‡
This paper was accepted at the 2nd AI for Math Workshop at ICML 2025.
We introduce Boolformer, a Transformer-based model trained to perform end-to-end symbolic regression of Boolean functions. First, we show that it can predict compact formulas for complex functions not seen during training, given their full truth table. Then, we demonstrate that even with incomplete or noisy observations, Boolformer is still able to find good approximate expressions. We evaluate Boolformer on a broad set of real-world binary classification datasets, demonstrating its potential as an interpretable alternative to classic machine learning methods. Finally, we apply it to the widespread task of modeling the dynamics of gene regulatory networks and show through a benchmark that Boolformer is competitive with state-of-the-art genetic algorithms, with a speedup of several orders of magnitude. Our code and models are available publicly.
February 19, 2025research area Computer Vision
November 10, 2022research area Methods and Algorithmsconference NeurIPS