This past Friday night and Saturday morning I put together a new utility which I've brought onto IRC and introduced as "thebits". The idea was to parse and canonicalize the most common ways of writing boolean equations and offer utilities to check equivalency, evalute an expression given values of any free variables, and to eventually provide minimization.
Current implementation
Since this is a personal project, I've opted to once again roll up my
sleeves and write a parser from scratch. It allows many ways to write the same
operations, supporting everything I've seen an actual human use while writing
equations in text. One can use +
for logical disjunction and
*
for conjunction, or use implicit conjunction between two
adjacent terms. Both prefix !
and ~
are usable for
negation, as well as suffix '
. Any lowercase letter may be used
as a free variable.
There are three main modes of operation depending on what is given as input. In the simplest case a single binary equation is given as input. This is checked for errors and parsed into an AST. The AST is dumped in "lisp" form, and a summary of the used variables is printed followed by the "canonical" form of the input.
The second mode is invoked by giving two equations separated by an equals sign, in which case the equations are checked for equivalence. If they are not equivalent, a single set of values for the variables is given which produces different results for each equation.
The final mode is used to evaluate an equation given values for each variable.
Examples:
a + b
ab + c
!(a + b) + c(d ^ e)
a + true
a + b = b + a
abc + d = d(c + a)
a=1,b=0,c=1; a + b ^ c
a=0,b=1,c=1; ab + ~c
This post needs more work to be completed. Until then, a more complete description of all the available operators and forms is available in the current README.
Future work
The current minimization is, well, minimal itself. Currently
double negations are removed after canonicalizations (to account for prefix
and postfix which may also cancel). Simple constant folding such as true
+ false
or a * false
are not handled. Slightly more
complicated would be subexpression evaluation to check if it is a constant.
Small cases like b + !b
can be handled directly, but there is
already an evaluation engine that could be leveraged to handle such cases
dynamically.
The evaluation and equivalence checking are currently implemented as rather dumb brute force approaches. Even simple optimizations such as short circuiting are not implemented. An idea I'd like to explore is threading the equivalence checking by setting some subset of variables and having individual threads check the remaining options. The AST could be transformed to replace the fixed variables with constants, and an additional minimization pass could be performed.
Canonicalization does not go far enough currently. All operations and grouping is transformed into one "canonical" options, but no sorting of terms happens. For a string of conjuctions of single variables, the variables could be sorted without affecting the result. For more complicated subexpressions, the trees could be compared and sorted by things such as depth, node count, or perhaps lexigraphical ordering of in order traversal.
Part of what spawned this project was talking to a fellow on IRC about Karnaugh maps. It would be nice to generate graphical maps of the minimization as a teaching or example generating tool
Expose the utility through a webpage. Probably this one, and the eventual page over in the work section.