Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reconsider absorption rules for boolean predicates #194

Open
robdockins opened this issue Mar 11, 2022 · 2 comments
Open

Reconsider absorption rules for boolean predicates #194

robdockins opened this issue Mar 11, 2022 · 2 comments

Comments

@robdockins
Copy link
Contributor

Right now, What4 does a check on every boolean and and or to see if we can reduce the given terms by one of the absorption laws of boolean algebras. However, this check is rather expensive, and we should re-evaluate if it is worth doing. In particular, building a large sequence of conjunctions by repeatedly using the binary logical operations ends up being quadratic in the number of terms involved, and may be of no real use if the terms involved are all known to be distinct.

@langston-barrett
Copy link
Contributor

The boolean-normalization executable in #277 gives at least the beginnings of a criteria for evaluating these rewrites - we could see how much removing them impacts the running time vs. term size. Of course, uniformly random boolean expressions aren't necessarily representative of real-world data, so ideally we would want to evaluate a large SAW proof as well.

@langston-barrett
Copy link
Contributor

langston-barrett commented Feb 4, 2025

We could also consider a mechanism that avoids this check in some cases: Each WordMap could additionally store an array of N bits. The predicates would be hashed on insertion (as they already are), that hash would be taken mod N, and that bit would be set in the array. Before running the absorption checks, the Builder could first see if the bit-arrays are disjoint. If so, the conjunctions surely share no variables.

This obviously comes with false negatives (or however you would say that) - hashes can collide, especially when they're winnowed down to a small domain, so the bit-array check may say the maps are not disjoint even when they really are. Increases in N would lead to fewer collisions but also additional space overhead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants