Say we are given
and want to know, is ? How about ?
A human can of course look at these equations and say "yes" and "not provable from the given information", but our goal is to develop an algorithm that can do this for arbitrary sets of inequalities, and arbitrary queries. Our compiler will need this for some checks.
So, how can we do this? For starters, manipulating equations tends to be easiest if we place them into matrices. Secondarily, equalities are much easier to work with than inequalities. Thus, we introduce slack variables to turn each of these into an equality. Now, we have
Now, say, we're given an arbitrary query with respect to our 5 variables (and possible constants). E.g., we may want to know whether or are known
Let us use , or equivalently, as an example. We could repsent this as the column vector .
So, to be able to answer queries with respect to and , the first thing we'd need to be able to do isolate them in an expression. That means, we'll need to take some linear combination of the constraints to get the vector .
But first, we should put our constraints into Hermite Normal Form and then drop all the zeroed constraints, or employ some other similar technique, in order to make sure that all remaining constraints are linearly independent. Guaranteeing they're linearly independent may simplify work later on. In the example above, we already have full row rank (4), so we'll skip this step in this blog post.
Now, to isolate our query, we have the problem
So we're trying to find a linear combination of constraints to isolate our query , but then what shall be?
corresponds to our four slack variables.
Now, to answer queries of the variety, we want all slack variables to be . That is if we have , then .
We can focus on only , as the other queries of interest can be reframed as , e.g. by adjusting constant offsets or flipping signs.
So, our current idea for proving (or failing to prove) a query is to find the linear combination of existing constraints that produces our query, and then check if all slack variables are negative.
To assist us in this quest, we add augment colums and set :
Now, the linear combination of columns must produce a value of for each slack variable. We can divide this into both the part contributing to our query, and our augment. Letting be the total number of variables, and the total number of equations, using 0-based indexing, and referring to the above expression more succinctly as :
Note that is precisely the quantity we require to be for our query to be true; it is the value of the slack variable in the linear combination of constraints that produces our query. So, to check if this is satisfied, all we must do is check that .
Okay, but how do we solve for ? If has full column rank, this is trivial: we either have a solution, or no solutions. If we have no solutions, we obviously fail. If we have one solution, it is trivial to check whether all .
But what if does not have full column rank? Again, we have the problem . Our first step will be to ensure that it has full row rank, which we can do by multiplying by some matrix if necessary, e.g., calculated from Hermite Normal Form, discarding zeroed rows. Letting , and , we now have .
Letting and be the number of rows and columns of , respectively. Note that is also the rank of , as we have guaranteed that it has full row rank. Because the column rank is deficient, we know .
We can construct a non-singular (but not necessarilly unimodulary) integer matrix , where the first columns diagonalize , while the remaining columns are the nullspace of . Calling these two blocks and , we have .
With this, we have
Where is diagonal.
is a valid solution for all values of . Now, solving for , as it is that we need to prove our queries, we have
As is the nullspace of , can moves through the nullspace of .
Now, recognizing that we need the last entries of to be but don't care about the first elements, we left-multiply by
to discard the elements of we aren't interested in. For simplicity, let
Now, as what we need is
So, this reduces the problem to one of simply checking if is satisfiable, where is known from the initial constraints, and is computed from the query (and most of the computation can be done once on initialization; for a particular query, all we need is a matrix-vector multiply). Checking feasibility is the first step to solving any linear program, but we can write specialized software for polyhedra that handles the unbounded nature of without needing to duplicate the elements of in a tableau.