Author: Christophe Calvès (Université de Lorraine, Centre National de la Recherche Scientifique)
Conference: 24th International Conference on Rewriting Techniques and Applications (RTA 2013), 24-26 June, 2013, Eindhoven, The Netherlands.
Journal reference: Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, Femke van Raamsdonk (Ed.), pp 143-157 (2013).
Comments: Many algorithms in computer science involve some sort of pattern matching which involves, at its most abstract level, working out whether two different terms can be forced to be equal by an appropriate choice of substitutions of their variables. This process is called unification and is best understood by simple examples - given the terms f(c,x) and f(y,d), where f is a binary function, x and y are variables, and c and d are constants that may not vary, we can unify them by mapping x to c and y to d, to get the new term f(c,d). On the other hand the terms f(x,x) and f(c,d) cannot be unified - there is nothing we can substitute for x that will make the terms equal.
In its simplest form, as in the examples above, unification is frequently useful but basically trivial - unifiers (more precisely the best, or 'most general' unifiers) can be computed in linear time. One way to make the problem harder is to relax the definition of equal from 'literally equal' to 'equal with respect to some equational theory', which creates a unification problem that is in general undecidable - working out whether f(x,x) and f(c,d) are unifiable now necessitates determining if there exists any term t such that f(t,t) and f(c,d) are provably equal, which involves a possibly infinite search through all possible terms t, not to mention the difficulty of proving the equality holds when a candidate term t is found.
Another way that unification can become more difficult is to enrich the syntax that is used to define terms - here, we consider variable binding, a frequent concern of my research, and this blog. Conceptually, this is not too different from the 'equational unification' of the paragraph above - terms can be considered equal despite being not literally equal, in this case because they differ only uniformly in their bound names. Fortunately the equality theory of binding is not difficult enough to lead to uncomputability: so-called nominal unification was proved to be decidable in at least exponential time in the original 2003 paper of that name by Urban, Pitts and Gabbay. Exponential time algorithms are rather slow so are of limited practical use; the picture was improved in 2008 by Levy and Villaret who gave a quadratic time algorithm for nominal unification, based on a detour via 'higher-order' syntax. Calvès, this week's author, in 2011 published a paper with Fernández presenting a different quadratic time algorithm, based on a detour via first-order syntax. The point of this paper is to show that, despite the differences between the approaches of Levy-Villaret and Calvès-Fernández, the algorithms are in a certain sense mathematically identical; both rely implicitly on a concept Calvès calls nominal modalities.
No comments:
Post a Comment