Authors: Sjaak Smetsers, Ken Madlener (Radboud University Nijmegen), and Marko van Eekelen (Radboud University Nijmegen and Open University of the Netherlands)
Conference: Symposium on Trends in Functional Programming (TFP 2013), 14-16 May 2013, Provo, Utah, USA.
Journal reference: Lecture Notes in Computer Science, to appear.
Comments: Last week I looked at the ways Google Scholar Updates has been useful to my research; I recently read a blog post that saw the tool put to a rather more surprising use: Updates recommended a possibly relevant published paper that turned out to be highly related, in the sense that it was partly plagiarised from the academic's own work! Click here to read the story.
Moving on to this week's paper, we'll look at bialgebraic semantics of programming languages. The (only) good thing about technical jargon is that you can fit a lot of content into a small space, so we'll try to unpack this over the course of the post.
First, what is meant by "semantics of programming languages"? To give such is to give a precise and formal mathematical statement of how a programming language should be understood. These semantics can then be used to (a) reason about properties of the language, (b) reason about programs written in that language, and (c) guide the implementation of the language, e.g. compilation. Semantics come in a variety of different styles (see e.g. Wikipedia's overview of some options); the "bialgebraic" style of this paper is so called because it combines algebraic and coalgebraic approaches.
Algebraic semantics start with a specification of the syntax of the language, and then show how programs using that syntax can be interpreted within some mathematical setting. This mathematical translation is called the denotation of a program, and so algebraic semantics are a form of what is called denotational semantics. We discussed this approach here, but our mathematical setting in this paper is a lot simpler than that used there - the category of sets and functions rather than rich and complex "domains". The advantage of denotational semantics in general is that they are 'mathematical' in character and hence can be reasoned about with familiar mathematical tools such as induction; the disadvantage is that they are abstracted away from the original language enough that the actual behaviour of programs can be obscured.
Coalgebraic semantics are the category theoretic dual of algebraic semantics; we discussed coalgebra here. Coalgebra are a good way of talking about systems with transitions; when used in semantics they view programs as transitions between states (say, memory states of the program). This is a particular style of what is called operational semantics.
The denotational and operational semantics of a language or program should, of course, have something to do with each other; the jargon for when they conform is the adequacy property. But we'd like to say more than that: we'd like the coalgebraically specified operational behaviour of the language to very closely flow from the algebraically specified syntax. In other words we'd like the algebra to provide 'structure' for the coalgebra, in the style called structural operational semantics. It turns out this sort of structure can be imposed at the level of abstract category theory, quite independent of the attributes of any particular language, by asking that the algebra and coalgebra share certain mathematical relationships called 'distribution laws'. The seminal paper behind this idea is Turi and Plotkin's 1997 Towards a mathematical operational semantics, but I'd recommend starting with Klin's nice introduction to the area.
So what does this week's paper contribute? One problem with formal semantics - the reason most languages lack them despite their obvious advantages - is that real languages are huge and complicated and formal proofs involving them are just too big to fit in any one human being's mind. Hence there is a need for formalisation, in the sense that machines, rather than people, should at the very least record and check proofs, and preferably do some of the grunt work themselves. This is a pretty common refrain on this blog - see here for example. This paper, then, is a formalisation of bialgebraic semantics (the general theory, not any particular language) into the specification tool and theorem prover PVS. The particularities of the coding are a bit beyond me, as a non-expert in mechanised theorem proving, but the appeal of PVS seems to be that it is designed and used as a working tool for real world formal methods, and so provides a promising link between abstract category theory and application; the main impediment seems to be its weak support for certain forms of polymorphism.
No comments:
Post a Comment