07/04/2005
Michel Schellekens (CEOL, University of Cork)
Towards a Calculus for Modular Software Timing; ACETT a programming language with Linearly-Compositional Average-Case Time Complexity

Modularity is a highly desirable property. We quote from "Mathematical Foundations of Software Engineering: a roadmap" (T. Maibaum): "The only effective way for dealing with the complexity of software based systems is decomposition. Modularity is a property of systems which reflects the extent to which it is decomposable in parts, from the properties of which we are able to predict the properties of the whole. Languages that do not have sufficiently strong modularity properties are doomed to failure, in so far as predictable design is concerned."

In this context we focus on modularity for Software Timing and on the related development of time-able software, which is crucial for Real-Time language applications. We give an overview of the modularity properties of the main complexity measures of exact time, worst-case time and average-case time and motivate the study of average-case time in this context.

We argue that the main obstacle to producing "time-able" software is that current language constructs are not guaranteed to be modular w.r.t. timing and hence do not facilitate, or even obstruct, timing analyis. This is illustrated via basic examples. We focus in the remainder of the talk on the average-case time measure and on methods of ensuring modularity in this context.

Average-case time analysis is one of the most thoroughly studied areas of Computer Science and notoriously difficult. As illustrated by Knuth's volume on Sorting and Searching, this method involves a variety of techniques which, typically, do not allow for automation. Current analysis techniques and current automated average-case analysis tools, such as LUO, face a fundamental bottle neck in that standard algorithms, such as Quicksort, can not be implemented or algorithms, such as Heapsort, simply defy analysis.

We present ACETT, the Average Case Execution Time Tool, as the first language which is compositional (= modular) with respect to a non-trivial time measure. This provides a key step for the IFIP2000 research challenge to bridge (Denotational) Semantics and Complexity and removes the current bottle neck in automated average-case time analysis. We show that the average time of ACETT programs can be expressed in a compositional way as linear combinations of the average times of their basic components.

Thus far algorithms are analyzed on a case-by-case basis. ACETT allows for a uniform approach to average-time analysis. The new approach is directly inspired by the traditional semantic notion of compositionality, which in turn is guaranteed by the fact that ACETT programs are shown to be Random-Structure preserving. As an application of the language we show how to reprogram Heapsort in ACETT. This leads to the first Randomness-preserving variant of the algorithm, solving an open problem stated by Knuth and Edelkamp, with wider repercussions for Automated Average Case Analysis.

We show how this work goes well beyond the state of the art in mainstream areas in Computer Science, including Automated Average Case Analysis, Real-Time Languages and Random Structures. Given sufficient time, we will motivate how the ACETT principles have led to a new model of computation complementing the traditional von Neumann model.

The talk aims to present a general overview and will focus on historical motivation, the open problems which have been solved and on the relevance to industry. Given sufficient interest, follow-up talks can be organized to expand on the technical background.