Algorithms: Complexity and Engineering (ACE)

We work in a number of areas including: approximation and online algorithms, data structures, efficient algorithms, algorithms for algebraic and geometric problems, graph and network algorithms, scheduling algorithms, and decidability. 

We also look at practical aspects and applications of theory, in particular algorithm engineering, which deals with the practical applications and performance of algorithms.

Find more information about ACE on our research pages.

Suggested topics:

Algorithmic Aspects of Communication Networks

Supervisor: Thomas Erlebach

Numerous challenging optimisation problems arise in the design and operation of communication networks such as optical networks, cellular wireless networks, mobile wireless ad-hoc networks, and GRID computing. Examples of such problems include network design, resource provisioning, resource allocation, scheduling, virtual backbone construction, and call admission control. 

Students working on such topics typically identify suitable models and problem formulations, design and analyse algorithms (approximation algorithms, on-line algorithms, distributed algorithms, algorithmic game theory), and implement simulations of algorithms for the purpose of experimental evaluation and comparisons. The focus can be more on the theoretical analysis (proofs of worst-case performance guarantees, competitive analysis) or on the experimental and algorithm engineering side, but typically the research includes aspects of both.

This is a broad topic that can accommodate a number of specific PhD research projects. Note, however, that I already have a number of PhD students and, for the moment, will accept additional students only if they are exceptionally well qualified.

Skills required:

  • Ability to understand the design and analysis of algorithms. 
  • Ability to understand and apply basic concepts of discrete mathematics (graphs, combinatorics, proofs). 
  • Ability to write about algorithms and proofs in a precise and rigorous way. 
  • Creativity in problem-solving. 
  • Ability to write programs in a suitable programming language such as Java or C++ (only if the PhD project includes a simulation or implementation component).

Relevant optional modules that can be attended at the University of Leicester:

Some suggested reading:

Introduction to Online Algorithms:
Susanne Albers: Competitive Online Algorithms. BRICS Lecture Series, LS-96-2, 1996.
Available at:

Survey of Approximation Algorithms for Disjoint Paths Problems:
Thomas Erlebach: Approximation Algorithms for Edge-Disjoint Paths and Unsplittable Flow. In E. Bampis et al. (Eds.): Efficient Approximation and Online Algorithms, LNCS 3484, Springer, 2006, pp. 97-134.
Available at:

Survey of Algorithmic Models of Ad-hoc/Sensor Networks
S. Schmid and R. Wattenhofer: Algorithmic Models for Sensor Networks. In Proceedings of the 14th International Workshop on Parallel and Distributed Real-Time Systems (WPDRTS), April 2006.
Available at:

Example of an algorithmic result for virtual backbone construction in ad-hoc networks:
Christoph Ambuehl, Thomas Erlebach, Matus Mihalak, Marc Nunkesser: Constant-factor approximation for minimum-weight (connected) dominating sets in unit disk graphs. In Proceedings of the 9th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems (APPROX 2006), LNCS 4110, Springer, 2006, pp. 3-14.
Extended version available at:

Algorithm with uncertainty

Supervisor: Michael Hoffmann

The area of update algorithms for problems with uncertainty is a relative recent developed topic.

For most classical algorithms it is assumed that the input is precise. In this model each input might be given in a rough (uncertain) form. The precise value for each input can be obtained for a certain cost. The algorithm has to decide for which input data the the extra costs needs to be paid to solve the underlying problem. Algorithm with uncertainty are rated on cost effective they are.


  • Programming in C++, Java, or similar computer language
  • Knowledge in time and space complexity
  • Knowledge in statistical analysis

Related publication

  • Thomas Erlebach, Michael Hoffmann, Danny Krizanc, Matu's Mihala'k, Rajeev Raman: Computing Minimum Spanning Trees with Uncertainty. STACS 2008: 277-288T.

View MichaelHoffmann's homepage and contact details.

Data Mining

Supervisor: Rajeev Raman

I am interested in data mining. Data mining is of course an application that involves processing large amounts of data. Succinct data structures can help to process large amounts of data. 

In addition, I am working on two projects:

  • Mining Probabilistic Data. Much of data to be mined is automatically harvested and collected, and is therefore uncertain due to sensor errors, inconsistent data in databases or the web etc. Mining such data is a challenge. See a recent PhD Thesis I supervised on this topic.

  • Tom Gransden, a student supervised jointly with Neil Walkinshaw, is investigating the use of Machine Learning algorithms to detect proof tactics used in Interactive Theorem Provers. 

  • Visit Rajeev's homepage and contact details.

Design and analysis of algorithms

Supervisor: Stanley Fung

Particularly in the areas of:

  • Online algorithms
  • Combinatorial optimisation
  • Computational biology and bioinformatics


  • Background in discrete mathematics and algorithms as provided in most CS undergraduate degrees

Suggested MSc modules

  • Analysis and Design of Algorithms. provides the above skills.
  • C++ Programming and Advanced Algorithm Design and Game Theory in Computer Science may also be useful.


See Stanley Fung's homepage and contact details.

Efficient Data Structures

Supervisor: Rajeev Raman

Niklaus Wirth wrote a famous book in 1976 called Algorithms + Data Structures = Programs. What is meant by this is that algorithms manipulate data, and in order to write a good program, one should identify the operations that one performs on the data and choose an appropriate data structure. However, we are living in a world of BIG data, and classical data structures (such as trees or lists) can be incredibly inefficient.

  • One problem is that normal data structures take up very large amounts of RAM, compared to the data they store. For example, after reading a 1GB XML document into a DOM tree, the memory usage goes up to 15GB RAM. This increase in memory makes it impossible to process BIG data efficiently. Succinct and compressed data structures solve this problem in novel and clever ways. For example, did you know that you can navigate in a binary tree, in which each node is represented using at most two bits?. I have many potential PhD topics in succinct data structures. See an example PhD Thesis I have supervised in this area. There are many practical applications; see e.g. the SiXML project, the Bowtie package.

  • I am also interested in novel algorithms for large-scale data processing, using say MapReduce or similar frameworks, but my interests are narrowly focussed. You will need to provide a general application area (what kind of data are you interested in processing? why is it challenging to process this kind of data using MapReduce?).

Selected References

(my PhD students' names in bold)
  • Muhammad Muzammal, Rajeev Raman. Mining sequential patterns from probabilistic databases. Knowledge and Information Systems, 2014. 10.1007/s10115-014-0766-7, 2014.
  • Thomas Gransden, Neil Walkinshaw, Rajeev Raman. Mining State-Based Models from Proof Corpora. CICM 2014: 282-297. Published version on arXiV.
  • Stelios Joannou, Rajeev Raman. Dynamizing Succinct Tree Representations. Experimental Algorithms Lecture Notes in Computer Science Volume 7276, 2012, pp 224-235. Final submitted version.
  • Richard Geary, Venkatesh Raman, Rajeev Raman. Succinct ordinal trees with level-ancestor queries. ACM Transactions on Algorithms, 2006, 2 (4), pp.510-534. DOI: 10.1145/1198513.1198516. Final submitted version.

Visit Rajeev'shomepage and contact details.

Online Algorithms

Supervisor: Michael Hoffmann

"In computer science, an online algorithm is one that can process its input piece-by-piece, without having the entire input available from the start. In contrast, an offline algorithm is given the whole problem data from the beginning and is required to output an answer which solves the problem at hand. (For example, selection sort requires that the entire list be given before it can sort it.)

As an example of the problem consider the problem of finding a shortest path in a finite connected graph when the graph is unknown and the algorithm receives the node neighbors only when it "enters" the node. It is clear that this problem can not be solved optimally without a simple exhaustive search. Thus, new performance measures have to be introduced, such as competitive analysis, which compares the performance of an online algorithm with that of a hypothetical offline algorithm that knows the entire input in advance." Source: Wikipedia

In this general topic some of my recent research focused on network discovery. Where a structure of the network is not known and by probing certain nodes the whole network will be discovered.


  • Programming in C++, Java, or similar computer language
  • Knowledge in time and space complexity
  • Knowledge in statistical analysis

Related publication

  • Zuzana Beerliova, Felix Eberhard, Thomas Erlebach, Alexander Hall, Michael Hoffmann, Matu's Mihala'k, L. Shankar Ram: Network Discovery and Verification. IEEE Journal on Selected Areas in Communications 24(12): 2168-2181 (2006)