skip to content

Department of Computer Science and Technology

What does designing new quantum algorithms have to do with predicting the outcome of a general election? The surprising answer is, more than you might think. It turns out that an exit poll on the night of a general election could be a good proxy for the way such new quantum algorithms will need to work – namely, by inferring the answer based on a small, randomly-selected sample space.

So says Tom Gur, a Professor of Computer Science here. Tom’s research focuses on theoretical computer science and quantum computing and he is particularly interested in designing new quantum algorithms – a notoriously difficult challenge.

It’s one that the quantum research community has been grappling with for three decades but to date, progress has been slow and successes have been sparse. However, as announced today, Tom has just been awarded a €1.5m Starting Grant by the European Research Council (ERC) for a research project that he hopes could lay the foundations for changing this.

He is one of 500 early-career researchers across many disciplines to be awarded one of these prestigious grants. Highly sought-after and extremely competitive (this year, only 14 per cent of applications were successful), they are intended to help young scientists launch their own projects, form their teams and pursue their most promising ideas and cutting-edge research.

The real-world challenge of Big Data 
In Tom’s case, though his project is theoretical, "it is motivated by real-world challenges," he says. Such challenges include the sheer size of datasets that computers are now being asked to handle. Consider the size of a graph of the global Facebook community, or the set of data from space-borne sensors measuring changes in the earth’s atmosphere, biomass and ocean life. These can be petabytes in size (a petabyte is a thousand times larger than a terabyte) and their massive scale makes efficient computation very challenging.

Election opinion polls are a good example of a sublinear type of approximation algorithm: you may only be looking at a small sample, yet you are able to deduce a global property – such as, who is going to be elected. 

Prof Tom Gur

 

To change this, Tom argues, our notion of efficient computation has to evolve from working in polynomial time (the standard of the last 60 years). "We urgently need ultra-fast algorithms that run in sublinear time – i.e. they run much faster than the time it takes to read their input.

"In the quantum era," he adds, "this need is even more imperative since the size of a quantum system scales exponentially in the number of quantum bits."

He gives the example of a quantum system that has only 127 quantum bits: its description size (at more than 10^{76} real numbers) is close to the number of atoms in the observable universe. "So, sublinear computation is essential in the quantum regime. Otherwise, working with these exponentially large objects becomes completely intractable."

And that means designing algorithms that essentially read just a tiny fraction of the data and carry out computation to produce an answer in less time than it would take to merely read all the data in the dataset.

Algorithms that produce an answer in less time than it takes to read the data
This sounds baffling. And scarcely possible. "Supposing we want to learn about the structural properties of massive graphs such as the Facebook network, which consists of roughly two billion nodes," Tom says. "How can we deduce meaningful information about this social network if its massive size doesn’t even allow us to read it?" But this is where the analogy of the general election opinion poll comes in.

"When polling companies run opinion polls during an election campaign, they don’t ask every single voter who they voted for, just a random sample. Yet, this gives us some statistical information about the predicted outcome of the election.

"So such polls are a good example of a sublinear type of approximation algorithm: you may only be looking at a small sample, yet you are able to deduce a global property – such as, who is going to be elected." (In the 2024 General Election, the BBC's exit poll forecast Labour would win 410 seats, just a fraction less than the 411 they actually won.)

That, for him, is the starting point of finding a way to make the design of quantum algorithms much easier. And he’ll be bringing in some sophisticated tools from deep areas of mathematics, such as additive combinatorics and harmonic analysis, to help with the task.

"The first step is to develop mathematical tools and use them to create a general-purpose machinery for boosting the power of quantum algorithms," he says.

"We start by designing a very limited quantum algorithm that only manages to solve the problem on a tiny amount of inputs. So it’s an algorithm that is mostly wrong and is only marginally better than just randomly guessing the answer. Once we have that, we can then amplify this weak algorithm to make it into an algorithm that is always correct."

Designing fundamentally new sublinear quantum algorithms
Once that step is completed, he plans to use this machinery to design fundamentally new sublinear quantum algorithms "ranging from quantum machine learning algorithms to ones that could shed light about the very nature of quantum mechanics."

This is "a high-risk, high-gain research programme, which targets long-standing open problems in sublinear quantum computation and quantum complexity theory."

But its value, he adds, is that "it will pioneer new notions of quantum computation, motivated by real-world needs, and establish fundamental connections between quantum computing and deep areas of mathematics such as harmonic analysis and additive combinatorics."


Published by Rachel Gardner on Thursday 5th September 2024