skip to content

Department of Computer Science and Technology

Fact-checking has become increasingly important because of the speed with which both information and misinformation can spread in the modern media ecosystem. But fact-checking is a time-consuming task. For example, to judge the validity of claims about how many jobs were gained under the Trump administration compared to the Obama administration, a journalist would need to search many sources and evaluate the reliability of each of them before making a comparison.

To help counter misinformation, researchers here are developing state-of-the-art software to automatically verify claims made in text – e.g. on Wikipedia pages or in politician’s speeches. This software could then be used to assist journalists, speeding up the fact-checking process. It could also be used by regular internet users to help find evidence for or against claims they encounter.

As the researchers will explain at a Cambridge Festival workshop AI Truth-Tellers: Fact or Fiction? here on Saturday 18 March, they’re working to understand how artificial intelligence needs to reason in order to find the evidence to support, or refute, the claims that are made to us.

As part of that work, they’ve been surveying what researchers in the field are doing. Last year, they published a survey of the modelling strategies that are being used – including machine learning, natural language processing and datasets – and their limitations. Dr Zhijiang Guo, Dr Michael Schlichtkrull and Professor Andreas Vlachos (right)also found there are some significant challenges yet to be addressed.

One of the key issues, they said, was the choice of labels that automated fact-checking systems use when they have assessed a claim.

Prof Vlachos says: "Separating them just into 'truth' and 'falsehood' can be too simplistic. This is the case when a speaker cherry-picks the facts. To say 'I have never lost a game of chess' may be true. But it can be presented in a way that is misleading – for example, when said by someone who has never actually played chess."

Another key challenge is how to develop systems for automatically checking facts when they are presented in more than one way – for example when a piece of text is accompanied by a video that is misleading, or an image that is wrongly captioned or used to unduly over-emphasise a point. So what's the answer to this? "We can do automated fact-checking," says Prof Vlachos, "and we will hopefully get better at it. But to really help people be aware of misinformation that may be directed at them – for example, about the safety of vaccines – it often helps having a conversation with them about it and opening their minds to it."

Here in the Department of Computer Science and Technology, Dr Youmna Farag and Prof Vlachos, together with colleagues from the Open University, the University of Sheffield and Toshiba Research Lab, are just completing a project called 'Opening Up Minds with Argumentative Dialogues'. For this, they took three contentious subjects – veganism, Brexit and COVID-19 vaccination – and developed a chatbot ('argubot') with which participants could discuss and argue about the subject.

The 'argubot' held conversations with participants on the three topics, and the researchers asked the participants questions about whether they thought people who hold views different to theirs had good reasons for their opinions. The researchers showed that such dialogue systems had a positive effect in opening up people’s minds to differing opinions on certain topics.

"We think this sort of approach is more effective if we want to debunk false claims and misinformation, and have better public debate on important topics," says Prof Vlachos. "Yes, we can have disagreements, but we can have them constructively. Engaging with other people, learning what they think – and why they think as they do – is likely to be more effective in countering misinformation than simply arguing that their claims are wrong." 

  • The Cambridge Festival workshop AI Truth Tellers: Fact or Fiction? takes place here in the Department of Computer Science and Technology on Saturday 18 March. 

Published by Rachel Gardner on Monday 13th February 2023