skip to content

Department of Computer Science and Technology

Image shows PhD student Michelle Lee

"I'd urge you to consider a career in AI and data science," Michelle Seng Ah Lee told a group of black state school students recently. "Improving diversity in AI would benefit our industries and our society – making sure diverse backgrounds and thoughts are represented in the algorithms we are releasing into the wild."

Michelle, who hails originally from South Korea, is a Ph.D. candidate here. Her research focuses on fairness, bias, and discrimination in machine learning algorithms and their trade-offs on aggregate and individual levels.

"I think it's important for young students to see others like themselves in the field to be able to envision themselves following this career."

This work, she says, is becoming more important as companies and governments are increasingly using machine learning (ML) and artificial intelligence (AI) technologies to inform decisions that impact our daily lives.

Tackling the problem of poor AI management 
"AI can help decide whether you get a job, what price you pay for your insurance, whether you’re flagged in a security database, what your credit limit is, and whether your mortgage is approved," she explains. "But poor AI management is increasingly becoming headline news."

As examples, she cites a credit card company that was investigated by a regulator for their algorithms accused of giving higher credit limits to men than to women and an AI-powered software that was piloted in California to predict areas of high crime – and turned out to be actually tracking areas with high minority populations, regardless of the crime rate.

This is exactly the kind of issue that she wants to address in her PhD. "My work aims to introduce an end-to-end context-conscious methodology to govern the risk of unfair bias in ML," she says.

Last month she was talking about her work to a group of black sixth form students from London. This was as part of the STEM@Cambridge programme run jointly by St John’s College (of which Michelle is a member) and the charity Generating Genius. The programme aims to guide able black students from non-selective secondary schools in London through applications and interviews, to give them a fair opportunity to attend the UK’s most prestigious higher education institutions.

Talking about careers in computer science, Michelle introduced the students to the problems she is studying and discussed the difficulties of addressing them.

Trying to make algorithms fair
"It's hard to make algorithms fair because while the fact that it’s mathematical might make it seem like it's objective, each fairness definition has its own ethical and philosophical values embedded in it," she said. "The real question is, what types of unequal outcomes are ethically acceptable, and which ones are not?"

She talked them through a widely cited case study of an algorithm in the US Criminal Justice System to score people who committed a crime based on how likely they were to reoffend after they were released. Black defendants were found to be twice as likely to be incorrectly labelled as having a higher risk of repeat offending than white defendants.

"One of the challenges is that when there's a lack of diversity, there’s a lack of diverse thought."

As she told the students, "the company that created the algorithm maintains that it is non-discriminatory because the rate of accuracy for its scores is identical for black and white defendants. While both perspectives sound fair they are based on different perceptions of what fairness means, and it is mathematically impossible to meet both objectives at the same time."

One of the issues, as she points out, is that developers may not spot their own biases in the algorithm development process. So, she argues, bringing more women and members of minorities into data science would benefit it.

While demand for AI specialists and data scientists is high, the field is still very white and very male. "And one of the challenges is that when there's a lack of diversity, there’s a lack of diverse thought," she told students.

Consider a career in data science 
"If the population that is creating the technology is homogeneous, we’re going to get technology that works well for that specific population. They will naturally design toward what they are most familiar with.

"So, if this type of work is of interest to you, I'd urge you to consider a career in AI and data science," she said.

Michelle herself works part-time alongside her PhD studies. She is the AI ethics lead at a professional services firm. "It works very well," she explains. "I run round-table discussions with clients about new risks they are facing in their AI systems and about making their algorithms fairer. These discussions feed into my research – and in turn, my research feeds into some of the approaches and fairness toolkits that I propose to them."

She is thoroughly enjoying her work and is keen to encourage others into the field. "A lot of companies are looking for more diverse teams – especially since we've seen that diversity is better for companies in terms of their productivity, profit and creativity, as well as for building accessible technology and algorithms."

And she is happy to be an advocate for more diversity in recruiting. "There's a certain image and stereotype that people seem to have of computer science, and I think it's important for young students to see others like themselves in the field to be able to envision themselves following this career.

"Computer science is actually a really exciting field with incredible growth opportunities, and it would be a shame if students got dissuaded from pursuing it because they don't think they would fit in."

Published by Rachel Gardner on Sunday 7th March 2021