Moate PhD student Louise McCormack won University of Galway’s (NUIG) annual Threesis competition for her research on how to ethically use Artificial Intelligence (AI) in news distribution.
ETHICAL AI
She will now compete in the national final in November against finalists from seven other Irish universities with her project titled ‘Scoring AI for Ethical Trustworthiness’.
The Moate native is the co-founder and CEO of Lexe, a news start-up aiming to form an ethical AI-enabled link between journalists and news consumers. Companies like TikTok and Facebook have already bridged the gap between reader and writer using AI and have “essentially become the news publishers” according to Louise.
“It’s not done with any of the ethics that you would expect from a news publication platform,” she told Topic.
“Social media powerhouses are bridging this gap by mining your data using recommender algorithms. These algorithms analyse your in-app behaviour in real time and combine this with data you provide including age, gender and nationality. The algorithm uses this information to manufacture the ‘For You Page’ that forms your news feed which feeds your world view.
“Companies have advanced profiles on people. People would be scared if they understood how advanced the profiles are,” said Louise.
RADICILISATION
You are not the one weaving the tapestry of recommended content in each of your apps. The ‘For You Page’ is actually designed for you by the companies who own the apps you use. You simply provide data that is used to spit out content in a targeted manner.” Louise said “TikTok decides who gets to see each specific piece of content”.
“It’s not an echo chamber. You’re not in control of your algorithm the way people think they are. People’s biometrics, their behavioural data, that’s being used to design exclusively what that person sees.”
This is leading to “huge radicalisation” according to Louise who said it has resulted in a rise of gender-based violence and extremist political views. Irish politics has been caught in a tug of war between the left and right wing which Louise suggests has been set in motion by the recommender algorithm.
“The right are looking at the left, the left are looking at the right and nobody’s looking upwards at the algorithm,” she said.
QUANTIFYING TRUST
The question of how to create ethical recommender algorithms has presented a hurdle to news organisations. Social media decided the race is run quicker when you ignore the obstacles and now traditional media is playing catch up. This is the barrier Lousie is trying to vault through Lexe and her PhD research. She is aiming to “quantify trust” in the recommender algorithm to achieve ethical AI-enabled news distribution.
Louise is using the European Commission’s principles of AI trustworthiness to inform her PhD research. The principles state that AI should be safe and secure; respectful of privacy; robust and reliable; responsible and accountable; transparent, documented and explainable; and fair and impartial. Her study explores how to make these principles “tangible” and use them to score AI models.
Louise, whose family have owned Central Garage in Moate for generations, attributed her dedication to ethics to her grandparents. This has spurred her on to find an ethical bridge between writer and reader.
“It’s very important to give somebody news that they will enjoy reading. But there are much bigger questions around what is the agenda, who gets to decide the algorithm, how is that algorithm policed. That’s why a lot of news companies haven’t used AI. There is no agreed-upon ethical way to use AI to decide what news stories people see,” said Louise.