UK Parliament | Social Media, Misinformation, and Harmful Algorithms: A Deep Dive

1 day ago
3

Between July 30 and August 7, 2024, the UK witnessed a wave of anti-immigration demonstrations and riots, some targeting mosques and hotels housing asylum seekers. These events were partly fueled by false claims circulated on social media about the tragic killing of three children in Southport.
Some parties have blamed Sir Keir Starmer and the Labour Party, including the Police, for allowing this situation to “run out of control” when some basic information could have been disseminated which would not have prejudiced the case when it finally came to trial.
Ofcom, the UK’s communications regulator, revealed that illegal content and disinformation spread "widely and quickly" online following the incident. It highlighted how “algorithmic recommendations” on social media platforms amplify divisive narratives during this crisis. Moreover, Ofcom criticised social media companies’ inconsistent responses to this harmful content.
So, are the social media companies culpable? They create an environment where it is too easy to write and circulate a post.
We have some sensible advice, “Write once-think twice before pressing the “send” button.
Social media companies have been too quick to make profits before their users.
Notably, YouTube has more rigorous procedures, and a low percentage of harmful content appears on its platform.
To investigate these issues further, the UK Science, Innovation and Technology Committee launched an inquiry led by Chinyelu Susan "Chi" Onwurah, a British politician who has served as the Labour Party Member of Parliament for Newcastle upon Tyne Central and West since 2024 and previously for Newcastle upon Tyne Central from 2010 to 2024.
The inquiry examines the link between social media algorithms, generative AI, and the spread of harmful or false content online. It also evaluates the effectiveness of current regulations, including the Online Safety Act, and explores whether additional measures are needed. A key focus is the role of these technologies in driving social harm, particularly their influence on the summer 2024 riots.
Key witnesses included:
* Chris Yiu, Director of Public Policy for Northern Europe, Meta
* Alistair Law, Director of Public Policy and Government Affairs, UK and Ireland, TikTok
* Wifredo Fernandez, Senior Director for Government Affairs, X (formerly known as Twitter)
Notably, the UK government invited Elon Musk, owner of X, to participate but could not attend. His absence was particularly significant given the platform's role in spreading misinformation during the crisis.

In response to the growing impact of misinformation, the Online Safety Act 2023 was introduced to tighten regulations on disinformation. This law imposes new responsibilities on social media platforms to minimise the risk of illegal activity and swiftly remove unlawful content.

Stay tuned as we explore the complex relationship between social media algorithms, misinformation, and societal impact.
Like, comment, and subscribe for more in-depth discussions on tech and society.

Loading 1 comment...