Centre for Data Ethics and Innovation Calls For Social Media Regulation
In its first report to Government reviewing online targeting, the Centre for Data Ethics and Innovation has warned that people are unaware how tech platforms target information at users and cautioned that existing regulation was out of step with the public’s expectations.
The report highlighted tech platform’s methods of targeting and engaging users. It covered internet addiction where, “some online products incorporated ‘persuasive design’ features to encourage continuous use. And research has found that online targeting could exacerbate addictive behaviours.”
It also covered the ability of platforms to amplify “harmful” content and cautioned: “Content recommendation systems may serve increasingly extreme content to someone because they have viewed similar material. The proliferation of online content promoting self-harm, including eating disorders, is reasonably well documented.”
The CDEI highlighted the negative impact that online platforms have on the sustainability of the traditional news media industry. “Through their targeting systems, platforms have a significant influence over traffic to news publishers’ websites, and therefore the level of advertising revenue they can generate…
“In addition, online targeting may make it harder for the news media to play their traditional role of holding politicians to account. It is likely that the widespread personalisation of online experiences reduces the news media’s ability to identify and scrutinise targeted political messaging. Online targeting can also stop important news from spreading.”
It warned how online targeting could lead to polarisation and social fragmentation via eco chambers and in turn influence content created online. “As content producers are incentivised to study what types of content is amplified through the online targeting system, and create similar content themselves, increasing people’s exposure to their content and maximising their advertising revenues.”
The report covered an Ipsos MORI workshop of 90 participants and highlighted their main concerns. It found that survey respondents overwhelmingly (61 per cent) favoured giving an independent regulator oversight of the use of online targeting systems, rather than via self-regulation (17 per cent).
A key concern of participants was the potential for online targeting exploiting the vulnerable. “They thought that vulnerable people have limited capacity to make informed judgments and are more likely to be unduly influenced by online targeting systems.”
Participants were also concerned that online targeting systems could reduce the range or variety of information and perspectives that people see. “With regard to news and political messaging, most thought that this represents a risk to the democratic process. Some thought that this could lead to wider social fragmentation.”
The other main concern was that online targeting systems could expose people to “problematic” content, especially where content was targeted to maximise user engagement. It summarised: “They thought that the cumulative and sustained impact of exposure to “problematic” content increased risks of polarisation and radicalisation significantly.
“This was supported with real examples given by some dialogue participants about close family members developing extreme views towards anorexia and conspiracy theories, which they associated with the use of online targeting systems.”
The report concludes with recommendations to make online platforms more accountable, increase transparency, and empower users to take control of how they are targeted. Proposals included:
- New systemic regulation of the online targeting systems that promote and recommend content like posts, videos and adverts.
- Powers to require platforms to allow independent researchers secure access to their data to build an evidence base on issues of public concern – from the potential links between social media use and declining mental health, to its role in incentivising the spread of misinformation
- Platforms to host publicly accessible online archives for ‘high-risk’ adverts, including politics, ‘opportunities’ (e.g. jobs, housing, credit) and age-restricted products.
- Steps to encourage long-term wholesale reform of online targeting to give individuals greater control over how their online experiences are personalised.
Roger Taylor, chair of the Centre for Data Ethics and Innovation, said: “Tech platforms’ ability to decide what information people see puts them in a position of real power. To build public trust over the long-term it is vital for the Government to ensure that the new online harms regulator looks at how platforms recommend content, establishing robust processes to protect vulnerable people.”