Media

Spreading harmful content on social media should be an offence like dumping chemicals


Social media companies should be held liable for the spread of harmful disinformation on their platforms, according to a panel of experts, including the former chair of the Department for Digital, Culture, Media & Sport select committee. They also warn that the next “fake news” battle will be over coronavirus vaccine take-up.

The former chair of a parliamentary committee charged with investigating social platforms has called for the deliberate spreading of harmful content on social media to be criminalised, with users and social companies both legally liable.

Damian Collins, MP for Folkestone and Hythe, told PRWeek’s Crisis Communications virtual event that it is not enough for social company executives to use a “free speech” defence to absolve their platforms of responsibility – they should be held to account alongside the bad actors who deliberately spread harmful disinformation.

“People have free-speech rights, but they don’t have the right to say things that will lead to the other people harming themselves or being harmed, imminent harm and danger being caused,” Collins, who chaired the Digital, Culture, Media & Sport select committee between October 2016 and November 2019, said.

“If your [algorithm is] directing people towards extreme opinions and conspiracy theories because it’s a better way of holding an audience, and that behaviour is irresponsible, you bear some liability for the consequences of that activity.

“It should be seen in the same way that it’s an offence for a business to dump chemicals and hazardous waste into the water supply. With other industries, we’ve created rules and regulations to protect public health. We should create a liability for the social media companies so that they act against networks and known organisations that are putting out information that could lead to real-world harm.”

Social media companies have long argued that they are technology businesses, rather than media platforms that exercise editorial control. Therefore, they argue that they cannot be held liable for content posted by users, but will reactively remove content that breaches their community standards.

Recently, they have begun to take some steps towards stronger policing of hate speech. Twitter has hidden a tweet from US president Donald Trump that glorified violence, Reddit and Twitch have banned pro-Trump accounts over hate speech and this week Facebook banned hundreds of accounts, groups and pages that it said were linked to the US far-right extremist Boogaloo movement.

However, critics argue that platforms have acted too slowly, passively and inadequately for years, with harmful content often spreading to large audiences before it is even flagged. A Facebook Live broadcast of the Christchurch mosque shootings in March 2019 spread on to other platforms and went viral before it was taken down.

Social media platforms have also been criticised about their algorithms, which help conspiracy theorists and hate groups connect with like-minded individuals to share extremist content.

Collins cited a Wall Street Journal report that revealed in 2018 Facebook executives were aware its recommendation engine stoked divisiveness and polarisation with negative consequences to society.

The report said Facebook’s leadership ignored the findings and tried to absolve the company of responsibility for the polarisation it directly contributed to, because the changes might disproportionately affect conservative users and hurt engagement.

A safe haven for conspiracies

Social media platforms have long been a safe haven for hate groups, extremists and conspiracy theorists.

During the coronavirus pandemic, conspiracy theories and disinformation have thrived on social media, often seeming to spread far more rapidly across the world on these platforms than the virus itself has moved.

This has fuelled false narratives. A conspiracy theory about the coronavirus being caused by 5G masts led to the vandalism of mobile towers across the world. Studies have found that up to 30% of people believe Covid-19 was created in a science lab in China, while some antisemitic rumours have suggested that Jewish people created a vaccine for the virus before it was discovered in Wuhan.

Infamously, Trump touted the use of dangerous chemicals as miracle cures. Although these views have been challenged and corrected in the media, they have been amplified and spread across social media, sometimes with deadly consequences.

“At the beginning of the pandemic… in Iran around 800 people actually died drinking methanol,” Dr Rima Merhi, a consultant and media and communications lecturer in Lebanon, said.

“When you take into consideration that you also have Trump talking about disinfectants, it’s not only a case of countries in the Middle East where people are ignorant, it’s also world leaders sometimes sending out the wrong messages. The Brazilian president said things which they considered misinformation and would harm the public, and this video was put out by Facebook, Twitter and YouTube.”

Merhi said that in authoritarian countries such as Iran, where the media is often a state apparatus, people rely more heavily on social media for information.

“They’re not sitting in New York checking Google to see the impact of methanol. You ended up with 800 people killed drinking methanol,” she said. “In some cases, it should be an offence [to deliberately spread harmful disinformation], but so long as it is really an offence. Somebody being malicious, not somebody being ignorant, and not the state using it to put somebody they don’t want to speak out in jail or to interrogate them.”

It is for this reason, Will Moy, chief executive of fact-checking non-profit Full Fact, argued, that any legislation for the malicious spreading of harmful content needs to be proportionate to the crime.

“We have to recognise that the UK, which may be one of the early movers in making laws in this area, could do great harm by setting precedents that other countries use and misuse, to abuse their own populations and others,” he said. “What we need to see from lawmakers in the UK is proportional responses to clearly identified harms.”

The Covid-19 vaccine battle

Experts warned that the next “big national battle on false information” will be the take-up of a coronavirus vaccine.

“When we finally have a vaccine for this situation, there will be a concerted effort to discredit it,” Moy explained. “They don’t need to persuade everybody this is dangerous, they just need to insert enough doubt into the conversation that people decide not to take it.”

For herd immunity to be achieved, vaccines usually require a coverage rate of at least 80-90% of a population, meaning propaganda efforts to thwart their effectiveness only need to convince two in 10 people.

Moy believes it will be interesting to see whether “old-fashioned ways of communicating”, such as advice from doctors, the government and public-health authorities, will be able to overcome the highly targeted propaganda campaigns that are likely to play out on social media.

Mark Borkowski, founder and head of Borkowski PR, said there should be an aggressive and visible effort to call out bad actors and social media companies.

“I think we should go much further in terms of naming and shaming tactics and [calling out] what people are doing. And we need to increase the pressure on the likes of Twitter, Facebook and YouTube to go for these people who will take note,” he said.

“We talked about David Icke being pulled down, and Katie Hopkins as well, but it took a long time for that to take place. I’m involved with something at the moment and YouTube are really dragging their heels [to deal with] someone who’s falsifying a lot of information, making up statistics… [they are very] reluctant to intervene.”

Borkowski didn’t mince his words when describing the impact of disinformation spreading on social media.

“It is polluting the web – it is no different from throwing poisonous toxic effluent into the web,” he said. “We’ve got to start looking at those people.”

A version of this story first appeared in PRWeek



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.