‘Real and present danger’ warning for Twitter after Dover far right attack

‘Real and present danger’ warning for Twitter after Dover far right attack

Twitter is in “real and present danger” from extremism and hate – with the Dover bomber the latest to be linked to the platform, experts have warned.

A man said to be a prolific user of the social media site attacked an immigration centre in Dover, Kent, with crude incendiary devices before he was found dead.

The attack came days after billionaire Elon Musk took over the micro-blogging site and amid concern over cuts to the company’s content moderation team.

On Saturday, counter-terrorism police confirmed 66-year-old Andrew Leak, from High Wycombe, had been motivated by extreme right-wing terrorist ideology.

The UK’s leading specialist in this area, Professor Matthew Feldman, had predicted perceived growing social and political crises in Britain this autumn and winter would likely lead to a rise in far-right political violence.

He told the PA news agency: “The disgusting Dover attack bears out what I was saying only last week: desperate times can make people do desperately horrible things.

“I fear the extreme right-wing terrorist attack last week will not be the only case of social media extremism escalating to real-world violence in coming months.

“Until we decide to take self-directed (or ‘lone wolf’) terrorism seriously, we are going to be continually shocked by these attacks. But the pipeline from poorly-moderated hatred on social media to unaffiliated individuals physically attacking society’s most vulnerable is frequent, rapidly increasing and seriously alarming.”

Commenting on Leak’s history of spreading harmful but legal content on Twitter, Dr Tim Squirrell, from the Institute for Strategic Dialogue, said: “That guy was posting hateful material since 2014, at the very least, but it’s all legal groups. There’s nothing proscribed in that he’s not subscribed to neo-Nazi terrorist groups. There’s some far right content.

“There’s a huge amount of hatred of migrants, gay people or trans people, Jewish people, Muslims. And all of that is harmful but legal, other than some of the things which come under hate crime legislation, potentially.

“Basically, if you can name a conspiracy that anyone would have posted in the last five years, he posted it.”

Even though Leak had few followers, he was prolific in his posts and retweeted an “overtly racist” group which Twitter has refused to suspend.

Dr Squirrell said: “They consistently post some of the most egregious content on Telegram and the fact that they are on Twitter is, frankly, obscene.”

Sunder Katwala, director of independent think tank British Future, raised concern that not only had material posted by the group been shared by Leak, it had since sought to divert blame for his terrorist act.

He said: “They were tweeting to say that they didn’t blame this disturbed man, they blamed the Government, ‘the left’ and the refugees for the fire bombing. That sounds to me like mitigation, condoning terrorism.”

He said the refusal to remove the account, which has some 14,000 followers, showed Twitter had failed to implement changes to the rules promised in Parliament over a year ago in the wake of racist abuse hurled at England footballers.

“They promised that there would be a rule change so that you couldn’t say somebody couldn’t be of a nationality because of their skin colour. And they haven’t implemented that rule yet.”

“They should implement the rule change that makes it against the rules to be racist against the England team, against the Prime Minister Rishi Sunak – or against a user like me,” he said.

Mr Katwala suggested a “no-holds barred” approach to social media conflicted with “clear evidence” following the Christchurch massacre that online radicalisation can lead to violence and hatred in the real world.

By playing down these findings Twitter risked creating “protected spaces for extreme racism and hatred”, he suggested.

He told PA: “To have an ‘earplugs to the potential victims’ approach to extreme hatred and radicalisation is going to fail on every front. It’s what I didn’t want to be offered when I went to a football ground when I was 11, while people carried on racist chanting and booing.

“You need to change the social norm. It’s not fine to have safe protected spaces for hatred, and racism.”

The Dover attack came as extremists and people banned for spreading hate on the site attempted to rejoin Mr Musk’s Twitter, even though its moderation rules appeared unchanged.

A volunteer network told PA it reported 235 “respawned” accounts in four days from October 28, representing a 20% rise.

Since then, Twitter’s response time has also gone up from same-day suspension to up to three days, a British member of the anonymous network said.

Yoel Roth, head of safety and integrity at Twitter, sought to reassure users following staff cuts on Saturday by tweeting: “While we said goodbye to incredibly talented friends and colleagues yesterday, our core moderation capabilities remain in place.”

But Dr Squirrell said: “Since Elon Musk took charge, it looks like it’s chaos in Twitter right now.

“I think there is a real and present danger that the platform becomes effectively unusable for a very large proportion of the people for a variety of different reasons.”

The head of communications and editorial at the independent think tank has identified extremists across a whole spectrum attempting to return, including Islamic State fanatics.

He said: “We can’t allow our most important information ecosystems to become the vanity projects of the richest people in the world. The things that need to be done to right the ship are basically restoring regularity, not firing all your Trust and Safety team.”

Mr Katwala highlighted one group of “respawners” called The Shed, which attempted to return to Twitter after largely being driven out by vigilant volunteers.

Some 50 accounts linked to the hate group had sprung up within days of Mr Musk’s takeover.

Mr Katwala said: “Twitter found themselves in a lot of reputational risk around the level of the surge from these particular networks we’ve been tracking. There have been big surges of people whose litmus test is the use of the N-word.”

Dr Squirrell added that much of this content would fall foul of the “legal but harmful” provisions of the Online Safety Bill, which has been halted again in its journey through Parliament, leading to concerns the ground-breaking legislation may be diluted or even scrapped.

Published: by Radio NewsHub
Start your relationship

If you are interested in receiving bulletins from Radio News Hub or would simply like to find out more please fill in the form below. We operate on annual contracts - spread the cost is available.

We aim to get back to you within 48 hours