Earlier this month, the Centre for Countering Digital Hate (CCDH), a lobby group that campaigns for more online censorship run by Imran Ahmed, a former adviser to ex-Labour ministers Hilary Benn and Angela Eagle, held an “emergency” meeting to discuss the role of social media in fuelling the civil disorder that followed the murder of three girls in Southport (GB News, Spectator, Times).
The meeting was attended by officials in the Department for Science, Innovation and Technology (DSIT), the Home Office, Ofcom and the Counter Terrorism Internet Referral Unit at the Met Policy, as well as representatives of the Community Security Trust, Tell Mama, the Incorporated Society of British Advertising and current and former MPs.
Cosily enough, this little get together was held under the Chatham House rule, meaning that when CCDH published the policy recommendations that emerged from the meeting it wasn’t required to attribute views to individual participants.
The most eye-catching of these is that the Online Safety Act should be amended to grant Ofcom – one of the bodies represented at CCDH’s conflab – additional “emergency response” powers to fight ‘misinformation’ that poses a ‘threat’ to ‘national security’ and ‘the health or safety of the public’.
As FSU General Secretary Toby Young points out in the Spectator, the stipulation that these new powers would only be used in an ‘emergency’ or a ‘crisis’ is not particularly reassuring, given that such terms “have a remarkable tendency to expand to suit the agenda of our would-be censors”.
The CCDH’s proposal involves amending the section 175 ‘special circumstances’ directive created by the Online Safety Act to enable Peter Kyle MP, the Secretary of State for DSIT (also represented at CCDH’s chinwag, of course), to issue a ‘directive’ to Ofcom to ramp up its censorship powers if the Government feels there is a threat to national security or to the health and safety of the public. Not only that, but the ‘objective’ he would be ‘directing’ Ofcom to prioritise, e.g. what online content to remove, would be defined by him.
Mr Kyle is a close confidant of Sir Keir Starmer’s, and, judging from interviews he’s given about the shortcomings of the Online Safety Act, it’s a safe bet he’d be in favour of amending it to grant himself the power to order Ofcom to remove content that, in his view, poses a threat to public safety.
We’re frequently told by environmental lobbyists that we’re in the midst of a ‘climate emergency’ – indeed, as far back as 2018, Mr Kyle was boasting to his constituents of how he and his Labour Party colleagues had “managed to force the Government to declare a Climate Emergency”.
So what guarantee is there that as Secretary of State for DSIT he won’t persuade himself that ‘climate denial’ – which in a report published earlier this year the CCDH is defined as “arguments used to undermine climate action” – is causing significant harm to the health or safety of the public and issue a directive to Ofcom to remove this dangerous ‘misinformation’?
You might think: “So what? Climate denialism is ridiculous, potentially harmful, and should be removed from social media.”
The difficulty, however, is that while it’s indisputable that average global temperatures have increased since the mid-nineteenth century, lots of people, including climate scientists, hold a range of different views about the causes and effects of climate change and that in turn influences their opinion about the best way to tackle it – or, indeed, whether “climate action”, as CCDH describes it, is possible or necessary. Different solutions to tackling climate are informed by different values and recommending one approach over another inevitably involves making a political choice. There is no-such thing as an apolitical, ‘scientific’ solution and, therefore, it is a dishonest, sleight-of-hand to categorise dissent from one particular solution as ‘misinformation’.
Aside from ongoing debates around climate change, there’s the question of what guarantee we have that the Government wouldn’t order the regulator to remove ‘misinformation’ around, say, another pandemic on the grounds that it poses a threat to public health?
We know that during the last Parliament, SNP and Labour MPs were keen to amend the Online Safety Bill to include the vague and capacious phrase “health-related misinformation and disinformation” as a form of legal but harmful speech.
We also know that during the Covid lockdowns a shadowy state agency with no statutory footing was used to monitor perfectly lawful yet dissenting social media posts by politicians, journalists and scientists who were “critical of the Government” on the grounds that they were disseminating ‘misinformation’. Yet the difference between ‘misinformation’ and ‘plausible hypothesis’ – the lab leak theory of Covid-19’s origins, for instance – is often little more than the passage of time.
It’s a point nicely demonstrated by Meta CEO Mark Zuckerberg finally admitting this week that Facebook was wrong to censor Covid-related posts of this kind, and that the company should have fought pressure from the Biden administration to do so.
Ofcom is of course free to meet under the Chatham House rule with as many think tanks as it likes – but under the proposals put forward by CCDH the regulator would become a weapon of censorship wielded by the Government.