Why one Lancashire academic feels tougher steps are needed to tackle ‘toxic‘ fake news

A powerful Government committee has become the latest body to criticise Facebook over the spreading of ‘fake news' as Lancashire academic William Dance explains

By Mike Hill
Tuesday, 26th February 2019, 11:32 am
Updated Tuesday, 26th February 2019, 12:32 pm
More action is needed to tackle fake news
More action is needed to tackle fake news

Taking the form of anything from attention-grabbing clickbait articles to fictional editorials in fake newspapers, disinformation has invaded our daily lives. This false information can be trivial but it can also be dangerous, as shown by organisations such as the Royal Society for Public Health finding that half of all parents in the UK with small children have been exposed to disinformation about vaccines on social media.

A 111-page document criticising Facebook’s role in spreading disinformation and ‘fake news’ has been published by the Department for Digital, Culture, Media and Sport (DCMS) to redress the issue of deceptive content online and its “pernicious ability to distort, to disrupt, and to destabilise”.Disinformation, also known as ‘fake news’, is news that is intentionally factually incorrect and is distributed with the goal of deceiving and misinforming its reader.

Read More

Read More
How to spot fake news

Sign up to our daily newsletter

Lancaster University PHd student William Dance

The DCMS is responsible for the UK's digital infrastructure, ranging from implementing broadband and 4G to helping protect adults and children online. Within their remit is protecting against what are known as ‘online harms’, content on the internet which is malicious and potentially dangerous, such as revenge porn, hate speech, and disinformation. The department operates under the premise of “if it’s unacceptable offline then it’s unacceptable online”.

Spanning 18 months and carried out by a cross-party House of Commons select committee of officials from nine countries representing a total of 447m people, the report comes at a time where social media is being blamed for polarising people’s political and social views and helping sow discord across the world.

Following the Cambridge Analytica scandal and news of Russian interference in both UK and US elections, the report aims to protect our democratic system from hostile states which weaponise disinformation online.

According to the report, Facebook and other social media companies deliberately violated privacy and competition law in their treatment of smaller partner companies. The report continues, saying companies such as Facebook should not be allowed to operate like “digital gangsters”. The chair of the select committee Damian Collins MP went on to say, “Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies”.

The comments come after the Facbeook CEO refused to appear before the committee which led to the DCMS using parliamentary powers to seize a cache of internal Facebook documents from the CEO of Six4Three, a tech firm which Facebook is currently embroiled in a legal battle with.

The documents, which were released in full under parliamentary privilege, show Facebook was aware companies such as Cambridge Analytica were exploiting privacy loopholes and that Facebook intentionally made companies aware of privacy loopholes they could exploit.

The report recommends a new category of social media company which is outside the current classifications of either being a “platform” or a “publisher”. Sharon White, chief executive of regulator Ofcom, said this new category is needed because companies should be responsible for the content they host and advertise even if they did not produce the content themselves. This move would then increase the companies’ legal liability and hopefully force them to act on online harms or face harsh financial penalties.

The report adds that a compulsory Code of Ethics should also be implemented and overseen by a regulatory board to help define what constitutes online harms. Perhaps the most shocking part of the report is the committee’s observation that British electoral law is not fit for purpose to deal with online disinformation and digital advertising techniques such as microtargeted political advertising. Microtargeting is an advertising strategy which uses people’s personal and social data to segment them into groups. These groups are then targeted with adverts online which appeal directly to their views and beliefs.

While this can be innocent such as using internet cookies to show individuals adverts for their favourite websites, it also allows organisations to target social media users with hyper-partisan views which take advantage of people’s fears and prejudices to try to influence views and voting behaviour. According to the report, the rapid increase of internet advertising has not been accounted for in electoral law, leaving large grey areas which are being taken advantage of.

The report proposes absolute transparency of online advertising such as displaying who has paid for and who is running the advert – laws which are already in place for political advertising in print and on TV. The report ends with a call for greater digital literacy and acknowledges that part of the current disinformation crisis is not solely attributable to malicious intent but on the wider-public’s inability to judge the veracity of online content.

Digital and media literacy education and initiatives are needed to help improve people’s judgement of potentially malicious and deceptive content.Once called a “plucky little panel” by the Washington Post, the DCMS Committee’s report is an unsparing denouncement of Facebook and Zuckerberg’s operations which presents informed and pragmatic recommendations to the Government to help end the toxic consumption and reproduction of so-called ‘fake news’.

* William Dance is a PhD student and associate lecturer at Lancaster University. He is interested in investigating deception and manipulation in online spaces.