It is hard to believe that any SCL member is not aware of the
report from the Commons Select Committee which was officially published on 29
July, though widely leaked beforehand. But its contents may not be known to all
visitors to this site, particularly those from outside the UK, and the report
might not always have been accurately reflected in the mainstream media’s
reports.
The report warns that we are facing a democratic crisis
founded on the manipulation of personal data, and targeting pernicious views to
users, particularly during elections and referenda. The Committee outlines a
series of recommendations to tackle the problem of disinformation and fake news
facing the whole world.
An official summary of the report can be accessed here
and the full report can be accessed here.
The report’s conclusions and recommendations are set out in full below.
Damian Collins MP, Chair of the Committee, said:
‘We are facing nothing less than a crisis in our democracy –
based on the systematic manipulation of data to support the relentless targeting
of citizens, without their consent, by campaigns of disinformation and messages
of hate.
In this inquiry we have pulled back the curtain on the
secretive world of the tech giants, which have acted irresponsibly with the
vast quantities of data they collect from their users. Despite concerns being
raised, companies like Facebook made it easy for developers to scrape user data
and to deploy it in other campaigns without their knowledge or consent.
Throughout our inquiry these companies have tried to frustrate scrutiny and
obfuscated in their answers. The light of transparency must be allowed to shine
on their operations and they must be made responsible, and liable, for the way
in which harmful and misleading content is shared on their sites.
We heard evidence of coordinated campaigns by Russian
agencies to influence how people vote in elections around the world. This
includes running adverts through Facebook during elections in other countries
and in breach of their laws. Facebook failed to spot this at the time, and it
was only discovered after repeated requests were made for them to look for
evidence of this activity. Users were unaware that they were being targeted by
political adverts from Russia, because they were made to look like they came
from their own country, and there was no information available at all about the
true identity of the advertiser. I believe what we have discovered so far is
the tip of the iceberg. There needs to be far greater analysis done to expose
the way advertising and fake accounts are being used on social media to target
people with disinformation during election periods. The ever-increasing
sophistication of these campaigns, which will soon be helped by developments in
augmented reality technology, make this an urgent necessity.
Data crimes are real crimes, with real victims. This is a
watershed moment in terms of people realising they themselves are the product,
not just the user of a free service. Their rights over their data must be
protected.
The first steps in tackling disinformation and fake news are
to identify the scale of the problem and the areas were immediate action is
required. In this interim report we have set out a series of recommendations,
including; making the tech companies take greater responsibility for misleading
and harmful content on their sites; providing greater transparency for users on
the origin of content that has been presented to them; raising funding from the
tech sector to provide more media literacy training in schools; and calling for
an international coalition to act against campaigns of disinformation from
Russian agencies and their networks, whose purpose is to disrupt our democracy.’
The report’s conclusions and recommendations are as follows.
Introduction and
background
1. The term ‘fake news’ is bandied around with no clear
idea of what it means, or agreed definition. The term has taken on a
variety of meanings, including a description of any statement that is not
liked or agreed with by the reader. We recommend that the Government
rejects the term ‘fake news’, and instead puts forward an agreed definition
of the words ‘misinformation’ and ‘disinformation’. With such a shared
definition, and clear guidelines for companies, organisations, and the
Government to follow, there will be a shared consistency of meaning
across the platforms, which can be used as the basis of regulation
and enforcement. (Paragraph 14)
2. We recommend that the Government uses the rules
given to Ofcom under the Communications Act 2003 to set and enforce
content standards for television and radio broadcasters, including rules
relating to accuracy and impartiality, as a basis for setting standards
for online content. We look forward to hearing Ofcom’s plans for greater
regulation of social media this autumn. We plan to comment on
these in our further Report. (Paragraph 15)
3. The Government should support research into the
methods by which misinformation and disinformation are created and spread
across the internet: a core part of this is fact-checking. We recommend
that the Government initiate a working group of experts to create a
credible annotation of standards, so that people can see, at
a glance, the level of verification of a site. This would help people
to decide on the level of importance that they put on those sites. (Paragraph
18)
4. During the course of this inquiry we have wrestled
with complex, global issues, which cannot easily be tackled by blunt, reactive
and outmoded legislative instruments. In this Report, we suggest
principle-based recommendations which are sufficiently adaptive to deal
with fast-moving technological developments. We look forward to hearing
the Government’s response to these recommendations. (Paragraph 19)
5. We also welcome submissions to the Committee from readers
of this interim Report, based on these recommendations, and on specific areas
where the recommendations can incorporate work already undertaken by others.
This inquiry has grown through collaboration with other countries,
organisations, parliamentarians, and individuals, in this country and abroad,
and we want this co-operation to continue. (Paragraph 20)
The definition, role
and legal responsibilities of tech companies
6. The Data Protection Act 2018 gives greater
protection to people’s data than did its predecessor, the 1998 Data Protection
Act, and follows the law set out in the GDPR. However, when the UK leaves
the EU, social media companies will be able to process personal data
of people in the UK from bases in the US, without any coverage of data
protection law. We urge the Government to clarify this loophole in a White
Paper this Autumn. (Paragraph 30)
7. We welcome the increased powers that the Information
Commissioner has been given as a result of the Data Protection Act 2018,
and the ability to be able to look behind the curtain of tech
companies, and to examine the data for themselves. However, to be a
sheriff in the wild west of the internet, which is how the Information
Commissioner has described her office, the ICO needs to have the same if
not more technical expert knowledge as those organisations under scrutiny.
The ICO needs to attract and employ more technically-skilled engineers who
not only can analyse current technologies, but have the capacity
to predict future technologies. We acknowledge the fact that the
Government has given the ICO pay flexibility to retain and
recruit more expert staff, but it is uncertain whether pay
flexibility will be enough to retain and attract the expertise
that the ICO needs. We recommend that the White Paper explores the
possibility of major investment in the ICO and the way in which that money
should be raised. One possible route could be a levy on tech companies
operating in the UK, to help pay for the expanded work of the ICO, in
a similar vein to the way in which the banking sector pays for the upkeep
of the Financial Conduct Authority. (Paragraph 36)
8. The globalised nature of social media creates
challenges for regulators. In evidence Facebook did not accept their
responsibilities to identify or prevent illegal election campaign activity
from overseas jurisdictions. In the context of outside interference in
elections, this position is unsustainable and Facebook, and
other platforms, must begin to take responsibility for the way
in which their platforms are used. (Paragraph 44)
9. Electoral law in this country is not fit for purpose
for the digital age, and needs to be amended to reflect
new technologies. We support the Electoral Commission’s suggestion that
all electronic campaigning should have easily accessible digital imprint
requirements, including information on the publishing organisation and who
is legally responsible for the spending, so that it is obvious at
a glance who has sponsored that campaigning material, thereby bringing
all online advertisements and messages into line with physically published
leaflets, circulars and advertisements. We note that a similar recommendation
was made by the Committee on Standards in Public Life, and urge the
Government to study the practicalities of giving the Electoral Commission
this power in its White Paper. (Paragraph 45)
10. As well as having digital imprints, the Government
should consider the feasibility of clear, persistent banners on all paid-for
political adverts and videos, indicating the source and making it easy for
users to identify what is in the adverts, and who the advertiser is. (Paragraph
46)
11. The Electoral Commission’s current maximum fine
limit of £20,000 should be changed to a larger fine based on a fixed
percentage of turnover, such as has been granted recently to the
Information Commissioner’s Office in the Data Protection Act 2018.
Furthermore, the Electoral Commission should have the ability to refer
matters to the Crown Prosecution Service, before their investigations have
been completed. (Paragraph 47)
12. Electoral law needs to be updated to reflect changes
in campaigning techniques, and the move from physical leaflets and
billboards to online, micro-targeted political campaigning, as well as
the many digital subcategories covered by paid and organic campaigning.
The Government must carry out a comprehensive review of the current rules
and regulations surrounding political work during elections and referenda,
including: increasing the length of the regulated period; definitions
of what constitutes political campaigning; absolute transparency of online
political campaigning; a category introduced for digital spending on campaigns;
reducing the time for spending returns to be sent to the
Electoral Commission (the current time for large political organisations
is six months); and increasing the fine for not complying with the electoral
law. (Paragraph 48)
13. The Government should consider giving the Electoral
Commission the power to compel organisations that it does not specifically
regulate, including tech companies and individuals, to provide information
relevant to their inquiries, subject to due process. (Paragraph 49)
14. The Electoral Commission should also establish a
code for advertising through social media during election periods, giving
consideration to whether such activity should be restricted during the
regulated period, to political organisations or campaigns that have
registered with the Commission. Both the Electoral Commission and the ICO
should consider the ethics of Facebook or other relevant social
media companies selling lookalike political audiences to advertisers
during the regulated period, where they are using the data they hold
on their customers to guess whether their political interests are
similar to those profiles held in target audiences already collected
by a political campaign. In particular, we would ask them to consider
whether users of Facebook or other relevant social media companies should
have the right to opt out from beingincluded in such lookalike audiences. (Paragraph
50)
15. Within social media, there is little or no regulation.
Hugely important and influential subjects that affect us—political opinions,
mental health, advertising, data privacy—are being raised, directly or
indirectly, in these tech spaces. People’s behaviour is being modified and
changed as a result of social media companies. There is currently no sign of
this stopping. (Paragraph 56)
16. Social media companies cannot hide behind the claim of
being merely a ‘platform’, claiming that they are tech companies and have no
role themselves in regulating the content of their sites. That is not the case;
they continually change what is and is not seen on their sites, based on
algorithms and human intervention. However, they are also significantly
different from the traditional model of a ‘publisher’, which commissions, pays
for, edits and takes responsibility for the content it disseminates. (Paragraph
57)
17. We recommend that a new category of tech company is
formulated, which tightens tech companies’ liabilities, and which is not
necessarily either a ‘platform’ or a ‘publisher’. We anticipate that the
Government will put forward these proposals in its White Paper
later this year and hope that sufficient time will be built in for
our Committee to comment on new policies and possible legislation. (Paragraph
58)
18. We support the launch of the Government’s
Cairncross Review, which has been charged with studying the role of the
digital advertising supply chain, and whether its model incentivises the
proliferation of inaccurate or misleading news. We propose that this
Report is taken into account as a submission to the Cairncross Review.
We recommend that the possibility of the Advertising Standards Agency
regulating digital advertising be considered as part of the Review. We
ourselves plan to take evidence on this question this autumn, from the ASA
themselves, and as part of wider discussions with DCMS and Ofcom. (Paragraph
59)
19. It is our recommendation that this process should
establish clear legal liability for the tech companies to act against
harmful and illegal content on their platforms. This should include both
content that has been referred to them for takedown by their users, and
other content that should have been easy for the tech companies
to identify for themselves. In these cases, failure to act on behalf
of the tech companies could leave them open to legal proceedings launched
either by a public regulator, and/or by individuals or organisations who
have suffered as a result of this content being freely disseminated on a
social media platform. (Paragraph 60)
20. Tech companies are not passive platforms on which users
input content; they reward what is most engaging, because engagement is part of
their business model and their growth strategy. They have profited greatly by
using this model. This manipulation of the sites by tech companies must be made
more transparent. Facebook has all of the information. Those outside of the
company have none of it, unless Facebook chooses to release it. Facebook was
reluctant to share information with the Committee, which does not bode well for
future transparency We ask, once more, for Mr Zuckerberg to come to
the Committee to answer the many outstanding questions to which Facebook
has not responded adequately, to date. (Paragraph 64)
21. Facebook and other social media companies
should not be in a position of ‘marking their own homework’. As part of
its White Paper this Autumn, the Government need to carry out proactive
work to find practical solutions to issues surrounding transparency that
will work for both users, the Government, and the tech companies. (Paragraph
65)
22. Facebook and other social media companies have a
duty to publish and to follow transparent rules. The Defamation Act 2013
contains provisions stating that, if a user is defamed on
social media, and the offending individual cannot be identified, the
liability rests with the platform. We urge the Government to
examine the effectiveness of these provisions, and to monitor tech
companies to ensure they are complying with court orders in the UK
and to provide details of the source of disputed content—including advertisements—to
ensure that they are operating in accordance with the law, or any future
industry Codes of Ethics or Conduct. Tech companies also have a
responsibility to ensure full disclosure of the source of any political
advertising they carry. (Paragraph 66)
23. Just as the finances of companies are audited and
scrutinised, the same type of auditing and scrutinising should be carried
out on the non-financial aspects of technology companies, including their
security mechanisms and algorithms, to ensure they are operating
responsibly. The Government should provide the appropriate body with the
power to audit these companies, including algorithmic auditing, and we
reiterate the point that the ICO’s powers should be substantially strengthened
in these respects. (Paragraph 72)
24. If companies like Facebook and Twitter fail to act
against fake accounts, and properly account for the estimated total of
fake accounts on their sites at any one time, this could not only damage
the user experience, but potentially defraud advertisers who could be
buying target audiences on the basis that the user profiles are connected
to real people. We ask the Competition and Markets Authority to
consider conducting an audit of the operation of the advertising market on
social media. (Paragraph 73)
25. Social media companies have a legal duty to inform
users of their privacy rights. Companies give users the illusion of users
having freedom over how they control their data, but they make it
extremely difficult, in practice, for users to protect their data. Complicated
and lengthy terms and conditions, small buttons to protect our
data and large buttons to share our data mean that, although in
principle we have the ability to practise our rights over our data—through
for example the GDPR and the Data Protection Act—in practice it is made
hard for us. (Paragraph 75)
26. The UK Government should consider establishing a
digital Atlantic Charter as a new mechanism to reassure users that their
digital rights are guaranteed. This innovation would demonstrate the UK’s
commitment to protecting and supporting users, and establish a formal
basis for collaboration with the US on this issue. The Charter would be
voluntary, but would be underpinned by a framework setting out clearly the
respective legal obligations in signatory countries. This would help
ensure alignment, if not in law, then in what users can expect in
terms of liability and protections. (Paragraph 76)
27. The United Nations has named Facebook as being
responsible for inciting hatred against the Rohingya Muslim minority in Burma,
through its ‘Free Basics’ service. It provides people free mobile phone access
without data charges, but is also responsible for the spread disinformation and
propaganda. The CTO of Facebook, Mike Schroepfer described the situation in
Burma as “awful”, yet Facebook cannot show us that it has done anything to stop
the spread of disinformation against the Rohingya minority. (Paragraph 82)
28. The hate speech against the Rohingya—built up on
Facebook, much of which is disseminated through fake accounts—and
subsequent ethnic cleansing, has potentially resulted in the success of
DFID’s aid programmes being greatly reduced, based on the qualifications
they set for success. The activity of Facebook undermines international aid to
Burma, including the UK Government’s work. Facebook is releasing a product
that is dangerous to consumers and deeply unethical. We urge the
Government to demonstrate how seriously it takes Facebook’s apparent collusion
in spreading disinformation in Burma, at the earliest opportunity. This is
a further example of Facebook failing to take responsibility for the
misuse of its platform. (Paragraph 83)
29. A professional global Code of Ethics should be developed
by tech companies, in collaboration with this and other governments,
academics, and interested parties, including the World Summit
on Information Society, to set down in writing what is and what is
not acceptable by users on social media, with possible liabilities for
companies and for individuals working for those companies, including those
technical engineers involved in creating the software for the companies.
New products should be tested to ensure that products are fit-for-purpose
and do not constitute dangers to the users, or to society. (Paragraph
89)
30. The Code of Ethics should be the backbone of tech
companies’ work, and should be continually referred to when developing new technologies
and algorithms. If companies fail to adhere to their own Code of Ethics,
the UK Government should introduce regulation to make such ethical rules
compulsory. (Paragraph 90)
31. The dominance of a handful of powerful tech
companies, such as Facebook, Twitter and Google, has resulted in their
behaving as if they were monopolies in their specific area. While
this portrayal of tech companies does not appreciate the benefits
of a shared service, where people can communicate freely, there
are considerations around the data on which those services are
based, and how these companies are using the vast amount of data they
hold on users. In its White Paper, the Government must set out why the
issue of monopolies is different in the tech world, and the measures
needed to protect users’ data. (Paragraph 91)
The issue of data
targeting, based around the Facebook, GSR and Cambridge Analytica allegations
32. Over the past month, Facebook has been investing in
adverts globally, proclaiming the fact that “Fake accounts are not our
friends.” Yet the serious failings in the company’s operations that resulted in
data manipulation, resulting in misinformation and disinformation, have
occurred again. Over four months after Facebook suspended Cambridge Analytica
for its alleged data harvesting, Facebook suspended another company, Crimson
Hexagon—which has direct contracts with the US government and Kremlin-connected
Russian organisations—for allegedly carrying out the same offence. (Paragraph
133)
33. We are concerned about the administrators’
proposals in connection with SCL Elections Ltd, as listed in Companies
House, and the fact that Emerdata Ltd is listed as the ultimate parent
company of SCL Elections Ltd, and is the major creditor and owed over
£6.3 million. The proposals also describe laptops from the SCL Elections
Ltd offices being stolen, and laptops returned by the ICO, following its
investigations, also being stolen. We recommend that the National Crime
Agency, if it is not already, should investigate the connections between
the company SCL Elections Ltd and Emerdata Ltd. (Paragraph 134)
34. The allegations of data harvesting revealed the extent
of data misuse, made possible by Cambridge University’s Dr Kogan and
facilitated by Facebook, GSR, and manipulated into micro-targeting Cambridge
Analytica and its associated companies, through AIQ. The SCL Group and
associated companies have gone into administration, but other companies are
carrying out very similar work. Many of the individuals involved in SCL and
Cambridge Analytica appear to have moved on to new corporate vehicles. Cambridge
Analytica is currently being investigated by the Information Commissioner’s
Office (ICO) (and, as a leading academic institution, Cambridge University also
has questions to answer from this affair about the activities of Dr
Kogan). (Paragraph 135)
35. We invited Alexander Nix twice to give evidence; both
times he was evasive in his answers and the standard of his answers fell well
below those expected from a CEO of an organisation. His initial evidence
concerning GSR was not the whole truth. There is a public interest in getting
to the heart of what happened, and Alexander Nix must take responsibility for
failing to provide the full picture of events, for whatever reason. With
respect to GSR, he misled us. We will give a final verdict on Mr Nix’s evidence
when we complete the inquiry. (Paragraph 136)
Political campaigning
36. We recommend that the Government look at ways in
which the UK law defines digital campaigning. This should include online
adverts that use political terminology that are not sponsored by a
specific political party. There should be a public register for
political advertising, requiring all political advertising work to be
listed for public display so that, even if work is not
requiring regulation, it is accountable, clear, and transparent for all
to see. There should be a ban on micro-targeted political advertising
to lookalikes online, and a minimum limit for the number of voters sent
individual political messages should be agreed, at a national level. (Paragraph
142)
37. We reiterate our support for the Cairncross Review
and will engage with the consultation in the coming months. In particular,
we hope that Frances Cairncross will give due weight to the role of
digital advertising in elections, and will make concrete recommendations
about how clearer rules can be introduced to ensure fairness and
transparency. (Paragraph 143)
38. The Government should investigate ways in which to
enforce transparency requirements on tech companies, to ensure
that paid-for political advertising data on social media platforms,
particularly in relation to political adverts, are publicly accessible,
are clear and easily searchable, and identify the source, explaining who
uploaded it, who sponsored it, and its country of origin. This information
should be imprinted into the content, or included in a banner at the top
of the content. Such transparency would also enable members of the public
to understand the behaviour and intent of the content providers, and it
would also enable interested academics and organisations to conduct
analyses and to highlight trends. (Paragraph 144)
39. Tech companies must also address the issue of shell
corporations and other professional attempts to hide identity in advert
purchasing, especially around election advertising. There should be full
disclosure of targeting used as part of advert transparency. The
Government should explore ways of regulating on the use of external
targeting on social media platforms, such as Facebook’s Custom Audiences.
We expect to see the detail of how this will be achieved in its
White Paper later this year. (Paragraph 145)
40. Data sets allegedly enabled Leave.EU to push their
message to groups of people that they might not otherwise have had information
about. This evidence informed our inquiry, backing up concerns that data is
being harvested and utilised from many people unwittingly and used for purposes
of which they may not be aware. It is alleged that Leave.EU obtained data used
during the Referendum from insurance data from companies owned by Arron Banks.
The Information Commissioner’s Office is investigating both the alleged misuse
of customer data from Arron Banks’ Eldon Insurance Services Ltd and the misuse
of that data by the call centre staff, to make calls on behalf of Leave.EU.
These are extremely serious allegations. We look forward to hearing the final
results of the ICO’s investigations, when it reports in October 2018. (Paragraph
159)
Russian influence in
political campaigns
41. In November 2017, the Prime Minister accused Russia
of meddling in elections and planting ‘fake news’ in an attempt to
‘weaponise information’ and sow discord in the West. It is clear from
comments made by the then Secretary of State in evidence to us that
he shares her concerns. However, there is a disconnect between the
Government’s expressed concerns about foreign interference in elections,
and tech companies intractability in recognising the issue. We would
anticipate that this issue will be addressed, with possible plans of
action, in the White Paper this Autumn. (Paragraph 176)
42. Arron Banks is, reportedly, the largest individual
donor in UK political history. As far as we understand, he met with
the Russian Ambassador, for the first time, in the run up to the EU
Referendum. Evidence discloses that he discussed business ventures within
Russia and beyond, and other financial ventures, in a series of meetings
with Russian Embassy staff. Arron Banks and Andy Wigmore have misled the
Committee on the number of meetings that took place with the Russian
Embassy and walked out of the Committee’s evidence session to avoid
scrutiny of the content of the discussions with the Russian Embassy. (Paragraph
185)
43. From the emails that we have seen, it is evident
that Arron Banks had many meetings with Russian officials, including the
Russian Ambassador, Alexander Yakovenko, between 2015 and 2017. The
meetings involved discussions about business deals involving Alrosa, the
Russian diamond monopoly, the purchase of gold mines, funded by Sberbank,
the Russian-state bank, and the transferring of confidential documents to
Russian officials. Mr. Banks seemed to want to hide the extent of
his contacts with Russia, while his spokesman Andy Wigmore’s
statements have been unreliable—by his own admission—and cannot
be taken at face value. Mr Wigmore is a self-confessed liar and, as a
result, little significance can be attached to anything that he says. It
is unclear whether Mr. Banks profited from business deals arising from meetings
arranged by Russian officials. We understand that the National Crime
Agency (NCA) is investigating these matters. We believe that they should
be given full access to any relevant information that will aid their
inquiry. (Paragraph 186)
44. Arron Banks is believed to have donated £8.4
million to the Leave campaign, the largest political donation in British
politics, but it is unclear from where he obtained that amount of money.
He failed to satisfy us that his own donations had, in fact,
come from sources within the UK. At the same time, we have evidence
of Mr. Banks’ discussions with Russian Embassy contacts, including the
Russian Ambassador, over potential gold and diamond deals, and the passing
of confidential information by Mr Banks. The Electoral Commission should pursue
investigations into donations that Arron Banks made to the Leave campaign,
to verify that the money was not sourced from abroad. Should there be any
doubt, the matter should be referred to the NCA. The
Electoral Commission should come forward with proposals for more stringent
requirements for major donors to demonstrate the source of their
donations. (Paragraph 191)
45. The Electoral Commission has recommended that there
should be a change in the rules covering political spending, so that
limits are put on the amount of money an individual can donate. We agree
with this recommendation, and urge the Government to take this proposal on
board. (Paragraph 192)
46. We heard evidence that showed alleged Russian
interference in the Spanish Referendum, in October 2017. During the
Referendum campaign, Russia provoked conflict, through a mixture of
misleading information and disinformation, between people within Spain,
and between Spain and other member states in the EU, and in NATO. We
heard evidence that showed that Russia had a special interest
in discrediting the Spanish democratic system, through Russian state
affiliated TV organisations spreading propaganda that benefitted those
wanting independence in Catalonia. (Paragraph 197)
47. We recommend that the UK Government approaches
other governments and follows the recommendation agreed by US and
EU representatives, including representatives from this Committee, at the
recent inter-parliamentary meeting at the Atlantic Council. The Government
should share information on risks, vulnerabilities, and best practices to
counter Russian interference, and co-ordinate between parliamentarians
across the world. Only by sharing information, resources, and best
practice will this Government be able to combat Russian interference in
our elections. We look forward to a White Paper this autumn, and
the opportunity for the Government to set out the practical
steps that it will follow to ensure greater global co-operation to
combat Russian interference. (Paragraph 202)
48. Just as six Select Committees have joined forces in
an attempt to combat Russian influence in our political discourse, so the
Government should co-ordinate joint working with the different relevant
Departments. Those Departments should not be working in silos, but should
work together, sharing data, intelligence and expert knowledge, to counter the
emerging threat of Russia, and other malign players. (Paragraph 203)
49. We note that the Mueller Inquiry into Russian
interference in the United States is ongoing. It would be wrong
for Robert Mueller’s investigation to take the lead about
related issues in the UK. We recommend that the Government makes a
statement about how many investigations are currently being carried out
into Russian interference in UK politics and ensures that a co-ordinated
structure exists, involving the Electoral Commission and the Information
Commissioner, as well as other relevant authorities. (Paragraph 204)
SCL influence in
foreign elections
50. We received disturbing evidence, some of which we
have published, some of which we have not, of activities
undertaken by the SCL-linked companies in various political campaigns
dating from around 2010, including the use of hacking, of disinformation,
and of voter suppression, and the use of the services of Black Cube, an
Israeli private intelligence service, whose work allegedly included
illegal hacking. We also heard of the links between SCL and Christian
Kalin of Henley Partners and their involvement in election campaigns, in which
Mr Kalin already ran or subsequently launched citizenship-by-investment
programmes, involving the selling of countries passports to investors.
SCL’s alleged undermining of democracies in many countries, by the active
manipulation of the facts and events, was happening alongside work done by
theSCL Group on behalf of the UK Government, the US Government, and other
allied governments. We do not have the remit or the capacity to
investigate these claims ourselves, but we urge the Government to ensure
that the National Crime Agency thoroughly investigates these allegations. (Paragraph
231)
Digital literacy
51. We recommend that the Government put forward
proposals in its White Paper for an educational levy to be raised by
social media companies, to finance a comprehensive educational framework
(developed by charities and non-governmental organisations) and based
online. Digital literacy should be the fourth pillar of education,
alongside reading, writing and maths. The DCMS Department should co-ordinate
with the Department for Education, in highlighting proposals to include
digital literacy, as part of the Physical, Social, Health and Economic
curriculum (PSHE). The social media educational levy should be used, in
part, by the Government, to finance this additional part of
the curriculum.(Paragraph 246)
52. There should be a unified public
awareness initiative, supported by the Departments for DCMS, Health, and
Education, with additional information and guidance from the Information
Commissioner’s Office and the Electoral Commission, and funded in part
by the tech company levy. Such an initiative would set the context of
social media content, explain to people what their rights over their data
are, within the context of current legislation, and set out ways in which
people can interact with political campaigning on social media. This
initiative should be a rolling programme, and not one that occurs only
before general elections or referenda. (Paragraph 247)
53. The public should be made more aware of their ability to
report digital campaigning that they think is misleading, or unlawful. We look
forward to the work that the Electoral Commission is planning, to bring this to
the fore. (Paragraph 248)