The European Commission has unveiled policy options for AI and data, as well as a communication on a digital future. Over the next five years, the Commission will focus on three key objectives in digital:
- Technology that works for people;
- A fair and competitive economy; and
- An open, democratic and sustainable society.
The Commission’s aim is that new policies and frameworks will enable the EU to deploy cutting-edge digital technologies and strengthen its cybersecurity capacities.
White paper on AI
The Commission has presented a white paper on AI, which is open for consultation until 19 May 2020. The framework envisages trustworthy artificial intelligence, based on excellence and trust. In partnership with the private and the public sector, the aim is to mobilise resources along the entire value chain and to create the right incentives to accelerate deployment of AI, including by smaller and medium-sized enterprises. This includes working with member states and the research community, to attract and keep talent. As AI systems can be complex and bear significant risks in certain contexts, building trust is essential. Clear rules need to address high-risk AI systems without putting too much burden on less risky ones. Strict EU rules for consumer protection, to address unfair commercial practices and to protect personal data and privacy, continue to apply.
For high-risk cases, such as in health, policing, or transport, AI systems should be transparent, traceable and guarantee human oversight. Authorities should be able to test and certify the data used by algorithms in the same way that they check cosmetics, cars or toys. Unbiased data is needed to train high-risk systems to perform properly, and to ensure respect of fundamental rights, in particular non-discrimination. Although the use of facial recognition for remote biometric identification is generally prohibited and can only be used in exceptional, duly justified and proportionate cases, subject to safeguards and based on EU or national law, the Commission wants to launch a broad debate about which circumstances, if any, might justify such exceptions.
For lower risk AI applications, the Commission envisages a voluntary labelling scheme if they apply higher standards.
All AI applications are welcome in the European market as long as they comply with EU rules.
The Commission’s data strategy
The Commission has also published a European data strategy. Its objective is to make sure the EU becomes a role model and a leader for a society empowered by data. To do this, it aims to set up a true European data space, a single market for data, to unlock unused data, allowing it to flow freely within the EU and across sectors for the benefit of businesses, researchers and public administrations. Citizens, businesses and organisations should be empowered to make better decisions based on insights gleaned from non-personal data. That data should be available to all, whether public or private, start-up or giant.
To achieve this, the Commission intends to establish the right regulatory framework regarding data governance, access and reuse between businesses, between businesses and government, and within administrations. This involves creating incentives for data sharing, establishing practical, fair and clear rules on data access and use, which comply with European values and rights such as personal data protection, consumer protection and competition rules. It also intends to make public sector data more widely available by opening up high-value datasets across the EU and allowing their reuse to innovate on top.
Second, the Commission aims to support the development of the technological systems and the next generation of infrastructures, which will enable the EU and other parties to grasp the opportunities of the data economy. It will contribute to investments in European high impact projects on European data spaces and trustworthy and energy efficient cloud infrastructures.
Finally, it will launch sectoral specific actions, to build European data spaces in, for example, industrial manufacturing, the green deal, mobility or health.
The Commission will also work to further narrow the digital skills gap among individuals, and explore how to give citizens better control over who can access their machine-generated data.
Next steps
The Commission intends to make further proposals later in 2020 which will include a Digital Services Act and a European Democracy Action Plan, a review of the eIDAS regulation, and a Joint Cyber Unit. The EU will also continue to build alliances with global partners, leveraging its regulatory power, capacity building, diplomacy and finance to promote the European digitalisation model.
Reaction
The European consumer organisation BEUC has responded to the European Commission’s communications. It calls for the EU to put in place safeguards to mitigate risks from AI and data sharing for the EU’s 450 million consumers and to protect their rights in this digital revolution. On the use of AI and algorithmic decision-making (ADM), it calls for:
- Legally binding and enforceable rules on fairness, transparency, accountability, control and safety to ensure AI/ADM is used in a fair and responsible way. Consumers must have a clear picture of how decisions are made and be able to oppose them. Companies must put in place appropriate measures to guarantee compliance and allow adequate regulatory oversight.
- Existing legislation on consumer protection, discrimination, product safety and product liability must be updated to protect consumers against risks arising from AI/ADM and to offer redress and enforcement mechanisms if they suffer harm.
- Systems to assess risks from AI/ADM to ensure that the higher the potential adverse impacts of their use, the stronger the regulatory requirements.
It says that EU-rules on accessing and sharing data must:
- require companies to provide access to data only if necessary to correct market failures, eg if companies refuse to grant competitors access to data to prevent them offering innovative products or services.
- guarantee that data collection and use comply with the GDPR. Special protections need to be put in place when companies access each other’s data.
- create tools for consumers to better control their personal information.
- ensure data security to prevent data leaks.