EU Digital Strategy 2030 – Setting goals for the digital age

Europe's path to the Digital Decade and the goals until then

On 9 March 2021, the EU Commission presented the objectives for the digital age until 2030. The core points of the digital strategy are competence building, the digital transformation in companies, the creation of secure and sustainable digital infrastructures as well as the digitalisation of public services.

The EU’s ambitions are to be divided into main stages and accompanied by a traffic light monitoring system. In addition, transnational projects are planned to close gaps in critical EU capacities. The focus on “digital citizenship” to strengthen the rights and principles of EU citizens is interesting. This means that the preservation of European values should continue in the digital space.

The EU Digital Strategy 2030 should pursue the following principles:

  • Safe and trustworthy online environment
  • Digital education and know-how building
  • Environmentally friendly digital systems and devices
  • Human-centred services
  • Ethical principles for artificial intelligence
  • Protection of children in the online world
  • Access to digital health services
  • Transparency, privacy and freedom of expression

The EU is thus setting the building block for the next steps to create uniform norms and standards to regulate digital supply chains.

In the next stage, the objectives of EU citizens are to be collected by means of a consultation procedure. A digital policy programme should then be drawn up by the end of summer 2021. The inter-institutional declaration on digital principles should lead to initial progress by the end of 2021.

A Digital Single Market for Europe

The EU has recognised that data is immensely important in international competition and that Europe as a market carries great weight in the digital economy. The handling of the data of EU citizens can be decisive for the future prosperity of the EU. In order for the European economy to actually benefit from it, the Digital Data Strategy should create a significant competitive advantage by creating a single market. The aim is to share data across industries within the EU and to reorganise data access. Another goal is to join forces to create European cloud capacities. With EU-wide data management, for example, sensor data can be collected and identified to improve the efficiency of operations, real-time navigation and notification can be enabled, or industrial as well as business data can be used to optimise business processes.

New law for gatekeepers

At the end of 2020, the EU published a proposal for a regulation on digital markets. This is to follow the concept of the EU single market and encourage “large platforms”, called gatekeepers (more than 10% of the EU population is reached), to support a fair business environment. This means that the Digital Markets Act helps protect dependent commercial users and customers from unfair business practices. For example, gatekeepers will not be allowed to constantly rank their products in front of third parties or prevent customers from contacting third parties outside the platform. Thus, it will no longer be allowed that, for example, pre-installed software on smartphones cannot be uninstalled.

How does the EU intend to ensure this? With fines (up to 10% of the company’s annual worldwide turnover), periodic penalty payments (up to 5% of the average daily turnover) and in serious cases it may even go so far as to require business units to be divested. The aim of the Digital Markets Act, apart from consumer protection, is also to support smaller online platforms in the EU and to provide a uniform set of rules within the EU.

More explanation on the Digital Markets Act (DMA).

EU AI strategy

1. On the artificial intelligence (AI) front

The EU has been working for years on approaches to regulate AI developments for manufacturers, operators and users. Trustworthy and robust AI is a prerequisite for sustainable AI use (White Paper on Artificial Intelligence). The EU even wants to invest up to 20 billion annually in this sector in the future.

2. EU legal framework for AI

What happens if an AI malfunctions? Who should be liable if people are injured as a result? Where there are opportunities, there are often risks. Especially in the development of AI, there is a risk of privacy violations or discrimination. In order to reduce such disadvantages, the EU plans to create a legal framework that will deal with these case constellations, for example civil liability. In addition, there are plans to regulate how the use of AI in the judiciary (Internal Affairs Committee AI in Criminal Law) can be handled. It is worth keeping an eye on developments here, as not only the manufacturers of an AI are affected, but also those companies as operators that use an AI. Especially high-risk applications in the field of HR, education (e.g. performance assessment of pupils), medical procedures, criminal law and justice in general, autonomous driving, the energy industry, holding elections, military systems or granting loans are considered important and should be regulated more closely in the future. 3.

3. Ethical aspects of AI and robotics

Since prejudices established in society are transferred when software is produced, the value system in programming (software ethics) plays a major role. The European concept for excellence and trust of AI applications creates ethical standards with its own legislative initiative. This is intended to define ethical principles under which the development and operation of AIs are placed. Values such as transparency, protection against discrimination, social responsibility, respect for privacy and data protection, accountability and security play a major role. This is intended to put people at the centre and make “AI made in EU” a special seal of approval for safety and sustainability.

Read more: Digital ethics in theory and practice

4. Corporate Digital Responsibility (CDR)

It is precisely such developments as those promoted by the EU that strengthen the motivation of many companies to take a closer look at their own digital corporate values. A sustainable digital economy allows users to use products and services with confidence. To achieve this goal, organisations are striving to question their own value compass with regard to data use. Digital Corporate Responsibility (CDR) creates processes and frameworks to build trust with customers and break down barriers to use. On the one hand, this strengthens the reputation, on the other hand, it creates a solid starting point for the product or service to be accepted by the end customer in the first place. If there are doubts (lack of transparency) or mistrust (misuse of data, violations of ethical principles), an offer will not be taken up in the first place and, in the worst case, the previous investments will be lost. With CDR, organisations across Europe are now working out principles and principles for the future of the digital economy: a people- and value-oriented design of digitalisation.

EU Digital Strategy – The Digital Decade has begun

The EU Digital Strategy 2030 shows many comprehensive concepts for the future use of data in the European economic area. Hardly any company will not have the topic of digital sustainability or digital corporate responsibility on its agenda for the next few years. To ensure that everyone benefits from progress, ethical principles will have to be taken into account in the design, development, implementation and operation of new technologies.

The sooner companies address the new regulations in the EU project pipeline in a timely manner, the sooner they will be able to offer added value in the market and ensure that their customers rely on a sustainable quality feature in the long term.

Mag. Karin Dietl ist selbständige Unternehmensberaterin und Spezialistin für Datenschutz-Compliance. Sie startete ihre Ausbildung als Textilchemikerin, absolvierte neben der Anwaltsprüfung mehrere Jahre in internationalen Wirtschaftskanzleien und beschäftigt sich seit 2010 mit der Digitalwirtschaft. Derzeit berät sie Unternehmen zu den Themen Informationssicherheit, Datenschutz, Risikomanagement sowie Digitaler Ethik und Corporate Digital Responsibility. Sie führt zudem Datenschutz-Audits durch und wird für Unternehmen als Datenschutzbeauftragte tätig. Darüber hinaus ist sie Fachvortragende bei Veranstaltungen und Autorin zahlreicher Fachpublikationen.

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More