Artificial intelligence: how data protection works

Artificial intelligence: how data protection works

Artificial intelligence (AI) has been on everyone's radar since the huge popularity of ChatGPT. But it's not just since ChatGPT that AI has been used in the professional world. The use of AI has been developing steadily for several years now and is gradually finding its way into many industries.

Many companies are already using AI software in a wide variety of areas - and the trend is rising. AI can take over work or make it easier. However, data protection should always be taken into account, as AI is usually trained with data. When it comes to personal data in particular, companies should take a close look at the legal framework. In this article, we give you tips on how you can implement AI successfully and legally securely.

Do you have questions about data protection law? With comprehensive expertise in technical and legal matters, our team will provide you with comprehensive advice and support for your concerns. Arrange a non-binding initial consultation with our experts now.

What can AI already do in companies?

Artificial intelligence (AI) can already make many tasks easier or take them over completely (e.g. automatic invoicing, quotations, etc.). These are tasks that have normally required human intelligence - until now. AI works with algorithms that use data, recognise patterns and then find solutions to problems or even make decisions. With the help of machine learning, a branch of AI, an algorithm is able to learn independently.

There are many ways to use AI. Here are a few examples:

  • Composing texts and modules (e.g. in e-mails),

  • summarising texts,

  • Programming software and programmes,

  • Recording and analysing data (e.g. CVs of applicants),

  • Translations into other languages.

Even though AI is getting better and better at this, it also makes mistakes and cannot yet replace human thinking.

Comply with GDPR and data protection laws

The applicable law must be observed when using AI. In the professional environment, this often includes labour law, liability law and copyright law. Data protection law may also apply if personal data is processed by the AI.

No company should therefore deploy an AI without sufficient preparation and without first familiarising itself with the applicable requirements. It must be checked whether the AI used fulfils the requirements.

The following data protection requirements in particular must be observed when using AI:

  1. data subjects must be informed that their data is being processed through the use of AI (transparency, e.g. information obligations).

  2. data subjects have the right to object to this use or the processing of the data as a whole at any time (right to object)

  3. processing must always be minimised, even when using AI. Therefore, make sure that the AI implements this (data minimisation).

  4. the processing of personal data must always serve a specific and legitimate purpose. AI may not store or process data without such a purpose (purpose limitation).

  5. the AI must adhere to the erasure deadlines for data.

  6. if data is transferred to third parties, either an order processing contract (AVV) must be in place or the requirements of Art. 6 GDPR must be met. If data is transferred abroad, there must be an adequate level of data protection in the third country. This is the case if the EU has adopted an adequacy decision (Art. 45 para. 3 GDPR). Otherwise, a transfer is lawful if suitable guarantees (e.g. standard contractual clauses) are in place in accordance with Art. 46 GDPR. If this is also not the case, Art. 49 GDPR provides further exceptions (in particular the consent of the data subject).

Define data protection strategy

Almost every company is affected by data protection law. Therefore, every company should already have an effective and well thought-out data protection strategy in place, which can now also include the use of AI.

With such a strategy, internal mechanisms should be established to implement the requirements of the GDPR. This means, for example, that the erasure deadlines for data are automatically adhered to and that there is a processing directory that is constantly updated.

Before using AI software, it must then be checked whether it is necessary, whether it fulfils the requirements for data minimisation, purpose limitation, etc. and what risks its use entails (risk analysis). Only then should it be used.

In addition, companies will have to take a closer look at the EU's AI Act (not yet in force) in future in order to avoid fines from the data protection authorities.

Don't yet have a data protection strategy? Have unanswered questions on the topic? Feel free to contact us at any time for a consultation. Together, we will develop a comprehensive data protection strategy tailored to your company.

Clarify responsibilities

Responsibilities and roles should be clarified before AI is used in the company. There are various options here:

  • Order processing: As a rule, the company acts in its role as the client and the AI service provider acts as the processor. An order processing contract must then be concluded in accordance with Art. 28 GDPR. In the case of order processing, it is not the processor who is responsible under data protection law, but the client. The processor is bound by the instructions of the client and may not use the data for its own purposes.

  • Joint responsibility: Both the AI provider and the company are jointly responsible for data processing. As a rule, there is no relationship of instruction or dependency.

  • Separate responsibility: Only the provider is responsible for the processing of the data, while the company is responsible for the input into the AI. Each party must fulfil its own data protection requirements.

Depending on who is responsible, different measures must be taken to fulfil the requirements of the GDPR:

In the case of joint controllership, an agreement is required in accordance with Art. 26 GDPR, which defines the division of data protection obligations. Both parties must also ensure that data subjects are informed of the joint controllership in accordance with Art. 13 and 14 GDPR. Technical and organisational measures must also be taken here.

In the case of separate responsibility, each party must fulfil its own data protection obligations in accordance with Art. 5 GDPR. Agreements regarding data transfer and access are useful to ensure that the respective obligations are fulfilled correctly.

Finally, it should be emphasised that regular reviews and data protection impact assessments in accordance with Art. 35 GDPR are necessary, especially when using AI. You should therefore carefully check which responsibilities apply to whom and plan early on.

Thinking about the GDPR in HR as well

The GDPR does not only apply to customers, suppliers and business partners. The GDPR always applies to the processing of personal data. This may also be the case internally. This has particular implications for human resources.

If AI is used here, care should be taken to ensure that employees' sensitive data is protected. For example, data should only be entered into the AI anonymously. In this case, it must be ensured that employees have the same rights as other data subjects.

Involve employees at an early stage

Every company stands and falls with its own employees. This is especially true when it comes to data protection. This is because it is the employees who collect, process, store and delete personal data. And they may also be the ones who "feed" AI tools with this data in order to train them.

This is precisely why it is important to provide employees with comprehensive training at an early stage and involve them in the selection and implementation of AI. This ensures that everyone in the company is up to date in terms of technology, software and data protection law.

Through regular training, employees can implement and apply the applicable laws and thus maintain the data protection standard in the company. They also lose their fear of dealing with AI and can be sure that AI will support them, but not take away their jobs. This also promotes the working atmosphere.

How to make the use of AI in the company a success

The first step before implementing AI in a company should always be a team discussion and a data protection categorisation. This is because not all software fulfils data protection requirements and is therefore legally compliant. Therefore, check carefully whether the use of AI with personal data can make work processes better and more effective and how this relates to the data protection risk.

Do you need support? Our team consists of lawyers, data protection officers, auditors, IT security consultants and risk managers who will work for you throughout Germany and in Luxembourg. As specialised management consultants, we provide you with comprehensive support in the areas of data protection, IT law and cyber security. Contact us at any time for a non-binding initial consultation.