Atlas der Automatisierung

Automatisierte Entscheidungen
und Teilhabe in Deutschland

Der Atlas der Automatisierung wird aktuell nicht mehr aktualisiert.
Die Daten sind daher nicht mehr auf dem neuesten Stand.

       

Regulation

The General Equal Treatment Act (Allgemeines Gleichbehandlungsgesetz – AGG) and provisions on automated acts of public administration overarchingly regulate how to deal with automated decision-making.

EQUAL TREATMENT

In Germany the principle of equality, which is derived from the third article of the Basic Law (Grundgesetz), effectively means that (“All persons are equal before the law“). The General Equal Treatment Act (AGG), which is also known as the “Anti-Discrimination Law“, was enacted in 2006.

The AGG defines equal treatment as the prevention and elimination of “discrimination on the grounds of race or ethnic origin, sex, religion, disability, age or sexual orientation”. The Act covers a wide range of societal aspects (including access to employment, goods, services and housing) and equal treatment is mandatory. Thus, the legislator indirectly provides us with a definition of the state’s understanding of participation. AGG requirements are also relevant for ADM systems.

FULL AUTOMATION OF ADMINISTRATIVE DECISION-MAKING PROCEDURES

Paragraphs within the Administrative Law Act (Verwaltungsverfahrensgesetz – VwVfG) and the Social Insurance Code (Sozialgesetzbuch – SGB) regulate the employment of automated administrative procedures in Germany.

In general, the authorities are only allowed to use automated procedures if they are legally permitted to do so. As a result, authorities that implement fully automated procedures are required to develop guidelines to ensure compliance with the principle of equal treatment. All automated procedures must be set up in such a way that they recognize when an applicant’s situation deviates from the scenarios provided in the programming. In such a situation, the case must be assessed individually. In addition, citizens must have the opportunity to present their own point of view, for example if they want to apply for special circumstances in their tax declaration.

When fully automated systems are in place, the criteria for decisions that are used by algorithms have to be transparent. Furthermore, whenever ADM systems are used by authorities in more than just a support role their basic principles and decision criteria are subject to the publication requirement.

Risk management procedures in which operations are forwarded for a more detailed review by a human may not discriminate without substantiation and by giving a logical reason.

FURTHER EU AND NATIONAL REGULATION IN GERMANY

The GDPR

Among the new regulations that have come into effect recently in the EU and in Germany, the General Data Protection Regulation (GDPR) is one of the most important. The implementation of this EU-wide ordinance on data protection can be found in, among other places, the automated administrative acts described above. Regarding ADM, the GDPR stipulates that citizens have the right to appeal against ADM if three criteria are met:

There is some disagreement as to whether or not the GDPR is sufficient enough to give people adequate protection against disadvantages due to discrimination through ADM. One possible loophole in the regulation relates to credit. Bureaus such as the SCHUFA that evaluate the credit-worthiness of clients do not have to explain their procedures to those concerned, despite the fact that such a scoring must be transparent according to the GDPR and the Federal Data Protection Act (Bundesdatenschutzgesetz – BDSG). As for GDPR Art. 22, BDSG §31, this regulation would only come into effect if the credit bureaus also took decisions on the extension of credit or something similar. However, this procedure is performed by the financial institutions themselves who then in turn do not disclose the details of their decisions because they do not calculate the score.

Labor Law Criteria at the EU Level

At the EU level, a legal structure regarding the principle of labor equality arose in parallel to the development of labor law in Germany. Various directives and court decisions (especially on Article 157 of the Treaty on the Functioning of the European Union – TFEU) form the foundation for the rights of employees in relation to partially or fully automated decisions. These rights include the same principles as the principle of equality, namely the prohibition of discrimination based on criteria such as sex, ethnic affiliation and racial origin.

In addition, Article 21 of the European Charter of Fundamental Rights addresses non-discrimination. However, this article is phrased in such a way that the criteria listed in it are not exclusive. This might be of relevance in the future, especially in regard to the application of Article 21 to the protection objectives connected to ADM.

Human Rights

As a sovereign state and an EU-member, Germany is a signatory to various human rights conventions. Since ADM also touches on human rights (e.g. the right to personal freedom and security, equality before the law, freedom of religious expression etc.), its influence on future – new or revised – laws and regulations must be considered. [LINK]

 


In the chapter “Education, Stock Trading, Cities & Traffic“ the present paper deals with high-frequency trade (HTF) in stocks and autonomous cars as further industry or sector specific regulations. The regulation of medical devices and health technologies is the subject of the chapter “Health and Medicine”.


 

CONSUMER PROTECTION

In principle, the interests of consumers in Germany are well represented by consumer protection bodies and associations. Apart from credit assessments there are few areas in which the social participation of consumers is impacted directly and decisively by ADM systems.

In the retail business sector, ADM is mainly used in online trading. In most cases consumers generally have alternatives if they do not want to order or book online. However, these options could be far fewer in the future. In online trading, although not exclusively, ADM can be used for customer segmentation through Dynamic Pricing or scoring procedures, which in turn can lead to preferential treatment or discrimination. Both practices are legal and legitimate in principle. Yet, this can lead to systematic discrimination or to the exclusion of specific consumer groups.

One well-known example for Dynamic Pricing is the Uber taxi service which sets its prices depending on the demand and the time of day. In other applications of Dynamic Pricing, the price for an offer fluctuates in accordance with the end device consumers use for their request. Insurers give discounts in relation to telematics-based car insurance. In these cases, the tariff level is determined by the driving style which is established by the telematics system. In the customer relationship management, ADM is used to calculate the so-called Customer Lifetime Value (CLV): How profitable are customers, who should get preferential treatment and who can be placed at the end of the queue in the phone waiting loop when necessary.

While Dynamic Pricing is not yet common in Germany, scoring procedures have been in use in the consumer sector for a while. Scoring in this context means a categorization of persons according to a number of selected criteria. The combination of specific values of these criteria results in a score that can influence, for example, which price customers pay for a product or whether a bank will extend credit to them. In Germany, the most well-known example for scoring is the credit assessment provided by the private company Schufa (see box below: OpenSCHUFA).

Scoring helps companies decide which people it wants to establish a customer relationship with. However, the decision privilege in the freedom of contract is only limited in essential areas such as tenancy and labor law because they affect the principle of equal treatment. Whether or not the rejection of customers that is justified by the freedom of contract or the price discrimination (via scoring or Dynamic Pricing) also runs contrary to the principle of equal treatment in other areas is something which is still subject to debate.

Under current law, consumers have the right to be informed when they are subject to scoring. Yet, this legal right is insufficiently specified by the law according to experts such as the Consumer Affairs Council (Sachverständigenrat für Verbraucherfragen – SVRV). The existing means available to enforce citizens’ rights that are often criticized as inadequate. [LINK]

 


OpenSCHUFA

In the spring of 2018, the Open Knowledge Foundation Germany and AlgorithmWatch started the project OpenSCHUFA. The goal was to examine the scoring procedure of Germany’s best-known credit bureau Schufa for potential discrimination. The bureau holds the data of about 70 million citizens (out of Germany’s 83 million population). The company provides information to banks which may result in customers being denied credit. Following a successful crowdfunding drive, more than 3,000 people donated their Schufa reports using an online portal specifically developed for this campaign.

In the autumn of 2018, Spiegel Online and Bayerischer Rundfunk published an analysis of the donated data. The editors emphasized that the data available to them is by no means representative. Nevertheless, they were able to identify various anomalies in the data. For instance, it was striking that a number of people were rated rather negatively even though SCHUFA had no negative information about them, e.g. on debt defaults. There also appears to be noticeable differences between different versions of the SCHUFA scores. https://algorithmwatch.org/en/schufa-a-black-box-openschufa-results-published/


Next chapter: Topic: Labor

Previous chapter: Stakeholders

Contents