Security & Surveillance
Whether in the physical or the digital world, when it comes to migration, crime, terror and war, software is involved in decision-making.
Sorting, rejecting and predicting – those are the tasks that ADM systems are most commonly expected to perform in the field of security and surveillance. The most prevalent uses are associated with cameras, Internet traffic surveillance, predictive policing, automated border controls and autonomous weapons systems. However, if the police and other security agencies delegate parts of their work to machines and programs, the results can quickly lead to false suspicions. This infringes on freedom of movement and the presumption of innocence which are core elements of participation.
FLIGHT, MIGRATION AND BORDER SECURITY
By a “Digitisation Agenda 2020“ the Federal Office for Migration and Refugees (Bundesamt für Flüchtlinge und Migration – BAMF) aims to tackle problems related to its procedures [LINK]. In 2016, an „integrated identity management“ was introduced. Today it contains several modules that are available for supporting case managers in their decisions. The system is mainly aimed at finding out whether the details given by those seeking protection are plausible. For example, software is used to try and recognize the language of origin of a person from audio recordings. Initially, the error rate of the so-called speech biometrics was approximately 20 per cent; according to BAMF this figure could have been reduced to 15 per cent by now. By mid November 2018 , the procedure had been used about 6000 times. The software that is used has its origins in military forensics, the secret services and the police [LINK]. This software is able to analyze telephone data, past connection data and saved telephone numbers. The BAMF claims that refugees give it their permission to access their telephones voluntarily. In 2018, the insights gained from the analysis of thousands of refugee’s telephones resulted in usable results in less than 100 cases. Other software is employed by the BAMF to compare photographic portraits and various possible transliterations of Arabic names into Romanized letters. The BAMF believes that the use of these automated procedures has been a success. However, critics think that the cost of the procedures and the number of errors are too high. They also complain about the lack of transparency in the way the software systems function and the lack of scientific monitoring used to evaluate the effectiveness of the procedures.
Since 2013 the EU has been using the ‚Smart Border‘ control system at its borders. As part of the EU-wide “automated border control systems”, Germany already uses completely automated passport controls (EasyPASS) at some German airports. Meanwhile, an entry permit system for the Schengen Area (European Travel Information and Authorization System – ETIAS) is currently being worked on and will come into force from 2021.. As for the Visa Information System, a structure for Entry/Exit procedures is currently being established. It is supposed to create a database by interacting with the Passenger Name Records which are already used in air traffic. The plan is that entry/exit records to and from the Schengen Area will be recorded and stored centrally from 2020. Biometric data will play an essential part in face recognition. In addition, the EU Commission is financing an experiment into lie detection to the tune of 4.5 Million Euros [LINK]. The experiment, called “IBORDERCTRL”, will run until August 2019 and is being tested at the Hungarian and Greek borders, among other locations. IBORDERCTRL consists of a computer-animated border guard on a screen which asks questions to the person entering the EU country. The interviewee’s “micro gestures” are recorded by a camera and analyzed to decide whether or not they are lying. Some critics say that IBORDERCTRL is based on “pseudo-science” [LINK].
A collaboration between the Deutsche Bahn AG train company, the German Federal Police (Bundespolizei) and the Federal Criminal Police Office (Bundeskriminalamt – BKA) aimed at using camera surveillance to detect criminal suspects ended in mid-2018. The camera surveillance tests took place at Berlin’s Südkreuz train station and ended with questionable results. In a section of the station that is seen as a test area for new technologies by Deutsche Bahn AG, different software systems were set the task of filtering out criminal suspects via face recognition. The software systems were tested for almost a year. Officially, the results were seen as a success because the detection rate was 80 per cent. In addition, the false alarm rate (FAR) was under one per cent. The Chaos Computer Club (CCC) declares that, in reality, the detection rate of 80 per cent was too high. They said that this high rate of detection was only achieved when the results of all three of the software systems that were tested were joined together. At the Südkreuz station, which sees 90,000 passengers pass through per day, the FAR of circa 0.7 per cent would flag approximately 600 false suspects per day. The CCC also criticized the sample of test people claiming that it was hardly representative of the population at large in regard to age, sex and origin [LINK]. In the United States, a study showed that some of the face recognition systems that were used were not good at recognizing Afro-American women [LINK].
The number of false detections by recognition systems is increasingly becoming a civil rights issue. This is especially the case when considering the growing use of air drones (“Quadrocopters”) equipped with cameras and utilised by police at demonstrations. For example, during the investigation into the aftermath of the clashes around the G20 summit in Hamburg in 2017 the police searched through hours of image recordings using automated procedures.
The next test phase of the camera surveillance at Südkreuz station—in which ADM systems were going to be used to recognize objects such as suitcases and “unusual behavior” of people—was cancelled by Deutsche Bahn on financial grounds in early 2019. However, a similar test was started in Mannheim at the end of 2018. The Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (IOSB) is providing the technology for an “algorithm-based video surveillance (system) in public space for combatting street crime”. The five-year project is supposed to increase the ability, for example, to recognize physical kicks or punches and alert police officers in operation centers about potential incidents. In total, 76 cameras in the inner city will be connected to the system [LINK].
High false detection rates seem to be the rule when it comes to automated tracing of license plates. In some federal states, for example in Bavaria, Hessen and Saxony, such systems have been installed permanently. In other federal states they are only employed selectively or not at all (yet). A pilot project run by the government of Baden-Württemberg in 2017 found detection errors in about 90 per cent of cases. Other federal states saw similar percentages due to the fact that they did not have the latest technology. The automated license plate recognition system in Baden-Württemberg was first acquired in 2011 [LINK].
At present, predictive policing systems are employed in six federal states. Apart from systems developed by the authorities themselves, systems developed by various private manufacturers are also in use. The basic goal of predictive policing is to use statistical analysis to identify areas where burglaries of apartments, business premises and car theft are likely to occur. The criminal prognosis is based on models such as the near-repeat-theory, which states that burglars tend to strike again near the location of a successful break-in. Using this system, patrols can be deployed more efficiently. It is unclear, however, whether such locatgion-based systems result in positive effects. An accompanying study by the Max Planck Institute for Foreign and International Criminal Law in Freiburg was unable to find any clear evidence of effective prevention or decrease in criminality during the test phase which ran between 2015 and 2017 in Stuttgart and Karlsruhe [LINK]. Regading participation and predictive policing, it would be better to examine whether predictive policing might help create re-enforcing effects that could lead to stigmatization in specific parts of some towns, cities or other areas. [LINK]
“Hessen-Data“ works as a person-related system instead of a place-based one. The software is provided by Palantir, a private software company from the USA, As far as it is known, this system combines data from social media with entries in various police databases as well as connection data from telephone surveillance in order to identify potential offenders. By “profiling”, it is intended to help identify potential terrorists and was acquired in 2017. Hessen’s government is planning to extend its deployment by using it to help detect child abuse and abduction. The necessary legal foundation for “Hessen-Data“ was provided by the Hessen Law on Police which was revised in 2018. An Investigative Committee, reporting to the Hessen Parliament, is currently trying to clarify issues around the acquisition of the system and look into questions relating to data protection. Apparently, the system is supervised by the staff of Palantir, who as a result might have access to private data related to individual citizens. [LINK]
SURVEILLANCE AND DATA RETENTION
Since the 2013 revelations of former CIA employee Edward Snowden, there is mounting evidence that the secret services of Western countries monitor Internet traffic on a global scale regardless of whether an individual is involved in suspicious activity or not. It is also a known fact that German secret services are collaborating in this activity. What software systems and procedures are being used is not known because the options for parliamentary control over these activities is fairly limited. Later in 2019, the Federal Constitutional Court (Bundesverfassungsgericht) will issue eight decisions related to this surveillance activity. Among other things, the decisions will focus on the extension of rights to the Foreign Intelligence Service of Germany (Bundesnachrichtendienst), on the retention of data and the surveillance of telecommunication and postal correspondence. [LINK]
AUTONOMOUS WEAPONS SYSTEMS
Prompted by the preliminary work of some NGOs, UN bodies have been debating a worldwide ban on autonomous weapons systems since 2017. These include drones in the air, in water and on land that, under specific circumstances, execute deadly violence without further intervention from a human operator. As far as it is known, such completely autonomous or autarkical weapons systems are not yet in operation. However, air drones have been equipped with weapons systems that can perform at least some tasks independently. Their arsenals also include systems for the recognition of persons and of objects. An independent evaluation of their error rates does not seem to exist. The air force as well as the marines in the German army are using various types of air drones. Of those, at least one has the capacity for armament.
Dynamic Risk Analysis Systems
Dynamic Risk Assessment Systems (DyRiAS) are instruments produced by the German company “Institut Psychologie & Bedrohungsmanagement (Institute for Psychology and Threat Management – IPBm Projekt GmbH). The instruments give a risk assessment of potential violent acts by people in various social contexts (school, intimate partnership, workplace, Islamist terror etc.). According to the manufacturer, the results of their products are psychologically, as well as empirically, well-founded. Their basic assumption is that observable spirals of escalation precede violent acts. The analysis is supposed to find out “if a person is on a development path that might lead them to a potential attack”. DyRiAS offer an overview of the development of the threat over time while simultaneously creating case documentation. The risk assessment is based on the statistical evaluation of questionnaires that are filled in by case managers (e.g. police personnel).
DyRiAS systems are used by associations for the protection of women in Germany and Austria. In Switzerland, DyRiAS is used in combination with other risk management systems in preventive police work, for example to identify “potential attackers” (Gefährder*innen). According to research undertaken by the Swiss TV channel SRF, these instruments deliver high false alarm rates: Obviously, the software is configured with a tendency to overestimate the risk [LINK].