Everybody Hates the Technopolice | Commentaries: Technology and Policing (Part 2 of 3)
by Félix Tréguer, Translation by Jessica Saxby
In the wake of the Second World War, writer Aldous Huxley noted in an essay that “in the past, personal and political liberty depended to a considerable extent upon governmental inefficiency.” It was 1946, and he already felt that this was no longer the case. “Progressive science and technology,” he writes, “have changed all this completely,” noting that “revolutionary improvements in the means of transport and communications have vastly strengthened the hands of the police.” [1]
Huxley was undoubtedly referring to progress in identifying and creating files on individuals, the use of punch cards, statistics, cartography, radio, telephones and even cars. For one of his contemporaries, August Vollmer, a man who would go down in history as a great reformer of the American police at the start of the 20th century, all of these innovations were at the very heart of a “scientific and modern” mode of police organisation that he was calling for The former officer of Uncle Sam’s colonial wars made no secret of the origin of these technologies and their attendant ideologies, stressing that they were directly borne out of the “principles of military science.”[2]
At the time of writing, for Huxley, the computer had only just been invented. It was not until the 1960s that this technology, previously confined to military uses, became integral to policing practices. In the United States, while some forces were experimenting with proto forms of what is now called “predictive policing” based on cross-checking data, the main use of this technology consisted of the creation of computerised files in order to render police archives more easily accessible. Needless to say, a great deal has happened since then.
Four years ago (2017), comrades from “La Quadrature du Net” and I were still looking towards China with a somewhat distant concern.[3] We had heard about trials with social credit, permanent behavioural evaluations, all-out facial recognition and counter-insurrection technologies that were to enable the ethnic cleansing underway in Xinjiang, and so on. The Chinese regime was proof of how the rolling out forms of surveillance designed to anticipate and repress any deviancy from the norm can occur with a hard sense of determination and performativity. At the same time, in the United States, various collectives were organising against experiments in predictive policing being rolled out in cities like Los Angeles and New Orleans. They documented the normalisation of facial recognition in police stations, or the use of private subcontractors to come up with files on “potentially dangerous persons” established via data mined from the social media profiles of activists associated with the Black Lives Matter movement.
For us in Europe, all this seemed relatively far away. It was a time when, within the European Union, new laws concerning the protection of personal data had come into force, and almost everyone seemed to be in agreement that it was a historic victory for our right to privacy. Of course, we had already come across numerous instances of mass surveillance. In 2018, the use of video surveillance was already out of hand, with almost 100,000 cameras in operation in public spaces across France. The government continued to require telecoms operators and hosting providers to archive the Internet and telephone communication history of the entire population, despite a ruling by the Court of Justice of the European Union that this regime was in flagrant violation of fundamental rights. We also became aware that intelligence services have colossal means to scan entire parts of Internet traffic and when it came to police drones, their flights were becoming more and more oppressive with each demonstration. Yet in spite of all this, what we were observing in other corners of the world seemed a fraction more extreme.
And then one day, in early 2018, alerted by a press release from the city of Marseille regarding the launch of a project entitled “Smart City” we sent a first request to the town hall for a freedom of information request concerning administrative documents. The documents we received a few weeks later would show that this project, responding to the innocuously named “Observatory for Big Data on Public Order,” aimed to integrate various databases provided by the police, transport networks, hospitals and social media, among other sources. Its aim, “to analyse that which has taken place (yesterday), to assess the current situation (today), and anticipate likely future circumstances (tomorrow)” in order to “allow security and public order agents to provide new solutions that are adapted to their needs.” In short, predictive policing.
A while later, another document was communicated to us by a different source. It was an outline of a trial under negotiation between the city of Nice and fifteen companies and research organisations led by Thales, the defence giant. The document opened with the panicked assessment that a “rampant urbanization of the planet” was underway. In order to prepare for both “natural risks” and “human risks” ie. criminality, terrorism, and so on. The project aimed to collect “a maximum amount of existing data” in order to “evaluate each situation… to be able to anticipate incidents and crises” to identify “weak signals” in order to provide a “tool to help forward planning” and provide “predictions based on scenarios” using a “command centre” [centre d’hypervision].
It was mid-2018, and we had begun to understand that something had slipped by unnoticed. More research emerged that led us to unearth a number of other projects: in Valenciennes, Huawei, the Chinese telecoms giant was offering local police services their “intelligent video surveillance' tools free of charge in order to automatically detect suspicious behaviour. IBM was doing more or less the same thing with the public marketplace in Toulouse. Idemia and Thales were also experimenting with “computer assisted vision” technologies in the Paris police force. In Saint-Étienne, the mayor was planning to deploy “intelligent microphones” in one working-class neighbourhood in order to automatically detect suspicious noises which would then send drones to film the scene before making the decision to send out police services. Even facial recognition was no longer taboo, in the Provence-Alpes-Côte d’Azur region, president of the region Renaud Muselier and mayor of Nice Christian Estrosi announced in late 2018 that they were hoping to try out a biometric identification system to monitor those entering and leaving high schools. We also realised that facial recognition had been in fact being used on a huge scale in order to examine police files using photographs of suspects and to identify those already on record (with 400,000 requests of this kind in 2020 alone, inspecting 8 million photographs contained within the Criminal Records processing File).[4]
Stepping back to look at the bigger picture, it was hard not to see in all these developments as the hypertrophied digital version of the police project described by Michel Foucault in Discipline and Punish:
Police power must bear ‘over everything’: it is not however the totality of the state nor of the kingdom as visible and invisible body of the monarch; it is the dust of events, actions, behaviour, opinions - ‘everything that happens’ … And, in order to be exercised, this power had to be given the instrument of permanent, exhaustive, omnipresent surveillance, capable of making all visible, as long as it could itself remain invisible. It had to be like a faceless gaze that transformed the whole
social body into a field of perception.[5]
In the hopes of preventing this, and in order to sound the alarm, along with a core group of half a dozen people, we launched the project Technopolice, a participative campaign initiated in September 2019, coordinated by La Quadrature. Our aim was to document the roll-out of these technologies and dedicate our time to combating them. Over the course of three years we have collected and analysed a great deal of documents with the help of journalists and researchers. We have carried out direct action in the street, and workshops in a number of major French cities, as well as initiating legal proceedings. By closely following these complex evolutions within the world of techno-security, we have begun to unpick certain root causes of the excessive technologization of the police, as well as some of their effects. It is these that we will outline in this text.
AT THE TECHNOPOLICE CASINO
Today, in France at least, the most significant driving force behind the technopolice is without a doubt to be found in industrial policy. Our political leaders no longer make any attempt to hide it either, for what is touted as “technological sovereignty” is above all a question of money. When he shared his inaugural statements on facial recognition in autumn 2019 with Le Monde, the Macronist secretary of state in charge of digital affairs, Cédric O plainly stated that “experimenting with facial recognition is necessary,” in order for “our industry to progress.” In parliament, the former Le Republique En Marche deputy for the Loire, Jean-Michel Mis, more or less followed the same line, explaining to anyone who would listen that it is urgent to help French and European industry “position itself in relation to the Americans and the Chinese, notably when it comes to questions of identification and biometric recognition.” It is true that market studies have what it takes to turn heads, the global market for facial recognition has increased 20% per year and is forecast to reach 11.6 billion dollars by 2026.[6] When it comes to security, the increase in recent years has sat at about 7%, which is twice the percentage of global growth.[7] This represents quite the deal for businesses that have previously been attached to the military domain, and who are now coming across new markets for their products.
Political leaders and administrators are convinced that there is no hope for France or for Europe if they do not participate in the global race for the latest technologies of control. This belief is directly underpinned by the effect of self-reinforcement linked to the back and forth between public and private Cédric O, for example, is a former executive at Safran. It is also backed up more generally by the cross-fertilisation between political, administrative and economic elites. While the union between state motives and the financial interests of large corporations is by no means a new phenomenon, it is nevertheless worth noting that the economic policy of the past forty years has considerably contributed in disinhibiting the most zealous public servants. In recent years, as a result of mergers and acquisitions in the surveillance sector, two French multinationals have strengthened their positions: Idemia and Thales. The state, providing capital via the Caisse des dépôts et consignations (“Deposits and Consignments Fund” a French public sector financial institution) holds almost a third of shareholder voting rights for each of them.
Surrounding ministers are a number of lobbyist and parapublic structures which are sharpening the tools of the technopolice and are contributing to the total dependency of public policy upon it. Even before the development of these technologies the influence of these groups has long-weighed heavily upon the orientation of research policy, which provides the sector with research and development funding. In the case of research programmes piloted by the European Commission, financing for security purposes has more than doubled, from 3,8 million euros during the 2007-2013 budgetary period, to 8 million euros for the 2014-2020 period.[8] This equates to half of all public expenses on research in the security sector in the European Union.
Once released from the Research and Development laboratories, techno-security products are immediately tested on the ground, most often within local municipalities. This is in fact a way of implementing these technologies discreetly, by making them part of the professional practice of individuals who never actually asked for them. Often presented as a win-win scenario, these experiments allow municipal police forces to have free, or low-cost access to the latest cutting-edge technologies, while in return, the companies that produce them are able to adjust their products to “real-life conditions” and to begin to familiarise the police-officer-cum-beta-tester with them. At this stage, public money also arrives with cold hard cash. In Nice, for example, the project Safe City, rolled out by a consortium led by Thales, benefited from 11 million euros in the form of subventions and recuperable advances from Bpifrance, the French public sector investment bank, for a project that cost 25 million euros over a period of 3 years. The national agency for urban renovation (ANRU) also financed certain “Safe City” projects – a concept put forward by the industry to refer to the policing side of the Smart City – for instance the above-mentioned sound captors in Saint Étienne. In Marseille, the Observatoire Big Data de la Tranquillité Publique received a 600,000 euro subsidy from European funds which aims to tackle regional inequalities.
Naturally, after research and experimentation, comes the time for deployment. The white paper on interior security published by the government in November 2020 reviews a few of the lab-fresh products in order to present an unsurpassable horizon for the police of the future. It foresees distance biometric identification – “face, voice, scent” – AI and info centres for the “analysis of past data as a tool for feedback and decision-making assistance,” “virtual reality glasses or helmets… to provide agents with operational information or instructions,” as well as “facial recognition technologies,” “automatic reading of numberplates” and the multiplication of drones.
In order to respond to the hopes of ensuring the “Interior Ministry is at the frontier of technological advancement,” the white paper suggests dedicating 1% of the national GDP to home security by 2030. This would be an expected increase of 30% of the ministry’s budget across the decade. We can only imagine that social security, national education and universities will not be able to count on such budgetary efforts over the same period.
COUNTER-INSURRECTIONARY MANAGEMENT OF CROWDS
In this period of out-of-control authoritarian neoliberalism and in the face of mounting resistance, the priority has clearly become to re-establish the “right hand of the state.”[9] The proliferation of surveillance technologies is inscribed in a decades-long logic of power which consists in protecting the State from close combat with the population. In recent years, the uprising of the Gilet Jaunes caught the attention of many. Under the direction of Pierre de Bousquet de Florian, at the time national coordinator for intelligence for the Elysée Palace, a document stating the national strategy for intelligence agencies was published summer 2019, suggesting that lessons be learned from what should be considered a patent failure in the maintenance of public order in France:
The increasing power of social movements and networks of a subversive nature constitute an increasingly worrying crisis, given their focus on directly trying to weaken, or even destroy the foundations of our democracy and republican institutions by way of insurrectional violence. This translates to violent actions against people or against property (black blocs, entering protected spaces, sabotage etc.) but it also consists of espousing traditional demands that these movements organise around, in order to infiltrate them and radicalise them.[10]
Based on these assessments, the document announced that “the anticipation, analysis and surveillance of social movements, and societal crises by intelligence services” was henceforth a priority for this sector. In November 2019, in a rare public address, Bousquet de Florian justified this more open disinhibition on a matter that is particularly sensitive for public freedoms. Fearing “a sort of upsurge in breakaway ideologies,” he pointed to “radicals on all sides” whose “rhetoric is increasingly violent,” citing anti-speciesists, libertarians, autonomists, transhumanists, identitarians and ecologists in no particular order. And to sum up the situation, he didn’t hesitate to use a term dear to the extreme right, speaking of a “general becoming savage (ensauvagement) of society.”[11]
The prioritisation of these aims shows in the statistics. While intelligence services have seen a 30% increase in staffing since 2015, notably in sectors dedicated to the development of technological capacities, in recent years, the portion of their activity dedicated to following social movements has rapidly increased. Thanks to information published by official reports,[12] as well as the expansion of the PASP dossier (dossier for the prevention of attacks on public security) we are able to measure this trend: between 2017 and 2020 the number of people contained in this database has gone from around 40,000 to 60,000. The Interior minister has justified this increase by evoking “grave problems in public order which have developed since 2015.”[13]
This headlong rush into political espionage follows in the footsteps of a long history of the relationship between counter-insurrectional doctrines and electronic records. Following a number of uncertain trials since the 1960s, Artificial Intelligence and Big Data are now in the process of optimisation for maintaining order in public space. In fact, within the frame of the Safe City programmes, which are spreading across France, the anticipation of demonstrations and the rationalisation of the management of crowds seems to have become one of the central dimensions of the project. As part of the research project S2UCRE (Safety and Security of Urban Crowded Environments) which was launched in 2019, the Paris police force teamed up with Idemia, Thales and the start up Deveryware in order to integrate different applications “within one system,” including:
The surveillance of crowds in sprawling urban environments, the density of crowds and their dynamics; short term prediction and crowd behaviour in order to prepare a quick and efficient evacuation; the semi-automatic analysis of suspicious behaviour in order to enforce security; the detection and (geo)localisation of presumed suspects of crime.[14]
Another project, this time led by INRIA in collaboration with the corporation Onhys and the national police force, the emphasis was on “organised movement of pedestrians in public spaces.” The aim is to couple “optimisation methods in order to plan the itineraries,” of demonstrations, but also to bring together contingents of localised police forces in order to “control these demonstrations by using fixed and mobile cameras, as well as simulation and visualisation methods.” Placing an “accent on social behaviour in particular.”[15] Since, these research projects have been carried out as part of broader projects linked to the 2024 Olympic Games, the large sports gathering that we have to thank for the French techno-security complex expanding and testing out its knowledge, benefitting all the while from huge public markets. As the sociologist Myrtille Picaud notes, “the importance of security at major events as part of the development of digital tools for security must also be understood as a de-politicised zone for experimentation. These tools are then able to circulate afterwards to regulate political demonstrations.”[16]
These research projects aren’t generally very loquacious about their increasing reliance on new technologies in order to arm the repression of demonstrations. We saw with the juridical repression of the Gilet Jaunes the ways in which images from security cameras or images shared by protestors on social media enabled the identification of participants, in particular when these images were able to be coupled with photographs and facial data already on file. In internal notes sent to police forces and judges in April 2021, the government called for an increased centralisation of photo and video resources, citing, notably, “videoprotection cameras located in the city, body cams, the use of helicopters with high resolution cameras (allowing for extremely precise, long distance observation, following individuals over considerable distances, both day and night, and when needs be providing the conditions for a fumigation.)”[17] With this inventory that is reminiscent of Prévert, the government forgot to mention drones, the use of which was temporarily banned following an appeal against the Interior Minister.[18] Yet we know that they now play a key role in the surveillance of demonstrations.
THE PIOUS DESIRE FOR THE RATIONALISATION OF BUREAUCRACY
Beyond political or economic interests, the progress of technopolicing also should be read as the manifestation of a phenomenon proper to the police, that is one of a generalised rationalisation of bureaucracy. Data governance now constitutes the dominant paradigm of large organisations. Confronted with the irreducible complexity of the real that they are charged with administrating, and due to the diverse constraints that weigh upon them, state bureaucracies rely on artificial intelligence and automation in order to scale up, to do more with less, often in response to challenges brought about by technology itself.
In March 2018, the director of military intelligence, general Jean-François Ferlet, justified the increasing use of artificial intelligence during a hearing at the National Assembly, explaining the ways in which they were up against a “tsunami of data” given the volume of information produced by different sensors including: satellites, social media, Internet traffic data. We find the same rationale behind automatized video surveillance projects too: now, proponents claim, there are too many cameras in the public space, and too few agents charged with viewing them in urban supervision centres; there is therefore an increasing need to automatise the detection of suspicious events thanks to algorithms conceived for passing on these “alerts” to the police.
Such justifications operate in symbiosis with financial and managerial arguments. Automatisation often appears as a gage for either good budgetary management, or as a direct consequence of the backdrop of austerity economics. In a note to the CNIL (National Commission on Informatics and Liberty), the PACA region defended its experiments with facial recognition to monitor entries in high schools by claiming that the project constituted “a response to the increasing differential we have identified between the requirements for a securitization of entrances to these establishments, and the human means available in highschools due to the successive programmes of reduction in staff in the public sector.”
The history of technologies categorised as “predictive policing” have a considerable overlap with the instruments of managerial organisation. In the United States, the software CompStat was used as early as the 1990s under the thumb of William Bratton, at the time the chief of the New York police; it was conceived in order to introduce police forces to the doctrines of New Public Management (NPM) and “pilot” public action using statistical indicators. It was initially designed to produce data on instances of delinquency in order to improve the reaction times of the police and their capacity to anticipate. It rapidly transformed into a tool for a politics of figures, however. The same goes for Predpol (since renamed as Geolitica), a mapping software dedicated to the orientation of patrols in “hotspots,” meaning zones where the probability of a case of criminality occurring is considered to be higher. For sociologist Bilel Benbouzid, above all, Predpol allows for superiors to rationalise the work of their agents, positioning them as auxiliaries to machines:
PredPol offers an infrastructure to police departments, it is not a service for prediction, but for management. It is a system which allows for the simple management of agents. Police already carry out patrols, now, their superiors have access to software that allows for the optimisation of their work, telling them exactly where to go and when. Police vehicules are even equipped with GPS, allowing them to check the time spent in a hotspot.[19]
This managerial surveillance is encountering resistance, however. At the Los Angeles Police District (LAPD), pioneers in technopolicing, staff unions have, for example, refused the activation of these GPS devices placed in their cars. Furthermore, an internal enquiry into the technical bugs that the radio systems that these same vehicles use revealed that they had been sabotaged by police officers opposed to the direct sharing of their conversations with a command centre.[20] Exposed to this kind of surveillance by their superiors, might we consider cops workers like any other?
These internal oppositions are all the more pronounced given that the recourse to technology with the aim of bureaucratic optimisation reveals what Jacques Ellul called the “productivity bluff.” [21] On top of the constant risk of being submerged by a “tsunami of data” the technopolice is in fact plagued by technical bugs, the accumulation of disparate software and infrastructures, layers of technical and organisational complexity which complicate its operational potential. Often, technologies like these also create un-anticipated blind spots. For example, the establishment of police patrols by car in the 1930s and 1940s was supposed to allow for a greater number of patrols covering a greater area, while also reducing response times of police forces in instances of crime. However, twenty years later, the rate of criminality had increased drastically and research showed that police response times had little impact on the evolution of criminality.[22] Technical systems tend, therefore, to focus the attention on unimportant indicators, even from the restricted perspective of police rationality.
From this perspective, the documents that we have been able to gather together, lift the veil on what is undoubtedly a real rupture in the history of policing, where any pretension to elucidate the “causes of crime” is totally abandoned. If the police has always played along with the role assigned to it since the 18th century at least – that is, to produce knowledge about the population, to direct its behaviour by acting on the variables that determine it in order to guarantee its docility and productivity - the promoters of technopolicing now seem determined to abandon the decidedly too elusive horizon of public order. It is no longer relevant to question the social or psychological roots of a type of behaviour judged deviant, we must be content with managing disorder. It is therefore unsurprising that the technologies produced in this context uphold institutional blindness, acting as a smoke screen that tends to “naturalise” political choices. As the researcher and author of a critical study of Predpol Ismaël Benslimane explains, the software is “also a means by which social reality is hidden.” “Instead of saying a neighbourhood is poor, it will be called a criminal zone.”[23]
It therefore follows that with automatic video surveillance, and all these other systems developed to pick up the weak signals of “danger” from what is considered to by “unusual” behaviour, by assimilating the subject with statistical categories, the technopolice has broken with a whole range of liberal criminal law which highlights the explicability and intentionality of the actions of others. In this algorithmic penal schema, both the delinquent, real or potential, and the police officer or the judge, are no more than pairs with comparable mental states: they are not animated by will, directed by beliefs, moved by desires. As the historian and philosopher of science Peter Galison writes, each is considered a “black box” with “inputs and outputs and without access to the interior life of the other.”[24] By refusing to interrogate the causes of a crime, technopolicing doesn’t simply participate in increasingly serious forms of bureaucratic dehumanisation, it also restricts our political imaginary by preventing us from envisaging collective responses to criminality and to violence that would not rely on the police.
REFORMING THE POLICE WITH TECHNOLOGY?
Paradoxically, technology doesn’t need to be efficient to legitimise the actions of the police. Sarah Brayne, a sociologist who spent five years researching the new modalities of surveillance in the LAPD, explains:
The police adopted Big Data, not because there was empirical proof that it really improved the efficiency of the organisation, but because there was an immense institutional pressure to conform at a time when other institutions were beginning to make use of Big Data and algorithmic predictions in order to make decision.[25]
In other words, alongside public financing and institutional pressure, the paradigm of “governance by data” contributed to making technology a source of symbolic capital, renewed relentlessly by police bureaucrats. As Brayne highlights through her case study, it is often in response to certain critiques addressed to the institution of the police, for example in the realm of discrimination or racism, that these technologies are often deployed.
The LAPD is not an isolated case, however. In the history of the police, technology has often been touted as an instrument for reform, making policing allegedly more transparent, more objective in its treatment of illegality, more equalitarian in its relationship to populations. In France, a good example would be body-cams. The extensive roll-out of these video devices, placed at chest height on police officers in order to film their operations, was announced by the government after years of anti-racist lobbying against stop and search. In 2015, while a number of civil rights groups encouraged the handing out of paper slips to those subject to these search operations, the interior minister Manual Valls rejected this idea and encouraged the use of body-cams, which he claimed allowed for a greater transparency regarding police behaviour. Yet, six years later, these devices haven’t proven to provide any such security during stop and search operations, or in regard to police violence.[26] On the contrary, similarly to the United States, these body cams are above all a means to extend the network of video surveillance, and to transmit real-time images to command centres.[27] In the future, they will also enable the identification of individuals through the use of facial recognition. While strategically chosen extracts of videos are sometimes shared by the Interior minister, their role is above all to focus the spectator on the point of view of the police officer.
The history of the police is a little like hell: it’s paved with good intentions and marred with false promises. Since at least the beginning of the 20th century, at each moment of crisis for the policing institution, spokespeople pop up, putting forward ambitious reforms to the institution capable of remediating past errors. Taken together, these moments of reform sketch instead a history of the professionalisation of the police. Since the end of the 19th century, the story goes, the police has progressively freed itself from the bosom of the military, all while being increasingly and more formally subject to administrative and political regulations, resulting in the progressively more parsimonious use of force. This “liberal” police force, we are told, is now subject to law and to the respect of constitutionally guaranteed freedoms. It has thus become part of a “process of civilisation” which seeks to reduce violence and progress democracy.
The uptake of these new technologies is an integral part of this dominant narrative. Within the discourse of these reformists, technology and science must, at every step of the way, contribute to rendering the police more objective, more transparent, more rigorous in the application of the rules, more efficient, but also more “just,” “closer to the population it serves” more “respectful of rights.” Yet almost no one seems to take issue with the fact that while these very same reforms were in the outset justified by the project of making the police more “liberal,” they end up incorporating the tools that are the direct result of the two central domains of state violence and lawlessness into the tools of policing practice: the military domain and intelligence.
Today, the focus on distant surveillance and the increasing gamification of social control, (through the use of colourful software interfaces which seem to come straight out of a science fiction film) tend to corroborate the idea of a tendential decrease in physical violence.[28] As is often also the case with the discourse of privacy advocates who reduce police surveillance to a simple question of privacy or freedom of expression. However, as the American activist and researcher Sarah Hamid reminds us, “these technologies aren’t just about the feeling of being surveilled, or not being able to express oneself freely.”[29] It is also about the fact they are “violent, carceral technologies.” Once they are part of policing practice and used in penal policy, they become complicit with a penitentiary system founded on the enclosure and brutalisation of bodies. For Hamid, the objective is not to limit such technologies in order that they become “a little less invasive” but rather, abolish them.
SOCIAL (UN)ACCEPTABILITY
At the LAPD, Predpol and “Operation LASER” – a scoring system developed by the company Palantir, which evaluates the dangerousness of people on probation or conditional freedom, were originally justified by a reformist discourse. Under the umbrella of the Stop LAPD coalition, neighbourhood collectives succeeded in blocking the road for these over-policing technologies and their long history of violence. Under increasing pressure, they succeeded in launching an internal enquiry which concluded that the danger scores attributed to those under Operation LASER were based on random and racist criteria. The programme was then suspended in April 2019.[30] One year later, much like other police forces in the United States, the LAPD announced that they were dropping PredPol, given that it was too expensive and not efficient enough.[31]
Much like the developers of facial recognition who are seeking to improve the precision of their systems by correcting “technical biases'' whereby women and racialised individuals pay the price—their faces are poorly recognised by algorithms, leading to a much higher number of false positives–, the providers of predictive policing technologies are pleading repentance. Now they too find the notion of predictive policing “terrifying” and plan to adapt their software in order to encourage the deployment of social and sanitary services in “high risk” neighbourhoods.[32] To a certain degree, their proposal is reminiscent of the reforms pushed by August Vollmer almost a century ago.[33] Clearly, the partisans of police reform aren’t scared of farcical repetition.
As for the unrepentant, a new concept is already emerging: precision policing, a term borrowed directly from military propaganda and designed to evoke the idea of a policing practice that has “surgical” precision. In a speech given at the Heritage Foundation, a highly influential conservative think tank in the US, William Bratton didn’t beat about the bush when explaining how the police of the future will resemble that of the past, but worse: precision policing, he explained simply, is in fact “the CompStat of the 1990s but on steroids.”[34] At its heart, police reforms have historically followed a similar path to that of carceral reforms, to the point that we might borrow directly from Foucault and claim that “for 150 years, the claim that the police has failed is exactly what enables maintaining it.”[35] Scandal after scandal, reformers continue to present technology either as the lever which will reform the police institution and as a means to stem abuse, or as the condition of efficiency in the struggle against crime. Subsequently, they swear, hand on heart, that they will adopt safeguards in order to protect fundamental liberties.
In France, it is this second case that today prevails as the major strategy of the proponents of the technopolice. It allows them to calm public opinion, to stay one step ahead of the CNIL (National Commission on Informatics and Liberty) and to temper certain judges who are a little too concerned with human rights. In high places, people worry about the “social acceptability” of the technopolice. The authors of a white paper on home security admit to considerable “resistance” from the population; resistance which, according to them, requires by way of “long-term pedagogy and progress compatible with the development of social compromises.”[36] Pedagogy and progress, that is, the perpetual allegory of the boiling frog. There are, however, at least two holes in this plan. On the one hand, it’s not certain that the agents of technopolicing know how to contain the rhythm of these roll-outs, as shaped as they are by disruption, ground-breaking innovation and shock tactics. And on the other hand, it just so happens that in 2002, Victor Hutchinson, professor of Zoology at the university of Oklahoma disproved the theory of the boiling frog, showing that in reality, the amphibian would jump right out of its container if it felt the temperature was getting too hot.
References:
[1] Aldous Huxley, Science, Liberty And Peace (London: Chatto & Windus, 1950).
[2] August Vollmer, “Police Progress in the Past Twenty-Five Years,” Journal of Criminal Law and Criminology 24, no.1, 165.
[3] La Quadrature du Net (Squaring the Net in French) is a French advocacy group that promotes digital rights and freedoms for its citizens. See the collective’s website: https://www.laquadrature.net/.
[4] Beauvau monte en puissance sur l’analyse d’images. (November 4, 2020), La Lettre A, www.lalettrea.fr/action-publique_executif/2020/11/04/beauveau-monte-en-puissance-sur-l-analyse-d-images,109618851-art. Translations are my own unless otherwise stated.
[5] Michel Foucault and Alan Sheridan, Discipline and Punish, the Birth of the Prison (New York: Vintage, 1995), 213.
[6] “Global Facial Recognition Market Size, Trends | Industry Report 2021 to 2026 With Covid Impact”, Mordor Intelligence, April 2021, www.mordorintelligence.com/industry-reports/facial-recognition-market.
[7] C. Guiliano, “Milpol 2019 ? : Développer une approche globale de la sécurité en agissant sur tous les contiuums”, AEF Info, 2019, www.aefinfo.fr/depeche/616430.
[8] C. Jones, “Market Forces?: The development of the EU security-industrial complex,” Transnational Institute & Statewatch, www.statewatch.org/marketforces.
[9] L. Wacquant, “La fabrique de l'Etat néolibéral ? : ‘Workfare’, ‘Prisonfare’ et insécurité sociale.” Civilisations. Revue internationale d'anthropologie et de sciences humaines 59, no.1 (2010): 151–174.
[10] La Stratégie Nationale du Renseignement, Coordination nationale du renseignement et de la lutte contre le terrorisme, Présidence de la République, 2019, 13. http://www.sgdsn.gouv.fr/files/files/Publications/20190703-cnrlt-np-strategie-nationale-renseignement.pdf.
[11] R. Marchal, “Pierre de Bousquet de Florian constate une "forme d'ensauvagement général de la société" (salon Milipol),” AEF Info, 22 November 2019, www.aefinfo.fr/depeche/616920.
[12] 5ème rapport d'activité 2020. (2021). Commission nationale de contrôle des techniques de renseignement. https://data.guardint.org/en/entity/3gm1qigwhrf.
[13] AFP, “Sécurité intérieure? : Le gouvernement élargit les possibilités de fichage,” Sud Ouest, August 12, 2020. www.sudouest.fr/justice/securite-interieure-le-gouvernement-elargit-les-possibi-lites-de-fichage-1641137.php.
[14] Agence Nationale de la Recherche, “Sureté et Sécurité en environnement urbain surpeuplé –S2UCRE”, 2019, https://data.technopolice.fr/fr/entity/dd33j6ttis, 2.
[15] See the project presentation on the Onhys website: https://www.archive.ph/wip/TnI6H.
[16] M. Picaud, “Peur sur la ville? : La sécurité numérique pour l'espace urbain en France,” Working Paper No. 01/2021 Cities and Digital Technologies Chair, Sciences Po, 2021, www.sciencespo.fr/ecole-urbaine/sites/sciencespo.fr.ecole-urbaine/files/2021_01%20-%20Picaud.pdf.
[17] Technopolice.fr, “Maintien de l'ordre: le guide de bonne conduite de l'Intérieur et de la Justice pour mieux réprimer les manifestations”, July 7, 2021, www.technopolice.fr/blog/maintien-de-lordre-le-guide-de-bonne-conduite-de-linterieur-et-de-la-justice-pour-mieux-reprimer-les-manifestations.
[18] This practice resumed in April 2022.
[19] Quoted in J. Hourdeaux, “Police prédictive? : Deux chercheurs démontent l'algorithme,” Mediapart, September 13, 2016, www.mediapart.fr/journal/international/130916/police-predictive-deux-chercheurs-demontent-l-algorithme.
[20] S. Brayne, Predict and Surveil: Data, Discretion, and the Future of Policing (Oxford: Oxford University Press, 2021), 82.
[21] J. Ellul, Le bluff technologique (Paris: Hachette, 2004), 575.
[22] G. L. Kelling, & M. H. Moore, The Evolving Strategy of Policing, U.S. Department of Justice, Office of Justice Programs, National Institute of Justice, 1989, 8.
[23] H. Guillaud, “Police prédictive? : la prédiction des banalités,” InternetActu, June 23, 2015, www.internetactu.net/2015/06/23/predpol-la-prediction-des-banalites.
[24] P. Galison, “The Ontology of the Enemy? : Norbert Wiener and the Cybernetic Vision,” Critical Inquiry 21, no.1 (1994): 228–266. See also: F. Bruno, M. Lissovsky, & I. F. V. Junior, “Abstraction, expropriation, anticipation? : Note généalogique sur les visions machiniques de la gestualité,” Réseaux 211 no.5 (2018): 105–135.
[25] S. Brayne, Predict and Surveil, 23.
[26] A. Léchenet, “Dans la lutte contre les contrôles au faciès, le fiasco des caméras-piétons.”, Mediapart, May 12, 2019, www.mediapart.fr/journal/france/120519/dans-la-lutte-contre-les-controles-au-facies-le-fiasco-des-cameras-pietons.
[27] S. Fussel, “Did Body Cameras Backfire?”, Route Fifty, November 1, 2019, www.route-fifty.com/public-safety/2019/11/did-body-cameras-backfire/161019.
[28] See G. Chamayou, Théorie du drone (Paris: La Fabrique, 2013).
[29] “Community Defense: Sarah T. Hamid on Abolishing Carceral Technologies”, Logic Magazine, No. 11, August 2020, www.logicmag.io/care/community-defense-sarah-t-hamid-on-abolishing-carceral-technologies.
[30] S. Brayne, Predict and Surveil, 137.
[31] A. Le Denin, “La police de Los Angeles abandonne PredPol, le logiciel qui prédit les crimes,” usine-digitale.fr, April 23, 2020, www.usine-digitale.fr/article/a-police-de-los-angeles-abandonne-predpol-le-logiciel-qui-predit-les-crimes.N956926.
[32] D. Uberti, After Backlash, “Predictive Policing Adapts to a Changed World,” Wall Street Journal, July 8, 2021, www.wsj.com/articles/after-backlash-predictive-policing-adapts-to-a-changed-world-11625752931.
[33] A. Newitz, “How the Father of Modern Policing ‘Abolished’ the Police,” The New York Times, June 3, 2021. www.nytimes.com/2021/06/03/opinion/august-vollmer-abolish-police.html.
[34] W.J. Bratton, “Cops Count, Police Matter? Preventing Crime and Disorder in the 21st Century,” The Heritage Foundation, 2018, www.heritage.org/sites/default/files/2018-03/HL1286.pdf.
[35] Foucault, Discipline and Punish.
[36] Ministère de l'intérieur, Livre blanc de la sécurité intérieure, 2020, p. 9, www.interieur.gouv.fr/Actualites/L-actu-du-Ministere/Livre-blanc-de-la-securite-interieure.