Lawsuit calls for a ban on the use of facial recognition technologies by the São Paulo subway system
Por comunicainter em
In order to prevent that the four million daily users of the São Paulo subway system keep having information about their faces collected, mapped, and monitored through facial recognition, civil society organizations and public defenders filed a Public Civil Action on Thursday. The following organizations are parties to the lawsuit: the Public Defender’s Office of the State of São Paulo, the Federal Public Defender’s Office, the Brazilian Consumer Defense (Instituto Brasileiro de Defesa do Consumidor – Idec), Intervozes – Coletivo Brasil de Comunicação Social, ARTIGO 19 Brazil and South America and the Human Rights Advocacy Collective (Coletivo de Advocacia em Direitos Humanos – CADHu).
The organizations claim that the facial recognition system implemented by the São Paulo Metro company does not meet the legal requirements set forth in the General Law of Data Protection (LGPD), in the Consumer Defense Code, in the Code of Users of Public Services, in the Child and Adolescent Statute (ECA), in the Federal Constitution, and in international treaties.
“The facial recognition implemented by the subway system retrieves a biometric and irreplaceable personal data, which is the information about one’s face. The subway company has violated practically every law on this matter, from the LGPD to the ECA, as well as the Constitution and international treaties,” says Eloísa Machado, FGV’s professor and one of the plaintiffs’ lawyers.
The Public Civil Action is a result of the analysis of documents presented by the São Paulo Metro as part of a previous lawsuit that demanded information about the implementation of this project that cost more than R$ 50 million to the public budget and that, among other activities, involves the use of facial recognition in whoever uses this means of transportation.
“The ineffectiveness of this technology, which is aggressive and invasive by nature, besides producing discriminatory actions against passengers, can worsen the already precarious experience of the public transportation user, who may have his long and tiring daily commute interrupted due to ‘false positives’, generating even more insecurity for the user,” says Diogo Moyses, coordinator of Idec’s Digital Rights Program. “Besides not providing precise information, it is also questionable the priority of spending millions in a faulty monitoring instead of investing in the necessary improvement and expansion of the subway transportation system,” adds Estela Guerrini, public defender and coordinator of the Specialized Nucleus of Consumer Defense of the State Public Defender’s Office.
The lawsuit underlines that facial recognition technologies exponentially raise the risk of discrimination against black, non-binary, and transgender people since this technology is known to be flawed in its accuracy and rooted in an environment of structural racism. Even the best algorithms are not accurate enough to recognize black and transgender people, who are more affected by false positives and false negatives and are more exposed to embarrassment and rights violations “The discriminatory result of facial recognition technology is insoluble and reflects the bias present in the very database that feeds this technology, since it is designed and developed by a few cis and white men from multinationals that control its sale to the rest of the world” says Isadora Brandão, public defender and coordinator of the Specialized Nucleus for the Defense of Diversity and Racial Equality of the State Public Defender’s Office. “Performing facial recognition of subway users massively collects biometric data without consent and is a disproportionate measure that sets up a mass surveillance system. It opens the way for the normalization of a society under surveillance, increasingly vulnerable to the authoritarian inclinations of governments that have unprecedented control over the lives of citizens,” says Pedro Ekman from Intervozes.
The lawsuit also questions the use of images and the collection and processing of children’s and teenagers’ sensitive personal data, without the consent of their parents or guardians, in direct violation of the LGPD, the ECA, and the Constitution. “Furthermore, as children grow and their faces change rapidly, it is known that the chances of success of the facial recognition system on children are small, and the argument that this system would make it possible to locate missing children proves to be unfounded,” adds Daniel Secco, public defender and coordinator of the Specialized Center for Children and Youth of the State Public Defender’s Office.
The organizations point out that this initiative is in opposition to policies enforced in other countries, especially in Europe and the US, which restrict the massive use of this type of technology due to its invasive nature and its potential to promote a scenario of surveillance and monitoring of people going through public spaces. In recent years, companies such as Microsoft, IBM, and Amazon have also reported that they will stop selling facial recognition solutions for police use due to potential human rights violations.
“Even if there was a commitment to ‘improve’ the performance of these tools, this would not be enough to make their mass use safe and compatible with human rights – in this sense, for facial recognition to be more accurate, the population passing through monitored spaces would be even more vulnerable to being tracked, for example. In certain cases, this could even affect exercising the right to protest. Thus, it is necessary that these practices are stopped and banned,” points out Sheila de Carvalho, coordinator of the Legal Reference Center of Article 19 Brazil and South America. “This is the first lawsuit to question the use of facial recognition in publicly accessible places, a technology that has been massively and indiscriminately implemented throughout Brazil,” she adds.
The lawsuit also demands that the Court orders São Paulo Metro to immediately stop performing facial recognition in its facilities and, furthermore, pleads a compensation payment of at least R$ 42 million (the amount provided for the implementation of this technology in the contract) as a result of collective moral damages for the harm caused to the rights of its passengers.