Cufa (Central Única das Favelas) went back after announcing that it was using facial recognition in a project that envisaged the registration of 2 million beneficiaries of basic food baskets. The NGO had started registration last week. However, with the questioning about the collection of biometric data, the measure was suspended.
In a statement, Cufa highlighted the work carried out more than a year ago in combating the impacts of the pandemic. One of the initiatives is the Mothers of the Favela, which offers food to thousands of women in several Brazilian cities. The NGO said it already provides accounts with paper records, but adopted facial recognition to simplify the process.
“The choice of using the latest technology, which is being tested, is precisely to streamline our processes and reinforce transparency, which is a requirement of donors for absolutely all organizations,” said Cufa. The NGO said it was aware of the requirements of the LGPD (General Data Protection Law) and ensured that the data collected will not be used for any other purpose.
Still, the association said it would stop the practice. “Cufa has just made a decision. It will only partner with donors who accept account payments in the form of photos and videos. No data will be requested from any beneficiary and transparency will be guaranteed by our word ”.
The president of Cufa, Preto Zezé, informed that the data collected in the test were all erased. “Everything’s canceled. In fact, it generated a good debate about data security, since many organizations are collecting data ”, he published in his profile on Twitter.
Cufa has partnered with Unico
The collection of biometric data was carried out in partnership with Unico (formerly known as Digital Access). The company issued a statement to explain the agreement with Cufa. In the text, the company pointed out that it follows the LGPD and does not share any personal data with other companies.
“In a partnership with Cufa, Unico donated facial recognition technology to authenticate beneficiaries of the Mãe da Favela program on its own platform, whose database is deleted at the end of the campaign. In this way, the solution assists in the registration process, validating – with authorization – the real beneficiaries in this moment of crisis quickly and safely, avoiding possible fraud ”.
Unico explained that its system analyzes points on users’ faces and generates a biometric authentication score that indicates the probability of the person being registered in the database. The company stated that it does not share any stored data and that it is committed to the privacy of Brazilian data.
Use of facial recognition is questioned
The use of facial recognition was stopped by Cufa after researchers raised doubts about the practice. One of the main requests for explanations came from developer Nina da Hora, who questioned the purpose of collecting biometric data. She also asked how the information would be stored and processed.
What purpose ? Where will the data be stored? Do people sign any terms authorizing this? Transparency of collected data and processor? What is the company behind this? BASIC QUESTION https://t.co/50MjoMeXdY
– Nina da Hora – Nina de la Hora (@ninadhora) April 26, 2021
For the political scientist and member of the Center for Studies on Security and Citizenship, Pablo Nunes, despite the positive decision, Cufa has not yet clarified the doubts. “We were happy with the result, but we are left with questions about what the partnership was like, what the type of contract was, how responsibilities were defined in relation to the collection, processing and storage of this data. We don’t know, ”he told the Tecnoblog.
He pointed out the concern with partnerships that provide facial recognition systems to NGOs and governments, but foresee the transfer of data to technology companies. “We are very concerned about these relationships that are being formed and that there is no transparency about what will be done with the data,” he said.
The researcher indicated that Cufa and Unico’s explanations are insufficient to evaluate the project. One of the doubts is related to the photos and videos that Cufa said it will use to account for it from now on. That’s because, even if the images are not ideal, they could still be used to identify people.