A majority of European MEPs voted in favor of restricting the use of blanket facial recognition by law enforcement. The resolution will also outlaw the use of private databases focused on facial prints.
377 MEPs backed the resolution, while 248 voted against it and 62 abstained. A statement by MEPs reads:
“To ensure that fundamental rights are upheld when using these technologies, algorithms should be transparent, traceable and sufficiently documented, MEPs ask. Where possible, public authorities should use open-source software in order to be more transparent.”
The members want to ensure that data collection and use of biometrics at large are consistent with the GDPR. This includes limitations on storage, data security, and accountability.
Moreover, the resolution aims to restrict law enforcement’s surveillance of the general populace. It calls for the “permanent prohibition of the use of automated analysis and/or recognition” of any human features in public spaces. This refers to biometrics like fingerprints, DNA, voice, and behavioral patterns.
Striving for Transparency
These databases usually scrape photos without the consent of the subjects. That is part of the reason why the European Parliament would ban such activity.
Online advertising plays loose with data. On the other hand, law enforcement must play by the book. To achieve that, MEPs want to ban the use of these databases.
In a conversation with the Register, a spokesperson noted that these types of technology, often backed by AI, tend to be biased against minority groups. Some of the other matters discussed in the resolution related to anti-racism more broadly.
Ultimately, MEPs want law enforcement bodies to be accountable when conducting background checks. Agencies must use legally obtained data. Governments can better ensure the rights of those under investigation are protected, in doing so.
It should be noted that the resolution is non-binding. But while it won’t lead to immediate legislative action, it is part of a broader push for better digital privacy. It could also clamp down on discriminate surveillance.