Human rights concerns as facial recognition technology use expands
A research project led by Associate Professor Nessa Lynch has shown the potential impact that use of facial recognition technology (FRT) has on human rights.
The study highlights the current regulation gap in Aotearoa New Zealand and their report makes 15 recommendations that aim to inform governments how best to manage the risks of the use of FRT.
The research was co-authored with Professor Liz Campbell of Monash University in Australia, Dr Joe Purshouse of the University of East Anglia in England and Dr Marcin Betkier of the Faculty of Law, Victoria University of Wellington. The research was funded by the New Zealand Law Foundation.
The research shows that FRT is increasing in usage in Aotearoa and comparable countries. It is used across sectors including government departments, policing, banking, travel, security, and customer tracking.
“If this regulation gap isn’t plugged soon, the impacts on human rights—such as privacy, freedom of expression, the right to peaceful protest, and the right to be free from discrimination—are potentially extensive,” says Associate Professor Lynch.
The research involved a stocktake of the use of FRT here and in comparable countries, with a focus on use by the state. The spectrum of impact ranges from low-impact uses such as verification at the border, to high-impact activities like live automatic facial recognition from CCTV feeds and controversial apps such as Clearview AI, which is used in policing in other countries and has been trialled in this country.
“We ask there to be an immediate moratorium on live automatic facial recognition by police, due to its impact on individual and societal rights. We also ask for additional oversight of police access to driving licence and passport databases, while a range of recommendations are taken into account,” says Associate Professor Lynch.
The comprehensive range of recommendations includes giving biometric information—such as DNA, fingerprints, iris scans, and facial data—special protection and implementing high-quality privacy impact assessments and algorithm impact assessments.
“We also recommend the government establishes independent oversight of the collection, retention, and comparison of facial images. We also want to see transparency around the sharing of facial images between state agencies, other jurisdictions, and the private sector,” says Associate Professor Lynch.
The full report is now available here.
For enquiries contact Associate Professor Nessa Lynch.
A link to a public panel discussion held at Victoria University of Wellington's School of Law in October 2019.