An appropriate legal framework for automated facial recognition in South Africa

  • 0

Abstract

Automated facial recognition (AFR) is a new technology that enables computers to individualise people without the input of a human operator. It has a myriad of applications in the private and public sectors and can be used by law enforcement to identify suspects in large groups. However, AFR also has the potential for enormous human rights abuses. People’s movements can be determined, and if linked to pre-existing systems, such as a state’s driver’s license database, it is possible to monitor the movements of total populations. As a result, it is of the utmost importance that the use of such systems should remain within the legal sphere, and that legal principles governing such systems are clear and well defined. It is important that individuals’ discretion in the use of such systems is limited to prevent abuse of power.

AFR technology simply determines whether two photos are those of the same person. The system takes a picture of a known person and creates a simple triangulated map by connecting general details of the face and measuring those lines. For example, the technology can determine how wide the eyes and nose are, where the lines of the mouth and eyebrows run, and the shape of the face and ears. These points are then linked together, their lengths are measured, and the dimensions and identity of the person are stored as a simple list of digits, in a certain order, in a database. When these simple facial details are linked with a larger system, the system becomes so powerful that it is possible to identify people. In such a system a second video source is necessary, for example where a group of people passes in front of a closed-circuit television (CCTV) camera. The system will process the facial biometrics of all the people in the video, feed them into and compare them with the original database where the identity of individuals is known. If the AFR system makes a positive match, the system operator learns the person’s identity.

The legality of AFR technology was recently decided in the United Kingdom (UK) in The Queen (on application of Edward Bridges) v The Chief Constable of South Wales Police (2020) EWCA Civ. 1058 (R-Bridges), the first case of its kind in the world. The court confirmed that the core issue of this matter is whether an appropriate legal framework exists within which AFR can be legally deployed. Several pieces of legislation were scrutinised to determine whether AFR functions within the law. The first of these is article 8 of the European Convention on Human Rights, which regulates the right to privacy. In PG v United Kingdom (2008) 46 EHRR 51, 57 the court ruled that article 8 could be violated if new technology was used in such a way that it preserves a permanent record of public events. The Court of Appeal confirmed this statement in R-Bridges. Secondly, the Surveillance Camera Code of Practice, enacted under section 29 of the Protection of Freedoms Act 2012, seeks to protect human rights by establishing guidelines that restrict camera operators’ discretion when working with CCTV video footage. It also regulates the storage of and access to data. Thirdly, various sections of the Data Protection Act 2018 regulate the collection of data, as well as its storage. These legislative measures contain the enabling legislation within which AFR should operate in the UK. It was decided in R-Bridges that the defendant had gone beyond legislative provisions when deploying AFR, and issued a declaratory statement to resolve the matter between the parties.

As in the UK, several legislative tools in South Africa contain the enabling legislation for AFR. The Protection of Personal Information Act 4 of 2013 (POPI) is the first of these, and regulates the collection, storage and further processing of personal information. Several sections are applicable to AFR, and anyone wanting to implement an AFR system in South Africa will have to take note of all relevant provisions in this act. The POPI Act stipulates that the Regulator may issue codes of conduct which will be applicable to specific situations. It is therefore possible to write a code of conduct which specifically sets out the principles applicable to AFR. This would be akin to the UK Surveillance Camera Code of Practice, and such a code could make a meaningful contribution to the law, provided it is formulated correctly. Section 45 of the Cybercrime Bill of 2017 regulates the possession of data by police officials. In the context of AFR, police officials can obtain CCTV video footage from third-party service providers simply by getting their permission to view the material. If the service provider is willing to provide specific information, like street-camera footage, to law enforcement officials, it can be lawfully obtained and viewed by them. This provides too much discretion to individual parties, and could easily be an abuse of human rights.

The unreported judgment of Vumacam v Johannesburg Road Agency 2020-08-20 case no. 14867/20 (HHSA) illustrates the atmosphere and views when sensitive biometric information is involved. In this case the Johannesburg Roads Agency (JRA) refused to issue wayleaves to Vumacam, because it believed that Vumacam had abused their power by spying on innocent people and selling the “footage” to third parties. JRA further felt that Vumacam’s “spy footage” was a tradable asset in their hands, and that this was the primary reason for installing the cameras. In essence the JRA accused Vumacam of spying on individuals’ movements and thereby infringing on their right to privacy. It is very interesting that the JRA, like the Court of Appeal in R-Bridges, argued that a legal framework must be in place before such sensitive biometric data can be collected and processed and that it should respect individuals’ privacy rights. Vumacam retaliated by explaining that the cameras had been installed for crime prevention purposes. It further claimed that the JRA was in no position to deny wayleave applications, as Vumacam complied with the requirements as contained in the legislation.

To reach a correct finding, the court examined the local legislation applicable to this case and concluded that if it appears that the service provider has been approved to work around public roads, and it appears that the necessary procedures are in place to protect the infrastructure, the authorising authority must grant the application. Nowhere in the legislation is any mention made of a provision that the authorising authority (in this case the JRA) may refuse an application on any other grounds not contained in the legislation. Consequently, the case was decided in favour of Vumacam. It should be stated clearly that Vumacam’s victory had nothing to do with the issue of potential privacy breaches. At the heart of the matter was the JRA’s refusal for considering Vumacam’s applications for wayleaves. In that regard the JRA erred in considering factors other than those outlined in the empowering legislation. The importance of the case is that it illustrates the general legal sense regarding AFR and the processing of sensitive biometric data. One of the critically important requirements for the successful implementation of AFR without committing large-scale human rights violations is a comprehensive legal framework that restricts individual discretion and prohibits unnecessary processing of sensitive data. Currently Vumacam is at liberty to record and store CCTV footage at will, without any specific legislation regulating how this material should be handled and stored.

 Several recommendations may be made. The first is that the exceptions in the POPI Act are too broad for providing meaningful human rights protection. The South African legal framework gives too much discretion to individual operators, which increases the possibility of human rights violations. Secondly, it appears that ordinary CCTV footage can be used as a data source for AFR systems. The UK Surveillance Camera Code of Practice contains comprehensive regulations for CCTV operators. No similar regulations are in place in South Africa, and as a result it is not surprising that the JRA is so concerned about Vumacam’s wide discretion regarding CCTV technology. The POPI Act allows for the creation of codes of conduct, and these provisions should be used to create codes regulating CCTV and AFR use. It is strongly recommended that codes of such nature be issued in South Africa as a matter of urgency. Thirdly, it seems that with any AFR system a human should be the final decision maker. This is something the Bridges ruling emphasised, and fortunately it is also something that is pertinently addressed in the POPI Act. South Africa has the fragments of a legal framework to implement AFR successfully. Just as with our UK counterpart, the law should be developed and expanded to safeguard human rights in an era where new technology has the potential to infringe on human domains like never before.

Keywords: automated facial recognition (AFR); automated facial recognition technology; biometric information; Bridges, facial biometrics; human rights; law enforcement; legal framework; Protection of Personal Information Act (POPI Act); regulation; Vumacam

 

Lees die volledige artikel in Afrikaans

’n Toepaslike regsraamwerk vir geoutomatiseerde gesigherkenningstegnologie in Suid-Afrika

 

  • 0

Reageer

Jou e-posadres sal nie gepubliseer word nie. Kommentaar is onderhewig aan moderering.


 

Top