Watching spaces, surveilling faces… in Jamaica? Part 1
In July 2020 a news report appeared, to little fanfare, that high-tech closed-circuit television (CCTV) surveillance cameras “equipped with facial recognition software” were being utilised by business operators and law enforcement personnel in Montego Bay. The report indicated that those surveillance cameras had facilitated “a number of arrests” and were even being lauded by local security experts for their capacity to identify even “mask-wearing robbers”.
Over the years, the increasingly widespread deployment of facial recognition software and other facial recognition technologies (FRTs) has become the subject of impassioned debate worldwide for various reasons. The term FRTs is an umbrella designation that refers to “a set of digital tools used to perform tasks on images or videos of human faces…” (Buolamwini, Ordóñez, Morgenstern and Learned-Miller 2020, 2).
FRTs work by utilising machine-learning algorithms trained on data sets containing facial images to identify individuals by matching their facial image or videos containing their facial image against a database of facial images. They do this by first analysing an individual’s facial geometry, which is comprised of unique biometric identifiers, then producing a “face print” that is systematically matched against other face prints contained in the database until a positive identification is made (Buolamwini, Ordóñez, Morgenstern and Learned-Miller 2020, 8-12).
FRTs are in common use as verification tools that facilitate access to electronic devices like smartphones. They are also deployed to facilitate the identification of criminals or suspected criminals; assist with the tracking of missing children and victims of human trafficking; support disaster response and recovery efforts; streamline the provision of health-care services, including patient screening; and control entry to airports.
General concerns
On the flip side, however, FRTs have considerable potential to being misused as mass surveillance tools, and are highly invasive due to the sensitive nature of personal data they collect, store, and process (ie biometric facial data). Additionally, digital and privacy activists have expressed concern about their largely unregulated deployment by public and private entities, especially law enforcement as well as their propensity for misidentifying black/brown people, and women at comparatively higher rates than white men due to inherent “algorithmic biases”, which can exacerbate racial profiling issues.
This latter concern regarding inherent racialised and gendered algorithmic biases was first spotlighted by the pioneering research of Dr Joy Buolamwini, a Ghanaian American computer scientist, digital activist, and self-professed “poet of code” who “advises world leaders, policymakers, and executives on redressing algorithmic harms”. Dr Buolamwini’s research broke major ground by exposing the inability of several commercial FRTs to accurately identify darker-skinned female faces when presented with racially and gender diverse data sets of facial images.
Due to the abovementioned concerns, about two years ago tech giants Amazon, IBM, and Microsoft temporarily banned the sale of FRTs to law enforcement, pending legal regulation by Congress. In the EU, Belgium and Luxembourg have officially opposed the unregulated deployment of FRTs, while in England and Wales, the Court of Appeal determined that the utilisation of automated face recognition by law enforcement breached “data protection laws, privacy laws and equality laws” (Edward Bridges v The Chief Constable of South Wales Police and others [2020] EWHC Civ 1058).
In the United States of America, San Francisco, Boston and California have all banned the utilisation of FRTs by city departments and in police body cameras. And recently the city of Portland, Oregon, banned the utilisation of FRTs by government agencies, law enforcement and even private businesses in “places of public accommodation” except:
(1) to the extent necessary to comply with federal, state or local laws;
(2) where it is required to facilitate access to the user’s own personal or employer-issued communication and electronic devices; or
(3) where it is utilised in automatic face detection services for social media applications.
FRTs and the Data
Within the digital economy, personal data as both a currency and commodity is arguably more valuable than gold. In keeping with global trends, prioritising the protection of personal data, our Parliament, on May 19 2020 passed the long awaited Data Protection Act (DPA). The DPA was largely modelled on the European Union’s General Data Protection Regulation (GDPR). The protection of personal data has been acknowledged as being “of fundamental importance to person’s enjoyment of his or her right to respect for private life…” (MK v France App no 19522/09 [ECtHR, 18 April 2013]). By providing for the protection of personal data, the DPA gives statutory expression to the right to informational privacy, which the Court in Julian J Robinson v The Attorney General of Jamaica affirmed as an important dimension of the constitutionally guaranteed right to privacy ([2019] JMFC Full 04, page 131, para 174). Thus, as leading data privacy expert, Chukwuemeka Cameron carefully explained, during a recent panel discussion, the duty of data controllers to protect the personal data they are processing does not “singularly arise from the [DPA]” since ‘[i]t is the constitution that places the onus on [data controllers] to ensure that they are processing personal data in a safe, secure, transparent, and accountable way…”
Under the DPA, individuals (data subjects) enjoy various rights in relation to the handling (ie, collection, processing, storage, and disposal) by data controllers of their personal data, which they are considered to own. An especially important right in this connection is the right of data subjects to provide and withdraw consent to the processing of their personal data barring certain exceptions. The DPA also obligates all actors handling personal data whether directly themselves (ie data controllers) or indirectly on behalf of another (ie data processors) to do so in accordance with the data protection standards it establishes.
The formal appointment of Celia Barclay as information commissioner on December 1, 2021 marked the commencement of the two-year transitional period. During this time, businesses and other entities deemed data controllers by the DPA must bring themselves in line with it on pain of hefty financial penalties (and by necessary implication, possible reputational damage).
While the DPA is curiously silent on the question of FRT deployment and its legal ramifications, it is nevertheless the case that, since personal data, specifically biometric facial data, would be processed by FRTs, their deployment would engage the DPA — albeit indirectly. In addition, since biometric facial data would qualify as “sensitive personal data” within the meaning of the DPA it would attract more stringent conditions and obligations with respect to its handling, and also require explicit consent for processing unless there is some legitimate interest militating in favour of processing in the absence of explicit consent.
It is conceivable that the collection of facial data for processing by FRTs could be challenged on the ground that there is no lawful basis for the collection, as has been done under the comparative GDPR regime in the EU. However, while the DPA would apply — albeit indirectly — processing by FRTs of biometric facial data it arguably does not offer any meaningful opportunity for regulating FRTs in a comprehensive and focused way. In several jurisdictions with sophisticated technological infrastructures a dedicated set of laws geared towards specifically regulating FRTs is fast becoming accepted by key stakeholders as necessary to ethically regulate FRTs. This is especially so due to the unique and highly sensitive nature of biometric data, which is regarded as a special category of data under the GDPR since it “makes it possible to uniquely identify people”.
Regulating the deployment of FRTs
Given the manifold concerns surrounding the largely unregulated deployment of FRTs, calls for their swift and comprehensive regulation have only been intensifying with the passage of time. However, digital activists, politicians, and corporate bigwigs remain divided on the way forward; unable to chart a definitive course towards regulation owing to divergent and oppositional interests that make uniform consensus building difficult. Notwithstanding that, emerging as an apparent ray of hope amidst the dissonance is the shared recognition among those stakeholders that the ethical regulation of FRTs is necessary to promote the public good.
While there are no jurisdictions which have enacted specific legislation to frontally regulate the deployment of FRTs across various contexts, repose has been sought in the imposition of moratoriums on the deployment of FRTs pending regulation, the indirect application of the GDPR, and the intervention of courts.
In 2019 a Chinese professor sued a local zoo after it replaced its fingerprint-based admission system with one that utilised facial recognition. The decision of China’s Supreme People’s Court was published last year and outlined a set of new rules that seemingly attempt to regulate the deployment of FRTs in commercial contexts. In substance, the rules require hotels, shopping malls, airports, and other commercial entities to seek and receive consent from customers to utilise facial recognition. They also appear to limit the utilisation of FRTs only to what is necessary and demands that entities utilising them implement adequate data protection measures.
Further afield, in Telangana, India, a social activist filed a lawsuit earlier this year after he was stopped on his way home by police officers who demanded that he remove his mask and allow them to take his picture. His enquiries about the purpose for which his picture was being taken and the use(s) to which it will be put were not answered. In his lawsuit, he is requesting that the Government of Telangana discontinues its use of FRTs for law enforcement purposes.
Despite the possibility of quasi-regulation of FRTs through the DPA and its impending regulations in Jamaica, the comprehensive and focused regulation of technologies that have been shown to pose particular danger to the very people comprising the majority of the Jamaican population will render such regulation ineffective. This state of affairs exists against the backdrop of an inevitable uptick in the deployment of FRTs locally due to crime, the ever-evolving digital economy, and other factors. The dangerous implications become clear in a context in which a false identification by technologies that misidentify melanated people and women at higher rates, and struggle greatly with identifying faces covered by masks could actually lead to an individual’s arrest and detention. As such, an ethic of transparency and accountability by public and private actors deploying FRTs is of crucial importance.
Among other things, this will put data subjects whose facial information will be captured and ultimately processed by FRTs in a better position to provide informed consent. In addition, members of the public whose faces are scanned upon entry onto private or public establishments utilising cameras containing facial recognition software should be apprised of:
(1) the extent to which their facial data is being processed by FRTs;
(2) their identification accuracy rates;
(3) the mechanisms in place to adequately safeguard the data being processed by the technology;
(4) the measures implemented to mitigate the risks inherent in their deployment;
(5) whether the facial data captured has been deleted or stored in a database;
(6) who has access to the database outside of those deploying the FRTs; and
(7) the specific and general purposes for which their facial data are being used or are likely to be used in the future.
As Jamaica continues its slow yet certain march towards a fully digitised society, the reported deployment of FRTs by public and private actors, especially in the absence of comprehensive legal regulation, must be carefully scrutinised in the light of the well-document issues associated with their deployment elsewhere in the world.
In Part 2 of this conversation I will look at the cross-jurisdictional approaches to regulating FRTs with specific consideration given to the United States of America, the European Union, the United Kingdom, and China.
Amanda Quest is an attorney-at-law. Send comments to the Jamaica Observer or amandajdquest@gmail.com.