This website requires JavaScript.
Latest Story

World Economic Forum releases framework for facial recognition technology

Mar 3, 2020 by The Passage Team
World Economic Forum releases framework for facial recognition technology

World Economic Forum

Biometric monitoring and susceptibility to unfair bias are primary concerns, along with the lack of industry standards that are a barrier to companies and governments deploying the technology’s potential benefits.

To help organizations tackle this challenge, the World Economic Forum released the first framework for the safe and trustworthy use of facial recognition technology. The Framework for Responsible Limits on Facial Recognition was built by the Forum, industry actors, policy makers, civil society representatives and academics. It is meant to be deployed and tested as a tool to mitigate risks from potential unethical practices of the technology.

“Although the progress in facial recognition technology has been considerable over the past few years, ethical concerns have surfaced regarding its limitations,” said Kay Firth-Butterfield, Head of Artificial Intelligence and Machine Learning at the World Economic Forum. “Our ambition is to empower citizens and representatives as they navigate the different trade-offs they will face along the way.”

This is the first framework to go beyond general principles and to operationalize use cases for two distinct audiences: engineering teams and policy makers.

This framework is structured around four steps:

Define- what constitutes the responsible use of facial recognition through the drafting of a set of principles for action. These principles focus on privacy, bias mitigation, the proportional use of the technology, accountability, consent, right to accessibility, children’s rights and alternative options.

Design- best practices, to support product teams in the development of systems that are “responsible by design”, focusing on four main dimensions: justify the use of facial recognition, design a data plan that matches end-user characteristics, mitigate the risks of biases, and inform end-users.

Assess- to what extent the system designed is responsible, through an assessment questionnaire that describes what rules should be respected for each use case to comply with the principles for action

Validate- compliance with the principle for action through the design of an audit framework by a trusted third party (AFNOR Certification for the policy pilot).

The Passage Team

The Passage is committed to creating in-depth content over technology industry across Asia with a focus on emerging startups in the technology, healthcare, education, food, tech, travel & mobility segments.

Follow The Passage Team