New report offers blueprint for facial recognition tech
Sydney - A new report from the University of Technology Sydney (UTS) outlines a model law for facial recognition technology to protect against harmful use.
Australian law was not drafted with widespread use of facial recognition in mind but the report recommends reform to modernise their law, especially to address threats to privacy and other human rights.
Facial recognition and other remote biometric technologies have grown exponentially in recent years, raising concerns about privacy, mass surveillance and unfairness experienced, especially by people of colour and women, when the technology makes mistakes.
Reports have found some Australian retailers are using facial recognition to identify customers entering their stores, leading to considerable community alarm and calls for improved regulation.
There have also been widespread calls for reform of facial recognition law in Australia and internationally.
The UTS report says facial recognition tech leaves society particularly vulnerable to human rights restrictions when it is misused or overused.
When facial recognition applications are designed and regulated well, there can be real benefits, helping to identify people efficiently and at scale.
The technology is widely used by people who are blind or have a vision impairment, making the world more accessible for those groups.
The latest report proposes a risk-based model law for facial recognition. The starting point should be to ensure that facial recognition is developed and used in ways that uphold people's basic human rights.
The gaps in current New Zealand law have created a kind of regulatory market failure. Many respected companies have pulled back from offering facial recognition because consumers aren't properly protected.
Many civil society organisations, government and inter-governmental bodies and independent experts have sounded the alarm about dangers associated with current and predicted uses of facial recognition.
The model law sets out three levels of risk to human rights for individuals affected by the use of a particular facial recognition technology application, as well as risks to the broader community.
Under the model law, anyone who develops or deploys facial recognition technology must first assess the level of human rights risk that would apply to their application. That assessment can then be challenged by members of the public and the regulator.
Based on the risk assessment, the model law then sets out a cumulative set of legal requirements, restrictions and prohibitions.
While Whittaker’s has to date sourced only Ghanaian cocoa beans to make its chocolate, it is now supplementing this with cocoa beans that meet its quality and ethical standards from other parts of Africa. Whittaker’s Chocolate Lovers will see changes to its packaging to reflect the cocoa origin change from next month.