July 5, 2024
What the FAccT? First law, bad law
Hosted by
Alix Dunn

In part three of our FAccT deep dive, Alix joins Lara Groves and Jacob Metcalf to discuss their paper “Auditing Work: Exploring the New York City algorithmic bias audit regime”. When you apply for a job, it’s very likely that somewhere down the line your CV or cover letter will be ‘screened’ by some kind of algorithm. But how fair and transparent is this process? And does transparency even guarantee fairness? In their paper, Lara and Jacob explore the first ever law to require independent algorithmic bias audits, and for this requirement to be imposed in a commercial setting.

On Computer Says Maybe, host Alix Dunn interviews visionaries and cutting edge researchers to help you wade through the wacky and worrying world of new technology.
Contact us with your feedback and suggestions, or if you’re keen to explore your area in conversation. We have rotating co-hosts and expert guests that help us deep dive into a particular topic. Write to us anytime at team@saysmaybe.com.
Show Notes

Lara Groves is a Senior Researcher at the Ada Lovelace Institute. Her most recent project explored the role of third-party auditing regimes in AI governance. Lara has previously led research on the role of public participation in commercial AI labs, and on algorithmic impact assessments. Her research interests include practical and participatory approaches to algorithmic accountability and innovative policy solutions to challenges of governance.

Before joining Ada, Lara worked as a tech and internet policy consultant, and has experience in research, public affairs and campaigns for think-tanks, political parties and advocacy groups. Lara has an MSc in Democracy from UCL.

Jacob Metcalf, PhD, is a researcher at Data & Society, where he leads the AI on the Ground Initiative, and works on an NSF-funded multisite project, Pervasive Data Ethics for Computational Research (PERVADE). For this project, he studies how data ethics practices are emerging in environments that have not previously grappled with research ethics, such as industry, IRBs, and civil society organizations. His recent work has focused on the new organizational roles that have developed around AI ethics in tech companies.

Jake’s consulting firm, Ethical Resolve, provides a range of ethics services, helping clients to make well-informed, consistent, actionable, and timely business decisions that reflect their values. He also serves as the Ethics Subgroup Chair for the IEEE P7000 Standard.

Further Reading

Transcript