What if AI judged you?
Without knowing you?

The Biometric Mirror is a suite of provocative interactive systems that expose how AI sees, classifies, and judges us. By turning your data into instant labels, scores, or predictions, it sparks reflection on bias, fairness, and accountability in automated decision-making.

Why it matters
AI is already used to decide who gets a loan, a job, or even parole. These systems claim to be objective, yet they often replicate (or amplify) the biases of their designers and datasets. The Biometric Mirror challenges this myth of neutrality, confronting us with a simple but urgent question: Should we trust machines to define who we are?

The experience
Through playful yet confronting installations, participants watch themselves being assessed in real-time by AI systems that claim to measure traits like attractiveness, competence, trustworthiness, yes, even their preferred music genre. This immediate feedback (part game, part warning) forces us to reflect on how it feels to be judged by an algorithm.

Public impact
By making algorithmic processes visible and tangible, the Biometric Mirror empowers audiences to take part in the debate about the role of AI in society. It sparks conversation, fuels public awareness, and highlights the stakes of letting automated systems influence critical human decisions.

The bigger picture
At its core, the Biometric Mirror is not just about technology. It’s about values, fairness, and accountability. It asks: if AI systems are shaping our futures, who gets to design them, and who bears the risks when they get it wrong?

Meet the Mirrors

The Biometric Mirror exists in multiple forms, each one exposing a different way algorithms attempt to know us. These works are not products or prototypes; they are provocations. Each installation takes a familiar promise of AI and pushes it into public space, forcing us to ask: what happens when machines judge, misread, or redefine who we are?

Biometric Mirror,
University of Melbourne

The first Mirror turned facial recognition into a tool of self-confrontation. Participants watched as their portrait was instantly translated into personality traits: ambitious, neurotic, and extroverted. The bluntness of the judgment revealed both the seduction and absurdity of reducing people to data.

Face Value,
University of Technology Sydney

Here, the Mirror examined emotion and trust detection technologies, which are increasingly used in policing, hiring, and access control. By turning dubious science into an interactive encounter, it exposed how easily human complexity is reduced to signals that machines claim to read.

Beauty Temple,
Science Gallery

Set within the aesthetics of a futuristic salon, this work, created with Lucy McRae, allowed visitors to submit to the algorithm’s gaze and receive a prescription for beauty. In doing so, it questioned how much we are willing to outsource our bodies and identities to systems that pretend to know what “better” looks like.

How They See Us, SparkOH!

This latest iteration asked not only how algorithms classify us, but how they might care for us. Visitors interacted with a virtual therapist, only to realise its empathy was as artificial as its categories. The installation raised an unsettling question: what happens when our most intimate needs are met by systems that do not, and cannot, understand us?

Publications

Pursuit, 2021. Niels Wouters and Jeannie Paterson: TikTok Captures your Face.

Pursuit, 2020. Niels Wouters and Ryan Kelly: The Danger of Surveillance Tech Post COVID-19.

Proceedings of the 2019 Conference on Designing Interactive Systems. Niels Wouters, Ryan Kelly, Eduardo Velloso, Katrin Wolf, Hasan Shahid Ferdous, Joshua Newn, Zaher Joukhadar and Frank Vetere: Biometric Mirror: Exploring Ethical Opinions towards Facial Analysis and Automated Decision-Making.

Pursuit, 2018. Niels Wouters and Frank Vetere: Holding a Black Mirror up to Artificial Intelligence.

The Conversation, 2018. Niels Wouters and Frank Vetere: AI Scans your Data to Assess your Character but Biometric Mirror Asks: What if it Gets it Wrong?