Part of the
4TU.
Design United
TU DelftTU EindhovenUniversity of TwenteWageningen University
4TU.
Design United
Close

4TU.Federation

+31(0)6 48 27 55 61

secretaris@4tu.nl

Website: 4TU.nl

HomeProjectsDEI4 Embodied AI

DEI4 Embodied AI

What values, biases and stereotypes are we putting into robots, chatbots, smart assistants, and autonomous cars?

Currently, embodied AI (e.g., smart objects, robots, smart personal assistants) is an expression of power: it can be used to support humans through human-agent relationships, but it can also increase existing societal bias and amplify injustice. Recently, we  have seen a growing number of calls for considering how issues like gender, race, and disability play a role in AI technology. The DEI4 Embodied AI initiative focused on broadening participation in the development of AI in computer science (i.e., who takes decisions about what to develop, how to develop, who takes a particular perspective, and which values to embed) and on changing current practices in an open societal conversation. 

“Designing for diversity, equity and inclusion is a complex challenge that requires an urgent revision of existing approaches and methods. We, as designers, must respond to this challenge, by being humble; by being ready to provide others with platforms to tell their story, and by collectively challenging the idea that there is one ‘normal’ way of being.”
Maria Luce Lupetti

We developed four transdisciplinary tools to conduct futuring and critical design workshops with academics, designers, and participants from society.

We tested the tools in 4 international workshops with more than 200 participants.

The tools are: 

  1. Reflect on implicit assumptions: The tool offers a two-step activity for tangible reflections related to how we design embodied AI and how we imagine possible, probable, and desirable futures
  2. Mapping privileges. This tool is designed to let people reflect on personal positions of privilege. Our tool invites participants to position themselves into binary axes generally associated with privilege, i.e., skin colour.
  3. Punkbot collages against the status quo. The tool supports the overturning of the status quo of robot design using ornamental activities inspired by punk techniques by Letterist International.

Exploring spaces between categories: A biased classifier. This tool aims to break with stereotypical expectations and thinking in binary categories. By surfacing our unconscious associations and the narrow ways of categorizing things, a classification algorithm (i.e.,Teachable Machine) can be used to help us challenge gender norms and stereotypes.

“Design methods and Toolkits are useful but do provide an easy fix for our current practices. We need to reflect on lived experiences and critically assess the participatory principles we employ. As we have seen in all the workshops, participatory approaches often fall short. We must nurture practices to examine the power dynamics in community research and design projects.”
Critina Zaga

The project focuses on entanglements related to reflecting on AI with and for society. It is relevant because traditionally, designers and engineers are not trained to include reflection and practices that tackle social inequity; thus —willingly or unwillingly—encode certain (negative) values into the design systems.

We offer practical tools, insights, and a community to design for justice.

Contacts

  • -
    Cristina Zaga
  • -
    Nazli Cila
  • -
    Maria Luce Lupetti
  • -
    Minha Lee
  • -
    Gijs Huisman
  • -
    Eduard Fosch Villaronga
  • -
    Anne Arzberger

Participating universities

MoreLess
Related projects
All projects