- June 22nd
- 9:00 - 11:00 AM
UNESCO’s 2019 report, entitled “I’d Blush If I Could,” claims that voice assistants propagate harmful gender biases, such as reinforcing that women should be in subservient roles, while media coverage and research continue to argue that tech companies need to do better. As a reflection of the brand, the agent’s personality is a critical component of the design of conversational systems. But how do the personalities that we design for our voice assistants propagate biases and how do we avoid doing so? This 5-hour hands on workshop explores the components that make up personality, the role that each component — including gender — plays, and ways to avoid unintended biases. We’ll share our work on Q, the non-binary voice and our research on gender and personality in voice assistants. We’ll then break out into teams to go through design activities and rapid prototyping for a conversational assistant, thinking through the implications of our decisions.
This workshop is beneficial for designers, developers, product managers, and creatives that are responsible for creating conversational assistants or bringing their company’s brand to life.
This workshop is split into two days: the second day is June 24, 9:00 am
9:00 – 9:15 AM Day 1 Welcome, Agenda
9:15 – 9:30 AM (Interactive) Icebreaker
9:30 – 10:00 AM (Lecture 1) Presentation on personality design:
– What “persona” is in conversational interfaces
– How people project persona onto
– What the components of personality are
– How persona needs to suit the use case
10:00 – 10:30 AM (Interactive)
– Groups discuss provided use case and brainstorm persona that suits it
– Each group is presented with a use case and scope (description of basic features / functionality)
– Brainstorm what persona suits the use case and how it addresses the use case or application area
– Members rank traits for persona and decide on finalized list of traits.
10:30 – 11:00 AM (Lecture 2) Presentation on bias in personality of AI entities:
– Research on people’s perception of gender
– Potential unintended biases introduced by each component and how they are propagated in current voice assistants