Context

This study project is the result of an online course that began by examining the history of feminism, right up to today. I learned about the intersection of feminism and technology from influential figures like Josie Young, Alexander Fefegha, Caroline Sinders and Feminist Internet.

UAL (University of the Arts, London) logo
Futurelearn logo

This project was researched and designed as part of a Futurelearn course by the UAL Creative Computing Institute called ‘Design a Feminist Chatbot’.

My role

As the conversation designer for this study project I followed these steps:

  1. Completed user interviews
  2. Created a storyboard to get more insight into the contexts of use
  3. Considered design and representation in the context of the user using Google’s Conversation Design Process
  4. Mapped conversation flows using Whimsical
  5. Coded the scripted bot using Glitch

Challenge

Create a conversational design flow for a scripted chatbot. Meeting the needs of an underrepresented group of people within society.


Process

Feminist Internet studies

As part of the course I learned:

  • About the different types of chatbots (scripted or AI) and the impact of bias in their design.
  • How many ‘helper’ bots (Alexa, Siri) are gendered as women – perpetuating the stereotype of women as carers and, in some ways, as subservient.
  • Ways that bias in the design process can amplify harmful effects of stereotypes and serve them back to us.
  • The value of research. Discovering the actual, rather than perceived, needs of the people we are designing for.
  • The importance of diversity within a design team.

Users as stakeholders

The course questioned the term ‘users’. Not the first time I had encountered this and I tend to use ‘people’ and ‘users’ fairly interchangeably. It was suggested that we use the term ‘stakeholders’ instead, encouraging us to think of them as ‘having a significant input (stake) in the project’. Using the stakeholder generator tool, I chose to create a scripted bot that would help children reduce anxiety.


Stakeholder research

Simple (and limited) stakeholder research was completed with my son and a couple of his friends. Being at the younger end of the stakeholder group (age 5 and 6) their responses were short, but helpful. They did get bored rather quickly!

The tool prompted me to carefully consider any barriers of use for my stakeholder group. The most obvious is reading ability. There’s a big difference in the reading ability of a six-year-old compared to a child of ten, but either way, the language used by the bot should focus on being clear and conversational.

Stakeholder research page. Specifying the users (stakeholders) for this bot.
Stakeholder research page. Specifying the users, or stakeholders for this bot.

I felt it was important to consider the kinds of things children might worry about. What might cause them anxiety? My son and his peers live a mostly privileged life. Many children do not. They may worry about whether they will eat that day, or be affected by bullying or abuse. They could have caring obligations, be in fragile or abusive domestic situations, deal with financial hardship or any number of other concerns.

Stakeholder research page with brainstorm
Stakeholder research page with brainstorm ideas.

A bot like this may not work for all children, it may even be inaccessible to some. But it has the potential to introduce them to the idea of taking a pause and encourage self-reflection. Stopping and noticing how they feel, rather than simply reacting to those feelings. If Cody only achieved that, it could be beneficial.


Context of use

At this stage, I created a simple storyboard. It helped to think about where they might be, the frame of mind of a stakeholder at each stage in the process and what triggers them to use the bot in the first place.

Storyboard setting the scene (Amy has a hard day at school, sensory overload, struggling to keep up with work and peers) followed by triggers for use of the app (Amy struggling with noise and work at school) which her mum downloads. Then the interaction phase. Amy finds the bot is friendly and colourful, she learns ways to deal with the noise and peers in school. And finally, change. Amy is doing better at school, and coping better generally.
Storyboard setting the scene (Amy has a hard day at school, sensory overload, struggling to keep up with work and peers) followed by triggers for use of the app (Amy struggling with noise and work at school) which her mum downloads. Then the interaction phase. Amy finds the bot is friendly and colourful, she learns ways to deal with the noise and peers in school. And finally, change. Amy is doing better at school, and coping better generally.

Design and representation

The next step was to consider the chatbot representation and its personality. I used Google’s Conversation Design Process, with prompts from the Feminist Design Tool.

A brainstorm of adjectives came next, focussed on the qualities I wanted the chatbot to embody. That list was then narrowed down to 5 core personality traits.

Stakeholders (users) could be in an escalated state, so it’s important that the bot personality is calm and non-threatening. That it can understand and empathise, in order to be able to help.

The next brainstorm was to discover characters who might embody these qualities. Not necessarily a person, could be an animal, a robot or even an alien! We were prompted to be aware of perpetuating stereotypes through the use of gender. I chose a rabbit as my character. The very definition of a non-threatening animal, a rabbit is soft and gentle.

Adjectives to describe the bot, and characters who embody those qualities.
Brainstorm of adjectives to describe the bot, and characters who embody those qualities.
A description of the chatbot, and some visual references. Animated gifs of robots, and stylised rabbits.
A description of the chatbot, and visual references.

Conversation design

Considering how to talk to children in that age group, particularly about things they’re reluctant to think or talk about, was really important.

Language points to consider:

  • Using words like ‘help’ and ‘worries’ might put kids off… try talking about what’s ‘on your mind’ or ‘bothering you’ instead.
  • Getting the balance of information, conversation and fun.
  • Need to convey that the bot understands how they feel (within the limits of what a bot can do).
  • I felt it was important to introduce the children to techniques, activities and info. Not push it.

I then mapped out the scripted conversation flow using Whimsical.

Screenshot of conversation flow in Whimsical
I felt it was important to introduce the children to techniques, activities and info. Not push it.

Alpha version

Working through this process step-by-step ensured that (despite having a very limited research group) I was considering the stakeholder’s needs and limitations at every point.

Because of this, I made the decision that the bot should introduce kids to the organisation Childline and resources available to them through their website and counselling services (chat online or call). Also that a final version of the bot should have a read-aloud function in order to be more accessible to that age group.

I chose the name Cody for this particular chatbot. It’s a non-gendered name meaning ‘Helper’. It’s also a (rather obvious) pun on coding, which appealed to me.

Cody should:

  • Introduce kids to Childline and resources available to them through their website and counselling services (chat online or call).
  • Provide them with simple mindfulness techniques to help them to manage their worries.
  • Provide information and fun, alongside the techniques. It’s a fine balance!
  • Have read-aloud functionality.
Screenshot of conversation where Cody asks to be your friend and how you would like to be addressed.
Introduction conversation where Cody asks to be your friend and asks how you would like to be addressed.

Once the conversation flow was complete I worked on creating it using Glitch, remixing the F’xa bot by Feminist Internet.

Screenshot of the chatbot introduction, this time it's a split screen showing the code on the right hand side, and the chatbot on the left. The code shows what the bot says at each point, and the human response options. The introduction text reads 'Hey! My name is Cody. I'd like to be your friend. What would you like me to call you?'
Code and conversation shown in Glitch
View the alpha version on Glitch

Result

Considering this was a short project with a very limited research group, I’m proud of the end result as an alpha version. I believe it has potential as a way of introducing de-escalation and mindfulness techniques to children. And offering support to those who need it.

Given the huge variation in each child’s lived experience, and developmental changes that happen between the ages of 5-10, it would never be a product that could work for all children. People have all different kinds of worries. The scale of these worries may vary depending on the gender, race, family situation and socio-economic circumstances of each child, and those with *serious or safety concerns should be properly signposted to contact Childline.

*serious of course from the point of view of an adult lens. All worries are valid and serious to the child.

The scale of these worries may vary depending on the gender, race, family situation and socio-economic circumstances of each child.

Reflections

I found the course, and the process of chatbot design fascinating. To progress this project I would complete interviews with a far greater number of participants, from a more diverse research population. It would need a greater range of ages, more racial diversity and to be split across socio-economic groups. The only way to get a better view of the user (stakeholder) needs would be with comprehensive research. At this point it would be ready for another iteration, or user testing.

Photo by Mika Baumeister on Unsplash

Animations used for Cody by Jonathan Dahl, Nhut Nguyen, Formas Studio and Bender from Futurama by Matt Groening.