The default female voice is perpetuating gender stereotypes which lead to bias, prejudice, bigotry and misogyny.
The biggest challenge here is the inequality that we must overcome in order to implement these new design suggestions.
A. Male dominated field - right now only 12% of AI researchers are female.
B. Old data sets - old, incomplete or biased data creates biased AI.
This presents an opportunity to examine the current methods used in digital assistant design, determine the problem areas and create strategies and best practices to steer us away from bias and toward equality.
Apple Ally is a phone application that uses AI to guide minors in a way that promotes equality and respect.
We need to reach people at a younger age! In order to shift attitudes and change behaviors I came across an unmet market which provided an opportunity for a voice assistant specifically designed for a minor who is a first time iPhone owner. The strategies and design of the app were guided by a set of best practices I created in order to solve for the problem in a cohesive way.
WHAT : Digital voice guide for minors.
WHY : Empowers kids to choose a voice, promote respectful interactions with guide and provide equal representation of genders via the “Alter Ally” feature.
WHEN : During purchase and setting up first iPhone.
HOW : Replace Siri with Ally for minors. Add Ally as an option for adults.
WHO THIS SERVES
Click below to see the combination family persona and user journey maps.
In order to create a service that solves for the appropriate problems I had to create a set of best practices. These are intended for designers at the beginning of the design process and should be be applied and referenced throughout each phase of the project.
1. Create a team of diverse people.
2. Use a diverse and comprehensive gender-sensitive data set.
3. Create an onboarding system with an awareness of gender bias.
4. Employ corrective responses for manners in order to promote respect toward a female.
5. Employ corrective responses for sexist comments.
6. Pay special attention to the children who are first time phone owners.
7. Be a leader in progress by standing up for equality, even if it contradicts user feedback.
8. Deter interaction with sexual references.
9. Replace the word assistant.
10. Ditch the default.
Click on the image to see five different flows with annotations.
INTERVIEWS WITH EXPERTS
Elizabeth Canfield, Ph.D.
Department of Gender, Sexuality and Women's Studies, Associate Professor and Associate Chair
I interviewed Liz at the very beginning of this journey. She helped direct toward the topic of bias in technology and how to mitigate it.
Vojislav Kecman, Ph.D.
Professor- Computer scientist specializing in large-scale datasets and machine learning.
Vojislav helped me understand how algorithms aren't necessarily bias but the data sets are the real problem.
Bridget McInnes, Ph.D.
Assistant Professor - Natural Language Processing (NLP) with a particular interest in semantics.
Dr. McInnes and I discussed language and it's part in the bias in AI.
As a passionate feminist I really wanted to find an impactful solution for flattening the curve of bias. It took me awhile to figure out what part of the problem I wanted to solve and where, exactly, I needed to start. The first expert I interviewed was a PhD in women's studies. After that interview I knew I wanted to focus on the bias in AI because of its relevance. It took a ton of research, including three books and about 57 articles for me to arrive at that "ah-ha" moment. But I finally got there. I only wish I had more time to continue to work on this. I hit my stride late, but at least I hit it. This is a viable solution based on sound best practices. I firmly believe that one day women will over come inequality via bite-sized solutions that create monumental movements.