Home / Gadgets / Conversation AI can drive social stereotypes

Conversation AI can drive social stereotypes

Alexa, Siri, Watson and their speaking AI siblings make our lives easier, but also reinforce gender stereotypes. Polite, submissive digital secretaries like Alexa and Siri are portrayed as female. Confident, omniscient risk! The champion Watson is most often referred to as "he". New generations of AI are coming that will make this problem more significant and make it much more difficult to avoid. As the field expands, designers need to ensure that they create a more expansive world and don't replicate a narrow-minded, gendered world. Linguists can help you get there.

Last summer, UNESCO published a report warning against the "worrying effects" of gender-specific AI. The researchers recommended a closer look at why many current voice-based AI systems that interact with millions of people around the world often speak by default with a female voice, even though they claim to be genderless. While efforts to investigate and address the issue of AI and gender should be welcomed, the report's authors and others have overlooked one crucial point: it's not just about changing pronouns or vocal traits. To seriously address the issue of gender stereotyping in AI, not only the voice of the system needs to be considered.



Sharone Horowit-Hendler is a PhD student in language anthropology at SUNY Albany with a focus on gender studies. Her upcoming dissertation Navigating the Binary deals with the representation of genders in the non-binary community. James Hendler is Professor of Computer Science, Director of the Institute for Data Research and Application at the Rensselaer Polytechnic Institute and member of the Association for Advancement of Artificial Intelligence. In their most recent book Social Machines: The upcoming collision of artificial intelligence, social networks and humanity (Apress, 201

7) the emerging effects of AI technology are discussed products that are talkative wide beyond the question-and-answer format of our pocket assistants. These new “social machines” are increasingly becoming partners in multi-personal, multi-media decision-making processes. For example, instead of answering a single user's request for the closest Chinese restaurant, a conversation partner can contact a group of people in the not too distant future to decide where to eat. Such an AI will participate as a member of the group: "Well, if Bob and Bill want Chinese and Mary Thai, why not merge on the street?" Or it jumps in even more wildly: "OK, then we go to the fusion site."

In linguistics, it goes without saying that speech patterns in conversations elicit gender-specific assumptions regardless of the voice or the appearance of the speaker. In standard American culture, for example, the literature describes that men "take up space" more often in conversation: they interrupt more often, use more words, avoid some social courtesy and speak with more obvious certainty. Women, on the other hand, speak stereotypically less and more politely, give more confirmation and signs of listening and suggest rather than dictate. In addition, tone, speed, choice of words and other small changes can change the perception of the speaker by a participant.

When some have tried to solve the problem by creating systems with asexual digital voices, they still lack an important function. Even in a voiceless chatbot, a user can assign a male or female gender based on these conversation characteristics. In the previous restaurant example, the first suggestion is likely to be considered polite and feminine, while the latter is typically considered masculine. Recent studies also show that these clues can outweigh whether a voice sounds stereotypically male or female, and even contradict a speaker's direct claims, whether human or machine, regarding their own identity. In AI terms, the fact that Siri replied, "I don't have a gender," didn't change the fact that people mostly see the program as female.

Designers need to pay more attention to the ethical issues that result. If new AIs continue to fall into current gender role stereotypes, the stereotype of the passive and submissive woman towards the knowledgeable leader / expert will be further developed. But designers could also be powerful agents of change, not only in our culture, but especially in developing countries where the subordinate status of women is a growing international problem. Imagine the impact of a corporate or medical advisor AI that presents itself as companion female and assistive AI with standard male speaking styles. More women-perceived AIs in expert roles could help to improve society's perception and promote the acceptance of women in such positions.

Another future potential is to completely break away from the binary gender dichotomy. A growing percentage of the world's population does not identify itself as male or female and falls into categories that are just becoming more widely recognized in mainstream society. This includes not only transgender individuals, but also the large subpopulation that does not identify with a binary gender at all. Such AI systems could have a major impact for these marginalized groups, which have extremely high suicide rates, for example. Not only could they popularize the use of the gender-neutral singular pronoun, but they also reflected the language patterns of this community. Now that linguistic studies on non-binary language are emerging, AI designers who work with language researchers could also benefit from this community. It would be invaluable for non-binary people to recognize their way of speaking in AI role models.

Source link