The Conversation Sep 24, 2019 12:58:52 IST
Suggest to Samsung's Virtual Personal Assistant Bixby "Let's talk dirty" and "I do not
Ask the same question about the program's voice and its replies "I've read that earth's erosion is a real dirt problem."
In South Africa, where I live and conduct my research into gender biases in artificial intelligence, Samsung offers Bixby in various languages on which language you choose. For American English, there's Julia, Stephanie, Lisa and John. The voices of Julia, Lisa and Stephanie are coquettish and eager. John is clever and straightforward.
Virtual Personal Assistants ̵
They function as an application on a smart device , responding to voice commands through natural language processing. Their ubiquity throughout the world is increasing. A recent report by UNESCO estimated that by as early as next year we will have more conversations with our virtual personal assistants than with our spouses.
Yet, as I've explored in my own research with Dr Nora Ni Loideain from the Information Law and Policy Center at the University of London betray critical gender biases.
With their female names, voices and programmed flirtatiousness, the design of virtual personal assistants reproduces discriminatory stereotypes of female secretaries who, according to the stereotypical gender, is often more than just a secretary to her male boss.
It also reinforces the role of women as secondary and submissive to men . These AI assistants operate on the command of their user. They have no right to refuse these commands. They are only to obey. Arguably, they thus raise expectations for how to make real women ought to behave.
The objective of these assistants is to do so. This is problematic at least two fronts: it suggests the user has more time for supposedly more important work. Secondly, it makes a critical statement about the value of the child performed in the digital future.
"What are you wearing?" [SiriandCortanaforinstance
Siri is a Nordic meaning " the beautiful woman that leads you to victory"
Cortana takes its name from the game series Halo . In Halo Cortana was created from a clone of the brain of a successful female scientist with a transparent and highly-sexualized female body. She functions as a fictional aide for gamers with her unassuming intelligence and mesmeric shape.
In addition to her female voices, all the virtual personal assistants on the market today come with a default female voice, which, like Bixby, is programmed to respond to all suggestive questions and comments. Thesis questions include : "What are you wearing?" Siri's response is "why should I wear anything?"
Alexa, meanwhile, quips: "They do not make clothes for me";
Bias and discrimination in AI
AI systems are often biased, especially along race and gender lines. For example, the recent recruitment algorithm development by Amazon has been published.
As it has been there is a critical link between the development of AI systems which display gender biases and women in teams that design them.
But there is no recognition of the ways in which AI products incorporate stereotyped representations of gender within their very design. For AI Now a leading research institution looking into the social impact of AI, there is a clear connection between the male-dominated AI industry and the discriminatory systems and products it produces.
AI is the leading technology in the so-called Fourth Industrial Revolution ,
As South Africa continues to engage with the promises and pitfalls of what this holds  Rachel Adams Research Specialist, Human Sciences Research Council
This article is republished from The Conversation under a Creative Commons license. Read the original article .
Find our entire collection of stories, in-depth analysis, live updates, videos & more on Chandrayaan 2 Moon's mission on our dedicated # Chandrayaan2TheMoon domain .