Skip to main content

Home/ HyperVoix/ Group items tagged gender

Rss Feed Group items tagged

Cécile Christodoulou

Experts warn AI could hardwire sexism into our future - 0 views

  •  
    "In her talk, called "Memoirs of Geisha: Building AI without gender bias," [Laura Andina, a Product Manager at Telefonica Digital] explained AI's gender bias by taking a look at Apple's pioneering of skeuomorphic design - a design method that replicates what a product would look like in real-life, as well as taking into account how the physical product would be used." "Receptionists, customer service representatives, and assistants have traditionally been female-dominated careers. Women have had to be helpful, friendly, and patient because it's their job. The skeuomorphic design of an AI assistant therefore would be female. For Andina, it's essential to break these gender biases in design to be able to make real-world changes. If new technology would stop peddling old stereotypes, women would have an easier time moving up the ranks professionally without being cast as assistants or any other "helpful" stereotype." "To avoid hardwiring sexism and gender bias into our future, one possible solution, according to Andina, would be providing a genderless voice for AI technology. But it won't be easy to make - most genderless voices sound too robotic. Human-sounding voices are more trustworthy, so this could deter users."
Cécile Christodoulou

Women Reclaiming AI workshop - 0 views

  •  
    "Women Reclaiming AI (WRAI) is an expanding activist artwork, presented as a feminist AI voice assistant, programmed through participatory workshops by a growing community of self-identifying women. Through creating a platform for collective writing and editing, the project co-creates an AI that challenges prescribed gender roles. It is a response to the pervasive depiction of AI voice assistants gendered as women, subordinate and serving. Designed by development teams which lack diversity, these systems are embedded with unrepresentative world views and stereotype in ways that reinforce traditional gender roles. WRAI aims to reclaim female voices in the development of future AI systems by empowering self-identifying women to harness conversational AI as a medium for protest. This project is created by artists-technologists Coral Manton and Birgitte Aga in collaboration with an ever evolving community of self-identifying women."
Cécile Christodoulou

Meet Q: The First Genderless Voice - FULL SPEECH - YouTube - 1 views

  •  
    https://www.wired.com/story/the-genderless-digital-voice-the-world-needs-right-now/ "[...] a group of linguists, technologists, and sound designers-led by Copenhagen Pride and Vice's creative agency Virtue-are on a quest to change that with a new, genderless digital voice, made from real voices, called Q. Q isn't going to show up in your smartphone tomorrow, but the idea is to pressure the tech industry into acknowledging that gender isn't necessarily binary, a matter of man or woman, masculine or feminine." "[...] there's a sweet spot between 145 and 175 hertz, a range that research shows we perceive as more gender-neutral. Go higher and you'll perceive the voice as typically female; go lower and it becomes more masculine."
Cécile Christodoulou

This feminist chatbot challenges AI bias in voice assistants - 0 views

  •  
    "F'xa is built with feminists values in mind and every response given holds up to feminist beliefs that avoid reinforcing bias and stereotypes. F'xa was created by a diverse team using the Feminist Internet's Personal Intelligent Assistant Standards and Josie Young's Feminist Chatbot Design research. In preparation for building F'xa, Young explored contemporary feminist techniques for designing technology called the Feminist Chatbot Design Process - a series of reflective questions incorporating feminist design, ethical AI principles, and research on de-biasing data. Using a smartphone, the bot works to ensure designers don't perpetuate gender inequalities into their chatbots and educates users on how current voice assistants give gender equality a bleak future. "
Cécile Christodoulou

Female Voice Assistants Reinforcing Stereotypes, says UN Report - 0 views

  •  
    "The UNESCO report asks developers to design a neutral machine gender for voice assistants, which are programmed to discourage gender-based insults. Technology firms should also emphasise to the public that voice assistants are non-human." https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1
Cécile Christodoulou

Smart speakers understand men better than women, according to study | TechRadar - 0 views

  •  
    "Female owners of smart speakers are more likely than men to report that their device fails to understand their commands, according to a recent study of 1000 British smart speaker owners by YouGov." "The researchers also found that women tend to speak more politely to their smart speakers, with "45% saying they "always" or "often" say 'please' and 'thank you', compared to only 30% of male owners"." "This discrepancy between male and female users could be a result of bias at the point of training AI assistants like Alexa or Siri; if programmers train the AI to respond to mainly male voices, it may have trouble recognizing female voices in the future. Not everyone believes this to be the case however. In its reporting of the study, the Evening Standard cites a blog post by founder and CEO of R7 Speech Sciences, Delip Rao, who believes that the discrepancy is down to technological issues rather than gender bias. "
1 - 10 of 10
Showing 20 items per page