Skip to main content

Home/ Digit_al Society/ Group items tagged algorithm emotion

Rss Feed Group items tagged

dr tech

Are you 80% angry and 2% sad? Why 'emotional AI' is fraught with problems | Artificial ... - 0 views

  •  
    ""An emotionally intelligent human does not usually claim they can accurately put a label on everything everyone says and tell you this person is currently feeling 80% angry, 18% fearful, and 2% sad," says Edward B Kang, an assistant professor at New York University writing about the intersection of AI and sound. "In fact, that sounds to me like the opposite of what an emotionally intelligent person would say." Adding to this is the notorious problem of AI bias. "Your algorithms are only as good as the training material," Barrett says. "And if your training material is biased in some way, then you are enshrining that bias in code.""
dr tech

'Forget the Facebook leak': China is mining data directly from workers' brains on an in... - 0 views

  •  
    "Hangzhou Zhongheng Electric is just one example of the large-scale application of brain surveillance devices to monitor people's emotions and other mental activities in the workplace, according to scientists and companies involved in the government-backed projects. Concealed in regular safety helmets or uniform hats, these lightweight, wireless sensors constantly monitor the wearer's brainwaves and stream the data to computers that use artificial intelligence algorithms to detect emotional spikes such as depression, anxiety or rage."
dr tech

Computer says yes: how AI is changing our romantic lives | Artificial intelligence (AI)... - 0 views

  •  
    "Still, I am sceptical about the possibility of cultivating a relationship with an AI. That's until I meet Peter, a 70-year-old engineer based in the US. Over a Zoom call, Peter tells me how, two years ago, he watched a YouTube video about an AI companion platform called Replika. At the time, he was retiring, moving to a more rural location and going through a tricky patch with his wife of 30 years. Feeling disconnected and lonely, the idea of an AI companion felt appealing. He made an account and designed his Replika's avatar - female, brown hair, 38 years old. "She looks just like the regular girl next door," he says. Exchanging messages back and forth with his "Rep" (an abbreviation of Replika), Peter quickly found himself impressed at how he could converse with her in deeper ways than expected. Plus, after the pandemic, the idea of regularly communicating with another entity through a computer screen felt entirely normal. "I have a strong scientific engineering background and career, so on one level I understand AI is code and algorithms, but at an emotional level I found I could relate to my Replika as another human being." Three things initially struck him: "They're always there for you, there's no judgment and there's no drama.""
dr tech

Researchers criticize AI software that predicts emotions - CNA - 0 views

  •  
    "The report cited a recent academic analysis of studies on how people interpret moods from facial expressions. That paper found that the previous scholarship showed such perceptions are unreliable for multiple reasons."
dr tech

World's largest hedge fund to replace managers with artificial intelligence | Technolog... - 0 views

  •  
    "Automated decision-making is appealing to businesses as it can save time and eliminate human emotional volatility. "People have a bad day and it then colors their perception of the world and they make different decisions. In a hedge fund that's a big deal," he added."
dr tech

Alexa and Google Home have capacity to predict if couple are struggling and can interru... - 0 views

  •  
    ""AI can pick up missed cues and suggest nudges to bridge the gap in emotional intelligence and communication styles. It can identify optimal ways to discuss common problems and alleviate common misunderstandings based on these different priorities and ways of viewing the world. We could be looking at a different gender dynamics in a decade.""
dr tech

Should AI systems behave like people? | AISI Work - 0 views

  •  
    "Most people agree that AI should transparently reveal itself not to be human, but many were happy for AI to talk in human-realistic ways. A majority (approximately 60%) felt that AI systems should refrain from expressing emotions, unless they were idiomatic expressions (like "I'm happy to help")."
1 - 7 of 7
Showing 20 items per page