Skip to main content

Home/ Words R Us/ Group items tagged artificial

Rss Feed Group items tagged

Lara Cowell

From Facebook To A Virtual You: Planning Your Digital Afterlife - 1 views

  •  
    A start-up, Eterni-Me, is looking at ways of using artificial intelligence to keep us alive virtually - long after we're gone. The company collects data that you've curated from Facebook, Twitter, e-mail, photos, video, location information, and even Google Glass and Fitbit devices., and processes this huge amount of information using complex artificial intelligence algorithms. Then it generates a virtual YOU, an avatar that emulates your personality and can interact with, and offer information and advice to your family and friends, even after you pass away.
Lara Cowell

DeepDrumpf 2016 - 0 views

  •  
    Bradley Hayes, a post-doc student at the Massachusetts Institute of Technology, has invented @DeepDrumpf, an amusing bit of artificial intelligence. DeepDrumpf is a bot trained on the publicly available speech transcripts, tweets, and debate remarks of Donald Trump. Using a machine learning model known as a Recurrent Neural Network, the bot generates sequences of words based on priming text and the statistical structure found within its training data. Created to highlight the absurdity of this election cycle, it has amassed over 20,000 followers and has been viewed over 12 million times -- showcasing the consequences of training a machine learning model on a dataset that embodies fearmongering, bigotry, xenophobia, and hypernationalism. Here's a sample tweet: "We have to end education. What they do is unbelievable, how bad. Nobody can do that like me. Believe me."
Ryan Catalani

Lexicalist.com - a demographic dictionary of modern American English - 1 views

  •  
    "Lexicalist uses artificial intelligence to analyze the web and figure out who's talking about what. The result is a demographic picture of language in actual use today."
Lara Cowell

Alexa vs. Siri vs. Google: Which Can Carry on a Conversation Best? - 1 views

  •  
    Just in case you were under the misimpression that artificial intelligence will be taking over the world shortly, this article suggests that digital assistants really can't even handle the sort of everyday linguistic interaction that humans take for granted. Still, it is interesting to find out how product engineers are designing the assistants to become "smarter" at comprehending your words and requests. Machine learning algorithms can help devices deal with turn-by-turn exchanges. But each verbal exchange is limited to a simple, three- or four-turn conversation.
allstonpleus19

Facebook AI Creates Its Own Language In Creepy Preview Of Our Potential Future - 0 views

  •  
    Facebook shut down an Artificial Intelligence experiment shortly after developers discovered that the machines were talking in a language the machines made up that humans couldn't understand. The developers first programmed the chatbots so they could talk in English to each other when trading items back and forth. When the chatbots got into a long negotiation, they started talking to each other in their own language. Even though these chatbots were not highly intelligent, it is concerning that they went off on their own so quickly. In 2014, scientist Stephen Hawking warned that Artificial Intelligence could be the end of the human race. Hopefully the Matrix is science fiction, not science prediction.
Lara Cowell

AI's Language Problem - 0 views

  •  
    This MIT Technology Review article notes that while Artificial Intelligence has experienced many sophisticated advances, one fundamental capability remains elusive: language. Systems like Siri and IBM's Watson can follow simple spoken or typed commands and answer basic questions, but they can't hold a conversation and have no real understanding of the words they use. In addition, humans, unlike machines, have the ability to learn very quickly from a relatively small amount of data and have a built-in ability to model the world in 3-D very efficiently. Programming machines to comprehend and generate language is a complex task, because the machines would need to mimic human learning, mental model building, and psychology. As MIT cognitive scientist Josh Tenenbaum states, "Language builds on other abilities that are probably more basic, that are present in young infants before they have language: perceiving the world visually, acting on our motor systems, understanding the physics of the world or other agents' goals." ­ If he is right, then it will be difficult to re-create language understanding in machines and AI systems without trying to mimic human learning, mental model building, and psychology.
Lara Cowell

Is language the ultimate frontier of AI research? | Stanford School of Engineering - 0 views

  •  
    Learning the intricacies of human languages is hard even for human children and non-native speakers - but it's particularly difficult for AI. Scientists have already taught computers how to do simple tasks, like translating one language to another or searching for keywords. Artificial intelligence has gotten better at solving these narrow problems. But now scientists are tackling harder problems, like how to build AI algorithms that can piece together bits of information to give a coherent answer for more complicated, nuanced questions. "Language is the ultimate frontier of AI research because you can express any thought or idea in language," states Stanford computer science professor Yoav Shoham. "It's as rich as human thinking." For Shoham, the excitement about artificial intelligence lies not only in what it can do - but also in what it can't. "It's not just mimicking the human brain in silicon, but asking what traits are so innately human that we don't think we can emulate them on a computer," Shoham said. "Our creativity, fairness, emotions, all the stuff we take for granted - machines can't even come close."
Lara Cowell

Did My Cat Just Hit On Me? An Adventure in Pet Translation - 0 views

  •  
    The urge to converse with animals is age-old, long predating the time when smartphones became our best friends. A new app, is the product of a growing interest in enlisting additional intelligences - machine-learning algorithms - to decode animal communication. The app detects and analyzes cat utterances in real-time, assigning each one a broadly defined "intent," such as happy, resting, hunting or "mating call." It then displays a conversational, plain English "translation" of whatever intent it detects. MeowTalk uses the sounds it collects to refine its algorithms and improve its performance, the founders said, and pet owners can provide in-the-moment feedback if the app gets it wrong. In 2021, MeowTalk researchers reported that the software could distinguish among nine intents with 90 percent accuracy overall. But the app was better at identifying some than others, not infrequently confusing "happy" and "pain," according to the results. Dogs could soon have their own day. Zoolingua, a start-up based in Arizona, is hoping to create an A.I.-powered dog translator that will analyze canine vocalizations and body language. Still, even sophisticated algorithms may miss critical real-world context and cues, said Alexandra Horowitz, an expert on dog cognition at Barnard College. For instance, much of canine behavior is driven by scent. "How is that going to be translated, when we don't know the extent of it ourselves?" Dr. Horowitz said in an email.
Lara Cowell

Meet Michael Running Wolf, the man using AI to reclaim Native languages - 1 views

  •  
    Imagine putting on a virtual reality headset and entering a world where you can explore communities, like Missoula, except your character, and everyone you interact with, speaks Salish, Cheyenne or Blackfoot. Imagine having a device like Amazon's Alexa that understands and speaks exclusively in Indigenous languages. Or imagine a digital language playground in Facebook's Metaverse, where programmers create interactive games to enhance Indigenous language learning. Michael Running Wolf, a Northern Cheyenne man who is earning his Ph.D. in computer science, wants to make these dreams a reality. Running Wolf grew up in Birney, a town with a population of 150 just south of the Northern Cheyenne Reservation. He spent most of his childhood living without electricity. Running Wolf can speak some Cheyenne, but he wants Indigenous language learning to be more accessible, immersive and engaging. And he believes artificial intelligence is the solution. Running Wolf is one of a handful of researchers worldwide who are studying Indigenous languages and AI. He works with a small team of linguists and data scientists, and together, they analyze Indigenous languages and work to translate them into something a computer can interpret. If his team can accomplish this, Running Wolf reasons, then perhaps AI can be used to help revitalize Indigenous languages everywhere.
emilydaehler24

How AI is decoding the animal kingdom - 0 views

  •  
    This article writes about the complexities of animal communication and how elephants are able to use low frequency sounds to stay in touch amongst each other. Generative artificial intelligence is able to help humans generate an algorithm that is capable of detecting animal calls, grumbles, grunts, squeaks etc and translate it into a language that humans are comfortable interpreting.
Ryan Catalani

PLoS ONE: Why Um Helps Auditory Word Recognition: The Temporal Delay Hypothesis - 2 views

  •  
    "Our main conclusion is that delays in word onset facilitate word recognition, and that such facilitation is independent of the type of delay. ... Our findings support the perhaps counterintuitive conclusion that fillers like um can sometimes help (rather than hinder) listeners to identify spoken words. But critically, the data show that the same is true for silent pauses and pauses filled with artificially generated tones. "
Lara Cowell

If Your Shrink is a Bot, How Do You Respond? - 1 views

  •  
    An interesting story--my students, you might recall Sheryl Turkle of MIT referencing robot therapists in her TED talk. USC has developed a robot therapist, Ellie, designed to talk to people who are struggling emotionally, and to take their measure in a way no human can. Originally developed to work with military PTSD patients, Ellie's purpose: to gather information and provide real human therapists detailed analysis of patients' movements and vocal features, in order to give new insights into people struggling with emotional issues. The body, face and voice express things that words sometimes obscure. Ellie's makers believe that her ability to do this will ultimately revolutionize American mental health care.
Ryan Catalani

Persuasive speech: The way we, um, talk sways our listeners - 3 views

  •  
    ""Interviewers who spoke moderately fast, at a rate of about 3.5 words per second, were much more successful at getting people to agree than either interviewers who talked very fast or very slowly," said Jose Benki... variation in pitch could be helpful for some interviewers but for others, too much pitch variation sounds artificial, like people are trying too hard. ... "People who pause too much are seen as disfluent. But it was interesting that even the most disfluent interviewers had higher success rates than those who were perfectly fluent.""
Lara Cowell

Teaching a Machine to Have a Conversation Isn't Easy | Adobe Blog - 0 views

  •  
    Voice-based interactions require sophisticated programming and AI to enable a machine to understand and talk to you. In other words, it's a really big deal to program a computer to have a conversation. Walter says the challenge now is to process massive amounts of dialogue-related data to facilitate human-like intellectual and speech functionality in machines. "We are developing the technology that will give machines the cognitive ability to understand the semantics and context of a conversation, respond to topic changes and, in essence, carry on everything from a complex conversation to small talk."
Lara Cowell

The Fantastical Rise of Invented Languages | The New Republic - 0 views

  •  
    This article documents the subculture of conlangers. Conlang", short for "constructed language," is a language that has been constructed. There are a lot of them, of various sorts. International auxiliary languages like Volapük, Esperanto, or Interlingua are one specific type of conlang. Invented to facilitate international communication during the great techno-utopian-modernist thought-boom of the last two centuries, they never got terribly popular. Conlangs do not necessarily have to be useful. As David Peterson explains in his new book _The Art of Language Invention_, conlanging is an art as well as a science, something you might do for your own pleasure, as well as for the entertainment of others.
Lara Cowell

Looking for a Choice of Voices in A.I. Technology - 0 views

  •  
    Choosing a voice has implications for design, branding or interacting with machines. A voice can change or harden how we see each other. Research suggests that users prefer a younger, female voice for their digital personal assistant. We don't just need that computerized voice to meet our expectations, said Justine Cassell, a professor at Carnegie Mellon's Human-Computer Interaction Institute. We need computers to relate to us and put us at ease when performing a task. "We have to know that the other is enough like us that it will run our program correctly," she said. That need seems to start young. Ms. Cassell has designed an avatar of indeterminate race and gender for 5-year-olds. "The girls think it's a girl, and the boys think it's a boy," she said. "Children of color think it's of color, Caucasians think it's Caucasian." Another system Cassell built spoke in what she termed "vernacular" to African-American children, achieving better results in teaching scientific concepts than when the computer spoke in standard English. When tutoring the children in a class presentation, however, "we wanted it to practice with them in 'proper English.' Standard American English is still the code of power, so we needed to develop an agent that would train them in code switching," she said. And, of course, there are regional issues to consider when creating a robotic voice. Many companies, such as Apple, have tweaked robotic voices for localized accents and jokes.
Lara Cowell

Finding A Pedicure In China, Using Cutting-Edge Translation Apps - 0 views

  •  
    A traveling journalist in Beijing utilizes both Baidu (China's version of Google) and Google voice-translation apps with mixed results. You speak into the apps, they listen and then translate into the language you choose. They do it in writing, by displaying text on the screen as you talk; and out loud, by using your phone's speaker to narrate what you've said once you're done talking. Typically exchanges are brief: 3-4 turns on average for Google, 7-8 for Baidu's translate app. Both Google and Baidu use machine learning to power their translation technology. While a human linguist could dictate all the rules for going from one language to another, that would be tedious, and yield poor results because a lot of languages aren't structured in parallel form. So instead, both companies have moved to pattern recognition through "neural machine translation." They take a mountain of data - really good translations - and load it into their computers. Algorithms then mine through the data to look for patterns. The end product is translation that's not just phrase-by-phrase, but entire thoughts and sentences at a time. Not surprisingly, sometimes translations are successes, and other times, epic fails. Why? As Macduff Hughes, a Google executive, notes, "there's a lot more to translation than mapping one word to another. The cultural understanding is something that's hard to fully capture just in translation."
aching17

AI Might Soon Allow Us to Translate the Mysterious Dolphin Language - 0 views

  •  
    This article explains how close we are getting to understanding Dolphin's language. They language consists of sentences, similar to ours, where the order of the words determine the meaning of the sentence. They also apparently take turn talking, as we do. So all scientists need to do now is figure out what meaning matchings each sound. This is where AI (aka. Artificial intelligence) comes in. Scientists will be using one to help figure out meanings to sounds. This will be taking place in a wildlife park south of Stockholm to Bottlenose dolphins.
Lara Cowell

9 Tips to Design Conversational Style for Your Bot - 0 views

  •  
    Interesting article re: designing an efficient and responsive bot: leveraging key human linguistic features in order to have the machine "converse" with humans, comprehend their needs, and respond appropriately. Programming a machine to exhibit the conversational nuances and sophisticated comprehension of a normal human=hard.
nataliekaku22

Hashtags may not be words, grammatically speaking, but they help spread a message - 0 views

  •  
    This article talks about the different arguments for the linguistic status of hashtags. One of the arguments is that they are like compound words. Compound words are words that are a combination of two existing words which were formed into one word (ex. notebook, living room or long-term). Another suggestion is that hashtagging is a less formal and completely new process of forming words. It suggests that there are no rules in hashtagging other than that there can be no spaces in between the parts. The authors argue that their research goes against both arguments by saying that they shouldn't be considered as words at all, but that they are still very interesting linguistically because they function in many different roles in language use on social media.
  •  
    This article argues that hashtags are artificial words based on their research of a collection of millions of New Zealand English tweets. Hashtags are a widespread feature of social media posts and used widely in search engines. Anything with the intent of attracting attention comes with a memorable hashtag like #BlackLivesMatter, #MeToo, and #COVID19. There are two main theories regarding the linguistic status of hashtags. One claims hashbrowns are like compound words. This is a way of making new words by gluing two or more words together. Another claims that hashtags are words that arise from a completely different process. Hashtagging is a much looser word-formation process with fewer restrictions. However, these researchers argue against both these conjectures. They suggest hashtags are written to look orthographically like words but their function is much broader and similar to keywords in a library catalogue or search engine. The researchers also created their own term, hybrid hashtags, meaning hashtags comprising one or more words from two distinct languages. Their example of hybrid hashtags included #kiaora4that and #letssharegoodtereostories which combined English and Maori, the indigenous language of New Zealand.
1 - 20 of 28 Next ›
Showing 20 items per page