Skip to main content

Home/ Words R Us/ Group items tagged machine

Rss Feed Group items tagged

daralynwen19

This is your brain on communication - 1 views

  •  
    There are two neural mechanisms that scientists believe enable us to communicate. One is that sound waves made by the speaker affects how the listener's brain responds, which is basically the same way the speaker's brain is responding. The other is that human brains have formed a common neural behavior which makes our brains respond in the same type of pattern, allowing us to share information through this neural behavior. One of the experiments discussed in this article explains that people were brought in and scanned by fMRI machines, monitoring the part of the brain that processes sound waves coming from the ear. These subjects were monitored while they were at rest, telling a story, and/or listening to a story. It then discusses the results and results of similar experiments.
Lara Cowell

Alexa vs. Siri vs. Google: Which Can Carry on a Conversation Best? - 1 views

  •  
    Just in case you were under the misimpression that artificial intelligence will be taking over the world shortly, this article suggests that digital assistants really can't even handle the sort of everyday linguistic interaction that humans take for granted. Still, it is interesting to find out how product engineers are designing the assistants to become "smarter" at comprehending your words and requests. Machine learning algorithms can help devices deal with turn-by-turn exchanges. But each verbal exchange is limited to a simple, three- or four-turn conversation.
Lara Cowell

Looking for a Choice of Voices in A.I. Technology - 0 views

  •  
    Choosing a voice has implications for design, branding or interacting with machines. A voice can change or harden how we see each other. Research suggests that users prefer a younger, female voice for their digital personal assistant. We don't just need that computerized voice to meet our expectations, said Justine Cassell, a professor at Carnegie Mellon's Human-Computer Interaction Institute. We need computers to relate to us and put us at ease when performing a task. "We have to know that the other is enough like us that it will run our program correctly," she said. That need seems to start young. Ms. Cassell has designed an avatar of indeterminate race and gender for 5-year-olds. "The girls think it's a girl, and the boys think it's a boy," she said. "Children of color think it's of color, Caucasians think it's Caucasian." Another system Cassell built spoke in what she termed "vernacular" to African-American children, achieving better results in teaching scientific concepts than when the computer spoke in standard English. When tutoring the children in a class presentation, however, "we wanted it to practice with them in 'proper English.' Standard American English is still the code of power, so we needed to develop an agent that would train them in code switching," she said. And, of course, there are regional issues to consider when creating a robotic voice. Many companies, such as Apple, have tweaked robotic voices for localized accents and jokes.
Parker Tuttle

Medical scanner to reveal how brain processes languages - 3 views

  •  
    A state-of-the-art medical scanner will help scientists unveil the secrets of how the brain processes languages. The magneto-encephalography machine, which was unveiled yesterday at the inauguration of New York University Abu Dhabi's Neuroscience of Language Laboratory, will be able to analyse language processes in the brain faster and more efficiently than current neuroscience technology.
kianakomeiji22

How exactly does Google Translate produce results? - 0 views

  •  
    This article discusses how Google Translate functions. Google Translate is a relatively accurate and easy-to-use translator. At first, the system required millions of human-generated translations of texts to identify patterns, in order to provide a pretty accurate translation. Also during this early period, the translator would use English as an intermediary language-languages were translated to English and then from English to the target language. The translator was decent at translating short excerpts, but as the texts got longer, there is a decline in the quality of the translations. In 2016, Google announced they were shifting to a neural network machine learning process, which is supposed to attempt look at the full context of the texts to eliminate discrepancies in translations. This way instead of an intermediary language, the system can just translate from one language to another.
Lara Cowell

Computing for deaf people - The race to teach sign language to computers | Science &amp... - 3 views

  •  
    The World Health Organisation counts 430m people as deaf or hard of hearing. Many use sign languages to communicate. If they cannot also use those languages to talk to computers, they risk being excluded from the digitisation that is taking over everyday life. Sign language poses particular issues in re: its translation to either text or speech. Some challenges include improving the machine-learning algorithms that recognise signs and their meaning and developing the best methods to interpret sign languages' distinctive grammars. The applications of such technology could improve the lives of the deaf, for example, allowing them to use their cell phones to search for directions or look up the meanings of unknown signs, without resorting to the written form of a spoken language.
Lara Cowell

Is language the ultimate frontier of AI research? | Stanford School of Engineering - 0 views

  •  
    Learning the intricacies of human languages is hard even for human children and non-native speakers - but it's particularly difficult for AI. Scientists have already taught computers how to do simple tasks, like translating one language to another or searching for keywords. Artificial intelligence has gotten better at solving these narrow problems. But now scientists are tackling harder problems, like how to build AI algorithms that can piece together bits of information to give a coherent answer for more complicated, nuanced questions. "Language is the ultimate frontier of AI research because you can express any thought or idea in language," states Stanford computer science professor Yoav Shoham. "It's as rich as human thinking." For Shoham, the excitement about artificial intelligence lies not only in what it can do - but also in what it can't. "It's not just mimicking the human brain in silicon, but asking what traits are so innately human that we don't think we can emulate them on a computer," Shoham said. "Our creativity, fairness, emotions, all the stuff we take for granted - machines can't even come close."
Lara Cowell

Natural Language Processing ft. Siri - 0 views

  •  
    Siri uses a variety of advanced machine learning technologies to be able to understand your command and return a response - primarily natural language processing (NLP) and speech recognition. NLP primarily focuses on allowing computers to understand and communicate in human language. In terms of programming, languages are split up into three categories - syntax, semantics, and pragmatics. Whereas syntax describes the structure and composition of the phrases, semantics provide meaning for the syntactic elements. Pragmatics, on the other hand, refers to the composition and context in which the phrase is used.
keonsagara23

The Chinese Room Argument (Stanford Encyclopedia of Philosophy) - 0 views

  •  
    This post discusses the Chinese Room thought experiment, which examines whether or not computers are actually able to understand us and our words, or if they're just stringing together certain words by association.
Lara Cowell

Did My Cat Just Hit On Me? An Adventure in Pet Translation - 0 views

  •  
    The urge to converse with animals is age-old, long predating the time when smartphones became our best friends. A new app, is the product of a growing interest in enlisting additional intelligences - machine-learning algorithms - to decode animal communication. The app detects and analyzes cat utterances in real-time, assigning each one a broadly defined "intent," such as happy, resting, hunting or "mating call." It then displays a conversational, plain English "translation" of whatever intent it detects. MeowTalk uses the sounds it collects to refine its algorithms and improve its performance, the founders said, and pet owners can provide in-the-moment feedback if the app gets it wrong. In 2021, MeowTalk researchers reported that the software could distinguish among nine intents with 90 percent accuracy overall. But the app was better at identifying some than others, not infrequently confusing "happy" and "pain," according to the results. Dogs could soon have their own day. Zoolingua, a start-up based in Arizona, is hoping to create an A.I.-powered dog translator that will analyze canine vocalizations and body language. Still, even sophisticated algorithms may miss critical real-world context and cues, said Alexandra Horowitz, an expert on dog cognition at Barnard College. For instance, much of canine behavior is driven by scent. "How is that going to be translated, when we don't know the extent of it ourselves?" Dr. Horowitz said in an email.
‹ Previous 21 - 30 of 30
Showing 20 items per page