Skip to main content

Home/ Words R Us/ Group items tagged Siri

Rss Feed Group items tagged

Lara Cowell

Natural Language Processing ft. Siri - 0 views

  •  
    Siri uses a variety of advanced machine learning technologies to be able to understand your command and return a response - primarily natural language processing (NLP) and speech recognition. NLP primarily focuses on allowing computers to understand and communicate in human language. In terms of programming, languages are split up into three categories - syntax, semantics, and pragmatics. Whereas syntax describes the structure and composition of the phrases, semantics provide meaning for the syntactic elements. Pragmatics, on the other hand, refers to the composition and context in which the phrase is used.
Lara Cowell

Alexa vs. Siri vs. Google: Which Can Carry on a Conversation Best? - 1 views

  •  
    Just in case you were under the misimpression that artificial intelligence will be taking over the world shortly, this article suggests that digital assistants really can't even handle the sort of everyday linguistic interaction that humans take for granted. Still, it is interesting to find out how product engineers are designing the assistants to become "smarter" at comprehending your words and requests. Machine learning algorithms can help devices deal with turn-by-turn exchanges. But each verbal exchange is limited to a simple, three- or four-turn conversation.
ellisalang17

How Machines Learned to Speak the Human Language - 0 views

  •  
    This article explains how machines such as "Siri" and "Echo" are able to speak the human language. "Language technologies teach themselves, via a form of pattern-matching. For speech recognition, computers are fed sound files on the one hand, and human-written transcriptions on the other. The system learns to predict which sounds should result in what transcriptions."
kianakomeiji22

How computers are learning to understand language​ | Welcome to Bio-X - 0 views

  •  
    This article provides an insight into an interview with Christopher Manning, a Stanford professor of computer science and linguistics. He is focused on computational linguistics, also known as natural language processing. Natural language processing involves creating algorithms that can allow computers to understand written and spoken language and then intelligently respond. This involves systems such as Siri, Alexa, and Google Voice. These systems are pretty advanced technology, however, they are still far from perfect. Manning notes that people will probably still be working on natural learning processing in twenty years.
Lara Cowell

AI's Language Problem - 0 views

  •  
    This MIT Technology Review article notes that while Artificial Intelligence has experienced many sophisticated advances, one fundamental capability remains elusive: language. Systems like Siri and IBM's Watson can follow simple spoken or typed commands and answer basic questions, but they can't hold a conversation and have no real understanding of the words they use. In addition, humans, unlike machines, have the ability to learn very quickly from a relatively small amount of data and have a built-in ability to model the world in 3-D very efficiently. Programming machines to comprehend and generate language is a complex task, because the machines would need to mimic human learning, mental model building, and psychology. As MIT cognitive scientist Josh Tenenbaum states, "Language builds on other abilities that are probably more basic, that are present in young infants before they have language: perceiving the world visually, acting on our motor systems, understanding the physics of the world or other agents' goals." ­ If he is right, then it will be difficult to re-create language understanding in machines and AI systems without trying to mimic human learning, mental model building, and psychology.
1 - 5 of 5
Showing 20 items per page