Skip to main content

Home/ Future of the Web/ Group items matching "Watson" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Paul Merrell

AT&T ups the ante in speech recognition | Signal Strength - CNET News - 2 views

  • It's developed a core technology platform, known as Watson, which is a cloud-based system of services that not only identifies words but interprets meaning and context to deliver more accurate results. The system itself is built on servers that model and compare speech to recorded voices. Watson is an evolving platform that with more data is able to adapt and learn so that it continues to improve accuracy and also cross reference data to use speech as input for getting to all kinds of communication and data. "We are really on the cusp of a technology revolution in speech and language technology," said Mazin Gilbert, executive director of speech and language technology at AT&T Labs. "It's no longer about simply trying to get the words right. It's about adding intelligence to interpret what is being said and then using that to apply to other modes of communication, such as text or video."
  • The system is designed to get more accurate over time as it learns the speech patterns of large numbers of users.
Paul Merrell

High Court Rules UK's Surveillance Powers Violate Human Rights - 0 views

  • UK's High Court found the rushed Data Retention and Investigatory Powers Act (DRIPA) to be illegal under the European Convention on Human Rights and EU Charter of Fundamental Rights, both of which require respect for private and family life, as well as protection of personal data in the case of the latter. DRIPA was challenged by two members of Parliament (MPs), Labor's Tom Watson and the Conservative David Davis, who argued that the surveillance of communications wasn't limited to serious crimes, that individual notices for data collection were kept secret, and that no provision existed to protect those who need professional confidentiality, such as lawyers and journalists. DRIPA was pushed through in three days last year after the European Court of Justice ruled that the EU data retention powers were disproportionate, which invalidated the previous data retention law in the UK. The UK High Court also ruled that sections 1 and 2 of DRIPA were unlawful based on the fact that they fail to provide precise policies to ensure that data is only accessed for the purpose of investigating serious crimes. Another major point against DRIPA was that it didn't require judicial approval, which could limit access to only the data that is strictly necessary for investigations.
  • DRIPA passed in only three days, but the Court allowed it to continue for another nine months, to give the UK government enough time to draft new legislation. Although this almost doubles the time in which this law will exist, it might be better in the long term, as it gives the members of Parliament enough time to debate its successor, without having to rush yet another law fearing that the government's surveillance powers will expire. This court ruling arrived at the right time, as the UK government is currently preparing the draft for the Investigative Powers Bill (also called Snooper's Charter by many), which further expands the government's surveillance powers and may even request encryption backdoors. It also joins other recent reviews of the government's surveillance laws that called for much stricter oversight done by judges rather than the government's own members. "Campaigners, MPs across the political spectrum, the Government's own reviewer of terrorism legislation are all calling for judicial oversight and clearer safeguards," said James Welch, Legal Director for Liberty, a human rights organization.
  •  
    The Dark State takes another hit.
Paul Merrell

Deep Fakes: A Looming Crisis for National Security, Democracy and Privacy? - Lawfare - 1 views

  • “We are truly fucked.” That was Motherboard’s spot-on reaction to deep fake sex videos (realistic-looking videos that swap a person’s face into sex scenes actually involving other people). And that sleazy application is just the tip of the iceberg. As Julian Sanchez tweeted, “The prospect of any Internet rando being able to swap anyone’s face into porn is incredibly creepy. But my first thought is that we have not even scratched the surface of how bad ‘fake news’ is going to get.” Indeed. Recent events amply demonstrate that false claims—even preposterous ones—can be peddled with unprecedented success today thanks to a combination of social media ubiquity and virality, cognitive biases, filter bubbles, and group polarization. The resulting harms are significant for individuals, businesses, and democracy. Belated recognition of the problem has spurred a variety of efforts to address this most recent illustration of truth decay, and at first blush there seems to be reason for optimism. Alas, the problem may soon take a significant turn for the worse thanks to deep fakes. Get used to hearing that phrase. It refers to digital manipulation of sound, images, or video to impersonate someone or make it appear that a person did something—and to do so in a manner that is increasingly realistic, to the point that the unaided observer cannot detect the fake. Think of it as a destructive variation of the Turing test: imitation designed to mislead and deceive rather than to emulate and iterate.
  • Fueled by artificial intelligence, digital impersonation is on the rise. Machine-learning algorithms (often neural networks) combined with facial-mapping software enable the cheap and easy fabrication of content that hijacks one’s identity—voice, face, body. Deep fake technology inserts individuals’ faces into videos without their permission. The result is “believable videos of people doing and saying things they never did.” Not surprisingly, this concept has been quickly leveraged to sleazy ends. The latest craze is fake sex videos featuring celebrities like Gal Gadot and Emma Watson. Although the sex scenes look realistic, they are not consensual cyber porn. Conscripting individuals (more often women) into fake porn undermines their agency, reduces them to sexual objects, engenders feeling of embarrassment and shame, and inflicts reputational harm that can devastate careers (especially for everyday people). Regrettably, cyber stalkers are sure to use fake sex videos to torment victims. What comes next? We can expect to see deep fakes used in other abusive, individually-targeted ways, such as undermining a rival’s relationship with fake evidence of an affair or an enemy’s career with fake evidence of a racist comment.
1 - 3 of 3
Showing 20 items per page