Skip to main content

Home/ KI-Network/ Group items tagged case

Rss Feed Group items tagged

Stephen Dale

Artificial intelligence is not as smart as you (or Elon Musk) think | TechCrunch - 0 views

  •  
    AI is (currently) very good at specialist/single tasks that require brute force computational effort or training algorithms, but we are a long way from developing generalised AI, that requires some form of unsupervised deep learning. There are many things that humans understand but are well beyond the reach of AI, and this will remain the case for many years to come - if it ever happens.
Stephen Dale

Blockchain beyond the hype: What is the strategic business value? | McKinsey & Company - 0 views

  •  
    "Companies can determine whether they should invest in blockchain by focusing on specific use cases and their market position."
Stephen Dale

You Will Lose Your Job to a Robot-and Sooner Than You Think - Mother Jones - 0 views

  •  
    I want to tell you straight off what this story is about: Sometime in the next 40 years, robots are going to take your job. I don't care what your job is. If you dig ditches, a robot will dig them better. If you're a magazine writer, a robot will write your articles better. If you're a doctor, IBM's Watson will no longer "assist" you in finding the right diagnosis from its database of millions of case studies and journal articles. It will just be a better doctor than you.
Stephen Dale

Data Bias Is Becoming A Massive Problem | Digital Tonto - 0 views

  •  
    Machines, even virtual ones, have biases. They are designed, necessarily, to favour some kinds of data over others. Unfortunately, we rarely question the judgments of mathematical models and, in many cases, their biases can pervade and distort operational reality, creating unintended consequences that are hard to undo.
Gary Colet

Why Facts Don't Change Our Minds - The New Yorker - 0 views

  • In a study conducted at Yale, graduate students were asked to rate their understanding of everyday devices, including toilets, zippers, and cylinder locks. They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again. Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped. (Toilets, it turns out, are more complicated than they appear.) Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth,” just about everywhere. People believe that they know way more than they actually do. What allows us to persist in this belief is other people. In the case of my toilet, someone else designed it so that I can operate it easily. This is something humans are very good at. We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins. “One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.
  • ween one person’s ideas and knowledge” and “those of other members” of the group.
  • ween one person’s ideas and knowledge” and “those of other members” of the group.
  • ...1 more annotation...
  • ween one person’s ideas and knowledge” and “those of other members” of the group.
kin wbs

Making the emotional case for change - 0 views

  •  
    "From an excerpt form a recent book by Chip Heath "
‹ Previous 21 - 28 of 28
Showing 20 items per page