Skip to main content

Home/ InsightNG/ Group items tagged Human-Machine

Rss Feed Group items tagged

Neil Movold

Data and the human-machine connection - O'Reilly Radar - 1 views

  •  
    Our company is a science-oriented company, and the core belief is that behavior - human or otherwise - can be mathematically expressed. Yes, people make irrational value judgments, but they are driven by common motivation factors, and the math expresses that. I look at the so-called "big data phenomenon" as the instantiation of human experience. Previously, we could not quantitatively measure human experience, because the data wasn't being captured. But Twitter recently announced that they now serve 350 billion tweets a day. What we say and what we do has a physical manifestation now. Once there is a physical manifestation of a phenomenon, then it can be mathematically expressed. And if you can express it, then you can shape business ideas around it, whether that's in government or health care or business.
Neil Movold

The Internet of Things and the cloud - 0 views

  •  
    We are in the early stages of the Internet of Things, the much anticipated era when all manner of devices can talk to each other and to intermediary services. But for this era to achieve its full potential, operators must fundamentally change the way they build and run clouds. Why? Machine-to-machine (M2M) interactions are far less failure tolerant than machine-to-human interactions.
Neil Movold

Role and Use of Ontologies in the Open Semantic Framework - 0 views

  •  
    Ontologies are to the Open Semantic Framework what humans were to the Mechanical Turk. The hidden human in the Mechanical Turk was orchestrating all and every chess move. However, to the observers, the automated chess machine was looking just like it: a new kind of intelligent machine. We were in 1770.
Neil Movold

Computer Scientist leads the way to the next revolution in artificial intelligence - 0 views

  •  
    AMHERST, Mass. - As computer scientists this year celebrate the 100th anniversary of the birth of the mathematical genius Alan Turing, who set out the basis for digital computing in the 1930s to anticipate the electronic age, they still quest after a machine as adaptable and intelligent as the human brain. Now, computer scientist Hava Siegelmann of the University of Massachusetts Amherst, an expert in neural networks, has taken Turing's work to its next logical step. She is translating her 1993 discovery of what she has dubbed "Super-Turing" computation into an adaptable computational system that learns and evolves, using input from the environment in a way much more like our brains do than classic Turing-type computers. She and her post-doctoral research colleague Jeremie Cabessa report on the advance in the current issue of Neural Computation. "This model is inspired by the brain," she says. "It is a mathematical formulation of the brain's neural networks with their adaptive abilities." The authors show that when the model is installed in an environment offering constant sensory stimuli like the real world, and when all stimulus-response pairs are considered over the machine's lifetime, the Super Turing model yields an exponentially greater repertoire of behaviors than the classical computer or Turing model. They demonstrate that the Super-Turing model is superior for human-like tasks and learning.
Neil Movold

Cognitive Computing: When Computers Become Brains - 0 views

  • The human brain integrates memory and processing together, weighs less than 3 lbs, occupies about a two-liter volume, and uses less power than a light bulb.  It operates as a massively parallel distributed processor.  It is event driven, that is, it reacts to things in its environment, uses little power when active and even less while resting.  It is a reconfigurable, fault-tolerant learning system.  It is excellent at pattern recognition and teasing out relationships.
  • A computer, on the other hand, has separate memory and processing.  It does its work sequentially for the most part and is run by a clock.  The clock, like a drum majorette in a military band, drives every instruction and piece of data to its next location — musical chairs with enough chairs.  As clock rates increase to drive data faster, power consumption goes up dramatically, and even at rest these machines need a lot of electricity.  More importantly, computers have to be programmed.  They are hard wired and fault prone.  They are good at executing defined algorithms and performing analytics.
  • Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE)
  •  
    Cognitive computing, as the new field is called, takes computing concepts to a whole new level.  Earlier this week, Dharmendra Modha, who works at IBM's Almaden Research Center, regaled a roomful of analysts with what cognitive computing can do and how IBM is going about making a machine that thinks the way we do.  His own blog on the subject is here.
Neil Movold

7 Amazing Websites To See The Latest In Artificial Intelligence Programming - 0 views

  •  
    Artificial Intelligence is not yet HAL from the 2001: The Space Odyssey…but we are getting awfully close. Sure enough, one day it could be as similar to the sci-fi potboilers being churned out by Hollywood. If that's your idea of what artificial intelligence is all about, then you aren't far off the mark. In layman's terms, artificial intelligence is about creating intelligent machines through the use of intelligent computer programs. Most, if not all of artificial intelligence (AI) tries to mimic human behavior. The scale of ambition is different, but artificial intelligence programming is a full-fledged field in the cutting edge of science today. If you have some interest in the world of tomorrow, check out these websites to grasp what artificial intelligence is all about.
Neil Movold

Welcome to the Era of Cognitive Systems - 0 views

  • Notice, I don’t use the term “thinking machines.” That’s because I don’t want to suggest that cognitive systems will think like humans do. Rather, they will help us think and make better decisions.
  •  
    "Today, we are at the dawn of another epochal shift in the evolution of technology. At IBM Research, we call it the era of cognitive systems. This is a big deal. The changes that are coming over the next 10 to 20 years-building on IBM's Watson technology-will transform the way we live, work and learn, just as programmable computing has transformed the human landscape over the past 60+ years. You could even call this the post-computing era."
Neil Movold

New Ways of Thinking - Beyond Machines - 0 views

  •  
    "For more than half a century, computers have been little better than calculators with storage structures and programmable memory, a model that scientists have continually aimed to improve. Comparatively, the human brain-the world's most sophisticated computer-can perform complex tasks rapidly and accurately using the same amount of energy as a 20 watt light bulb in a space equivalent to a 2 liter soda bottle. Cognitive computing: thought for the future Making sense of real-time input flowing in at a dizzying rate is a Herculean task for today's computers, but would be natural for a brain-inspired system. Using advanced algorithms and silicon circuitry, cognitive computers learn through experiences, find correlations, create hypotheses, and remember-and learn from-the outcomes. For example, a cognitive computing system monitoring the world's water supply could contain a network of sensors and actuators that constantly record and report metrics such as temperature, pressure, wave height, acoustics and ocean tide, and issue tsunami warnings based on its decision making."
Neil Movold

No real Artificial Intelligence in the next 40 years - 0 views

  • The real issue is that we don’t understand how human intelligence and “consciousness” work.
  • We don’t know the principles behind it; we can superficially imitate it but we cannot build something like it, or better – for now.What we need is a “cognitive computing” model (a theory) before we can build machines around it.
  •  
    Can computing and science fiction collide to create a true Artificial Intelligence? A.I has been part of our computing landscape for a long time, first as an idea, then taking baby steps, thing started to move in the early days of computers. After that, there was a period of disillusion and with the rise of cloud computing and massively parallel consumer-level chips A.I is more than ever on our lips and in our minds - but how far are we really from the awakening of a digital form of consciousness?
1 - 9 of 9
Showing 20 items per page