Google Glass May Be Hands-Free, But Not Brain-Free - NYTimes.com - 0 views
-
The “eyes-free” goal addresses an obvious limitation of the human brain: we can’t look away from where we’re heading for more than a few seconds without losing our bearings. And time spent looking at a cellphone is time spent oblivious to the world, as shown in the viral videos of distracted phone users who stumble into shopping-mall fountains. Most people intuitively grasp the “two-second rule.”
-
Researchers at the Virginia Tech Transportation Institute outfitted cars and trucks with cameras and sensors to monitor real-world driving behavior. When drivers were communicating, they tended to look away for as much as 4.6 seconds during a 6-second period. In effect, people lose track of time when texting, leading them to look at their phones far longer than they know they should
-
Heads-up displays like Google Glass, and voice interfaces like Siri, seem like ideal solutions, letting you simultaneously interact with your smartphone while staying alert to your surroundings
- ...4 more annotations...
-
The problem is that looking is not the same as seeing, and people make wrong assumptions about what will grab their attention.
-
about 70 percent of Americans believe that “people will notice when something unexpected enters their field of view, even when they’re paying attention to something else.”
-
“inattentional blindness” shows that what we see depends not just on where we look but also on how we focus our attention.