All Can Be Lost: The Risk of Putting Our Knowledge in the Hands of Machines - Nicholas ... - 0 views
-
We rely on computers to fly our planes, find our cancers, design our buildings, audit our businesses. That's all well and good. But what happens when the computer fails?
-
On the evening of February 12, 2009, a Continental Connection commuter flight made its way through blustery weather between Newark, New Jersey, and Buffalo, New York.
-
The Q400 was well into its approach to the Buffalo airport, its landing gear down, its wing flaps out, when the pilot’s control yoke began to shudder noisily, a signal that the plane was losing lift and risked going into an aerodynamic stall. The autopilot disconnected, and the captain took over the controls. He reacted quickly, but he did precisely the wrong thing: he jerked back on the yoke, lifting the plane’s nose and reducing its airspeed, instead of pushing the yoke forward to gain velocity.
- ...43 more annotations...
-
The crash, which killed all 49 people on board as well as one person on the ground, should never have happened.
-
aptain’s response to the stall warning, the investigators reported, “should have been automatic, but his improper flight control inputs were inconsistent with his training” and instead revealed “startle and confusion.
-
Automation has become so sophisticated that on a typical passenger flight, a human pilot holds the controls for a grand total of just three minutes.
-
We humans have been handing off chores, both physical and mental, to tools since the invention of the lever, the wheel, and the counting bead.
-
And that, many aviation and automation experts have concluded, is a problem. Overuse of automation erodes pilots’ expertise and dulls their reflexes,
-
No one doubts that autopilot has contributed to improvements in flight safety over the years. It reduces pilot fatigue and provides advance warnings of problems, and it can keep a plane airborne should the crew become disabled. But the steady overall decline in plane crashes masks the recent arrival of “a spectacularly new type of accident,”
-
The experience of airlines should give us pause. It reveals that automation, for all its benefits, can take a toll on the performance and talents of those who rely on it. The implications go well beyond safety. Because automation alters how we act, how we learn, and what we know, it has an ethical dimension. The choices we make, or fail to make, about which tasks we hand off to machines shape our lives and the place we make for ourselves in the world.
-
What pilots spend a lot of time doing is monitoring screens and keying in data. They’ve become, it’s not much of an exaggeration to say, computer operators.
-
Examples of complacency and bias have been well documented in high-risk situations—on flight decks and battlefields, in factory control rooms—but recent studies suggest that the problems can bedevil anyone working with a computer
-
That may leave the person operating the computer to play the role of a high-tech clerk—entering data, monitoring outputs, and watching for failures. Rather than opening new frontiers of thought and action, software ends up narrowing our focus.
-
A labor-saving device doesn’t just provide a substitute for some isolated component of a job or other activity. It alters the character of the entire task, including the roles, attitudes, and skills of the people taking part.
-
when we work with computers, we often fall victim to two cognitive ailments—complacency and bias—that can undercut our performance and lead to mistakes. Automation complacency occurs when a computer lulls us into a false sense of security. Confident that the machine will work flawlessly and handle any problem that crops up, we allow our attention to drift.
-
Automation bias occurs when we place too much faith in the accuracy of the information coming through our monitors. Our trust in the software becomes so strong that we ignore or discount other information sources, including our own eyes and ears
-
Automation is different now. Computers can be programmed to perform complex activities in which a succession of tightly coordinated tasks is carried out through an evaluation of many variables. Many software programs take on intellectual work—observing and sensing, analyzing and judging, even making decisions—that until recently was considered the preserve of humans.
-
Automation turns us from actors into observers. Instead of manipulating the yoke, we watch the screen. That shift may make our lives easier, but it can also inhibit the development of expertise.
-
Since the late 1970s, psychologists have been documenting a phenomenon called the “generation effect.” It was first observed in studies of vocabulary, which revealed that people remember words much better when they actively call them to mind—when they generate them—than when they simply read them.
-
When you engage actively in a task, you set off intricate mental processes that allow you to retain more knowledge. You learn more and remember more. When you repeat the same task over a long period, your brain constructs specialized neural circuits dedicated to the activit
-
What looks like instinct is hard-won skill, skill that requires exactly the kind of struggle that modern software seeks to alleviate.
-
In many businesses, managers and other professionals have come to depend on decision-support systems to analyze information and suggest courses of action. Accountants, for example, use the systems in corporate audits. The applications speed the work, but some signs suggest that as the software becomes more capable, the accountants become less so.
-
You can put limits on the scope of automation, making sure that people working with computers perform challenging tasks rather than merely observing.
-
Experts used to assume that there were limits to the ability of programmers to automate complicated tasks, particularly those involving sensory perception, pattern recognition, and conceptual knowledge
-
Who needs humans, anyway? That question, in one rhetorical form or another, comes up frequently in discussions of automation. If computers’ abilities are expanding so quickly and if people, by comparison, seem slow, clumsy, and error-prone, why not build immaculately self-contained systems that perform flawlessly without any human oversight or intervention? Why not take the human factor out of the equation?
-
That idea is seductive, but no machine is infallible. Sooner or later, even the most advanced technology will break down, misfire, or, in the case of a computerized system, encounter circumstances that its designers never anticipated. As automation technologies become more complex, relying on interdependencies among algorithms, databases, sensors, and mechanical parts, the potential sources of failure multiply. They also become harder to detect.
-
Because many system designers assume that human operators are “unreliable and inefficient,” at least when compared with a computer, they strive to give the operators as small a role as possible.
-
People end up functioning as mere monitors, passive watchers of screens. That’s a job that humans, with our notoriously wandering minds, are especially bad at
-
people have trouble maintaining their attention on a stable display of information for more than half an hour. “This means,” Bainbridge observed, “that it is humanly impossible to carry out the basic function of monitoring for unlikely abnormalities.”
-
a person’s skills “deteriorate when they are not used,” even an experienced operator will eventually begin to act like an inexperienced one if restricted to just watching.
-
You can program software to shift control back to human operators at frequent but irregular intervals; knowing that they may need to take command at any moment keeps people engaged, promoting situational awareness and learning.
-
What’s most astonishing, and unsettling, about computer automation is that it’s still in its early stages.
-
most software applications don’t foster learning and engagement. In fact, they have the opposite effect. That’s because taking the steps necessary to promote the development and maintenance of expertise almost always entails a sacrifice of speed and productivity.
-
Learning requires inefficiency. Businesses, which seek to maximize productivity and profit, would rarely accept such a trade-off. Individuals, too, almost always seek efficiency and convenience.
-
Abstract concerns about the fate of human talent can’t compete with the allure of saving time and money.
-
The small island of Igloolik, off the coast of the Melville Peninsula in the Nunavut territory of northern Canada, is a bewildering place in the winter.
-
, Inuit hunters have for some 4,000 years ventured out from their homes on the island and traveled across miles of ice and tundra to search for game. The hunters’ ability to navigate vast stretches of the barren Arctic terrain, where landmarks are few, snow formations are in constant flux, and trails disappear overnight, has amazed explorers and scientists for centuries. The Inuit’s extraordinary way-finding skills are born not of technological prowess—they long eschewed maps and compasses—but of a profound understanding of winds, snowdrift patterns, animal behavior, stars, and tides.
-
The Igloolik hunters have begun to rely on computer-generated maps to get around. Adoption of GPS technology has been particularly strong among younger Inuit, and it’s not hard to understand why.
-
But as GPS devices have proliferated on Igloolik, reports of serious accidents during hunts have spread. A hunter who hasn’t developed way-finding skills can easily become lost, particularly if his GPS receiver fails.
-
The routes so meticulously plotted on satellite maps can also give hunters tunnel vision, leading them onto thin ice or into other hazards a skilled navigator would avoid.
-
An Inuit on a GPS-equipped snowmobile is not so different from a suburban commuter in a GPS-equipped SUV: as he devotes his attention to the instructions coming from the computer, he loses sight of his surroundings. He travels “blindfolded,” as Aporta puts it
-
Computer automation severs the ends from the means. It makes getting what we want easier, but it distances us from the work of knowing. As we transform ourselves into creatures of the screen, we face an existential question: Does our essence still lie in what we know, or are we now content to be defined by what we want?