In the last few years, it's become increasingly clear that food companies engineer hyperprocessed foods in ways precisely geared to most appeal to our tastes. This technologically advanced engineering is done, of course, with the goal of maximizing profits, regardless of the effects of the resulting foods on consumer health, natural resources, the environment or anything else.
But the issues go way beyond food, as the City University of New York professor Nicholas Freudenberg discusses in his new book, "Lethal but Legal: Corporations, Consumption, and Protecting Public Health." Freudenberg's case is that the food industry is but one example of the threat to public health posed by what he calls "the corporate consumption complex," an alliance of corporations, banks, marketers and others that essentially promote and benefit from unhealthy lifestyles.
"New research from Allen Downey, a computer scientist at Olin College of Engineering in Massachusetts, shows a startling correlation between the rise of the Internet and the decline of religious affiliation in the United States.
According to MIT Technology Review, back in 1990 only eight percent of the U.S. population did not have a religious affiliation. Twenty years later in 2010 that number was up to 18 percent. That is a jump of 25 million people. Americans seem to be losing their religion, and from Downey's research we may have an answer."
"We live in a hyper-rational, data-driven time; geeks are our kings and queens. When something inexplicable happens, we are in awe, suddenly, of concepts that the ancients took for granted: the suffering of innocents, supernatural causes, and twists of fate."
"It uses an "emotional engine" and a cloud-based artificial intelligence system that allows it to analyse gestures, expressions and voice tones.
The firm said people could communicate with it "just like they would with friends and family" and it could perform various tasks.
It will go on sale to the public next year for 198,000 yen ($1,930; £1,150).
"People describe others as being robots because they have no emotions, no heart," Masayoshi Son, chief executive of Softbank, said at a press conference.
"For the first time in human history, we're giving a robot a heart, emotions.""
Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.
If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias.
The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”
Here the counter-argument couched in evolutionary psych about its adaptive function - hypersociability.
Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.
reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group.
“This is one of many cases in which the environment changed too quickly for natural selection to catch up.
Environment changed too quickly for our evolutionary progress to keep up.
People believe that they know way more than they actually do. What allows us to persist in this belief is other people.
“One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.
When it comes to new technologies, incomplete understanding is empowering.
it gets us into trouble, according to Sloman and Fernbach, is in the political domain
If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless.
We’ve been relying on one another’s expertise ever since we figured out how to hunt together,
This is the opposite side to doubting our group members, once we trust them, we can be somewhat blind in our trust.
This is how a community of knowledge can become dangerous,
If we—or our friends or the pundits on CNN—spent less time pontificating and more trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views.
In a well-run laboratory, there’s no room for myside bia
This is where the second section begins, arguing that our evolutionary emphasis on social collaboration also operates to short-circuit or undermine the effectiveness of reason as a WOK.