Skip to main content

Home/ TOK@ISPrague/ Group items tagged development

Rss Feed Group items tagged

Lawrence Hrubes

Why Are Some Cultures More Individualistic Than Others? - NYTimes.com - 0 views

  • AMERICANS and Europeans stand out from the rest of the world for our sense of ourselves as individuals. We like to think of ourselves as unique, autonomous, self-motivated, self-made. As the anthropologist Clifford Geertz observed, this is a peculiar idea.People in the rest of the world are more likely to understand themselves as interwoven with other people — as interdependent, not independent. In such social worlds, your goal is to fit in and adjust yourself to others, not to stand out. People imagine themselves as part of a larger whole — threads in a web, not lone horsemen on the frontier. In America, we say that the squeaky wheel gets the grease. In Japan, people say that the nail that stands up gets hammered down.
  • These are broad brush strokes, but the research demonstrating the differences is remarkably robust and it shows that they have far-reaching consequences. The social psychologist Richard E. Nisbett and his colleagues found that these different orientations toward independence and interdependence affected cognitive processing. For example, Americans are more likely to ignore the context, and Asians to attend to it. Show an image of a large fish swimming among other fish and seaweed fronds, and the Americans will remember the single central fish first. That’s what sticks in their minds. Japanese viewers will begin their recall with the background. They’ll also remember more about the seaweed and other objects in the scene.Another social psychologist, Hazel Rose Markus, asked people arriving at San Francisco International Airport to fill out a survey and offered them a handful of pens to use, for example four orange and one green; those of European descent more often chose the one pen that stood out, while the Asians chose the one more like the others.
  • In May, the journal Science published a study, led by a young University of Virginia psychologist, Thomas Talhelm, that ascribed these different orientations to the social worlds created by wheat farming and rice farming. Rice is a finicky crop. Because rice paddies need standing water, they require complex irrigation systems that have to be built and drained each year. One farmer’s water use affects his neighbor’s yield. A community of rice farmers needs to work together in tightly integrated ways. Not wheat farmers. Wheat needs only rainfall, not irrigation. To plant and harvest it takes half as much work as rice does, and substantially less coordination and cooperation. And historically, Europeans have been wheat farmers and Asians have grown rice.Continue reading the main story Continue reading the main story Continue reading the main story The authors of the study in Science argue that over thousands of years, rice- and wheat-growing societies developed distinctive cultures: “You do not need to farm rice yourself to inherit rice culture.”
Lawrence Hrubes

How Children Learn To Read - The New Yorker - 0 views

  • Why is it easy for some people to learn to read, and difficult for others? It’s a tough question with a long history. We know that it’s not just about raw intelligence, nor is it wholly about repetition and dogged persistence. We also know that there are some conditions that, effort aside, can hold a child back. Socioeconomic status, for instance, has been reliably linked to reading achievement. And, regardless of background, children with lower general verbal ability and those who have difficulty with phonetic processing seem to struggle. But what underlies those differences? How do we learn to translate abstract symbols into meaningful sounds in the first place, and why are some children better at it than others?
  • When Hoeft took into account all of the explanatory factors that had been linked to reading difficulty in the past—genetic risk, environmental factors, pre-literate language ability, and over-all cognitive capacity—she found that only one thing consistently predicted how well a child would learn to read. That was the growth of white matter in one specific area of the brain, the left temporoparietal region. The amount of white matter that a child arrived with in kindergarten didn’t make a difference. But the change in volume between kindergarten and third grade did.
  • She likens it to the Dr. Seuss story of Horton and the egg. Horton sits on an egg that isn’t his own, and, because of his dedication, the creature that eventually hatches looks half like his mother, and half like the elephant. In this particular case, Hoeft and her colleagues can’t yet separate cause and effect: Were certain children predisposed to develop strong white-matter pathways that then helped them to learn to read, or was superior instruction and a rich environment prompting the building of those pathways?
markfrankel18

BBC News - How random is random on your music player? - 0 views

  • "Our brain is an excellent pattern-matching device," said Babar Zafar, a lead developer at Spotify, in an interview for Tech Tent on the BBC World Service. "It will find patterns where there aren't any."
  • "The problem is that, to humans, truly random does not feel random," said Mattias Johansson, a Spotify software engineer, in a response on the question-and-answer site Quora.
markfrankel18

Climate affects development of human speech -- ScienceDaily - 1 views

  • A correlation between climate and the evolution of language has been uncovered by researchers. To find a relationship between the climate and the evolution of language, one needs to discover an association between the environment and vocal sounds that is consistent throughout the world and present in different languages. And that is precisely what a group of researchers has done.
Lawrence Hrubes

Problems Too Disgusting to Solve - The New Yorker - 0 views

  • last month, Bill Gates released a video of one of the latest ventures funded by the Bill and Melinda Gates Foundation: the Omniprocessor, a Seattle-based processing plant that burns sewage to make clean drinking water. In the video, Gates raises a glass of water to his lips. Just five minutes ago, the caption explains, that water was human waste. Gates takes a sip. “It’s water,” he says. “Having studied the engineering behind it,” he writes, on the foundation’s blog, “I would happily drink it every day. It’s that safe.”
  • In the first series of studies, the group asked adults in five cities about their backgrounds, their political and personal views, and, most important, their view on the concept of “recycled water.” On average, everyone was uncomfortable with the idea—even when they were told that treated, recycled water is actually safer to drink than unfiltered tap water. That discomfort, Rozin found, was all about disgust. Twenty-six per cent of participants were so disgusted by the idea of toilet-to-tap that they even agreed with the statement, “It is impossible for recycled water to be treated to a high enough quality that I would want to use it.” They didn’t care what the safety data said. Their guts told them that the water would never be drinkable. It’s a phenomenon known as contagion, or, as Rozin describes it, “once in contact, always in contact.” By touching something we find disgusting, a previously neutral or even well-liked item can acquire—permanently—its properties of grossness.
  • eelings of disgust are often immune to rationality. And with good reason: evolutionarily, disgust is an incredibly adaptive, life-saving reaction. We find certain things instinctively gross because they really can harm us. Human secretions pass on disease. Noxious odors signal that your surroundings may be unsafe. If something feels slimy and sludgy, it’s likely a moisture-rich environment where pathogens may proliferate. Disgust is powerful, in short, because it often signals something important. It’s easy, though, to be disgusted by things that aren’t actually dangerous. In a prior study, Rozin found that people were unwilling to drink a favorite beverage into which a “fully sterilized” cockroach had been dipped. Intellectually, they knew that the drink was safe, but they couldn’t get over the hump of disgust. In another experiment, students wouldn’t eat chocolate that had been molded to look like poop: they knew that it was safe—tasty, even—but its appearance was too much to handle. Their response makes no logical sense. When it comes to recycled water, for instance, Rozin points out that, on some level, all water comes from sewage: “Rain is water that used to be in someone’s toilet, and nobody seems to mind.” The problem, he says, has to do with making the hidden visible. “If it’s obvious—take shit water, put it through a filter—then people are upset.”
  • ...1 more annotation...
  • Disgust has deep psychological roots, emerging early in a child’s development. Infants and young toddlers don’t feel grossed out by anything—diapers, Rozin observes, are there in part to stop a baby “from eating her shit.” In the young mind, curiosity and exploration often overpower any competing instincts. But, at around four years old, there seems to be a profound shift. Suddenly, children won’t touch things that they find appalling. Some substances, especially human excretions of any sort, are seen as gross and untouchable all over the world; others are culturally determined. But, whether universal or culturally-specific, the disgust reactions that we acquire as children stay with us throughout our lives. If anything, they grow stronger—and more consequential—with age.
markfrankel18

Creativity Creep - The New Yorker - 3 views

  • How did we come to care so much about creativity? The language surrounding it, of unleashing, unlocking, awakening, developing, flowing, and so on, makes it sound like an organic and primordial part of ourselves which we must set free—something with which it’s natural to be preoccupied. But it wasn’t always so; people didn’t always care so much about, or even think in terms of, creativity.
  • It was Romanticism, in the late eighteenth and early nineteenth centuries, which took the imagination and elevated it, giving us the “creative imagination.”
  • How did creativity transform from a way of being to a way of doing? The answer, essentially, is that it became a scientific subject, rather than a philosophical one.
  • ...5 more annotations...
  • All of this measuring and sorting has changed the way we think about creativity. For the Romantics, creativity’s center of gravity was in the mind. But for us, it’s in whatever the mind decides to share—that is, in the product. It’s not enough for a person to be “imaginative” or “creative” in her own consciousness. We want to know that the product she produces is, in some sense, “actually” creative; that the creative process has come to a workable conclusion. To today’s creativity researchers, the “self-styled creative person,” with his inner, unverifiable, possibly unproductive creativity, is a kind of bogeyman; a great deal of time is spent trampling on the scarf of the lone, Romantic genius. Instead, attention is paid to the systems of influence, partnership, power, funding, and reception that surround creativity—the social structures, in other words, that enable managers to reap the fruits of creative labor. Often, this is imagined to be some sort of victory over Romanticism and its fusty, pretentious, élitist ideas about creativity.
  • But this kind of thinking misses the point of the Romantic creative imagination. The Romantics weren’t obsessed with who created what, because they thought you could be creative without “creating” anything other than the liveliness in your own head.
  • It sounds bizarre, in some ways, to talk about creativity apart from the creation of a product. But that remoteness and strangeness is actually a measure of how much our sense of creativity has taken on the cast of our market-driven age
  • Thus the rush, in my pile of creativity books, to reconceive every kind of life style as essentially creative—to argue that you can “unleash your creativity” as an investor, a writer, a chemist, a teacher, an athlete, or a coach.
  • Among the many things we lost when we abandoned the Romantic idea of creativity, the most valuable may have been the idea of creativity’s stillness. If you’re really creative, really imaginative, you don’t have to make things. You just have to live, observe, think, and feel.
markfrankel18

We are more rational than those who nudge us - Steven Poole - Aeon - 3 views

  • We are told that we are an irrational tangle of biases, to be nudged any which way. Does this claim stand to reason?
  • A culture that believes its citizens are not reliably competent thinkers will treat those citizens differently to one that respects their reflective autonomy. Which kind of culture do we want to be? And we do have a choice. Because it turns out that the modern vision of compromised rationality is more open to challenge than many of its followers accept.
  • Modern skepticism about rationality is largely motivated by years of experiments on cognitive bias.
  • ...5 more annotations...
  • The thorny question is whether these widespread departures from the economic definition of ‘rationality’ should be taken to show that we are irrational, or whether they merely show that the economic definition of rationality is defective.
  • During the development of game theory and decision theory in the mid-20th century, a ‘rational’ person in economic terms became defined as a lone individual whose decisions were calculated to maximise self-interest, and whose preferences were (logically or mathematically) consistent in combination and over time. It turns out that people are not in fact ‘rational’ in this homo economicus way,
  • There has been some controversy over the correct statistical interpretations of some studies, and several experiments that ostensibly demonstrate ‘priming’ effects, in particular, have notoriously proven difficult to replicate. But more fundamentally, the extent to which such findings can show that we are acting irrationally often depends on what we agree should count as ‘rational’ in the first place.
  • if we want to understand others, we can always ask what is making their behaviour ‘rational’ from their point of view. If, on the other hand, we just assume they are irrational, no further conversation can take place.
  • And so there is less reason than many think to doubt humans’ ability to be reasonable. The dissenting critiques of the cognitive-bias literature argue that people are not, in fact, as individually irrational as the present cultural climate assumes. And proponents of debiasing argue that we can each become more rational with practice. But even if we each acted as irrationally as often as the most pessimistic picture implies, that would be no cause to flatten democratic deliberation into the weighted engineering of consumer choices, as nudge politics seeks to do. On the contrary, public reason is our best hope for survival.
markfrankel18

What We Really Taste When We Drink Wine : The New Yorker - 2 views

  • Salzman first became interested in wine when he was a graduate student at Stanford University studying neuroscience (Ph.D.) and psychiatry (M.D.). “I was corrupted by some people who were very serious about wine,” he told me. Together, they would host wine tastings and travel to vineyards. Over time, as his interest in wine grew, he began to think about the connections between his tastings and the work he was doing on the ways in which emotion colors the way our brains process information. “We study how cognitive and emotional processes can affect perception,” he said. “And in the case of something like wine, you have the perfect example: even before you open a bottle to experience the wine itself, you already have an arbitrary visual stimulus—the bottle and the label—that comes with non-arbitrary emotional associations, good and bad.” And those emotional associations will, in turn, affect what we taste.
  • In one recent study, the Caltech neuroscientist Hilke Plassman found that people’s expectations of a wine’s price affected their enjoyment on a neural level: not only did they report greater subjective enjoyment but they showed increased activity in an area of the brain that has frequently been associated with the experience of pleasantness. The same goes for the color and shape of a wine’s label: some labels make us think that a wine is more valuable (and, hence, more tasty), while others don’t. Even your ability to pronounce a winery’s name can influence your appreciation of its product—the more difficult the name is to pronounce, the more you’ll like the wine.
  • For experts, though, the story is different. In 1990, Gregg Solomon, a Harvard psychologist who wrote “Great Expectorations: The Psychology of Expert Wine Talk,” found that amateurs can’t really distinguish different wines at all, but he also found that experts can indeed rank wines for sweetness, balance, and tannin at rates that far exceeded chance. Part of the reason isn’t just in the added experience. It’s in the ability to phrase and label that experience more precisely, a more developed sensory vocabulary that helps you to identify and remember what you experience. Indeed, when novices are trained, their discrimination ability improves.
markfrankel18

Fighting Whitewashed History With MIT's Diversity Hackers | Atlas Obscura - 0 views

  • Since Wikipedia is a compendium of information that already exists elsewhere, it reflects this long-standing bias. In addition, Wikipedia’s editorship–the tens of millions of volunteers who add, tinker with, and argue about articles–is not particularly diverse. “It’s largely younger, largely male, largely white,” Ayers says. “And people often write about their own interests, which is natural and makes sense. But what that means is that we have a lot of articles about software and famous military figures, and not a lot about, say, traditional women’s handicrafts or activists in the developing world.” For a site that aims to “really reflect the fullness of our collective human experience,” Ayers says, this is a big issue.
  • As the hackers dig in, roadblocks keep popping up–some common to all historical efforts, others unique to this one. One hacker, trying to separate a psychologist couple into two independent pages (“it’s like she’s glued to her husband’s side!”) has difficulty finding sources that credit the female half of the pair with anything. Another recalls making a prior effort, only to find her changes immediately reversed by the overzealous editors Hyland was talking about.
Lawrence Hrubes

How Measurement Fails Doctors and Teachers - The New York Times - 1 views

  • TWO of our most vital industries, health care and education, have become increasingly subjected to metrics and measurements. Of course, we need to hold professionals accountable. But the focus on numbers has gone too far. We’re hitting the targets, but missing the point.
  • We also need more research on quality measurement and comparing different patient populations. The only way to understand whether a high mortality rate, or dropout rate, represents poor performance is to adequately appreciate all of the factors that contribute to these outcomes — physical and mental, social and environmental — and adjust for them.
  • He developed what is known as Donabedian’s triad, which states that quality can be measured by looking at outcomes (how the subjects fared), processes (what was done) and structures (how the work was organized).
  • ...1 more annotation...
  • “The secret of quality is love,” he said.
Lawrence Hrubes

Don't Ban 'Bossy' : The New Yorker - 2 views

  • There are precedents for such reclaiming—pejorative words like “queer” and even “slut,” for instance, which their targets have taken over and brandished with pride. But maybe a more apt comparison would be the word “nerd.” “Nerd” used to be a put-down—and it used to cover boys more often than girls. Like “bossy,” it wasn’t really that harsh, but it wasn’t nice, either. It actually had a gender dimension, too, because it called out brainy boys who were not athletic or aggressive. It was a dis of boys who lived in their heads and wore pocket protectors and ate their lunch indoors, playing chess. Just as “bossy” might be said to undermine female leadership, “nerd” might be said to have undermined male intellectualism. But now “nerd,” and its close cousin “geek,” are words that lots of people are happy to identify with, humble-bragging about their obsessive expertise. Brainiac techies can get rich these days, and that has helped spiff up the image of nerdery. John Green, an author of young-adult novels, and his brother Hank have developed a thriving online and off-line community of “nerdfighters,” girls as well as boys, who like to say that they fight for good with their brains and hearts, calculators and trombones. They have heartthrobs like the actor Benedict Cumberbatch. They find each other on Tumblr.
Lawrence Hrubes

When Philosophy Lost Its Way - The New York Times - 0 views

  • Having adopted the same structural form as the sciences, it’s no wonder philosophy fell prey to physics envy and feelings of inadequacy. Philosophy adopted the scientific modus operandi of knowledge production, but failed to match the sciences in terms of making progress in describing the world. Much has been made of this inability of philosophy to match the cognitive success of the sciences. But what has passed unnoticed is philosophy’s all-too-successful aping of the institutional form of the sciences. We, too, produce research articles. We, too, are judged by the same coin of the realm: peer-reviewed products. We, too, develop sub-specializations far from the comprehension of the person on the street. In all of these ways we are so very “scientific.”
markfrankel18

Malcolm Gladwell got us wrong: Our research was key to the 10,000-hour rule, but here's... - 0 views

  • First, there is nothing special or magical about ten thousand hours. Gladwell could just as easily have mentioned the average amount of time the best violin students had practiced by the time they were eighteen — approximately seventy-four hundred hours — but he chose to refer to the total practice time they had accumulated by the time they were twenty, because it was a nice round number.
  • And the number varies from field to field.
  • Third, Gladwell didn’t distinguish between the type of practice that the musicians in our study did — a very specific sort of practice referred to as “deliberate practice” which involves constantly pushing oneself beyond one’s comfort zone, following training activities designed by an expert to develop specific abilities, and using feedback to identify weaknesses and work on them — and any sort of activity that might be labeled “practice.”
  • ...1 more annotation...
  • The final problem with the ten-thousand-hour rule is that, although Gladwell himself didn’t say this, many people have interpreted it as a promise that almost anyone can become an expert in a given field by putting in ten thousand hours of practice. But nothing in the study of the Berlin violinists implied this. To show a result like this, it would have been necessary to put a collection of randomly chosen people through ten thousand hours of deliberate practice on the violin and then see how they turned out. All that the Berlin study had shown was that among the students who had become good enough to be admitted to the Berlin music academy, the best students had put in, on average, significantly more hours of solitary practice than the better students, and the better and best students had put in more solitary practice than the music-education students.
Lawrence Hrubes

The Great A.I. Awakening - The New York Times - 1 views

  • Translation, however, is an example of a field where this approach fails horribly, because words cannot be reduced to their dictionary definitions, and because languages tend to have as many exceptions as they have rules. More often than not, a system like this is liable to translate “minister of agriculture” as “priest of farming.” Still, for math and chess it worked great, and the proponents of symbolic A.I. took it for granted that no activities signaled “general intelligence” better than math and chess.
  • A rarefied department within the company, Google Brain, was founded five years ago on this very principle: that artificial “neural networks” that acquaint themselves with the world via trial and error, as toddlers do, might in turn develop something like human flexibility. This notion is not new — a version of it dates to the earliest stages of modern computing, in the 1940s — but for much of its history most computer scientists saw it as vaguely disreputable, even mystical. Since 2011, though, Google Brain has demonstrated that this approach to artificial intelligence could solve many problems that confounded decades of conventional efforts. Speech recognition didn’t work very well until Brain undertook an effort to revamp it; the application of machine learning made its performance on Google’s mobile platform, Android, almost as good as human transcription. The same was true of image recognition. Less than a year ago, Brain for the first time commenced with the gut renovation of an entire consumer product, and its momentous results were being celebrated tonight.
Lawrence Hrubes

The Ethical Quandaries You Should Think About The Next Time You Look At Your Phone | Fa... - 3 views

  • To what extent can we and should we aspire to create machines that can outthink us? For example, Netflix has an algorithm that can predict what movies you will like based on the ones you've already seen and rated. Suppose a dating site were to develop a similar algorithm—maybe even a more sophisticated one—and predict with some accuracy which partner would be the best match for you. Whose advice would you trust more? The advice of the smart dating app or the advice of your parents or your friends?
  • The question, it seems to me, is should we use new genetic technologies only to cure disease and repair injury, or also to make ourselves better-than-well. Should we aspire to become the masters of our natures to protect our children and improve their life prospects?AdvertisementAdvertisement This goes back to the role of accident. Is the unpredictability of the child an important precondition of the unconditional love of parents for children? My worry is that if we go beyond health, we run the risk of turning parenthood into an extension of the consumer society.
markfrankel18

Why are shoppers being asked to buy ethically or not in the first place? - Quartz - 3 views

  • A series of studies suggests that, while a product’s ethics may influence purchasing decisions, many shoppers choose simply not to know whether something was ethically made. That includes shoppers who care about social responsibility. And shoppers who ignore ethical matters can even develop a negative opinion about people who do express ethical concerns—which makes them even less likely to pay attention to ethical issues in the future.
  • “You feel badly that you were not ethical when someone else was,” Rebecca Reczek, a professor of marketing at Ohio State University and one of the study’s authors, told NPR about the results. “It’s a threat to your sense of self, to your identity. So to recover from that, you put the other person down.”
  • International supply chains, she points out, are notoriously opaque, and the free market doesn’t have any good way to deal with the way this system stifles information. It might be best, she says, if these matters were regulated before the products even reached consumers, taking ethical dilemmas out of the shopping equation. Of the unethical choice, like a polluting car or a shirt made with exploited labor, she suggests: “Maybe we just shouldn’t have it available.”
markfrankel18

The Primitive Streak - Radiolab - 0 views

  • Last May, two research groups announced a breakthrough: they each grew human embryos, in the lab, longer than ever before. In doing so, they witnessed a period of human development no one had ever seen. But in the process, they crashed up against something called the '14-day rule,' a guideline set over 30 years ago that dictates what we do, and possibly how we feel, about human embryos in the lab. On this episode, join producer Molly Webster as she peers down at our very own origins, and wonders: what do we do now?
markfrankel18

Presidential debate: A philosopher explains why facts are irrelevant to Donald Trump an... - 0 views

  • The malleable nature of facts is a particular preoccupation in one field of philosophy. “Social constructivism” argues that there are simply no objective facts. Instead, every “fact” we believe is a reflection of our socially constructed values, and how we choose to perceive the world. This is not a new theory, and develops many of its ideas from Karl Marx and Friedrich Nietzsche, who examined shifting human values from a historical perspective in the 19th century. But the current political debate offers a vivid demonstration of these ideas. Jesse Prinz, a philosophy professor at City University of New York, explains that facts are always subjective. Even something as foundational as the periodic table. “When you look closely, you realize that it could have been organized very differently. It could be ordered by atomic weight, rather than atomic number, it could include isotopes, it could exclude elements that don’t exist in nature, and so on,” he says. “The way we classify things is always a function of both mind and world.”
Lawrence Hrubes

Was E-mail a Mistake? | The New Yorker - 0 views

  • There’s nothing intrinsically bad about e-mail as a tool. In situations where asynchronous communication is clearly preferable—broadcasting an announcement, say, or delivering a document—e-mails are superior to messengered printouts. The difficulties start when we try to undertake collaborative projects—planning events, developing strategies—asynchronously. In those cases, communication becomes drawn out, even interminable
« First ‹ Previous 41 - 60 of 61 Next ›
Showing 20 items per page