Skip to main content

Home/ TOK@ISPrague/ Group items matching "time" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Lawrence Hrubes

What's in a Brand Name? - The New Yorker - 1 views

  • There are various ways a corporate name can seem apposite. In the case of existing words, connotations are crucial: a Corvette is a light, speedy attack ship; Tesla was an inventor of genius. Made-up names often rely instead on resonances with other words: Lexus evokes luxurious; Viagra conjures virility and vitality. Bad names bring the wrong associations to consumers’ minds. In the nineteen-eighties, United Airlines tried to turn itself into a diversified travel company called Allegis. The move was a fiasco. No less an authority than Donald Trump (whose faith in brand-name power is total) said that the name sounded “like the next world-class disease.”
  • A philosopher called Hermogenes argues that the relationship between a word and its meaning is purely arbitrary; Cratylus, another philosopher, disagrees; and Socrates eventually concludes that there is sometimes a connection between meaning and sound. Linguistics has mostly taken Hermogenes’ side, but, in the past eighty years, a field of research called phonetic symbolism has shown that Cratylus was on to something.
  • Remarkably, some of these phonemic associations seem to be consistent across many languages. That’s good news for multinationals: research shows that if customers feel your name is a good fit they’ll remember it better and even like it more.
  • ...1 more annotation...
  • Over time, corporate naming has developed certain conventions: alliteration and vowel repetition are good. “X” and “z” are held to be memorable and redolent of speed and fluidity. The letter “x” occurs sixteen times as often in drug names as in other English words; “z” occurs eighteen times as often.
markfrankel18

The Brain on Trial - Issue 5: Fame - Nautilus - 0 views

  • Now we are regularly bombarded with new insights into how the unconscious guides our behavior. At the same time, neuroscience has largely debunked the idea of an autonomous self that has the final say in decisions; few science-savvy folks still believe there is a “ghost in the machine,” a little homunculus in the brain who is watching our perceptions or thinking our thoughts. Some philosophers even question whether the conscious mind plays any role in our thoughts. In short, present-day neuroscience has pulled the rug out from under the concept of “the rational man.”
  • If you are asked why you chose the violin, your answer is unlikely to be an accurate reflection of the unconscious competition that led to your choice. In effect, the decision happened to you. Your brain developed a “violin neural circuit” in the same way that fame makes some actors, musicians, and novelists superstars while others, for reasons that are never entirely clear, are relegated to obscurity.
  • Imagine that you are a juror assigned to the sentencing phase of a person convicted of first-degree murder. The defendant is a 33-year-old woman who has confessed to shooting her boyfriend in the head, then stabbing him nearly 30 times before unsuccessfully trying to decapitate him with a butcher knife. Initially she tells police she hadn’t been present, that her boyfriend had been killed by “unknown intruders.” When she can offer no evidence to substantiate her alibi, she then confesses, arguing self-defense and that her boyfriend had submitted her to prior physical and mental abuse. On a national TV news show, she predicts that no jury will find her guilty, yet after a several-month trial, you find her guilty of first-degree murder. It is now sentencing time. Your assignment is to determine whether the crime warrants the death penalty or a life sentence without parole, or a lesser sentence with the possibility of parole.
markfrankel18

Chasing Coincidences - Issue 4: The Unlikely - Nautilus - 0 views

  • The simple question might be “why do such unlikely coincidences occur in our lives?” But the real question is how to define the unlikely. You know that a situation is uncommon just from experience. But even the concept of “uncommon” assumes that like events in the category are common. How do we identify the other events to which we can compare this coincidence? If you can identify other events as likely, then you can calculate the mathematical probability of this particular event as exceptional.
  • We are exposed to possible events all the time: some of them probable, but many of them highly improbable. Each rare event—by itself—is unlikely. But by the mere act of living, we constantly draw cards out of decks. Because something must happen when a card is drawn, so to speak, the highly improbable does appear from time to time.
markfrankel18

What Emotions Are (and Aren't) - The New York Times - 1 views

  • Analogously, emotion words like “anger,” “happiness” and “fear” each name a population of diverse biological states that vary depending on the context. When you’re angry with your co-worker, sometimes your heart rate will increase, other times it will decrease and still other times it will stay the same. You might scowl, or you might smile as you plot your revenge. You might shout or be silent. Variation is the norm.
  • The ease with which we experience emotions, and the effortlessness with which we see emotions in others, doesn’t mean that each emotion has a distinct pattern in the face, body or brain. Instead of asking where emotions are or what bodily patterns define them, we would do better to abandon such essentialism and ask the more revealing question, “How does the brain construct these incredible experiences?”
Lawrence Hrubes

Meet Dan Barber: America's next foodie-in-chief - Salon.com - 0 views

  • As the unofficial spokespeople for our organic-eating, Food Network-watching ways, when chefs talk, Americans tend to listen. And Dan Barber — the farm-to-table icon behind restaurants Blue Hill in New York City and Blue Hill Stone Barns in Tarrytown — isn’t wasting his platform.Barber has given a wildly popular TED talk, been counted among TIME’s 100 most influential people, been appointed to the President’s Council on Fitness, Sports & Nutrition and, as of last week, authored a nearly 500-page book laying out a radical new vision for the future of food.So when Barber came out last weekend with a New York TIMEs op-ed detailing the shortcomings of the farm-to-table movement he had previously helped promote, people paid attention. While we like to pat ourselves on the back for eating seasonally and locally, Barber’s contentious argument went, the foodie fad that grew to become the face of sustainable eating has failed to bring about the promised revolution. Industrial agriculture still rules, Barber argued. And as it keeps growing, more and more small farms and native prairie disappear under Big Ag’s plow.If we really care about changing our food system (as anyone who hopes to feed our growing world should), Barber believes we’re going to need a true revolution. And “The Third Plate“ is his thesis, over a decade in the works, for what that change must look like.Among other things, change means thinking holistically, embracing diversity in ingredient choice and cuisine, and shifting meat over from its vaulted place at the center of the dinner plate. But while the solutions Barber describes are frighteningly extensive, they also, in his telling, sound delicious. That, more than any warning about the consequences of continuing along with the status quo, could be thing that ends up making a difference.
markfrankel18

Whole Foods is taking heat for selling rabbit - Quartz - 0 views

  • But worrying about data is probably just a distraction, because, ultimately, “pet” is a relative term—there are more fish in our home aquariums than there are pet dogs, and any category that lumps the two together feels inadequate.
  • Rabbits, as this passer-by is implying, are widely consumed in other countries. Western Europeans love rabbit sausage, slow-cooked rabbit stews, and braised bunny dishes, while the Chinese—who account for 30% of global rabbit consumption—consider rabbit’s head a delicacy. + Rabbit was even a staple of the American diet at one time. It helped sustain the European transplants who migrated west across the frontier, and during World War II, eating rabbit was promoted as an act of patriotism akin to growing a victory garden. But as small farms gave way to large-scale operations, rabbit meat’s popularity melted away and other meats took over.
  • Herzog started thinking about this 20 years ago, when he was sitting in a hotel bar having a beer with the psychologist and animal rights activist, Ken Shapiro. Herzog knew Shapiro was a vegan; Shapiro knew Herzog ate meat. Both men had read all of the same psychology and animal-rights literature, and both spent a lot of time working through the same philosophical questions. But somehow, they came to different conclusions about how to live their lives. + “Hal, I don’t get it: why aren’t you like us?” Shapiro suddenly asked. Herzog didn’t have an answer. He still doesn’t. + “I’ve been struggling with this for a long time,” Herzog says. “I can handle moral ambiguity. I can deal with it. So I don’t have that need for moral consistency that animal activists do.” He laughs a little. “And I know that their logic is better than mine, so I don’t even try arguing with them. They win in these arguments.” +
  • ...1 more annotation...
  • Outside of the Union Square store, the activists are talking to a small crowd. “They refuse to test products on the very animals they turn around and sell as meat,” says a man wearing fuzzy bunny ears and holding a big sign. + This inconsistency presents a valid question: If I decide there is something ethically wrong with dripping chemicals into a rabbit’s eye to test its toxicity, is it hypocritical to eat that animal? + Hal Herzog talks about the relative ability of an individual to live with moral inconsistency, but perhaps the rabbit debate is less about morality and instead has to do with the categorical boundaries we use to talk about the debate in the first place.
Lawrence Hrubes

The drone operator who said 'No' - 0 views

  • The drone operator who said 'No' 21 January 2015 Last updated at 00:57 GMT For almost five years, Brandon Bryant worked in America's secret drone programme bombing targets in Afghanistan and elsewhere.He was told that he helped to kill more than 1,600 people, but as time went by he felt uneasy with what he was doing. He found it hard to sleep and started dreaming in infra-red. Brandon Bryant told Witness about his doubts and the mission that convinced him it was time to stop. Witness is a World Service programme of the stories of our times told by the people who were there.
markfrankel18

Twitter histories of events are vanishing - Salon.com - 1 views

  • Nowadays, we’re very good at telling history in real time. Live-tweeting, livestreaming, Instagraming, link sharing, instant commenting — everyday lives and major events are recorded and narrated from every angle as they happen. A new study has found, however, that these minutes-old histories may not be built to last.
  • As the Technology Review reported:A significant proportion of the websites that this social media [around the Arab Spring] points to has disappeared. And the same pattern occurs for other culturally significant events, such as the the H1N1 virus outbreak, Michael Jackson’s death and the Syrian uprising. In other words, our history, as recorded by social media, is slowly leaking away.
  • So it seems that social media sites like Twitter do not remain as fecund a resource over time as they do in real time. But no historian has ever worked on the assumption that all, or even most, information about an event is preserved, let alone even recorded. Not even Twitter has changed that.
Lawrence Hrubes

BBC News - France holds back the anti-smacking tide - 1 views

  • Turn on the radio in France in 1951 and you might have heard contributors extol the benefits of parents smacking their children. "I don't like slapping the face," one commentator says. "Slapping can harm the ears and the eyes, especially if it's violent. But everybody knows that smacking the bottom is excellent for the circulation of the blood." At the time, few would have seen that advice as abusive. It was another three decades before Sweden became the first European country to make smacking children illegal. More than 20 others have followed suit, but France has held out against the changing tide of parenting, with staunch resolve. In the wake of the European ruling this week, articles have appeared in the French media with titles such as "Smacking: A French Passion", and contributors have lined up on online forums to advocate the benefits of "la fessee", as it's known here. "We were really surprised by the response," says Christine Hernandez, a writer for France's most popular parenting magazine, Parents. "Many of our readers said that smacking is part of educating children. It's astonishing that parents still think that it's a good way to teach children how to behave. They think they have to impose their authority on children from time to time - it's part of French traditional upbringing."
Lawrence Hrubes

BBC News - Living with the J-word - 1 views

  • Thankfully, most of this Jew-targeted hatred takes the form of verbal aggression rather than physical violence. But because many critics of Israel make no distinction between citizens of the Jewish state and the worldwide Jewish community, the J-word has been the focus. You won't see "Kill Israelis" scrawled on London synagogue walls. What you see on walls is "Kill the Jews", and on banners "Hitler was Right". And this brings me back to the point about the complexity of anti-Semitism today. It is always around and in the end it is focused primarily on the J-word, in the same way that another form of racism is focused on the N-word. Those on the receiving end find their lives shaped by it. Certainly my life, my sense of myself, has been shaped by the casual anti-Semitism that I have encountered for more than half a century. The first time I was called a "Jew" with malicious intent was September 1958 in the playground of Belmont Hills Elementary School, in the suburbs of Philadelphia. It came as a surprise. I was eight years old and up until that time had been living in New York City where everyone I encountered was Jewish. Until that moment, the word "Jew" had simply been one of the words and phrases - like "Mike", "son" and "114 East 90th Street" - whose meanings were slowly building up into a sense of who I was.
  • Throughout the 19th Century, "Israelite" or "Hebrew" or "follower of Moses" supplanted "Jew" as the politically correct way to refer to the community. It was a process analogous to the way "black" and then "African-American" or "person of colour" replaced "Negro" in polite discourse after the Civil Rights era.
  • Thirty years later, a new word for this hatred was coined - "anti-Semitism". This was a time when race science was all the rage. Anti-Semitism avoided the connotation of pure hatred against individuals which is, after all, irrational. It focused scientifically on the supposed racial and social characteristics of a group, the Jews, without mentioning them by name. From there it was easy to start a political movement - based on scientific "facts" - to rein in a people who clearly were alien.
Lawrence Hrubes

Tim Cook Opposes Order for Apple to Unlock iPhone, Setting Up Showdown - The New York Times - 0 views

  • Apple said on Wednesday that it would oppose and challenge a federal court order to help the F.B.I. unlock an iPhone used by one of the two attackers who killed 14 people in San Bernardino, Calif., in December.
  • “We have made a solemn commitment to the victims and their families that we will leave no stone unturned as we gather as much information and evidence as possible. These victims and families deserve nothing less.”
  • Mr. Cook said the order would amount to creating a “back door” to bypass Apple’s strong encryption standards — “something we simply do not have, and something we consider too dangerous to create.”
  • ...1 more annotation...
  • “If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data.
Lawrence Hrubes

The Wrong Way to Teach Math - The New York Times - 0 views

  • HERE’S an apparent paradox: Most Americans have taken high school mathematics, including geometry and algebra, yet a national survey found that 82 percent of adults could not compute the cost of a carpet when told its dimensions and square-yard price.
  • In fact, what’s needed is a different kind of proficiency, one that is hardly taught at all. The Mathematical Association of America calls it “quantitative literacy.” I prefer the O.E.C.D.’s “numeracy,” suggesting an affinity with reading and writing.
  • Many students fall by the wayside. It’s not just the difficulty of the classes. They can’t see how such formulas connect with the lives they’ll be leading.
  • ...1 more annotation...
  • Finally, we talk about how math can help us think about reorganizing the world around us in ways that make more sense. For example, there’s probably nothing more cumbersome than how we measure time: How quickly can you compute 17 percent of a week, calibrated in hours (or minutes, or seconds)? So our class undertook to decimalize time.
Lawrence Hrubes

Unreliable research: Trouble at the lab | The Economist - 0 views

  • Academic scientists readily acknowledge that they often get things wrong. But they also hold fast to the idea that these errors get corrected over time as other scientists try to take the work further.
  • Evidence that many more dodgy results are published than are subsequently corrected or withdrawn calls that much-vaunted capacity for self-correction into question. There are errors in a lot more of the scientific papers being published, written about and acted on than anyone would normally suppose, or like to think.
  • Various factors contribute to the problem. Statistical mistakes are widespread. The peer reviewers who evaluate papers before journals commit to publishing them are much worse at spotting mistakes than they or others appreciate. Professional pressure, competition and ambition push scientists to publish more quickly than would be wise. A career structure which lays great stress on publishing copious papers exacerbates all these problems.
  • ...1 more annotation...
  • The idea that the same experiments always get the same results, no matter who performs them, is one of the cornerstones of science’s claim to objective truth. If a systematic campaign of replication does not lead to the same results, then either the original research is flawed (as the replicators claim) or the replications are (as many of the original researchers on priming contend). Either way, something is awry.
  •  
    "Academic scientists readily acknowledge that they often get things wrong. But they also hold fast to the idea that these errors get corrected over time as other scientists try to take the work further. Evidence that many more dodgy results are published than are subsequently corrected or withdrawn calls that much-vaunted capacity for self-correction into question. There are errors in a lot more of the scientific papers being published, written about and acted on than anyone would normally suppose, or like to think."
Lawrence Hrubes

There's a good reason Americans are horrible at science - Quartz - 0 views

  • There are a number of problems with teaching science as a collection of facts. First, facts change. Before oxygen was discovered, the theoretical existence of phlogiston made sense. For a brief, heady moment in 1989, it looked like cold fusion (paywall) was going to change the world. In the field of medical science, “facts” are even more wobbly. For example, it has been estimated that fewer than 10% of published high profile cancer studies are reproducible (the word “reproducible” here is a euphemism for “not total poppycock”).
  • It’s not possible for everyone—or anyone—to be sufficiently well trained in science to analyze data from multiple fields and come up with sound, independent interpretations. I spent decades in medical research, but I will never understand particle physics, and I’ve forgotten almost everything I ever learned about inorganic chemistry. It is possible, however, to learn enough about the powers and limitations of the scientific method to intelligently determine which claims made by scientists are likely to be true and which deserve skepticism. As a starting point, we could teach our children that the theories and technologies that have been tested the most times, by the largest number of independent observers, over the greatest number of years, are the most likely to be reliable.
markfrankel18

One of Us - Lapham's Quarterly - 0 views

  • These are stimulating times for anyone interested in questions of animal consciousness. On what seems like a monthly basis, scientific teams announce the results of new experiments, adding to a preponderance of evidence that we’ve been underestimating animal minds, even those of us who have rated them fairly highly. New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”
Lawrence Hrubes

What Did You Think of 'What's Going On in This Picture'? - The New York Times - 1 views

  • Last October we introduced a new feature in which, each Monday morning this school year, we posted a New York Times photograph without a caption, then invited students to answer three deceptively simple questions about it: What’s going on in this picture? What do you see that makes you say that? What more can you find? As student answers poured in to the blog each week — this shows someone learning a back-flip; I think he’s in the military because of his camouflage shirt; The background makes it look like a movie set — our collaborators for the feature, Visual Thinking Strategies, acted as live moderators, linking thoughts and posting further questions intended to help them go deeper and see more. Most weeks that created a lively debate in our comments section, as students of all ages, backgrounds and places pushed each other to find detail and defend interpretations.
Lawrence Hrubes

Period. Full Stop. Point. Whatever It's Called, It's Going Out of Style - The New York Times - 0 views

  • The period — the full-stop signal we all learn as children, whose use stretches back at least to the Middle Ages — is gradually being felled in the barrage of instant messaging that has become synonymous with the digital age
  • Increasingly, says Professor Crystal, whose books include “Making a Point: The Persnickety Story of English Punctuation,” the period is being deployed as a weapon to show irony, syntactic snark, insincerity, even aggression
  • At the same time, he said he found that British teenagers were increasingly eschewing emoticons and abbreviations such as “LOL” (laughing out loud) or “ROTF” (rolling on the floor) in text messages because they had been adopted by their parents and were therefore considered “uncool”
  •  
    note: this article was written with an intentional lack of periods
‹ Previous 21 - 40 of 272 Next › Last »
Showing 20 items per page