Skip to main content

Home/ TOK Friends/ Group items matching "smartphones" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

Resist the Internet - The New York Times - 0 views

  • Definitely if you’re young, increasingly if you’re old, your day-to-day, minute-to-minute existence is dominated by a compulsion to check email and Twitter and Facebook and Instagram with a frequency that bears no relationship to any communicative need.
  • it requires you to focus intensely, furiously, and constantly on the ephemera that fills a tiny little screen, and experience the traditional graces of existence — your spouse and friends and children, the natural world, good food and great art — in a state of perpetual distraction.
  • It certainly delivers some social benefits, some intellectual advantages, and contributes an important share to recent economic growth.
  • ...9 more annotations...
  • They are the masters; we are not. They are built to addict us, as the social psychologist Adam Alter’s new book “Irresistible” points out — and to madden us, distract us, arouse us and deceive us.
  • We primp and perform for them as for a lover; we surrender our privacy to their demands; we wait on tenterhooks for every “like.” The smartphone is in the saddle, and it rides mankind.
  • the internet, like alcohol, may be an example of a technology that should be sensibly restricted in custom and in law.
  • Used within reasonable limits, of course, these devices also offer us new graces. But we are not using them within reasonable limits.
  • there are also excellent reasons to think that online life breeds narcissism, alienation and depression, that it’s an opiate for the lower classes and an insanity-inducing influence on the politically-engaged, and that it takes more than it gives from creativity and deep thought. Meanwhile the age of the internet has been, thus far, an era of bubbles, stagnation and democratic decay — hardly a golden age whose customs must be left inviolate.
  • So a digital temperance movement would start by resisting the wiring of everything, and seek to create more spaces in which internet use is illegal, discouraged or taboo. Toughen laws against cellphone use in cars, keep computers out of college lecture halls, put special “phone boxes” in restaurants where patrons would be expected to deposit their devices, confiscate smartphones being used in museums and libraries and cathedrals, create corporate norms that strongly discourage checking email in a meeting.
  • Then there are the starker steps. Get computers — all of them — out of elementary schools, where there is no good evidence that they improve learning. Let kids learn from books for years before they’re asked to go online for research; let them play in the real before they’re enveloped by the virtual
  • The age of consent should be 16, not 13, for Facebook accounts. Kids under 16 shouldn’t be allowed on gaming networks. High school students shouldn’t bring smartphones to school. Kids under 13 shouldn’t have them at all.
  • I suspect that versions of these ideas will be embraced within my lifetime by a segment of the upper class and a certain kind of religious family. But the masses will still be addicted, and the technology itself will have evolved to hook and immerse — and alienate and sedate — more completely and efficiently.
Javier E

Revisiting the prophetic work of Neil Postman about the media » MercatorNet - 1 views

  • The NYU professor was surely prophetic. “Our own tribe is undergoing a vast and trembling shift from the magic of writing to the magic of electronics,” he cautioned.
  • “We face the rapid dissolution of the assumptions of an education organised around the slow-moving printed word, and the equally rapid emergence of a new education based on the speed-of-light electronic message.”
  • What Postman perceived in television has been dramatically intensified by smartphones and social media
  • ...31 more annotations...
  • Postman also recognised that technology was changing our mental processes and social habits.
  • Today corporations like Google and Amazon collect data on Internet users based on their browsing history, the things they purchase, and the apps they use
  • Yet all citizens are undergoing this same transformation. Our digital devices undermine social interactions by isolating us,
  • “Years from now, it will be noticed that the massive collection and speed-of-light retrieval of data have been of great value to large-scale organisations, but have solved very little of importance to most people, and have created at least as many problems for them as they may have solved.”
  • “Television has by its power to control the time, attention, and cognitive habits of our youth gained the power to control their education.”
  • As a student of Canadian philosopher Marshall McLuhan, Postman believed that the medium of information was critical to understanding its social and political effects. Every technology has its own agenda. Postman worried that the very nature of television undermined American democratic institutions.
  • Many Americans tuned in to the presidential debate looking for something substantial and meaty
  • It was simply another manifestation of the incoherence and vitriol of cable news
  • “When, in short, a people become an audience and their public business a vaudeville act, then a nation finds itself at risk; culture-death is a clear possibility,” warned Postman.
  • Technology Is Never Neutral
  • As for new problems, we have increased addictions (technological and pornographic); increased loneliness, anxiety, and distraction; and inhibited social and intellectual maturation.
  • The average length of a shot on network television is only 3.5 seconds, so that the eye never rests, always has something new to see. Moreover, television offers viewers a variety of subject matter, requires minimal skills to comprehend it, and is largely aimed at emotional gratification.
  • This is far truer of the Internet and social media, where more than a third of Americans, and almost half of young people, now get their news.
  • with smartphones now ubiquitous, the Internet has replaced television as the “background radiation of the social and intellectual universe.”
  • Is There Any Solution?
  • Reading news or commentary in print, in contrast, requires concentration, patience, and careful reflection, virtues that our digital age vitiates.
  • Politics as Entertainment
  • “How television stages the world becomes the model for how the world is properly to be staged,” observed Postman. In the case of politics, television fashions public discourse into yet another form of entertainment
  • In America, the fundamental metaphor for political discourse is the television commercial. The television commercial is not at all about the character of products to be consumed. … They tell everything about the fears, fancies, and dreams of those who might buy them.
  • The television commercial has oriented business away from making products of value and towards making consumers feel valuable, which means that the business of business has now become pseudo-therapy. The consumer is a patient assured by psycho-dramas.
  • Such is the case with the way politics is “advertised” to different subsets of the American electorate. The “consumer,” depending on his political leanings, may be manipulated by fears of either an impending white-nationalist, fascist dictatorship, or a radical, woke socialist takeover.
  • This paradigm is aggravated by the hypersiloing of media content, which explains why Americans who read left-leaning media view the Proud Boys as a legitimate, existential threat to national civil order, while those who read right-leaning media believe the real immediate enemies of our nation are Antifa
  • Regardless of whether either of these groups represents a real public menace, the loss of any national consensus over what constitutes objective news means that Americans effectively talk past one another: they use the Proud Boys or Antifa as rhetorical barbs to smear their ideological opponents as extremists.
  • Yet these technologies are far from neutral. They are, rather, “equipped with a program for social change.
  • Postman’s analysis of technology is prophetic and profound. He warned of the trivialising of our media, defined by “broken time and broken attention,” in which “facts push other facts into and then out of consciousness at speeds that neither permit nor require evaluation.” He warned of “a neighborhood of strangers and pointless quantity.”
  • does Postman offer any solutions to this seemingly uncontrollable technological juggernaut?
  • Postman’s suggestions regarding education are certainly relevant. He unequivocally condemned education that mimics entertainment, and urged a return to learning that is hierarchical, meaning that it first gives students a foundation of essential knowledge before teaching “critical thinking.”
  • Postman also argued that education must avoid a lowest-common-denominator approach in favor of complexity and the perplexing: the latter method elicits in the student a desire to make sense of what perplexes him.
  • Finally, Postman promoted education of vigorous exposition, logic, and rhetoric, all being necessary for citizenship
  • Another course of action is to understand what these media, by their very nature, do to us and to public discourse.
  • We must, as Postman exhorts us, “demystify the data” and dominate our technology, lest it dominate us. We must identify and resist how television, social media, and smartphones manipulate our emotions, infantilise us, and weaken our ability to rebuild what 2020 has ravaged.
katherineharron

Screen time: Mental health menace or scapegoat? - CNN - 0 views

  • (CNN)"Have smartphones destroyed a generation?" Jean Twenge, a professor of psychology at San Diego State University, asked in an adapted excerpt of her controversial book, "iGen."In the book, she argues that those born after 1995 are on the "brink of a mental-health crisis" -- and she believes it can be linked to growing up with their noses pressed against a screen.
  • For those who responded 10 to 19 hours per week, that number was about 18%. For those who spent 40 or more hours a week using social media, that number approached 24%.
  • By the twelfth grade, however, the negative correlations between screen time and teen psychology had somewhat dissipated. In addition, less is not always more: Teens with zero hours of screen time had higher rates of unhappiness than their peers who logged in a few hours a week.
  • ...4 more annotations...
  • Twenge recognizes that her study shows only a correlation between screen use and "psychological well-being," which is measured using survey questions about self-esteem, life satisfaction and happiness. The surveys can't say whether screen time directly changes teens' mental health, the research states.
  • "I spent my career in technology. I wasn't prepared for its effect on my kids," philanthropist Melinda Gates, whose three children were also born after 1995, wrote August in the Washington Post. "Phones and apps aren't good or bad by themselves, but for adolescents who don't yet have the emotional tools to navigate life's complications and confusions, they can exacerbate the difficulties of growing up."
  • At the same time, she said, kids are learning on their devices and connecting in novel ways. "Marginalized groups such as gay and lesbian students (are) finding support they never had before through social networks," said Gates.
  • Other studies have explored the connection between social media and isolation and how "likes" activate the brain's reward center. Some analyses have found that moderate use of these technologies is "not intrinsically harmful" and can even improve social skills and develop resilience.
sandrine_h

Does your smartphone make you less likely to trust others? - 0 views

  • When trust between people in a country goes up, for example, so does economic growth. At the individual level, people who trust others more also tend to have better health and higher well-being.Could our increasing reliance on information from devices, rather than from other people, be costing us opportunities to build social capital?
  • As information technology continues to make our lives easier, our findings highlight the possible social costs of constant information access: By turning to convenient electronic devices, people may be forgoing opportunities to foster trust – a finding that seems particularly poignant in the present political climate.
summertyler

Last words? Phone app bids to save dying aboriginal language - CNN.com - 0 views

  • A smartphone app has been launched to help save an Australian indigenous language that is in danger of disappearing.
  • aims to prevent the extinction of the Iwaidja language
  • "People have their phones with them most of the time, the app is incredibly easy to use, and this allows data collection to happen spontaneously, opportunistically,"
  • ...4 more annotations...
  • "We believe the tools we are developing will exponentially increase the involvement of the Indigenous people whose languages are threatened, without the need for difficult-to-attain levels of computer literacy,"
  • Until now endangered aboriginal languages were recorded in the presence of a linguist and selected native speakers with recording equipment
  • indigenous people whose languages are threatened can record and upload languages at their own pace and at times which suit them, he says, without requiring the presence of a specialist holding a microphone
  • "The ability provided by the tools we are developing to easily create, record and share language, images, and video, at the same time as building sustainable databases for future use, involves and empowers speakers of indigenous languages in a way which has not been possible before."
  •  
    Language is a barrier, and people are trying to break down these barriers.
Javier E

The Cheapest Generation - The Atlantic - 0 views

  • today’s young people simply don’t drive like their predecessors did. In 2010, adults between the ages of 21 and 34 bought just 27 percent of all new vehicles sold in America, down from the peak of 38 percent in 1985. Miles driven are down, too. Even the proportion of teenagers with a license fell, by 28 percent, between 1998 and 2008.
  • What if Millennials’ aversion to car-buying isn’t a temporary side effect of the recession, but part of a permanent generational shift in tastes and spending habits? It’s a question that applies not only to cars, but to several other traditional categories of big spending—most notably, housing. And its answer has large implications for the future shape of the economy—and for the speed of recovery.
  • Half of a typical family’s spending today goes to transportation and housing
  • ...9 more annotations...
  • tech­nology is allow­ing these practices to go mainstream, and that represents a big new step for consumers. For decades, inventory manage­ment was largely the province of companies, not individuals,
  • he Great Recession is responsible for some of the decline. But it’s highly possible that a perfect storm of economic and demographic factors—from high gas prices, to re-­urbanization, to stagnating wages, to new technologies enabling a different kind of consumption—has fundamentally changed the game for Millennials
  • The emergence of the “sharing economy”—services that use the Web to let companies and families share otherwise idle goods—is headlined by Zipcar, but it also involves companies such as Airbnb, a shared market­place for bedrooms and other accommodations for travelers; and thred­UP, a site where parents can buy and sell kids’ used clothing.
  • Millennials have turned against both cars and houses in dramatic and historic fashion. Just as car sales have plummeted among their age cohort, the share of young people getting their first mortgage between 2009 and 2011 is half what it was just 10 years ago
  • today, peer-to-peer software and mobile technology allow us all to have access, just when we need it, to the things we used to have to buy and hold. And the most powerful application is for cars.
  • Car ownership, meanwhile, has slipped down the hierarchy of status goods for many young adults. “Zipcar conducted a survey of Millennials,
  • “And this generation said, ‘We don’t care about owning a car.’ Cars used to be what people aspired to own. Now it’s the smartphone.”
  • Smartphones compete against cars for young people’s big-ticket dollars, since the cost of a good phone and data plan can exceed $1,000 a year. But they also provide some of the same psychic benefits—opening new vistas and carrying us far from the physical space in which we reside. “You no longer need to feel connected to your friends with a car
  • mobile technology has empowered more than just car-sharing. It has empowered friendships that can be maintained from a distance. The upshot could be a continuing shift from automobiles to mobile technology, and a big reduction in spending.
Javier E

The Fall of Facebook - The Atlantic - 0 views

  • Alexis C. Madrigal Nov 17 2014, 7:59 PM ET Social networking is not, it turns out, winner take all. In the past, one might have imagined that switching between Facebook and “some other network” would be difficult, but the smartphone interface makes it easy to be on a dozen networks. All messages come to the same place—the phone’s notifications screen—so what matters is what your friends are doing, not which apps they’re using.
  • if I were to put money on an area in which Facebook might be unable to dominate in the future, it would be apps that take advantage of physical proximity. Something radically new could arise on that front, whether it’s an evolution of Yik Yak
  • The Social Machine, predicts that text will be a less and less important part of our asynchronous communications mix. Instead, she foresees a “very fluid interface” that would mix text with voice, video, sensor outputs (location, say, or vital signs), and who knows what else
  • ...5 more annotations...
  • the forthcoming Apple Watch seems like a step toward the future Donath envisions. Users will be able to send animated smiley faces, drawings, voice snippets, and even their live heartbeats, which will be tapped out on the receiver’s wrist.
  • A simple but rich messaging platform—perhaps with specialized hardware—could replace the omnibus social network for most purposes. “I think we’re shifting in a weird way to one-on-one conversations on social networks and in messaging apps,” says Shani Hilton, the executive editor for news at BuzzFeed, the viral-media site. “People don’t want to perform their lives publicly in the same way that they wanted to five years ago.”
  • Facebook is built around a trade-off that it has asked users to make: Give us all your personal information, post all your pictures, tag all your friends, and so on, forever. In return, we’ll optimize your social life. But this output is only as good as the input. And it turns out that, when scaled up, creating this input—making yourself legible enough to the Facebook machine that your posts are deemed “relevant” and worthy of being displayed to your mom and your friends—is exhausting labor.
  • These new apps, then, are arguments that we can still have an Internet that is weird, and private. That we can still have social networks without the social network. And that we can still have friends on the Internet without “friending” them.
  • A Brief History of Information Gatekeepers 1871: Western Union controls 90 percent of U.S. telegraph traffic. 1947: 97 percent of the country’s radio stations are affiliated with one of four national networks. 1969: Viewership for the three nightly network newscasts hits an all-time high, with 50 percent of all American homes tuning in. 1997: About half of all American homes with Internet access get it through America Online. 2002: Microsoft Internet Explorer captures 97 percent of the worldwide browser market. 2014: Amazon sells 63 percent of all books bought online—and 40 percent of books overall.
anonymous

Walmart Prepares to Enter Mobile Payments Business - The New York Times - 0 views

  • “Soon, customers can leave only with their keys and smartphone to shop at their local Walmart,” said Neil Ashe, Walmart’s e-commerce chief. “It’s fast, easy and secure.”
  • “When Apple released Apple Pay, the idea was that we’re now moving forward with N.F.C., and that’s the way all mobile payments will be transacted,
  • “But now, we’re beginning to see that no, that’s probably not the way it’s going to develop, and it’s not something we’re just going to give over to Apple or Google,”
  • ...6 more annotations...
  • mobile payment system for its stores
  • And in a change of strategy for Walmart, its new payment app does not seek to bypass credit card companies.
  • “When you look at these things, the customer has constraints put upon them, and that creates frictions and seams in the shopping experience,” Mr. Eckert said. “But Walmart Pay works with any smartphone and almost any payment type.”
  • “And once that happens, literally, the customer is done,”
  • “They can put their phone away. Once the transaction’s complete, we total up the register, and the customer can leave.”
  • There are also concerns over security.
Javier E

After the Fact - The New Yorker - 1 views

  • newish is the rhetoric of unreality, the insistence, chiefly by Democrats, that some politicians are incapable of perceiving the truth because they have an epistemological deficit: they no longer believe in evidence, or even in objective reality.
  • the past of proof is strange and, on its uncertain future, much in public life turns. In the end, it comes down to this: the history of truth is cockamamie, and lately it’s been getting cockamamier.
  • . Michael P. Lynch is a philosopher of truth. His fascinating new book, “The Internet of Us: Knowing More and Understanding Less in the Age of Big Data,” begins with a thought experiment: “Imagine a society where smartphones are miniaturized and hooked directly into a person’s brain.” As thought experiments go, this one isn’t much of a stretch. (“Eventually, you’ll have an implant,” Google’s Larry Page has promised, “where if you think about a fact it will just tell you the answer.”) Now imagine that, after living with these implants for generations, people grow to rely on them, to know what they know and forget how people used to learn—by observation, inquiry, and reason. Then picture this: overnight, an environmental disaster destroys so much of the planet’s electronic-communications grid that everyone’s implant crashes. It would be, Lynch says, as if the whole world had suddenly gone blind. There would be no immediate basis on which to establish the truth of a fact. No one would really know anything anymore, because no one would know how to know. I Google, therefore I am not.
  • ...20 more annotations...
  • In England, the abolition of trial by ordeal led to the adoption of trial by jury for criminal cases. This required a new doctrine of evidence and a new method of inquiry, and led to what the historian Barbara Shapiro has called “the culture of fact”: the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth and the only kind of evidence that’s admissible not only in court but also in other realms where truth is arbitrated. Between the thirteenth century and the nineteenth, the fact spread from law outward to science, history, and journalism.
  • Lynch isn’t terribly interested in how we got here. He begins at the arrival gate. But altering the flight plan would seem to require going back to the gate of departure.
  • Lynch thinks we are frighteningly close to this point: blind to proof, no longer able to know. After all, we’re already no longer able to agree about how to know. (See: climate change, above.)
  • Empiricists believed they had deduced a method by which they could discover a universe of truth: impartial, verifiable knowledge. But the movement of judgment from God to man wreaked epistemological havoc.
  • For the length of the eighteenth century and much of the nineteenth, truth seemed more knowable, but after that it got murkier. Somewhere in the middle of the twentieth century, fundamentalism and postmodernism, the religious right and the academic left, met up: either the only truth is the truth of the divine or there is no truth; for both, empiricism is an error.
  • That epistemological havoc has never ended: much of contemporary discourse and pretty much all of American politics is a dispute over evidence. An American Presidential debate has a lot more in common with trial by combat than with trial by jury,
  • came the Internet. The era of the fact is coming to an end: the place once held by “facts” is being taken over by “data.” This is making for more epistemological mayhem, not least because the collection and weighing of facts require investigation, discernment, and judgment, while the collection and analysis of data are outsourced to machines
  • “Most knowing now is Google-knowing—knowledge acquired online,”
  • We now only rarely discover facts, Lynch observes; instead, we download them.
  • “The Internet didn’t create this problem, but it is exaggerating it,”
  • nothing could be less well settled in the twenty-first century than whether people know what they know from faith or from facts, or whether anything, in the end, can really be said to be fully proved.
  • In his 2012 book, “In Praise of Reason,” Lynch identified three sources of skepticism about reason: the suspicion that all reasoning is rationalization, the idea that science is just another faith, and the notion that objectivity is an illusion. These ideas have a specific intellectual history, and none of them are on the wane.
  • Their consequences, he believes, are dire: “Without a common background of standards against which we measure what counts as a reliable source of information, or a reliable method of inquiry, and what doesn’t, we won’t be able to agree on the facts, let alone values.
  • When we Google-know, Lynch argues, we no longer take responsibility for our own beliefs, and we lack the capacity to see how bits of facts fit into a larger whole
  • Essentially, we forfeit our reason and, in a republic, our citizenship. You can see how this works every time you try to get to the bottom of a story by reading the news on your smartphone.
  • what you see when you Google “Polish workers” is a function of, among other things, your language, your location, and your personal Web history. Reason can’t defend itself. Neither can Google.
  • rump doesn’t reason. He’s a lot like that kid who stole my bat. He wants combat. Cruz’s appeal is to the judgment of God. “Father God, please . . . awaken the body of Christ, that we might pull back from the abyss,” he preached on the campaign trail. Rubio’s appeal is to Google.
  • Is there another appeal? People who care about civil society have two choices: find some epistemic principles other than empiricism on which everyone can agree or else find some method other than reason with which to defend empiricism
  • Lynch suspects that doing the first of these things is not possible, but that the second might be. He thinks the best defense of reason is a common practical and ethical commitment.
  • That, anyway, is what Alexander Hamilton meant in the Federalist Papers, when he explained that the United States is an act of empirical inquiry: “It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.”
Javier E

How to Get Your Mind to Read - The New York Times - 1 views

  • Americans’ trouble with reading predates digital technologies. The problem is not bad reading habits engendered by smartphones, but bad education habits engendered by a misunderstanding of how the mind reads.
  • Just how bad is our reading problem? The last National Assessment of Adult Literacy from 2003 is a bit dated, but it offers a picture of Americans’ ability to read in everyday situations: using an almanac to find a particular fact, for example, or explaining the meaning of a metaphor used in a story. Of those who finished high school but did not continue their education, 13 percent could not perform simple tasks like these.
  • When things got more complex — in comparing two newspaper editorials with different interpretations of scientific evidence or examining a table to evaluate credit card offers — 95 percent failed.
  • ...17 more annotations...
  • poor readers can sound out words from print, so in that sense, they can read. Yet they are functionally illiterate — they comprehend very little of what they can sound out. So what does comprehension require? Broad vocabulary, obviously. Equally important, but more subtle, is the role played by factual knowledge.
  • All prose has factual gaps that must be filled by the reader.
  • Knowledge also provides context.
  • You might think, then, that authors should include all the information needed to understand what they write.
  • Current education practices show that reading comprehension is misunderstood. It’s treated like a general skill that can be applied with equal success to all texts. Rather, comprehension is intimately intertwined with knowledge.
  • students who score well on reading tests are those with broad knowledge; they usually know at least a little about the topics of the passages on the test.
  • One experiment tested 11th graders’ general knowledge with questions from science (“pneumonia affects which part of the body?”), history (“which American president resigned because of the Watergate scandal?”), as well as the arts, civics, geography, athletics and literature. Scores on this general knowledge test were highly associated with reading test scores.
  • But those details would make prose long and tedious for readers who already know the information. “Write for your audience” means, in part, gambling on what they know.
  • That suggests three significant changes in schooling.
  • First, it points to decreasing the time spent on literacy instruction in early grades.
  • Third-graders spend 56 percent of their time on literacy activities but 6 percent each on science and social studies. This disproportionate emphasis on literacy backfires in later grades, when children’s lack of subject matter knowledge impedes comprehension.
  • Another positive step would be to use high-information texts in early elementary grades. Historically, they have been light in content.
  • Second, understanding the importance of knowledge to reading ought to make us think differently about year-end standardized tests. If a child has studied New Zealand, she ought to be good at reading and thinking about passages on New Zealand. Why test her reading with a passage about spiders, or the Titanic?
  • Third, the systematic building of knowledge must be a priority in curriculum design.
  • The Common Core Standards for reading specify nearly nothing by way of content that children are supposed to know — the document valorizes reading skills. State officials should go beyond the Common Core Standards by writing content-rich grade-level standards
  • Don’t blame the internet, or smartphones, or fake news for Americans’ poor reading. Blame ignorance. Turning the tide will require profound changes in how reading is taught, in standardized testing and in school curriculums. Underlying all these changes must be a better understanding of how the mind comprehends what it reads.
  • Daniel T. Willingham (@DTWillingham) is a professor of psychology at the University of Virginia and the author, most recently, of “The Reading Mind: A Cognitive Approach to Understanding How the Mind Reads.”
Javier E

You Think With the World, Not Just Your Brain - The Atlantic - 2 views

  • embodied or extended cognition: broadly, the theory that what we think of as brain processes can take place outside of the brain.
  • The octopus, for instance, has a bizarre and miraculous mind, sometimes inside its brain, sometimes extending beyond it in sucker-tipped trails. Neurons are spread throughout its body; the creature has more of them in its arms than in its brain itself. It’s possible that each arm might be, to some extent, an independently thinking creature, all of which are collapsed into an octopean superconsciousness in times of danger
  • Embodied cognition, though, tells us that we’re all more octopus-like than we realize. Our minds are not like the floating conceptual “I” imagined by Descartes. We’re always thinking with, and inseparable from, our bodies.
  • ...8 more annotations...
  • The body codes how the brain works, more than the brain controls the body. When we walk—whether taking a pleasant afternoon stroll, or storming off in tears, or trying to sneak into a stranger’s house late at night, with intentions that seem to have exploded into our minds from some distant elsewhere—the brain might be choosing where each foot lands, but the way in which it does so is always constrained by the shape of our legs
  • The way in which the brain approaches the task of walking is already coded by the physical layout of the body—and as such, wouldn’t it make sense to think of the body as being part of our decision-making apparatus? The mind is not simply the brain, as a generation of biological reductionists, clearing out the old wreckage of what had once been the soul, once insisted. It’s not a kind of software being run on the logical-processing unit of the brain. It’s bigger, and richer, and grosser, in every sense. It has joints and sinews. The rarefied rational mind sweats and shits; this body, this mound of eventually rotting flesh, is really you.
  • That’s embodied cognition.
  • Extended cognition is stranger.
  • The mind, they argue, has no reason to stop at the edges of the body, hemmed in by skin, flapping open and closed with mouths and anuses.
  • When we jot something down—a shopping list, maybe—on a piece of paper, aren’t we in effect remembering it outside our heads? Most of all, isn’t language itself something that’s always external to the individual mind?
  • Language sits hazy in the world, a symbolic and intersubjective ether, but at the same time it forms the substance of our thought and the structure of our understanding. Isn’t language thinking for us?
  • Writing, for Plato, is a pharmakon, a “remedy” for forgetfulness, but if taken in too strong a dose it becomes a poison: A person no longer remembers things for themselves; it’s the text that remembers, with an unholy autonomy. The same criticisms are now commonly made of smartphones. Not much changes.
ilanaprincilus06

Why the modern world is bad for your brain | Science | The Guardian - 0 views

  • Our brains are busier than ever before. We’re assaulted with facts, pseudo facts, jibber-jabber, and rumour, all posing as information. Trying to figure out what you need to know and what you can ignore is exhausting.
  • Our smartphones have become Swiss army knife–like appliances that include a dictionary, calculator, web browser, email, Game Boy, appointment calendar, voice recorder, guitar tuner, weather forecaster, GPS, texter, tweeter, Facebook updater, and flashlight.
  • But there’s a fly in the ointment. Although we think we’re doing several things at once, multitasking, this is a powerful and diabolical illusion.
  • ...12 more annotations...
  • When people think they’re multitasking, they’re actually just switching from one task to another very rapidly. And every time they do, there’s a cognitive cost in doing so.”
  • Even though we think we’re getting a lot done, ironically, multitasking makes us demonstrably less efficient.
  • Multitasking creates a dopamine-addiction feedback loop, effectively rewarding the brain for losing focus and for constantly searching for external stimulation.
  • The irony here for those of us who are trying to focus amid competing activities is clear: the very brain region we need to rely on for staying on task is easily distracted.
  • Instead of reaping the big rewards that come from sustained, focused effort, we instead reap empty rewards from completing a thousand little sugar-coated tasks.
  • His research found that being in a situation where you are trying to concentrate on a task, and an email is sitting unread in your inbox, can reduce your effective IQ by 10 points.
  • Wilson showed that the cognitive losses from multitasking are even greater than the cognitive losses from pot‑smoking.
  • If students study and watch TV at the same time, for example, the information from their schoolwork goes into the striatum, a region specialised for storing new procedures and skills, not facts and ideas. Without the distraction of TV, the information goes into the hippocampus, where it is organised and categorised in a variety of ways, making it easier to retrieve.
  • All this activity gives us a sense that we’re getting things done – and in some cases we are. But we are sacrificing efficiency and deep concentration when we interrupt our priority activities with email.
  • This uncertainty wreaks havoc with our rapid perceptual categorisation system, causes stress, and leads to decision overload. Every email requires a decision! Do I respond to it? If so, now or later? How important is it? What will be the social, economic, or job-related consequences if I don’t answer, or if I don’t answer right now?
  • A lever in the cage allowed the rats to send a small electrical signal directly to their nucleus accumbens. Do you think they liked it? Boy how they did! They liked it so much that they did nothing else. They forgot all about eating and sleeping. Long after they were hungry, they ignored tasty food if they had a chance to press that little chrome bar;
  • But remember, it is the dumb, novelty-seeking portion of the brain driving the limbic system that induces this feeling of pleasure, not the planning, scheduling, higher-level thought centres in the prefrontal cortex. Make no mistake: email-, Facebook- and Twitter-checking constitute a neural addiction.
knudsenlu

You Are Already Living Inside a Computer - The Atlantic - 1 views

  • Nobody really needs smartphone-operated bike locks or propane tanks. And they certainly don’t need gadgets that are less trustworthy than the “dumb” ones they replace, a sin many smart devices commit. But people do seem to want them—and in increasing numbers.
  • Why? One answer is that consumers buy what is on offer, and manufacturers are eager to turn their dumb devices smart. Doing so allows them more revenue, more control, and more opportunity for planned obsolescence. It also creates a secondary market for data collected by means of these devices. Roomba, for example, hopes to deduce floor plans from the movement of its robotic home vacuums so that it can sell them as business intelligence.
  • And the more people love using computers for everything, the more life feels incomplete unless it takes place inside them.
  • ...15 more annotations...
  • Computers already are predominant, human life already takes place mostly within them, and people are satisfied with the results.
  • These devices pose numerous problems. Cost is one. Like a cheap propane gauge, a traditional bike lock is a commodity. It can be had for $10 to $15, a tenth of the price of Nokē’s connected version. Security and privacy are others. The CIA was rumored to have a back door into Samsung TVs for spying. Disturbed people have been caught speaking to children over hacked baby monitors. A botnet commandeered thousands of poorly secured internet-of-things devices to launch a massive distributed denial-of-service attack against the domain-name syste
  • Reliability plagues internet-connected gadgets, too. When the network is down, or the app’s service isn’t reachable, or some other software behavior gets in the way, the products often cease to function properly—or at all.
  • Turing guessed that machines would become most compelling when they became convincing companions, which is essentially what today’s smartphones (and smart toasters) do.
  • But Turing never claimed that machines could think, let alone that they might equal the human mind. Rather, he surmised that machines might be able to exhibit convincing behavior.
  • People choose computers as intermediaries for the sensual delight of using computers
  • ne such affection is the pleasure of connectivity. You don’t want to be offline. Why would you want your toaster or doorbell to suffer the same fate? Today, computational absorption is an ideal. The ultimate dream is to be online all the time, or at least connected to a computational machine of some kind.
  • Doorbells and cars and taxis hardly vanish in the process. Instead, they just get moved inside of computers.
  • “Being a computer” means something different today than in 1950, when Turing proposed the imitation game. Contra the technical prerequisites of artificial intelligence, acting like a computer often involves little more than moving bits of data around, or acting as a controller or actuator. Grill as computer, bike lock as computer, television as computer. An intermediary
  • Or consider doorbells once more. Forget Ring, the doorbell has already retired in favor of the computer. When my kids’ friends visit, they just text a request to come open the door. The doorbell has become computerized without even being connected to an app or to the internet. Call it “disruption” if you must, but doorbells and cars and taxis hardly vanish in the process. Instead, they just get moved inside of computers, where they can produce new affections.
  • The present status of intelligent machines is more powerful than any future robot apocalypse.
  • Why would anyone ever choose a solution that doesn’t involve computers, when computers are available? Propane tanks and bike locks are still edge cases, but ordinary digital services work similarly: The services people seek out are the ones that allow them to use computers to do things—from finding information to hailing a cab to ordering takeout. This is a feat of aesthetics as much as it is one of business. People choose computers as intermediaries for the sensual delight of using computers, not just as practical, efficient means for solving problems.
  • This is not where anyone thought computing would end up. Early dystopic scenarios cautioned that the computer could become a bureaucrat or a fascist, reducing human behavior to the predetermined capacities of a dumb machine. Or else, that obsessive computer use would be deadening, sucking humans into narcotic detachment.Those fears persist to some extent, partly because they have been somewhat realized. But they have also been inverted. Being away from them now feels deadening, rather than being attached to them without end. And thus, the actions computers take become self-referential: to turn more and more things into computers to prolong that connection.
  • But the real present status of intelligent machines is both humdrum and more powerful than any future robot apocalypse. Turing is often called the father of AI, but he only implied that machines might become compelling enough to inspire interaction. That hardly counts as intelligence, artificial or real. It’s also far easier to achieve. Computers already have persuaded people to move their lives inside of them. The machines didn’t need to make people immortal, or promise to serve their every whim, or to threaten to destroy them absent assent. They just needed to become a sufficient part of everything human beings do such that they can’t—or won’t—imagine doing those things without them.
  • . The real threat of computers isn’t that they might overtake and destroy humanity with their future power and intelligence. It’s that they might remain just as ordinary and impotent as they are today, and yet overtake us anyway.
Javier E

Opinion | Lower fertility rates are the new cultural norm - The Washington Post - 0 views

  • The percentage who say that having children is very important to them has dropped from 43 percent to 30 percent since 2019. This fits with data showing that, since 2007, the total fertility rate in the United States has fallen from 2.1 lifetime births per woman, the “replacement rate” necessary to sustain population levels, to just 1.64 in 2020.
  • The U.S. economy is losing an edge that robust population dynamics gave it relative to low-birth-rate peer nations in Japan and Western Europe; this country, too, faces chronic labor-supply constraints as well as an even less favorable “dependency ratio” between workers and retirees than it already expected.
  • the timing and the magnitude of such a demographic sea-change cry out for explanation. What happened in 2007?
  • ...12 more annotations...
  • New financial constraints on family formation are a potential cause, as implied by another striking finding in the Journal poll — 78 percent of adults lack confidence this generation of children will enjoy a better life than they do.
  • Yet a recent analysis for the Aspen Economic Strategy Group by Melissa S. Kearney and Phillip B. Levine, economics professors at the University of Maryland and Wellesley College, respectively, determined that “beyond the temporary effects of the Great Recession, no recent economic or policy change is responsible for a meaningful share of the decline in the US fertility rate since 2007.”
  • Their study took account of such factors as the high cost of child care, student debt service and housing as well as Medicaid coverage and the wider availability of long-acting reversible contraception. Yet they had “no success finding evidence” that any of these were decisive.
  • Kearney and Levine speculated instead that the answers lie in the cultural zeitgeist — “shifting priorities across cohorts of young adults,”
  • A possibility worth considering, they suggested, is that young adults who experienced “intensive parenting” as children now balk at the heavy investment of time and resources needed to raise their own kids that way: It would clash with their career and leisure goals.
  • another event that year: Apple released the first iPhone, a revolutionary cultural moment if there ever was one. The ensuing smartphone-enabled social media boom — Facebook had opened membership to anyone older than 13 in 2006 — forever changed how human beings relate with one another.
  • We are just beginning to understand this development’s effect on mental health, education, religious observance, community cohesion — everything. Why wouldn’t it also affect people’s willingness to have children?
  • one indirect way new media affect childbearing rates is through “time competition effects” — essentially, hours spent watching the tube cannot be spent forming romantic partnerships.
  • a 2021 review of survey data on young adults and adolescents in the United States and other countries, the years between 2009 and 2018 saw a marked decline in reported sexual activity.
  • the authors hypothesized that people are distracted from the search for partners by “increasing use of computer games and social media.
  • during the late 20th century, Brazil’s fertility rates fell after women who watched soap operas depicting smaller families sought to emulate them by having fewer children themselves.
  • This may be an area where incentives do not influence behavior, at least not enough. Whether the cultural shift to lower birthrates occurs on an accelerated basis, as in the United States after 2007, or gradually, as it did in Japan, it appears permanent — “sticky,” as policy wonks say.
Javier E

Computer Algorithms Rely Increasingly on Human Helpers - NYTimes.com - 0 views

  • Although algorithms are growing ever more powerful, fast and precise, the computers themselves are literal-minded, and context and nuance often elude them. Capable as these machines are, they are not always up to deciphering the ambiguity of human language and the mystery of reasoning.
  • And so, while programming experts still write the step-by-step instructions of computer code, additional people are needed to make more subtle contributions as the work the computers do has become more involved. People evaluate, edit or correct an algorithm’s work. Or they assemble online databases of knowledge and check and verify them — creating, essentially, a crib sheet the computer can call on for a quick answer. Humans can interpret and tweak information in ways that are understandable to both computers and other humans.
  • Even at Google, where algorithms and engineers reign supreme in the company’s business and culture, the human contribution to search results is increasing. Google uses human helpers in two ways. Several months ago, it began presenting summaries of information on the right side of a search page when a user typed in the name of a well-known person or place, like “Barack Obama” or “New York City.” These summaries draw from databases of knowledge like Wikipedia, the C.I.A. World Factbook and Freebase, whose parent company, Metaweb, Google acquired in 2010. These databases are edited by humans.
  • ...3 more annotations...
  • When Google’s algorithm detects a search term for which this distilled information is available, the search engine is trained to go fetch it rather than merely present links to Web pages. “There has been a shift in our thinking,” said Scott Huffman, an engineering director in charge of search quality at Google. “A part of our resources are now more human curated.”
  • “Our engineers evolve the algorithm, and humans help us see if a suggested change is really an improvement,” Mr. Huffman said.
  • Ben Taylor, 25, is a product manager at FindTheBest, a fast-growing start-up in Santa Barbara, Calif. The company calls itself a “comparison engine” for finding and comparing more than 100 topics and products, from universities to nursing homes, smartphones to dog breeds. Its Web site went up in 2010, and the company now has 60 full-time employees. Mr. Taylor helps design and edit the site’s education pages. He is not an engineer, but an English major who has become a self-taught expert in the arcane data found in Education Department studies and elsewhere. His research methods include talking to and e-mailing educators. He is an information sleuth.
Javier E

Google Glass May Be Hands-Free, But Not Brain-Free - NYTimes.com - 0 views

  • The “eyes-free” goal addresses an obvious limitation of the human brain: we can’t look away from where we’re heading for more than a few seconds without losing our bearings. And time spent looking at a cellphone is time spent oblivious to the world, as shown in the viral videos of distracted phone users who stumble into shopping-mall fountains. Most people intuitively grasp the “two-second rule.”
  • Researchers at the Virginia Tech Transportation Institute outfitted cars and trucks with cameras and sensors to monitor real-world driving behavior. When drivers were communicating, they tended to look away for as much as 4.6 seconds during a 6-second period. In effect, people lose track of time when texting, leading them to look at their phones far longer than they know they should
  • Heads-up displays like Google Glass, and voice interfaces like Siri, seem like ideal solutions, letting you simultaneously interact with your smartphone while staying alert to your surroundings
  • ...4 more annotations...
  • The problem is that looking is not the same as seeing, and people make wrong assumptions about what will grab their attention.
  • about 70 percent of Americans believe that “people will notice when something unexpected enters their field of view, even when they’re paying attention to something else.”
  • “inattentional blindness” shows that what we see depends not just on where we look but also on how we focus our attention.
  • Perception requires both your eyes and your mind, and if your mind is engaged, you can f
Javier E

Messages Galore, but No Time to Think - NYTimes.com - 0 views

  • “Nobody can think anymore because they’re constantly interrupted,” said Leslie Perlow, a Harvard Business School professor and author of “Sleeping With Your Smartphone.” “Technology has enabled this expectation that we always be on.”
  • To lessen the disruptive nature of e-mail and other messages, teams need to discuss how to alter their work process to allow blocks of time where they can disconnect entirely,
Javier E

Disruptions: Medicine That Monitors You - NYTimes.com - 0 views

  • researchers and some start-ups are already preparing the next, even more intrusive wave of computing: ingestible computers and minuscule sensors stuffed inside pills.
  • some people on the cutting edge are already swallowing them to monitor a range of health data and wirelessly share this information with a doctor
  • does not need a battery. Instead, the body is the power source. Just as a potato can power a light bulb, Proteus has added magnesium and copper on each side of its tiny sensor, which generates just enough electricity from stomach acids.
  • ...6 more annotations...
  • People with heart failure-related difficulties could monitor blood flow and body temperature; those with central nervous system issues, including schizophrenia and Alzheimer’s disease, could take the pills to monitor vital signs in real time.
  • Future generations of these pills could even be convenience tools.
  • Once that pill is in your body, you could pick up your smartphone and not have to type in a password. Instead, you are the password. Sit in the car and it will start. Touch the handle to your home door and it will automatically unlock. “Essentially, your entire body becomes your authentication token,
  • “The wonderful is that there are a great number of things you want to know about yourself on a continual basis, especially if you’re diabetic or suffer from another disease. The terrible is that health insurance companies could know about the inner workings of your body.”
  • And the implications of a tiny computer inside your body being hacked? Let’s say they are troubling.
  • After it has done its job, flowing down around the stomach and through the intestinal tract, what happens next?“It passes naturally through the body in about 24 hours,” Ms. Carbonelli said, but since each pill costs $46, “some people choose to recover and recycle it.”
‹ Previous 21 - 40 of 99 Next › Last »
Showing 20 items per page