Skip to main content

Home/ TOK Friends/ Group items tagged attended memories

Rss Feed Group items tagged

10More

How Sleeping Memories Come Back to Life | Time - 0 views

  • It’s almost a good thing that we’ve never been entirely able to figure out how human memory works, because if we did, we’d probably just forget.
  • It’s almost a good thing that we’ve never been entirely able to figure out how human memory works, because if we did, we’d probably just forget.
    • ilanaprincilus06
       
      Shows just how unreliable our brains truly are
  • Working memories, it seems, are preserved in a latent or hidden state, existing without any evident activation at all until the moment they’re needed.
    • ilanaprincilus06
       
      Reminds me of learning about a new topic for a class and then somehow remembering the concept during an assessment.
  • ...5 more annotations...
  • Instead, however, while there was indeed detectable neural activity for the so-called attended memory item (AMI)—the one that the subjects knew they would need right away—there was none at all for the unattended memory items (UMI), which the subjects might also need, but not until later.
  • All the same, when subjects were asked about a UMI, a peak appeared for it just as it did for an AMI. In both cases, working memory worked just fine, but in one case it did so without the benefit of any visible storage system.
  • unattended memories are maintained in what the researchers called “a privileged state” only as long as they had to be.
  • Whatever the explanation, the work has implications for understanding not just memory but other cognitive functions like perception, attention and goal maintenance.
  • if noninvasive brain stimulation techniques can be used to reactivate and potentially strengthen latent memories”—in other words, recovering information that had been forever lost.
25More

Are search engines and the Internet hurting human memory? - Slate Magazine - 2 views

  • are we losing the power to retain knowledge? The short answer is: No. Machines aren’t ruining our memory. Advertisement The longer answer: It’s much, much weirder than that!
  • we’ve begun to fit the machines into an age-old technique we evolved thousands of years ago—“transactive memory.” That’s the art of storing information in the people around us.
  • frankly, our brains have always been terrible at remembering details. We’re good at retaining the gist of the information we encounter. But the niggly, specific facts? Not so much.
  • ...22 more annotations...
  • subjects read several sentences. When he tested them 40 minutes later, they could generally remember the sentences word for word. Four days later, though, they were useless at recalling the specific phrasing of the sentences—but still very good at describing the meaning of them.
  • When you’re an expert in a subject, you can retain new factoids on your favorite topic easily. This only works for the subjects you’re truly passionate about, though
  • The groups that scored highest on a test of their transactive memory—in other words, the groups where members most relied on each other to recall information—performed better than those who didn't use transactive memory. Transactive groups don’t just remember better: They also analyze problems more deeply, too, developing a better grasp of underlying principles.
  • Wegner noticed that spouses often divide up memory tasks. The husband knows the in-laws' birthdays and where the spare light bulbs are kept; the wife knows the bank account numbers and how to program the TiVo
  • Together, they know a lot. Separately, less so.
  • Wegner suspected this division of labor takes place because we have pretty good "metamemory." We're aware of our mental strengths and limits, and we're good at intuiting the memory abilities of others.
  • We share the work of remembering, Wegner argued, because it makes us collectively smarter
  • They were, in a sense, Googling each other.
  • Transactive memory works best when you have a sense of how your partners' minds work—where they're strong, where they're weak, where their biases lie. I can judge that for people close to me. But it's harder with digital tools, particularly search engines
  • So humanity has always relied on coping devices to handle the details for us. We’ve long stored knowledge in books, paper, Post-it notes
  • And as it turns out, this is what we’re doing with Google and Evernote and our other digital tools. We’re treating them like crazily memorious friends who are usually ready at hand. Our “intimate dyad” now includes a silicon brain.
  • When Sparrow tested the students, the people who knew the computer had saved the information were less likely to personally recall the info than the ones who were told the trivia wouldn't be saved. In other words, if we know a digital tool is going to remember a fact, we're slightly less likely to remember it ourselves
  • believing that one won't have access to the information in the future enhances memory for the information itself, whereas believing the information was saved externally enhances memory for the fact that the information could be accessed.
  • Just as we learn through transactive memory who knows what in our families and offices, we are learning what the computer 'knows' and when we should attend to where we have stored information in our computer-based memories,
  • We’ve stored a huge chunk of what we “know” in people around us for eons. But we rarely recognize this because, well, we prefer our false self-image as isolated, Cartesian brains
  • We’re dumber and less cognitively nimble if we're not around other people—and, now, other machines.
  • When humans spew information at us unbidden, it's boorish. When machines do it, it’s enticing.
  • Though you might assume search engines are mostly used to answer questions, some research has found that up to 40 percent of all queries are acts of remembering. We're trying to refresh the details of something we've previously encountered.
  • "the thinking processes of the intimate dyad."
  • We need to develop literacy in these tools the way we teach kids how to spell and write; we need to be skeptical about search firms’ claims of being “impartial” referees of information
  • And on an individual level, it’s still important to slowly study and deeply retain things, not least because creative thought—those breakthrough ahas—come from deep and often unconscious rumination, your brain mulling over the stuff it has onboard.
  • you can stop worrying about your iPhone moving your memory outside your head. It moved out a long time ago—yet it’s still all around you.
42More

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
75More

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
10More

Germans Protect Memorials to Soviet Troops Who Defeated Nazis - The New York Times - 0 views

  • In interviews across three German states, historians, activists, officials and ordinary citizens explained their support for monuments glorifying a former enemy and occupier as a mixture of bureaucratic drift, aversion to change and a rock-solid commitment to honoring the victims of Nazi aggression that trumps any shifts in global affairs.
  • “We were taught to learn from pain,” said Teresa Schneidewind, 33, the head of Lützen’s museum. “We care for our memorials, because they allow us to learn from the mistakes of past generations.”
  • Germany’s top court ruled just last year against the removal of a medieval, antisemitic sculpture in the very church where Martin Luther had preached. Despite debates, some swastikas from the Third Reich have been left on church bells.
  • ...7 more annotations...
  • Red Army memorials are just some of the divisive symbols that persist in Germany long after the political systems and social mores that sustained them have vanished, a reckoning with parallels in the United States and elsewhere.
  • Officials say their duty to care for such memorials dates to the so-called Good Neighbor agreement between Germany and the Soviet Union in 1990. Under that measure, each nation committed itself to the upkeep of the other’s war graves on its territory.
  • This propensity for what Ms. Schneidewind calls “historical hoarding” means that many Soviet memorials in East Germany contain Stalin’s name nearly 70 years after the dictator was largely purged from public spaces in Russia itself.
  • Most of the Red Army monuments in Germany are believed to have been built above the graves of Soviet soldiers or prisoners of war. The Russian Embassy has used the pact to draw the German government’s attention to Soviet monuments, including the one in Lützen, that have been damaged or neglected.
  • “Instead of tearing them down, you should redefine these memorials,” Mr. Nagel said. “You need to explain why they are here, and why you have a different view of them now.”
  • In Lützen, local residents say they want to keep their Red Army memorial as it is, a tribute to the central place occupied by the pyramid in the town’s public life during Communist rule. Some remember playing around it while attending the nearby kindergarten, and they say they will fight plans to move it to accommodate a proposed new supermarket.
  • “This is our history, no matter what is going on in world politics,” said the town’s mayor, Uwe Weiss. “We have to take care of it, because it is part of us.”
11More

How Knowledge Changes Us - The New York Times - 1 views

  • They become the center of every room they enter, with all the attendant narcissism. They also have inside information, and often leap to the conclusion that people who don’t have this information are simply not worth listening to.
  • They become the center of every room they enter, with all the attendant narcissism. They also have inside information, and often leap to the conclusion that people who don’t have this information are simply not worth listening to.
    • ilanaprincilus06
       
      Logical fallacy of appealing to authority. Their narcissism will continue to bias their mind with the idea that they will always be seen by others as correct just because they have more power over them.
  • “First, you’ll be exhilarated by some of this new information, and by having it all — so much! incredible! — suddenly available to you.
    • ilanaprincilus06
       
      Relates to our memory recollection. Shortly after experiencing something new or exciting, our emotions usually project this with happiness and other good feelings
  • ...3 more annotations...
  • Still, there are psychic effects that come from having this information. I have never seen them so perfectly expressed as by Daniel Ellsberg in a speech he supposedly gave to Henry Kissinger in 1968 as Kissinger was entering the government.
    • ilanaprincilus06
       
      One affect could be mind deception. Over time, the secret is more likely to be altered based on certain experiences/actions that the secret holder partakes. This will leave a negative impact on the true nature of the secret.
  • you will forget there ever was a time when you didn’t have it, and you’ll be aware only of the fact that you have it now and most others don’t
    • ilanaprincilus06
       
      Relates to theory of mind. The secret constantly reminds the secret holder of the disparity between their understanding of the secret and other's unknowingness to the secret
  • it’s often inaccurate
    • ilanaprincilus06
       
      The effects that the brain leaves on long-term memory. The premise of the secret may still be true, but the conclusion and other evidence is most likely skewed
14More

Sports' path to return to normal may not be clear - The Washington Post - 0 views

  • “It was the mental part. You want to get it right; you want to be fair; you want to be smart. It just felt like there was broken glass all around. And it wasn’t a matter of if you’re going to step on it but how deep is the cut going to be.”
  • “It was the mental part. You want to get it right; you want to be fair; you want to be smart. It just felt like there was broken glass all around. And it wasn’t a matter of if you’re going to step on it but how deep is the cut going to be.”
    • colemorris
       
      it is very interesting to hear how even though he is not the one playing the sports and is the broadcaster/ analysist he is so heavily affected by this
  • will sports be fun again — will they serve as the harmless distraction and social balm that once sustained so many?
  • ...7 more annotations...
  • In the past, sports have served as a welcome distraction through pain or trauma. But no game could blot out what was unfolding in 2020. It all blended together, the planet’s problems disrupting the sports world, along with everything else.
  • Even when the games returned, television ratings were down almost across the board. Sports hadn’t been able to offer a respite for everyone.
    • colemorris
       
      covid really put the entire economy in shambles no matter what the business.
  • “Think about it: We took all the fundamental building blocks out of sports this year,” he said. “Anticipation, social gathering, water cooler talk, the ability to play — we took everything out.
  • Utah Jazz player Rudy Gobert tested positive in March and the NBA swiftly halted its season,
    • colemorris
       
      I remember this and in the press conference right before we found out he had it he was seen mocking the disease and touching everything around him purposely.
  • A majority of fans, 56 percent, said people shouldn’t play indoor sports, according to a Marist poll this month. And 49 percent said fans shouldn’t be allowed to attend the Super Bowl.
  • McManus points out that nearly one in five sports fans, 18 percent, said they should be allowed to attend games right now and another 36 percent said they should be allowed to attend with restrictions in place.
  • “Americans, in particular, have very, very short memories,”
    • colemorris
       
      true
4More

Retired Green Beret: Please think of Memorial Day as more than just a day off | Fox News - 0 views

  • Many Americans may be tired of armed conflict, but less than one percent of them will ever find themselves in remote proximity to a combat zone.
  • Extremism is not going away. We have already been fighting radical Islamic terrorism for decades. Our children and grandchildren will be fighting radical Islamic terrorism for decades to come.
  • Yet so few Americans understand the immense sacrifice undertaken by one of the most elite, rigorously trained, and patriotic group of men in the world.
  • ...1 more annotation...
  • If you enjoy being able to speak your mind publicly and practice (or abstain from practicing) the religion of your choice, please think of this Memorial Day as more than a beachside, grill-side day off from work. Attend a local ceremony; thank a veteran; reach out to the family of a deployed or wounded service member.
26More

How Did Consciousness Evolve? - The Atlantic - 0 views

  • Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?
  • The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions.
  • The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence
  • ...23 more annotations...
  • Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition.
  • It coordinates something called overt attention – aiming the satellite dishes of the eyes, ears, and nose toward anything important.
  • Selective enhancement therefore probably evolved sometime between hydras and arthropods—between about 700 and 600 million years ago, close to the beginning of complex, multicellular life
  • The next evolutionary advance was a centralized controller for attention that could coordinate among all senses. In many animals, that central controller is a brain area called the tectum
  • At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing.
  • All vertebrates—fish, reptiles, birds, and mammals—have a tectum. Even lampreys have one, and they appeared so early in evolution that they don’t even have a lower jaw. But as far as anyone knows, the tectum is absent from all invertebrates
  • According to fossil and genetic evidence, vertebrates evolved around 520 million years ago. The tectum and the central control of attention probably evolved around then, during the so-called Cambrian Explosion when vertebrates were tiny wriggling creatures competing with a vast range of invertebrates in the sea.
  • The tectum is a beautiful piece of engineering. To control the head and the eyes efficiently, it constructs something called an internal model, a feature well known to engineers. An internal model is a simulation that keeps track of whatever is being controlled and allows for predictions and planning.
  • The tectum’s internal model is a set of information encoded in the complex pattern of activity of the neurons. That information simulates the current state of the eyes, head, and other major body parts, making predictions about how these body parts will move next and about the consequences of their movement
  • In fish and amphibians, the tectum is the pinnacle of sophistication and the largest part of the brain. A frog has a pretty good simulation of itself.
  • With the evolution of reptiles around 350 to 300 million years ago, a new brain structure began to emerge – the wulst. Birds inherited a wulst from their reptile ancestors. Mammals did too, but our version is usually called the cerebral cortex and has expanded enormously
  • The cortex also takes in sensory signals and coordinates movement, but it has a more flexible repertoire. Depending on context, you might look toward, look away, make a sound, do a dance, or simply store the sensory event in memory in case the information is useful for the future.
  • The most important difference between the cortex and the tectum may be the kind of attention they control. The tectum is the master of overt attention—pointing the sensory apparatus toward anything important. The cortex ups the ante with something called covert attention. You don’t need to look directly at something to covertly attend to it. Even if you’ve turned your back on an object, your cortex can still focus its processing resources on it
  • The cortex needs to control that virtual movement, and therefore like any efficient controller it needs an internal model. Unlike the tectum, which models concrete objects like the eyes and the head, the cortex must model something much more abstract. According to the AST, it does so by constructing an attention schema—a constantly updated set of information that describes what covert attention is doing moment-by-moment and what its consequences are
  • Covert attention isn’t intangible. It has a physical basis, but that physical basis lies in the microscopic details of neurons, synapses, and signals. The brain has no need to know those details. The attention schema is therefore strategically vague. It depicts covert attention in a physically incoherent way, as a non-physical essence
  • this, according to the theory, is the origin of consciousness. We say we have consciousness because deep in the brain, something quite primitive is computing that semi-magical self-description.
  • I’m reminded of Teddy Roosevelt’s famous quote, “Do what you can with what you have where you are.” Evolution is the master of that kind of opportunism. Fins become feet. Gill arches become jaws. And self-models become models of others. In the AST, the attention schema first evolved as a model of one’s own covert attention. But once the basic mechanism was in place, according to the theory, it was further adapted to model the attentional states of others, to allow for social prediction. Not only could the brain attribute consciousness to itself, it began to attribute consciousness to others.
  • In the AST’s evolutionary story, social cognition begins to ramp up shortly after the reptilian wulst evolved. Crocodiles may not be the most socially complex creatures on earth, but they live in large communities, care for their young, and can make loyal if somewhat dangerous pets.
  • If AST is correct, 300 million years of reptilian, avian, and mammalian evolution have allowed the self-model and the social model to evolve in tandem, each influencing the other. We understand other people by projecting ourselves onto them. But we also understand ourselves by considering the way other people might see us.
  • t the cortical networks in the human brain that allow us to attribute consciousness to others overlap extensively with the networks that construct our own sense of consciousness.
  • Language is perhaps the most recent big leap in the evolution of consciousness. Nobody knows when human language first evolved. Certainly we had it by 70 thousand years ago when people began to disperse around the world, since all dispersed groups have a sophisticated language. The relationship between language and consciousness is often debated, but we can be sure of at least this much: once we developed language, we could talk about consciousness and compare notes
  • Maybe partly because of language and culture, humans have a hair-trigger tendency to attribute consciousness to everything around us. We attribute consciousness to characters in a story, puppets and dolls, storms, rivers, empty spaces, ghosts and gods. Justin Barrett called it the Hyperactive Agency Detection Device, or HADD
  • the HADD goes way beyond detecting predators. It’s a consequence of our hyper-social nature. Evolution turned up the amplitude on our tendency to model others and now we’re supremely attuned to each other’s mind states. It gives us our adaptive edge. The inevitable side effect is the detection of false positives, or ghosts.
17More

How the Internet Gets Inside Us : The New Yorker - 0 views

  • It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with
  • The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare.
  • Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness.
  • ...14 more annotations...
  • In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.
  • “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”
  • when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix.
  • The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965.
  • If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.
  • Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began.
  • Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.
  • The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points.
  • at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.
  • What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own.
  • A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them.
  • And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.
  • Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user.
  • the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.
43More

How to Navigate a 'Quarterlife' Crisis - The New York Times - 0 views

  • Satya Doyle Byock, a 39-year-old therapist, noticed a shift in tone over the past few years in the young people who streamed into her office: frenetic, frazzled clients in their late teens, 20s and 30s. They were unnerved and unmoored, constantly feeling like something was wrong with them.
  • “Crippling anxiety, depression, anguish, and disorientation are effectively the norm,”
  • her new book, “Quarterlife: The Search for Self in Early Adulthood.” The book uses anecdotes from Ms. Byock’s practice to outline obstacles faced by today’s young adults — roughly between the ages of 16 and 36 — and how to deal with them.
  • ...40 more annotations...
  • Just like midlife, quarterlife can bring its own crisis — trying to separate from your parents or caregivers and forge a sense of self is a struggle. But the generation entering adulthood now faces novel, sometimes debilitating, challenges.
  • Many find themselves so mired in day-to-day monetary concerns, from the relentless crush of student debt to the swelling costs of everything, that they feel unable to consider what they want for themselves long term
  • “We’ve been constrained by this myth that you graduate from college and you start your life,” she said. Without the social script previous generations followed — graduate college, marry, raise a family — Ms. Byock said her young clients often flailed around in a state of extended adolescence.
  • nearly one-third of Gen Z adults are living with their parents or other relatives and plan to stay there.
  • Many young people today struggle to afford college or decide not to attend, and the “existential crisis” that used to hit after graduation descends earlier and earlier
  • Ms. Byock said to pay attention to what you’re naturally curious about, and not to dismiss your interests as stupid or futile.
  • Experts said those entering adulthood need clear guidance for how to make it out of the muddle. Here are their top pieces of advice on how to navigate a quarterlife crisis today.
  • She recommends scheduling reminders to check in with yourself, roughly every three months, to examine where you are in your life and whether you feel stuck or dissatisfied
  • From there, she said, you can start to identify aspects of your life that you want to change.
  • “Start to give your own inner life the respect that it’s due,”
  • But quarterlife is about becoming a whole person, Ms. Byock said, and both groups need to absorb each other’s characteristics to balance themselves out
  • However, there is a difference between self-interest and self-indulgence, Ms. Byock said. Investigating and interrogating who you are takes work. “It’s not just about choosing your labels and being done,” she said.
  • Be patient.
  • Quarterlifers may feel pressure to race through each step of their lives, Ms. Byock said, craving the sense of achievement that comes with completing a task.
  • But learning to listen to oneself is a lifelong process.
  • Instead of searching for quick fixes, she said, young adults should think about longer-term goals: starting therapy that stretches beyond a handful of sessions, building healthy nutrition and exercise habits, working toward self-reliance.
  • “I know that seems sort of absurdly large and huge in scope,” she said. “But it’s allowing ourselves to meander and move through life, versus just ‘Check the boxes and get it right.’”
  • take stock of your day-to-day life and notice where things are missing. She groups quarterlifers into two categories: “stability types” and “meaning types.”
  • “Stability types” are seen by others as solid and stable. They prioritize a sense of security, succeed in their careers and may pursue building a family.
  • “But there’s a sense of emptiness and a sense of faking it,” she said. “They think this couldn’t possibly be all that life is about.”
  • On the other end of the spectrum, there are “meaning types” who are typically artists; they have intense creative passions but have a hard time dealing with day-to-day tasks
  • “These are folks for whom doing what society expects of you is so overwhelming and so discordant with their own sense of self that they seem to constantly be floundering,” she said. “They can’t quite figure it out.”
  • That paralysis is often exacerbated by mounting climate anxiety and the slog of a multiyear pandemic that has left many young people mourning family and friends, or smaller losses like a conventional college experience or the traditions of starting a first job.
  • Stability types need to think about how to give their lives a sense of passion and purpose. And meaning types need to find security, perhaps by starting with a consistent routine that can both anchor and unlock creativity.
  • perhaps the prototypical inspiration for staying calm in chaos: Yoda. The Jedi master is “one of the few images we have of what feeling quiet amid extreme pain and apocalypse can look like,
  • Even when there seems to be little stability externally, she said, quarterlifers can try to create their own steadiness.
  • establishing habits that help you ground yourself as a young adult is critical because transitional periods make us more susceptible to burnout
  • He suggests building a practical tool kit of self-care practices, like regularly taking stock of what you’re grateful for, taking controlled breaths and maintaining healthy nutrition and exercise routines. “These are techniques that can help you find clarity,”
  • Don’t be afraid to make a big change.
  • It’s important to identify what aspects of your life you have the power to alter, Dr. Brown said. “You can’t change an annoying boss,” he said, “but you might be able to plan a career change.”
  • That’s easier said than done, he acknowledged, and young adults should weigh the risks of continuing to live in their status quo — staying in their hometown, or lingering in a career that doesn’t excite them — with the potential benefits of trying something new.
  • quarterlife is typically “the freest stage of the whole life span,
  • Young adults may have an easier time moving to a new city or starting a new job than their older counterparts would.
  • Know when to call your parents — and when to call on yourself.
  • Quarterlife is about the journey from dependence to independence, Ms. Byock said — learning to rely on ourselves, after, for some, growing up in a culture of helicopter parenting and hands-on family dynamics.
  • there are ways your relationship with your parents can evolve, helping you carve out more independence
  • That can involve talking about family history and past memories or asking questions about your parents’ upbringing
  • “You’re transitioning the relationship from one of hierarchy to one of friendship,” she said. “It isn’t just about moving away or getting physical distance.”
  • Every quarterlifer typically has a moment when they know they need to step away from their parents and to face obstacles on their own
  • That doesn’t mean you can’t, or shouldn’t, still depend on your parents in moments of crisis, she said. “I don’t think it’s just about never needing one’s parents again,” she said. “But it’s about doing the subtle work within oneself to know: This is a time I need to stand on my own.”
1 - 11 of 11
Showing 20 items per page