Skip to main content

Home/ Groups/ TOK Friends
Javier E

FaceApp helped a middle-aged man become a popular younger woman. His fan base has never... - 0 views

  • Soya’s fame illustrated a simple truth: that social media is less a reflection of who we are, and more a performance of who we want to be.
  • It also seemed to herald a darker future where our fundamental senses of reality are under siege: The AI that allows anyone to fabricate a face can also be used to harass women with “deepfake” pornography, invent fraudulent LinkedIn personas and digitally impersonate political enemies.
  • As the photos began receiving hundreds of likes, Soya’s personality and style began to come through. She was relentlessly upbeat. She never sneered or bickered or trolled. She explored small towns, savored scenic vistas, celebrated roadside restaurants’ simple meals.
  • ...25 more annotations...
  • She took pride in the basic things, like cleaning engine parts. And she only hinted at the truth: When one fan told her in October, “It’s great to be young,” Soya replied, “Youth does not mean a certain period of life, but how to hold your heart.”
  • She seemed, well, happy, and FaceApp had made her that way. Creating the lifelike impostor had taken only a few taps: He changed the “Gender” setting to “Female,” the “Age” setting to “Teen,” and the “Impression” setting — a mix of makeup filters — to a glamorous look the app calls “Hollywood.”
  • Users in the Internet’s early days rarely had any presumptions of authenticity, said Melanie C. Green, a University of Buffalo professor who studies technology and social trust. Most people assumed everyone else was playing a character clearly distinguished from their real life.
  • Nakajima grew his shimmering hair below his shoulders and raided his local convenience store for beauty supplies he thought would make the FaceApp images more convincing: blushes, eyeliners, concealers, shampoos.
  • “When I compare how I feel when I started to tweet as a woman and now, I do feel that I’m gradually gravitating toward this persona … this fantasy world that I created,” Nakajima said. “When I see photos of what I tweeted, I feel like, ‘Oh. That’s me.’ ”
  • The sensation Nakajima was feeling is so common that there’s a term for it: the Proteus effect, named for the shape-shifting Greek god. Stanford University researchers first coined it in 2007 to describe how people inhabiting the body of a digital avatar began to act the part
  • People made to appear taller in virtual-reality simulations acted more assertively, even after the experience ended. Prettier characters began to flirt.
  • What is it about online disguises? Why are they so good at bending people’s sense of self-perception?
  • they tap into this “very human impulse to play with identity and pretend to be someone you’re not.”
  • Soya pouted and scowled on rare occasions when Nakajima himself felt frustrated. But her baseline expression was an extra-wide smile, activated with a single tap.
  • “This identity play was considered one of the huge advantages of being online,” Green said. “You could switch your gender and try on all of these different personas. It was a playground for people to explore.”
  • But wasn’t this all just a big con? Nakajima had tricked people with a “cool girl” stereotype to boost his Twitter numbers. He hadn’t elevated the role of women in motorcycling; if anything, he’d supplanted them. And the character he’d created was paper thin: Soya had no internal complexity outside of what Nakajima had projected, just that eternally superimposed smile.
  • The Web’s big shift from text to visuals — the rise of photo-sharing apps, live streams and video calls — seemed at first to make that unspoken rule of real identities concrete. It seemed too difficult to fake one’s appearance when everyone’s face was on constant display.
  • Now, researchers argue, advances in image-editing artificial intelligence have done for the modern Internet what online pseudonyms did for the world’s first chat rooms. Facial filters have allowed anyone to mold themselves into the character they want to play.
  • researchers fear these augmented reality tools could end up distorting the beauty standards and expectations of actual reality.
  • Some political and tech theorists worry this new world of synthetic media threatens to detonate our concept of truth, eroding our shared experiences and infusing every online relationship with suspicion and self-doubt.
  • Deceptive political memes, conspiracy theories, anti-vaccine hoaxes and other scams have torn the fabric of our democracy, culture and public health.
  • But she also thinks about her kids, who assume “that everything online is fabricated,” and wonders whether the rules of online identity require a bit more nuance — and whether that generational shift is already underway.
  • “Bots pretending to be people, automated representations of humanity — that, they perceive as exploitative,” she said. “But if it’s just someone engaging in identity experimentation, they’re like: ‘Yeah, that’s what we’re all doing.'
  • To their generation, “authenticity is not about: ‘Does your profile picture match your real face?’ Authenticity is: ‘Is your voice your voice?’
  • “Their feeling is: ‘The ideas are mine. The voice is mine. The content is mine. I’m just looking for you to receive it without all the assumptions and baggage that comes with it.’ That’s the essence of a person’s identity. That’s who they really are.”
  • It wasn’t until the rise of giant social networks like Facebook — which used real identities to, among other things, supercharge targeted advertising — that this big game of pretend gained an air of duplicity. Spaces for playful performance shrank, and the biggest Internet watering holes began demanding proof of authenticity as a way to block out malicious intent.
  • Perhaps he should have accepted his irrelevance and faded into the digital sunset, sharing his life for few to see. But some of Soya’s followers have said they never felt deceived: It was Nakajima — his enthusiasm, his attitude about life — they’d been charmed by all along. “His personality,” as one Twitter follower said, “shined through.”
  • In Nakajima’s mind, he’d used the tools of a superficial medium to craft genuine connections. He had not felt real until he had become noticed for being fake.
  • Nakajima said he doesn’t know how long he’ll keep Soya alive. But he said he’s grateful for the way she helped him feel: carefree, adventurous, seen.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 0 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
Javier E

Fight the Future - The Triad - 0 views

  • In large part because our major tech platforms reduced the coefficient of friction (μ for my mechanics nerd posse) to basically zero. QAnons crept out of the dark corners of the web—obscure boards like 4chan and 8kun—and got into the mainstream platforms YouTube, Facebook, Instagram, and Twitter.
  • Why did QAnon spread like wildfire in America?
  • These platforms not only made it easy for conspiracy nuts to share their crazy, but they used algorithms that actually boosted the spread of crazy, acting as a force multiplier.
  • ...24 more annotations...
  • So it sounds like a simple fix: Impose more friction at the major platform level and you’ll clean up the public square.
  • But it’s not actually that simple because friction runs counter to the very idea of the internet.
  • The fundamental precept of the internet is that it reduces marginal costs to zero. And this fact is why the design paradigm of the internet is to continually reduce friction experienced by users to zero, too. Because if the second unit of everything is free, then the internet has a vested interest in pushing that unit in front of your eyeballs as smoothly as possible.
  • the internet is “broken,” but rather it’s been functioning exactly as it was designed to:
  • Perhaps more than any other job in the world, you do not want the President of the United States to live in a frictionless state of posting. The Presidency is not meant to be a frictionless position, and the United States government is not a frictionless entity, much to the chagrin of many who have tried to change it. Prior to this administration, decisions were closely scrutinized for, at the very least, legality, along with the impact on diplomacy, general norms, and basic grammar. This kind of legal scrutiny and due diligence is also a kind of friction--one that we now see has a lot of benefits. 
  • The deep lesson here isn’t about Donald Trump. It’s about the collision between the digital world and the real world.
  • In the real world, marginal costs are not zero. And so friction is a desirable element in helping to get to the optimal state. You want people to pause before making decisions.
  • described friction this summer as: “anything that inhibits user action within a digital interface, particularly anything that requires an additional click or screen.” For much of my time in the technology sector, friction was almost always seen as the enemy, a force to be vanquished. A “frictionless” experience was generally held up as the ideal state, the optimal product state.
  • Trump was riding the ultimate frictionless optimized engagement Twitter experience: he rode it all the way to the presidency, and then he crashed the presidency into the ground.
  • From a metrics and user point of view, the abstract notion of the President himself tweeting was exactly what Twitter wanted in its original platonic ideal. Twitter has been built to incentivize someone like Trump to engage and post
  • The other day we talked a little bit about how fighting disinformation, extremism, and online cults is like fighting a virus: There is no “cure.” Instead, what you have to do is create enough friction that the rate of spread becomes slow.
  • Our challenge is that when human and digital design comes into conflict, the artificial constraints we impose should be on the digital world to become more in service to us. Instead, we’ve let the digital world do as it will and tried to reconcile ourselves to the havoc it wreaks.
  • And one of the lessons of the last four years is that when you prize the digital design imperatives—lack of friction—over the human design imperatives—a need for friction—then bad things can happen.
  • We have an ongoing conflict between the design precepts of humans and the design precepts of computers.
  • Anyone who works with computers learns to fear their capacity to forget. Like so many things with computers, memory is strictly binary. There is either perfect recall or total oblivion, with nothing in between. It doesn't matter how important or trivial the information is. The computer can forget anything in an instant. If it remembers, it remembers for keeps.
  • This doesn't map well onto human experience of memory, which is fuzzy. We don't remember anything with perfect fidelity, but we're also not at risk of waking up having forgotten our own name. Memories tend to fade with time, and we remember only the more salient events.
  • And because we live in a time when storage grows ever cheaper, we learn to save everything, log everything, and keep it forever. You never know what will come in useful. Deleting is dangerous.
  • Our lives have become split between two worlds with two very different norms around memory.
  • [A] lot of what's wrong with the Internet has to do with memory. The Internet somehow contrives to remember too much and too little at the same time, and it maps poorly on our concepts of how memory should work.
  • The digital world is designed to never forget anything. It has perfect memory. Forever. So that one time you made a crude joke 20 years ago? It can now ruin your life.
  • Memory in the carbon-based world is imperfect. People forget things. That can be annoying if you’re looking for your keys but helpful if you’re trying to broker peace between two cultures. Or simply become a better person than you were 20 years ago.
  • The digital and carbon-based worlds have different design parameters. Marginal cost is one of them. Memory is another.
  • 2. Forget Me Now
  • 1. Fix Tech, Fix America
Javier E

Opinion | What Facebook Fed the Baby Boomers - The New York Times - 1 views

  • n mid-October I asked two people I’d never met to give me their Facebook account passwords for three weeks leading up to and after Election Day. I wanted to immerse myself in the feeds of a type of person who has become a trope of sorts in our national discussion about politics and disinformation: baby boomers with an attachment to polarizing social media.
  • Despite Facebook’s reputation as a leading source for conspiracy theories and misinformation, what goes on in most average Americans’ news feeds is nearly impossible for outsiders to observe.
Javier E

Opinion | Why Covid's Airborne Transmission Was Acknowledged So Late - The New York Times - 0 views

  • A week ago, more than a year after the World Health Organization declared that we face a pandemic, a page on its website titled “Coronavirus Disease (Covid-19): How Is It Transmitted?” got a seemingly small update.
  • The revised response still emphasizes transmission in close contact but now says it may be via aerosols — smaller respiratory particles that can float — as well as droplets. It also adds a reason the virus can also be transmitted “in poorly ventilated and/or crowded indoor settings,” saying this is because “aerosols remain suspended in the air or travel farther than 1 meter.”
  • on Friday, the Centers for Disease Control and Prevention also updated its guidance on Covid-19, clearly saying that inhalation of these smaller particles is a key way the virus is transmitted, even at close range, and put it on top of its list of how the disease spreads.
  • ...38 more annotations...
  • But these latest shifts challenge key infection control assumptions that go back a century, putting a lot of what went wrong last year in context
  • They may also signal one of the most important advancements in public health during this pandemic.
  • If the importance of aerosol transmission had been accepted early, we would have been told from the beginning that it was much safer outdoors, where these small particles disperse more easily, as long as you avoid close, prolonged contact with others.
  • We would have tried to make sure indoor spaces were well ventilated, with air filtered as necessary.
  • Instead of blanket rules on gatherings, we would have targeted conditions that can produce superspreading events: people in poorly ventilated indoor spaces, especially if engaged over time in activities that increase aerosol production, like shouting and singing
  • We would have started using masks more quickly, and we would have paid more attention to their fit, too. And we would have been less obsessed with cleaning surfaces.
  • The implications of this were illustrated when I visited New York City in late April — my first trip there in more than a year.
  • A giant digital billboard greeted me at Times Square, with the message “Protecting yourself and others from Covid-19. Guidance from the World Health Organization.”
  • That billboard neglected the clearest epidemiological pattern of this pandemic: The vast majority of transmission has been indoors, sometimes beyond a range of three or even six feet. The superspreading events that play a major role in driving the pandemic occur overwhelmingly, if not exclusively, indoors.
  • The billboard had not a word about ventilation, nothing about opening windows or moving activities outdoors, where transmission has been rare and usually only during prolonged and close contact. (Ireland recently reported 0.1 percent of Covid-19 cases were traced to outdoor transmission.)
  • Mary-Louise McLaws, an epidemiologist at the University of New South Wales in Sydney, Australia, and a member of the W.H.O. committees that craft infection prevention and control guidance, wanted all this examined but knew the stakes made it harder to overcome the resistance. She told The Times last year, “If we started revisiting airflow, we would have to be prepared to change a lot of what we do.” She said it was a very good idea, but she added, “It will cause an enormous shudder through the infection control society.”
  • In contrast, if the aerosols had been considered a major form of transmission, in addition to distancing and masks, advice would have centered on ventilation and airflow, as well as time spent indoors. Small particles can accumulate in enclosed spaces, since they can remain suspended in the air and travel along air currents. This means that indoors, three or even six feet, while helpful, is not completely protective, especially over time.
  • To see this misunderstanding in action, look at what’s still happening throughout the world. In India, where hospitals have run out of supplemental oxygen and people are dying in the streets, money is being spent on fleets of drones to spray anti-coronavirus disinfectant in outdoor spaces. Parks, beaches and outdoor areas keep getting closed around the world. This year and last, organizers canceled outdoor events for the National Cherry Blossom Festival in Washington, D.C. Cambodian customs officials advised spraying disinfectant outside vehicles imported from India. The examples are many.
  • Meanwhile, many countries allowed their indoor workplaces to open but with inadequate aerosol protections. There was no attention to ventilation, installing air filters as necessary or even opening windows when possible, more to having people just distancing three or six feet, sometimes not requiring masks beyond that distance, or spending money on hard plastic barriers, which may be useless at best
  • clear evidence doesn’t easily overturn tradition or overcome entrenched feelings and egos. John Snow, often credited as the first scientific epidemiologist, showed that a contaminated well was responsible for a 1854 London cholera epidemic by removing the suspected pump’s handle and documenting how the cases plummeted afterward. Many other scientists and officials wouldn’t believe him for 12 years, when the link to a water source showed up again and became harder to deny.
  • Along the way to modern public health shaped largely by the fight over germs, a theory of transmission promoted by the influential public health figure Charles Chapin took hold
  • Dr. Chapin asserted in the early 1900s that respiratory diseases were most likely spread at close range by people touching bodily fluids or ejecting respiratory droplets, and did not allow for the possibility that such close-range infection could occur by inhaling small floating particles others emitted
  • He was also concerned that belief in airborne transmission, which he associated with miasma theories, would make people feel helpless and drop their guard against contact transmission. This was a mistake that would haunt infection control for the next century and more.
  • It was in this context in early 2020 that the W.H.O. and the C.D.C. asserted that SARS-CoV-2 was transmitted primarily via these heavier, short-range droplets, and provided guidance accordingly
  • Amid the growing evidence, in July, hundreds of scientists signed an open letter urging the public health agencies, especially the W.H.O., to address airborne transmission of the coronavirus.
  • Last October, the C.D.C. published updated guidance acknowledging airborne transmission, but as a secondary route under some circumstances, until it acknowledged airborne transmission as crucial on Friday. And the W.H.O. kept inching forward in its public statements, most recently a week ago.
  • Linsey Marr, a professor of engineering at Virginia Tech who made important contributions to our understanding of airborne virus transmission before the pandemic, pointed to two key scientific errors — rooted in a lot of history — that explain the resistance, and also opened a fascinating sociological window into how science can get it wrong and why.
  • Dr. Marr said that if you inhale a particle from the air, it’s an aerosol.
  • biomechanically, she said, nasal transmission faces obstacles, since nostrils point downward and the physics of particles that large makes it difficult for them to move up the nose. And in lab measurements, people emit far more of the easier-to-inhale aerosols than the droplets, she said, and even the smallest particles can be virus laden, sometimes more so than the larger ones, seemingly because of how and where they are produced in the respiratory tract.
  • Second, she said, proximity is conducive to transmission of aerosols as well because aerosols are more concentrated near the person emitting them. In a twist of history, modern scientists have been acting like those who equated stinky air with disease, by equating close contact, a measure of distance, only with the larger droplets, a mechanism of transmission, without examination.
  • Since aerosols also infect at close range, measures to prevent droplet transmission — masks and distancing — can help dampen transmission for airborne diseases as well. However, this oversight led medical people to circularly assume that if such measures worked at all, droplets must have played a big role in their transmission.
  • Another dynamic we’ve seen is something that is not unheard-of in the history of science: setting a higher standard of proof for theories that challenge conventional wisdom than for those that support it.
  • Another key problem is that, understandably, we find it harder to walk things back. It is easier to keep adding exceptions and justifications to a belief than to admit that a challenger has a better explanation.
  • The ancients believed that all celestial objects revolved around the earth in circular orbits. When it became clear that the observed behavior of the celestial objects did not fit this assumption, those astronomers produced ever-more-complex charts by adding epicycles — intersecting arcs and circles — to fit the heavens to their beliefs.
  • In a contemporary example of this attitude, the initial public health report on the Mount Vernon choir case said that it may have been caused by people “sitting close to one another, sharing snacks and stacking chairs at the end of the practice,” even though almost 90 percent of the people there developed symptoms of Covid-19
  • So much of what we have done throughout the pandemic — the excessive hygiene theater and the failure to integrate ventilation and filters into our basic advice — has greatly hampered our response.
  • Some of it, like the way we underused or even shut down outdoor space, isn’t that different from the 19th-century Londoners who flushed the source of their foul air into the Thames and made the cholera epidemic worse.
  • Righting this ship cannot be a quiet process — updating a web page here, saying the right thing there. The proclamations that we now know are wrong were so persistent and so loud for so long.
  • the progress we’ve made might lead to an overhaul in our understanding of many other transmissible respiratory diseases that take a terrible toll around the world each year and could easily cause other pandemics.
  • So big proclamations require probably even bigger proclamations to correct, or the information void, unnecessary fears and misinformation will persist, damaging the W.H.O. now and in the future.
  • I’ve seen our paper used in India to try to reason through aerosol transmission and the necessary mitigations. I’ve heard of people in India closing their windows after hearing that the virus is airborne, likely because they were not being told how to respond
  • The W.H.O. needs to address these fears and concerns, treating it as a matter of profound change, so other public health agencies and governments, as well as ordinary people, can better adjust.
  • It needs to begin a campaign proportional to the importance of all this, announcing, “We’ve learned more, and here’s what’s changed, and here’s how we can make sure everyone understands how important this is.” That’s what credible leadership looks like. Otherwise, if a web page is updated in the forest without the requisite fanfare, how will it matter?
Javier E

Opinion | The 1619 Chronicles - The New York Times - 0 views

  • The 1619 Project introduced a date, previously obscure to most Americans, that ought always to have been thought of as seminal — and probably now will. It offered fresh reminders of the extent to which Black freedom was a victory gained by courageous Black Americans, and not just a gift obtained from benevolent whites.
  • in a point missed by many of the 1619 Project’s critics, it does not reject American values. As Nikole Hannah-Jones, its creator and leading voice, concluded in her essay for the project, “I wish, now, that I could go back to the younger me and tell her that her people’s ancestry started here, on these lands, and to boldly, proudly, draw the stars and those stripes of the American flag.” It’s an unabashedly patriotic thought.
  • ambition can be double-edged. Journalists are, most often, in the business of writing the first rough draft of history, not trying to have the last word on it. We are best when we try to tell truths with a lowercase t, following evidence in directions unseen, not the capital-T truth of a pre-established narrative in which inconvenient facts get discarded
  • ...25 more annotations...
  • on these points — and for all of its virtues, buzz, spinoffs and a Pulitzer Prize — the 1619 Project has failed.
  • That doesn’t mean that the project seeks to erase the Declaration of Independence from history. But it does mean that it seeks to dethrone the Fourth of July by treating American history as a story of Black struggle against white supremacy — of which the Declaration is, for all of its high-flown rhetoric, supposed to be merely a part.
  • he deleted assertions went to the core of the project’s most controversial goal, “to reframe American history by considering what it would mean to regard 1619 as our nation’s birth year.”
  • She then challenged me to find any instance in which the project stated that “using 1776 as our country’s birth date is wrong,” that it “should not be taught to schoolchildren,” and that the only one “that should be taught” was 1619. “Good luck unearthing any of us arguing that,” she added.
  • I emailed her to ask if she could point to any instances before this controversy in which she had acknowledged that her claims about 1619 as “our true founding” had been merely metaphorical. Her answer was that the idea of treating the 1619 date metaphorically should have been so obvious that it went without saying.
  • “1619. It is not a year that most Americans know as a notable date in our country’s history. Those who do are at most a tiny fraction of those who can tell you that 1776 is the year of our nation’s birth. What if, however, we were to tell you that this fact, which is taught in our schools and unanimously celebrated every Fourth of July, is wrong, and that the country’s true birth date, the moment that its defining contradictions first came into the world, was in late August of 1619?”
  • Here is an excerpt from the introductory essay to the project by The New York Times Magazine’s editor, Jake Silverstein, as it appeared in print in August 2019 (italics added):
  • In his introduction, Silverstein argues that America’s “defining contradictions” were born in August 1619, when a ship carrying 20 to 30 enslaved Africans from what is present-day Angola arrived in Point Comfort, in the English colony of Virginia. And the title page of Hannah-Jones’s essay for the project insists that “our founding ideals of liberty and equality were false when they were written.”
  • What was surprising was that in 1776 a politically formidable “defining contradiction” — “that all men are created equal” — came into existence through the Declaration of Independence. As Abraham Lincoln wrote in 1859, that foundational document would forever serve as a “rebuke and stumbling block to the very harbingers of reappearing tyranny and oppression.”
  • As for the notion that the Declaration’s principles were “false” in 1776, ideals aren’t false merely because they are unrealized, much less because many of the men who championed them, and the nation they created, hypocritically failed to live up to them.
  • These two flaws led to a third, conceptual, error. “Out of slavery — and the anti-Black racism it required — grew nearly everything that has truly made America exceptional,” writes Silverstein.
  • Nearly everything? What about, say, the ideas contained by the First Amendment? Or the spirit of openness that brought millions of immigrants through places like Ellis Island? Or the enlightened worldview of the Marshall Plan and the Berlin airlift? Or the spirit of scientific genius and discovery exemplified by the polio vaccine and the moon landing?
  • On the opposite side of the moral ledger, to what extent does anti-Black racism figure in American disgraces such as the brutalization of Native Americans, the Chinese Exclusion Act or the internment of Japanese-Americans in World War II?
  • The world is complex. So are people and their motives. The job of journalism is to take account of that complexity, not simplify it out of existence through the adoption of some ideological orthodoxy.
  • This mistake goes far to explain the 1619 Project’s subsequent scholarly and journalistic entanglements. It should have been enough to make strong yet nuanced claims about the role of slavery and racism in American history. Instead, it issued categorical and totalizing assertions that are difficult to defend on close examination.
  • It should have been enough for the project to serve as curator for a range of erudite and interesting voices, with ample room for contrary takes. Instead, virtually every writer in the project seems to sing from the same song sheet, alienating other potential supporters of the project and polarizing national debate.
  • James McPherson, the Pulitzer Prize-winning author of “Battle Cry of Freedom” and a past president of the American Historical Association. He was withering: “Almost from the outset,” McPherson told the World Socialist Web Site, “I was disturbed by what seemed like a very unbalanced, one-sided account, which lacked context and perspective.”
  • In particular, McPherson objected to Hannah-Jones’s suggestion that the struggle against slavery and racism and for civil rights and democracy was, if not exclusively then mostly, a Black one. As she wrote in her essay: “The truth is that as much democracy as this nation has today, it has been borne on the backs of Black resistance.”
  • McPherson demurs: “From the Quakers in the 18th century, on through the abolitionists in the antebellum, to the Radical Republicans in the Civil War and Reconstruction, to the N.A.A.C.P., which was an interracial organization founded in 1909, down through the civil rights movements of the 1950s and 1960s, there have been a lot of whites who have fought against slavery and racial discrimination, and against racism,” he said. “And that’s what’s missing from this perspective.”
  • Wilentz’s catalog of the project’s mistakes is extensive. Hannah-Jones’s essay claimed that by 1776 Britain was “deeply conflicted” over its role in slavery. But despite the landmark Somerset v. Stewart court ruling in 1772, which held that slavery was not supported by English common law, it remained deeply embedded in the practices of the British Empire. The essay claimed that, among Londoners, “there were growing calls to abolish the slave trade” by 1776. But the movement to abolish the British slave trade only began about a decade later — inspired, in part, Wilentz notes, by American antislavery agitation that had started in the 1760s and 1770s.
  • ie M. Harris, an expert on pre-Civil War African-American life and slavery. “On Aug. 19 of last year,” Harris wrote, “I listened in stunned silence as Nikole Hannah-Jones … repeated an idea that I had vigorously argued against with her fact checker: that the patriots fought the American Revolution in large part to preserve slavery in North America.”
  • The larger problem is that The Times’s editors, however much background reading they might have done, are not in a position to adjudicate historical disputes. That should have been an additional reason for the 1619 Project to seek input from, and include contributions by, an intellectually diverse range of scholarly voices. Yet not only does the project choose a side, it also brooks no doubt.
  • “It is finally time to tell our story truthfully,” the magazine declares on its 1619 cover page. Finally? Truthfully? Is The Times suggesting that distinguished historians, like the ones who have seriously disputed aspects of the project, had previously been telling half-truths or falsehoods?
  • unlike other dates, 1776 uniquely marries letter and spirit, politics and principle: The declaration that something new is born, combined with the expression of an ideal that — because we continue to believe in it even as we struggle to live up to it — binds us to the date.
  • On the other, the 1619 Project has become, partly by its design and partly because of avoidable mistakes, a focal point of the kind of intense national debate that columnists are supposed to cover, and that is being widely written about outside The Times. To avoid writing about it on account of the first scruple is to be derelict in our responsibility toward the second.
Javier E

Michio Kaku Says the Universe Is Simpler Than We Think - The New York Times - 0 views

  • As the title suggests, Kaku’s latest concern is with what he calls the “holy grail” of all science, the metaphorical “umbilical cord” of our infant universe, whenever it was (or wasn’t) born out of the alleged multiverse. He wanted to write a balanced account of the physics community’s quest to prove string theory — and thus to resolve the messy, imperfect Standard Model of subatomic particles into one elegant theory of everything
  • This book is like a State of the Union where the union is all of existence.
  • Right now the known laws of the universe — “the theory of almost everything,” he calls it — can be written on a single sheet of paper. There’s Einstein’s general relativity on one line, and then a couple more for the Standard Model. “The problem is that the two theories hate each other,” he said. “They’re based on different math, different principles. Every time you put them together it blows up in your face. Why should nature be so clumsy?”
  • ...2 more annotations...
  • Where in English departments, “hundreds of Ph.D. theses are created every year because we want to know what Hemingway really meant,” to him, “physics is the exact opposite.” The equations get “simpler and simpler, but more fundamental and more powerful, every year.”
  • When we “find the rules that govern the chess game,” Kaku said, “we then become grand masters. That’s our destiny, I think, as a species.”
caelengrubb

History Is About Stories. Here's Why We Get Them Wrong | Time - 0 views

  • Science comes hard to most of us because it can’t really take that form. Instead it’s equations, models, theories and the data that support them. But ironically, science offers an explanation of why we love stories.
  • It starts with a challenge posed in human evolution — but the more we come to understand about that subject, the more we see that our storytelling instinct can lead us astray, especially when it comes to how most of us understand history.
  • Many animals have highly developed mind-reading instinct, a sort of tracking-device technique shared with creatures that have no language, not even a language of thought.
  • ...14 more annotations...
  • It’s what they use to track prey and avoid predation.
  • The theory of mind is so obvious it’s nearly invisible: it tells us that behavior is the result of the joint operation of pairs of beliefs and desires.
  • The desires are about the ways we want things to turn out in the future. The beliefs are about the way things are now.
  • The theory of mind turns what people do into a story with a plot by pairing up the content of beliefs and desires, what they are about.
  • Psycholinguistics has shown that the theory of mind is necessary for learning language and almost anything else our parents teach us.
  • Imitating others requires using the theory to figure out what they want us to do and in what order. Without it, you can’t learn much beyond what other higher primates can.
  • The theory of mind makes us construct stories obsessively, and thus encourages us to see the past as a set of them.
  • When popular historians seek to know why Hitler declared war on the U.S. (when he didn’t have to), they put the theory of mind to work: What did he believe and what was it that he wanted that made him do such a foolish thing?
  • he trouble is that the theory of mind is completely wrong about the way the mind, i.e. the brain, actually works. We can’t help but use it to guess what is going on in other people’s minds, and historians rely on it, but the evidence from neuroscience shows that in fact what’s “going on” in anyone’s mind is not decision about what to do in the light of beliefs and desire, but rather a series of neural circuitry firings.
  • The wrongness of the theory of mind is so profound it makes false all the stories we know and love, in narrative history (and in historical novels).
  • Neuroscience reveals that the brain is not organized even remotely to work the way the theory of mind says it does. The fact that narrative histories give persistently different answers to questions historians have been asking for centuries should be evidence that storytelling is not where the real answers can be found.
  • Crucially, they discovered that while different parts of the brain control different things, the neurons’ electrical signals don’t differ in “content”; they are not about different subjects. They are not about anything at all. Each neuron is just in a different part of the mid-brain, doing its job in exactly the same way all other neurons do, sending the same electrochemical oscillations.
  • There is nothing in our brains to vindicate the theory’s description of how anyone ever makes up his or her mind. And that explains a lot about how bad the theory of mind is at predicting anything much about the future, or explaining anything much about the past.
  • If we really want historical knowledge we’ll need to use the same tools scientists use — models and theories we can quantify and test. Guessing what was going through Hitler’s mind, and weaving it into a story is no substitute for empirical science.
caelengrubb

How History Gets Things Wrong: The Neuroscience of Our Addiction to Stories | Reviews |... - 0 views

  • In this book, Rosenberg elaborates further such arguments to take issue with historical narratives and, more generally, with the way in which we most pervasively make sense of people's actions and motivations through narratives.
  • His main contention is that such narratives are always wrong or, to put it differently, that they can't possibly be right.
  • Rosenberg argues that neuroscience itself, our only reliable method to study psychological capacities, shows that theory of mind's posits do not exist
  • ...12 more annotations...
  • The reason is that the best available evidence on how the brain works shows that the brain does not deal with the kind of things that beliefs and desires are supposed to trade with: contents.
  • When we believe or desire, something is believed or desired: that I have bread, that it rains, that the Parliament passes the bill, etc. Beliefs and desires are about something.
  • . After being presented with the assassin and the crime, the book moves on to explain why, even if always wrong, narratives in general, and historical narratives in particular, are so compelling for us. Even if we have no claim to truth or correctness for our narratives, narratives seem to be highly convincing in moving us to act
  • Furthermore, we cannot but think in terms of them. Rosenberg's explanation for this 'addiction to stories' is that it has been entrenched in us by evolutionary processes that took place over the last million years of natural history
  • Narrative explanations emerged out of Darwinian processes of natural selection -- or "environmental filtration", in the less purposive parlance Rosenberg prefers -- that allowed our ancestors to coordinate efforts, collaborate and flourish, moving from the bottom to the top of the Pleistocene's food chain. Rosenberg argues that while the basic mechanisms of mindreading pervasive in the animal kingdom, based on mutual tracking and monitoring of animals' behavior, are a sound method for getting agents to coordinate behavior, these mechanisms' more recent successor, the theory of mind, crafted by the use of co-evolved languages, turned those mindreading abilities into a theory with empirical hypothesis about agents' beliefs and desires but no facts to match them.
  • The error historians allegedly make lies in mistaking stories for real explanations, surmising that behind our behavior there are purposes, rational motivations.
  • Historians -- in particular narrative historians -- make a pervasive use of folk psychological explanations, i.e., explanations that describe events in terms of the beliefs and desires of historical agents, including individuals and groups.
  • In order for folk psychological and historical narratives to be right there have to be facts of the matter about what sentences in such explanation refer to that make them true.
  • Folk psychological explanations of actions in terms of platitudes about beliefs and desires pairings evolved in natural history closely related to mind-reading mechanisms that allowed our ancestors to deal with cooperation and coordination problems.
  • There are no interpretative mechanisms in the brain (at any level of description) that can vindicate the attribution of contents to beliefs and desires.
  • There are no facts of the matter that allow us to select belief/desire pairings as those actually operating 'behind' an agent's behavior.
  • Folk psychological explanations do not track any facts and thus can't be correct.
caelengrubb

The future's in the past | Culture | The Guardian - 0 views

  • Whenever the importance of history is discussed, epigrams and homilies come tripping easily off our tongues: How can we understand our present or glimpse our future if we cannot understand our past? How can we know who we are if we don't know who we were?
  • While history may be condemned to repeat itself, historians are condemned to repeat themselves. History is bunk or possibly bunkum.
  • Historians, more than any other class, spend a great deal of time justifying their trade, defining it and aphorising it, seeming to lavish more attention on historiography than history.
  • ...15 more annotations...
  • Historians are no longer grandees at the centre of a fixed civilisation; they are simply journalists writing about celebrities who haven't got the grace to be alive any more
  • There are those who wonder if the whole of history is now valuable only as a politically correct lesson in the stupidity and cruelty of monarchs, aristocrats, industrialists and generals
  • You don't even have to dignify it with ideological abstractions any more; history is really the story of a series of subjugations, oppressions, exploitations and abuses.
  • The biggest challenge facing the great teachers and communicators of history is not to teach history itself, nor even the lessons of history, but why history matters.
  • A history in which historians have to stand on one side of an argument or another, for, in between, they are nothing but dry-as-dust statisticians
  • we measure the exponential growth in the public appetite for history
  • Certainly, history is popular in grand traditional forms, but new subgenres of history have, for the last 20 years, exploded in popularity, too.
  • After all, isn't that what poetry and novels show, that humanity is best comprehended by understanding humans rather than ideas? But for some, this leads to the worry that history can now only mean witness
  • History, then, as one long, grovelling apology or act of self-abasement and self-laceration.
  • We haven't arrived at our own moral and ethical imperatives by each of us working them out from first principles; we have inherited them and they were born out of blood and suffering, as all human things and human beings are.
  • This does not stop us from admiring and praising the progressive heroes who got there early and risked their lives to advance causes that we now take for granted.
  • In the end, I suppose history is all about imagination rather than facts
  • If you cannot feel what our ancestors felt when they cried: 'Wilkes and Liberty!' or, indeed, cried: 'Death to Wilkes!', if you cannot feel with them, then all you can do is judge them and condemn them, or praise them and over-adulate them.
  • History is not the story of strangers, aliens from another realm; it is the story of us had we been born a little earlier
  • History is memory
caelengrubb

Why It's Important That We Study History - 0 views

  • 1. History helps us develop a better understanding of the world.
  • You can’t build a framework on which to base your life without understanding how things work in the world. History paints us a detailed picture of how society, technology, and government worked way back when so that we can better understand how it works now.
  • 2. History helps us understand ourselves.
  • ...12 more annotations...
  • It’s also a valuable tool when it comes to understanding those who are different from us. Global, national, and regional history books help us understand how other cultures affect our own.
  • 3. History helps us learn to understand other people.
  • A large part of that is learning where you fit into the story of your country or the global community in the grand scheme of things. History tells you the story of how your nation, city, or community came to be everything that it is. It tells you where your ancestors came from and tells you who they were.
  • 4. History teaches a working understanding of change.
  • . Each of us has a different experience with the rest of the world – an experience shaped by societal norms, cultural differences, personal experiences, and more. We know when we as individuals crave change and why. History helps us better understand how, when, and why change occurs (or should be sought) on a larger scale.
  • 5. History gives us the tools we need to be decent citizens.
  • Good citizens are always informed citizens, and no one can consider himself to be an informed citizen without a working knowledge of history
  • 6. History makes us better decision makers.
  • History gives us the opportunity to learn from past mistakes. It helps us understand the many reasons why people may behave the way they do.
  • Our judicial system is a perfect example of this concept at work.
  • 7. History helps us develop a new level of appreciation for just about everything.  
  • History is more than just the living record of nations, leaders, and wars. It’s also the story of us. It’s packed with tales of how someone stood up for what they believed in, or died for love, or worked hard to make their dreams come true.
caelengrubb

Opinion | History is repeating itself - right before our eyes - 0 views

  • History has a tendency to repeat itself. As memory fades, events from the past can become events of the present.
  • this is due to the cyclical nature of history — history repeats itself and flows based on the generations
  • According to them, four generations are needed to cycle through before similar events begin to occur, which would put the coming of age of the millennial generation in parallel to the events of the early 20th century.
  • ...9 more annotations...
  • Hate crime reports increased 17 percent in the United States in 2017 according to the FBI, increasing for the third consecutive year.
  • It is not just LGBTQ+ hate crime that is on the rise. 2018 saw a 99 percent increase in anti-Semitic incidents versus 2015, according to the Anti-Defamation League. When it strictly came to race/ethnicity/ancestry motivated crimes, the increase was 18.4 percent between 2016 and 2017. It is a dangerous time if you are not a cisgender, white, Christian in America, but that is not new.
  • A hundred years ago, in 1920, the National Socialist German Workers’ (Nazi) Party was founded in Germany. It started a generation of Germans that came of age around World War II, meaning they were young adults in 1939.
  • This is not really surprising. History repeats itself. And people forget about history.
  • The Anti-Defamation League says it like it is: Anti-Semitism in the U.S. is as bad as it was in the 1930s
  • The Nazis held a rally in New York City, where they were protected from protesters by the NYPD. This occurred a full six years after the concentration camps started in Germany. American history sometimes casually likes to omit those events in its recounting of World War II. Americans were undoubtedly the good guys of World War II, saving many countries and millions of people worldwide from fascism, but it has also done a poor job at ensuring these fascists ideas stay out of the country in recent years.
  • How can we protect history and avoid making the same mistakes we made in the past when we forget what happened?
  • In the same survey, 93 percent of respondents said that students should learn about the Holocaust in school. Americans understand the importance of passing down the knowledge of this dark past, but we have a government that still refuses to condemn groups promoting the same ideas that tore the world apart 80 years ago.
  • Those events took so many lives, led to a collective awakening to the plight of the Jewish people and now, 80 years later, we are falling back into old patterns.
1 - 20 of 5687 Next › Last »
Showing 20 items per page