Skip to main content

Home/ TOK Friends/ Group items tagged creator

Rss Feed Group items tagged

Javier E

God, Darwin and My College Biology Class - NYTimes.com - 0 views

  • There are a few ways to talk about evolution and religion, I begin. The least controversial is to suggest that they are in fact compatible. Stephen Jay Gould called them “nonoverlapping magisteria,” noma for short, with the former concerned with values and the latter with facts.
  • Noma is the received wisdom in the scientific establishment, including institutions like the National Center for Science Education, which has done much heavy lifting when it comes to promoting public understanding and acceptance of evolution. According to this expansive view, God might well have used evolution by natural selection to produce his creation.
  • This is undeniable. If God exists, then he could have employed anything under the sun — or beyond it — to work his will. Hence, there is nothing in evolutionary biology that necessarily precludes religion, save for most religious fundamentalisms
  • ...4 more annotations...
  • here’s the turn: These magisteria are not nearly as nonoverlapping as some of them might wish.
  • As evolutionary science has progressed, the available space for religious faith has narrowed: It has demolished two previously potent pillars of religious faith and undermined belief in an omnipotent and omni-benevolent God.
  • The more we know of evolution, the more unavoidable is the conclusion that living things, including human beings, are produced by a natural, totally amoral process, with no indication of a benevolent, controlling creator.
  • I CONCLUDE The Talk by saying that, although they don’t have to discard their religion in order to inform themselves about biology (or even to pass my course), if they insist on retaining and respecting both, they will have to undertake some challenging mental gymnastic routines.
Javier E

E.D. Hirsch Jr.'s 'Cultural Literacy' in the 21st Century - The Atlantic - 0 views

  • much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
  • What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been.
  • The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion.
  • ...38 more annotations...
  • Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist.
  • Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
  • A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols.
  • So, first of all, Americans do need a list. But second, it should not be Hirsch’s list. And third, it should not made the way he made his. In the balance of this essay, I want to unpack and explain each of those three statements.
  • If you take the time to read the book attached to Hirsch’s appendix, you’ll find a rather effective argument about the nature of background knowledge and public culture. Literacy is not just a matter of decoding the strings of letters that make up words or the meaning of each word in sequence. It is a matter of decoding context: the surrounding matrix of things referred to in the text and things implied by it
  • That means understanding what’s being said in public, in the media, in colloquial conversation. It means understanding what’s not being said. Literacy in the culture confers power, or at least access to power. Illiteracy, whether willful or unwitting, creates isolation from power.
  • his point about background knowledge and the content of shared public culture extends well beyond schoolbooks. They are applicable to the “texts” of everyday life, in commercial culture, in sports talk, in religious language, in politics. In all cases, people become literate in patterns—“schema” is the academic word Hirsch uses. They come to recognize bundles of concept and connotation like “Party of Lincoln.” They perceive those patterns of meaning the same way a chess master reads an in-game chessboard or the way a great baseball manager reads an at bat. And in all cases, pattern recognition requires literacy in particulars.
  • Lots and lots of particulars. This isn’t, or at least shouldn’t be, an ideologically controversial point. After all, parents on both left and right have come to accept recent research that shows that the more spoken words an infant or toddler hears, the more rapidly she will learn and advance in school. Volume and variety matter. And what is true about the vocabulary of spoken or written English is also true, one fractal scale up, about the vocabulary of American culture.
  • those who demonized Hirsch as a right-winger missed the point. Just because an endeavor requires fluency in the past does not make it worshipful of tradition or hostile to change.
  • radicalism is made more powerful when garbed in traditionalism. As Hirsch put it: “To be conservative in the means of communication is the road to effectiveness in modern life, in whatever direction one wishes to be effective.”
  • Hence, he argued, an education that in the name of progressivism disdains past forms, schema, concepts, figures, and symbols is an education that is in fact anti-progressive and “helps preserve the political and economic status quo.” This is true. And it is made more urgently true by the changes in American demography since Hirsch gave us his list in 1987.
  • If you are an immigrant to the United States—or, if you were born here but are the first in your family to go to college, and thus a socioeconomic new arrival; or, say, a black citizen in Ferguson, Missouri deciding for the first time to participate in a municipal election, and thus a civic neophyte—you have a single overriding objective shared by all immigrants at the moment of arrival: figure out how stuff really gets done here.
  • So, for instance, a statement like “One hundred and fifty years after Appomattox, our house remains deeply divided” assumes that the reader knows that Appomattox is both a place and an event; that the event signified the end of a war; that the war was the Civil War and had begun during the presidency of a man, Abraham Lincoln, who earlier had famously declared that “a house divided against itself cannot stand”; that the divisions then were in large part about slavery; and that the divisions today are over the political, social, and economic legacies of slavery and how or whether we are to respond to those legacies.
  • But why a list, one might ask? Aren’t lists just the very worst form of rote learning and standardized, mechanized education? Well, yes and no.
  • it’s not just newcomers who need greater command of common knowledge. People whose families have been here ten generations are often as ignorant about American traditions, mores, history, and idioms as someone “fresh off the boat.”
  • The more serious challenge, for Americans new and old, is to make a common culture that’s greater than the sum of our increasingly diverse parts. It’s not enough for the United States to be a neutral zone where a million little niches of identity might flourish; in order to make our diversity a true asset, Americans need those niches to be able to share a vocabulary. Americans need to be able to have a broad base of common knowledge so that diversity can be most fully activated.
  • as the pool of potential culture-makers has widened, the modes of culture creation have similarly shifted away from hierarchies and institutions to webs and networks. Wikipedia is the prime embodiment of this reality, both in how the online encyclopedia is crowd-created and how every crowd-created entry contains links to other entries.
  • so any endeavor that makes it easier for those who do not know the memes and themes of American civic life to attain them closes the opportunity gap. It is inherently progressive.
  • since I started writing this essay, dipping into the list has become a game my high-school-age daughter and I play together.
  • I’ll name each of those entries, she’ll describe what she thinks to be its meaning. If she doesn’t know, I’ll explain it and give some back story. If I don’t know, we’ll look it up together. This of course is not a good way for her teachers to teach the main content of American history or English. But it is definitely a good way for us both to supplement what school should be giving her.
  • And however long we end up playing this game, it is already teaching her a meta-lesson about the importance of cultural literacy. Now anytime a reference we’ve discussed comes up in the news or on TV or in dinner conversation, she can claim ownership. Sometimes she does so proudly, sometimes with a knowing look. My bet is that the satisfaction of that ownership, and the value of it, will compound as the years and her education progress.
  • The trouble is, there are also many items on Hirsch’s list that don’t seem particularly necessary for entry into today’s civic and economic mainstream.
  • Which brings us back to why diversity matters. The same diversity that makes it necessary to have and to sustain a unifying cultural core demands that Americans make the core less monochromatic, more inclusive, and continuously relevant for contemporary life
  • it’s worth unpacking the baseline assumption of both Hirsch’s original argument and the battles that erupted around it. The assumption was that multiculturalism sits in polar opposition to a traditional common culture, that the fight between multiculturalism and the common culture was zero-sum.
  • As scholars like Ronald Takaki made clear in books like A Different Mirror, the dichotomy made sense only to the extent that one imagined that nonwhite people had had no part in shaping America until they started speaking up in the second half of the twentieth century.
  • The truth, of course, is that since well before the formation of the United States, the United States has been shaped by nonwhites in its mores, political structures, aesthetics, slang, economic practices, cuisine, dress, song, and sensibility.
  • In its serious forms, multiculturalism never asserted that every racial group should have its own sealed and separate history or that each group’s history was equally salient to the formation of the American experience. It simply claimed that the omni-American story—of diversity and hybridity—was the legitimate American story.
  • as Nathan Glazer has put it (somewhat ruefully), “We are all multiculturalists now.” Americans have come to see—have chosen to see—that multiculturalism is not at odds with a single common culture; it is a single common culture.
  • it is true that in a finite school year, say, with finite class time and books of finite heft, not everything about everyone can be taught. There are necessary trade-offs. But in practice, recognizing the true and longstanding diversity of American identity is not an either-or. Learning about the internment of Japanese Americans does not block out knowledge of D-Day or Midway. It is additive.
  • As more diverse voices attain ever more forms of reach and power we need to re-integrate and reimagine Hirsch’s list of what literate Americans ought to know.
  • To be clear: A 21st-century omni-American approach to cultural literacy is not about crowding out “real” history with the perishable stuff of contemporary life. It’s about drawing lines of descent from the old forms of cultural expression, however formal, to their progeny, however colloquial.
  • Nor is Omni-American cultural literacy about raising the “self-esteem” of the poor, nonwhite, and marginalized. It’s about raising the collective knowledge of all—and recognizing that the wealthy, white, and powerful also have blind spots and swaths of ignorance
  • What, then, would be on your list? It’s not an idle question. It turns out to be the key to rethinking how a list should even get made.
  • the Internet has transformed who makes culture and how. As barriers to culture creation have fallen, orders of magnitude more citizens—amateurs—are able to shape the culture in which we must all be literate. Cat videos and Star Trek fan fiction may not hold up long beside Toni Morrison. But the entry of new creators leads to new claims of right: The right to be recognized. The right to be counted. The right to make the means of recognition and accounting.
  • It is true that lists alone, with no teaching to bring them to life and no expectation that they be connected to a broader education, are somewhere between useless and harmful.
  • This will be a list of nodes and nested networks. It will be a fractal of associations, which reflects far more than a linear list how our brains work and how we learn and create. Hirsch himself nodded to this reality in Cultural Literacy when he described the process he and his colleagues used for collecting items for their list, though he raised it by way of pointing out the danger of infinite regress.
  • His conclusion, appropriate to his times, was that you had to draw boundaries somewhere with the help of experts. My take, appropriate to our times, is that Americans can draw not boundaries so much as circles and linkages, concept sets and pathways among them.
  • Because 5,000 or even 500 items is too daunting a place to start, I ask here only for your top ten. What are ten things every American—newcomer or native born, affluent or indigent—should know? What ten things do you feel are both required knowledge and illuminating gateways to those unenlightened about American life? Here are my entries: Whiteness The Federalist Papers The Almighty Dollar Organized labor Reconstruction Nativism The American Dream The Reagan Revolution DARPA A sucker born every minute
Javier E

Life on Mars? You Read It Here First. - The New York Times - 1 views

  • There was no question in the mind of Percival Lowell, the astronomer, after he finished his observations of Mars during the 1907 opposition, when it was that much closer to Earth.
  • “It is a direct sequitur from this that the planet is at present the abode of intelligent constructive life,” he continued. “I may say in this connection that the theory of such life upon Mars was in no way an a priori hypothesis on my part, but deduced from the outcome of observation, and that my observations since have fully confirmed it. No other supposition is consonant with all the facts here.”
  • By the time of Lowell’s death in 1916, his theories about the Martian canals and their creators had been discredited. As The Times noted gently in a valedictory editorial, “The present judgment of the scientific world is that he was too largely governed in his researches by a vivid imagination.”
Javier E

Journeys in Alterity: Living According to a Story: A Reflection on Faith - 0 views

  • While I’ve not given up on religion in general or Catholicism in particular, I have said farewell to a specific conception of God, namely God as explanation, and in so doing have joined hands with the atheists and agnostics, if not for the whole of life’s journey, at least for a section of the walk. To clarify, I continue to call God creator and savior, but for me God is not the solution to riddle or a formula. God’s not an answer to scientific inquiry or the end result of metaphysical speculation. God is wholly other than all these lines of human reasoning, all these constructions fashioned to explain the world. My need for God is not the need of a student seeking to explain a mathematical theorem, or the need of an ethicist looking for a basis for good behavior, or someone searching for the last piece to a grand puzzle. The divine isn’t the intellectual rope that ties the whole system together.
  • I find it unwise to hold on to God as an explanation, for sooner or later, what I use God to explain will likely be revealed to have a different basis. If I believe in God because God explains this, that, and the other thing, then I can be almost sure to have a belief that’s not long for this world.
  • What is left of my faith when I have forsaken this idea of God? Having fled from the crumbling ruins of the unmoved mover and the uncaused cause, where do I go in search of the sacred? What conception of the divine lies ahead of me, having kicked the dust from my feet and departed the cities of certainty and supernatural explanation? In short, why do I still believe?
  • ...3 more annotations...
  • I continue to believe, to walk the paths of faith, because I believe a story and continue to choose to believe that story. More precisely, I believe in a grand sacred history that has been given embodiment in a plurality of diverse narratives, epistles, and other sacred writings. I interpret these writings in ways literal and figurative and in ways between. While I don’t look to the books of the New Testament for a historical transcript of the life of Christ, I cling to the hope that they reveal a Divine Person and give flesh and blood anew to impossible events, namely the Incarnation, the Crucifixion, the Resurrection, and the Ascension. On the one hand, my choice to believe the truth of these writings—writings that don’t perfectly add up, to be sure—is a decision to believe that an underlying thematic truth speaks through incredible, fantastical tales told to me by mostly unknown strangers, and passed down to me by figures holy and insidious, self-giving and power-hungry, saintly and vicious. On the other hand, I find some of those who have told and retold these stories, particularly the early Christian martyrs, to be credible witnesses. Those who have given their lives for Christ did so not merely in defiance of their murders, but as an act of witness embraced in the hope that their enemies would become their brothers and sisters. That kind of love strikes me as the height of love. And it’s been known to work wonders.
  • What does my faith give me? It gives me a love story. Not a story that explains love, but a story that gives birth to—and directs my heart, mind, and very being to—the fullest expression and fulfillment of love. It is a story that means everything if it means anything at all. It is a story about what it means to be human and what it means to be divine, both of which tell of what it means to love. My religion tells a love story about a humble God who reveals and who gives humanity, through the sacraments and other gifts, the grace to respond in faith, hope, and most importantly love. In this sacred romance, faith and hope are not ends in themselves, or even eternal things, but the temporal means to an eternal end. That end is love. According to this story, there is no need for faith or hope in heaven, and so you will not find them there. What you will find, if there is anything after death to find or a paradise to find it, is love.
  • My faith doesn’t free me from these unsettling possibilities. It doesn’t whisk me away from the battlefield like a protective Aphrodite. Instead, it fills me with fear and trembling and places me in the hopeless situation of not knowing what I love when I love my God. Yet I would not choose to be anywhere else. I’ve no interest in certainty, gnosis, or other false comforts. Nor do I wish to close the book of faith and place it on the bookshelf, unread, ignored and unlived. I intend to live according to a story I love, to share it with those I love, and to allow it to guide my steps and convert my soul, even though I journey to who knows where. And I intend as well to incline an ear to the voice of alterity, to reasons and rhymes that might expose my faith to its undoing.
catbclark

Is Most of Our DNA Garbage? - NYTimes.com - 0 views

  • Is Most of Our DNA Garbage?
  • Gregory believes that while some noncoding DNA is essential, most probably does nothing for us at all, and until recently, most biologists agreed with him.
  • Recent studies have revealed a wealth of new pieces of noncoding DNA that do seem to be as important to our survival as our more familiar genes.
  • ...6 more annotations...
  • Large-scale surveys of the genome have led a number of researchers to expect that the human genome will turn out to be even more full of activity than previously thought.
  • “It was pretty much a case of hubris to imagine that we could dispense with any part of the genome — as if we knew enough to say it wasn’t functional.”
  • If every piece of the genome were essential, then many of those mutations would lead to significant birth defects, with the defects only multiplying over the course of generations; in less than a century, the species would become extinct.
  • “Much of what has been called ‘junk DNA’ in the human genome is actually a massive control panel with millions of switches regulating the activity of our genes.”
  • It’s no coincidence, researchers like Gregory argue, that bona fide creationists have used recent changes in the thinking about junk DNA to try to turn back the clock to the days before Darwin. (The recent studies on noncoding DNA “clearly demonstrate we are ‘fearfully and wonderfully made’ by our Creator God,” declared the Institute for Creation Research.)
  • Over millions of years, the human genome has spontaneously gotten bigger, swelling with useless copies of genes and new transposable elements.
Javier E

Is Huckleberry Finn's ending really lacking? Not if you're talking psychology. | Litera... - 0 views

  • What is it exactly that critics of the novel’s final chapters object to?
  • As Leo Marx put it in a 1953 essay, when Tom enters the picture, Huck falls “almost completely under his sway once more, and we are asked to believe that the boy who felt pity for the rogues is now capable of making Jim’s capture the occasion for a game. He becomes Tom’s helpless accomplice, submissive and gullible.” And to Marx, this regressive transformation is as unforgiveable as it is unbelievable.
  • psychologically, the reversion is as sound as it gets, despite the fury that it inspires. Before we rush to judge Huck—and to criticize Twain for veering so seemingly off course—we’d do well to consider a few key elements of the situations.
  • ...10 more annotations...
  • Huck is a thirteen (or thereabouts)-year-old boy. He is, in other words, a teenager. What’s more, he is a teenager from the antebellum South. Add to that the disparity between his social standing and education and Tom Sawyer’s, and you get a picture of someone who is quite different from a righteous fifty-something (or even thirty-something) literary critic who is writing in the twentieth century for a literary audience. And that someone has to be judged appropriately for his age, background, and social context—and his creator, evaluated accordingly.
  • There are a few important issues at play. Huck is not an adult. Tom Sawyer is not a stranger. The South is not a psychology lab. And slavery is not a bunch of lines projected on a screen. Each one of these factors on its own is enough to complicate the situation immensely—and together, they create one big complicated mess, that makes it increasingly likely that Huck will act just as he does, by conforming to Tom’s wishes and reverting to their old group dynamic.
  • Tom is a part of Huck’s past, and there is nothing like context to cue us back to past habitual behavior in a matter of minutes. (That’s one of the reasons, incidentally, that drug addicts often revert back to old habits when back in old environments.)
  • Jim is an adult—and an adult who has become a whole lot like a parent to Huck throughout their adventures, protecting him and taking care of him (and later, of Tom as well) much as a parent would. And the behavior that he wants from Huck, when he wants anything at all, is prosocial in the extreme (an apology, to take the most famous example, for playing a trick on him in the fog; not much of an ask, it seems, unless you stop to consider that it’s a slave asking a white boy to acknowledge that he was in the wrong). Tom, on the other hand, is a peer. And his demands are far closer to the anti-social side of the scale. Is it so surprising, then, that Huck sides with his old mate?
  • Another crucial caveat to Huck’s apparent metamorphosis: we tend to behave differently in private versus public spheres.
  • behavior is highly contextual—especially when it comes to behaviors that may not be as socially acceptable as one might hope. Huck and Jim’s raft is akin to a private sphere. It is just them, alone on the river, social context flowing away. And when does Huck’s behavior start to shift? The moment that he returns to a social environment, when he joins the Grangerfords in their family feud.
  • When the researchers looked at conformity to parents, they found a steady decrease in conforming behavior. Indeed, for the majority of measures, peer and parental conformity were negatively correlated. And what’s more, the sharpest decline was in conformity to pro-social behaviors.
  • On the raft, Jim was in a new environment, where old rules need not apply—especially given its private nature. But how quickly old ways kick back in, irrespective of whether you were a Huck or a Jim in that prior context.
  • there is a chasm, she points out, between Huck’s stated affection for Jim and his willingness to then act on it, especially in these final episodes. She blames the divide on Twain’s racism. But wouldn’t it be more correct to blame Huck’s only too real humanity?
  • Twain doesn’t make Huck a hero. He makes him real. Can we blame the book for telling it like it is?
Javier E

The Foolish, Historically Illiterate, Incredible Response to Obama's Prayer Breakfast S... - 0 views

  • Inveighing against the barbarism of ISIS, the president pointed out that it would be foolish to blame Islam, at large, for its atrocities. To make this point he noted that using religion to brutalize other people is neither a Muslim invention nor, in America, a foreign one: Lest we get on our high horse and think this is unique to some other place, remember that during the Crusades and the Inquisition, people committed terrible deeds in the name of Christ. In our home country, slavery and Jim Crow all too often was justified in the name of Christ.
  • The "all too often" could just as well be "almost always." There were a fair number of pretexts given for slavery and Jim Crow, but Christianity provided the moral justification
  • Christianity did not "cause" slavery, anymore than Christianity "caused" the civil-rights movement. The interest in power is almost always accompanied by the need to sanctify that power. That is what the Muslims terrorists in ISIS are seeking to do today, and that is what Christian enslavers and Christian terrorists did for the lion's share of American history.
  • ...3 more annotations...
  • Stephens went on to argue that the "Christianization of the barbarous tribes of Africa" could only be accomplished through enslavement. And enslavement was not made possible through Robert's Rules of Order, but through a 250-year reign of mass torture, industrialized murder, and normalized rape—tactics which ISIS would find familiar. Its moral justification was not "because I said so," it was "Providence," "the curse against Canaan," "the Creator," "and Christianization." In just five years, 750,000 Americans died because of this peculiar mission of "Christianization." Many more died before, and many more died after. In his "Segregation Now" speech, George Wallace invokes God 27 times and calls the federal government opposing him "a system that is the very opposite of Christ."
  • That this relatively mild, and correct, point cannot be made without the comments being dubbed, "the most offensive I’ve ever heard a president make in my lifetime,” by a former Virginia governor gives you some sense of the limited tolerance for any honest conversation around racism in our politics.
  • related to that is the need to infantilize and deify our history. Pointing out that Americans have done, on their own soil, in the name of their own God, something similar to what ISIS is doing now does not make ISIS any less barbaric, or any more correct.
Javier E

Heady Stakes for 'Black-ish' on ABC - NYTimes.com - 0 views

  • hovering above all that is a more subtle — and quietly clever — narrative arc, involving the gap between parents and children and how each generation has a different awareness of what it means to be black in 2014.
  • I want it to succeed because the show arrives when black characters on mainstream broadcast networks who directly deal with issues like race are incredibly rare.
  • so far, his approach seems to be a hit. The premiere resonated with critics and attracted a robust 11 million viewers, besides generating a lot of positive reactions and discussions on social media. In a vote of confidence, ABC has given the show a full-season order.
  • ...14 more annotations...
  • it seems as if networks think that post-racial story lines are the only acceptable ways of showcasing black characters on television.
  • TV is resplendent with ethnically diverse casts, from procedurals like “Law & Order: SVU” and “NCIS: Los Angeles” to hits like “Scandal” and “Elementary” to sitcoms like “New Girl” and “Brooklyn Nine-Nine.”  But the characters on those series don’t often deal directly with racial issues in everyday life and, by not doing so, perpetuate another kind of colorblindness, one that homogenizes characters and treats race as inconsequential, when it is anything but.
  • “The PC way of handling culture has been to not talk about it,” Kenya Barris, the show’s creator, said in an interview. “But we should be talking about it.”
  • What black viewers are left with instead, said Dayna Chatman, a media researcher at the Annenberg School for Communication and Journalism at the University of Southern California, is a dynamic that “makes whiteness the norm.”
  • reality television often showcases African-Americans, but since that genre is often about over-the-top performances, she said, it isn’t “particularly representative or flattering.”
  • there’s no middle ground: Either race is largely absent or exaggerated to the point of caricature.
  • The lack of texture and diversity on television is harder to ignore amid the rise of streaming and online series (say, Netflix’s “Orange Is the New Black” or Issa Rae’s “The Misadventures of Awkward Black Girl”) as well as social media like Instagram and Vine (see King Bach’s account). They offer a welcome and much more nuanced window into black humor and culture.
  • “The business explanation is always that this isn’t what the marketplace is asking for,”
  • He said ABC and cable networks pursued him and the “Black-ish” pilot “very aggressively.” Contrary to popular belief, networks “are looking for something that deals with diversity,” he said. “The problem has been timing and having the right package behind it.”
  • “This is the first time in American history where the most famous people in America are black,” he said, naming the Obama family and the musicians Kanye West and Beyoncé. “But there’s still a really obvious invisibility on television.”
  • Mr. Barris said he was determined to do more than create a successor to “The Cosby Show,” although “Black-ish” draws from its legacy. But while the popularity of the Huxtable family centered on its warmth and relatability, it was, Mr. Barris said, “about a family that happened to be black.” He added that he wanted his show to be much more cognizant of modern racial identity, and to reflect the class and racial dynamics of being black in America.
  • “We are hyperaware of how people and the media perceive us,” she said. “And who gets it and who doesn’t get it.”
  • In 2005, Mr. Chappelle walked away from his lucrative show on Comedy Central after expressing discomfort that the line between his social commentary and racial satire had grown too thin.
  • Today, there are a few other shows operating in the space left behind by Mr. Chappelle, including “Key & Peele” on Comedy Central and “Black Jesus” on Adult Swim. But those are cable outlets with smaller audiences, whereas “Black-ish” is on a mainstream network.
Javier E

The Obama Boom - The New York Times - 1 views

  • What did Mr. Obama do that was supposed to kill jobs? Quite a lot, actually. He signed the 2010 Dodd-Frank financial reform, which critics claimed would crush employment by starving businesses of capital.
  • He raised taxes on high incomes, especially at the very top, where average tax rates rose by about six and a half percentage points after 2012, a step that critics claimed would destroy incentives.
  • Yet none of the dire predicted consequences of these policies have materialized.
  • ...6 more annotations...
  • And he enacted a health reform that went into full effect in 2014, amid claims that it would have catastrophic effects on employment.
  • what do we learn from this impressive failure to fail? That the conservative economic orthodoxy dominating the Republican Party is very, very wrong.
  • conservative orthodoxy has a curiously inconsistent view of the abilities and motivations of corporations and wealthy individuals — I mean, job creators.
  • On one side, this elite is presumed to be a bunch of economic superheroes, able to deliver universal prosperity by summoning the magic of the marketplace. On the other side, they’re depicted as incredibly sensitive flowers who wilt in the face of adversity — raise their taxes a bit, subject them to a few regulations, or for that matter hurt their feelings in a speech or two, and they’ll stop creating jobs and go sulk in their tents, or more likely their mansions.
  • It’s a doctrine that doesn’t make much sense, but it conveys a clear message that, whaddya know, turns out to be very convenient for the elite: namely, that injustice is a law of nature, that we’d better not do anything to make our society less unequal or protect ordinary families from financial risks. Because if we do, the usual suspects insist, we’ll be severely punished by the invisible hand, which will collapse the economy.
  • From a conservative point of view, Mr. Obama did everything wrong, afflicting the comfortable (slightly) and comforting the afflicted (a lot), and nothing bad happened. We can, it turns out, make our society better after all.
kushnerha

The Words That Killed Medieval Jews - The New York Times - 0 views

  • DO harsh words lead to violent acts? At a moment when hate speech seems to be proliferating, it’s a question worth asking.
  • worry that heated anti-Muslim political rhetoric would spark an increase in attacks against Muslims.
  • Some claim that last month’s mass shooting in Colorado Springs was provoked by Carly Fiorina’s assertion that Planned Parenthood was “harvesting baby parts”; Mrs. Fiorina countered that language could not be held responsible for the deeds of a “deranged” man.
  • ...12 more annotations...
  • beating of a homeless Hispanic man in Boston, allegedly inspired by Donald J. Trump’s anti-immigration rhetoric, and by the shooting deaths of police officers in California, Texas and Illinois, which some have attributed to anti-police sentiment expressed at Black Lives Matter protests.
  • history does show that a heightening of rhetoric against a certain group can incite violence against that group, even when no violence is called for. When a group is labeled hostile and brutal, its members are more likely to be treated with hostility and brutality. Visual images are particularly powerful, spurring actions that may well be unintended by the images’ creators.
  • Official Christian theology and policy toward Jews remained largely unchanged in the Middle Ages. Over roughly 1,000 years, Christianity condemned the major tenets of Judaism and held “the Jews” responsible for the death of Jesus. But the terms in which these ideas were expressed changed radically.
  • Before about 1100, Christian devotions focused on Christ’s divine nature and triumph over death. Images of the crucifixion showed Jesus alive and healthy on the cross. For this reason, his killers were not major focuses in Christian thought. No anti-Jewish polemics were composed during these centuries
  • In an effort to spur compassion among Christian worshipers, preachers and artists began to dwell in vivid detail on Christ’s pain. Christ morphed from triumphant divine judge to suffering human savior. A parallel tactic, designed to foster a sense of Christian unity, was to emphasize the cruelty of his supposed tormentors, the Jews.
  • The “Goad of Love,” a retelling of the crucifixion that is considered the first anti-Jewish Passion treatise, was written around 1155-80. It describes Jews as consumed with sadism and blood lust. They were seen as enemies not only of Christ, but also of living Christians; it was at this time that Jews began to be accused of ritually sacrificing Christian children.
  • Ferocious anti-Jewish rhetoric began to permeate sermons, plays and polemical texts. Jews were labeled demonic and greedy. In one diatribe, the head of the most influential monastery in Christendom thundered at the Jews: “Why are you not called brute animals? Why not beasts?” Images began to portray Jews as hooknosed caricatures of evil.
  • the First Crusade had called only for an “armed pilgrimage” to retake Jerusalem from Muslims, the first victims of the Crusade were not the Turkish rulers of Jerusalem but Jewish residents of the German Rhineland. Contemporary accounts record the crusaders asking why, if they were traveling to a distant land to “kill and to subjugate all those kingdoms that do not believe in the Crucified,” they should not also attack “the Jews, who killed and crucified him?”
  • At no point did Christian authorities promote or consent to the violence. Christian theology, which applied the Psalm verse “Slay them not” to Jews, and insisted that Jews were not to be killed for their religion, had not changed. Clerics were at a loss to explain the attacks. A churchman from a nearby town attributed the massacres to “some error of mind.”
  • But not all the Rhineland killers were crazy. The crusaders set out in the Easter season. Both crusade and Easter preaching stirred up rage about the crucifixion and fear of hostile and threatening enemies.
  • Sometimes the perpetrators were zealous holy warriors, sometimes they were opportunistic business rivals, sometimes they were parents grieving for children lost to accident or crime, or fearful of the ravages of a new disease.
  • Some may well have been insane. But sane or deranged, they did not pick their victims in a vacuum. It was repeated and dehumanizing excoriation that led those medieval Christians to attack people who had long been their neighbors.
kushnerha

Viewpoint: Why do fictional universes matter? - BBC News - 0 views

  • The myths that make up Western culture have changed. Extended fictional universes, from Harry Potter to Game of Thrones, have taken over from Shakespeare and the Bible
  • We might like to think that the Western world is based on the noble myths of the Greeks and Romans, the tales of our greatest novelists, and a common understanding of the Bible and Shakespeare. But in truth, the stories that really bind us together are completely different.
  • whole extended fictional universes, entirely self-consistent, with deep histories, hundreds of characters, and even a form of theological scholarship.
  • ...5 more annotations...
  • modern civilisation, like all civilisations before it, has settled around a set of myths and legends as the basis of its culture. They are more complex, more interesting, more sophisticated, and with a much richer interaction between creators and fans than you might think
  • shared mythos, the cultural touchpoints we can use as a framework to tell each other stories, is no longer the Bible or the Odyssey. It's Star Wars and Star Trek, Gotham City and Westeros
  • For fans of all of these extended worlds, there is scholarship, debates, and a need to get the stories straight that rivals contemporary mainstream theology.
  • employs a whole team dedicated to keeping the mythos consistent. Recently, there has been a schism in the Star Wars canon, and stories known as the backdrop of this insanely popular world are being reassessed, with no sense of oddity, as to their actual historical accuracy.
  • write new archetypes. Fan fiction is our new folklore - a continuation of the traditions that gave us the tales of Robin Hood
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

Uber, Arizona, and the Limits of Self-Driving Cars - The Atlantic - 0 views

  • it’s a good time for a critical review of the technical literature of self-driving cars. This literature reveals that autonomous vehicles don’t work as well as their creators might like the public to believe.
  • The world is a 3-D grid with x, y, and z coordinates. The car moves through the grid from point A to point B, using highly precise GPS measurements gathered from nearby satellites. Several other systems operate at the same time. The car’s sensors bounce out laser radar waves and measure the response time to build a “picture” of what is outside.
  • It is a masterfully designed, intricate computational system. However, there are dangers.
  • ...11 more annotations...
  • Self-driving cars navigate by GPS. What happens if a self-driving school bus is speeding down the highway and loses its navigation system at 75 mph because of a jammer in the next lane?
  • Because they are not calculating the trajectory for the stationary fire truck, only for objects in motion (like pedestrians or bicyclists), they can’t react quickly to register a previously stationary object as an object in motion.
  • If the car was programmed to save the car’s occupants at the expense of pedestrians, the autonomous-car industry is facing its first public moment of moral reckoning.
  • This kind of blind optimism about technology, the assumption that tech is always the right answer, is a kind of bias that I call technochauvinism.
  • an overwhelming number of tech people (and investors) seem to want self-driving cars so badly that they are willing to ignore evidence suggesting that self-driving cars could cause as much harm as good
  • By this point, many people know about the trolley problem as an example of an ethical decision that has to be programmed into a self-driving car.
  • With driving, the stakes are much higher. In a self-driving car, death is an unavoidable feature, not a bug.
  • t imagine the opposite scenario: The car is programmed to sacrifice the driver and the occupants to preserve the lives of bystanders. Would you get into that car with your child? Would you let anyone in your family ride in it? Do you want to be on the road, or on the sidewalk, or on a bicycle, next to cars that have no drivers and have unreliable software that is designed to kill you or the driver?
  • Plenty of people want self-driving cars to make their lives easier, but self-driving cars aren’t the only way to fix America’s traffic problems. One straightforward solution would be to invest more in public transportation.
  • Public-transportation funding is a complex issue that requires massive, collaborative effort over a period of years. It involves government bureaucracy. This is exactly the kind of project that tech people often avoid attacking, because it takes a really long time and the fixes are complicated.
  • Plenty of people, including technologists, are sounding warnings about self-driving cars and how they attempt to tackle very hard problems that haven’t yet been solved. People are warning of a likely future for self-driving cars that is neither safe nor ethical nor toward the greater good. Still,  the idea that self-driving cars are nifty and coming soon is often the accepted wisdom, and there’s a tendency to forget that technologists have been saying “coming soon” for decades now.
Emilio Ergueta

What is Art? and/or What is Beauty? | Issue 108 | Philosophy Now - 1 views

  • Art is something we do, a verb. Art is an expression of our thoughts, emotions, intuitions, and desires, but it is even more personal than that: it’s about sharing the way we experience the world, which for many is an extension of personality. It is the communication of intimate concepts that cannot be faithfully portrayed by words alone.
  • eauty is much more than cosmetic: it is not about prettiness. There are plenty of pretty pictures available at the neighborhood home furnishing store; but these we might not refer to as beautiful; and it is not difficult to find works of artistic expression that we might agree are beautiful that are not necessarily pretty.
  • Works of art may elicit a sense of wonder or cynicism, hope or despair, adoration or spite; the work of art may be direct or complex, subtle or explicit, intelligible or obscure; and the subjects and approaches to the creation of art are bounded only by the imagination of the artist.
  • ...4 more annotations...
  • The game changers – the square pegs, so to speak – are those who saw traditional standards of beauty and decided specifically to go against them, perhaps just to prove a point. Take Picasso, Munch, Schoenberg, to name just three. They have made a stand against these norms in their art. Otherwise their art is like all other art: its only function is to be experienced, appraised, and understood (or not).
  • art is not necessarily positive: it can be deliberately hurtful or displeasing: it can make you think about or consider things that you would rather not. But if it evokes an emotion in you, then it is art.
  • art cannot be simply defined on the basis of concrete tests like ‘fidelity of representation’ or vague abstract concepts like ‘beauty’. So how can we define art in terms applying to both cave-dwellers and modern city sophisticates? To do this we need to ask: What does art do? And the answer is surely that it provokes an emotional, rather than a simply cognitive response. One way of approaching the problem of defining art, then, could be to say: Art consists of shareable ideas that have a shareable emotional impact
  • . A work of art is that which asks a question which a non-art object such as a wall does not: What am I? What am I communicating? The responses, both of the creator artist and of the recipient audience, vary, but they invariably involve a judgement, a response to the invitation to answer. The answer, too, goes towards deciphering that deeper question – the ‘Who am I?’ which goes towards defining humanity.
sissij

Nimuno Loops Invents LEGO Sticky Tape So You Can Build Vertical or Even Defy Gravity | ... - 1 views

  • The creators of the Nimuno Loops tape have done some genius inventing bringing us a product that makes you wonder why no one else has come up with it before.
  • They have created the world's first toy block compatible tape — simple, versatile, cheap, and promising unlimited creative possibilities.
  •  
    LEGO is one of my favorite toy from when I was little. The creativities in those building blocks inspired my mind. Different series have different building blocks but they are all very creative and interesting. And now, they have come up with a new idea of LEGO-compatible tape. I think this is a very genius idea. This invention is a combination of age and LEGO building boards and I am amazed at people's ability of drawing useful connections between totally different object. I think it show how the advantage of making connections in human mind benefit our life and mindset. --Sissi (3/31/2017)
sissij

Prejudice AI? Machine Learning Can Pick up Society's Biases | Big Think - 1 views

  • We think of computers as emotionless automatons and artificial intelligence as stoic, zen-like programs, mirroring Mr. Spock, devoid of prejudice and unable to be swayed by emotion.
  • They say that AI picks up our innate biases about sex and race, even when we ourselves may be unaware of them. The results of this study were published in the journal Science.
  • After interacting with certain users, she began spouting racist remarks.
  • ...2 more annotations...
  • It just learns everything from us and as our echo, picks up the prejudices we’ve become deaf to.
  • AI will have to be programmed to embrace equality.
  •  
    I just feel like this is so ironic. As the parents of the AI, humans themselves can't even be equal , how can we expect the robot we made to be perform perfect humanity and embrace flawless equality. I think equality itself is flawed. How can we define equality? Just like we cannot define fairness, we cannot define equality. I think this robot picking up racist remarks just shows that how children become racist. It also reflects how powerful the cultural context and social norms are. They can shape us subconsciously. --Sissi (4/20/2017)
Javier E

The Boston Rally and the Left's Intolerance of Free Speech - 2 views

  • I wanted to see the ways in which my internet-mediated intellectual life was dominated by assumptions that did not recognize themselves as assumptions, to understand how the perspective that did not understand itself to be a perspective had distorted my vision of the world. I wanted to better see the water in which my school of fish swims.” So he tried to find a new perspective, but still failed. He realized what I once saw. You cannot edit this stream. It edits you in the end. This is self-knowledge: “[T]he fact that so many people like me write the professional internet, the fact that the creators of the idioms and attitudes of our newsmedia and cultural industry almost universally come from a very thin slice of the American populace, is genuinely dangerous.”
  • It is — and getting more so. I just want to say this to my friend: You have checked yourself in not because you are insane, but because you tried to retain your sanity. It is America that is going nuts; and the internet is one reason why.
priyankaghosh

How to Become a Great Investor like Warren Buffet - 0 views

  •  
    Warren Buffet is no less than a God for the investors around the world. His investment journey dates back to the time when he started trading in stocks at the very young age of 11. By the time he turned 16, he had already made a net worth of $6000 (the equivalent of $53,000 today). With a net worth over $70 billion, this 87-year-old wealth creator is currently the second richest person in the world, behind his friend Bill Gates. So, what did Warren Buffet do right to amass such a huge wealth? Here are a few of his key investment strategies from which you can pick up useful tips. Click the link above to read full article.
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
katedriscoll

The Quest to Tell Science from Pseudoscience | Boston Review - 0 views

  • Of the answers that have been proposed, Popper’s own criterion—falsifiability—remains the most commonly invoked, despite serious criticism from both philosophers and scientists. These attacks fatally weakened Popper’s proposal, yet its persistence over a century of debates helps to illustrate the challenge of demarcation—a problem no less central today than it was when Popper broached it
  • pper’s answer emerged. Popper was born just after the turn of the twentieth century in Vienna—the birthplace of psychoanalysis—and received his doctorate in psychology in 1928. In the early 1920s Popper volunteered in the clinics of Alfred Adler, who had split with his former mentor, the creator of psychoanalysis: Sigmund Freud. Precocious interest in psychoanalysis, and his subsequent rejection of it, were crucial in Popper’s later formulation of his philosophical views on science.
  • At first, Popper was quite taken with logical empiricism, but he would diverge from the mainstream of the movement and develop his own framework for understanding scientific thought in his two influential books The Logic of Scientific Discovery (1934, revised and translated to English in 1959) and Conjectures and Refutations (1962). Popper claimed to have formulated his initial ideas about demarcation in 1919, when he was seventeen years old. He had, he writes, “wished to distinguish between science and pseudo-science; knowing very well that science often errs, and that pseudoscience may happen to stumble on the truth.”
‹ Previous 21 - 40 of 61 Next › Last »
Showing 20 items per page