Skip to main content

Home/ TOK Friends/ Group items tagged Adam

Rss Feed Group items tagged

Javier E

A Modest Proposal for More Back-Stabbing in Preschool - NYTimes.com - 0 views

  • I am a deluded throwback to carefree days, and in my attempt to raise a conscious, creative and socially and environmentally responsible child while lacking the means to also finance her conscious, creative and environmentally and socially responsible lifestyle forever, I’d accidentally gone and raised a hothouse serf. Oops.
  • Reich’s thesis is that some inequality is inevitable, even necessary, in a free-market system. But what makes an economy stable and prosperous is a strong, vibrant, growing middle class. In the three decades after World War II, a period that Reich calls “the great prosperity,” the G.I. Bill, the expansion of public universities and the rise of labor unions helped create the biggest, best-educated middle class in the world. Reich describes this as an example of a “virtuous circle” in which productivity grows, wages increase, workers buy more, companies hire more, tax revenues increase, government invests more, workers are better educated. On the flip side, when the middle class doesn’t share in the economic gains, it results over time in a downward vicious cycle: Wages stagnate, workers buy less, companies downsize, tax revenues decrease, government cuts programs, workers are less educated, unemployment rises, deficits grow. Since the crash that followed the deregulation of the financial markets, we have struggled to emerge from such a cycle.
  • What if the kid got it in her head that it was a good idea to go into public service, the helping professions, craftsmanship, scholarship or — God help her — the arts? Wouldn’t a greedier, more back-stabby style of early education be more valuable to the children of the shrinking middle class ­ — one suited to the world they are actually living in?
  • ...3 more annotations...
  • Are we feeding our children a bunch of dangerous illusions about fairness and hard work and level playing fields? Are ideals a luxury only the rich can afford?
  • I’m reminded of the quote by John Adams: “I must study politics and war, that my sons may have the liberty to study mathematics and philosophy. My sons ought to study mathematics and philosophy, geography, natural history [and] naval architecture . . . in order to give their children a right to study painting, poetry, music, architecture, tapestry and porcelain.” For all intents and purposes, I guess I studied porcelain. The funny thing is that my parents came from a country (Peru) with a middle class so small that parents had to study business so that their children could study business. If I didn’t follow suit, it’s at least in part because I spent my childhood in the 1970s absorbing the nurturing message of a progressive pop culture that told me I could be anything I wanted, because this is America.
  • “When we see the contrast between the values we share and the realities we live in, that is the fundamental foundation for social change.”
Javier E

Adam Kirsch: Art Over Biology | The New Republic - 1 views

  • Nietzsche, who wrote in Human, All Too Human, under the rubric “Art dangerous for the artist,” about the particular ill-suitedness of the artist to flourishing in a modern scientific age: When art seizes an individual powerfully, it draws him back to the views of those times when art flowered most vigorously.... The artist comes more and more to revere sudden excitements, believes in gods and demons, imbues nature with a soul, hates science, becomes unchangeable in his moods like the men of antiquity, and desires the overthrow of all conditions that are not favorable to art.... Thus between him and the other men of his period who are the same age a vehement antagonism is finally generated, and a sad end
  • What is modern is the sense of the superiority of the artist’s inferiority, which is only possible when the artist and the intellectual come to see the values of ordinary life—prosperity, family, worldly success, and happiness—as inherently contemptible.
  • Art, according to a modern understanding that has not wholly vanished today, is meant to be a criticism of life, especially of life in a materialist, positivist civilization such as our own. If this means the artist does not share in civilization’s boons, then his suffering will be a badge of honor.
  • ...18 more annotations...
  • The iron law of Darwinian evolution is that everything that exists strives with all its power to reproduce, to extend life into the future, and that every feature of every creature can be explained as an adaptation toward this end. For the artist to deny any connection with the enterprise of life, then, is to assert his freedom from this universal imperative; to reclaim negatively the autonomy that evolution seems to deny to human beings. It is only because we can freely choose our own ends that we can decide not to live for life, but for some other value that we posit. The artist’s decision to produce spiritual offspring rather than physical ones is thus allied to the monk’s celibacy and the warrior’s death for his country, as gestures that deny the empire of mere life.
  • Animals produce beauty on their bodies; humans can also produce it in their artifacts. The natural inference, then, would be that art is a human form of sexual display, a way for males to impress females with spectacularly redundant creations.
  • For Darwin, the human sense of beauty was not different in kind from the bird’s.
  • Still, Darwin recognized that the human sense of beauty was mediated by “complex ideas and trains of thought,” which make it impossible to explain in terms as straightforward as a bird’s:
  • Put more positively, one might say that any given work of art can be discussed critically and historically, but not deduced from the laws of evolution.
  • with the rise of evolutionary psychology, it was only a matter of time before the attempt was made to explain art in Darwinian terms. After all, if ethics and politics can be explained by game theory and reciprocal altruism, there is no reason why aesthetics should be different: in each case, what appears to be a realm of human autonomy can be reduced to the covert expression of biological imperatives
  • Still, there is an unmistakable sense in discussions of Darwinian aesthetics that by linking art to fitness, we can secure it against charges of irrelevance or frivolousness—that mattering to reproduction is what makes art, or anything, really matter.
  • The first popular effort in this direction was the late Denis Dutton’s much-discussed book The Art Instinct, which appeared in 2009.
  • Dutton’s Darwinism was aesthetically conservative: “Darwinian aesthetics,” he wrote, “can restore the vital place of beauty, skill, and pleasure as high artistic values.” Dutton’s argument has recently been reiterated and refined by a number of new books,
  • “The universality of art and artistic behaviors, their spontaneous appearance everywhere across the globe ... and the fact that in most cases they can be easily recognized as artistic across cultures suggest that they derive from a natural, innate source: a universal human psychology.”
  • Again like language, art is universal in the sense that any local expression of it can be “learned” by anyone.
  • Yet earlier theorists of evolution were reluctant to say that art was an evolutionary adaptation like language, for the simple reason that it does not appear to be evolutionarily adaptive.
  • Stephen Jay Gould suggested that art was not an evolutionary adaptation but what he called a “spandrel”—that is, a showy but accidental by-product of other adaptations that were truly functiona
  • the very words “success” and “failure,” despite themselves, bring an emotive and ethical dimension into the discussion, so impossible is it for human beings to inhabit a valueless world. In the nineteenth century, the idea that fitness for survival was a positive good motivated social Darwinism and eugenics. Proponents of these ideas thought that in some way they were serving progress by promoting the flourishing of the human race, when the basic premise of Darwinism is that there is no such thing as progress or regress, only differential rates of reproduction
  • In particular, Darwin suggests that it is impossible to explain the history or the conventions of any art by the general imperatives of evolution
  • Boyd begins with the premise that human beings are pattern-seeking animals: both our physical perceptions and our social interactions are determined by our brain’s innate need to find and to
  • Art, then, can be defined as the calisthenics of pattern-finding. “Just as animal physical play refines performance, flexibility, and efficiency in key behaviors,” Boyd writes, “so human art refines our performance in our key perceptual and cognitive modes, in sight (the visual arts), sound (music), and social cognition (story). These three modes of art, I propose, are adaptations ... they show evidence of special design in humans, design that offers survival and especially reproductive advantages.”
  • make coherent patterns
Javier E

What Gamergate should have taught us about the 'alt-right' | Technology | The Guardian - 0 views

  • Gamergate
  • The 2014 hashtag campaign, ostensibly founded to protest about perceived ethical failures in games journalism, clearly thrived on hate – even though many of those who aligned themselves with the movement either denied there was a problem with harassment, or wrote it off as an unfortunate side effect
  • ure, women, minorities and progressive voices within the industry were suddenly living in fear. Sure, those who spoke out in their defence were quickly silenced through exhausting bursts of online abuse. But that wasn’t why people supported it, right? They were disenfranchised, felt ignored, and wanted to see a systematic change.
  • ...23 more annotations...
  • Is this all sounding rather familiar now? Does it remind you of something?
  • it quickly became clear that the GamerGate movement was a mess – an undefined mission to Make Video Games Great Again via undecided means.
  • fter all, the culture war that began in games now has a senior representative in The White House. As a founder member and former executive chair of Brietbart News, Steve Bannon had a hand in creating media monster Milo Yiannopoulos, who built his fame and Twitter following by supporting and cheerleading Gamergate. This hashtag was the canary in the coalmine, and we ignored it.
  • Gamergate was an online movement that effectively began because a man wanted to punish his ex girlfriend. Its most notable achievement was harassing a large number of progressive figures - mostly women – to the point where they felt unsafe or considered leaving the industry
  • The similarities between Gamergate and the far-right online movement, the “alt-right”, are huge, startling and in no way a coincidence
  • These figures gave Gamergate a new sense of direction – generalising the rhetoric: this was now a wider war between “Social Justice Warriors” (SJWs) and everyday, normal, decent people. Games were simply the tip of the iceberg – progressive values, went the argument, were destroying everything
  • In 2016, new wave conservative media outlets like Breitbart have gained trust with their audience by painting traditional news sources as snooty and aloof. In 2014, video game YouTube stars, seeking to appear in touch with online gaming communities, unscrupulously proclaimed that traditional old-media sources were corrupt. Everything we’re seeing now, had its precedent two years ago.
  • With 2014’s Gamergate, Breitbart seized the opportunity to harness the pre-existing ignorance and anger among disaffected young white dudes. With Trump’s movement in 2016, the outlet was effectively running his campaign: Steve Bannon took leave of his role at the company in August 2016 when he was hired as chief executive of Trump’s presidential campaign
  • young men converted via 2014’s Gamergate, are being more widely courted now. By leveraging distrust and resentment towards women, minorities and progressives, many of Gamergate’s most prominent voices – characters like Mike Cernovich, Adam Baldwin, and Milo Yiannopoulos – drew power and influence from its chaos
  • no one in the movement was willing to be associated with the abuse being carried out in its name. Prominent supporters on Twitter, in subreddits and on forums like 8Chan, developed a range of pernicious rhetorical devices and defences to distance themselves from threats to women and minorities in the industry: the targets were lying or exaggerating, they were too precious; a language of dismissal and belittlement was formed against them. Safe spaces, snowflakes, unicorns, cry bullies. Even when abuse was proven, the usual response was that people on their side were being abused too. These techniques, forged in Gamergate, have become the standard toolset of far-right voices online
  • The majority of people who voted for Trump will never take responsibility for his racist, totalitarian policies, but they’ll provide useful cover and legitimacy for those who demand the very worst from the President Elect. Trump himself may have disavowed the “alt-right”, but his rhetoric has led to them feeling legitimised. As with Gamergate, the press risks being manipulated into a position where it has to tread a respectful middle ground that doesn’t really exist.
  • Using 4chan (and then the more sympathetic offshoot 8Chan) to plan their subversions and attacks made Gamergate a terribly sloppy operation, leaving a trail of evidence that made it quite clear the whole thing was purposefully, plainly nasty. But the video game industry didn’t have the spine to react, and allowed the movement to coagulate – forming a mass of spiteful disappointment that Breitbart was only more than happy to coddle
  • Historically, that seems to be Breitbart’s trick - strongly represent a single issue in order to earn trust, and then gradually indoctrinate to suit wider purposes. With Gamergate, they purposefully went fishing for anti-feminists. 2016’s batch of fresh converts – the white extremists – came from enticing conspiracy theories about the global neoliberal elite secretly controlling the world.
  • The greatest strength of Gamergate, though, was that it actually appeared to represent many left-leaning ideals: stamping out corruption in the press, pushing for better ethical practices, battling for openness.
  • There are similarities here with many who support Trump because of his promises to put an end to broken neo-liberalism, to “drain the swamp” of establishment corruption. Many left-leaning supporters of Gamergate sought to intellectualise their alignment with the hashtag, adopting familiar and acceptable labels of dissent – identifying as libertarian, egalitarian, humanist.
  • At best they unknowingly facilitated abuse, defending their own freedom of expression while those who actually needed support were threatened and attacked.
  • Genuine discussions over criticism, identity and censorship were paralysed and waylaid by Twitter voices obsessed with rhetorical fallacies and pedantic debating practices. While the core of these movements make people’s lives hell, the outer shell – knowingly or otherwise – protect abusers by insisting that the real problem is that you don’t want to talk, or won’t provide the ever-shifting evidence they politely require.
  • In 2017, the tactics used to discredit progressive game critics and developers will be used to discredit Trump and Bannon’s critics. There will be gaslighting, there will be attempts to make victims look as though they are losing their grip on reality, to the point that they gradually even start to believe it. The “post-truth” reality is not simply an accident – it is a concerted assault on the rational psyche.
  • The strangest aspect of Gamergate is that it consistently didn’t make any sense: people chose to align with it, and yet refused responsibility. It was constantly demanded that we debate the issues, but explanations and facts were treated with scorn. Attempts to find common ground saw the specifics of the demands being shifted: we want you to listen to us; we want you to change your ways; we want you to close your publication down. This movement that ostensibly wanted to protect free speech from cry bully SJWs simultaneously did what it could to endanger sites it disagreed with, encouraging advertisers to abandon support for media outlets that published stories critical of the hashtag. The petulance of that movement is disturbingly echoed in Trump’s own Twitter feed.
  • Looking back, Gamergate really only made sense in one way: as an exemplar of what Umberto Eco called “eternal fascism”, a form of extremism he believed could flourish at any point in, in any place – a fascism that would extol traditional values, rally against diversity and cultural critics, believe in the value of action above thought and encourage a distrust of intellectuals or experts – a fascism built on frustration and machismo. The requirement of this formless fascism would – above all else – be to remain in an endless state of conflict, a fight against a foe who must always be portrayed as impossibly strong and laughably weak
  • 2016 has presented us with a world in which our reality is being wilfully manipulated. Fake news, divisive algorithms, misleading social media campaigns.
  • The same voices moved into other geek communities, especially comics, where Marvel and DC were criticised for progressive storylines and decisions. They moved into science fiction with the controversy over the Hugo awards. They moved into cinema with the revolting kickback against the all-female Ghostbusters reboot.
  • Perhaps the true lesson of Gamergate was that the media is culturally unequipped to deal with the forces actively driving these online movements. The situation was horrifying enough two years ago, it is many times more dangerous now.
Javier E

Trump Fires Adviser's Son From Transition for Spreading Fake News - The New York Times - 0 views

  • At the Defense Intelligence Agency, his staff members even coined their own name for his sometimes dubious assertions: “Flynn facts.”
  • “He has regularly engaged in the reckless public promotion of conspiracy theories that have no basis in fact, with disregard for the risks that giving credence to those theories could pose to the public,” Representative Adam Smith of Washington, the ranking Democrat on the House Armed Services Committee, said on Tuesday.
  • “Someone who is so oblivious to the facts, or intentionally ignorant of them, should not be entrusted with policy decisions that affect the safety of the American people,” Mr. Smith added.
  • ...2 more annotations...
  • His son, in contrast, showed no such restraint in the weeks before he was fired, regularly posting on Twitter about conspiracy theories involving Mrs. Clinton and her campaign staff well after the election.He continued to push his support for the fake news about Comet Ping Pong after his messages on Twitter about Sunday’s episode began attracting widespread attention. It was not until shortly before 3:30 p.m. Monday that he went silent on Twitter.
  • In one of the last messages he posted, he shared a post from another Twitter user who sought to spread a conspiracy theory that sprang up on the right-wing fringes after the shooting: that the suspect arrested at Comet Ping Pong, Edgar M. Welch, 28, of Salisbury, N.C., was actually an actor, and that the episode was a hoax cooked up to discredit the claim of a sex trafficking ring at the restaurant.
Javier E

The Selfish Gene turns 40 | Science | The Guardian - 0 views

  • The idea was this: genes strive for immortality, and individuals, families, and species are merely vehicles in that quest. The behaviour of all living things is in service of their genes hence, metaphorically, they are selfish.
  • Before this, it had been proposed that natural selection was honing the behaviour of living things to promote the continuance through time of the individual creature, or family, or group or species. But in fact, Dawkins said, it was the gene itself that was trying to survive, and it just so happened that the best way for it to survive was in concert with other genes in the impermanent husk of an individual
  • This gene-centric view of evolution also began to explain one of the oddities of life on Earth – the behaviour of social insects. What is the point of a drone bee, doomed to remain childless and in the service of a totalitarian queen? Suddenly it made sense that, with the gene itself steering evolution, the fact that the drone shared its DNA with the queen meant that its servitude guarantees not the individual’s survival, but the endurance of the genes they shar
  • ...9 more annotations...
  • the subject is taught bafflingly minimally and late in the curriculum even today; evolution by natural selection is crucial to every aspect of the living world. In the words of the Russian scientist Theodosius Dobzhansky: “Nothing in biology makes sense except in the light of evolution.”
  • his true legacy is The Selfish Gene and its profound effect on multiple generations of scientists and lay readers. In a sense, The Selfish Gene and Dawkins himself are bridges, both intellectually and chronologically, between the titans of mid-century biology – Ronald Fisher, Trivers, Hamilton, Maynard Smith and Williams – and our era of the genome, in which the interrogation of DNA dominates the study of evolution.
  • Genes aren’t what they used to be either. In 1976 they were simply stretches of DNA that encoded proteins. We now know about genes made of DNA’s cousin, RNA; we’ve discovered genes that hop from genome to genome
  • Since 1976, our understanding of why life is the way it is has blossomed and changed. Once the gene became the dominant idea in biology in the 1990s there followed a technological goldrush – the Human Genome Project – to find them all.
  • None of the complications of modern genomes erodes the central premise of the selfish gene.
  • Much of the enmity stems from people misunderstanding that selfishness is being used as a metaphor. The irony of these attacks is that the selfish gene metaphor actually explains altruism. We help others who are not directly related to us because we share similar versions of genes with them.
  • In the scientific community, the chief objection maintains that natural selection can operate at the level of a group of animals, not solely on genes or even individuals
  • To my mind, and that of the majority of evolutionary biologists, the gene-centric view of evolution always emerges intact.
  • the premise remains exciting that a gene’s only desire is to reproduce itself, and that the complexity of genomes makes that reproduction more efficient.
oliviaodon

The Cult of Coincidence | The Huffington Post - 0 views

  • Most people readily believe that they themselves are essentially fully independent thinkers, and that closed-mindedness, intellectual inflexibility and an irrational commitment to pre-conceived thinking dwells only in the feeble minds of others. Think about it: When was the last time in the course of discussion that someone admitted to you something like, “You’re right, I have just blindly swallowed all of the positions and cultural mores of my milieu” or, “Yes, I agree that no amount of oppositional information will ever dissuade me from the beliefs I hold?” No one is immune from this state of affairs, and it requires courage and perpetual vigilance to even venture outside of the intellectual echo chamber that most of us inhabit.
  • There are those who believe that the scientific community is uniquely positioned to avoid these pitfalls. They suggest that the system of peer review is inherently self-critical, and as such is structurally quarantined from bias. Some scientists think otherwise and note that science, in as much as it is conducted by human beings, is subject to the same partiality as every other endeavor.
  • like the communist party under Lenin, science is [in its own eyes] infallible because its judgments are collective. Critics are unneeded, and since they are unneeded, they are not welcome.
  • ...2 more annotations...
  • A classic example of this endemic bias at work is illustrated through Einstein. He was disturbed by the implications of an expanding universe. For thousands of years it was assumed — outside of some theological circles — that matter was eternal. The notion that it came into being at a discreet point in time naturally implied that something had caused it and quite possibly that that something had done it on purpose. Not willing to accept this new information, Einstein added a now famous “fudge factor” to his equations to maintain the solid state universe that he was comfortable with — something he would later describe as “the greatest blunder of my career.”
  • If there is great resistance to notions of design and causality in science, it is exponentially greater when it comes to theology.
Javier E

A Multitasking Video Game Makes Old Brains Act Younger - NYTimes.com - 0 views

  • The research “shows you can take older people who aren’t functioning well and make them cognitively younger through this training,” said Earl K. Miller, a neuroscientist at the Massachusetts Institute of Technology, who was not affiliated with the research. “It’s a very big deal.”
  • Neuroscientists there, led by Dr. Adam Gazzaley, worked with developers to create NeuroRacer, a relatively simple video game in which players drive and try to identify specific road signs that pop up on the screen, while ignoring other signs deemed irrelevant.
  • One of the main early findings of the study reinforced just how challenging it is to multitask successfully, particularly as people age.
  • ...3 more annotations...
  • People in their 20s experienced a 26 percent drop in performance when they were asked to try to drive and identify signs at the same time (rather than just identify the signs without driving). For people in their 60s to 80s, the performance drop was 64 percent.
  • But after the older adults trained at the game, they became more proficient than untrained people in their 20s. The performance levels were sustained for six months, even without additional training. Also, the older adults performed better at memory and attention tests outside the game.
  • The researchers created a second layer of proof by monitoring the brain waves of participants using electroencephalography. What they found was that in older participants, in their 60s to 80s, there were increases in a brain wave called theta, a low-level frequency associated with attention.
Javier E

New Statesman - The Joy of Secularism: 11 Essays for How We Live Now - 0 views

  • Art & Design Books Film Ideas Music & Performance TV & Radio Food & Drink Blog Return to: Home | Culture | Books The Joy of Secularism: 11 Essays for How We Live Now By George Levine Reviewed by Terry Eagleton - 22 June 2011 82 comments Print version Email a friend Listen RSS Misunderstanding what it means to be secular.
  • Societies become truly secular not when they dispense with religion but when they are no longer greatly agitated by it. It is when religious faith ceases to be a vital part of the public sphere
  • Christianity is certainly other-worldly, and so is any reasonably sensitive soul who has been reading the newspapers. The Christian gospel looks to a future transformation of the appalling mess we see around us into a community of justice and friendship, a change so deep-seated and indescribable as to make Lenin look like a Lib Dem.“This [world] is our home," Levine comments. If he really feels at home in this crucifying set-up, one might humbly suggest that he shouldn't. Christians and political radicals certainly don't.
  • ...9 more annotations...
  • he suspects that Christian faith is other-worldly in the sense of despising material things. Material reality, in his view, is what art celebrates but religion does not. This is to forget that Gerard Manley Hopkins was a Jesuit. It is also to misunderstand the doctrine of Creation
  • Adam Phillips writes suggestively of human helplessness as opposed to the sense of protectedness that religious faith supposedly brings us, without noticing that the signifier of God for the New Testament is the tortured and executed corpse of a suspected political criminal.
  • None of these writers points out that if Christianity is true, then it is all up with us. We would then have to face the deeply disagreeable truth that the only authentic life is one that springs from a self-dispossession so extreme that it is probably beyond our power.
  • Secularisation is a lot harder than people tend to imagine. The history of modernity is, among other things, the history of substitutes for God. Art, culture, nation, Geist, humanity, society: all these, along with a clutch of other hopeful aspirants, have been tried from time to time. The most successful candidate currently on offer is sport, which, short of providing funeral rites for its spectators, fulfils almost every religious function in the book.
  • The Christian paradigm of love, by contrast, is the love of strangers and enemies, not of those we find agreeable. Civilised notions such as mutual sympathy, more's the pity, won't deliver us the world we need.
  • What exactly," he enquires, "does the invocation of some supernatural being add?" A Christian might reply that it adds the obligations to give up everything one has, including one's life, if necessary, for the sake of others. And this, to say the least, is highly inconvenient.
  • If Friedrich Nietzsche was the first sincere atheist, it is because he saw that the Almighty is exceedingly good at disguising Himself as something else, and that much so-called secularisation is accordingly bogus.
  • Postmodernism is perhaps best seen as Nietzsche shorn of the metaphysical baggage. Whereas modernism is still haunted by a God-shaped absence, postmodern culture is too young to remember a time when men and women were anguished by the fading spectres of truth, reality, nature, value, meaning, foundations and the like. For postmodern theory, there never was any truth or meaning in the first place
  • Postmodernism is properly secular, but it pays an immense price for this coming of age - if coming of age it is. It means shelving all the other big questions, too, as hopelessly passé. It also involves the grave error of imagining that all faith or passionate conviction is inci­piently dogmatic. It is not only religious belief to which postmodernism is allergic, but belief as such. Advanced capitalism sees no need for the stuff. It is both politically divisive and commercially unnecessary.
Javier E

Scholarship and Politics - The Case of Noam Chomsky - NYTimes.com - 0 views

  • (1) The academy is a world of its own, complete with rules, protocols, systems of evaluation, recognized achievements, agreed-on goals, a roster of heroes and a list of tasks yet to be done.
  • (2) Academic work proceeds within the confines of that world, within, that is, a professional, not a public, space, although its performance may be, and often is, public.
  • (3) academic work is only tangentially, not essentially, political; politics may attend the formation of academic units and the selection of academic personnel, but political concerns and pressures have no place in the unfolding of academic argument, except as objects of its distinctive forms of attention
  • ...16 more annotations...
  • (4) The academic views of a professor are independent of his or her real-world political views; academic disputes don’t track partisan disputes or vice versa; you can’t reason from an academic’s disciplinary views to the positions he or she would take in the public sphere; they are independent variables.
  • The answer given in the first lecture — “What is Language?” — is that we are creatures with language, and that language as a uniquely human biological capacity appeared suddenly and quite late in the evolutionary story, perhaps 75,000 years ago.
  • Chomsky gave three lectures under the general title “What Kind of Creatures are We?”
  • Language, then, does not arise from the social/cultural environment, although the environment provides the stuff or input it works on. That input is “impoverished”; it can’t account for the creativity of language performance, which has its source not in the empirical world, but in an innate ability that is more powerful than the stimuli it utilizes and plays with. It follows that if you want to understand language, you shouldn’t look to linguistic behavior but to the internal mechanism — the Universal Grammar — of which particular linguistic behaviors are a non-exhaustive expression. (The capacity exceeds the empirical resources it might deploy.)
  • In his second lecture (“What Can We Understand?”), Chomsky took up the question of what humans are capable of understanding and his answer, generally, was that we can understand what we can understand, and that means that we can’t understand what is beyond our innate mental capacities
  • This does not mean, he said, that what we can’t understand is not real: “What is mysterious to me is not an argument that it does not exist.” It’s just that while language is powerful and creative, its power and creativity have limits; and since language is thought rather than an addition to or clothing of thought, the limits of language are the limits of what we can fruitfully think about
  • This is as good as it gets. There is “no evolution in our capacity for language.”
  • These assertions are offered as a counter to what Chomsky sees as the over-optimistic Enlightenment belief — common to many empiricist philosophies — that ours is a “limitless explanatory power” and that “we can do anything.”
  • In the third lecture (“What is the Common Good?”) Chomsky turned from the philosophy of mind and language to political philosophy and the question of what constitutes a truly democratic society
  • He likened dogmatic intellectual structures that interfere with free inquiry to coercive political structures that stifle the individual’s creative independence and fail to encourage humanity’s “richest diversity
  • He asserted that any institution marked by domination and hierarchy must rise to the challenge of justifying itself, and if it cannot meet the challenge, it should be dismantled.
  • He contrasted two accounts of democracy: one — associated by him with James Madison — distrusts the “unwashed” populace and puts its faith in representative government where those doing the representing (and the voting and the distributing of goods) constitute a moneyed and propertied elite
  • the other — associated by him with Adam Smith (in one of his moods), J. S. Mill, the 1960s and a tradition of anarchist writing — seeks to expand the franchise and multiply choices in the realms of thought, politics and economics. The impulse of this second, libertarian, strain of democracy, is “to free society from economic or theological guardianship,” and by “theological” Chomsky meant not formal religion as such but any assumed and frozen ideology that blocked inquiry and limited participation. There can’t, in short, be “too much democracy.”
  • It was thought of the highest order performed by a thinker, now 85 years old, who by and large eschewed rhetorical flourishes (he has called his own speaking style “boring” and says he likes it that way) and just did it, where ‘it” was the patient exploration of deep issues that had been explored before him by a succession of predecessors, fully acknowledged, in a conversation that is forever being continued and forever being replenished.
  • Yes, I said to myself, this is what we — those of us who bought a ticket on this particular train — do; we think about problems and puzzles and try to advance the understanding of them; and we do that kind of thinking because its pleasures are, in a strong sense, athletic and provide for us, at least on occasion, the experience of fully realizing whatever capabilities we might have. And we do it in order to have that experience, and to share it with colleagues and students of like mind, and not to make a moral or political point.
  • The term “master class” is a bit overused, but I feel no hesitation in using it here. It was a master class taught by a master, and if someone were to ask me what exactly is it that academics do, I would point to these lectures and say, simply, here it is, the thing itself.
Javier E

Why Are Hundreds of Harvard Students Studying Ancient Chinese Philosophy? - Christine G... - 0 views

  • Puett's course Classical Chinese Ethical and Political Theory has become the third most popular course at the university. The only classes with higher enrollment are Intro to Economics and Intro to Computer Science.
  • the class fulfills one of Harvard's more challenging core requirements, Ethical Reasoning. It's clear, though, that students are also lured in by Puett's bold promise: “This course will change your life.”
  • Puett uses Chinese philosophy as a way to give undergraduates concrete, counter-intuitive, and even revolutionary ideas, which teach them how to live a better life. 
  • ...18 more annotations...
  • Puett puts a fresh spin on the questions that Chinese scholars grappled with centuries ago. He requires his students to closely read original texts (in translation) such as Confucius’s Analects, the Mencius, and the Daodejing and then actively put the teachings into practice in their daily lives. His lectures use Chinese thought in the context of contemporary American life to help 18- and 19-year-olds who are struggling to find their place in the world figure out how to be good human beings; how to create a good society; how to have a flourishing life. 
  • Puett began offering his course to introduce his students not just to a completely different cultural worldview but also to a different set of tools. He told me he is seeing more students who are “feeling pushed onto a very specific path towards very concrete career goals”
  • Puett tells his students that being calculating and rationally deciding on plans is precisely the wrong way to make any sort of important life decision. The Chinese philosophers they are reading would say that this strategy makes it harder to remain open to other possibilities that don’t fit into that plan.
  • Students who do this “are not paying enough attention to the daily things that actually invigorate and inspire them, out of which could come a really fulfilling, exciting life,” he explains. If what excites a student is not the same as what he has decided is best for him, he becomes trapped on a misguided path, slated to begin an unfulfilling career.
  • He teaches them that:   The smallest actions have the most profound ramifications. 
  • From a Chinese philosophical point of view, these small daily experiences provide us endless opportunities to understand ourselves. When we notice and understand what makes us tick, react, feel joyful or angry, we develop a better sense of who we are that helps us when approaching new situations. Mencius, a late Confucian thinker (4th century B.C.E.), taught that if you cultivate your better nature in these small ways, you can become an extraordinary person with an incredible influence
  • Decisions are made from the heart. Americans tend to believe that humans are rational creatures who make decisions logically, using our brains. But in Chinese, the word for “mind” and “heart” are the same.
  • If the body leads, the mind will follow. Behaving kindly (even when you are not feeling kindly), or smiling at someone (even if you aren’t feeling particularly friendly at the moment) can cause actual differences in how you end up feeling and behaving, even ultimately changing the outcome of a situation.
  • In the same way that one deliberately practices the piano in order to eventually play it effortlessly, through our everyday activities we train ourselves to become more open to experiences and phenomena so that eventually the right responses and decisions come spontaneously, without angst, from the heart-mind.
  • Whenever we make decisions, from the prosaic to the profound (what to make for dinner; which courses to take next semester; what career path to follow; whom to marry), we will make better ones when we intuit how to integrate heart and mind and let our rational and emotional sides blend into one. 
  • Aristotle said, “We are what we repeatedly do,” a view shared by thinkers such as Confucius, who taught that the importance of rituals lies in how they inculcate a certain sensibility in a person.
  • “The Chinese philosophers we read taught that the way to really change lives for the better is from a very mundane level, changing the way people experience and respond to the world, so what I try to do is to hit them at that level. I’m not trying to give my students really big advice about what to do with their lives. I just want to give them a sense of what they can do daily to transform how they live.”
  • Their assignments are small ones: to first observe how they feel when they smile at a stranger, hold open a door for someone, engage in a hobby. He asks them to take note of what happens next: how every action, gesture, or word dramatically affects how others respond to them. Then Puett asks them to pursue more of the activities that they notice arouse positive, excited feelings.
  • Once they’ve understood themselves better and discovered what they love to do they can then work to become adept at those activities through ample practice and self-cultivation. Self-cultivation is related to another classical Chinese concept: that effort is what counts the most, more than talent or aptitude. We aren’t limited to our innate talents; we all have enormous potential to expand our abilities if we cultivate them
  • To be interconnected, focus on mundane, everyday practices, and understand that great things begin with the very smallest of acts are radical ideas for young people living in a society that pressures them to think big and achieve individual excellence.
  • One of Puett’s former students, Adam Mitchell, was a math and science whiz who went to Harvard intending to major in economics. At Harvard specifically and in society in general, he told me, “we’re expected to think of our future in this rational way: to add up the pros and cons and then make a decision. That leads you down the road of ‘Stick with what you’re good at’”—a road with little risk but little reward.
  • after his introduction to Chinese philosophy during his sophomore year, he realized this wasn’t the only way to think about the future. Instead, he tried courses he was drawn to but wasn’t naturally adroit at because he had learned how much value lies in working hard to become better at what you love. He became more aware of the way he was affected by those around him, and how they were affected by his own actions in turn. Mitchell threw himself into foreign language learning, feels his relationships have deepened, and is today working towards a master’s degree in regional studies.
  • “I can happily say that Professor Puett lived up to his promise, that the course did in fact change my life.”
Javier E

Face It, Your Brain Is a Computer - The New York Times - 0 views

  • all the standard arguments about why the brain might not be a computer are pretty weak.
  • Take the argument that “brains are parallel, but computers are serial.” Critics are right to note that virtually every time a human does anything, many different parts of the brain are engaged; that’s parallel, not serial.
  • the trend over time in the hardware business has been to make computers more and more parallel, using new approaches like multicore processors and graphics processing units.
  • ...6 more annotations...
  • The real payoff in subscribing to the idea of a brain as a computer would come from using that idea to profitably guide research. In an article last fall in the journal Science, two of my colleagues (Adam Marblestone of M.I.T. and Thomas Dean of Google) and I endeavored to do just that, suggesting that a particular kind of computer, known as the field programmable gate array, might offer a preliminary starting point for thinking about how the brain works.
  • FIELD programmable gate arrays consist of a large number of “logic block” programs that can be configured, and reconfigured, individually, to do a wide range of tasks. One logic block might do arithmetic, another signal processing, and yet another look things up in a table. The computation of the whole is a function of how the individual parts are configured. Much of the logic can be executed in parallel, much like what happens in a brain.
  • our suggestion is that the brain might similarly consist of highly orchestrated sets of fundamental building blocks, such as “computational primitives” for constructing sequences, retrieving information from memory, and routing information between different locations in the brain. Identifying those building blocks, we believe, could be the Rosetta stone that unlocks the brain.
  • it is unlikely that we will ever be able to directly connect the language of neurons and synapses to the diversity of human behavior, as many neuroscientists seem to hope. The chasm between brains and behavior is just too vast.
  • Our best shot may come instead from dividing and conquering. Fundamentally, that may involve two steps: finding some way to connect the scientific language of neurons and the scientific language of computational primitives (which would be comparable in computer science to connecting the physics of electrons and the workings of microprocessors); and finding some way to connect the scientific language of computational primitives and that of human behavior (which would be comparable to understanding how computer programs are built out of more basic microprocessor instructions).
  • If neurons are akin to computer hardware, and behaviors are akin to the actions that a computer performs, computation is likely to be the glue that binds the two.
Emilio Ergueta

Who cares what colour philosophers are? | Education | spiked - 0 views

  • ho’d be a philosopher? Once accused of interpreting the world rather than changing it, philosophers today cause embarrassment simply for existing.
  • Philosophy’s apparent problem with race refuses to go away. Most recently, Nathaniel Adam Tobias Coleman, a leading light in the Why Is My Curriculum White? campaign, has been informed he will not be offered a permanent position at University College London when his fixed-term contract expires at the end of September.
  • Coleman told Times Higher Education that his colleagues did not want him ‘turning the spotlight on to the ivory tower, putting the fear of God into many of its scholars – predominantly racialised as white – who had contented themselves hitherto to research and teach in an “aracial” – aka white-dominated – way’.
  • ...2 more annotations...
  • Arguments about what is most important for students to know prevent academic disciplines from ossifying. There may well be books or scholars that tradition has overlooked and are deserving of a place on the curriculum. Such debates throw open the question of how and why academic judgements are made; in so doing, they also expose the tyranny of identity politics within today’s universities.
  • Knowledge itself is increasingly viewed not as objective, but as ideologically loaded and representative of the perspectives of a dominant ruling elite. The problem, academic activists argue, is that not only are philosophers such as Mill, Nietzsche and Kant white men, but their work also reflects a worldview that is exclusive to white men and has little to offer anyone else. Within a couple of centuries, criticism of philosophy has moved from a focus on what philosophers think to who they are.
Javier E

Scott Adams on the Benefits of Boredom - WSJ.com - 0 views

  • My period of greatest creative output was during my corporate years, when every meeting felt like a play date with coma patients.
  • Lately I've started worrying that I'm not getting enough boredom in my life. If I'm watching TV, I can fast-forward through commercials. If I'm standing in line at the store, I can check email or play "Angry Birds." When I run on the treadmill, I listen to my iPod while reading the closed captions on the TV. I've eliminated boredom from my life. Now let's suppose that the people who are leaders and innovators around the world are experiencing a similar lack of boredom. I think it's fair to say they are. What change would you expect to see in a world that has declining levels of boredom and therefore declining creativity? Allow me to describe that world. See if you recognize it.
  • For starters, you might see people acting more dogmatic than usual. If you don't have the option of thinking creatively, the easiest path is to adopt the default position of your political party, religion or culture. Yup, we see that. You might see more movies that seem derivative or are sequels. Check. You might see more reality shows and fewer scripted shows. Right. You might see the best-seller lists dominated by fiction "factories" in which ghostwriters churn out familiar-feeling work under the brands of famous authors. Got it. You might see the economy flat-line for lack of industry-changing innovation. Uh-oh. You might see the headlines start to repeat, like the movie "Groundhog Day," with nothing but the names changed. We're there. You might find that bloggers are spending most of their energy writing about other bloggers. OK, maybe I do that. Shut up. You might find that people seem almost incapable of even understanding new ideas. Yes.
Javier E

We Are Just Not Digging The Whole Anymore : NPR - 1 views

  • We just don't do whole things anymore. We don't read complete books — just excerpts. We don't listen to whole CDs — just samplings. We don't sit through whole baseball games — just a few innings. Don't even write whole sentences. Or read whole stories like this one. Long-form reading, listening and viewing habits are giving way to browse-and-choose consumption. With the increase in the number of media options — or distractions, depending on how you look at them — something has to give, and that something is our attention span. - Adam Thierer, senior research fellow at George Mason University We care more about the parts and less about the entire. We are into snippets and smidgens and clips and tweets. We are not only a fragmented society, but a fragment society.
  • One Duke University student was famously quoted in a 2006 Time magazine essay telling his history professor, "We don't read whole books anymore."
  • Now there are lots of websites that present whole books and concepts in nano form
  • ...5 more annotations...
  • nearly half of all adults — 47 percent — get some of their local news and information on mobile computing devices. We are receiving our news in kibbles and bits, sacrificing context and quality for quickness and quantity.
  • Here is the ultra-condensation of Pride and Prejudice by Jane Austen: Mr. Darcy: Nothing is good enough for me. Ms. Elizabeth Bennet: I could never marry that proud man. (They change their minds.) THE END
  • Fewer and fewer gamers are following gaming storylines all the way to completion, according to a recent blog post on the IGN Entertainment video game website.
  • "With the increase in the number of media options — or distractions, depending on how you look at them — something has to give, and that something is our attention span." He ticks off a long list of bandied-about terms. Here's a shortened version: cognitive overload; information paralysis; techno stress; and data asphyxiation.
  • Rockmore believes that the way many people learn — or try to learn — these days is via this transporter technique. "The truth is," he says, "that modern pedagogy probably needs to address this in the sense that there is so much information out there, for free, so that obtaining it — even in bits and pieces — is not the challenge, rather integrating it into a coherent whole is. That's a new paradigm."
johnsonma23

Sheryl Sandberg and Adam Grant on Why Women Stay Quiet at Work - NYTimes.com - 0 views

  • Almost every time they started to speak, they were interrupted or shot down before finishing their pitch. When one had a good idea, a male writer would jump in and run with it before she could complete her thought.
  • When a woman speaks in a professional setting, she walks a tightrope. Either she’s barely heard or she’s judged as too aggressive.
  • male senators with more power (as measured by tenure, leadership positions and track record of legislation passed) spoke more on the Senate floor than their junior colleagues. But for female senators, power was not linked to significantly more speaking time.
  • ...8 more annotations...
  • Suspecting that powerful women stayed quiet because they feared a backlash, Professor Brescoll looked deeper.
  • Male executives who spoke more often than their peers were rewarded with 10 percent higher ratings of competence.
  • female executives spoke more than their peers, both men and women punished them with 14 percent lower ratings.
  • But female employees who spoke up with equally valuable ideas did not improve their managers’ perception of their performance.
  • But when women spoke up more, there was no increase in their perceived helpfulness.
  • businesses need to find ways to interrupt this gender bias.
  • increase women’s contributions by adopting practices that focus less on the speaker and more on the idea
  • SINCE most work cannot be done anonymously, leaders must also take steps to encourage women to speak and be heard
Javier E

Economic history: When did globalisation start? | The Economist - 0 views

  • economic historians reckon the question of whether the benefits of globalisation outweigh the downsides is more complicated than this. For them, the answer depends on when you say the process of globalisation started.
  • it is impossible to say how much of a “good thing” a process is in history without first defining for how long it has been going on.
  • Although Adam Smith himself never used the word, globalisation is a key theme in the Wealth of Nations. His description of economic development has as its underlying principle the integration of markets over time. As the division of labour enables output to expand, the search for specialisation expands trade, and gradually, brings communities from disparate parts of the world together
  • ...6 more annotations...
  • Smith had a particular example in mind when he talked about market integration between continents: Europe and America.
  • Kevin O’Rourke and Jeffrey Williamson argued in a 2002 paper that globalisation only really began in the nineteenth century when a sudden drop in transport costs allowed the prices of commodities in Europe and Asia to converge
  • But there is one important market that Mssrs O’Rourke and Williamson ignore in their analysis: that for silver. As European currencies were generally based on the value of silver, any change in its value would have had big effects on the European price level.
  • The impact of what historians have called the resulting “price revolution” dramatically changed the face of Europe. Historians attribute everything from the dominance of the Spanish Empire in Europe to the sudden increase in witch hunts around the sixteenth century to the destabilising effects of inflation on European society. And if it were not for the sudden increase of silver imports from Europe to China and India during this period, European inflation would have been much worse than it was. Price rises only stopped in about 1650 when the price of silver coinage in Europe fell to such a low level that it was no longer profitable to import it from the Americas.
  • The German historical economist, Andre Gunder Frank, has argued that the start of globalisation can be traced back to the growth of trade and market integration between the Sumer and Indus civilisations of the third millennium BC. Trade links between China and Europe first grew during the Hellenistic Age, with further increases in global market convergence occuring when transport costs dropped in the sixteenth century and more rapidly in the modern era of globalisation, which Mssrs O’Rourke and Williamson describe as after 1750.
  • it is clear that globalisation is not simply a process that started in the last two decades or even the last two centuries. It has a history that stretches thousands of years, starting with Smith’s primitive hunter-gatherers trading with the next village, and eventually developing into the globally interconnected societies of today. Whether you think globalisation is a “good thing” or not, it appears to be an essential element of the economic history of mankind.
Javier E

The Death of Adulthood in American Culture - NYTimes.com - 0 views

  • It seems that, in doing away with patriarchal authority, we have also, perhaps unwittingly, killed off all the grown-ups.
  • , the journalist and critic Ruth Graham published a polemical essay in Slate lamenting the popularity of young-adult fiction among fully adult readers. Noting that nearly a third of Y.A. books were purchased by readers ages 30 to 44 (most of them presumably without teenage children of their own), Graham insisted that such grown-ups “should feel embarrassed about reading literature for children.”
  • In my main line of work as a film critic, I have watched over the past 15 years as the studios committed their vast financial and imaginative resources to the cultivation of franchises (some of them based on those same Y.A. novels) that advance an essentially juvenile vision of the world. Comic-book movies, family-friendly animated adventures, tales of adolescent heroism and comedies of arrested development do not only make up the commercial center of 21st-century Hollywood. They are its artistic heart.
  • ...13 more annotations...
  • At sea or in the wilderness, these friends managed to escape both from the institutions of patriarchy and from the intimate authority of women, the mothers and wives who represent a check on male freedom.
  • What all of these shows grasp at, in one way or another, is that nobody knows how to be a grown-up anymore. Adulthood as we have known it has become conceptually untenable.
  • From the start, American culture was notably resistant to the claims of parental authority and the imperatives of adulthood. Surveying the canon of American literature in his magisterial “Love and Death in the American Novel,” Leslie A. Fiedler suggested, more than half a century before Ruth Graham, that “the great works of American fiction are notoriously at home in the children’s section of the library.”
  • “The typical male protagonist of our fiction has been a man on the run, harried into the forest and out to sea, down the river or into combat — anywhere to avoid ‘civilization,’ which is to say the confrontation of a man and woman which leads to the fall to sex, marriage and responsibility. One of the factors that determine theme and form in our great books is this strategy of evasion, this retreat to nature and childhood which makes our literature (and life!) so charmingly and infuriatingly ‘boyish.’ ”
  • What Fiedler notes, and what most readers of “Huckleberry Finn” will recognize, is Twain’s continual juxtaposition of Huck’s innocence and instinctual decency with the corruption and hypocrisy of the adult world.
  • we’ve also witnessed the erosion of traditional adulthood in any form, at least as it used to be portrayed in the formerly tried-and-true genres of the urban cop show, the living-room or workplace sitcom and the prime-time soap opera. Instead, we are now in the age of “Girls,” “Broad City,” “Masters of Sex” (a prehistory of the end of patriarchy), “Bob’s Burgers” (a loopy post-"Simpsons” family cartoon) and a flood of goofy, sweet, self-indulgent and obnoxious improv-based web videos.
  • we have a literature of boys’ adventures and female sentimentality. Or, to put it another way, all American fiction is young-adult fiction.
  • The bad boys of rock ‘n’ roll and the pouting screen rebels played by James Dean and Marlon Brando proved Fiedler’s point even as he was making it. So did Holden Caulfield, Dean Moriarty, Augie March and Rabbit Angstrom — a new crop of semi-antiheroes
  • We devolve from Lenny Bruce to Adam Sandler, from “Catch-22” to “The Hangover,” from “Goodbye, Columbus” to “The Forty-Year-Old Virgin.”
  • Unlike the antiheroes of eras past, whose rebellion still accepted the fact of adulthood as its premise, the man-boys simply refused to grow up, and did so proudly. Their importation of adolescent and preadolescent attitudes into the fields of adult endeavor (see “Billy Madison,” “Knocked Up,” “Step Brothers,” “Dodgeball”) delivered a bracing jolt of subversion, at least on first viewing. Why should they listen to uptight bosses, stuck-up rich guys and other readily available symbols of settled male authority?
  • That was only half the story, though. As before, the rebellious animus of the disaffected man-child was directed not just against male authority but also against women. I
  • their refusal of maturity also invites some critical reflection about just what adulthood is supposed to mean. In the old, classic comedies of the studio era — the screwbally roller coasters of marriage and remarriage, with their dizzying verbiage and sly innuendo — adulthood was a fact. It was inconvertible and burdensome but also full of opportunity. You could drink, smoke, flirt and spend money.
  • The desire of the modern comic protagonist, meanwhile, is to wallow in his own immaturity, plumbing its depths and reveling in its pleasures.
kushnerha

Philosophy's True Home - The New York Times - 0 views

  • We’ve all heard the argument that philosophy is isolated, an “ivory tower” discipline cut off from virtually every other progress-making pursuit of knowledge, including math and the sciences, as well as from the actual concerns of daily life. The reasons given for this are many. In a widely read essay in this series, “When Philosophy Lost Its Way,” Robert Frodeman and Adam Briggle claim that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences.
  • This institutionalization, the authors claim, led it to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives. I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.
  • identified philosophy with informal linguistic analysis. Fortunately, this narrow view didn’t stop them from contributing to the science of language and the study of law. Now long gone, neither movement defined the philosophy of its day and neither arose from locating it in universities.
  • ...13 more annotations...
  • The authors claim that philosophy abandoned its relationship to other disciplines by creating its own purified domain, accessible only to credentialed professionals. It is true that from roughly 1930 to 1950, some philosophers — logical empiricists, in particular — did speak of philosophy having its own exclusive subject matter. But since that subject matter was logical analysis aimed at unifying all of science, interdisciplinarity was front and center.
  • Philosophy also played a role in 20th-century physics, influencing the great physicists Albert Einstein, Niels Bohr and Werner Heisenberg. The philosophers Moritz Schlick and Hans Reichenbach reciprocated that interest by assimilating the new physics into their philosophies.
  • developed ideas relating logic to linguistic meaning that provided a framework for studying meaning in all human languages. Others, including Paul Grice and J.L. Austin, explained how linguistic meaning mixes with contextual information to enrich communicative contents and how certain linguistic performances change social facts. Today a new philosophical conception of the relationship between meaning and cognition adds a further dimension to linguistic science.
  • Decision theory — the science of rational norms governing action, belief and decision under uncertainty — was developed by the 20th-century philosophers Frank Ramsey, Rudolph Carnap, Richard Jeffrey and others. It plays a foundational role in political science and economics by telling us what rationality requires, given our evidence, priorities and the strength of our beliefs. Today, no area of philosophy is more successful in attracting top young minds.
  • Philosophy also assisted psychology in its long march away from narrow behaviorism and speculative Freudianism. The mid-20th-century functionalist perspective pioneered by Hilary Putnam was particularly important. According to it, pain, pleasure and belief are neither behavioral dispositions nor bare neurological states. They are interacting internal causes, capable of very different physical realizations, that serve the goals of individuals in specific ways. This view is now embedded in cognitive psychology and neuroscience.
  • philosopher-mathematicians Gottlob Frege, Bertrand Russell, Kurt Gödel, Alonzo Church and Alan Turing invented symbolic logic, helped establish the set-theoretic foundations of mathematics, and gave us the formal theory of computation that ushered in the digital age
  • Philosophy of biology is following a similar path. Today’s philosophy of science is less accessible than Aristotle’s natural philosophy chiefly because it systematizes a larger, more technically sophisticated body of knowledge.
  • Philosophy’s interaction with mathematics, linguistics, economics, political science, psychology and physics requires specialization. Far from fostering isolation, this specialization makes communication and cooperation among disciplines possible. This has always been so.
  • Nor did scientific progress rob philosophy of its former scientific subject matter, leaving it to concentrate on the broadly moral. In fact, philosophy thrives when enough is known to make progress conceivable, but it remains unachieved because of methodological confusion. Philosophy helps break the impasse by articulating new questions, posing possible solutions and forging new conceptual tools.
  • Our knowledge of the universe and ourselves expands like a ripple surrounding a pebble dropped in a pool. As we move away from the center of the spreading circle, its area, representing our secure knowledge, grows. But so does its circumference, representing the border where knowledge blurs into uncertainty and speculation, and methodological confusion returns. Philosophy patrols the border, trying to understand how we got there and to conceptualize our next move.  Its job is unending.
  • Although progress in ethics, political philosophy and the illumination of life’s meaning has been less impressive than advances in some other areas, it is accelerating.
  • the advances in our understanding because of careful formulation and critical evaluation of theories of goodness, rightness, justice and human flourishing by philosophers since 1970 compare well to the advances made by philosophers from Aristotle to 1970
  • The knowledge required to maintain philosophy’s continuing task, including its vital connection to other disciplines, is too vast to be held in one mind. Despite the often-repeated idea that philosophy’s true calling can only be fulfilled in the public square, philosophers actually function best in universities, where they acquire and share knowledge with their colleagues in other disciplines. It is also vital for philosophers to engage students — both those who major in the subject, and those who do not. Although philosophy has never had a mass audience, it remains remarkably accessible to the average student; unlike the natural sciences, its frontiers can be reached in a few undergraduate courses.
‹ Previous 21 - 40 of 55 Next ›
Showing 20 items per page