Skip to main content

Home/ TOK Friends/ Group items tagged artists

Rss Feed Group items tagged

7More

What is Art? and/or What is Beauty? | Issue 108 | Philosophy Now - 1 views

  • Art is something we do, a verb. Art is an expression of our thoughts, emotions, intuitions, and desires, but it is even more personal than that: it’s about sharing the way we experience the world, which for many is an extension of personality. It is the communication of intimate concepts that cannot be faithfully portrayed by words alone.
  • eauty is much more than cosmetic: it is not about prettiness. There are plenty of pretty pictures available at the neighborhood home furnishing store; but these we might not refer to as beautiful; and it is not difficult to find works of artistic expression that we might agree are beautiful that are not necessarily pretty.
  • Works of art may elicit a sense of wonder or cynicism, hope or despair, adoration or spite; the work of art may be direct or complex, subtle or explicit, intelligible or obscure; and the subjects and approaches to the creation of art are bounded only by the imagination of the artist.
  • ...4 more annotations...
  • The game changers – the square pegs, so to speak – are those who saw traditional standards of beauty and decided specifically to go against them, perhaps just to prove a point. Take Picasso, Munch, Schoenberg, to name just three. They have made a stand against these norms in their art. Otherwise their art is like all other art: its only function is to be experienced, appraised, and understood (or not).
  • art is not necessarily positive: it can be deliberately hurtful or displeasing: it can make you think about or consider things that you would rather not. But if it evokes an emotion in you, then it is art.
  • art cannot be simply defined on the basis of concrete tests like ‘fidelity of representation’ or vague abstract concepts like ‘beauty’. So how can we define art in terms applying to both cave-dwellers and modern city sophisticates? To do this we need to ask: What does art do? And the answer is surely that it provokes an emotional, rather than a simply cognitive response. One way of approaching the problem of defining art, then, could be to say: Art consists of shareable ideas that have a shareable emotional impact
  • . A work of art is that which asks a question which a non-art object such as a wall does not: What am I? What am I communicating? The responses, both of the creator artist and of the recipient audience, vary, but they invariably involve a judgement, a response to the invitation to answer. The answer, too, goes towards deciphering that deeper question – the ‘Who am I?’ which goes towards defining humanity.
8More

Opinion | Grifters Gone Wild - The New York Times - 0 views

  • Silicon Valley has always had “a flimflam element” and a “fake it ’til you make it” ethos, from the early ’80s, when it was selling vaporware (hardware or software that was more of a concept or work in progress than a workable reality).
  • “We’ve been lionizing and revering these young tech entrepreneurs, treating them not just like princes and princesses but like heroes and icons,” Carreyrou says. “Now that there’s a backlash to Silicon Valley, it will be interesting to see if we reconsider this view that just because you made a lot of money doesn’t necessarily mean that you’re a role model for boys and girls.”
  • Jaron Lanier, the scientist and musician known as the father of virtual reality, has a new book out, “Ten Arguments for Deleting Your Social Media Accounts Right Now.” He says that the business plans of Facebook and Google have served to “elevate the role of the con artist to be central in society.”
  • ...5 more annotations...
  • “Anytime people want to contact each other or have an awareness of each other, it can only be when it’s financed by a third party who wants to manipulate us, to change us in some way or affect how we vote or what we buy,” he says. “In the old days, to be in that unusual situation, you had to be in a cult or a volunteer in an experiment in a psychology building or be in an abusive relationship or at a bogus real estate seminar.
  • “We don’t believe in government,” he says. “A lot of people are pissed at media. They don’t like education. People who used to think the F.B.I. was good now think it’s terrible. With all of these institutions the subject of ridicule, there’s nothing — except Skinner boxes and con artists.”
  • “But now you just need to sign onto Facebook to find yourself in a behavior modification loop, which is the con. And this may destroy our civilization and even our species.”
  • As Maria Konnikova wrote in her book, “The Confidence Game,” “The whirlwind advance of technology heralds a new golden age of the grift. Cons thrive in times of transition and fast change” when we are losing the old ways and open to the unexpected.
  • now narcissistic con artists are dominating the main stage, soaring to great heights and spectacularly exploding
8More

What all the critics of "Unorthodox" are forgetting - The Forward - 0 views

  • The series has garnered glowing reviews
  • It also has its critical critics
  • both those who have celebrated the series and those who lambasted it are missing something. Something important.
  • ...5 more annotations...
  • intelligent assessors of artistic offerings never forget that truth and beauty are not necessarily one and the same. At times they can even diverge profoundly. There is a reason, after all, why the words “artifice” and “artificial” are based on the word “art.”
  • something obvious but all the same easily overlooked. Namely, that art and fact are entirely unrelated
  • Not only is outright fiction not fact, neither are depictions of actual lives and artistic documentaries, whether forged in words, celluloid or electrons
  • A brilliant artistic endeavor that has been a mainstay of college film studies courses is a good example. The 1935 film has been described as powerful, even overwhelming, and is cited as a pioneering archetype of the use of striking visuals and compelling narrative. It won a gold medal at the 1935 Venice Biennale and the Grand Prix at the 1937 World Exhibition in Paris. The New York Times’ J. Hoberman not long ago called it “supremely artful.”
  • And it was. As well as supremely evil, as Mr. Hoberman also explains. The film was Leni Riefenstahl’s “Triumph of the Will”
7More

These Wearables Are All About Neuroscience | Big Think - 0 views

  • Artist, writer, and experimental philosopher Jonathon Keats, fresh from his recent Reciprocal Biomimicry project, is back, and this time it’s wearable.
  • It’s clothing designed to alter one’s self-perception.
  • Wearing clothes that make you feel good isn’t new, of course, but Keats’ press release claims to be “applying cutting-edge neuroscience to millennia of costume history.”
  • ...3 more annotations...
  • The bracelets can encourage the wearer to assume a “power pose,” boosting self-assurance through the release of testosterone.
  • Superego shades have irises that open and close in sync with the wearer’s breathing, raising his or her consciousness of his or her respiration.
  • Superego shoes offer heels whose height can be adjusted to ensure the wearer is always taller than anyone with whom he or she is speaking.
  •  
    I think it is very interesting than even those wearable designs can be related to neuroscience. They seem to me that the two subjects are very far away. Those designs are very interesting as it combine some idea in science with artistic designs. As we learned in English when we were having a speech project, power pose is a standing position that can strengthen our confidence and persuasiveness. By having those clothing specially designed, it can force us into such position. I think this is a very fantastic idea. I really like the changing height high heel. As a short person, I know how people feel when they have to raise their heads to talk to people. --Sissi (3/12/2017)
16More

How Engaging With Art Affects the Human Brain | American Association for the Advancemen... - 0 views

  • Today, the neurological mechanisms underlying these responses are the subject of fascination to artists, curators and scientists alike.
  • "Once you circle these little things and come to the end of this little project, you'll be invited to compare where you came out against what the results of this experiment were and are," Vikan said. "What you'll find in this show is that there is an amazing convergence. The people that came to the museum liked and disliked the same categories of shapes as the people in the lab as the people in the fMRIs."
  • "Art accesses some of the most advanced processes of human intuitive analysis and expressivity and a key form of aesthetic appreciation is through embodied cognition, the ability to project oneself as an agent in the depicted scene,
  • ...13 more annotations...
  • Embodied cognition is "the sense of drawing you in and making you really feel the quality of the paintings,"
  • The Birth of Venus" because it makes them feel as though they are floating in with Venus on the seashell. Similarly, viewers can feel the flinging of the paint on the canvas when appreciating a drip painting by Jackson Pollock.
  • Mirror neurons, cells in the brain that respond similarly when observing and performing an action, are responsible for embodied cognition
  • Most research on the effects of music education has been done on populations that are privileged enough to afford private music instruction so Kraus is studying music instruction in group settings
  • "But observing the action requires the information to flow inward from the image you're seeing into the control centers. So that bidirectional flow is what's captured in this concept of mirror neurons and it gives the extra vividness to this aesthetics of art appreciation
  • Performing an action requires the information to flow out from the control centers to the limbs,
  • While congenitally blind people usually don't have activation in the visual area of the brain, in brain scans done after the subjects were taught to draw from memory,
  • Hearing speech in noise is one area in which musicians are uniquely skilled. In standardized tests, musicians across the lifespan were much better than the general public at listening to sentences and repeating them back as the level of background noise increased, Kraus said.
  • Artists are known to be better observers and exhibit better memory than non-artists. In an effort to see what happens in the brain when an individual is drawing and whether drawing can increase the brain's plasticity
  • Musicians are also known for their ability to keep rhythm, a skill that is correlated with reading ability and how precisely the brain responds to sound. After one year, students who participated in the group music instruction were faster and more accurate at keeping a beat than students in the control group, Kraus said.
  • "To sum things up, we are what we do and our past shapes our present," Kraus said. "Auditory biology is not frozen in time. It's a moving target. And music education really does seem to enhance communication by strengthening language skills."
  • "When you're doing art, your brain is running full speed,"
  • "It's hitting on all eight cylinders. So if you can figure out what's happening to the brain on art,
6More

Ad About Women's Self-Image Creates a Sensation - NYTimes.com - 0 views

  • An online video, presented in three- and six-minute versions, shows a forensic sketch artist who is asked to draw a series of women based only on their descriptions. Seated at a drafting table with his back to his subject, the artist, Gil Zamora, asks the women a series of questions about their features. “Tell me about your chin,” he says in the soft voice reminiscent of a therapist’s. Crow’s feet, big jaws, protruding chins and dark circles are just some of the many physical features that women criticized about themselves. After he finishes a drawing of a woman, he then draws another sketch of the same woman, only this time it is based on how someone else describes her. The sketches are then hung side by side and the women are asked to compare them. In every instance, the second sketch is more flattering than the first.
  • The video, shot in a loft in San Francisco, has become a sensation online. The three-minute version has been viewed more than 7.5 million times on the Dove YouTube channel, and the version that is twice as long has been viewed more than 936,000 times.
  • Dove executives said the campaign resulted from company research that showed only 4 percent of women consider themselves beautiful.
  • ...3 more annotations...
  • “As women we are so hard on ourselves physically and emotionally,” Ms. Olive said. “It gets you to stop and think about how we think of ourselves.”
  • Ms. Brice took issue with the tag line for the ad, “You’re More Beautiful Than You Think.” “I think it makes people much more susceptible to absorbing the subconscious messages,” Ms. Brice said, “that at the heart of it all is that beauty is still what defines women. It is a little hypocritical.”
  • “What if I did look like that woman on the left?” she said, referring to the less flattering sketches of the women. “There are people that look like that.”
5More

'Les Misérables' and Irony - NYTimes.com - 1 views

  • The artist who deploys irony tests the sophistication of his audience and divides it into two parts, those in the know and those who live in a fool’s paradise. Irony creates a privileged vantage point from which you can frame and stand aloof from a world you are too savvy to take at face value. Irony is the essence of the critical attitude, of the observer’s cool gaze; every reviewer who is not just a bourgeois cheerleader (and no reviewer will admit to being that) is an ironist.
  • “Les Misérables” defeats irony by not allowing the distance it requires. If you’re looking right down the throats of the characters, there is no space between them and you; their perspective is your perspective; their emotions are your emotions; you can’t frame what you are literally inside of.
  • Endless high passion and basic human emotions indulged in without respite are what “Les Misérables” offers in its refusal to afford the distance that enables irony.
  • ...2 more annotations...
  • Irony — postmodern or any other — is a brief against affirmation, against the unsophisticated embrace of positive (unqualified) values.
  • No one has seen this more clearly than David Foster Wallace, who complains that irony “serves an exclusively negative function,” but is “singularly unuseful when it comes to replace the hypocrisies it debunks” (“E Unibus Pluram,” Review of Contemporary Fiction, 1993). Irony, he adds, is “unmeaty”; that is, it has nothing solid inside it and is committed to having nothing inside it. Few artists, Wallace says, “dare to try to talk about ways of redeeming what’s wrong, because they’ll look sentimental and naïve to all the weary ironists.” But perhaps there is hope. “The next real … ‘rebels’ … might well emerge as some weird bunch of ‘antirebels,’ born oglers who dare to back away from ironic watching, who have the childish gall actually to endorse single-entendre values. Who treat old untrendy human troubles and emotions with reverence and conviction” (“E Pluribus Unam”).
10More

Nate Silver, Artist of Uncertainty - 0 views

  • In 2008, Nate Silver correctly predicted the results of all 35 Senate races and the presidential results in 49 out of 50 states. Since then, his website, fivethirtyeight.com (now central to The New York Times’s political coverage), has become an essential source of rigorous, objective analysis of voter surveys to predict the Electoral College outcome of presidential campaigns. 
  • Political junkies, activists, strategists, and journalists will gain a deeper and more sobering sense of Silver’s methods in The Signal and the Noise: Why So Many Predictions Fail—But Some Don’t (Penguin Press). A brilliant analysis of forecasting in finance, geology, politics, sports, weather, and other domains, Silver’s book is also an original fusion of cognitive psychology and modern statistical theory.
  • Its most important message is that the first step toward improving our predictions is learning how to live with uncertainty.
  • ...7 more annotations...
  • he blends the best of modern statistical analysis with research on cognition biases pioneered by Princeton psychologist and Nobel laureate in economics  Daniel Kahneman and the late Stanford psychologist Amos Tversky. 
  • Silver’s background in sports and poker turns out to be invaluable. Successful analysts in gambling and sports are different from fans and partisans—far more aware that “sure things” are likely to be illusions,
  • The second step is starting to understand why it is that big data, super computers, and mathematical sophistication haven’t made us better at separating signals (information with true predictive value) from noise (misleading information). 
  • One of the biggest problems we have in separating signal from noise is that when we look too hard for certainty that isn’t there, we often end up attracted to noise, either because it is more prominent or because it confirms what we would like to believe.
  • In discipline after discipline, Silver shows in his book that when you look at even the best single forecast, the average of all independent forecasts is 15 to 20 percent more accurate. 
  • Silver has taken the next major step: constantly incorporating both state polls and national polls into Bayesian models that also incorporate economic data.
  • Silver explains why we will be misled if we only consider significance tests—i.e., statements that the margin of error for the results is, for example, plus or minus four points, meaning there is one chance in 20 that the percentages reported are off by more than four. Calculations like these assume the only source of error is sampling error—the irreducible error—while ignoring errors attributable to house effects, like the proportion of cell-phone users, one of the complex set of assumptions every pollster must make about who will actually vote. In other words, such an approach ignores context in order to avoid having to justify and defend judgments. 
10More

What Mitt Lost While He Won - NYTimes.com - 0 views

  • In the end, Mitt Romney didn’t lose the Michigan primary, and he didn’t lose his near-lock on the Republican nomination. Rick Santorum isn’t going away, but a solid victory in Michigan and an easy win in Arizona leaves the Romney campaign’s basic math more or less intact. If their candidate can keep winning contests in the West and Northeast and holding serve across the Midwest, Romney’s rivals won’t be able to stop him from grinding out a victory.
  • Thanks to his own smooth evasiveness and the blunders of his rivals, meanwhile, he had managed to sidestep the obvious resemblances between his Massachusetts health care bill and the White House’s Affordable Care Act. And by selling himself as a turnaround artist rather than an ideologue, a champion of the middle class rather than a defender of his fellow 1 percenters, he seemed well-positioned to campaign on competence, experience and sound economic stewardship in the general election.
  • From the very first debates onward, Romney has spent the primary campaign walking a fine line — trying to assuage widespread right-wing doubts about his ideological reliability, while crafting a persona and a policy portfolio that will appeal to moderates as well as conservatives come November.
  • ...6 more annotations...
  • Romney had sketched out an economic plan that avoided the supply-side gimmicks and outright crankery embraced by many of his rivals. He had backed the smartest conservative thinking on entitlements and was rewarded with bipartisan cover when Oregon Democratic Senator Ron Wyden endorsed a similar model for Medicare reform.
  • But the frontrunner did lose something in the days leading up to the Michigan vote. He lost his general election narrative.
  • But then came the South Carolina primary, and Romney’s fumbling, tone-deaf responses to Newt Gingrich’s attacks on his career at Bain Capital. His awkwardness didn’t have direct policy implications, but it revealed a surprising inability to defend his own chosen electoral narrative against fairly obvious attacks. And with the businessman-turnaround artist narrative compromised, it became much easier for Romney’s rivals to turn the focus to his moderate past and long list of flip-flops.
  • It was to change this dynamic, presumably, that Romney’s campaign decided to have him come out for the first time with a big tax reform plan of his own, which he unveiled last week in a speech at Ford Field. In its broadest strokes, the plan isn’t terrible: It promises lower rates and a broader base, which is the goal of just about every sensible tax reform proposal, and it cuts rates for most taxpayers, not just businesses and the rich. But the Romney campaign has declined to explain exactly how the cuts will be paid for, offering vague promises of loophole closing and spending cuts that suggest a return to supply-side irresponsibility.
  • If left unrevised and unaddressed, this irresponsibility threatens to demolish the pillars of Romney’s general-election argument. First, it will make it considerably harder for him to attack the White House’s record on deficits, which would otherwise be a central part of the case against the president. Second, it will make Romney’s own vision for entitlement reform easy to demagogue and dismiss, since President Obama will have grounds to argue that his opponent only wants to cut Medicare and Social Security in order to cut taxes on the rich.
  • Both of these problems, needless to say, will be exacerbated if Romney continues to be unable to talk about his wealth in anything save the most clueless and flatfooted fashion. The White House might prefer to face Rick Santorum in the general election, but an out-of-touch rich guy running on Medicare cuts and an ill-considered tax plan will make for a pretty inviting target in his own right.
  •  
    What does this bode for the future?
5More

Chinese censorship is nothing new, but artists shouldn't trade freedom for money - The ... - 0 views

  • publishing industry to increase sales in China.
  • “whether certain topics were off limits for writers and if his publishing house adhered to government guidelines
  • ‘The censors felt that it did not portray Shanghai in a positive light, so that scene was removed from the movie,’ ”
  • ...2 more annotations...
  • As I’ve noted elsewhere, what was fascinating about the “Mission: Impossible III” anecdote is that the Chinese censors weren’t always worried about heterodox ideas — a plea for freedom of speech or gay rights, say — but simply looking bad, losing face. The treatment of art as nothing more than a means of transforming the perception of China has a long and storied history under the communist regime.
  • “Likewise, [Hu] opposed the party’s dictum that ‘bright things’ be emphasized and elements of backwardness and darkness be de-emphasized,”
15More

Kung Fu for Philosophers - NYTimes.com - 0 views

  • any ability resulting from practice and cultivation could accurately be said to embody kung fu.
  • the predominant orientation of traditional Chinese philosophy is the concern about how to live one’s life, rather than finding out the truth about reality.
  • Confucius’s call for “rectification of names” — one must use words appropriately — is more a kung fu method for securing sociopolitical order than for capturing the essence of things, as “names,” or words, are placeholders for expectations of how the bearer of the names should behave and be treated. This points to a realization of what J. L. Austin calls the “performative” function of language.
  • ...12 more annotations...
  • Instead of leading to a search for certainty, as Descartes’s dream did, Zhuangzi came to the realization that he had perceived “the transformation of things,” indicating that one should go along with this transformation rather than trying in vain to search for what is real.
  • the views of Mencius and his later opponent Xunzi’s views about human nature are more recommendations of how one should view oneself in order to become a better person than metaphysical assertions about whether humans are by nature good or bad. Though each man’s assertions about human nature are incompatible with each other, they may still function inside the Confucian tradition as alternative ways of cultivation.
  • The Buddhist doctrine of no-self surely looks metaphysical, but its real aim is to free one from suffering, since according to Buddhism suffering comes ultimately from attachment to the self. Buddhist meditations are kung fu practices to shake off one’s attachment, and not just intellectual inquiries for getting propositional truth.
  • The essence of kung fu — various arts and instructions about how to cultivate the person and conduct one’s life — is often hard to digest for those who are used to the flavor and texture of mainstream Western philosophy. It is understandable that, even after sincere willingness to try, one is often still turned away by the lack of clear definitions of key terms and the absence of linear arguments in classic Chinese texts. This, however, is not a weakness, but rather a requirement of the kung fu orientation — not unlike the way that learning how to swim requires one to focus on practice and not on conceptual understanding.
  • It even expands epistemology into the non-conceptual realm in which the accessibility of knowledge is dependent on the cultivation of cognitive abilities, and not simply on whatever is “publicly observable” to everyone. It also shows that cultivation of the person is not confined to “knowing how.” An exemplary person may well have the great charisma to affect others but does not necessarily know how to affect others.
  • Western philosophy at its origin is similar to classic Chinese philosophy. The significance of this point is not merely in revealing historical facts. It calls our attention to a dimension that has been eclipsed by the obsession with the search for eternal, universal truth and the way it is practiced, namely through rational arguments.
  • One might well consider the Chinese kung fu perspective a form of pragmatism.  The proximity between the two is probably why the latter was well received in China early last century when John Dewey toured the country. What the kung fu perspective adds to the pragmatic approach, however, is its clear emphasis on the cultivation and transformation of the person, a dimension that is already in Dewey and William James but that often gets neglected
  • A kung fu master does not simply make good choices and use effective instruments to satisfy whatever preferences a person happens to have. In fact the subject is never simply accepted as a given. While an efficacious action may be the result of a sound rational decision, a good action that demonstrates kung fu has to be rooted in the entire person, including one’s bodily dispositions and sentiments, and its goodness is displayed not only through its consequences but also in the artistic style one does it. It also brings forward what Charles Taylor calls the “background” — elements such as tradition and community — in our understanding of the formation of a person’s beliefs and attitudes. Through the kung fu approach, classic Chinese philosophy displays a holistic vision that brings together these marginalized dimensions and thereby forces one to pay close attention to the ways they affect each other.
  • This kung fu approach shares a lot of insights with the Aristotelian virtue ethics, which focuses on the cultivation of the agent instead of on the formulation of rules of conduct. Yet unlike Aristotelian ethics, the kung fu approach to ethics does not rely on any metaphysics for justification.
  • This approach opens up the possibility of allowing multiple competing visions of excellence, including the metaphysics or religious beliefs by which they are understood and guided, and justification of these beliefs is then left to the concrete human experiences.
  • it is more appropriate to consider kung fu as a form of art. Art is not ultimately measured by its dominance of the market. In addition, the function of art is not accurate reflection of the real world; its expression is not constrained to the form of universal principles and logical reasoning, and it requires cultivation of the artist, embodiment of virtues/virtuosities, and imagination and creativity.
  • If philosophy is “a way of life,” as Pierre Hadot puts it, the kung fu approach suggests that we take philosophy as the pursuit of the art of living well, and not just as a narrowly defined rational way of life.
1More

Films For Action - 0 views

  •  
    "Brandalism is a revolt against corporate control of the visual realm. It is the biggest anti-advertising campaign in world history and it's getting bigger. Starting in July 2012 with a small team in a van, Brandalism has grown tenfold to include teams in 10 UK cities skilled up in taking back space. This film covers the Brandalism takeover in May 2014 which saw the reclamation of over 360 corporate advertising spaces with hand made original art works submitted by 40 international artists."
6More

Art Is Vital - James Hamblin - The Atlantic - 1 views

  • If you ask Americans if liberal arts are important, Gardner continued, they say yes. But in terms of budgets, what gets cut first is not “core subjects” or even athletics.
  • “came about in a frame of increased emphasis on test scores and utility—the market economy becoming a marketing society. Everything is about what you’re going to get,” in readily quantifiable terms.
  • Woetzel's vision is “to give kids the tools to become adults who are creative, adaptable, and collaborative, expressive—capable of having their eyes and ears and senses alive.”
  • ...3 more annotations...
  • “We’re not talking about making sure that everybody has private music lessons,” Woetzel said. “We’re talking about a way of educating that involves artistic sensibilities—artistic habits of mind. The ability to re-assess and to imagine. To be in a science class and not think it’s about memorization entirely,” but to imagine its applications.
  • “People still don’t get it,” Woodard said. “They think it’s play time. They think it’s touchy feel-y. But it’s undeniable what music, painting, [and] movement do to the brain. It becomes more receptive to scientific ideas.”
  • “You cannot be an innovator in any category,” Woodard said, “unless that creative instinct is exercised.”
8More

The World Is Yours, the World Is Mine - NYTimes.com - 0 views

  • History is often held hostage by the highest bidder — whoever gets to tell the story ends up defining what happened. What happened in 2014? What mattered in 2014? It depends whom you ask.
  • Historical narratives recount political, economic or social events, but rarely tell stories of the everyday. The mundane nuances of life are often ignored precisely because they are so personal. But private stories are usually the ones that we connect with most
  • Modes of storytelling like painting and rap allow us to engage with those personal stories, becoming the vehicles through which history passes.
  • ...5 more annotations...
  • For years, the form had been ignored by many Pakistani artists. I found it ripe with potential — to change its status and its narrative and to deconstruct its stereotypes. What others saw as enslavement to tradition, I recognized as a path to expanding the medium from within, embracing the complexities of craft and rigor in order to open up possibilities for dialogue.
  • Much in the way that hip-hop’s place in popular culture was diminishing by the time Nas took it up in the early 1990s, Indo-Persian miniature painting fell from relevance in Pakistani culture. The practice shifted so dramatically after the fall of the Mughal Empire and the rise of colonial rule in South Asia during the 19th century that when I began engaging with the miniature in my work in the late 1980s and early ’90s at the National College of Arts in Lahore, it was regarded as tourist kitsch and derided as a craft technique.
  • My interest in juxtaposing hip-hop and Indo-Persian miniature painting, the primary medium through which I have told stories, is in taking these two disparate narrative forms and letting the dissonance find a detour.
  • My work over the past 20 years has both borrowed and departed from traditional modes of miniature painting. One of these elements, the hair of Gopis — the female consorts of the Hindu god Krishna — appears in this painting, circling around the central axis. Over the past 15 years, I have been experimenting by divorcing their signature hairstyles from the rest of their bodies as a means of identifying them. The Gopi hair, in its many transformed and recontextualized iterations, takes on the appearance of bats, particles or elements of a moving mass. In this painting, the Gopis swirl around Africa and move outward. In their clusters around the central glowing orb of Africa, the Gopis coalesce and overlap, suggesting a symbol that became ubiquitous in 2014: the biohazard sign.
  • My process is driven by my interest in exploring and rediscovering cultural and political boundaries, and using that space to create new frameworks for dialogue and visual narrative. Contemporaneity is about remaining relevant by challenging the status quo, not by clinging to past successes. This is at odds with the standards set up in the worlds of commercial art and music, which are more interested in branding and often hold an artist hostage to one idea or form.
100More

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
34More

In Defense of Facts - The Atlantic - 1 views

  • over 13 years, he has published a series of anthologies—of the contemporary American essay, of the world essay, and now of the historical American essay—that misrepresents what the essay is and does, that falsifies its history, and that contains, among its numerous selections, very little one would reasonably classify within the genre. And all of this to wide attention and substantial acclaim
  • D’Agata’s rationale for his “new history,” to the extent that one can piece it together from the headnotes that preface each selection, goes something like this. The conventional essay, nonfiction as it is, is nothing more than a delivery system for facts. The genre, as a consequence, has suffered from a chronic lack of critical esteem, and thus of popular attention. The true essay, however, deals not in knowing but in “unknowing”: in uncertainty, imagination, rumination; in wandering and wondering; in openness and inconclusion
  • Every piece of this is false in one way or another.
  • ...31 more annotations...
  • There are genres whose principal business is fact—journalism, history, popular science—but the essay has never been one of them. If the form possesses a defining characteristic, it is that the essay makes an argument
  • That argument can rest on fact, but it can also rest on anecdote, or introspection, or cultural interpretation, or some combination of all these and more
  • what makes a personal essay an essay and not just an autobiographical narrative is precisely that it uses personal material to develop, however speculatively or intuitively, a larger conclusion.
  • Nonfiction is the source of the narcissistic injury that seems to drive him. “Nonfiction,” he suggests, is like saying “not art,” and if D’Agata, who has himself published several volumes of what he refers to as essays, desires a single thing above all, it is to be known as a maker of art.
  • D’Agata tells us that the term has been in use since about 1950. In fact, it was coined in 1867 by the staff of the Boston Public Library and entered widespread circulation after the turn of the 20th century. The concept’s birth and growth, in other words, did coincide with the rise of the novel to literary preeminence, and nonfiction did long carry an odor of disesteem. But that began to change at least as long ago as the 1960s, with the New Journalism and the “nonfiction novel.”
  • What we really seem to get in D’Agata’s trilogy, in other words, is a compendium of writing that the man himself just happens to like, or that he wants to appropriate as a lineage for his own work.
  • What it’s like is abysmal: partial to trivial formal experimentation, hackneyed artistic rebellion, opaque expressions of private meaning, and modish political posturing
  • If I bought a bag of chickpeas and opened it to find that it contained some chickpeas, some green peas, some pebbles, and some bits of goat poop, I would take it back to the store. And if the shopkeeper said, “Well, they’re ‘lyric’ chickpeas,” I would be entitled to say, “You should’ve told me that before I bought them.”
  • when he isn’t cooking quotes or otherwise fudging the record, he is simply indifferent to issues of factual accuracy, content to rely on a mixture of guesswork, hearsay, and his own rather faulty memory.
  • His rejoinders are more commonly a lot more hostile—not to mention juvenile (“Wow, Jim, your penis must be so much bigger than mine”), defensive, and in their overarching logic, deeply specious. He’s not a journalist, he insists; he’s an essayist. He isn’t dealing in anything as mundane as the facts; he’s dealing in “art, dickhead,” in “poetry,” and there are no rules in art.
  • D’Agata replies that there is something between history and fiction. “We all believe in emotional truths that could never hold water, but we still cling to them and insist on their relevance.” The “emotional truths” here, of course, are D’Agata’s, not Presley’s. If it feels right to say that tae kwon do was invented in ancient India (not modern Korea, as Fingal discovers it was), then that is when it was invented. The term for this is truthiness.
  • D’Agata clearly wants to have it both ways. He wants the imaginative freedom of fiction without relinquishing the credibility (and for some readers, the significance) of nonfiction. He has his fingers crossed, and he’s holding them behind his back. “John’s a different kind of writer,” an editor explains to Fingal early in the book. Indeed he is. But the word for such a writer isn’t essayist. It’s liar.
  • he point of all this nonsense, and a great deal more just like it, is to advance an argument about the essay and its history. The form, D’Agata’s story seems to go, was neglected during the long ages that worshiped “information” but slowly emerged during the 19th and 20th centuries as artists learned to defy convention and untrammel their imaginations, coming fully into its own over the past several decades with the dawning recognition of the illusory nature of knowledge.
  • Most delectable is when he speaks about “the essay’s traditional ‘five-paragraph’ form.” I almost fell off my chair when I got to that one. The five-paragraph essay—introduction, three body paragraphs, conclusion; stultifying, formulaic, repetitive—is the province of high-school English teachers. I have never met one outside of a classroom, and like any decent college writing instructor, I never failed to try to wean my students away from them. The five-paragraph essay isn’t an essay; it’s a paper.
  • What he fails to understand is that facts and the essay are not antagonists but siblings, offspring of the same historical moment
  • —by ignoring the actual contexts of his selections, and thus their actual intentions—D’Agata makes the familiar contemporary move of imposing his own conceits and concerns upon the past. That is how ethnography turns into “song,” Socrates into an essayist, and the whole of literary history into a single man’s “emotional truth.”
  • The history of the essay is indeed intertwined with “facts,” but in a very different way than D’Agata imagines. D’Agata’s mind is Manichaean. Facts bad, imagination good
  • When he refers to his selections as essays, he does more than falsify the essay as a genre. He also effaces all the genres that they do belong to: not only poetry, fiction, journalism, and travel, but, among his older choices, history, parable, satire, the sermon, and more—genres that possess their own particular traditions, conventions, and expectation
  • one needs to recognize that facts themselves have a history.
  • Facts are not just any sort of knowledge, such as also existed in the ancient and medieval worlds. A fact is a unit of information that has been established through uniquely modern methods
  • Fact, etymologically, means “something done”—that is, an act or deed
  • It was only in the 16th century—an age that saw the dawning of a new empirical spirit, one that would issue not only in modern science, but also in modern historiography, journalism, and scholarship—that the word began to signify our current sense of “real state of things.”
  • It was at this exact time, and in this exact spirit, that the essay was born. What distinguished Montaigne’s new form—his “essays” or attempts to discover and publish the truth about himself—was not that it was personal (precursors like Seneca also wrote personally), but that it was scrupulously investigative. Montaigne was conducting research into his soul, and he was determined to get it right.
  • His famous motto, Que sais-je?—“What do I know?”—was an expression not of radical doubt but of the kind of skepticism that fueled the modern revolution in knowledge.
  • It is no coincidence that the first English essayist, Galileo’s contemporary Francis Bacon, was also the first great theorist of science.
  • That knowledge is problematic—difficult to establish, labile once created, often imprecise and always subject to the limitations of the human mind—is not the discovery of postmodernism. It is a foundational insight of the age of science, of fact and information, itself.
  • The point is not that facts do not exist, but that they are unstable (and are becoming more so as the pace of science quickens). Knowledge is always an attempt. Every fact was established by an argument—by observation and interpretation—and is susceptible to being overturned by a different one
  • A fact, you might say, is nothing more than a frozen argument, the place where a given line of investigation has come temporarily to rest.
  • Sometimes those arguments are scientific papers. Sometimes they are news reports, which are arguments with everything except the conclusions left out (the legwork, the notes, the triangulation of sources—the research and the reasoning).
  • When it comes to essays, though, we don’t refer to those conclusions as facts. We refer to them as wisdom, or ideas
  • the essay draws its strength not from separating reason and imagination but from putting them in conversation. A good essay moves fluidly between thought and feeling. It subjects the personal to the rigors of the intellect and the discipline of external reality. The truths it finds are more than just emotional.
10More

You're more biased than you think - even when you know you're biased | News | The Guardian - 0 views

  • there’s plenty of evidence to suggest that we’re all at least somewhat subject to bias
  • Tell Republicans that some imaginary policy is a Republican one, as the psychologist Geoffrey Cohen did in 2003, and they’re much more likely to support it, even if it runs counter to Republican values. But ask them why they support it, and they’ll deny that party affiliation played a role. (Cohen found something similar for Democrats.
  • those who saw the names were biased in favour of famous artists. But even though they acknowledged the risk of bias, when asked to assess their own objectivity, they didn’t view their judgments as any more biased as a result.
  • ...7 more annotations...
  • Even when the risk of bias was explicitly pointed out to them, people remained confident that they weren’t susceptible to it
  • “Even when people acknowledge that what they are about to do is biased,” the researchers write, “they still are inclined to see their resulting decisions as objective.”
  • why it’s often better for companies to hire people, or colleges to admit students, using objective checklists, rather than interviews that rely on gut feelings.
  • It turns out the bias also applies to bias. In other words, we’re convinced that we’re better than most at not falling victim to bias.
  • “used a strategy that they thought was biased,” the researchers note, “and thus they probably expected to feel some bias when using it. The absence of that feeling may have made them more confident in their objectivity.”
  • we have a cognitive bias to the effect that we’re uniquely immune to cognitive biases.
  • Bias spares nobody.
7More

Living the life authentic: Bernard Williams on Paul Gauguin | Aeon Essays - 0 views

  • Williams invites us to see Gauguin’s meaning in life as deeply intertwined with his artistic ambition. His art is, to use Williams’s term for such meaning-giving enterprises, his ground project
  • This is what a ground project does, according to Williams: it gives a reason, not just given that you are alive, but a reason to be alive in the first place.
  • The desires and goals at the heart of what Williams calls a ground project form a fundamental part of one’s identity, and in that sense being true to one’s deepest desires is being true to who one is most deeply.
  • ...4 more annotations...
  • We see here the enormously influential cultural ideal mentioned at the outset: the purpose of life is to be authentic, where that means finding out who you are and living accordingly. Gauguin, in other words, was a cultural prototype for a conception of life’s meaning that today has widespread appeal around the world.
  • Williams, however, thinks that Gauguin’s eventual success as a painter constitutes a form of moral luck, in that his artistic achievement justifies what he did. It provides a justification that not everyone will accept, but one that can make sense to Gauguin himself, and perhaps to others
  • In his essay ‘Moral Luck’ (1976), Williams discusses Paul Gauguin’s decision to leave Paris in order to move to Tahiti where he hoped he could become a great painter. Gauguin left behind – basically abandoned – his wife and children
  • If there’s one theme in all my work it’s about authenticity and self-expression,’ said the philosopher Bernard Williams in an interview with The Guardian in 2002
6More

(1) Mimetic Collapse, Our Destiny - Freddie deBoer - 0 views

  • this recent piece from The New York Times Magazine, which argues that the artistic obsession with novelty and experimentation, the primary obsession of modernism and so something like the default goal of artists for more than a century, has recently run aground. This turn from the primacy of the new does not stem from a choice to reject it, but because culture is truly spent, and can produce nothing original
  • the condition that Farago describes is ultimately the same condition that leads Rolling Stone to publish an anti-Infinite Jest piece in twenty goddamn twenty-three - discursive exhaustion, the inevitable dark side of meme culture, the sputtering firehose of human expression that is the internet running dry. CT Jones wrote that piece because it’s a thing people write, Rolling Stone published it because it’s a thing publications publish, and people read it because it’s a thing people are known to think. These are not ideas so much as they are the impressions of where ideas once were, like the lines you find on your face the morning after you sleep on the wrong pillow.
  • The litbro, in other words, is a simulacra, a symbol that has eaten what it was meant to symbolize, a representation of something that has never existed. The idea is Jean Baudrillard’s, expressed in several texts but most famously in Simulacra & Sign, published in 1981
  • ...3 more annotations...
  • In it, Baudrillard argues that there are four phases of the image - a faithful depiction of that which really is, an unfaithful depiction of that which really is, a depiction that covers up for the fact that there is nothing which is actually being depicted, and the simulacra, which exists in a human culture of such universal equivalency that no one has the grounding necessary to know what “reality” might even be outside of equivalencies, outside of depiction
  • at this stage, the litbro really exists precisely within Baudrillard’s concepts - it is a representation of someone else’s representation, a second-hand depiction, an archetype that is now developed fully through reference to itself rather than to some underlying reality. Baudrillard said that the development of simulacra is “a question of substituting the signs of the real for the real.”
  • Another example you often hear is the 1950s diner, the joint that has the neon signs and the art deco styling and the mini jukeboxes at the tables. This classic bit of Americana is not, in fact, based on what diners were like in the 1950s; it’s someone’s idea of what 1950s diners were like, which then spread mimetically from the actual physical 1950s diners that had been built to films and television, which then acted as “proof” that the imaginary diners were real, creating a social expectation of what a diner looks like that diner owners then felt pressure to fulfill…. Eventually most people came to believe that this is what diners were like in the 1950s. The point, though, is not that this is an act of deception. The point is that the consumerist reality in which these restaurants exists obliterates any belief in a true or false depiction. (No one cares whether the classic 1950s diner actually depicts a historical truth, really
1More

Football and racist language: Reclaiming the Y-word | The Economist - 0 views

  • Game theory Sports Previous Next Latest Game theory Latest from all our blogs Football and racist language Reclaiming the Y-word Nov 9th 2012, 16:28 by B.R. ENGLISH football grounds in the 1980s were not pleasant places. Fans were squeezed into caged terraces which were often left open to the elements. Hooliganism was rife and the country was in a state of moral panic as lurid images of fighting youths became a fixture on news bulletins. Margaret Thatcher, the prime minister, convened a "war cabinet". Ken Bates, the chairman of Chelsea football club, suggested electrifying the fences in the stadiums to keep the warring factions apart. By the end of the decade English football reached its nadir. In 1985, 39 Italian football fans had been killed in Heysel, Belgium after a riot by Liverpool supporters. In 1989, Liverpool supporters themselves were the victims as 96 lost their lives at Hillsborough as a result of incompetent policing.Some time toward the beginning of that decade, aged around ten, your correspondent was taken to his first away game by his father, a fanatical supporter of Tottenham Hotspur. The game was a derby with Chelsea, a bitter London rival. Chelsea's fans were among the game’s most notorious. Many were skinheads; foot soldiers of extreme right-wing parties such as the National Front and the British Movement. Tottenham, because of the area in North London in which it is situated, had a large and visible Jewish following. It did not make for a pleasant combination. At one point during the first half the hostile Chelsea crowd fell suddenly silent. Quietly at first came a hissing sound, like someone letting out gas from a canister. Before long the hissing reached crescendo. It was a terrifying sound for a small boy. But I was too young to grasp the significance. Only later was I filled in: the Chelsea fans were mimicking the sound of cyanide being released at a Nazi concentration camp. As the years wore on, the abuse towards Spurs fans became less subtle. When clubs with a large right-wing following came to Tottenham’s White Hart Lane stadium, such as Chelsea, West Ham, Leeds and Manchester United, the anti-semitism was relentless. One common song ran:Spurs are on their way to BelsenHitler's going to gas ‘em againThe Yids from TottenhamThe Yids from White Hart Lane The Y-word. It was the most relentless chant of all. Thousand of opposition fans, faces snarled, would come together in spiteful mantra: “Yiddo! Yiddo!” It was directed towards Tottenham fans and players alike. It would go on for minutes at a time, many times in a game. After a while it was so commonplace that one became immune to it. At some point during that time, something odd began to happen. Tottenham fans began to appropriate the Y-word. Gradually they began to refer to themselves as Yids. The club’s supporters started to describe themselves as the “Yid Army”. Soon the word was being chanted solely by Tottenham fans referring to themselves in a spirit of celebration and of togetherness. It had been reclaimed in much the same way that the word “nigger” was taken back by black hip-hop artists and “queer” was by gays.As a result, the word died as an insult, at least within football grounds.
‹ Previous 21 - 40 of 106 Next › Last »
Showing 20 items per page