Skip to main content

Home/ TOK Friends/ Group items matching "Theory" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
8More

From Sports Illustrated, the Latest Body Part for Women to Fix - NYTimes.com - 0 views

  • At 44, I am old enough to remember when reconstruction was something you read about in history class, when a muffin top was something delicious you ate at the bakery, a six-pack was how you bought your beer, camel toe was something one might glimpse at the zoo, a Brazilian was someone from the largest country in South America and terms like thigh gap and bikini bridge would be met with blank looks.
  • Now, each year brings a new term for an unruly bit of body that women are expected to subdue through diet and exercise.
  • Girls’ and women’s lives matter. Their safety and health and their rights matter. Whether every inch of them looks like a magazine cover?That, my sisters, does not matter at all.
  • ...5 more annotations...
  • there’s no profit in leaving things as they are.Show me a body part, I’ll show you someone who’s making money by telling women that theirs looks wrong and they need to fix it. Tone it, work it out, tan it, bleach it, tattoo it, lipo it, remove all the hair, lose every bit of jiggle.
  • As a graphic designer and Photoshop teacher, I also have to note that Photoshop is used HEAVILY in these kinds of publications. Even on women with incredibly beautiful (by pop culture standards) bodies. It's quite sad because the imagery we're expected to live up to (or approximate) by cultural standards, is illustration. It's not even real. My boyfriend and I had a big laugh over a Playboy cover a few months ago where the Photoshopping was so extreme (thigh gap and butt cheek) it was anatomically impossible and looked ridiculous. I work in the industry.. I know what the Liquify filter and the Spot Healing Brush can do!
  • We may harp on gender inequality while pursuing stupid fetishes. Well into our middle age, we still try to forcefully wriggle into size 2 pair of jeans. We foolishly spend tonnes of money on fake ( these guys should be sued for false advertising )age -defying, anti-wrinkle creams. Why do we have to have our fuzz and bush diappear while the men have forests on their chests,abdomens,butts, arms and legs? For that we have only ourselves to blame. We just cannot get out of this mindset of being objectified. And we pass on these foolishness to our daughters and grand-daughters. They get trapped, never satisfied with what they see in the mirror. Don't expect the men to change anytime soon. They will always maintain the status quo. It is for us, women to get out of this rut. We have to 'snatch' gender-equality. It will never be handed to us. PERIOD
  • I spent years dieting and exercising to look good--or really to not look bad. I knew the calories (and probably still do) in thousands of foods. How I regret the time I spent on that and the boyfriends who cared about that. And how much more I had to give to the world. With unprecedented economic injustice, ecosystems collapsing, war breaking out everywhere, nations going under water, people starving in refugee camps, the keys to life, behavior, and disease being unlocked in the biological sciences . . . this is what we think women should spend their time worrying about? Talk about a poverty of ambition. No more. Won't even look at these demeaning magazines when I get my hair cut. If that's what a woman cares about, I try to tell her to stop wasting her time. If that's what a man cares about, he is a waste of my time. What a depressing way to distract women from achieving more in this world. Really wish I'd know this at 12.
  • we believe we're all competing against one another to procreate and participate in evolution. So women (and men) compete ferociously, and body image is a subset of all that. Then there's LeMarckian evolutionary theory and epigenetics...http://en.wikipedia.org/wiki/Lamarckismhttp://en.wikipedia.org/wiki/EpigeneticsBottom line is that we can't stop this train any more easily than we can stop the anthropocene's Climate Change. Human beings are tempted. Sometimes we win the battle, other times we give in to vanity, hedonism, and ego. This is all a subset of much larger forces at play. Men and women make choices and act within that environment. Deal with it.
11More

Searching For Santa | Issue 70 | Philosophy Now - 0 views

  • I brace myself against the freezing air and remind myself that I’m here on a mission – to try and find an answer to a question which causes massive conflict to this day. Debate about it has reached fever pitch in recent years, with schoolteachers even being fired for teaching belief in him.
  • Certainly not! In fact, science disproves the existence of Santa. We know he couldn’t possibly visit all those children in a single evening, because his sleigh would explode at those speeds! We also know that he couldn’t fit down the chimney…
  • Not at all. A lot of people assume that because you don’t believe in Santa you must not get any presents, but that just isn’t the case. I get lots of presents, and I enjoy buying presents for my friends.
  • ...8 more annotations...
  • es, I’ve come here looking for Father Christmas.
  • Elder Kringle and his community are self-described ‘Santa Fundamentalists’. They believe the Santa legend exactly the way it’s told. Now I’m going to be the first person ever to be granted an interview by this strange and reclusive community.
  • Well Sam, there are a lot of misunderstandings out there. You see, not all Santa believers reject the theories of parents placing the gifts, or even claims that the toys are made by people in factories and bought in shops.
  • Now I was more than a little apprehensive. It seemed that he wanted to take me out of the country that very night, that very moment even, to meet a community of True Believers. Normally when bearded strangers decked out in red and green with bells make this kind of offer, the alarm bells start jingling in my mind. But I was enthralled. I couldn’t resist the opportunity to get this new angle on my story, and so I consented…
  • And so my first interview ended. I confess to finding the anti-Santa position somewhat unnerving, but it certainly addresses some very poignant questions. Next I decided to interview Reverend William Ronald, a believer and Santa apologist, to see if I could get the other side of the story.
  • If other people won’t lead their children in the ways of Santa then we’ll need to do it for them. Also, we would close all the toy stores; people shouldn’t be allowed to choose what toys they have. It isn’t the place of mortals to ‘Play Santa’ with the universe.
  • f we don’t need Santa in order to receive presents, then why believe in him at all? Wasn’t it Voltaire who said: “As long as people believe in absurdities they will continue to commit atrocities”? Does belief in Santa open up unnecessary doors for extremists? Can’t we just accept that sometimes we get crappy presents and just be grateful for getting any presents at all?
  • Maybe people only believe in Santa because it boosts their ego to think that their actions and lives are worthy of 24-hour observation. I don’t know, and I can’t claim to have all the answers. But my search for Santa has certainly given me some food for thought.
6More

Minds and Computers: An Introduction to AI by Matt Carter | Issue 68 | Philosophy Now - 0 views

  • his main concern is to outline and defend the possibility of a computational theory of mind.
  • there can be systems which display (and so have) mentality simply in virtue of instantiating certain computer programs – but that on the other hand, our best available programs are ‘woefully inadequate’ to that task.
  • For students of artificial intelligence (AI), the book explains very clearly why the whole artificial intelligence project presupposes substantive and controversial answers to some traditional philosophical questions.
  • ...3 more annotations...
  • One central problem for artificial intelligence is how to get aboutness into computer programs – how to get semantics out of syntactics.
  • Visual experience is beyond merely having certain physical inputs in the forms of light waves, undergoing certain transformations in the brain and producing physical outputs such as speaking the sentence “There is something red.”
  • He needs to explain how he thinks a computational account can be provided of qualia; or he needs to abandon a qualia-based account of experience, in favour of some computational account; or he needs to abandon his conclusion that there is no objection in principle to a purely computational account of the mind.
10More

Notes Towards a Philosophy of Sleep | Issue 91 | Philosophy Now - 0 views

  • Meeting Christopher after a long interval reminded me of his excellent book Living Philosophy: Reflections on Life, Meaning and Morality (2001). The volume includes a fascinating essay entitled ‘The Need to Sleep’, where he notes that philosophers have not paid sufficient attention to this extraordinary phenomenon. Well, a decade on, this is the beginning of a response to Christopher’s wake-up call.
  • If I told you that I had a neurological disease which meant that for eight or more hours a day I lost control of my faculties, bade farewell to the outside world, and was subject to complex hallucinations and delusions – such as being chased by a grizzly bear at Stockport Railway Station – you would think I was in a pretty bad way.
  • Of course, sleep is not a disease at all, but the condition of daily (nightly) life for the vast majority of us. The fact that we accept without surprise the need for a prolonged black-out as part of our daily life highlights our tendency to take for granted anything about our condition that is universal.
  • ...7 more annotations...
  • Honest philosophers know they cannot complain about casting their philosophical pearls before drowsy swine, because they, too, have fallen asleep over the works of philosophers greater than themselves.
  • Not only is sleep a reminder of our ultimate helplessness, or even of how circumscribed a place thought sometimes plays in our lives, there is also the fear of contagion, as if talking about sleep might induce it – just as this reference to yawning will get at least 50% of you yawning in the next 15 minutes. (It’s a fact, honest!)
  • Since all animals sleep, we assume it has a biological purpose. The trouble is, we don’t know what that purpose is. There are many theories – energy conservation, growth promotion, immobilisation during hours of darkness when it might be dangerous to be out and about, consolidation of memories – but they are all open to serious objections.
  • Dreams, of course, have figured more significantly in philosophy. Being a mode of consciousness – prompting Aristotle to say that “the soul makes assertions in sleep” (On Dreams 458b) – dreams seem one step up from the mere putting out of zzzs.
  • they place a philosophically interesting question mark against our confidence in the nature of the world we appear to share with others.
  • Naturally, dreams preoccupied him as much as the daily resurrection of the self. He suggested that dreams might be an attempt to make sense of the body’s passage from sleep to wakefulness.
  • nothing is more sleep-inducing than the egocentric tales of someone else’s solipsistic dreams. We long to hear that magic phrase “And then I woke up.”
3More

An Ancient Civics Lesson - NYTimes.com - 0 views

  • ANCIENT Greek and Roman politics rested on a conundrum. Lest they undermine social peace, the poor could not routinely threaten the lives or property of the rich. But unless the laws were fair enough to the poor, why should the plebs respect them?
  • Greeks and Romans addressed this challenge — one that we continue to face — with three distinct models. Athenian democracy empowered the poor, while employing the rich to serve; Roman republicanism empowered the rich, while building in special protections for the poor; and the political theory of Aristotle imagined a new politics of what he called the “middling” class.
  • This range of ancient options suggests that it is pointless to imagine a politics in which no class is dominant or one in which the interests of different classes don’t sometimes conflict. History and philosophy alike counsel that the most practical course is to moderate class conflict, not by pretending it away, but through the self-assertion of the weaker classes and institutionalized recognition of their interests.
12More

Great Scientists Don't Need Math - WSJ - 0 views

  • Without advanced math, how can you do serious work in the sciences? Well, I have a professional secret to share: Many of the most successful scientists in the world today are mathematically no more than semiliterate.
  • I was reassured by the discovery that superior mathematical ability is similar to fluency in foreign languages. I might have become fluent with more effort and sessions talking with the natives, but being swept up with field and laboratory research, I advanced only by a small amount.
  • Far more important throughout the rest of science is the ability to form concepts, during which the researcher conjures images and processes by intuition.
  • ...9 more annotations...
  • exceptional mathematical fluency is required in only a few disciplines, such as particle physics, astrophysics and information theory
  • When something new is encountered, the follow-up steps usually require mathematical and statistical methods to move the analysis forward. If that step proves too technically difficult for the person who made the discovery, a mathematician or statistician can be added as a collaborator
  • Ideas in science emerge most readily when some part of the world is studied for its own sake. They follow from thorough, well-organized knowledge of all that is known or can be imagined of real entities and processes within that fragment of existence
  • Ramped up and disciplined, fantasies are the fountainhead of all creative thinking. Newton dreamed, Darwin dreamed, you dream. The images evoked are at first vague. They may shift in form and fade in and out. They grow a bit firmer when sketched as diagrams on pads of paper, and they take on life as real examples are sought and found.
  • Over the years, I have co-written many papers with mathematicians and statisticians, so I can offer the following principle with confidence. Call it Wilson's Principle No. 1: It is far easier for scientists to acquire needed collaboration from mathematicians and statisticians than it is for mathematicians and statisticians to find scientists able to make use of their equations.
  • If your level of mathematical competence is low, plan to raise it, but meanwhile, know that you can do outstanding scientific work with what you have. Think twice, though, about specializing in fields that require a close alternation of experiment and quantitative analysis. These include most of physics and chemistry, as well as a few specialties in molecular biology.
  • Newton invented calculus in order to give substance to his imagination
  • Darwin had little or no mathematical ability, but with the masses of information he had accumulated, he was able to conceive a process to which mathematics was later applied.
  • For aspiring scientists, a key first step is to find a subject that interests them deeply and focus on it. In doing so, they should keep in mind Wilson's Principle No. 2: For every scientist, there exists a discipline for which his or her level of mathematical competence is enough to achieve excellence.
7More

Digital Dog Collar - NYTimes.com - 0 views

  • I hate the new Apple Watch. Hate what it will do to conversation, to the pace of the day, to my friends, to myself. I hate that it will enable the things that already make life so incremental, now-based and hyper-connected. That, and make things far worse.
  • People check their phones about 150 times a day. Now, imagine how many glances they’ll take with all the information in the world on their wrists.
  • To the complaints that our smartphone addiction has produced a world where nobody talks much anymore, nobody listens and nobody reads, you can add a new one with the smartwatch: nobody makes eye contact.
  • ...4 more annotations...
  • “The Apple Watch is the most personal device we have ever created,” he said. “It’s not just with you, it’s on you.”
  • From here on out, there is no down time, and no excuses for reality escapes. You are connected, 24/7.
  • There is some evidence that heavy smartphone use makes you dumber. The theory is that a having the world at the other end of a mobile search makes for lazy minds, while people who depend less on their devices develop more analytical skills.
  • Add to this concerns about privacy: that the watch is a tracking device, which sends all your personal information to a central database — a corporate control center that already knows far too much about the preferences and habits of smartphone users.
13More

Among the Disrupted - NYTimes.com - 0 views

  • Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind.
  • Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability.
  • the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms:
  • ...10 more annotations...
  • Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology
  • The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past.
  • even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science.
  • The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university
  • The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy.
  • So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
  • are we becoming posthumanists?
  • In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism.
  • The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate.
  • In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.”
20More

The Wisdom Deficit in Schools - The Atlantic - 0 views

  • When I was in high school, I chose to major in English in college because I wanted to be wiser. That’s the word I used. If I ended up making lots of money or writing a book, great; but really, I liked the prospect of being exposed to great thoughts and deep advice, and the opportunity to apply them to my own life in my own clumsy way. I wanted to live more thoughtfully and purposefully
  • Now I’m a veteran English teacher, reflecting on what’s slowly changed at the typical American public high school—and the word wisdom keeps haunting me. I don’t teach it as much anymore, and I wonder who is.
  • how teachers are now being informed by the Common Core State Standards—the controversial math and English benchmarks that have been adopted in most states—and the writers and thought leaders who shape the assessments matched to those standards. It all amounts to an alphabet soup of bureaucratic expectations and what can feel like soul-less instruction. The Smarter Balanced Assessment Consortium—referred to in education circles simply as "SBAC"—is the association that writes a Common Core-aligned assessment used in 25 states
  • ...17 more annotations...
  • The Common Core promotes 10 so-called "College and Career Readiness Anchor Standards" for reading that emphasize technical skills like analyzing, integrating, and delineating a text. But these expectations deal very little with ensuring students are actually appreciating the literature at hand—and say nothing about the personal engagement and life lessons to which my principal was referring
  • Kate Kinsella, an influential author who consults school districts across the country and is considered "a guiding force on the National Advisory Board for the Consortium on Reading Excellence," recently told me to "ditch literature" since "literary fiction is not critical to college success." Kinsella continued, "What’s represented by the standards is the need to analyze texts rather than respond to literature.
  • As a teacher working within this regimented environment, my classroom objectives have had to shift. I used to feel deeply satisfied facilitating a rich classroom discussion on a Shakespearean play; now, I feel proud when my students explicitly acknowledge the aforementioned "anchor standards" and take the initiative to learn these technical skills.
  • But as a man who used to be a high school student interested in pursuing wisdom, I’m almost startled to find myself up late at night, literally studying these anchor standards instead of Hamlet itself.
  • It just feels like a very slow, gradual cultural shift that I don’t even notice except for sudden moments of nostalgia, like remembering a dream out of nowhere
  • I get it: My job is to teach communication, not values, and maybe that’s reasonable. After all, I’m not sure I would want my daughter gaining her wisdom from a randomly selected high-school teacher just because he passed a few writing and literature courses at a state university (which is what I did). My job description has evolved, and I’m fine with that
  • This arrangement, in theory, allows students to read the literature on their own, when they get their own time—and I’m fine with that. But then, where are they getting the time and space to appreciate the deeper lessons of classic literature, to evaluate its truth and appropriately apply it to their own lives?
  • research suggests that a significant majority of teens do not attend church, and youth church attendance has been decreasing over the past few decades. This is fine with me. But then again, where are they getting their wisdom?
  • I’m not talking about my child, or your child. I’m absolutely positive that my daughter will know the difference between Darcy and Wickham before she’s in eighth grade; and it's likely that people who would gravitate toward this story would appreciate this kind of thinking
  • I’m talking about American children in general—kids whose parents work all day, whose fathers left them or whose mothers died
  • even for the parents who do prioritize the humanities in their households, I’m not sure that one generation is actually sharing culturally relevant wisdom with the next one—not if the general community doesn’t even talk about what that wisdom specifically means. Each family can be responsible for teaching wisdom in their own way, and I’m fine with that. But then, does the idea of cultural wisdom get surrendered in the process?
  • Secular wisdom in the public schools seems like it should inherently spring from the literature that’s shaped American culture. And while the students focus on how Whitman’s "purpose shapes the content and style of his text," they’re obviously exposed to the words that describe his leaves of grass.
  • But there is a noticeable deprioritization of literature, and a crumbling consensus regarding the nation’s idea of classic literature. The Common Core requires only Shakespeare, which is puzzling if only for its singularity
  • The country’s disregard for the institutional transfer of cultural wisdom is evident with this single observation: None of the state assessments has a single question about the content of any classic literature. They only test on reading skills
  • But where are the students getting their wisdom?
  • Admittedly, nothing about the Common Core or any modern shifts in teaching philosophies is forbidding me from sharing deeper lessons found in Plato’s cave or Orwell’s Airstrip One. The fine print of the Common-Core guidelines even mentions a few possible titles. But this comes with constant and pervasive language that favors objective analysis over personal engagement.
  • Later, a kid who reminds me of the teenager I was in high school—a boy who is at different times depressed, excited, naive, and curious—asked me why I became an English teacher. I smiled in self-defense, but I was silent again, not knowing what to say anymore.
9More

The Obama Boom - The New York Times - 1 views

  • What did Mr. Obama do that was supposed to kill jobs? Quite a lot, actually. He signed the 2010 Dodd-Frank financial reform, which critics claimed would crush employment by starving businesses of capital.
  • He raised taxes on high incomes, especially at the very top, where average tax rates rose by about six and a half percentage points after 2012, a step that critics claimed would destroy incentives.
  • Yet none of the dire predicted consequences of these policies have materialized.
  • ...6 more annotations...
  • And he enacted a health reform that went into full effect in 2014, amid claims that it would have catastrophic effects on employment.
  • what do we learn from this impressive failure to fail? That the conservative economic orthodoxy dominating the Republican Party is very, very wrong.
  • conservative orthodoxy has a curiously inconsistent view of the abilities and motivations of corporations and wealthy individuals — I mean, job creators.
  • On one side, this elite is presumed to be a bunch of economic superheroes, able to deliver universal prosperity by summoning the magic of the marketplace. On the other side, they’re depicted as incredibly sensitive flowers who wilt in the face of adversity — raise their taxes a bit, subject them to a few regulations, or for that matter hurt their feelings in a speech or two, and they’ll stop creating jobs and go sulk in their tents, or more likely their mansions.
  • It’s a doctrine that doesn’t make much sense, but it conveys a clear message that, whaddya know, turns out to be very convenient for the elite: namely, that injustice is a law of nature, that we’d better not do anything to make our society less unequal or protect ordinary families from financial risks. Because if we do, the usual suspects insist, we’ll be severely punished by the invisible hand, which will collapse the economy.
  • From a conservative point of view, Mr. Obama did everything wrong, afflicting the comfortable (slightly) and comforting the afflicted (a lot), and nothing bad happened. We can, it turns out, make our society better after all.
14More

The Virtue of Contradicting Ourselves - The New York Times - 0 views

  • We don’t just loathe inconsistencies in others; we hate them in ourselves, too. But why? What makes contradictions so revolting — and should they be?
  • Leon Festinger, one of the great social psychologists in history, coined the term cognitive dissonance to describe the discomfort you feel if you say or do something that is inconsistent with one of your beliefs
  • there was a catch: Sometimes people weren’t bothered at all by holding inconsistent beliefs
  • ...11 more annotations...
  • it appeared that people felt dissonance only when their choices had negative consequences, but people still felt dissonance when they wrote something inconsistent with their prior beliefs and then threw it in the trash, never to be seen again
  • For years, it remained a mystery why people would feel dissonance even when there were no negative consequences. But recently, it was solved
  • Using neuroscience to track the activation of different brain regions, Professor Harmon-Jones and colleagues found that inconsistent beliefs really bother us only when they have conflicting implications for action
  • If I’m socially liberal and fiscally conservative, and I want to vote for a candidate with a decent shot at winning, my beliefs are contradictory. One way to reconcile them is to change my opinion on abortion or tax policies. Goodbye, dissonanc
  • This helps to explain why many people’s political beliefs fall on a simple left-right continuum, rather than in more complex combinations. Once, we might have held more nuanced opinions, but in pursuit of consistency, we’ve long since whitewashed the shades of gray
  • It also explains why we can’t stand to vote for flip-floppers. We worry that they don’t have clear principles; we think they lack integrity
  • t consistency is especially appealing to political conservatives, who report a stronger preference for certainty, structure, order and closure than liberals. If you favor predictability over ambiguity and stability over change, a candidate who holds fast to his ideology has a lot of curb appeal.
  • When historians and political scientists rate the presidents throughout history, the most effective ones turn out to be the most open-minded
  • This is true of both conservative and liberal presidents. Abraham Lincoln was a flip-flopper: He started out pro-slavery before abolishing it. Franklin Delano Roosevelt was a flip-flopper, too: Elected on a platform of balancing the budget, he substantially increased spending with his New Deal.
  • we should be wary of electing anyone who fails to evolve. “Progress is impossible without change,” George Bernard Shaw observed, “and those who cannot change their minds cannot change anything.”
  • when it comes to facing our own contradictions, perhaps we should be more open as well. As the artist Marcel Duchamp observed, “I have forced myself to contradict myself, in order to avoid conforming to my own taste.”
6More

In Defense of Naïve Reading - NYTimes.com - 1 views

  • Clearly, poems and novels and paintings were not produced as objects for future academic study; there is no a priori reason to think that they could be suitable objects of  “research.” By and large they were produced for the pleasure and enlightenment of those who enjoyed them.
  • But just as clearly, the teaching of literature in universities ─ especially after the 19th-century research model of Humboldt University of Berlin was widely copied ─ needed a justification consistent with the aims of that academic setting
  • The main aim was research: the creating and accumulation and transmission of knowledge. And the main model was the natural science model of collaborative research: define problems, break them down into manageable parts, create sub-disciplines and sub-sub-disciplines for the study of these, train students for such research specialties and share everything. With that model, what literature and all the arts needed was something like a general “science of meaning” that could eventually fit that sort of aspiration. Texts or art works could be analyzed as exemplifying and so helping establish such a science. Results could be published in scholarly journals, disputed by others, consensus would eventually emerge and so on.
  • ...3 more annotations...
  • literature study in a university education requires some method of evaluation of whether the student has done well or poorly. Students’ papers must be graded and no faculty member wants to face the inevitable “that’s just your opinion” unarmed, as it were. Learning how to use a research methodology, providing evidence that one has understood and can apply such a method, is understandably an appealing pedagogy
  • Literature and the arts have a dimension unique in the academy, not shared by the objects studied, or “researched” by our scientific brethren. They invite or invoke, at a kind of “first level,” an aesthetic experience that is by its nature resistant to restatement in more formalized, theoretical or generalizing language. This response can certainly be enriched by knowledge of context and history, but the objects express a first-person or subjective view of human concerns that is falsified if wholly transposed to a more “sideways on” or third person view.
  • such works also can directly deliver a  kind of practical knowledge and self-understanding not available from a third person or more general formulation of such knowledge. There is no reason to think that such knowledge — exemplified in what Aristotle said about the practically wise man (the phronimos)or in what Pascal meant by the difference between l’esprit géometrique and l’esprit de finesse — is any less knowledge because it cannot be so formalized or even taught as such.
20More

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 1 views

  • Skinner's approach stressed the historical associations between a stimulus and the animal's response -- an approach easily framed as a kind of empirical statistical analysis, predicting the future as a function of the past.
  • Chomsky's conception of language, on the other hand, stressed the complexity of internal representations, encoded in the genome, and their maturation in light of the right data into a sophisticated computational system, one that cannot be usefully broken down into a set of associations.
  • Chomsky acknowledged that the statistical approach might have practical value, just as in the example of a useful search engine, and is enabled by the advent of fast computers capable of processing massive data. But as far as a science goes, Chomsky would argue it is inadequate, or more harshly, kind of shallow
  • ...17 more annotations...
  • David Marr, a neuroscientist colleague of Chomsky's at MIT, defined a general framework for studying complex biological systems (like the brain) in his influential book Vision,
  • a complex biological system can be understood at three distinct levels. The first level ("computational level") describes the input and output to the system, which define the task the system is performing. In the case of the visual system, the input might be the image projected on our retina and the output might our brain's identification of the objects present in the image we had observed. The second level ("algorithmic level") describes the procedure by which an input is converted to an output, i.e. how the image on our retina can be processed to achieve the task described by the computational level. Finally, the third level ("implementation level") describes how our own biological hardware of cells implements the procedure described by the algorithmic level.
  • The emphasis here is on the internal structure of the system that enables it to perform a task, rather than on external association between past behavior of the system and the environment. The goal is to dig into the "black box" that drives the system and describe its inner workings, much like how a computer scientist would explain how a cleverly designed piece of software works and how it can be executed on a desktop computer.
  • As written today, the history of cognitive science is a story of the unequivocal triumph of an essentially Chomskyian approach over Skinner's behaviorist paradigm -- an achievement commonly referred to as the "cognitive revolution,"
  • While this may be a relatively accurate depiction in cognitive science and psychology, behaviorist thinking is far from dead in related disciplines. Behaviorist experimental paradigms and associationist explanations for animal behavior are used routinely by neuroscientists
  • Chomsky critiqued the field of AI for adopting an approach reminiscent of behaviorism, except in more modern, computationally sophisticated form. Chomsky argued that the field's heavy use of statistical techniques to pick regularities in masses of data is unlikely to yield the explanatory insight that science ought to offer. For Chomsky, the "new AI" -- focused on using statistical learning techniques to better mine and predict data -- is unlikely to yield general principles about the nature of intelligent beings or about cognition.
  • Behaviorist principles of associations could not explain the richness of linguistic knowledge, our endlessly creative use of it, or how quickly children acquire it with only minimal and imperfect exposure to language presented by their environment.
  • it has been argued in my view rather plausibly, though neuroscientists don't like it -- that neuroscience for the last couple hundred years has been on the wrong track.
  • Implicit in this endeavor is the assumption that with enough sophisticated statistical tools and a large enough collection of data, signals of interest can be weeded it out from the noise in large and poorly understood biological systems.
  • Brenner, a contemporary of Chomsky who also participated in the same symposium on AI, was equally skeptical about new systems approaches to understanding the brain. When describing an up-and-coming systems approach to mapping brain circuits called Connectomics, which seeks to map the wiring of all neurons in the brain (i.e. diagramming which nerve cells are connected to others), Brenner called it a "form of insanity."
  • These debates raise an old and general question in the philosophy of science: What makes a satisfying scientific theory or explanation, and how ought success be defined for science?
  • Ever since Isaiah Berlin's famous essay, it has become a favorite pastime of academics to place various thinkers and scientists on the "Hedgehog-Fox" continuum: the Hedgehog, a meticulous and specialized worker, driven by incremental progress in a clearly defined field versus the Fox, a flashier, ideas-driven thinker who jumps from question to question, ignoring field boundaries and applying his or her skills where they seem applicable.
  • Chomsky's work has had tremendous influence on a variety of fields outside his own, including computer science and philosophy, and he has not shied away from discussing and critiquing the influence of these ideas, making him a particularly interesting person to interview.
  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • An unlikely pair, systems biology and artificial intelligence both face the same fundamental task of reverse-engineering a highly complex system whose inner workings are largely a mystery
  • neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
11More

Physicists in Europe Find Tantalizing Hints of a Mysterious New Particle - The New York... - 0 views

  • Two teams of physicists working independently at the Large Hadron Collider at CERN, the European Organization for Nuclear Research, reported on Tuesday that they had seen traces of what could be a new fundamental particle of nature.
  • One possibility, out of a gaggle of wild and not-so-wild ideas springing to life as the day went on, is that the particle — assuming it is real — is a heavier version of the Higgs boson, a particle that explains why other particles have mass. Another is that it is a graviton, the supposed quantum carrier of gravity, whose discovery could imply the existence of extra dimensions of space-time.
  • At the end of a long chain of “ifs” could be a revolution, the first clues to a theory of nature that goes beyond the so-called Standard Model, which has ruled physics for the last quarter-century.
  • ...8 more annotations...
  • The Higgs boson was the last missing piece of the Standard Model, which explains all we know about subatomic particles and forces. But there are questions this model does not answer, such as what happens at the bottom of a black hole, the identity of the dark matter and dark energy that rule the cosmos, or why the universe is matter and not antimatter.
  • When physicists announced in 2012 that they had indeed discovered the Higgs boson, it was not the end of physics. It was not even, to paraphrase Winston Churchill, the beginning of the end.
  • A coincidence is the most probable explanation for the surprising bumps in data from the collider, physicists from the experiments cautioned, saying that a lot more data was needed and would in fact soon be available
  • The Large Hadron Collider was built at a cost of some $10 billion, to speed protons around an 18-mile underground track at more than 99 percent of the speed of light and smash them together in search of new particles and forces of nature. By virtue of Einstein’s equivalence of mass and energy, the more energy poured into these collisions, the more massive particles can come out of them. And by the logic of quantum microscopy, the more energy they have to spend, the smaller and more intimate details of nature physicists can see.
  • Since June, after a two-year shutdown, CERN physicists have been running their collider at nearly twice the energy with which they discovered the Higgs, firing twin beams of protons with 6.5 trillion electron volts of energy at each other in search of new particles to help point them to deeper laws.
  • The most intriguing result so far, reported on Tuesday, is an excess of pairs of gamma rays corresponding to an energy of about 750 billion electron volts. The gamma rays, the physicists said, could be produced by the radioactive decay of a new particle, in this case perhaps a cousin of the Higgs boson, which itself was first noticed because it decayed into an abundance of gamma rays.
  • Or it could be a more massive particle that has decayed in steps down to a pair of photons. Nobody knows. No model predicted this, which is how some scientists like it.
  • “We are barely coming to terms with the power and the glory” of the CERN collider’s ability to operate at 13 trillion electron volts, Dr. Spiropulu said in a text message. “We are now entering the era of taking a shot in the dark!”
32More

Is Science Kind of a Scam? - The New Yorker - 1 views

  • No well-tested scientific concept is more astonishing than the one that gives its name to a new book by the Scientific American contributing editor George Musser, “Spooky Action at a Distance
  • The ostensible subject is the mechanics of quantum entanglement; the actual subject is the entanglement of its observers.
  • his question isn’t so much how this weird thing can be true as why, given that this weird thing had been known about for so long, so many scientists were so reluctant to confront it. What keeps a scientific truth from spreading?
  • ...29 more annotations...
  • it is as if two magic coins, flipped at different corners of the cosmos, always came up heads or tails together. (The spooky action takes place only in the context of simultaneous measurement. The particles share states, but they don’t send signals.)
  • fashion, temperament, zeitgeist, and sheer tenacity affected the debate, along with evidence and argument.
  • The certainty that spooky action at a distance takes place, Musser says, challenges the very notion of “locality,” our intuitive sense that some stuff happens only here, and some stuff over there. What’s happening isn’t really spooky action at a distance; it’s spooky distance, revealed through an action.
  • Why, then, did Einstein’s question get excluded for so long from reputable theoretical physics? The reasons, unfolding through generations of physicists, have several notable social aspects,
  • What started out as a reductio ad absurdum became proof that the cosmos is in certain ways absurd. What began as a bug became a feature and is now a fact.
  • “If poetry is emotion recollected in tranquility, then science is tranquility recollected in emotion.” The seemingly neutral order of the natural world becomes the sounding board for every passionate feeling the physicist possesses.
  • Musser explains that the big issue was settled mainly by being pushed aside. Generational imperatives trumped evidentiary ones. The things that made Einstein the lovable genius of popular imagination were also the things that made him an easy object of condescension. The hot younger theorists patronized him,
  • There was never a decisive debate, never a hallowed crucial experiment, never even a winning argument to settle the case, with one physicist admitting, “Most physicists (including me) accept that Bohr won the debate, although like most physicists I am hard pressed to put into words just how it was done.”
  • Arguing about non-locality went out of fashion, in this account, almost the way “Rock Around the Clock” displaced Sinatra from the top of the charts.
  • The same pattern of avoidance and talking-past and taking on the temper of the times turns up in the contemporary science that has returned to the possibility of non-locality.
  • the revival of “non-locality” as a topic in physics may be due to our finding the metaphor of non-locality ever more palatable: “Modern communications technology may not technically be non-local but it sure feels that it is.”
  • Living among distant connections, where what happens in Bangalore happens in Boston, we are more receptive to the idea of such a strange order in the universe.
  • The “indeterminacy” of the atom was, for younger European physicists, “a lesson of modernity, an antidote to a misplaced Enlightenment trust in reason, which German intellectuals in the 1920’s widely held responsible for their country’s defeat in the First World War.” The tonal and temperamental difference between the scientists was as great as the evidence they called on.
  • Science isn’t a slot machine, where you drop in facts and get out truths. But it is a special kind of social activity, one where lots of different human traits—obstinacy, curiosity, resentment of authority, sheer cussedness, and a grudging readiness to submit pet notions to popular scrutiny—end by producing reliable knowledge
  • What was magic became mathematical and then mundane. “Magical” explanations, like spooky action, are constantly being revived and rebuffed, until, at last, they are reinterpreted and accepted. Instead of a neat line between science and magic, then, we see a jumpy, shifting boundary that keeps getting redrawn
  • Real-world demarcations between science and magic, Musser’s story suggests, are like Bugs’s: made on the move and as much a trap as a teaching aid.
  • In the past several decades, certainly, the old lines between the history of astrology and astronomy, and between alchemy and chemistry, have been blurred; historians of the scientific revolution no longer insist on a clean break between science and earlier forms of magic.
  • Where once logical criteria between science and non-science (or pseudo-science) were sought and taken seriously—Karl Popper’s criterion of “falsifiability” was perhaps the most famous, insisting that a sound theory could, in principle, be proved wrong by one test or another—many historians and philosophers of science have come to think that this is a naïve view of how the scientific enterprise actually works.
  • They see a muddle of coercion, old magical ideas, occasional experiment, hushed-up failures—all coming together in a social practice that gets results but rarely follows a definable logic.
  • Yet the old notion of a scientific revolution that was really a revolution is regaining some credibility.
  • David Wootton, in his new, encyclopedic history, “The Invention of Science” (Harper), recognizes the blurred lines between magic and science but insists that the revolution lay in the public nature of the new approach.
  • What killed alchemy was the insistence that experiments must be openly reported in publications which presented a clear account of what had happened, and they must then be replicated, preferably before independent witnesses.
  • Wootton, while making little of Popper’s criterion of falsifiability, makes it up to him by borrowing a criterion from his political philosophy. Scientific societies are open societies. One day the lunar tides are occult, the next day they are science, and what changes is the way in which we choose to talk about them.
  • Wootton also insists, against the grain of contemporary academia, that single observed facts, what he calls “killer facts,” really did polish off antique authorities
  • once we agree that the facts are facts, they can do amazing work. Traditional Ptolemaic astronomy, in place for more than a millennium, was destroyed by what Galileo discovered about the phases of Venus. That killer fact “serves as a single, solid, and strong argument to establish its revolution around the Sun, such that no room whatsoever remains for doubt,” Galileo wrote, and Wootton adds, “No one was so foolish as to dispute these claims.
  • everal things flow from Wootton’s view. One is that “group think” in the sciences is often true think. Science has always been made in a cloud of social networks.
  • There has been much talk in the pop-sci world of “memes”—ideas that somehow manage to replicate themselves in our heads. But perhaps the real memes are not ideas or tunes or artifacts but ways of making them—habits of mind rather than products of mind
  • science, then, a club like any other, with fetishes and fashions, with schemers, dreamers, and blackballed applicants? Is there a real demarcation to be made between science and every other kind of social activity
  • The claim that basic research is valuable because it leads to applied technology may be true but perhaps is not at the heart of the social use of the enterprise. The way scientists do think makes us aware of how we can think
22More

How Humans Ended Up With Freakishly Huge Brains | WIRED - 0 views

  • paleontologists documented one of the most dramatic transitions in human evolution. We might call it the Brain Boom. Humans, chimps and bonobos split from their last common ancestor between 6 and 8 million years ago.
  • Starting around 3 million years ago, however, the hominin brain began a massive expansion. By the time our species, Homo sapiens, emerged about 200,000 years ago, the human brain had swelled from about 350 grams to more than 1,300 grams.
  • n that 3-million-year sprint, the human brain almost quadrupled the size its predecessors had attained over the previous 60 million years of primate evolution.
  • ...19 more annotations...
  • There are plenty of theories, of course, especially regarding why: increasingly complex social networks, a culture built around tool use and collaboration, the challenge of adapting to a mercurial and often harsh climate
  • Although these possibilities are fascinating, they are extremely difficult to test.
  • Although it makes up only 2 percent of body weight, the human brain consumes a whopping 20 percent of the body’s total energy at rest. In contrast, the chimpanzee brain needs only half that.
  • contrary to long-standing assumptions, larger mammalian brains do not always have more neurons, and the ones they do have are not always distributed in the same way.
  • The human brain has 86 billion neurons in all: 69 billion in the cerebellum, a dense lump at the back of the brain that helps orchestrate basic bodily functions and movement; 16 billion in the cerebral cortex, the brain’s thick corona and the seat of our most sophisticated mental talents, such as self-awareness, language, problem solving and abstract thought; and 1 billion in the brain stem and its extensions into the core of the brain
  • In contrast, the elephant brain, which is three times the size of our own, has 251 billion neurons in its cerebellum, which helps manage a giant, versatile trunk, and only 5.6 billion in its cortex
  • primates evolved a way to pack far more neurons into the cerebral cortex than other mammals did
  • The great apes are tiny compared to elephants and whales, yet their cortices are far denser: Orangutans and gorillas have 9 billion cortical neurons, and chimps have 6 billion. Of all the great apes, we have the largest brains, so we come out on top with our 16 billion neurons in the cortex.
  • “What kinds of mutations occurred, and what did they do? We’re starting to get answers and a deeper appreciation for just how complicated this process was.”
  • there was a strong evolutionary pressure to modify the human regulatory regions in a way that sapped energy from muscle and channeled it to the brain.
  • Accounting for body size and weight, the chimps and macaques were twice as strong as the humans. It’s not entirely clear why, but it is possible that our primate cousins get more power out of their muscles than we get out of ours because they feed their muscles more energy. “Compared to other primates, we lost muscle power in favor of sparing energy for our brains,” Bozek said. “It doesn’t mean that our muscles are inherently weaker. We might just have a different metabolism.
  • a pioneering experiment. Not only were they going to identify relevant genetic mutations from our brain’s evolutionary past, they were also going to weave those mutations into the genomes of lab mice and observe the consequences.
  • Silver and Wray introduced the chimpanzee copy of HARE5 into one group of mice and the human edition into a separate group. They then observed how the embryonic mice brains grew.
  • After nine days of development, mice embryos begin to form a cortex, the outer wrinkly layer of the brain associated with the most sophisticated mental talents. On day 10, the human version of HARE5 was much more active in the budding mice brains than the chimp copy, ultimately producing a brain that was 12 percent larger
  • “It wasn’t just a couple mutations and—bam!—you get a bigger brain. As we learn more about the changes between human and chimp brains, we realize there will be lots and lots of genes involved, each contributing a piece to that. The door is now open to get in there and really start understanding. The brain is modified in so many subtle and nonobvious ways.”
  • As recent research on whale and elephant brains makes clear, size is not everything, but it certainly counts for something. The reason we have so many more cortical neurons than our great-ape cousins is not that we have denser brains, but rather that we evolved ways to support brains that are large enough to accommodate all those extra cells.
  • There’s a danger, though, in becoming too enamored with our own big heads. Yes, a large brain packed with neurons is essential to what we consider high intelligence. But it’s not sufficient
  • No matter how large the human brain grew, or how much energy we lavished upon it, it would have been useless without the right body. Three particularly crucial adaptations worked in tandem with our burgeoning brain to dramatically increase our overall intelligence: bipedalism, which freed up our hands for tool making, fire building and hunting; manual dexterity surpassing that of any other animal; and a vocal tract that allowed us to speak and sing.
  • Human intelligence, then, cannot be traced to a single organ, no matter how large; it emerged from a serendipitous confluence of adaptations throughout the body. Despite our ongoing obsession with the size of our noggins, the fact is that our intelligence has always been so much bigger than our brain.
« First ‹ Previous 421 - 440 of 605 Next › Last »
Showing 20 items per page