Skip to main content

Home/ TOK Friends/ Group items tagged paradigms

Rss Feed Group items tagged

johnsonle1

Scientists Find First Observed Evidence That Our Universe May Be a Hologram | Big Think - 1 views

  • all the information in our 3-dimensional reality may actually be included in the 2-dimensional surface of its boundaries. It's like watching a 3D show on a 2D television.
  • the team found that the observational data they found was largely predictable by the math of holographic theory. 
  • After this phase comes to a close, the Universe goes into a geometric phase, which can be described by Einstein's equations.
  • ...1 more annotation...
  • It's a new paradigm for a physical reality.
  •  
    As we watched in the video "Spooky Science" in TOK, we saw how 2D and 3D world are very distinctive, but in this article, the author discussed another theory that our 3D reality may actually be included in the 2D surface of its boundaries. This theory is a rival to the theory of cosmic inflation. The holographic theory not only explains the abnormalities, it is also a more simple theory of the early universe. Now the scientists find that the math of holographic theory can very much predict the data, so it has the potential to be a new paradigm for a physical reality. --Sissi (2/6/2017)
  •  
    What is the holographic universe idea? It's not exactly that we are living in some kind of Star Trekky computer simulation. Rather the idea, first proposed in the 1990s by Leonard Susskind and Gerard 't Hooft, says that all the information in our 3-dimensional reality may actually be included in the 2-dimensional surface of its boundaries. It's like watching a 3D show on a 2D television.
Javier E

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 1 views

  • Skinner's approach stressed the historical associations between a stimulus and the animal's response -- an approach easily framed as a kind of empirical statistical analysis, predicting the future as a function of the past.
  • Chomsky's conception of language, on the other hand, stressed the complexity of internal representations, encoded in the genome, and their maturation in light of the right data into a sophisticated computational system, one that cannot be usefully broken down into a set of associations.
  • Behaviorist principles of associations could not explain the richness of linguistic knowledge, our endlessly creative use of it, or how quickly children acquire it with only minimal and imperfect exposure to language presented by their environment.
  • ...17 more annotations...
  • David Marr, a neuroscientist colleague of Chomsky's at MIT, defined a general framework for studying complex biological systems (like the brain) in his influential book Vision,
  • a complex biological system can be understood at three distinct levels. The first level ("computational level") describes the input and output to the system, which define the task the system is performing. In the case of the visual system, the input might be the image projected on our retina and the output might our brain's identification of the objects present in the image we had observed. The second level ("algorithmic level") describes the procedure by which an input is converted to an output, i.e. how the image on our retina can be processed to achieve the task described by the computational level. Finally, the third level ("implementation level") describes how our own biological hardware of cells implements the procedure described by the algorithmic level.
  • The emphasis here is on the internal structure of the system that enables it to perform a task, rather than on external association between past behavior of the system and the environment. The goal is to dig into the "black box" that drives the system and describe its inner workings, much like how a computer scientist would explain how a cleverly designed piece of software works and how it can be executed on a desktop computer.
  • As written today, the history of cognitive science is a story of the unequivocal triumph of an essentially Chomskyian approach over Skinner's behaviorist paradigm -- an achievement commonly referred to as the "cognitive revolution,"
  • While this may be a relatively accurate depiction in cognitive science and psychology, behaviorist thinking is far from dead in related disciplines. Behaviorist experimental paradigms and associationist explanations for animal behavior are used routinely by neuroscientists
  • Chomsky critiqued the field of AI for adopting an approach reminiscent of behaviorism, except in more modern, computationally sophisticated form. Chomsky argued that the field's heavy use of statistical techniques to pick regularities in masses of data is unlikely to yield the explanatory insight that science ought to offer. For Chomsky, the "new AI" -- focused on using statistical learning techniques to better mine and predict data -- is unlikely to yield general principles about the nature of intelligent beings or about cognition.
  • Chomsky acknowledged that the statistical approach might have practical value, just as in the example of a useful search engine, and is enabled by the advent of fast computers capable of processing massive data. But as far as a science goes, Chomsky would argue it is inadequate, or more harshly, kind of shallow
  • An unlikely pair, systems biology and artificial intelligence both face the same fundamental task of reverse-engineering a highly complex system whose inner workings are largely a mystery
  • Implicit in this endeavor is the assumption that with enough sophisticated statistical tools and a large enough collection of data, signals of interest can be weeded it out from the noise in large and poorly understood biological systems.
  • Brenner, a contemporary of Chomsky who also participated in the same symposium on AI, was equally skeptical about new systems approaches to understanding the brain. When describing an up-and-coming systems approach to mapping brain circuits called Connectomics, which seeks to map the wiring of all neurons in the brain (i.e. diagramming which nerve cells are connected to others), Brenner called it a "form of insanity."
  • These debates raise an old and general question in the philosophy of science: What makes a satisfying scientific theory or explanation, and how ought success be defined for science?
  • Ever since Isaiah Berlin's famous essay, it has become a favorite pastime of academics to place various thinkers and scientists on the "Hedgehog-Fox" continuum: the Hedgehog, a meticulous and specialized worker, driven by incremental progress in a clearly defined field versus the Fox, a flashier, ideas-driven thinker who jumps from question to question, ignoring field boundaries and applying his or her skills where they seem applicable.
  • Chomsky's work has had tremendous influence on a variety of fields outside his own, including computer science and philosophy, and he has not shied away from discussing and critiquing the influence of these ideas, making him a particularly interesting person to interview.
  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • it has been argued in my view rather plausibly, though neuroscientists don't like it -- that neuroscience for the last couple hundred years has been on the wrong track.
  • neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
katedriscoll

Metacontrol and body ownership: divergent thinking increases the virtual hand illusion ... - 0 views

  • The virtual hand illusion (VHI) paradigm demonstrates that people tend to perceive agency and bodily ownership for a virtual hand that moves in synchrony with their own movements. Given that this kind of effect can be taken to reflect self–other integration (i.e., the integration of some external, novel event into the representation of oneself), and given that self–other integration has been previously shown to be affected by metacontrol states (biases of information processing towards persistence/selectivity or flexibility/integration), we tested whether the VHI varies in size depending on the metacontrol bias. Persistence and flexibility biases were induced by having participants carry out a convergent thinking (Remote Associates) task or divergent-thinking (Alternate Uses) task, respectively, while experiencing a virtual hand moving synchronously or asynchronously with their real hand. Synchrony-induced agency and ownership effects were more pronounced in the context of divergent thinking than in the context of convergent thinking, suggesting that a metacontrol bias towards flexibility promotes self–other integration.
  • As in previous studies, participants were more likely to experience subjective agency and ownership for a virtual hand if it moved in synchrony with their own, real hand. As predicted, the size of this effect was significantly moderated by the type of creativity task in the context of which the illusion was induced.
  • It is important to keep in mind the fact that our present findings were obtained in a paradigm that strongly interleaved what we considered the task prime (i.e., the particular creativity task) and the induction of the VHI—the process we aimed to prime. The practical reason to do so was to increase the probability that the metacontrol state that the creativity tasks were hypothesized to induce or establish would be sufficiently close in time to the synchrony manipulation to have an impact on the thereby induced changes in self-perception. However, this implies that we are unable to disentangle the effects of the task prime proper and the effects of possible interactions between this task prime and the synchrony manipulation. There are indeed reasons to assume that such interactions are not unlikely to have occurred
  • ...2 more annotations...
  • and that they would make perfect theoretical sense. The observation that the VHI was affected by the type of creativity task and performance in the creativity tasks was affected by the synchrony manipulation suggests some degree of overlap between the ways that engaging in particular creativity tasks and experiencing particular degrees of synchrony are able to bias perceived ownership and agency. In terms of our theoretical framework, this implies that engaging in divergent thinking biases metacontrol towards flexibility in similar ways as experiencing synchrony between one’s own movements and those of a virtual effector does, while engaging in convergent thinking biases metacontrol towards persistence as experiencing asynchrony does. What the present findings demonstrate is that both kinds of manipulation together bias the VHI in the predicted direction, but they do not allow to statistically or numerically separate and estimate the contribution that each of the two confounded manipulations might have made. Accordingly, the present findings should not be taken to provide conclusive evidence that priming tasks alone are able to change self-perception without being supported (and perhaps even enabled) by the experience of synchrony
  • between proprioceptive and visual action feedback.
  •  
    This article relates to the ownership module. It talks about an experiment with VHI that is very interesting.
proudsa

Thomas Kuhn: the man who changed the way the world looked at science | Science | The Gu... - 0 views

  • how it ought to develop ("the scientific method")
    • proudsa
       
      could the switch between theories of how science works be defined as a paradigm within itself?
  • Kuhn's version of how science develops differed dramatically from the Whig version.
aliciathompson1

Science needs the freedom to constantly change its m... - 0 views

  • science is one of humanity’s most noble and successful endeavours, and our best way to learn how the world works.
  • Today science faces a crisis of legitimacy which is entirely centred on rampant public distrust and disavowal.
  • The capacity for self-correction is the source of science’s immense strength, but the public is unnerved by the fact that scientific wisdom isn’t immutable.
  • ...3 more annotations...
  • Many scientific findings run counter to common sense and challenge our deepest assumptions about reality
  • Nor are paradigm shifts confined to the distant scientific past.
  • But that’s the thing. Holding still is exactly what science won’t do.
  •  
    Many scientific findings run counter to common sense and challenge our deepest assumptions about reality
lenaurick

Being sleep-deprived makes people much more likely to give false confessions - Vox - 0 views

  • According to the Innocence Project, one in four people who have been exonerated for crimes they didn't commit confessed to that crime.
  • Psychologists have documented several reasons this might occur. The big one is that interrogating police officers can impose their suggestions on suspects: "We have evidence proving you were there!" "Your fingerprints were found!"
  • Only about 18 percent of the well-rested participants signed the form (such is the baseline power of an authority figure demanding guilt). But the results were more dramatic in the sleep-deprived condition. "That 18 percent now has risen to 50 percent," Loftus says.
  • ...5 more annotations...
  • According to Loftus's study, the majority of false confessions occur when interrogations last more than 12 hours.
  • Law enforcement "really needs to be super careful when a person is being interrogated after they have been up a long time," says Elizabeth Loftus, a co-author on a new study on sleep deprivation and false confessions in the Proceedings of the National Academy of Sciences.
  • hen they were told a second time to sign the form and admit their guilt, 68 percent of sleep-deprived participants gave in. (On the second request, 38 percent of the rested participants signed.)
  • "It would probably be scientifically prudent to go out and demonstrate it again with a more serious paradigm," Loftus admits. But there are also ethical limits to how far researchers can manipulate participants into thinking they've done something horrible.
  • She's also found that through subtle suggestions, people can be made to recall childhood memories that never happened.
proudsa

No, honey, you can't be anything you want to be. And that's okay. - The Washington Post - 0 views

  • studies show that pursuing overly-ambitious goals can be harmful.
    • proudsa
       
      Societal normal way of thinking might not always be right - possible paradigm shift of social thinking?
Javier E

Seeing What Cannot Be Spoken - The Dish | By Andrew Sullivan - The Daily Beast - 0 views

  • not everything we can see and therefore not everything we can mentally grasp can be put into words.
  • he believed those things about which we have to be silent to be the most important.
  • Philosophical confusion, he maintained, had its roots not in the relatively superficial thinking expressed by words but in that deeper territory studied by Freud, the pictorial thinking that lies in our unconscious and is expressed only involuntarily in, for example, our dreams, our doodles and in our “Freudian slips”
  • ...1 more annotation...
  • it is, he thinks, his job as a philosopher not to argue for or against the truth of this or that proposition but rather to delve deeper and substitute one picture for another. In other words, he conceived it as his task to make us, or at least to enable us, to see things differently.
Javier E

Worldly Philosophers Wanted - NYTimes.com - 0 views

  • Keynes himself was driven by a powerful vision of capitalism. He believed it was the only system that could create prosperity, but it was also inherently unstable and so in need of constant reform. This vision caught the imagination of a generation that had experienced the Great Depression and World War II and helped drive policy for nearly half a century.
  • Friedrich Hayek and Milton Friedman, who envisioned an ideal economy involving isolated individuals bargaining with one another in free markets. Government, they contended, usually messes things up. Overtaking a Keynesianism that many found inadequate to the task of tackling the stagflation of the 1970s, this vision fueled neoliberal and free-market conservative agendas of governments around the world.
  • It took extensive government action to prevent another Great Depression, while the enormous rewards received by bankers at the heart of the meltdown have led many to ask whether unfettered capitalism produced an equitable distribution of wealth. We clearly need a new, alternative vision of capitalism. But thanks to decades of academic training in the “dentistry” approach to economics, today’s Keynes or Friedman is nowhere to be found.
  • ...2 more annotations...
  • To refuse to discuss ideas such as types of capitalism deprives us of language with which to think about these problems. It makes it easier to stop thinking about what the economic system is for and in whose interests it is working.
  • Perhaps the protesters occupying Wall Street are not so misguided after all. The questions they raise — how do we deal with the local costs of global downturns? Is it fair that those who suffer the most from such downturns have their safety net cut, while those who generate the volatility are bailed out by the government? — are the same ones that a big-picture economic vision should address. If economists want to help create a better world, they first have to ask, and try to answer, the hard questions that can shape a new vision of capitalism’s potential.
Sophia C

BBC News - Viewpoint: Human evolution, from tree to braid - 0 views

  • What was, in my view, a logical conclusion reached by the authors was too much for some researchers to take.
  • he conclusion of the Dmanisi study was that the variation in skull shape and morphology observed in this small sample, derived from a single population of Homo erectus, matched the entire variation observed among African fossils ascribed to three species - H. erectus, H. habilis and H. rudolfensis.
  • a single population of H. erectus,
  • ...13 more annotations...
  • They all had to be the same species.
  • was not surprising to find that Neanderthals and modern humans interbred, a clear expectation of the biological species concept.
  • I wonder when the penny will drop: when we have five pieces of a 5,000-piece jigsaw puzzle, every new bit that we add is likely to change the picture.
  • e identity of the fourth player remains unknown but it was an ancient lineage that had been separate for probably over a million years. H. erectus seems a likely candidate. Whatever the name we choose to give this mystery lineage, what these results show is that gene flow was possible not just among contemporaries but also between ancient and more modern lineages.
  • cientists succeeded in extracting the most ancient mitochondrial DNA so far, from the Sima de los Huesos site in Atapuerca, Spain.
  • We have built a picture of our evolution based on the morphology of fossils and it was wrong.
    • Sophia C
       
      Kuhn
  • when we know how plastic - or easily changeable - skull shape is in humans. And our paradigms must also change.
  • e must abandon, once and for all, views of modern human superiority over archaic (ancient) humans. The terms "archaic" and "modern" lose all meaning as do concepts of modern human replacement of all other lineages.
  • he deep-rooted shackles that have sought to link human evolution with stone tool-making technological stages - the Stone Ages - even when we have known that these have overlapped with each other for half-a-million years in some instances.
  • e world of our biological and cultural evolution was far too fluid for us to constrain it into a few stages linked by transitions.
  • We have to flesh out the genetic information and this is where archaeology comes into the picture.
  • Rather than focus on differences between modern humans and Neanderthals, what the examples show is the range of possibilities open to humans (Neanderthals included) in different circumstances.
  • research using new technology on old archaeological sites, as at La Chapelle; and
Javier E

New Statesman - The Joy of Secularism: 11 Essays for How We Live Now - 0 views

  • Art & Design Books Film Ideas Music & Performance TV & Radio Food & Drink Blog Return to: Home | Culture | Books The Joy of Secularism: 11 Essays for How We Live Now By George Levine Reviewed by Terry Eagleton - 22 June 2011 82 comments Print version Email a friend Listen RSS Misunderstanding what it means to be secular.
  • Societies become truly secular not when they dispense with religion but when they are no longer greatly agitated by it. It is when religious faith ceases to be a vital part of the public sphere
  • Christianity is certainly other-worldly, and so is any reasonably sensitive soul who has been reading the newspapers. The Christian gospel looks to a future transformation of the appalling mess we see around us into a community of justice and friendship, a change so deep-seated and indescribable as to make Lenin look like a Lib Dem.“This [world] is our home," Levine comments. If he really feels at home in this crucifying set-up, one might humbly suggest that he shouldn't. Christians and political radicals certainly don't.
  • ...9 more annotations...
  • he suspects that Christian faith is other-worldly in the sense of despising material things. Material reality, in his view, is what art celebrates but religion does not. This is to forget that Gerard Manley Hopkins was a Jesuit. It is also to misunderstand the doctrine of Creation
  • Adam Phillips writes suggestively of human helplessness as opposed to the sense of protectedness that religious faith supposedly brings us, without noticing that the signifier of God for the New Testament is the tortured and executed corpse of a suspected political criminal.
  • None of these writers points out that if Christianity is true, then it is all up with us. We would then have to face the deeply disagreeable truth that the only authentic life is one that springs from a self-dispossession so extreme that it is probably beyond our power.
  • Secularisation is a lot harder than people tend to imagine. The history of modernity is, among other things, the history of substitutes for God. Art, culture, nation, Geist, humanity, society: all these, along with a clutch of other hopeful aspirants, have been tried from time to time. The most successful candidate currently on offer is sport, which, short of providing funeral rites for its spectators, fulfils almost every religious function in the book.
  • The Christian paradigm of love, by contrast, is the love of strangers and enemies, not of those we find agreeable. Civilised notions such as mutual sympathy, more's the pity, won't deliver us the world we need.
  • What exactly," he enquires, "does the invocation of some supernatural being add?" A Christian might reply that it adds the obligations to give up everything one has, including one's life, if necessary, for the sake of others. And this, to say the least, is highly inconvenient.
  • If Friedrich Nietzsche was the first sincere atheist, it is because he saw that the Almighty is exceedingly good at disguising Himself as something else, and that much so-called secularisation is accordingly bogus.
  • Postmodernism is perhaps best seen as Nietzsche shorn of the metaphysical baggage. Whereas modernism is still haunted by a God-shaped absence, postmodern culture is too young to remember a time when men and women were anguished by the fading spectres of truth, reality, nature, value, meaning, foundations and the like. For postmodern theory, there never was any truth or meaning in the first place
  • Postmodernism is properly secular, but it pays an immense price for this coming of age - if coming of age it is. It means shelving all the other big questions, too, as hopelessly passé. It also involves the grave error of imagining that all faith or passionate conviction is inci­piently dogmatic. It is not only religious belief to which postmodernism is allergic, but belief as such. Advanced capitalism sees no need for the stuff. It is both politically divisive and commercially unnecessary.
charlottedonoho

How can we best assess the neuropsychological effects of violent video game play? | Pet... - 0 views

  • Every time a research paper about violent video games makes it into the news, it feels like we’re in a time loop. Any claims that the study makes about the potential positive or (usually) negative effects of playing games tend to get over-egged to the point of ridiculousness.
  • At best, the measures of aggression that are used in such work are unstandardised; at worst, the field has been shown to be riddled with basic methodological and analytical flaws. These problems are further compounded by entrenched ideologies and a reluctance from some researchers to even talk to their ‘adversaries’, let alone discuss the potential for adversarial collaborations
  • All of this means that we’re stuck at an impasse with violent video games research; it feels like we’re no more clued up on what the actual behavioural effects are now than, say, five or ten years ago.
  • ...4 more annotations...
  • In stage 1, they submit the introduction, methods, proposed analysis, and if necessary, pilot data. This manuscript then goes through the usual peer review process, and is assessed on criteria such as the soundness of the methods and analysis, and overall plausibility of the stated hypotheses.
  • Once researchers have passed through stage 1, they can then move on to data collection. In stage 2, they then submit the full manuscript – the introduction and agreed methods from stage 1, plus results and discussion sections. The results must include the outcome of the analyses agreed in stage 1, but the researchers are allowed to include additional analyses in a separate, ‘exploratory’ section (as long as they are justified).
  • Pre-registering scientific articles in this way helps to protect against a number of undesirable practices (such as p-hacking and HARKing) that can exaggerate statistical findings and make non-existent effects seem real. While this is a problem across psychology generally, it is a particularly extreme problem for violent video game research.
  • By outlining the intended methods and analysis protocols beforehand, Registered Reports protect against these problems, as the review process concentrates on the robustness of the proposed methods. And Registered Reports offer an additional advantage: because manuscripts are never accepted based on the outcome of the data analysis, the process is immune to researcher party lines. It doesn’t matter which research ‘camp’ you are in; your data – and just as importantly, your methods - will speak for themselves.
charlottedonoho

How have changes to publishing affected scientists? | Julie McDougall-Waters | Science ... - 0 views

  • That was the purpose of a recent oral history event at the Royal Society, involving four senior scientists who began their careers in the 1960s and 1970s. Rather than simply reminiscing, they were asked to recall their publishing experiences in scientific periodicals over the last fifty years. How have things changed since they published their first paper?
  • It became clear that the hierarchy of journals has changed over the last fifty years, and the pressure to publish in those considered to have the highest impact has increased considerably, partly a result of the increased volume of data being produced and the need for readers to filter relevant information from the copious amounts of less pertinent stuff available.
  • What have also changed are the technologies available to write a paper. Frith related the process she went through in writing her first paper: “I wrote my papers by long hand and then typed them myself.” Writing a biological paper before computers is one thing, but Ashmore remembered the problems of producing mathematical formulae in a typed manuscript, explaining that “you wrote the paper and probably took it along to somebody to be typed… And then it came back with spaces where you had to write in the equations.”
  • ...2 more annotations...
  • Another change that interested the panellists was the increased number of collaborative and multiple authored papers now submitted to journals, which led them to think about the ethics of acknowledgement. In Meurig Thomas’s view the author is, simply, “the person that primarily thinks about the experiment, plans it, and writes it. I can sleep more comfortably at night this way. If I claim to be a senior author, I have to write it and I have to concoct what the experiment was, and defend it.” Chaloner suggested that authorship has grown “because of the pressure for people to have publications in their names”, with an “agreement to let you come onto this paper and I’ll get on yours next time”. Frith referred to this as “gaming”.
  • Despite all of the technological developments in the last fifty years, there has been no quick or easy response to questions over refereeing, and the event ended with the feeling that although there is no doubt technology has transformed the way science is communicated, its effect has not invariably simplified the process.
Javier E

The Widening World of Hand-Picked Truths - The New York Times - 0 views

  • it’s not just organized religions that are insisting on their own alternate truths. On one front after another, the hard-won consensus of science is also expected to accommodate personal beliefs, religious or otherwise, about the safety of vaccines, G.M.O. crops, fluoridation or cellphone radio waves, along with the validity of global climate change.
  • But presenting people with the best available science doesn’t seem to change many minds. In a kind of psychological immune response, they reject ideas they consider harmful.
  • Viewed from afar, the world seems almost on the brink of conceding that there are no truths, only competing ideologies — narratives fighting narratives. In this epistemological warfare, those with the most power are accused of imposing their version of reality — the “dominant paradigm” — on the rest, leaving the weaker to fight back with formulations of their own. Everything becomes a version.
  • ...3 more annotations...
  • I heard from young anthropologists, speaking the language of postmodernism, who consider science to be just another tool with which Western colonialism further extends its “cultural hegemony” by marginalizing the dispossessed and privileging its own worldview.
  • Science, through this lens, doesn’t discover knowledge, it “manufactures” it, along with other marketable goods.
  • The widening gyre of beliefs is accelerated by the otherwise liberating Internet. At the same time it expands the reach of every mind, it channels debate into clashing memes, often no longer than 140 characters, that force people to extremes and trap them in self-reinforcing bubbles of thought.
Javier E

History News Network | Are You a Genius? - 0 views

  • the real question is not so much ‘What is genius?’ or even ‘Who is a genius?’ but rather, ‘What stake do we have in the whole idea of genius?’ or even, ‘Who’s asking and what’s behind their question?’
  • These are the issues I address in my new book by looking at the different views and theories of genius over the course of three centuries, from the start of the eighteenth century to the present day.
  • I concentrated on France, partly because French literature and intellectual history happen to be my area of expertise and personal interest; partly because the French contribution to the literature on genius hasn’t received its due; but mostly because the variety and the inventiveness of the views and theories of genius in France was a story worth telling for itself
  • ...4 more annotations...
  • For me it’s this literature, more than the phenomenon itself, which makes genius a topic worth paying attention to. And the more you read, the less likely you are to be able to come up with any definition of what genius might be.
  • For eighteenth-century commentators, genius was self-evident: you knew it when you saw it, but for the nineteenth-century Romantics, genius was essentially misunderstood, and only genius itself was capable of recognizing its own kind
  • After the French Revolution, the question of national genius (another sense of the word, deriving from the genius loci) was subject to particularly anxious or over-assertive scrutiny. A number of nineteenth-century novels allowed for a rare feminine role in genius, but almost always doomed genius to failure. The medical profession turned the genius into a madman, while the experimental psychologists at the end of the century devised the IQ test which made genius nothing more than a high point on a continuous scale of intelligence. Child prodigies were the stuff of children’s literature but real examples in the twentieth century generated skepticism about the whole notion of genius, until Julia Kristeva came along and rehabilitated genius as essentially feminine, and Jacques Derrida embraced imposture as its essential quality
  • What all this indicates is that the idea of genius is curiously labile, that it changes shape, definition and value according to the way it’s talked about, but also that there’s something about the idea that, as Claude Lévi-Strauss said about animals, makes it ‘good to think with.’        
Javier E

We Are Just Not Digging The Whole Anymore : NPR - 1 views

  • We just don't do whole things anymore. We don't read complete books — just excerpts. We don't listen to whole CDs — just samplings. We don't sit through whole baseball games — just a few innings. Don't even write whole sentences. Or read whole stories like this one. Long-form reading, listening and viewing habits are giving way to browse-and-choose consumption. With the increase in the number of media options — or distractions, depending on how you look at them — something has to give, and that something is our attention span. - Adam Thierer, senior research fellow at George Mason University We care more about the parts and less about the entire. We are into snippets and smidgens and clips and tweets. We are not only a fragmented society, but a fragment society.
  • One Duke University student was famously quoted in a 2006 Time magazine essay telling his history professor, "We don't read whole books anymore."
  • Now there are lots of websites that present whole books and concepts in nano form
  • ...5 more annotations...
  • nearly half of all adults — 47 percent — get some of their local news and information on mobile computing devices. We are receiving our news in kibbles and bits, sacrificing context and quality for quickness and quantity.
  • Here is the ultra-condensation of Pride and Prejudice by Jane Austen: Mr. Darcy: Nothing is good enough for me. Ms. Elizabeth Bennet: I could never marry that proud man. (They change their minds.) THE END
  • Fewer and fewer gamers are following gaming storylines all the way to completion, according to a recent blog post on the IGN Entertainment video game website.
  • "With the increase in the number of media options — or distractions, depending on how you look at them — something has to give, and that something is our attention span." He ticks off a long list of bandied-about terms. Here's a shortened version: cognitive overload; information paralysis; techno stress; and data asphyxiation.
  • Rockmore believes that the way many people learn — or try to learn — these days is via this transporter technique. "The truth is," he says, "that modern pedagogy probably needs to address this in the sense that there is so much information out there, for free, so that obtaining it — even in bits and pieces — is not the challenge, rather integrating it into a coherent whole is. That's a new paradigm."
Javier E

E. O. Wilson's Theory of Everything - Magazine - The Atlantic - 0 views

  • Wilson told me the new proposed evolutionary model pulls the field “out of the fever swamp of kin selection,” and he confidently predicted a coming paradigm shift that would promote genetic research to identify the “trigger” genes that have enabled a tiny number of cases, such as the ant family, to achieve complex forms of cooperation.
  • In the book, he proposes a theory to answer what he calls “the great unsolved problem of biology,” namely how roughly two dozen known examples in the history of life—humans, wasps, termites, platypodid ambrosia beetles, bathyergid mole rats, gall-making aphids, one type of snapping shrimp, and others—made the breakthrough to life in highly social, complex societies. Eusocial species, Wilson noted, are by far “the most successful species in the history of life.”
  • Summarizing parts of it for me, Wilson was particularly unsparing of organized religion, likening the Book of Revelation, for example, to the ranting of “a paranoid schizophrenic who was allowed to write down everything that came to him.” Toward philosophy, he was only slightly kinder. Generation after generation of students have suffered trying to “puzzle out” what great thinkers like Socrates, Plato, and Descartes had to say on the great questions of man’s nature, Wilson said, but this was of little use, because philosophy has been based on “failed models of the brain.”
  • ...6 more annotations...
  • His theory draws upon many of the most prominent views of how humans emerged. These range from our evolution of the ability to run long distances to our development of the earliest weapons, which involved the improvement of hand-eye coordination. Dramatic climate change in Africa over the course of a few tens of thousands of years also may have forced Australopithecus and Homo to adapt rapidly. And over roughly the same span, humans became cooperative hunters and serious meat eaters, vastly enriching our diet and favoring the development of more-robust brains. By themselves, Wilson says, none of these theories is satisfying. Taken together, though, all of these factors pushed our immediate prehuman ancestors toward what he called a huge pre-adaptive step: the formation of the earliest communities around fixed camps.
  • “When humans started having a camp—and we know that Homo erectus had campsites—then we know they were heading somewhere,” he told me. “They were a group progressively provisioned, sending out some individuals to hunt and some individuals to stay back and guard the valuable campsite. They were no longer just wandering through territory, emitting calls. They were on long-term campsites, maybe changing from time to time, but they had come together. They began to read intentions in each other’s behavior, what each other are doing. They started to learn social connections more solidly.”
  • “The humans become consistent with all the others,” he said, and the evolutionary steps were likely similar—beginning with the formation of groups within a freely mixing population, followed by the accumulation of pre-adaptations that make eusociality more likely, such as the invention of campsites. Finally comes the rise to prevalence of eusocial alleles—one of two or more alternative forms of a gene that arise by mutation, and are found at the same place on a chromosome—which promote novel behaviors (like communal child care) or suppress old, asocial traits. Now it is up to geneticists, he adds, to “determine how many genes are involved in crossing the eusociality threshold, and to go find those genes.”
  • Wilson posits that two rival forces drive human behavior: group selection and what he calls “individual selection”—competition at the level of the individual to pass along one’s genes—with both operating simultaneously. “Group selection,” he said, “brings about virtue, and—this is an oversimplification, but—individual selection, which is competing with it, creates sin. That, in a nutshell, is an explanation of the human condition.
  • “Within groups, the selfish are more likely to succeed,” Wilson told me in a telephone conversation. “But in competition between groups, groups of altruists are more likely to succeed. In addition, it is clear that groups of humans proselytize other groups and accept them as allies, and that that tendency is much favored by group selection.” Taking in newcomers and forming alliances had become a fundamental human trait, he added, because “it is a good way to win.”
  • If Wilson is right, the human impulse toward racism and tribalism could come to be seen as a reflection of our genetic nature as much as anything else—but so could the human capacity for altruism, and for coalition- and alliance-building. These latter possibilities may help explain Wilson’s abiding optimism—about the environment and many other matters. If these traits are indeed deeply written into our genetic codes, we might hope that we can find ways to emphasize and reinforce them, to build problem-solving coalitions that can endure, and to identify with progressively larger and more-inclusive groups over time.
Javier E

Logical punctuation: Should we start placing commas outside quotation marks? - 1 views

  • For at least two centuries, it has been standard practice in the United States to place commas and periods inside of quotation marks. This rule still holds for professionally edited prose: what you'll find in Slate, the New York Times, the Washington Post— a
  • in copy-editor-free zones—the Web and emails, student papers, business memos—with increasing frequency, commas and periods find themselves on the outside of quotation marks, looking in. A punctuation paradigm is shifting.
  • you can find copious examples of the "outside" technique—which readers of Virginia Woolf and The Guardian will recognize as the British style—no further away than your Twitter or Facebook feed.
  • ...3 more annotations...
  • the main reason is that the British way simply makes more sense. Indeed, since at least the 1960s a common designation for that style has been "logical punctuation."
  • American style is inconsistent, moreover, because when it comes to other punctuation marks—semicolons, colons, exclamation points, question marks, dashes—we follow British/logical protocol.
  • If it seems hard or even impossible to defend the American way on the merits, that's probably because it emerged from aesthetic, not logical, considerations
Javier E

The Mental Virtues - NYTimes.com - 0 views

  • Even if you are alone in your office, you are thinking. Thinking well under a barrage of information may be a different sort of moral challenge than fighting well under a hail of bullets, but it’s a character challenge nonetheless.
  • some of the cerebral virtues. We can all grade ourselves on how good we are at each of them.
  • love of learning. Some people are just more ardently curious than others, either by cultivation or by nature.
  • ...12 more annotations...
  • courage. The obvious form of intellectual courage is the willingness to hold unpopular views. But the subtler form is knowing how much risk to take in jumping to conclusions.
  • Intellectual courage is self-regulation, Roberts and Wood argue, knowing when to be daring and when to be cautious. The philosopher Thomas Kuhn pointed out that scientists often simply ignore facts that don’t fit with their existing paradigms, but an intellectually courageous person is willing to look at things that are surprisingly hard to look at.
  • The median point between flaccidity and rigidity is the virtue of firmness. The firm believer can build a steady worldview on solid timbers but still delight in new information. She can gracefully adjust the strength of her conviction to the strength of the evidence. Firmness is a quality of mental agility.
  • humility, which is not letting your own desire for status get in the way of accuracy. The humble person fights against vanity and self-importance.
  • wisdom isn’t a body of information. It’s the moral quality of knowing how to handle your own limitations.
  • autonomy
  • Autonomy is the median of knowing when to bow to authority and when not to, when to follow a role model and when not to, when to adhere to tradition and when not to.
  • generosity. This virtue starts with the willingness to share knowledge and give others credit. But it also means hearing others as they would like to be heard, looking for what each person has to teach and not looking to triumphantly pounce upon their errors.
  • thinking well means pushing against the grain of our nature — against vanity, against laziness, against the desire for certainty, against the desire to avoid painful truths. Good thinking isn’t just adopting the right technique. It’s a moral enterprise and requires good character, the ability to go against our lesser impulses for the sake of our higher ones.
  • The humble researcher doesn’t become arrogant toward his subject, assuming he has mastered it. Such a person is open to learning from anyone at any stage in life.
  • Warren Buffett made a similar point in his own sphere, “Investing is not a game where the guy with the 160 I.Q. beats the guy with the 130 I.Q. Once you have ordinary intelligence, what you need is the temperament to control the urges that get other people into trouble.”
  • Good piece. I only wish David had written more about all the forces that work _against_ the virtues he describes. The innumerable examples of corporate suppression/spin of "inconvenient" truths (i.e, GM, Toyota, et al); the virtual acceptance that lying is a legitimate tactic in political campaigns; our preoccupation with celebrity, appearances, and "looking good" in every imaginable transaction; make the quiet virtues that DB describes even more heroic than he suggests.
‹ Previous 21 - 40 of 63 Next › Last »
Showing 20 items per page