Skip to main content

Home/ TOK@ISPrague/ Group items tagged aeon

Rss Feed Group items tagged

markfrankel18

Why keeping a pet is fundamentally unethical | Aeon Essays - 0 views

  • To say that an animal has a right not to be used as property is simply to say that we have a moral obligation to not use animals as things, even if it would benefit us to do so. With respect to domesticated animals, that means that we stop bringing them into existence altogether. We have a moral obligation to care for those right-holders we have here presently. But we have an obligation not to bring any more into existence.And this includes dogs, cats and other non-humans who serve as our ‘companions’.
  • If animals matter morally, we must recalibrate all aspects of our relationship with them. The issue we must confront is not whether our exploitation of them is ‘humane’ – with all of the concomitant tinkering with the practices of animal-use industries – but rather whether we can justify using them at all.
Philip Drobis

Imitation is what makes us human and creative - Kat McGowan - Aeon - 3 views

  • Throughout human history, innovation – including the technological progress we cherish – has been fuelled and sustained by imitation. Copying is the mighty force that has allowed the human race to move from stone knives to remote-guided drones, from digging sticks to crops that manufacture their own pesticides.
    • Philip Drobis
       
      Imitation is the source for technological advance --by copying others inventions we can add our own potentially benefit to them, thus providing a small contribution to its advancement
  • advances happen largely through tinkering, when somebody recreates a good thing with a minor upgrade that makes it slightly better.
  • When Isaac Newton talked about standing on the shoulders of giants, he should have said that we are dwarves, standing atop a vast heap of dwarves.
    • Philip Drobis
       
      Ties to our ability to observe and remember what we see. That we can then build off of that and improve it
  • ...2 more annotations...
  • Lots of copying means that many minds get their chance at the problem; imitation ‘makes the contents of brains available to everyone’, writes the developmental psychologist Michael Tomasello in the Cultural Origins of Human Cognition (1999). Tomasello, who is co-director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, calls the combination of imitation and innovation the ‘cultural ratchet’. It is like a mechanical ratchet that permits motion in only one direction – such as winding a watch, or walking through a turnstile. Good ideas push the ratchet forward one notch. Faithful imitation keeps the ratchet from slipping backward, protecting ideas from being forgotten or lost and keeping knowledge alive for the next round of improvement.
    • Philip Drobis
       
      Multiple minds are essentially key as the cumulative opportunities of each individual given a chance at the issue can lead to one finding something prosperous 
  • In the 1930s, a pair of psychologists raised an infant chimp alongside their own baby in an attempt to understand both species better. The chimp raised in this family (and others in other such experiments later in the century) never behaved much like a human. The human child, on the other hand, soon began knuckle-walking, biting, grunting and hooting – just like his new sibling.
    • Philip Drobis
       
      We copy to survive. Only we humans actually have the 'push' or are gullible enough to not realize as the Chimp example above proposes. -Ties to a biological and/or physiological connection in terms of behavior 
  •  
    How we are imitators from childbirth 
markfrankel18

Are scientific theories really better when they are simpler? | Aeon Essays - 0 views

  • If all art should be simple or if all art should be complex, the choice is clear. However, both of these norms seem absurd. Isn’t it obvious that some estimable art is simple and some is complex? True, there might be extremes that are beyond the pale; we are alienated by art that is far too complex and bored by art that is far too simple. However, between these two extremes there is a vast space of possibilities.
  • Science is different, at least according to many scientists. Albert Einstein spoke for many when he said that ‘it can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience’. The search for simple theories, then, is a requirement of the scientific enterprise.
eviemcconkie

Dehumanisation is a human universal - David Livingstone Smith - Aeon - 3 views

  • phenomenon of dehumanisation.
  • We dehumanise other people when we conceive of them as subhuman creatures
  • psychological essentialism’ to denote our pervasive and seemingly irrepressible tendency to essentialise categories of things.
  • ...6 more annotations...
  • People the world over segment the animal kingdom into species.
  • When we dehumanise others, we do not simply regard them as non-human. We regard them as less than human. Where does that come from?
  • Attributions of intrinsic value are intimately bound up with beliefs about moral obligation
  • we have developed methods to circumvent and neutralise our own horror at the prospect of spilling human blood.
  • You don’t have to be a monster or a madman to dehumanise others. You just have to be an ordinary human being.
  • A Midsummer Night’s Dream had no trouble understanding that though Bottom’s head looked like that of a donkey, he was really a human being ‘on the inside’, the donkeyish appearance concealing the human essence.
markfrankel18

What the oral histories of Russian Jews reveal about memory | Aeon Essays - 0 views

  • Jewish émigrés from the former Soviet Union tell inconsistent stories. What does this say about the nature of memory?
markfrankel18

Why don't our brains explode at movie cuts? - Jeff Zacks - Aeon - 1 views

  • Throughout evolutionary history, we never saw anything like a montage. So why do we hardly notice the cuts in movies?
  • Simply put, visual perception is much jerkier than we realise. First, we blink. Blinks happen every couple of seconds, and when they do we are blind for a couple of tenths of a second. Second, we move our eyes. Want to have a little fun? Take a close-up selfie video of your eyeball while you watch a minute’s worth of a movie on your computer or TV.
  • Between blinks and saccades, we are functionally blind about a third of our waking life.
  • ...5 more annotations...
  • Worse yet, even when your eyes are open, they are recording a lot less of the world than you realise.
  • There is, however, one situation in which stitching a new view in with the previous one is a bad idea: when the new view represents a transition from one event to another.
  • That makes good evolutionary sense, doesn’t it? If your memory conflicts with what is in front of your eyeballs, the chances are it is your memory that is at fault. So, most of the time your brain is stitching together a succession of views into a coherent event model, and it can handle cuts the same way it handles disruptions such as blinks and saccades in the real world.
  • Our brains do a lot of work to fill in the gaps, which can produce some pretty striking – and entertaining – errors of perception and memory.
  • So now I think we have a story about why our heads don’t explode when we watch movies. It’s not that we have learned how to deal with cuts. It’s certainly not that our brains have evolved biologically to deal with film – the timescale is way too short. Instead, film cuts work because they exploit the ways in which our visual systems evolved to work in the real world.
markfrankel18

There is no language instinct - Vyvyan Evans - Aeon - 0 views

  • Chomsky’s idea dominated the science of language for four decades. And yet it turns out to be a myth. A welter of new evidence has emerged over the past few years, demonstrating that Chomsky is plain wrong.
  • How much sense does it make to call whatever inborn basis for language we might have an ‘instinct’? On reflection, not much.
  • If our knowledge of the rudiments of all the world’s 7,000 or so languages is innate, then at some level they must all be the same. There should be a set of absolute grammatical ‘universals’ common to every one of them. This is not what we have discovered.
  • ...4 more annotations...
  • In a 2002 version, Chomsky and colleagues at Harvard proposed that perhaps all that is unique to the human language capability is a general-purpose computational capacity known as ‘recursion’.
  • While the human brain does exhibit specialisation for processing different genres of information, such as vision, there appears not to be a dedicated spot specialised just for language.
  • And indeed, we now believe that several of Chomsky’s evolutionary assumptions were incorrect.
  • we don’t have to assume a special language instinct; we just need to look at the sorts of changes that made us who we are, the changes that paved the way for speech.
markfrankel18

Morality is the key to personal identity - Nina Strohminger - Aeon - 4 views

  • We tend to think that our memories determine our identity, but it’s moral character that really makes us who we are
  • Nor can you have formal moral systems without identity. The 18th-century philosopher Thomas Reid observed that the fundaments of justice – rights, duty, responsibility – would be impossible without the ability to ascribe stable identity to persons.
  • Why does our identity detector place so much emphasis on moral capacities? These aren’t our most distinctive features.
markfrankel18

We are more rational than those who nudge us - Steven Poole - Aeon - 3 views

  • We are told that we are an irrational tangle of biases, to be nudged any which way. Does this claim stand to reason?
  • A culture that believes its citizens are not reliably competent thinkers will treat those citizens differently to one that respects their reflective autonomy. Which kind of culture do we want to be? And we do have a choice. Because it turns out that the modern vision of compromised rationality is more open to challenge than many of its followers accept.
  • Modern skepticism about rationality is largely motivated by years of experiments on cognitive bias.
  • ...5 more annotations...
  • The thorny question is whether these widespread departures from the economic definition of ‘rationality’ should be taken to show that we are irrational, or whether they merely show that the economic definition of rationality is defective.
  • During the development of game theory and decision theory in the mid-20th century, a ‘rational’ person in economic terms became defined as a lone individual whose decisions were calculated to maximise self-interest, and whose preferences were (logically or mathematically) consistent in combination and over time. It turns out that people are not in fact ‘rational’ in this homo economicus way,
  • There has been some controversy over the correct statistical interpretations of some studies, and several experiments that ostensibly demonstrate ‘priming’ effects, in particular, have notoriously proven difficult to replicate. But more fundamentally, the extent to which such findings can show that we are acting irrationally often depends on what we agree should count as ‘rational’ in the first place.
  • if we want to understand others, we can always ask what is making their behaviour ‘rational’ from their point of view. If, on the other hand, we just assume they are irrational, no further conversation can take place.
  • And so there is less reason than many think to doubt humans’ ability to be reasonable. The dissenting critiques of the cognitive-bias literature argue that people are not, in fact, as individually irrational as the present cultural climate assumes. And proponents of debiasing argue that we can each become more rational with practice. But even if we each acted as irrationally as often as the most pessimistic picture implies, that would be no cause to flatten democratic deliberation into the weighted engineering of consumer choices, as nudge politics seeks to do. On the contrary, public reason is our best hope for survival.
markfrankel18

How can Duchamp's 'Fountain' be both art and not art? | Aeon Essays - 1 views

  • Marcel Duchamp’s ‘Fountain’ is not just a radical kind of art. It’s a philosophical dialetheia: a contradiction that is true
markfrankel18

What if historians started taking the 'what if' seri... - 1 views

  • ‘“What if?” is a waste of time’ went the headline to the Cambridge historian Richard Evans’ piece in The Guardian last year. Surveying the many instances of public counterfactual discourse in the anniversary commemorations of the First World War, Evans wrote: ‘This kind of fantasising is now all the rage, and threatens to overwhelm our perceptions of what really happened in the past, pushing aside our attempts to explain it in favour of a futile and misguided attempt to decide whether the decisions taken in August 1914 were right or wrong.’
  • But hold on a minute.
  • If well-done counterfactuals can help us think them through, shouldn’t we allow what-ifs some space at the history table?
  • ...4 more annotations...
  • What is worse, counterfactual speculations spring naturally from deeply conservative assumptions about what makes history tick. Like bestselling popular histories, counterfactuals usually take as their subjects war, biography or an old-school history of technology that emphasises the importance of the inventor.
  • Women – as individuals, or as a group – almost never appear, and social, cultural, and environmental history are likewise absent. Evans, for his part, thinks this is because complex cultural topics are not easy to understand through the simplifying lens of the ‘what if’.
  • Counterfactuals, if done well, can force a super-meticulous look at the way historians use evidence. And counterfactuals can encourage readers to think about the contingent nature of history – an exercise that can help build empathy and diminish feelings of national, cultural, and racial exceptionalism.
  • Historians who refuse to engage with counterfactuals miss an opportunity to talk about history in a way that makes intuitive sense to non-historians, while introducing theories about evidence, causality and contingency into the mix. The best characteristic of well-done counterfactuals might, in fact, be the way that they make the artfulness inherent in writing history more evident. After all, even the most careful scholar or author employs some kind of selective process in coming up with a narrative, a set of questions or an argument.
1 - 16 of 16
Showing 20 items per page