Skip to main content

Home/ TOK Friends/ Group items tagged stages

Rss Feed Group items tagged

Javier E

A 5-Step Technique for Producing Ideas circa 1939 | Brain Pickings - 0 views

  • In learning any art the important things to learn are, first, Principles, and second, Method. This is true of the art of producing ideas. Particular bits of knowledge are nothing, because they are made up [of] so called rapidly aging facts. Principles and method are everything.
  • So with the art of producing ideas. What is most valuable to know is not where to look for a particular idea, but how to train the mind in the method by which all ideas are produced and how to grasp the principles which are at the source of all ideas.
  • Every really good creative person…whom I have ever known has always had two noticeable characteristics. First, there was no subject under the sun in which he could not easily get interested — from, say, Egyptian burial customs to modern art. Every facet of life had fascination for him. Second, he was an extensive browser in all sorts of fields of information.
  • ...7 more annotations...
  • What you do is to take the different bits of material which you have gathered and feel them all over, as it were, with the tentacles of the mind. You take one fact, turn it this way and that, look at it in different lights, and feel for the meaning of it. You bring two facts together and see how they fit. What you are seeking now is the relationship, a synthesis where everything will come together in a neat combination, like a jig-saw puzzle.
  • Then and only then, Young promises, everything will click in the fourth stage of the seemingly serendipitous a-ha! moment: Out of nowhere the Idea will appear. It will come to you when you are least expecting it
  • In his third stage of the creative process, Young stresses the importance of making absolutely “no effort of a direct nature”: It is important to realize that this is just as definite and just as necessary a stage in the process as the two preceding ones. What you have to do at this time, apparently, is to turn the problem over to your unconscious mind and let it work while you sleep.
  • [W]hen you reach this third stage in the production of an idea, drop the problem completely and turn to whatever stimulates your imagination and emotions. Listen to music, go to the theater or movies, read poetry or a detective story.
  • articulates the increasing importance of quality information filters in our modern information diet. This notion of gathering raw material is the first step in his outline of the creative process:
  • Young calls the last stage “the cold, gray dawn of the morning after,” when your newborn idea has to face reality: It requires a deal of patient working over to make most ideas fit the exact conditions, or the practical exigencies, under which they must work. And here is where many good ideas are lost. The idea man, like the inventor, is often not patient enough or practical enough to go through with this adapting part of the process. But it has to be done if you are to put ideas to work in a work-a-day world. Do not make the mistake of holding your idea close to your chest at this stage. Submit it to the criticism of the judicious. When you do, a surprising thing will happen. You will find that a good idea has, as it were, self-expanding qualities. It stimulates those who see it to add to it. Thus possibilities in it which you have overlooked will come to light.
  • what’s perhaps most interesting is the following note he made to the postscript of a reprint: From my own further experience in advertising, government, and public affairs I find no essential points which I would modify in the idea-producing process. There is one, however, on which I would put greater emphasis. This is as to the store of general materials in the idea-producer’s reservoir. […] I am convinced, however, that you gather this vicarious experience best, not when you are boning up on it for an immediate purpose, but when you are pursuing it as an end in itself.
charlottedonoho

How can we best assess the neuropsychological effects of violent video game play? | Pet... - 0 views

  • Every time a research paper about violent video games makes it into the news, it feels like we’re in a time loop. Any claims that the study makes about the potential positive or (usually) negative effects of playing games tend to get over-egged to the point of ridiculousness.
  • At best, the measures of aggression that are used in such work are unstandardised; at worst, the field has been shown to be riddled with basic methodological and analytical flaws. These problems are further compounded by entrenched ideologies and a reluctance from some researchers to even talk to their ‘adversaries’, let alone discuss the potential for adversarial collaborations
  • All of this means that we’re stuck at an impasse with violent video games research; it feels like we’re no more clued up on what the actual behavioural effects are now than, say, five or ten years ago.
  • ...4 more annotations...
  • In stage 1, they submit the introduction, methods, proposed analysis, and if necessary, pilot data. This manuscript then goes through the usual peer review process, and is assessed on criteria such as the soundness of the methods and analysis, and overall plausibility of the stated hypotheses.
  • Once researchers have passed through stage 1, they can then move on to data collection. In stage 2, they then submit the full manuscript – the introduction and agreed methods from stage 1, plus results and discussion sections. The results must include the outcome of the analyses agreed in stage 1, but the researchers are allowed to include additional analyses in a separate, ‘exploratory’ section (as long as they are justified).
  • Pre-registering scientific articles in this way helps to protect against a number of undesirable practices (such as p-hacking and HARKing) that can exaggerate statistical findings and make non-existent effects seem real. While this is a problem across psychology generally, it is a particularly extreme problem for violent video game research.
  • By outlining the intended methods and analysis protocols beforehand, Registered Reports protect against these problems, as the review process concentrates on the robustness of the proposed methods. And Registered Reports offer an additional advantage: because manuscripts are never accepted based on the outcome of the data analysis, the process is immune to researcher party lines. It doesn’t matter which research ‘camp’ you are in; your data – and just as importantly, your methods - will speak for themselves.
Javier E

Wine-tasting: it's junk science | Life and style | The Observer - 0 views

  • google_ad_client = 'ca-guardian_js'; google_ad_channel = 'lifeandstyle'; google_max_num_ads = '3'; // Comments Click here to join the discussion. We can't load the discussion on guardian.co.uk because you don't have JavaScript enabled. if (!!window.postMessage) { jQuery.getScript('http://discussion.guardian.co.uk/embed.js') } else { jQuery('#d2-root').removeClass('hd').html( '' + 'Comments' + 'Click here to join the discussion.We can\'t load the ' + 'discussion on guardian.co.uk ' + 'because your web browser does not support all the features that we ' + 'need. If you cannot upgrade your browser to a newer version, you can ' + 'access the discussion ' + 'here.' ); } Wor
  • Hodgson approached the organisers of the California State Fair wine competition, the oldest contest of its kind in North America, and proposed an experiment for their annual June tasting sessions.Each panel of four judges would be presented with their usual "flight" of samples to sniff, sip and slurp. But some wines would be presented to the panel three times, poured from the same bottle each time. The results would be compiled and analysed to see whether wine testing really is scientific.
  • Results from the first four years of the experiment, published in the Journal of Wine Economics, showed a typical judge's scores varied by plus or minus four points over the three blind tastings. A wine deemed to be a good 90 would be rated as an acceptable 86 by the same judge minutes later and then an excellent 94.
  • ...9 more annotations...
  • Hodgson's findings have stunned the wine industry. Over the years he has shown again and again that even trained, professional palates are terrible at judging wine."The results are disturbing," says Hodgson from the Fieldbrook Winery in Humboldt County, described by its owner as a rural paradise. "Only about 10% of judges are consistent and those judges who were consistent one year were ordinary the next year."Chance has a great deal to do with the awards that wines win."
  • French academic Frédéric Brochet tested the effect of labels in 2001. He presented the same Bordeaux superior wine to 57 volunteers a week apart and in two different bottles – one for a table wine, the other for a grand cru.The tasters were fooled.When tasting a supposedly superior wine, their language was more positive – describing it as complex, balanced, long and woody. When the same wine was presented as plonk, the critics were more likely to use negatives such as weak, light and flat.
  • In 2011 Professor Richard Wiseman, a psychologist (and former professional magician) at Hertfordshire University invited 578 people to comment on a range of red and white wines, varying from £3.49 for a claret to £30 for champagne, and tasted blind.People could tell the difference between wines under £5 and those above £10 only 53% of the time for whites and only 47% of the time for reds. Overall they would have been just as a successful flipping a coin to guess.
  • why are ordinary drinkers and the experts so poor at tasting blind? Part of the answer lies in the sheer complexity of wine.For a drink made by fermenting fruit juice, wine is a remarkably sophisticated chemical cocktail. Dr Bryce Rankine, an Australian wine scientist, identified 27 distinct organic acids in wine, 23 varieties of alcohol in addition to the common ethanol, more than 80 esters and aldehydes, 16 sugars, plus a long list of assorted vitamins and minerals that wouldn't look out of place on the ingredients list of a cereal pack. There are even harmless traces of lead and arsenic that come from the soil.
  • "People underestimate how clever the olfactory system is at detecting aromas and our brain is at interpreting them," says Hutchinson."The olfactory system has the complexity in terms of its protein receptors to detect all the different aromas, but the brain response isn't always up to it. But I'm a believer that everyone has the same equipment and it comes down to learning how to interpret it." Within eight tastings, most people can learn to detect and name a reasonable range of aromas in wine
  • People struggle with assessing wine because the brain's interpretation of aroma and bouquet is based on far more than the chemicals found in the drink. Temperature plays a big part. Volatiles in wine are more active when wine is warmer. Serve a New World chardonnay too cold and you'll only taste the overpowering oak. Serve a red too warm and the heady boozy qualities will be overpowering.
  • Colour affects our perceptions too. In 2001 Frédérick Brochet of the University of Bordeaux asked 54 wine experts to test two glasses of wine – one red, one white. Using the typical language of tasters, the panel described the red as "jammy' and commented on its crushed red fruit.The critics failed to spot that both wines were from the same bottle. The only difference was that one had been coloured red with a flavourless dye
  • Other environmental factors play a role. A judge's palate is affected by what she or he had earlier, the time of day, their tiredness, their health – even the weather.
  • Robert Hodgson is determined to improve the quality of judging. He has developed a test that will determine whether a judge's assessment of a blind-tasted glass in a medal competition is better than chance. The research will be presented at a conference in Cape Town this year. But the early findings are not promising."So far I've yet to find someone who passes," he says.
oliviaodon

What Your Brain Looks Like When It Solves a Math Problem - The New York Times - 0 views

  • The imaging analysis found four stages in all: encoding (downloading), planning (strategizing), solving (performing the math), and responding (typing out an answer).
  • The analysis found four separate stages that, depending on the problem, varied in length by a second or more. For instance, planning took up more time than the other stages when a clever workaround was required. The same stages are likely applicable to solving many creative problems, not just in math. But knowing how they play out in the brain should help in designing curriculums, especially in mathematics, the paper suggests.
Javier E

Opinion | Your Kid's Existential Dread Is Normal - The New York Times - 0 views

  • my daughter said: “When the pandemic started, I was only 7, and I wasn’t scared. Now I’m 9 and I really understand.”
  • I called Sally Beville Hunter, a clinical associate professor of child and family studies at the University of Tennessee, to see if this kind of philosophical musing was typical for a young tween. “There’s a huge cognitive transition happening” around this age, Hunter told me.
  • It’s the stage when children develop the capacity for abstract thought, she said. The pioneering developmental psychologist Jean Piaget called this transition the “formal operational stage,” and in his research he found it began around age 11, but Hunter said subsequent research has found that it may begin earlier. “It’s the first time children can consider multiple possibilities and test them against each other,” she said. Which helps explain why my daughter has begun thinking about whether Covid will linger into her college years, a decade from now.
  • ...1 more annotation...
  • Another aspect of development that may be happening for her is a stage that the psychologist Erik Erikson called “identity versus role diffusion” (also referred to as “role confusion”), which is shorthand for children figuring out their position in the world. “This is the first time when kids have questions about their own existence, questions about self-identity, the meaning of life and the changing role of authority,” Hunter said.
maxwellokolo

Brain Implant Eases Communication by Late-Stage A.L.S. Patient - 0 views

  •  
    Researchers have designed a system that lets a patient with late-stage Lou Gehrig's disease type words using brain signals alone. The patient, Hanneke De Bruijne, a doctor of internal medicine from the Netherlands, received a diagnosis of amyotrophic lateral sclerosis, also known as A.L.S. or Lou Gehrig's disease, in 2008.
sissij

The Stem-Cell Revolution Is Coming - Slowly - The New York Times - 3 views

  • In 2001, President George W. Bush issued an executive order banning federal funding for new sources of stem cells developed from preimplantation human embryos. The action stalled research and discouraged scientists.
  • re-energized the field by devising a technique to “reprogram” any adult cell, such as a skin cell, and coax it back to its earliest “pluripotent” stage. From there it can become any type of cell, from a heart muscle cell to a neuron.
  • But it’s a double-edged sword. After multiple cell cycles, the chances of mutations increases.
  •  
    In Biology, we learned that the study for stem cells has been halted because of the ethic issues on whether embryos should be count as human life. Now, there is this new technique that can induce skin cello its earliest "pluripotent" stage. With this technique,the study of stem cells and continue and flourish to benefit patients who need to have new cells that aren't mutated. It's surprised to see that how fast science is progressing. The science wielder at school might not be the science up to date.--Sissi (1/17/2017)
sissij

Sleeping Wipes Out Certain Memories - And That's a Good Thing, Reveal Studies | Big Think - 0 views

  • But what is its evolutionary purpose – what kind of changes do our brains undergo when we sleep?
  • suggest our brains undergo a pruning cycle while we rest.
  • Its important to note these studies are still in their early stages. The tests were done on mice.
  • ...3 more annotations...
  • letting us forget the less relevant information while strengthening memories that may be important.
  • However, modern humans don't abide by a natural sleep cycle anymore – we look at our phones before bed and expose ourselves to things that cause our brains to think sleep is not on the menu.
  • they might not require a chemical crutch to get some rest.
  •  
    This article shows that how unreliable our memory is. Every night when we go to sleep, our memory is edited and our brain would delete some irrelevant things. So our memory is not a primary source and I think the words of witnesses on the court can only be a reference, not a direct evidence. Also, in this article, the author states the uncertainties and limits of the experiment, showing that the result of the experiment in this stage can only serves as a suggestion, not a direct evidence. --Sissi (2/7/2017)
Emilio Ergueta

Performance Is The Thing | Issue 57 | Philosophy Now - 1 views

  • The definition of philosophy is pretty much set – love of wisdom, the rational investigation of questions about existence and knowledge and ethics, the application of reason towards a more enlightened way of life. How can we understand performance in equally clear terms? Also, what are the responsibilities of the performer?
  • How can my performances enrich my life and the lives of my audiences? What is my own personal philosophy of performance?
  • Performance can fundamentally be said to be a transformation of ideas and dreams and all those other little understood human impulses into outward action. In this very basic sense performance happens with every word and gesture. It also presupposes a process of evaluation by a spectator.
  • ...7 more annotations...
  • There’s one very basic thing I learnt that day: never take your writing on stage on a flimsy bit of paper. Always use something hardbacked. Even if your hands start shaking, it won’t be so noticeable.
  • I determined that I was going to break through whatever it was that made me so anxious to stand up in front of a room full of people, and simply be. But I also learnt some other more complex philosophical lessons too, which became clearer over time. In fact, whatever philosophy I have developed about performance has stemmed from that moment.
  • When I am performing, there’s a desire I can taste to bridge the gap in understanding between me and my audience. I want to find new ways, new language, verbal and non-verbal to express universal truths. I want to push the challenge of understanding deeper, for me and my audience.
  • Romantic aestheticians would have it that art, and by extension, performance, is a heightening of the common human activity of expressing emotions to the point where they are experienced and rendered lucid to the performer and audience in a way that is rarely seen in everyday life.
  • I say ‘I am a writer, poet, performer,’ but this self-definition is in itself a kind of performance.
  • One philosophical problem is that performance on the stage of life doesn’t have a beginning, middle and end in the way it does in the theatre. So after a stage performance, when we discard the magnifying glass for the clutter of the quotidian, it’s all too easy to forget to zero in – to forget that we are still and are always performing, and that life is constantly a performance: that we are, in fact, only performing from moment to moment.
  • understanding process can add to our understanding of how we can live our lives for the better and, dare I say, the greater good.
edencottone

What to Know About the World's Top Covid-19 Vaccines - The New York Times - 0 views

  • Novavax
  • Johnson & Johnson
  • Russia’s Sputnik V
  • ...24 more annotations...
  • Oxford-AstraZeneca
  • Pfizer-BioNTech
  • Chinese Vaccines
  • Moderna
  • Scientists have developed dozens of Covid-19 vaccines at record speed.
  • Efficacy: Unknown
  • pledged to be the primary vaccine provider for the developing world.
  • claims that its vaccine has a 79 percent efficacy rate, though it has not provided data.
  • expected to release results from its late-stage clinical trial this month
  • is estimated to have an efficacy rate between 63 and 78 percent
  • made waves in August when early data showed that its Covid vaccine prompted a surprisingly robust immune response in people and monkeys.
  • first Covid-19 vaccine to get emergency authorization in the United States
  • offers terrific protection against Covid-19 and sometimes comes with mild side effects.
  • in Britain, India and several other countries
  • Sputnik V vaccine has an efficacy rate of 91.4 percent.
  • could have a big impact on the pace of vaccinations in the U.S. because it is given in one dose instead of two.
  • cheap and easily stored
  • though data from late-stage trials has not yet been shared publicly.
  • Many scientists were puzzled, however, by data showing that its efficacy may depend on the strength of the initial dose or the gap between doses.
  • Moderna’s does not need to be stored at ultracold temperatures, making it better suited for smaller clinics and remote areas.
  • No serious health problems have been linked to the shot, though some people get fatigue, fever and muscle aches.
  • run into big delays
  • being distributed in Argentina, Belarus and other countries.
  • final stage of testing in the U.S. in late December.
Javier E

How to Remember Everything You Want From Non-Fiction Books | by Eva Keiffenheim, MSc | ... - 0 views

  • A Bachelor’s degree taught me how to learn to ace exams. But it didn’t teach me how to learn to remember.
  • 65% to 80% of students answered “no” to the question “Do you study the way you do because somebody taught you to study that way?”
  • the most-popular Coursera course of all time: Dr. Barabara Oakley’s free course on “Learning how to Learn.” So did I. And while this course taught me about chunking, recalling, and interleaving
  • ...66 more annotations...
  • I learned something more useful: the existence of non-fiction literature that can teach you anything.
  • something felt odd. Whenever a conversation revolved around a serious non-fiction book I read, such as ‘Sapiens’ or ‘Thinking Fast and Slow,’ I could never remember much. Turns out, I hadn’t absorbed as much information as I’d believed. Since I couldn’t remember much, I felt as though reading wasn’t an investment in knowledge but mere entertainment.
  • When I opened up about my struggles, many others confessed they also can’t remember most of what they read, as if forgetting is a character flaw. But it isn’t.
  • It’s the way we work with books that’s flawed.
  • there’s a better way to read. Most people rely on techniques like highlighting, rereading, or, worst of all, completely passive reading, which are highly ineffective.
  • Since I started applying evidence-based learning strategies to reading non-fiction books, many things have changed. I can explain complex ideas during dinner conversations. I can recall interesting concepts and link them in my writing or podcasts. As a result, people come to me for all kinds of advice.
  • What’s the Architecture of Human Learning and Memory?
  • Human brains don’t work like recording devices. We don’t absorb information and knowledge by reading sentences.
  • we store new information in terms of its meaning to our existing memory
  • we give new information meaning by actively participating in the learning process — we interpret, connect, interrelate, or elaborate
  • To remember new information, we not only need to know it but also to know how it relates to what we already know.
  • Learning is dependent on memory processes because previously-stored knowledge functions as a framework in which newly learned information can be linked.”
  • Human memory works in three stages: acquisition, retention, and retrieval. In the acquisition phase, we link new information to existing knowledge; in the retention phase, we store it, and in the retrieval phase, we get information out of our memory.
  • Retrieval, the third stage, is cue dependent. This means the more mental links you’re generating during stage one, the acquisition phase, the easier you can access and use your knowledge.
  • we need to understand that the three phases interrelate
  • creating durable and flexible access to to-be-learned information is partly a matter of achieving a meaningful encoding of that information and partly a matter of exercising the retrieval process.”
  • Next, we’ll look at the learning strategies that work best for our brains (elaboration, retrieval, spaced repetition, interleaving, self-testing) and see how we can apply those insights to reading non-fiction books.
  • The strategies that follow are rooted in research from professors of Psychological & Brain Science around Henry Roediger and Mark McDaniel. Both scientists spent ten years bridging the gap between cognitive psychology and education fields. Harvard University Press published their findings in the book ‘Make It Stick.
  • #1 Elaboration
  • “Elaboration is the process of giving new material meaning by expressing it in your own words and connecting it with what you already know.”
  • Why elaboration works: Elaborative rehearsal encodes information into your long-term memory more effectively. The more details and the stronger you connect new knowledge to what you already know, the better because you’ll be generating more cues. And the more cues they have, the easier you can retrieve your knowledge.
  • How I apply elaboration: Whenever I read an interesting section, I pause and ask myself about the real-life connection and potential application. The process is invisible, and my inner monologues sound like: “This idea reminds me of…, This insight conflicts with…, I don’t really understand how…, ” etc.
  • For example, when I learned about A/B testing in ‘The Lean Startup,’ I thought about applying this method to my startup. I added a note on the site stating we should try it in user testing next Wednesday. Thereby the book had an immediate application benefit to my life, and I will always remember how the methodology works.
  • How you can apply elaboration: Elaborate while you read by asking yourself meta-learning questions like “How does this relate to my life? In which situation will I make use of this knowledge? How does it relate to other insights I have on the topic?”
  • While pausing and asking yourself these questions, you’re generating important memory cues. If you take some notes, don’t transcribe the author’s words but try to summarize, synthesize, and analyze.
  • #2 Retrieval
  • With retrieval, you try to recall something you’ve learned in the past from your memory. While retrieval practice can take many forms — take a test, write an essay, do a multiple-choice test, practice with flashcards
  • the authors of ‘Make It Stick’ state: “While any kind of retrieval practice generally benefits learning, the implication seems to be that where more cognitive effort is required for retrieval, greater retention results.”
  • Whatever you settle for, be careful not to copy/paste the words from the author. If you don’t do the brain work yourself, you’ll skip the learning benefits of retrieval
  • Retrieval strengthens your memory and interrupts forgetting and, as other researchers replicate, as a learning event, the act of retrieving information is considerably more potent than is an additional study opportunity, particularly in terms of facilitating long-term recall.
  • How I apply retrieval: I retrieve a book’s content from my memory by writing a book summary for every book I want to remember. I ask myself questions like: “How would you summarize the book in three sentences? Which concepts do you want to keep in mind or apply? How does the book relate to what you already know?”
  • I then publish my summaries on Goodreads or write an article about my favorite insights
  • How you can apply retrieval: You can come up with your own questions or use mine. If you don’t want to publish your summaries in public, you can write a summary into your journal, start a book club, create a private blog, or initiate a WhatsApp group for sharing book summaries.
  • a few days after we learn something, forgetting sets in
  • #3 Spaced Repetition
  • With spaced repetition, you repeat the same piece of information across increasing intervals.
  • The harder it feels to recall the information, the stronger the learning effect. “Spaced practice, which allows some forgetting to occur between sessions, strengthens both the learning and the cues and routes for fast retrieval,”
  • Why it works: It might sound counterintuitive, but forgetting is essential for learning. Spacing out practice might feel less productive than rereading a text because you’ll realize what you forgot. Your brain has to work harder to retrieve your knowledge, which is a good indicator of effective learning.
  • How I apply spaced repetition: After some weeks, I revisit a book and look at the summary questions (see #2). I try to come up with my answer before I look up my actual summary. I can often only remember a fraction of what I wrote and have to look at the rest.
  • “Knowledge trapped in books neatly stacked is meaningless and powerless until applied for the betterment of life.”
  • How you can apply spaced repetition: You can revisit your book summary medium of choice and test yourself on what you remember. What were your action points from the book? Have you applied them? If not, what hindered you?
  • By testing yourself in varying intervals on your book summaries, you’ll strengthen both learning and cues for fast retrieval.
  • Why interleaving works: Alternate working on different problems feels more difficult as it, again, facilitates forgetting.
  • How I apply interleaving: I read different books at the same time.
  • 1) Highlight everything you want to remember
  • #5 Self-Testing
  • While reading often falsely tricks us into perceived mastery, testing shows us whether we truly mastered the subject at hand. Self-testing helps you identify knowledge gaps and brings weak areas to the light
  • “It’s better to solve a problem than to memorize a solution.”
  • Why it works: Self-testing helps you overcome the illusion of knowledge. “One of the best habits a learner can instill in herself is regular self-quizzing to recalibrate her understanding of what she does and does not know.”
  • How I apply self-testing: I explain the key lessons from non-fiction books I want to remember to others. Thereby, I test whether I really got the concept. Often, I didn’t
  • instead of feeling frustrated, cognitive science made me realize that identifying knowledge gaps are a desirable and necessary effect for long-term remembering.
  • How you can apply self-testing: Teaching your lessons learned from a non-fiction book is a great way to test yourself. Before you explain a topic to somebody, you have to combine several mental tasks: filter relevant information, organize this information, and articulate it using your own vocabulary.
  • Now that I discovered how to use my Kindle as a learning device, I wouldn’t trade it for a paper book anymore. Here are the four steps it takes to enrich your e-reading experience
  • How you can apply interleaving: Your brain can handle reading different books simultaneously, and it’s effective to do so. You can start a new book before you finish the one you’re reading. Starting again into a topic you partly forgot feels difficult first, but as you know by now, that’s the effect you want to achieve.
  • it won’t surprise you that researchers proved highlighting to be ineffective. It’s passive and doesn’t create memory cues.
  • 2) Cut down your highlights in your browser
  • After you finished reading the book, you want to reduce your highlights to the essential part. Visit your Kindle Notes page to find a list of all your highlights. Using your desktop browser is faster and more convenient than editing your highlights on your e-reading device.
  • Now, browse through your highlights, delete what you no longer need, and add notes to the ones you really like. By adding notes to the highlights, you’ll connect the new information to your existing knowledge
  • 3) Use software to practice spaced repetitionThis part is the main reason for e-books beating printed books. While you can do all of the above with a little extra time on your physical books, there’s no way to systemize your repetition praxis.
  • Readwise is the best software to combine spaced repetition with your e-books. It’s an online service that connects to your Kindle account and imports all your Kindle highlights. Then, it creates flashcards of your highlights and allows you to export your highlights to your favorite note-taking app.
  • Common Learning Myths DebunkedWhile reading and studying evidence-based learning techniques I also came across some things I wrongly believed to be true.
  • #2 Effective learning should feel easyWe think learning works best when it feels productive. That’s why we continue to use ineffective techniques like rereading or highlighting. But learning works best when it feels hard, or as the authors of ‘Make It Stick’ write: “Learning that’s easy is like writing in sand, here today and gone tomorrow.”
  • In Conclusion
  • I developed and adjusted these strategies over two years, and they’re still a work in progress.
  • Try all of them but don’t force yourself through anything that doesn’t feel right for you. I encourage you to do your own research, add further techniques, and skip what doesn’t serve you
  • “In the case of good books, the point is not to see how many of them you can get through, but rather how many can get through to you.”— Mortimer J. Adler
Javier E

Opinion | Trump's Tweeting Isn't Crazy. It's Strategic, Typos and All. - The New York T... - 0 views

  • And Mr. Trump’s typo? It was surely not accidental. That extra “i” circumvented Twitter’s efforts to hide the hashtag in search results. Called #typosquatting, this tactic is often used by trolls and media manipulators to get around the rules of social media platforms.
  • Constant repetition makes the charge sound true, and blunts accusations of unethical behavior against Mr. Trump’s own children. With the hashtag, Mr. Trump found a way to tell supporters: Here is all you need to know about the Democratic nominee.
  • The hashtag took a complicated issue with legitimate questions about Hunter Biden’s business dealings with Ukraine and China — and reduced it to a slogan that could also be used to spread falsehoods about Joe Biden
  • ...6 more annotations...
  • #BidenCrimeFamily, and the typo, is a crash course in how to rally supporters around a conspiracy theory — while neutering the attempts of social media companies to stop it. Mr. Trump has used this same tactic to sow doubt about mail-in ballots and the integrity of the election.
  • On Wednesday afternoon, with the presidential race unresolved, a protester in Nevada interrupted an election official’s news conference by yelling, “The Biden crime family is stealing the election!”
  • #BidenCrimeFamily is part of a yearlong, effective disinformation campaign against Joe Biden. In the final days of the presidential race, the hashtag was used on Twitter and Facebook, as well as the darker parts of the web, including 4chan and Parler. It was repeated in the right-wing media ecosystem, like Steve Bannon’s podcast and The Gateway Pundit.
  • In the last month, on Facebook alone, it reached at least 277,000 people, according to CrowdTangle — and that’s only on non-private pages
  • Stage 2: Seeding on Social MediaIn any manipulation campaign, the second stage involves campaign operators strategically spreading the hashtag across the media ecosystem.
  • Stage 1: OriginsFor more than a year, right-wing media and partisans have pushed “Biden crime family” as a viral slogan. Media manipulation campaigns are usually conjured in small, hidden spaces by a few operators with an agenda. But in this case, it was influential media and political personalities who got the ball rolling.
Javier E

Opinion | You Are the Object of Facebook's Secret Extraction Operation - The New York T... - 0 views

  • Facebook is not just any corporation. It reached trillion-dollar status in a single decade by applying the logic of what I call surveillance capitalism — an economic system built on the secret extraction and manipulation of human data
  • Facebook and other leading surveillance capitalist corporations now control information flows and communication infrastructures across the world.
  • These infrastructures are critical to the possibility of a democratic society, yet our democracies have allowed these companies to own, operate and mediate our information spaces unconstrained by public law.
  • ...56 more annotations...
  • The result has been a hidden revolution in how information is produced, circulated and acted upon
  • The world’s liberal democracies now confront a tragedy of the “un-commons.” Information spaces that people assume to be public are strictly ruled by private commercial interests for maximum profit.
  • The internet as a self-regulating market has been revealed as a failed experiment. Surveillance capitalism leaves a trail of social wreckage in its wake: the wholesale destruction of privacy, the intensification of social inequality, the poisoning of social discourse with defactualized information, the demolition of social norms and the weakening of democratic institutions.
  • These social harms are not random. They are tightly coupled effects of evolving economic operations. Each harm paves the way for the next and is dependent on what went before.
  • There is no way to escape the machine systems that surveil u
  • All roads to economic and social participation now lead through surveillance capitalism’s profit-maximizing institutional terrain, a condition that has intensified during nearly two years of global plague.
  • Will Facebook’s digital violence finally trigger our commitment to take back the “un-commons”?
  • Will we confront the fundamental but long ignored questions of an information civilization: How should we organize and govern the information and communication spaces of the digital century in ways that sustain and advance democratic values and principles?
  • Mark Zuckerberg’s start-up did not invent surveillance capitalism. Google did that. In 2000, when only 25 percent of the world’s information was stored digitally, Google was a tiny start-up with a great search product but little revenue.
  • By 2001, in the teeth of the dot-com bust, Google’s leaders found their breakthrough in a series of inventions that would transform advertising. Their team learned how to combine massive data flows of personal information with advanced computational analyses to predict where an ad should be placed for maximum “click through.”
  • Google’s scientists learned how to extract predictive metadata from this “data exhaust” and use it to analyze likely patterns of future behavior.
  • Prediction was the first imperative that determined the second imperative: extraction.
  • Lucrative predictions required flows of human data at unimaginable scale. Users did not suspect that their data was secretly hunted and captured from every corner of the internet and, later, from apps, smartphones, devices, cameras and sensors
  • User ignorance was understood as crucial to success. Each new product was a means to more “engagement,” a euphemism used to conceal illicit extraction operations.
  • When asked “What is Google?” the co-founder Larry Page laid it out in 2001,
  • “Storage is cheap. Cameras are cheap. People will generate enormous amounts of data,” Mr. Page said. “Everything you’ve ever heard or seen or experienced will become searchable. Your whole life will be searchable.”
  • Instead of selling search to users, Google survived by turning its search engine into a sophisticated surveillance medium for seizing human data
  • Company executives worked to keep these economic operations secret, hidden from users, lawmakers, and competitors. Mr. Page opposed anything that might “stir the privacy pot and endanger our ability to gather data,” Mr. Edwards wrote.
  • As recently as 2017, Eric Schmidt, the executive chairman of Google’s parent company, Alphabet, acknowledged the role of Google’s algorithmic ranking operations in spreading corrupt information. “There is a line that we can’t really get across,” he said. “It is very difficult for us to understand truth.” A company with a mission to organize and make accessible all the world’s information using the most sophisticated machine systems cannot discern corrupt information.
  • This is the economic context in which disinformation wins
  • In March 2008, Mr. Zuckerberg hired Google’s head of global online advertising, Sheryl Sandberg, as his second in command. Ms. Sandberg had joined Google in 2001 and was a key player in the surveillance capitalism revolution. She led the build-out of Google’s advertising engine, AdWords, and its AdSense program, which together accounted for most of the company’s $16.6 billion in revenue in 2007.
  • A Google multimillionaire by the time she met Mr. Zuckerberg, Ms. Sandberg had a canny appreciation of Facebook’s immense opportunities for extraction of rich predictive data. “We have better information than anyone else. We know gender, age, location, and it’s real data as opposed to the stuff other people infer,” Ms. Sandberg explained
  • The company had “better data” and “real data” because it had a front-row seat to what Mr. Page had called “your whole life.”
  • Facebook paved the way for surveillance economics with new privacy policies in late 2009. The Electronic Frontier Foundation warned that new “Everyone” settings eliminated options to restrict the visibility of personal data, instead treating it as publicly available information.
  • Mr. Zuckerberg “just went for it” because there were no laws to stop him from joining Google in the wholesale destruction of privacy. If lawmakers wanted to sanction him as a ruthless profit-maximizer willing to use his social network against society, then 2009 to 2010 would have been a good opportunity.
  • Facebook was the first follower, but not the last. Google, Facebook, Amazon, Microsoft and Apple are private surveillance empires, each with distinct business models.
  • In 2021 these five U.S. tech giants represent five of the six largest publicly traded companies by market capitalization in the world.
  • As we move into the third decade of the 21st century, surveillance capitalism is the dominant economic institution of our time. In the absence of countervailing law, this system successfully mediates nearly every aspect of human engagement with digital information
  • Today all apps and software, no matter how benign they appear, are designed to maximize data collection.
  • Historically, great concentrations of corporate power were associated with economic harms. But when human data are the raw material and predictions of human behavior are the product, then the harms are social rather than economic
  • The difficulty is that these novel harms are typically understood as separate, even unrelated, problems, which makes them impossible to solve. Instead, each new stage of harm creates the conditions for the next stage.
  • Fifty years ago the conservative economist Milton Friedman exhorted American executives, “There is one and only one social responsibility of business — to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game.” Even this radical doctrine did not reckon with the possibility of no rules.
  • With privacy out of the way, ill-gotten human data are concentrated within private corporations, where they are claimed as corporate assets to be deployed at will.
  • The sheer size of this knowledge gap is conveyed in a leaked 2018 Facebook document, which described its artificial intelligence hub, ingesting trillions of behavioral data points every day and producing six million behavioral predictions each second.
  • Next, these human data are weaponized as targeting algorithms, engineered to maximize extraction and aimed back at their unsuspecting human sources to increase engagement
  • Targeting mechanisms change real life, sometimes with grave consequences. For example, the Facebook Files depict Mr. Zuckerberg using his algorithms to reinforce or disrupt the behavior of billions of people. Anger is rewarded or ignored. News stories become more trustworthy or unhinged. Publishers prosper or wither. Political discourse turns uglier or more moderate. People live or die.
  • Occasionally the fog clears to reveal the ultimate harm: the growing power of tech giants willing to use their control over critical information infrastructure to compete with democratically elected lawmakers for societal dominance.
  • when it comes to the triumph of surveillance capitalism’s revolution, it is the lawmakers of every liberal democracy, especially in the United States, who bear the greatest burden of responsibility. They allowed private capital to rule our information spaces during two decades of spectacular growth, with no laws to stop it.
  • All of it begins with extraction. An economic order founded on the secret massive-scale extraction of human data assumes the destruction of privacy as a nonnegotiable condition of its business operations.
  • We can’t fix all our problems at once, but we won’t fix any of them, ever, unless we reclaim the sanctity of information integrity and trustworthy communications
  • The abdication of our information and communication spaces to surveillance capitalism has become the meta-crisis of every republic, because it obstructs solutions to all other crises.
  • Neither Google, nor Facebook, nor any other corporate actor in this new economic order set out to destroy society, any more than the fossil fuel industry set out to destroy the earth.
  • like global warming, the tech giants and their fellow travelers have been willing to treat their destructive effects on people and society as collateral damage — the unfortunate but unavoidable byproduct of perfectly legal economic operations that have produced some of the wealthiest and most powerful corporations in the history of capitalism.
  • Where does that leave us?
  • Democracy is the only countervailing institutional order with the legitimate authority and power to change our course. If the ideal of human self-governance is to survive the digital century, then all solutions point to one solution: a democratic counterrevolution.
  • instead of the usual laundry lists of remedies, lawmakers need to proceed with a clear grasp of the adversary: a single hierarchy of economic causes and their social harms.
  • We can’t rid ourselves of later-stage social harms unless we outlaw their foundational economic causes
  • This means we move beyond the current focus on downstream issues such as content moderation and policing illegal content. Such “remedies” only treat the symptoms without challenging the illegitimacy of the human data extraction that funds private control over society’s information spaces
  • Similarly, structural solutions like “breaking up” the tech giants may be valuable in some cases, but they will not affect the underlying economic operations of surveillance capitalism.
  • Instead, discussions about regulating big tech should focus on the bedrock of surveillance economics: the secret extraction of human data from realms of life once called “private.
  • No secret extraction means no illegitimate concentrations of knowledge about people. No concentrations of knowledge means no targeting algorithms. No targeting means that corporations can no longer control and curate information flows and social speech or shape human behavior to favor their interests
  • the sober truth is that we need lawmakers ready to engage in a once-a-century exploration of far more basic questions:
  • How should we structure and govern information, connection and communication in a democratic digital century?
  • What new charters of rights, legislative frameworks and institutions are required to ensure that data collection and use serve the genuine needs of individuals and society?
  • What measures will protect citizens from unaccountable power over information, whether it is wielded by private companies or governments?
  • The corporation that is Facebook may change its name or its leaders, but it will not voluntarily change its economics.
sissij

Pregnancy Changes the Brain in Ways That May Help Mothering - The New York Times - 0 views

  • Pregnancy changes a woman’s brain, altering the size and structure of areas involved in perceiving the feelings and perspectives of others, according to a first-of-its-kind study published Monday.
  • The results were remarkable: loss of gray matter in several brain areas involved in a process called social cognition or “theory of mind,” the ability to register and consider how other people perceive things.
  • A third possibility is that the loss is “part of the brain’s program for dealing with the future,” he said. Hormone surges in pregnancy might cause “pruning or cellular adaptation that is helpful,” he said, streamlining certain brain areas to be more efficient at mothering skills “from nurturing to extra vigilance to teaching.”
  • ...4 more annotations...
  • Pregnancy, she explained, may help a woman’s brain specialize in “a mother’s ability to recognize the needs of her infant, to recognize social threats or to promote mother-infant bonding.”
  • Researchers wanted to see if the women’s brain changes affected anything related to mothering. They found that relevant brain regions in mothers showed more activity when women looked at photos of their own babies than with photos of other children.
  • During another period of roiling hormonal change — adolescence — gray matter decreases in several brain regions that are believed to provide fine-tuning for the social, emotional and cognitive territory of being a teenager.
  • evidence against the common myth of ‘mommy brain.’
  •  
    Our brain changes during our lifetime to better fit our need. The decrease in gray matter in brain during pregnancy enables mothers to learn mothering skills fasters and be more focused on their own child. This aligns with the logic of evolution because newborns need a lot of attention and care from their mother. I am also very surprised to see that the similar thing also happens to teenager. The decrease in gray matter gives plasticity for teenagers to absorb new knowledge. It's so amazing that our brain is actually adjusting itself in different stages of life. --Sissi (12/20/2016)
kushnerha

Is That Even a Thing? - The New York Times - 3 views

  • Speakers and writers of American English have recently taken to identifying a staggering and constantly changing array of trends, events, memes, products, lifestyle choices and phenomena of nearly every kind with a single label — a thing.
  • It would be easy to call this a curiosity of the language and leave it at that. Linguistic trends come and go.
  • One could, on the other hand, consider the use of “a thing” a symptom of an entire generation’s linguistic sloth, general inarticulateness and penchant for cutesy, empty, half-ironic formulations that create a self-satisfied barrier preventing any form of genuine engagement with the world around them.
  • ...9 more annotations...
  • My assumption is that language and experience mutually influence each other. Language not only captures experience, it conditions it. It sets expectations for experience and gives shape to it as it happens. What might register as inarticulateness can reflect a different way of understanding and experiencing the world.
  • The word “thing” has of course long played a versatile and generic role in our language, referring both to physical objects and abstract matters. “The thing is …” “Here’s the thing.” “The play’s the thing.” In these examples, “thing” denotes the matter at hand and functions as stage setting to emphasize an important point. One new thing about “a thing,” then, is the typical use of the indefinite article “a” to precede it. We talk about a thing because we are engaged in cataloging. The question is whether something counts as a thing. “A thing” is not just stage setting. Information is conveyed.
  • What information? One definition of “a thing” that suggests itself right away is “cultural phenomenon.” A new app, an item of celebrity gossip, the practices of a subculture. It seems likely that “a thing” comes from the phrase the coolest/newest/latest thing. But now, in a society where everything, even the past, is new — “new thing” verges on the redundant. If they weren’t new they wouldn’t be things.
  • Clearly, cultural phenomena have long existed and been called “fads,” “trends,” “rages” or have been designated by the category they belong to — “product,” “fashion,” “lifestyle,” etc. So why the application of this homogenizing general term to all of them? I think there are four main reasons.
  • First, the flood of content into the cultural sphere. That we are inundated is well known. Information besieges us in waves that thrash us against the shore until we retreat to the solid ground of work or sleep or exercise or actual human interaction, only to wade cautiously back into our smartphones. As we spend more and more time online, it becomes the content of our experience, and in this sense “things” have earned their name. “A thing” has become the basic unit of cultural ontology.
  • Second, the fragmentation of this sphere. The daily barrage of culture requires that we choose a sliver of the whole in order to keep up. Netflix genres like “Understated Romantic Road Trip Movies” make it clear that the individual is becoming his or her own niche market — the converse of the celebrity as brand. We are increasingly a society of brands attuning themselves to markets, and markets evaluating brands. The specificity of the market requires a wider range of content — of things — to satisfy it
  • Third, the closing gap between satire and the real thing. The absurd excess of things has reached a point where the ironic detachment needed to cope with them is increasingly built into the things themselves, their marketing and the language we use to talk about them. The designator “a thing” is thus almost always tinged with ironic detachment. It puts the thing at arm’s length. You can hardly say “a thing” without a wary glint in your eye.
  • Finally, the growing sense that these phenomena are all the same. As we step back from “things,” they recede into the distance and begin to blur together. We call them all by the same name because they are the same at bottom: All are pieces of the Internet. A thing is for the most part experienced through this medium and generated by it. Even if they arise outside it, things owe their existence as things to the Internet. Google is thus always the arbiter of the question, “Is that a real thing?”
  • “A thing,” then, corresponds to a real need we have, to catalog and group together the items of cultural experience, while keeping them at a sufficient distance so that we can at least feign unified consciousness in the face of a world gone to pieces.
Sophia C

BBC News - Viewpoint: Human evolution, from tree to braid - 0 views

  • What was, in my view, a logical conclusion reached by the authors was too much for some researchers to take.
  • he conclusion of the Dmanisi study was that the variation in skull shape and morphology observed in this small sample, derived from a single population of Homo erectus, matched the entire variation observed among African fossils ascribed to three species - H. erectus, H. habilis and H. rudolfensis.
  • a single population of H. erectus,
  • ...13 more annotations...
  • They all had to be the same species.
  • was not surprising to find that Neanderthals and modern humans interbred, a clear expectation of the biological species concept.
  • I wonder when the penny will drop: when we have five pieces of a 5,000-piece jigsaw puzzle, every new bit that we add is likely to change the picture.
  • e identity of the fourth player remains unknown but it was an ancient lineage that had been separate for probably over a million years. H. erectus seems a likely candidate. Whatever the name we choose to give this mystery lineage, what these results show is that gene flow was possible not just among contemporaries but also between ancient and more modern lineages.
  • cientists succeeded in extracting the most ancient mitochondrial DNA so far, from the Sima de los Huesos site in Atapuerca, Spain.
  • We have built a picture of our evolution based on the morphology of fossils and it was wrong.
    • Sophia C
       
      Kuhn
  • when we know how plastic - or easily changeable - skull shape is in humans. And our paradigms must also change.
  • e must abandon, once and for all, views of modern human superiority over archaic (ancient) humans. The terms "archaic" and "modern" lose all meaning as do concepts of modern human replacement of all other lineages.
  • he deep-rooted shackles that have sought to link human evolution with stone tool-making technological stages - the Stone Ages - even when we have known that these have overlapped with each other for half-a-million years in some instances.
  • e world of our biological and cultural evolution was far too fluid for us to constrain it into a few stages linked by transitions.
  • We have to flesh out the genetic information and this is where archaeology comes into the picture.
  • Rather than focus on differences between modern humans and Neanderthals, what the examples show is the range of possibilities open to humans (Neanderthals included) in different circumstances.
  • research using new technology on old archaeological sites, as at La Chapelle; and
grayton downing

Opinion: I Want My Kidney | The Scientist Magazine® - 0 views

  • Indeed,10 years ago, one of Atala's young patients, Luke Massella, received an engineered bladder made using similar technology.
  • human kidney, albeit non-functional, takes us a step closer to utilizing the enormous potential of stem cells from different sources.
  • disorders can contribute to end-stage renal disease, which is the main indicator for kidney transplantation. Diabetes alone, which is on the rise the world over, is the major cause, responsible for more than 40 percent of all end-stage renal disease cases in need of kidney transplant
  • ...1 more annotation...
  • Research on differentiated kidney cells, particularly podocytes that can be cultured in the lab, may hold some answers. Although there are a few human- and mouse-derived cell lines that potentially differentiate into functional podocytes, there is a need for better cell lines and systems that mimic the in vivo situation more closely
Javier E

Humans, Version 3.0 § SEEDMAGAZINE.COM - 0 views

  • Where are we humans going, as a species? If science fiction is any guide, we will genetically evolve like in X-Men, become genetically engineered as in Gattaca, or become cybernetically enhanced like General Grievous in Star Wars.
  • There is, however, another avenue for human evolution, one mostly unappreciated in both science and fiction. It is this unheralded mechanism that will usher in the next stage of human, giving future people exquisite powers we do not currently possess, powers worthy of natural selection itself. And, importantly, it doesn’t require us to transform into cyborgs or bio-engineered lab rats. It merely relies on our natural bodies and brains functioning as they have for millions of years. This mystery mechanism of human transformation is neuronal recycling, coined by neuroscientist Stanislas Dehaene, wherein the brain’s innate capabilities are harnessed for altogether novel functions.
  • The root of these misconceptions is the radical underappreciation of the design engineered by natural selection into the powers implemented by our bodies and brains, something central to my 2009 book, The Vision Revolution. For example, optical illusions (such as the Hering) are not examples of the brain’s poor hardware design, but, rather, consequences of intricate evolutionary software for generating perceptions that correct for neural latencies in normal circumstances.
  • ...4 more annotations...
  • Like all animal brains, human brains are not general-purpose universal learning machines, but, instead, are intricately structured suites of instincts optimized for the environments in which they evolved. To harness our brains, we want to let the brain’s brilliant mechanisms run as intended—i.e., not to be twisted. Rather, the strategy is to twist Y into a shape that the brain does know how to process.
  • there is a very good reason to be optimistic that the next stage of human will come via the form of adaptive harnessing, rather than direct technological enhancement: It has already happened. We have already been transformed via harnessing beyond what we once were. We’re already Human 2.0, not the Human 1.0, or Homo sapiens, that natural selection made us. We Human 2.0’s have, among many powers, three that are central to who we take ourselves to be today: writing, speech, and music (the latter perhaps being the pinnacle of the arts). Yet these three capabilities, despite having all the hallmarks of design, were not a result of natural selection, nor were they the result of genetic engineering or cybernetic enhancement to our brains. Instead, and as I argue in both The Vision Revolution and my forthcoming Harnessed, these are powers we acquired by virtue of harnessing, or neuronal recycling.
  • Although the step from Human 1.0 to 2.0 was via cultural selection, not via explicit human designers, does the transformation to Human 3.0 need to be entirely due to a process like cultural evolution, or might we have any hope of purposely guiding our transformation? When considering our future, that’s probably the most relevant question we should be asking ourselves.
  • One of my reasons for optimism is that nature-harnessing technologies (like writing, speech, and music) must mimic fundamental ecological features in nature, and that is a much easier task for scientists to tackle than emulating the exhorbitantly complex mechanisms of the brain
Javier E

Newspaper in Israel Scrubs Women From a Photo of Paris Unity Rally - NYTimes.com - 0 views

  • “It is rather embarrassing when, at a time that the Western world is rallying against manifestations of religious extremism, our extremists manage to take the stage,” Allison Kaplan Sommer commented on a blog for Israel’s left-leaning newspaper Haaretz. She berated HaMevaser for “denying the fact that in the wider world, beyond the ultra-Orthodox Jewish community, women do stand on the world stage and shape events.”
  • Apparently deleted along with Ms. Merkel were: the mayor of Paris, Anne Hidalgo; a European Union official; and Simonetta Sommaruga, the president of Switzerland.
  • “It’s very, very, very, very, very hard for a nonreligious person to understand the purity of eyes,” Ms. Burshtein said. “By us, men don’t look at women’s photos, period. As long as you don’t know that, then it sounds ridiculous, or changing history or events. But we’re not here to get the events the way they are. We are here to keep the eyes.”
1 - 20 of 141 Next › Last »
Showing 20 items per page