Skip to main content

Home/ TOK Friends/ Group items tagged problem-solving

Rss Feed Group items tagged

Javier E

Young Minds in Critical Condition - NYTimes.com - 1 views

  • Our best college students are very good at being critical. In fact being smart, for many, means being critical. Having strong critical skills shows that you will not be easily fooled. It is a sign of sophistication, especially when coupled with an acknowledgment of one’s own “privilege.”
  • The combination of resistance to influence and deflection of responsibility by confessing to one’s advantages is a sure sign of one’s ability to negotiate the politics of learning on campus.
  • Taking things apart, or taking people down, can provide the satisfactions of cynicism. But this is thin gruel.
  • ...7 more annotations...
  • In overdeveloping the capacity to show how texts, institutions or people fail to accomplish what they set out to do, we may be depriving students of the chance to learn as much as possible from what they study.
  • As debunkers, they contribute to a cultural climate that has little tolerance for finding or making meaning — a culture whose intellectuals and cultural commentators get “liked” by showing that somebody else just can’t be believed.
  • Liberal education in America has long been characterized by the intertwining of two traditions: of critical inquiry in pursuit of truth and exuberant performance in pursuit of excellence. In the last half-century, though, emphasis on inquiry has become dominant, and it has often been reduced to the ability to expose error and undermine belief.
  • fetishizing disbelief as a sign of intelligence has contributed to depleting our cultural resources. Creative work, in whatever field, depends upon commitment, the energy of participation and the ability to become absorbed in works of literature, art and science. That type of absorption is becoming an endangered species of cultural life, as our nonstop, increasingly fractured technological existence wears down our receptive capacities.
  • Liberal learning depends on absorption in compelling work. It is a way to open ourselves to the various forms of life in which we might actively participate. When we learn to read or look or listen intensively, we are, at least temporarily, overcoming our own blindness by trying to understand an experience from another’s point of view.
  • we are learning to activate potential, and often to instigate new possibilities.
  • Liberal education must not limit itself to critical thinking and problem solving; it must also foster openness, participation and opportunity. It should be designed to take us beyond the campus to a life of ongoing, pragmatic learning that finds inspiration in unexpected sources, and increases our capacity to understand and contribute to the world
Javier E

In 'Misbehaving,' an Economics Professor Isn't Afraid to Attack His Own - NYTimes.com - 0 views

  • the book is part memoir, part attack on a breed of economist who dominated the academy – particularly, the Chicago School that dominated economic theory at the University of Chicago – for the much of the latter part of the 20th century.
  • economists have increasingly become the go-to experts on every manner of business and public policy issue facing society.
  • rather than being a disgruntled former employee or otherwise easily marginalized whistle-blower, Mr. Thaler recently took the reins as president of the American Economic Association (and still teaches at Chicago’s graduate business program
  • ...9 more annotations...
  • The economics profession that Mr. Thaler entered in the early 1970s was deeply invested in proving that it was more than a mere social science.
  • But economic outcomes are the result of human decision-making. To achieve the same mathematical precision of hard sciences, economists made a radically simplifying assumption that people are “optimizers” whose behavior is as predictable as the speed of physical body falling through space.
  • After so-called behavioral economics began to go mainstream, Professor Thaler turned his attention to helping solve a variety of business and, increasingly, public policy issues. As these tools have been applied to practical problems, Professor Thaler has noted that there has been “very little actual economics involved.” Instead, the resulting insights have “come primarily from psychology and the other social sciences.”
  • it is actually “a slur on those other social sciences if people insist on calling any policy-related research some kind of economics.”
  • Professor Thaler’s narrative ultimately demonstrates that by trying to set itself as somehow above other social sciences, the “rationalist” school of economics actually ended up contributing far less than it could have. The group’s intellectual denial led to not just sloppy social science, but sloppy philosophy.
  • Economists would do well to embrace both their philosophical and social science roots. No amount of number-crunching can replace the need to confront the complexity of human existence.
  • It is not only in academics that the most difficult questions are avoided behind a mathematical smoke screen. When businesses use cost-benefit analysis, for instance, they are applying a moral philosophy known as utilitarianism, popularized by John Stuart Mill in the 19th century.
  • Compared against alternative moral philosophies, like those of Kant or Aristotle, Mill has relatively few contemporary adherents in professional philosophical circles. But utilitarianism does have the virtue of lending itself to mathematical calculation. By giving the contentious philosophy a benign bureaucratic name like “cost-benefit analysis,” corporations hope to circumvent the need to confront the profound ethical issues implicated.
  • The “misbehaving” of Professor Thaler’s title is supposed to refer to how human actions are inconsistent with rationalist economic theory
Ellie McGinnis

Role of Humanities, in School and Life - NYTimes.com - 0 views

  • the major value of a college curriculum, and the reason an undergraduate degree is still preferable to a random menu of massive online open courses, is the opportunity it offers students through a variety of disciplines and the different skills specific to each
  • most colleges do not view humanities and sciences as in competition with each other. Today’s students need to develop the capacity for open-ended inquiry cultivated by the liberal arts, and also the problem-solving skills associated with science and technology.
  • a major factor that’s reshaped humanities education since 1970, when the decline began: postmodernism.
  • ...5 more annotations...
  • I fled my passion, literature, for a practical and rational-minded career in medicine.
  • More important, studying the humanities helps us make sense of our lives and our world, whether the times are good or bad.
  • But the humanities are not on life support. They are alive and well, and remain vitally important in preparing graduates to lead meaningful, considered lives, to flourish in multiple careers and to be informed, engaged citizens of our democracy and our rapidly evolving world
  • While the professors justifiably cite inadequate funding and marketplace demand for scientists and engineers as causes of the marginalization of the humanities, they also ought to look inward at their profession’s rejection of the rational ideals that make the educated world go round.
  • The narrow focus on STEM education can produce a well-trained work force. What the country and the world need are well-educated citizens.
Javier E

Listening to Ta-Nehisi Coates While White - The New York Times - 0 views

  • Your new book, “Between the World and Me,” is a great and searing contribution to this public education. It is a mind-altering account of the black male experience. Every conscientious American should read it.
  • Written as a letter to your son, you talk about the effects of pervasive fear. “When I was your age the only people I knew were black and all of them were powerfully, adamantly, dangerously afraid.”
  • the disturbing challenge of your book is your rejection of the American dream. My ancestors chose to come here. For them, America was the antidote to the crushing restrictiveness of European life, to the pogroms. For them, the American dream was an uplifting spiritual creed that offered dignity, the chance to rise.
  • ...6 more annotations...
  • The innocent world of the dream is actually built on the broken bodies of those kept down below.
  • Your ancestors came in chains. In your book the dream of the comfortable suburban life is a “fairy tale.” For you, slavery is the original American sin, from which there is no redemption. America is Egypt without the possibility of the Exodus. African-American men are caught in a crushing logic, determined by the past, from which there is no escape.
  • I think you distort American history. This country, like each person in it, is a mixture of glory and shame. There’s a Lincoln for every Jefferson Davis and a Harlem Children’s Zone for every K.K.K. — and usually vastly more than one. Violence is embedded in America, but it is not close to the totality of America.
  • In your anger at the tone of innocence some people adopt to describe the American dream, you reject the dream itself as flimflam. But a dream sullied is not a lie. The American dream of equal opportunity, social mobility and ever more perfect democracy cherishes the future more than the past. It abandons old wrongs and transcends old sins for the sake of a better tomorrow.
  • This dream is a secular faith that has unified people across every known divide. It has unleashed ennobling energies and mobilized heroic social reform movements. By dissolving the dream under the acid of an excessive realism, you trap generations in the past and destroy the guiding star that points to a better future.
  • Ben.Lynch Augusta, Georgia 1 day ago I've followed Mr. Coates for sometime now and tend to agree with his viewpoints. As a white southern male who works in our segregated education systems, it took years for me to actually apprehend my day to day interactions with black children in poverty. For the longest time, I simply couldn't understand the behaviors I saw, the reactions, and at first I reacted with contempt, but then I began to understand little by little. I don't think I can express it to anyone here, let alone Mr. Brooks, but it is something that one only appears to get by letting it sink into your bones day after day, hour after hour. The conditions are terrible and promise to remain so for the rest of my career. My students know, even if they lack the rhetorical chops to express it, that the game is rigged, that what we sell, meaning America at large, to make ourselves feel better, to feel just, is demonstrably a false bill of goods that does nothing to solve the actual problems of their life. These students are so used to being crushed that they can't even begin to articulate that they are being crushed because it's simply how it is. Some suppose Mr. Coates is overly dour, overly negative, but he provides a necessary contrapuntal to those who assume that the arc of the universe inevitably bends toward justice. Struggle can and often does end in defeat. The wicked often sleep as well as the just and sometimes better. The sins of the fathers remain fixed around the necks of the sons and daughters.
Javier E

Are College Lectures Unfair? - The New York Times - 1 views

  • a growing body of evidence suggests that the lecture is not generic or neutral, but a specific cultural form that favors some people while discriminating against others
  • research has demonstrated that we learn new material by anchoring it to knowledge we already possess. The same lecture, given by the same professor in the same lecture hall, is actually not the same for each student listening; students with more background knowledge will be better able to absorb and retain what they hear.
  • Active-learning courses deliberately structure in-class and out-of-class assignments to ensure that students repeatedly engage with the material. The instructors may pose questions about the week’s reading, for example, and require students to answer the questions online, for a grade, before coming to class.
  • ...4 more annotations...
  • In the structured course, all demographic groups reported completing the readings more frequently and spending more time studying; all groups also achieved higher final grades than did students in the lecture course.
  • Other active-learning courses administer frequent quizzes that oblige students to retrieve knowledge from memory rather than passively read it over in a textbook. Such quizzes have been shown to improve retention of factual material among all kinds of students.
  • The act of putting one’s own thoughts into words and communicating them to others, research has shown, is a powerful contributor to learning. Active-learning courses regularly provide opportunities for students to talk and debate with one another in a collaborative, low-pressure environment.
  • researchers from the University of Massachusetts Amherst and Yale University compare a course in physical chemistry taught in traditional lecture style to the same course taught in a “flipped” format, in which lectures were moved online and more time was devoted to in-class problem-solving activities. Exam performance over all was nearly 12 percent higher in the flipped class
Sophia C

Thomas Kuhn: Revolution Against Scientific Realism* - 1 views

  • as such a complex system that nobody believed that it corresponded to the physical reality of the universe. Although the Ptolemaic system accounted for observations-"saved the appearances"-its epicycles and deferents were never intended be anything more than a mathematical model to use in predicting the position of heavenly bodies. [3]
  • lileo that he was free to continue his work with Copernican theory if he agreed that the theory did not describe physical reality but was merely one of the many potential mathematical models. [10] Galileo continued to work, and while he "formally (23)claimed to prove nothing," [11] he passed his mathematical advances and his observational data to Newton, who would not only invent a new mathematics but would solve the remaining problems posed by Copernicus. [12]
  • Thus without pretending that his method could find the underlying causes of things such as gravity, Newton believed that his method produced theory, based upon empirical evidence, that was a close approximation of physical reality.
  • ...27 more annotations...
  • Medieval science was guided by "logical consistency."
  • The logical empiricist's conception of scientific progress was thus a continuous one; more comprehensive theory replaced compatible, older theory
  • Hempel also believed that science evolved in a continuous manner. New theory did not contradict past theory: "theory does not simply refute the earlier empirical generalizations in its field; rather, it shows that within a certain limited range defined by qualifying conditions, the generalizations hold true in fairly close approximation." [21]
  • New theory is more comprehensive; the old theory can be derived from the newer one and is one special manifestation" [22] of the more comprehensive new theory.
  • movement combined induction, based on empiricism, and deduction in the form of logic
  • It was the truth, and the prediction and control that came with it, that was the goal of logical-empirical science.
  • Each successive theory's explanation was closer to the truth than the theory before.
  • e notion of scientific realism held by Newton led to the evolutionary view of the progress of science
  • he entities and processes of theory were believed to exist in nature, and science should discover those entities and processes
  • Particularly disturbing discoveries were made in the area of atomic physics. For instance, Heisenberg's indeterminacy (25)principle, according to historian of science Cecil Schneer, yielded the conclusion that "the world of nature is indeterminate.
  • "even the fundamental principle of causality fail[ed] ."
  • was not until the second half of the twentieth century that the preservers of the evolutionary idea of scientific progress, the logical empiricists, were seriously challenged
  • revolutionary model of scientific change and examined the role of the scientific community in preventing and then accepting change. Kuhn's conception of scientific change occurring through revolutions undermined the traditional scientific goal, finding "truth" in nature
  • Textbooks inform scientists-to-be about this common body of knowledge and understanding.
  • for the world is too huge and complex to be explored randomly.
  • a scientist knows what facts are relevant and can build on past research
  • Normal science, as defined by Kuhn, is cumulative. New knowledge fills a gap of ignorance
  • ne standard product of the scientific enterprise is missing. Normal science does not aim at novelties of fact or theory and, when successful, finds none."
  • ntain a mechanism that uncovers anomaly, inconsistencies within the paradigm.
  • eventually, details arise that are inconsistent with the current paradigm
  • hese inconsistencies are eventually resolved or are ignored.
  • y concern a topic of central importance, a crisis occurs and normal science comes to a hal
  • that the scientists re-examine the foundations of their science that they had been taking for granted
  • it resolves the crisis better than the others, it offers promise for future research, and it is more aesthetic than its competitors. The reasons for converting to a new paradigm are never completely rational.
  • Unlike evolutionary science, in which new knowledge fills a gap of ignorance, in Kuhn's model new knowledge replaces incompatible knowledge.
  • Thus science is not a continuous or cumulative endeavor: when a paradigm shift occurs there is a revolution similar to a political revolution, with fundamental and pervasive changes in method and understanding. Each successive vision about the nature of the universe makes the past vision obsolete; predictions, though more precise, remain similar to the predictions of the past paradigm in their general orientation, but the new explanations do not accommodate the old
  • In a sense, we have circled back to the ancient and medieval practice of separating scientific theory from physical reality; both medieval scientists and Kuhn would agree that no theory corresponds to reality and therefore any number of theories might equally well explain a natural phenomenon. [36] Neither twentieth-century atomic theorists nor medieval astronomers are able to claim that their theories accurately describe physical phenomena. The inability to return to scientific realism suggests a tripartite division of the history of science, with a period of scientific realism fitting between two periods in which there is no insistence that theory correspond to reality. Although both scientific realism and the evolutionary idea of scientific progress appeal to common sense, both existed for only a few hundred years.
Javier E

Getting It Right - NYTimes.com - 1 views

  • What is it to truly know something?
  • In the complacent 1950s, it was received wisdom that we know a given proposition to be true if, and only if, it is true, we believe it to be true, and we are justified in so believing.
  • This consensus was exploded in a brief 1963 note by Edmund Gettier in the journal Analysis.
  • ...17 more annotations...
  • Suppose you have every reason to believe that you own a Bentley, since you have had it in your possession for many years, and you parked it that morning at its usual spot. However, it has just been destroyed by a bomb, so that you own no Bentley, despite your well justified belief that you do. As you sit in a cafe having your morning latte, you muse that someone in that cafe owns a Bentley (since after all you do). And it turns out you are right, but only because the other person in the cafe, the barista, owns a Bentley, which you have no reason to suspect. So you here have a well justified true belief that is not knowledge.
  • After many failed attempts to fix the justified-true-belief account with minor modifications, philosophers tried more radical departures. One promising approach suggests that knowledge is a form of action, comparable to an archer’s success when he consciously aims to hit a target.
  • An archer’s shot can be assessed in several ways. It can be accurate (successful in hitting the target). It can also be adroit (skillful or competent). An archery shot is adroit only if, as the arrow leaves the bow, it is oriented well and powerfully enough.
  • A shot’s aptness requires that its success be attained not just by luck (such as the luck of that second gust). The success must rather be a result of competence.
  • we can generalize from this example, to give an account of a fully successful attempt of any sort. Any attempt will have a distinctive aim and will thus be fully successful only if it succeeds not only adroitly but also aptly.
  • We need people to be willing to affirm things publicly. And we need them to be sincere (by and large) in doing so, by aligning public affirmation with private judgment. Finally, we need people whose assertions express what they actually know.
  • Aristotle in his “Nicomachean Ethics” developed an AAA account of attempts to lead a flourishing life in accord with fundamental human virtues (for example, justice or courage). Such an approach is called virtue ethics.
  • a fully successful attempt is good overall only if the agent’s goal is good enough. An attempt to murder an innocent person is not good even if it fully succeeds.
  • Virtue epistemology begins by recognizing assertions or affirmations.
  • A particularly important sort of affirmation is one aimed at attaining truth, at getting it right
  • All it takes for an affirmation to be alethic is that one of its aims be: getting it right.
  • Humans perform acts of public affirmation in the endeavor to speak the truth, acts with crucial importance to a linguistic species. We need such affirmations for activities of the greatest import for life in society: for collective deliberation and coordination, and for the sharing of information.
  • Since there is much truth that must be grasped if one is to flourish, some philosophers have begun to treat truth’s apt attainment as virtuous in the Aristotelian sense, and have developed a virtue epistemology
  • Virtue epistemology gives an AAA account of knowledge: to know affirmatively is to make an affirmation that is accurate (true) and adroit (which requires taking proper account of the evidence). But in addition, the affirmation must be apt; that is, its accuracy must be attributable to competence rather than luck.
  • Requiring knowledge to be apt (in addition to accurate and adroit) reconfigures epistemology as the ethics of belief.
  • as a bonus, it allows contemporary virtue epistemology to solve our Gettier problem. We now have an explanation for why you fail to know that someone in the cafe owns a Bentley, when your own Bentley has been destroyed by a bomb, but the barista happens to own one. Your belief in that case falls short of knowledge for the reason that it fails to be apt. You are right that someone in the cafe owns a Bentley, but the correctness of your belief does not manifest your cognitive or epistemic competence. You are right only because by epistemic luck the barista happens to own one.
  • When in your musings you affirm to yourself that someone in the cafe owns a Bentley, therefore, your affirmation is not an apt alethic affirmation, and hence falls short of knowledge.
Javier E

Feeling Sad Makes Us More Creative | Wired Science | Wired.com - 0 views

  • For thousands of years, people have speculated that there’s some correlation between sadness and creativity, so that people who are a little bit miserable (think Van Gogh, or Dylan in 1965, or Virginia Woolf) are also the most innovative.
  • People who received negative feedback created better collages, at least when compared to those who received positive feedback or no feedback at all. Furthermore, those with low baselines of DHEAS proved particularly vulnerable to the external effects of frowns, so that they proved to be the most creative of all.
  • It turns out that states of sadness make us more attentive and detail oriented, more focused
  • ...5 more annotations...
  • angst and sadness promote “information-processing strategies best suited to dealing with more-demanding situations.” This helps explain why test subjects who are melancholy — Forgas induces the mood with a short film about death and cancer — are better at judging the accuracy of rumors and recalling past events; they’re also much less likely to stereotype strangers and make fewer arithmetic mistakes.
  • shoppers in the “low mood” condition remembered nearly four times as many of the trinkets. The wet weather made them sad, and their sadness made them more aware and attentive.
  • There are two important lessons of this research. The first is that our fleeting feelings can change the way we think. While sadness makes us more focused and diligent — the spotlight of attention is sharpened — happiness seems to have the opposite effect, so that good moods make us 20 percent more likely to have a moment of insight. The second takeaway is that many of our creative challenges involve tasks that require diligence, persistence and focus. It’s not easy making a collage or writing a poem or solving a hard technical problem, which is why sometimes being a little miserable can improve our creative performance.
  • Why is mental illness so closely associated with creativity? Andreasen argues that depression is intertwined with a “cognitive style” that makes people more likely to produce successful works of art. In the creative process, Andreasen says, “one of the most important qualities is persistence.”
  • While Andreasen acknowledges the burden of mental illness — she quotes Robert Lowell on depression not being a “gift of the Muse” and describes his reliance on lithium to escape the pain — she argues that many forms of creativity benefit from the relentless focus it makes possible. “Unfortunately, this type of thinking is often inseparable from the suffering,” she says. “If you’re at the cutting edge, then you’re going to bleed.”
Javier E

Lies, Damned Lies, and Medical Science - Magazine - The Atlantic - 0 views

  • How should we choose among these dueling, high-profile nutritional findings? Ioannidis suggests a simple approach: ignore them all.
  • even if a study managed to highlight a genuine health connection to some nutrient, you’re unlikely to benefit much from taking more of it, because we consume thousands of nutrients that act together as a sort of network, and changing intake of just one of them is bound to cause ripples throughout the network that are far too complex for these studies to detect, and that may be as likely to harm you as help you
  • studies report average results that typically represent a vast range of individual outcomes.
  • ...17 more annotations...
  • studies usually detect only modest effects that merely tend to whittle your chances of succumbing to a particular disease from small to somewhat smaller
  • The odds that anything useful will survive from any of these studies are poor,” says Ioannidis—dismissing in a breath a good chunk of the research into which we sink about $100 billion a year in the United States alone.
  • nutritional studies aren’t the worst. Drug studies have the added corruptive force of financial conflict of interest.
  • Even when the evidence shows that a particular research idea is wrong, if you have thousands of scientists who have invested their careers in it, they’ll continue to publish papers on it,” he says. “It’s like an epidemic, in the sense that they’re infected with these wrong ideas, and they’re spreading it to other researchers through journals.
  • Nature, the grande dame of science journals, stated in a 2006 editorial, “Scientists understand that peer review per se provides only a minimal assurance of quality, and that the public conception of peer review as a stamp of authentication is far from the truth.
  • The ultimate protection against research error and bias is supposed to come from the way scientists constantly retest each other’s results—except they don’t. Only the most prominent findings are likely to be put to the test, because there’s likely to be publication payoff in firming up the proof, or contradicting it.
  • even for medicine’s most influential studies, the evidence sometimes remains surprisingly narrow. Of those 45 super-cited studies that Ioannidis focused on, 11 had never been retested
  • even when a research error is outed, it typically persists for years or even decades.
  • much, perhaps even most, of what doctors do has never been formally put to the test in credible studies, given that the need to do so became obvious to the field only in the 1990s
  • Other meta-research experts have confirmed that similar issues distort research in all fields of science, from physics to economics (where the highly regarded economists J. Bradford DeLong and Kevin Lang once showed how a remarkably consistent paucity of strong evidence in published economics studies made it unlikely that any of them were right
  • His PLoS Medicine paper is the most downloaded in the journal’s history, and it’s not even Ioannidis’s most-cited work
  • while his fellow researchers seem to be getting the message, he hasn’t necessarily forced anyone to do a better job. He fears he won’t in the end have done much to improve anyone’s health. “There may not be fierce objections to what I’m saying,” he explains. “But it’s difficult to change the way that everyday doctors, patients, and healthy people think and behave.”
  • “Usually what happens is that the doctor will ask for a suite of biochemical tests—liver fat, pancreas function, and so on,” she tells me. “The tests could turn up something, but they’re probably irrelevant. Just having a good talk with the patient and getting a close history is much more likely to tell me what’s wrong.” Of course, the doctors have all been trained to order these tests, she notes, and doing so is a lot quicker than a long bedside chat. They’re also trained to ply the patient with whatever drugs might help whack any errant test numbers back into line.
  • What they’re not trained to do is to go back and look at the research papers that helped make these drugs the standard of care. “When you look the papers up, you often find the drugs didn’t even work better than a placebo. And no one tested how they worked in combination with the other drugs,” she says. “Just taking the patient off everything can improve their health right away.” But not only is checking out the research another time-consuming task, patients often don’t even like it when they’re taken off their drugs, she explains; they find their prescriptions reassuring.
  • Already feeling that they’re fighting to keep patients from turning to alternative medical treatments such as homeopathy, or misdiagnosing themselves on the Internet, or simply neglecting medical treatment altogether, many researchers and physicians aren’t eager to provide even more reason to be skeptical of what doctors do—not to mention how public disenchantment with medicine could affect research funding.
  • We could solve much of the wrongness problem, Ioannidis says, if the world simply stopped expecting scientists to be right. That’s because being wrong in science is fine, and even necessary—as long as scientists recognize that they blew it, report their mistake openly instead of disguising it as a success, and then move on to the next thing, until they come up with the very occasional genuine breakthrough
  • Science is a noble endeavor, but it’s also a low-yield endeavor,” he says. “I’m not sure that more than a very small percentage of medical research is ever likely to lead to major improvements in clinical outcomes and quality of life. We should be very comfortable with that fact.”
Javier E

Review: Vernor Vinge's 'Fast Times' | KurzweilAI - 0 views

  • Vernor Vinge’s Hugo-award-winning short science fiction story “Fast Times at Fairmont High” takes place in a near future in which everyone lives in a ubiquitous, wireless, networked world using wearable computers and contacts or glasses on which computer graphics are projected to create an augmented reality.
  • So what is life like in Vinge’s 2020?The biggest technological change involves ubiquitous computing, wearables, and augmented reality (although none of those terms are used). Everyone wears contacts or glasses which mediate their view of the world. This allows computer graphics to be superimposed on what they see. The computers themselves are actually built into the clothing (apparently because that is the cheapest way to do it) and everything communicates wirelessly.
  • If you want a computer display, it can appear in thin air, or be attached to a wall or any other surface. If people want to watch TV together they can agree on where the screen should appear and what show they watch. When doing your work, you can have screens on all your walls, menus attached here and there, however you want to organize things. But none of it is "really" there.
  • ...7 more annotations...
  • Does your house need a new coat of paint? Don’t bother, just enter it into your public database and you have a nice new mint green paint job that everyone will see. Want to redecorate? Do it with computer graphics. You can have a birdbath in the front yard inhabited by Disneyesque animals who frolic and play. Even indoors, don’t buy artwork, just download it from the net and have it appear where you want.
  • Got a zit? No need to cover up with Clearsil, just erase it from your public face and people will see the improved version. You can dress up your clothes and hairstyle as well.
  • Of course, anyone can turn off their enhancements and see the plain old reality, but most people don’t bother most of the time because things are ugly that way.
  • Some of the kids attending Fairmont Junior High do so remotely. They appear as "ghosts", indistinguishable from the other kids except that you can walk through them. They go to classes and raise their hands to ask questions just like everyone else. They see the school and everyone at the school sees them. Instead of visiting friends, the kids can all instantly appear at one another’s locations.
  • The computer synthesizing visual imagery is able to call on the localizer network for views beyond what the person is seeing. In this way you can have 360 degree vision, or even see through walls. This is a transparent society with a vengeance!
  • The cumulative effect of all this technology was absolutely amazing and completely believable
  • One thing that was believable is that it seemed that a lot of the kids cheated, and it was almost impossible for the adults to catch them. With universal network connectivity it would be hard to make sure kids are doing their work on their own. I got the impression the school sort of looked the other way, the idea being that as long as the kids solved their problems, even if they got help via the net, that was itself a useful skill that they would be relying on all their lives.
Javier E

The Washington Monthly - The Magazine - The Information Sage - 0 views

  • After the publication of Envisioning Information, Tufte decided, he told me, “to be indifferent to culture or history or time.” He became increasingly consumed with what he calls “forever knowledge,” or the idea that design is meant to guide fundamental cognitive tasks and therefore is rooted in principles that apply regardless of the material being displayed and the technology used to produce it. As Tufte explains it, basic human cognitive questions are universal, which means that design questions should be universal too.
  • As Tufte sees it, graphic design has become a tragic field, a rich and storied craft knowledge that has been taken out of the realm of “nonfiction,” as he calls it, and into that of “fiction,” or marketing and propaganda. He told me several times of his contempt for “commercial art,” the graphic design that is “part of a fashion and a style and will be different someday.” Most designers, he said, want to do something new each time. “But I’m interested in the solved problem,” he said. “I’m interested in high art and real science.”
Javier E

Does Google Make Us Stupid? - Pew Research Center - 0 views

  • Carr argued that the ease of online searching and distractions of browsing through the web were possibly limiting his capacity to concentrate. "I'm not thinking the way I used to," he wrote, in part because he is becoming a skimming, browsing reader, rather than a deep and engaged reader. "The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author's words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas.... If we lose those quiet spaces, or fill them up with ‘content,' we will sacrifice something important not only in our selves but in our culture."
  • force us to get smarter if we are to survive. "Most people don't realize that this process is already under way," he wrote. "In fact, it's happening all around us, across the full spectrum of how we understand intelligence. It's visible in the hive mind of the Internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines, and in the development of drugs that some people (myself included) have discovered let them study harder, focus better, and stay awake longer with full clarity." He argued that while the proliferation of technology and media can challenge humans' capacity to concentrate there were signs that we are developing "fluid intelligence-the ability to find meaning in confusion and solve new problems, independent of acquired knowledge." He also expressed hope that techies will develop tools to help people find and assess information smartly.
  • 76% of the experts agreed with the statement, "By 2020, people's use of the internet has enhanced human intelligence; as people are allowed unprecedented access to more information they become smarter and make better choices. Nicholas Carr was wrong: Google does not make us stupid."
Javier E

A Right To Die? Ctd - The Dish | By Andrew Sullivan - The Daily Beast - 0 views

  • based on our ever-growing knowledge of brain physiology and habit formation.   He can fix himself; it is absolutely within the realm of the possible.  But he won't do it by thinking about himself; he needs to externalize.  Contra Freud, insight alone rarely solves much, and a constant focus on oneself and one's problems, especially for people who are depressed, tends to make things worse in the absence of concommitant specific cognitive and/or behavioral strategies for change
  • focus on doing something for someone or something outside of himself, sounds counter-intuitive and Pollyanna-ish, if not outright cruel.  And yet...  his neuronal pathways tending towards depressing, defeatist self-references have obviously been over-enriched at the expense of, well, everything else.  So he's got to change that. These things are plastic, and literally grow or shrink depending on usage.  
  • He needs physical activity directed towards an external goal; not doing something for himself (although he will be), but for other people, animals, the planet, a political cause, neighborhood clean-up - whatever.  Once he finds that cause and starts working, setting goals (however small) to accomplish in that cause, and accomplishing them, the energy itself will build and grow, just like his non-depressive cognitive patterns.  And every time he finds himself thinking negative, defeatist thoughts, he should imagine one of those giant red stop signs and STOP!  It's another habit to develop, and gets easier and more effective every time he tries it.  
  • ...1 more annotation...
  • let him develop the habit of smiling any time he starts feeling rotten. Believe it or not, it works.
Javier E

Why I Am a Naturalist - NYTimes.com - 1 views

  • Naturalism is the philosophical theory that treats science as our most reliable source of knowledge and scientific method as the most effective route to knowledge.
  • it is now a dominant approach in several areas of philosophy — ethics, epistemology, the philosophy of mind, philosophy of science and, most of in all, metaphysics, the study of the basic constituents of reality.
  • Naturalists have applied this insight to reveal the biological nature of human emotion, perception and cognition, language, moral value, social bonds and political institutions. Naturalistic philosophy has returned the favor, helping psychology, evolutionary anthropology and biology solve their problems by greater conceptual clarity about function, adaptation, Darwinian fitness and individual-versus-group selection.
  • ...3 more annotations...
  • 400 years of scientific success in prediction, control and technology shows that physics has made a good start. We should be confident that it will do better than any other approach at getting things right.
  • The second law of thermodynamics, the periodic table, and the principles of natural selection are unlikely to be threatened by future science. Philosophy can therefore rely on them to answer many of its questions without fear of being overtaken by events.
  • “Why can’t there be things only discoverable by non-scientific means, or not discoverable at all?” Professor Williamson asked in his essay. His question may be rhetorical, but the naturalist has an answer to it: nothing that revelation, inspiration or other non-scientific means ever claimed to discover has yet to withstand the test of knowledge that scientific findings attain. What are those tests of knowledge? They are the experimental/observational methods all the natural sciences share, the social sciences increasingly adopt, and that naturalists devote themselves to making explicit.
Javier E

Clive Thompson on Memory Engineering | Magazine - 0 views

  • a new trend I call memory engineering — the process of fashioning our inchoate digital pasts into useful memories.
  • Many of us generate massive amounts of personal data every day — phonecam pictures, text messages, status updates, and so on. By default, all of us are becoming lifeloggers. But we almost never go back and look at this stuff, because it’s too hard to parse.
  • Memory engineers are solving that problem by creating services that reformat that data in witty, often artistic ways.
  • ...2 more annotations...
  • Lifeloggers have long touted the “total recall” that’s achievable if you obsessively store and organize personal records: Never forget a thing! But Wegener has found that less can be more. When you show someone their year-old check-ins and nothing else, it’s a very crude signal — just a bunch of points on a map. But our brains seize these cues and fill in the details
  • these techniques can also work with “semantic” memories of facts and info. Last winter, Amazon released a clever app called Daily Review, which takes your Kindle clippings and redisplays them for you weeks or months later — timed on a schedule that’s designed to help you absorb your reading more deeply into your brain
Javier E

Older Really Can Mean Wiser - NYTimes.com - 0 views

  • mental faculties that improve with age.
  • Knowledge is a large part of the equation, of course. People who are middle-aged and older tend to know more than young adults, by virtue of having been around longer, and score higher on vocabulary tests, crossword puzzles and other measures of so-called crystallized intelligence.
  • the older brain offers something more, according to a new paper in the journal Psychological Science. Elements of social judgment and short-term memory, important pieces of the cognitive puzzle, may peak later in life than previously thought.
  • ...15 more annotations...
  • The researchers found that the broad split in age-related cognition — fluid in the young, crystallized in the old — masked several important nuances.
  • A year ago, German scientists argued that cognitive “deficits” in aging were caused largely by the accumulation of knowledge — that is, the brain slows down because it has to search a larger mental library of facts
  • Experts said the new analysis raised a different question: Are there distinct, independent elements of memory and cognition that peak at varying times of life?
  • The strength of the new analysis is partly in its data. The study evaluated historic scores from the popular Wechsler intelligence test, and compared them with more recent results from tens of thousands of people who took short cognitive tests on the authors’ websites, testmybrain.org and gameswithwords.org
  • The one drawback of this approach is that, because it didn’t follow the same people over a lifetime, it might have missed the effect of different cultural experiences
  • most previous studies have not been nearly as large, or had such a range of ages. Participants on the websites were 10 to 89 years old, and they took a large battery of tests, measuring skills like memory for abstract symbols and strings of digits, problem solving, and facility reading emotions from strangers’ eyes.
  • “We found different abilities really maturing or ripening at different ages,” Dr. Germine said. “It’s a much richer picture of the life span than just calling it aging.”
  • At least as important, the researchers looked at the effect of age on each type of test.
  • Processing speed — the quickness with which someone can manipulate digits, words or images, as if on a mental sketch board — generally peaks in the late teens
  • memory for some things, like names, does so in the early 20s
  • But the capacity of that sketch board, called working memory, peaks at least a decade later and is slow to decline. In particular, the ability to recall faces and do some mental manipulation of numbers peaked about age 30,
  • The researchers also analyzed results from the Reading the Mind in the Eyes test. The test involves looking at snapshots of strangers’ eyes on a computer screen and determining their moods from a menu of options like “tentative,” “uncertain” and “skeptical.”
  • people in their 40s or 50s consistently did the best, the study found, and the skill declined very slowly later in life
  • The picture that emerges from these findings is of an older brain that moves more slowly than its younger self, but is just as accurate in many areas and more adept at reading others’ moods — on top of being more knowledgeable. That’s a handy combination, given that so many important decisions people make intimately affects others.
  • for now, the new research at least gives some meaning to the empty adjective “wily.”
Javier E

The Creative Climate - NYTimes.com - 0 views

  • Sometimes creativity happens in pairs, duos like Lennon and McCartney who bring clashing worldviews but similar tastes. But sometimes it happens in one person, in someone who contains contradictions and who works furiously to resolve the tensions within.
  • When you see creative people like that, you see that they don’t flee from the contradictions; they embrace dialectics and dualism. They cultivate what Roger Martin called the opposable mind — the ability to hold two opposing ideas at the same time.
  • If they are religious, they seek to live among the secular. If they are intellectual, they go off into the hurly-burly of business and politics. Creative people often want to be strangers in a strange land. They want to live in dissimilar environments to maximize the creative tensions between different parts of themselves.
  • ...1 more annotation...
  • as Albert Einstein put it, “You can never solve a problem on the level on which it was created.”
Javier E

Americans Think We Have the World's Best Colleges. We Don't. - NYTimes.com - 1 views

  • When President Obama has said, “We have the best universities,” he has not meant: “Our universities are, on average, the best” — even though that’s what many people hear. He means, “Of the best universities, most are ours.” The distinction is important.
  • We see K-12 schools and colleges differently because we’re looking at two different yardsticks: the academic performance of the whole population of students in one case, the research performance of a small number of institutions in the other.
  • The fair way to compare the two systems, to each other and to systems in other countries, would be to conduct something like a PISA for higher education. That had never been done until late 2013, when the O.E.C.D. published exactly such a study.
  • ...7 more annotations...
  • The project is called the Program for the International Assessment of Adult Competencies (known as Piaac, sometimes called “pee-ack”). In 2011 and 2012, 166,000 adults ages 16 to 65 were tested in the O.E.C.D. countries
  • Like PISA, Piaac tests people’s literacy and math skills. Because the test takers were adults, they were asked to use those skills in real-world contexts.
  • As with the measures of K-12 education, the United States battles it out for last place, this time with Italy and Spain.
  • Only 18 percent of American adults with bachelor’s degrees score at the top two levels of numeracy, compared with the international average of 24 percent. Over one-third of American bachelor’s degree holders failed to reach Level 3 on the five-level Piaac scale, which means that they cannot perform math-related tasks that “require several steps and may involve the choice of problem-solving strategies.”
  • American results on the literacy and technology tests were somewhat better, in the sense that they were only mediocre. American adults were eighth from the bottom in literacy, for instance. And recent college graduates look no better than older ones. Among people ages 16 to 29 with a bachelor’s degree or better, America ranks 16th out of 24 in numeracy.
  • There is no reason to believe that American colleges are, on average, the best in the world.
  • Instead, Piaac suggests that the wide disparities of knowledge and skill present among American schoolchildren are not ameliorated by higher education. If anything, they are magnified. In 2000, American 15-year-olds scored slightly above the international average. Twelve years later, Americans who were about 12 years older scored below the international average.
Javier E

Op-Ed Contributor - Rich Man's Burden - Op-Ed - NYTimes.com - 0 views

  • what’s different from Weber’s era is that it is now the rich who are the most stressed out and the most likely to be working the most. Perhaps for the first time since we’ve kept track of such things, higher-income folks work more hours than lower-wage earners do.
  • This is a stunning moment in economic history: At one time we worked hard so that someday we (or our children) wouldn’t have to. Today, the more we earn, the more we work, since the opportunity cost of not working is all the greater (and since the higher we go, the more relatively deprived we feel).
  • when we get a raise, instead of using that hard-won money to buy “the good life,” we feel even more pressure to work since the shadow costs of not working are all the greater.
  • ...8 more annotations...
  • One of these forces is America’s income inequality, which has steadily increased since 1969. We typically think of this process as one in which the rich get richer and the poor get poorer. Surely, that should, if anything, make upper income earners able to relax.
  • technology both creates and reflects economic realities. Instead, less visible forces have given birth to this state of affairs.
  • even with the same work hours and household duties, women with higher incomes report feeling more stressed than women with lower incomes, according to a recent study by the economists Daniel Hamermesh and Jungmin Lee. In other words, not only does more money not solve our problems at home, it may even make things worse.
  • t it turns out that the growing disparity is really between the middle and the top. If we divided the American population in half, we would find that those in the lower half have been pretty stable over the last few decades in terms of their incomes relative to one another. However, the top half has been stretching out like taffy. In fact, as we move up the ladder the rungs get spaced farther and farther apart.
  • The result of this high and rising inequality is what I call an “economic red shift.” Like the shift in the light spectrum caused by the galaxies rushing away, those Americans who are in the top half of the income distribution experience a sensation that, while they may be pulling away from the bottom half, they are also being left further and further behind by those just above them.
  • since inequality rises exponentially the higher you climb the economic ladder, the better off you are in absolute terms, the more relatively deprived you may feel. In fact, a poll of New Yorkers found that those who earned more than $200,000 a year were the most likely of any income group to agree that “seeing other people with money” makes them feel poor.
  • Because these forces drive each other, they trap us in a vicious cycle: Rising inequality causes us to work more to keep up in an economy increasingly dominated by status goods. That further widens income differences.
  • if you are someone who is pretty well off but couldn’t stop working yesterday nonetheless, don’t blame your iPhone or laptop. Blame a new wrinkle in something much more antiquated: inequality.
Javier E

How Humans Ended Up With Freakishly Huge Brains | WIRED - 0 views

  • paleontologists documented one of the most dramatic transitions in human evolution. We might call it the Brain Boom. Humans, chimps and bonobos split from their last common ancestor between 6 and 8 million years ago.
  • Starting around 3 million years ago, however, the hominin brain began a massive expansion. By the time our species, Homo sapiens, emerged about 200,000 years ago, the human brain had swelled from about 350 grams to more than 1,300 grams.
  • n that 3-million-year sprint, the human brain almost quadrupled the size its predecessors had attained over the previous 60 million years of primate evolution.
  • ...19 more annotations...
  • There are plenty of theories, of course, especially regarding why: increasingly complex social networks, a culture built around tool use and collaboration, the challenge of adapting to a mercurial and often harsh climate
  • Although these possibilities are fascinating, they are extremely difficult to test.
  • Although it makes up only 2 percent of body weight, the human brain consumes a whopping 20 percent of the body’s total energy at rest. In contrast, the chimpanzee brain needs only half that.
  • contrary to long-standing assumptions, larger mammalian brains do not always have more neurons, and the ones they do have are not always distributed in the same way.
  • The human brain has 86 billion neurons in all: 69 billion in the cerebellum, a dense lump at the back of the brain that helps orchestrate basic bodily functions and movement; 16 billion in the cerebral cortex, the brain’s thick corona and the seat of our most sophisticated mental talents, such as self-awareness, language, problem solving and abstract thought; and 1 billion in the brain stem and its extensions into the core of the brain
  • In contrast, the elephant brain, which is three times the size of our own, has 251 billion neurons in its cerebellum, which helps manage a giant, versatile trunk, and only 5.6 billion in its cortex
  • primates evolved a way to pack far more neurons into the cerebral cortex than other mammals did
  • The great apes are tiny compared to elephants and whales, yet their cortices are far denser: Orangutans and gorillas have 9 billion cortical neurons, and chimps have 6 billion. Of all the great apes, we have the largest brains, so we come out on top with our 16 billion neurons in the cortex.
  • “What kinds of mutations occurred, and what did they do? We’re starting to get answers and a deeper appreciation for just how complicated this process was.”
  • there was a strong evolutionary pressure to modify the human regulatory regions in a way that sapped energy from muscle and channeled it to the brain.
  • Accounting for body size and weight, the chimps and macaques were twice as strong as the humans. It’s not entirely clear why, but it is possible that our primate cousins get more power out of their muscles than we get out of ours because they feed their muscles more energy. “Compared to other primates, we lost muscle power in favor of sparing energy for our brains,” Bozek said. “It doesn’t mean that our muscles are inherently weaker. We might just have a different metabolism.
  • a pioneering experiment. Not only were they going to identify relevant genetic mutations from our brain’s evolutionary past, they were also going to weave those mutations into the genomes of lab mice and observe the consequences.
  • Silver and Wray introduced the chimpanzee copy of HARE5 into one group of mice and the human edition into a separate group. They then observed how the embryonic mice brains grew.
  • After nine days of development, mice embryos begin to form a cortex, the outer wrinkly layer of the brain associated with the most sophisticated mental talents. On day 10, the human version of HARE5 was much more active in the budding mice brains than the chimp copy, ultimately producing a brain that was 12 percent larger
  • “It wasn’t just a couple mutations and—bam!—you get a bigger brain. As we learn more about the changes between human and chimp brains, we realize there will be lots and lots of genes involved, each contributing a piece to that. The door is now open to get in there and really start understanding. The brain is modified in so many subtle and nonobvious ways.”
  • As recent research on whale and elephant brains makes clear, size is not everything, but it certainly counts for something. The reason we have so many more cortical neurons than our great-ape cousins is not that we have denser brains, but rather that we evolved ways to support brains that are large enough to accommodate all those extra cells.
  • There’s a danger, though, in becoming too enamored with our own big heads. Yes, a large brain packed with neurons is essential to what we consider high intelligence. But it’s not sufficient
  • No matter how large the human brain grew, or how much energy we lavished upon it, it would have been useless without the right body. Three particularly crucial adaptations worked in tandem with our burgeoning brain to dramatically increase our overall intelligence: bipedalism, which freed up our hands for tool making, fire building and hunting; manual dexterity surpassing that of any other animal; and a vocal tract that allowed us to speak and sing.
  • Human intelligence, then, cannot be traced to a single organ, no matter how large; it emerged from a serendipitous confluence of adaptations throughout the body. Despite our ongoing obsession with the size of our noggins, the fact is that our intelligence has always been so much bigger than our brain.
« First ‹ Previous 121 - 140 of 174 Next › Last »
Showing 20 items per page