Skip to main content

Home/ TOK Friends/ Group items tagged engineering

Rss Feed Group items tagged

grayton downing

Humans' living creations put on display | Science News - 1 views

  • Others would say she was a modern wonder. She was genetically engineered by a Canadian company to produce milk that could be spun into spider silk.
  • As visitors pass through a curtain to enter the darkened exhibition space, they see a spectacularly fluffy white ornamental chicken and an aquarium full of glowing fish
  • But Pell does not intend the museum to be merely a cabinet of curiosities or a freak show. Visitors will not find rants that drum up fears of “Frankenfoods.”
  • ...2 more annotations...
  • Language used throughout the center is artfully neutral, and each specimen is accompanied by only a few basic facts and a brief story highlighting a social issue. One display shows dried leaves from a transgenic American chestnut, engineered with a wheat gene to resist the fungal blight that nearly eradicated wild populations of the tree. An audio guide recounts how researchers decided to use the wheat gene instead of one from frogs, for fear of controversy.
  • The museum’s approach could frustrate science enthusiasts when social and ethical questions push interesting scientific details into the background.
jlessner

Why Facebook's News Experiment Matters to Readers - NYTimes.com - 0 views

  • Facebook’s new plan to host news publications’ stories directly is not only about page views, advertising revenue or the number of seconds it takes for an article to load. It is about who owns the relationship with readers.
  • It’s why Google, a search engine, started a social network and why Facebook, a social network, started a search engine. It’s why Amazon, a shopping site, made a phone and why Apple, a phone maker, got into shopping.
  • Facebook’s experiment, called instant articles, is small to start — just a few articles from nine media companies, including The New York Times. But it signals a major shift in the relationship between publications and their readers. If you want to read the news, Facebook is saying, come to Facebook, not to NBC News or The Atlantic or The Times — and when you come, don’t leave. (For now, these articles can be viewed on an iPhone running the Facebook app.)
  • ...6 more annotations...
  • The front page of a newspaper and the cover of a magazine lost their dominance long ago.
  • But news reports, like albums before them, have not been created that way. One of the services that editors bring to readers has been to use their news judgment, considering a huge range of factors, when they decide how articles fit together and where they show up. The news judgment of The New York Times is distinct from that of The New York Post, and for generations readers appreciated that distinction.
  • “In digital, every story becomes unbundled from each other, so if you’re not thinking of each story as living on its own, it’s tying yourself back to an analog era,” Mr. Kim said.
  • Facebook executives have insisted that they intend to exert no editorial control because they leave the makeup of the news feed to the algorithm. But an algorithm is not autonomous. It is written by humans and tweaked all the time. Advertisement Continue reading the main story Advertisement Continue reading the main story
  • That raises some journalistic questions. The news feed algorithm works, in part, by showing people more of what they have liked in the past. Some studies have suggested that means they might not see as wide a variety of news or points of view, though others, including one by Facebook researchers, have found they still do.
  • Tech companies, Facebook included, are notoriously fickle with their algorithms. Publications became so dependent on Facebook in the first place because of a change in its algorithm that sent more traffic their way. Later, another change demoted articles from sites that Facebook deemed to run click-bait headlines. Then last month, Facebook decided to prioritize some posts from friends over those from publications.
Javier E

Is Algebra Necessary? - NYTimes.com - 1 views

  • My aim is not to spare students from a difficult subject, but to call attention to the real problems we are causing by misdirecting precious resources.
  • one in four ninth graders fail to finish high school. In South Carolina, 34 percent fell away in 2008-9, according to national data released last year; for Nevada, it was 45 percent. Most of the educators I’ve talked with cite algebra as the major academic reason.
  • Algebra is an onerous stumbling block for all kinds of students: disadvantaged and affluent, black and white. In New Mexico, 43 percent of white students fell below “proficient,” along with 39 percent in Tennessee
  • ...15 more annotations...
  • The depressing conclusion of a faculty report: “failing math at all levels affects retention more than any other academic factor.” A national sample of transcripts found mathematics had twice as many F’s and D’s compared as other subjects.
  • Of all who embark on higher education, only 58 percent end up with bachelor’s degrees. The main impediment to graduation: freshman math.
  • California’s two university systems, for instance, consider applications only from students who have taken three years of mathematics and in that way exclude many applicants who might excel in fields like art or history. Community college students face an equally prohibitive mathematics wall. A study of two-year schools found that fewer than a quarter of their entrants passed the algebra classes they were required to take.
  • a definitive analysis by the Georgetown Center on Education and the Workforce forecasts that in the decade ahead a mere 5 percent of entry-level workers will need to be proficient in algebra or above.
  • “mathematical reasoning in workplaces differs markedly from the algorithms taught in school.” Even in jobs that rely on so-called STEM credentials — science, technology, engineering, math — considerable training occurs after hiring, including the kinds of computations that will be required.
  • I fully concur that high-tech knowledge is needed to sustain an advanced industrial economy. But we’re deluding ourselves if we believe the solution is largely academic.
  • Nor will just passing grades suffice. Many colleges seek to raise their status by setting a high mathematics bar. Hence, they look for 700 on the math section of the SAT, a height attained in 2009 by only 9 percent of men and 4 percent of women. And it’s not just Ivy League colleges that do this: at schools like Vanderbilt, Rice and Washington University in St. Louis, applicants had best be legacies or athletes if they have scored less than 700 on their math SATs.
  • A January 2012 analysis from the Georgetown center found 7.5 percent unemployment for engineering graduates and 8.2 percent among computer scientists.
  • “Our civilization would collapse without mathematics.” He’s absolutely right.
  • Quantitative literacy clearly is useful in weighing all manner of public policies
  • Mathematics is used as a hoop, a badge, a totem to impress outsiders and elevate a profession’s status.
  • Instead of investing so much of our academic energy in a subject that blocks further attainment for much of our population, I propose that we start thinking about alternatives. Thus mathematics teachers at every level could create exciting courses in what I call “citizen statistics.” This would not be a backdoor version of algebra, as in the Advanced Placement syllabus. Nor would it focus on equations used by scholars when they write for one another. Instead, it would familiarize students with the kinds of numbers that describe and delineate our personal and public lives.
  • This need not involve dumbing down. Researching the reliability of numbers can be as demanding as geometry.
  • I hope that mathematics departments can also create courses in the history and philosophy of their discipline, as well as its applications in early cultures. Why not mathematics in art and music — even poetry — along with its role in assorted sciences? The aim would be to treat mathematics as a liberal art, making it as accessible and welcoming as sculpture or ballet
  • Yes, young people should learn to read and write and do long division, whether they want to or not. But there is no reason to force them to grasp vectorial angles and discontinuous functions. Think of math as a huge boulder we make everyone pull, without assessing what all this pain achieves. So why require it, without alternatives or exceptions? Thus far I haven’t found a compelling answer.
Javier E

As Interest Fades in the Humanities, Colleges Worry - NYTimes.com - 0 views

  • “Both inside the humanities and outside, people feel that the intellectual firepower in the universities is in the sciences, that the important issues that people of all sorts care about, like inequality and climate change, are being addressed not in the English departments,”
  • nationally, the percentage of humanities majors hovers around 7 percent — half the 14 percent share in 1970. As others quickly pointed out, that decline occurred between 1970, the high point, and 1985, not in recent years.
  • “In the scholarly world, cognitive sciences has everybody’s ear right now, and everybody is thinking about how to relate to it,” said Louis Menand, a Harvard history professor. “How many people do you know who’ve read a book by an English professor in the past year? But everybody’s reading science books.”
  • ...5 more annotations...
  • while it is easy to spot the winners at science fairs and robotics competitions, students who excel in humanities get less acclaim and are harder to identify.
  • “I got the sense from them that it’s not cool to be a nerd in high school, unless you’re a STEM nerd,” he said, using the term for science, technology, engineering and mathematics.
  • “I live in Seattle, surrounded by Amazon and Google and Microsoft,” said Ms. Roberts, a history buff. “One of the best things about the program, that made us all breathe a sigh of relief, was being in an environment where no one said: “Oh, you’re interested in humanities? You’ll never get a job.”
  • since the recession — probably because of the recession — there has been a profound shift toward viewing college education as a vocational training ground. “College is increasingly being defined narrowly as job preparation, not as something designed to educate the whole person,”
  • Many do not understand that the study of humanities offers skills that will help them sort out values, conflicting issues and fundamental philosophical questions, said Leon Botstein, the president of Bard College. “We have failed to make the case that those skills are as essential to engineers and scientists and businessmen as to philosophy professors,” he said.
Javier E

Teachers - Will We Ever Learn? - NYTimes.com - 0 views

  • America’s overall performance in K-12 education remains stubbornly mediocre.
  • The debate over school reform has become a false polarization
  • teaching is a complex activity that is hard to direct and improve from afar. The factory model is appropriate to simple work that is easy to standardize; it is ill suited to disciplines like teaching that require considerable skill and discretion.
  • ...13 more annotations...
  • In the nations that lead the international rankings — Singapore, Japan, South Korea, Finland, Canada — teachers are drawn from the top third of college graduates, rather than the bottom 60 percent as is the case in the United States. Training in these countries is more rigorous, more tied to classroom practice and more often financed by the government than in America. There are also many fewer teacher-training institutions, with much higher standards.
  • By these criteria, American education is a failed profession. There is no widely agreed-upon knowledge base, training is brief or nonexistent, the criteria for passing licensing exams are much lower than in other fields, and there is little continuous professional guidance. It is not surprising, then, that researchers find wide variation in teaching skills across classrooms; in the absence of a system devoted to developing consistent expertise, we have teachers essentially winging it as they go along, with predictably uneven results.
  • Teaching requires a professional model, like we have in medicine, law, engineering, accounting, architecture and many other fields. In these professions, consistency of quality is created less by holding individual practitioners accountable and more by building a body of knowledge, carefully training people in that knowledge, requiring them to show expertise before they become licensed, and then using their professions’ standards to guide their work.
  • Teachers in leading nations’ schools also teach much less than ours do. High school teachers provide 1,080 hours per year of instruction in America, compared with fewer than 600 in South Korea and Japan, where the balance of teachers’ time is spent collaboratively on developing and refining lesson plans
  • These countries also have much stronger welfare states; by providing more support for students’ social, psychological and physical needs, they make it easier for teachers to focus on their academic needs.
  • hese elements create a virtuous cycle: strong academic performance leads to schools with greater autonomy and more public financing, which in turn makes education an attractive profession for talented people.
  • In America, both major teachers’ unions and the organization representing state education officials have, in the past year, called for raising the bar for entering teachers; one of the unions, the American Federation of Teachers, advocates a “bar exam.”
  • Ideally the exam should not be a one-time paper-and-pencil test, like legal bar exams, but a phased set of milestones to be attained over the first few years of teaching. Akin to medical boards, they would require prospective teachers to demonstrate subject and pedagogical knowledge — as well as actual teaching skill.
  • We let doctors operate, pilots fly, and engineers build because their fields have developed effective ways of certifying that they can do these things. Teaching, on the whole, lacks this specialized knowledge base; teachers teach based mostly on what they have picked up from experience and from their colleagues.
  • other fields spend 5 percent to 15 percent of their budgets on research and development, while in education, it is around 0.25 percent
  • Education-school researchers publish for fellow academics; teachers develop practical knowledge but do not evaluate or share it; commercial curriculum designers make what districts and states will buy, with little regard for quality.
  • Early- to mid-career teachers need time to collaborate and explore new directions — having mastered the basics, this is the stage when they can refine their skills. The system should reward master teachers with salaries commensurate with leading professionals in other fields.
  • research suggests that the labels don’t matter — there are good and bad programs of all types, including university-based ones. The best programs draw people who majored as undergraduates in the subjects they wanted to teach; focus on extensive clinical practice rather than on classroom theory; are selective in choosing their applicants rather than treating students as a revenue stream; and use data about how their students fare as teachers to assess and revise their practice.
Javier E

How the Internet Gets Inside Us : The New Yorker - 0 views

  • It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with
  • The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare.
  • Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness.
  • ...14 more annotations...
  • In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.
  • “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”
  • when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix.
  • The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965.
  • If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.
  • Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began.
  • Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.
  • The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points.
  • at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.
  • What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own.
  • A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them.
  • And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.
  • Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user.
  • the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.
Javier E

The End of the Future - Peter Thiel - National Review Online - 0 views

  • The state can successfully push science; there is no sense denying it. The Manhattan Project and the Apollo program remind us of this possibility. Free markets may not fund as much basic research as needed.
  • But in practice, we all sense that such gloating belongs to a very different time. Most of our political leaders are not engineers or scientists and do not listen to engineers or scientists.
  • Today’s aged hippies no longer understand that there is a difference between the election of a black president and the creation of cheap solar energy; in their minds, the movement towards greater civil rights parallels general progress everywhere. Because of these ideological conflations and commitments, the 1960s Progressive Left cannot ask whether things actually might be getting worse.
  • ...1 more annotation...
  • after 40 years of wandering, it is not easy to find a path back to the future. If there is to be a future, we would do well to reflect about it more. The first and the hardest step is to see that we now find ourselves in a desert, and not in an enchanted forest.
Javier E

The Scoreboards Where You Can't See Your Score - NYTimes.com - 0 views

  • The characters in Gary Shteyngart’s novel “Super Sad True Love Story” inhabit a continuously surveilled and scored society.
  • Consider the protagonist, Lenny Abramov, age 39. A digital dossier about him accumulates his every health condition (high cholesterol, depression), liability (mortgage: $560,330), purchase (“bound, printed, nonstreaming media artifact”), tendency (“heterosexual, nonathletic, nonautomotive, nonreligious”) and probability (“life span estimated at 83”). And that profile is available for perusal by employers, friends and even strangers in bars.
  • Even before the appearance of these books, a report called “The Scoring of America” by the World Privacy Forum showed how analytics companies now offer categorization services like “churn scores,” which aim to predict which customers are likely to forsake their mobile phone carrier or cable TV provider for another company; “job security scores,” which factor a person’s risk of unemployment into calculations of his or her ability to pay back a loan; “charitable donor scores,” which foundations use to identify the households likeliest to make large donations; and “frailty scores,” which are typically used to predict the risk of medical complications and death in elderly patients who have surgery.
  • ...12 more annotations...
  • In two nonfiction books, scheduled to be published in January, technology experts examine similar consumer-ranking techniques already in widespread use.
  • While a federal law called the Fair Credit Reporting Act requires consumer reporting agencies to provide individuals with copies of their credit reports on request, many other companies are free to keep their proprietary consumer scores to themselves.
  • Befitting the founder of a firm that markets reputation management, Mr. Fertik contends that individuals have some power to influence commercial scoring systems.
  • “This will happen whether or not you want to participate, and these scores will be used by others to make major decisions about your life, such as whether to hire, insure, or even date you,”
  • “Important corporate actors have unprecedented knowledge of the minutiae of our daily lives,” he writes in “The Black Box Society: The Secret Algorithms That Control Money and Information” (Harvard University Press), “while we know little to nothing about how they use this knowledge to influence important decisions that we — and they — make.”
  • Data brokers amass dossiers with thousands of details about individual consumers, like age, religion, ethnicity, profession, mortgage size, social networks, estimated income and health concerns such as impotence and irritable bowel syndrome. Then analytics engines can compare patterns in those variables against computer forecasting models. Algorithms are used to assign consumers scores — and to recommend offering, or withholding, particular products, services or fees — based on predictions about their behavior.
  • It’s a fictional forecast of a data-deterministic culture in which computer algorithms constantly analyze consumers’ profiles, issuing individuals numeric rankings that may benefit or hinder them.
  • Think of this technique as reputation engine optimization. If an algorithm incorrectly pegs you as physically unfit, for instance, the book suggests that you can try to mitigate the wrong. You can buy a Fitbit fitness tracker, for instance, and upload the exercise data to a public profile — or even “snap that Fitbit to your dog” and “you’ll quickly be the fittest person in your town.”
  • Professor Pasquale offers a more downbeat reading. Companies, he says, are using such a wide variety of numerical rating systems that it would be impossible for average people to significantly influence their scores.
  • “Corporations depend on automated judgments that may be wrong, biased or destructive,” Professor Pasquale writes. “Faulty data, invalid assumptions and defective models can’t be corrected when they are hidden.”
  • Moreover, trying to influence scoring systems could backfire. If a person attached a fitness device to a dog and tried to claim the resulting exercise log, he suggests, an algorithm might be able to tell the difference and issue that person a high score for propensity toward fraudulent activity.
  • “People shouldn’t think they can outwit corporations with hundreds of millions of dollars,” Professor Pasquale said in a phone interview.Consumers would have more control, he argues, if Congress extended the right to see and correct credit reports to other kinds of rankings.
Javier E

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 1 views

  • Skinner's approach stressed the historical associations between a stimulus and the animal's response -- an approach easily framed as a kind of empirical statistical analysis, predicting the future as a function of the past.
  • Chomsky's conception of language, on the other hand, stressed the complexity of internal representations, encoded in the genome, and their maturation in light of the right data into a sophisticated computational system, one that cannot be usefully broken down into a set of associations.
  • Chomsky acknowledged that the statistical approach might have practical value, just as in the example of a useful search engine, and is enabled by the advent of fast computers capable of processing massive data. But as far as a science goes, Chomsky would argue it is inadequate, or more harshly, kind of shallow
  • ...17 more annotations...
  • David Marr, a neuroscientist colleague of Chomsky's at MIT, defined a general framework for studying complex biological systems (like the brain) in his influential book Vision,
  • a complex biological system can be understood at three distinct levels. The first level ("computational level") describes the input and output to the system, which define the task the system is performing. In the case of the visual system, the input might be the image projected on our retina and the output might our brain's identification of the objects present in the image we had observed. The second level ("algorithmic level") describes the procedure by which an input is converted to an output, i.e. how the image on our retina can be processed to achieve the task described by the computational level. Finally, the third level ("implementation level") describes how our own biological hardware of cells implements the procedure described by the algorithmic level.
  • The emphasis here is on the internal structure of the system that enables it to perform a task, rather than on external association between past behavior of the system and the environment. The goal is to dig into the "black box" that drives the system and describe its inner workings, much like how a computer scientist would explain how a cleverly designed piece of software works and how it can be executed on a desktop computer.
  • As written today, the history of cognitive science is a story of the unequivocal triumph of an essentially Chomskyian approach over Skinner's behaviorist paradigm -- an achievement commonly referred to as the "cognitive revolution,"
  • While this may be a relatively accurate depiction in cognitive science and psychology, behaviorist thinking is far from dead in related disciplines. Behaviorist experimental paradigms and associationist explanations for animal behavior are used routinely by neuroscientists
  • Chomsky critiqued the field of AI for adopting an approach reminiscent of behaviorism, except in more modern, computationally sophisticated form. Chomsky argued that the field's heavy use of statistical techniques to pick regularities in masses of data is unlikely to yield the explanatory insight that science ought to offer. For Chomsky, the "new AI" -- focused on using statistical learning techniques to better mine and predict data -- is unlikely to yield general principles about the nature of intelligent beings or about cognition.
  • Behaviorist principles of associations could not explain the richness of linguistic knowledge, our endlessly creative use of it, or how quickly children acquire it with only minimal and imperfect exposure to language presented by their environment.
  • it has been argued in my view rather plausibly, though neuroscientists don't like it -- that neuroscience for the last couple hundred years has been on the wrong track.
  • Implicit in this endeavor is the assumption that with enough sophisticated statistical tools and a large enough collection of data, signals of interest can be weeded it out from the noise in large and poorly understood biological systems.
  • Brenner, a contemporary of Chomsky who also participated in the same symposium on AI, was equally skeptical about new systems approaches to understanding the brain. When describing an up-and-coming systems approach to mapping brain circuits called Connectomics, which seeks to map the wiring of all neurons in the brain (i.e. diagramming which nerve cells are connected to others), Brenner called it a "form of insanity."
  • These debates raise an old and general question in the philosophy of science: What makes a satisfying scientific theory or explanation, and how ought success be defined for science?
  • Ever since Isaiah Berlin's famous essay, it has become a favorite pastime of academics to place various thinkers and scientists on the "Hedgehog-Fox" continuum: the Hedgehog, a meticulous and specialized worker, driven by incremental progress in a clearly defined field versus the Fox, a flashier, ideas-driven thinker who jumps from question to question, ignoring field boundaries and applying his or her skills where they seem applicable.
  • Chomsky's work has had tremendous influence on a variety of fields outside his own, including computer science and philosophy, and he has not shied away from discussing and critiquing the influence of these ideas, making him a particularly interesting person to interview.
  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • An unlikely pair, systems biology and artificial intelligence both face the same fundamental task of reverse-engineering a highly complex system whose inner workings are largely a mystery
  • neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
anonymous

VW Says Emissions Cheating Was Not a One-Time Error - The New York Times - 0 views

  • Volkswagen said on Thursday that its emissions cheating scandal began in 2005 with a decision to heavily promote diesel engines in the United States and a realization that those engines could not meet clean air standards.
  • Some employees, the company found, chose to cheat on emissions tests rather than curtail Volkswagen’s American campaign.
  • “There was a tolerance for breaking the rules,” Hans-Dieter Pötsch, the chairman of Volkswagen’s supervisory board, said
  • ...4 more annotations...
  • That is the hardest thing to accept,”
  • “It proves not to have been a one-time error, but rather a chain of errors that were allowed to happen,” Mr. Pötsch said.
  • “a mind-set in some areas of the company that tolerated breaches of the rules.”
  • “We have to understand how this came about,” Mr. Pötsch said. “That is the only way we can prevent it from happening again.”
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

Counting Calories to Stay Fit? There's a Trillion Little Problems With That. - Mother J... - 0 views

  • The scientists during Atwater’s era saw the human digestive system as a single engine producing a predictable quantity of energy from a given amount of fuel.
  • Yet the human gut contains a multitude of engines, and they interact with each other in ways science is just beginning to unravel. Over the past 15 years, a fast-growing body of literature suggests that the gut microbiome—the trillions of microbes that live inside us—shapes the way we metabolize food and may play an important role in how we gain weight.
  • Antibiotics, it turns out, reconfigure your gut’s balance in favor of microbes that help us store food as body fat.
  • ...8 more annotations...
  • As a result, our microbiomes are better at helping us store fat than those of our ancestors.
  • Antibiotics aren’t the only force shifting our internal ecology. Modern diets are full of processed foods and low in fiber, the kind of hard-to-break-down carbohydrates found especially in vegetables, legumes, and whole grains that are crucial for a healthy microbiome.
  • The vast majority of our internal microbes live in the far reaches of our digestive tract, the colon, explains Justin Sonnenburg, an associate professor of microbiology and immunology at Stanford. Because of their location, these microscopic critters “really only get access to the dregs of what we eat”—the dietary fiber that our organs can’t digest. The microbes have evolved to process that fiber by fermenting it with enzymes.
  • feeding this fermentation process appears to be crucial for averting weight gain and diseases like obesity and Type 2 diabetes
  • fiber supplements might also trigger liver cancer.
  • “Right now, the only useful advice I could give somebody would be to eat foods naturally rich in fiber,” he says, like bran cereal and every kind of bean you can think of. Other winners included pears, avocados, apples, seeds, and nuts.
  • The Institute of Medicine recommends that women eat 25 grams and men 38 grams of fiber every day, but Americans only get about 15 grams on average.
  • The choice of whether to lunch on a cup of black beans or five chicken nuggets—which both contain about 220 calories—just got a whole lot easier.
Javier E

Economic Statistics Miss the Benefits of Technology - NYTimes.com - 2 views

  • Value added by the information technology and communications industries — mostly hardware and software — has remained stuck at around 4 percent of the nation’s economic output for the last quarter century.
  • But these statistics do not tell the whole story. Because they miss much of what technology does for people’s well-being. News organizations that take advantage of computers to let go of journalists, secretaries and research assistants will show up in the economic statistics as more productive, making more with less. But statisticians have no way to value more thorough, useful, fact-dense articles. What’s more, gross domestic product only values the goods and services people pay for. It does not capture the value to consumers of economic improvements that are given away free. And until recently this is what news media organizations like The New York Times were doing online.
  • “G.D.P. is not a measure of how much value is produced for consumers,” said Erik Brynjolfsson of the Massachusetts Institute of Technology. “Everybody should recognize that G.D.P. is not a welfare metric.”
  • ...10 more annotations...
  • how to measure the Internet’s contribution to our lives? A few years ago, Austan Goolsbee of the University of Chicago and Peter J. Klenow of Stanford gave it a shot. They estimated that the value consumers gained from the Internet amounted to about 2 percent of their income — an order of magnitude larger than what they spent to go online. Their trick was to measure not only how much money users spent on access but also how much of their leisure time they spent online.
  • people who had access to a search engine took 15 minutes less to answer a question than those without online access.
  • Gross domestic product has always failed to capture many things — from the costs of pollution and traffic jams to the gains of unpaid household work. A
  • Varian estimated that a search engine might be worth about $500 annually to the average worker. Across the working population, this would add up to $65 billion a year.
  • the consumer surplus from free online services — the value derived by consumers from the experience above what they paid for it — has been growing by $34 billion a year, on average, since 2002. If it were tacked on as “economic output,” it would add about 0.26 of a percentage point to annual G.D.P. growth.
  • The Internet is hardly the first technology to offer consumers valuable free goods. The consumer surplus from television is about five times as large as that delivered by free stuff online, according to Mr. Brynjolfsson’s calculations.
  • Measured in money — what it contributes to G.D.P. — the recording industry is shrinking. Yet never before have Americans had access to so much music.
  • . The missed consumer surplus from the Internet may be no bigger than the unmeasured gains in the production, for example, of electric light.
  • The amount of time Americans devote to the Internet has doubled in the last five years.
  • “We know less about the sources of value in the economy than we did 25 years ago,”
anonymous

Gene Therapy Creates Replacement Skin to Save a Dying Boy - The New York Times - 1 views

  • The boy in the Nature article had suffered since birth from blisters all over his body, and in 2015 contracted bacterial infections that caused him to lose two-thirds of his skin. His doctors did not know how to treat him, other than keeping him on morphine for the pain.
  • Doctors in the burn unit tried everything: antibiotics, bandages, special nutritional measures, a skin transplant from the boy’s father. Nothing worked.
  • The doctors removed a sample of the boy’s skin — slightly more than half a square inch — and took it to Modena, where they genetically engineered his cells, using a virus to insert the normal form of his mutated gene into his DNA.Then they grew the engineered cells in the laboratory into sheets of skin and transported them back to Germany, where surgeons grafted them onto the boy’s body.In October 2015, they covered his arms and legs with the new skin, and in November, his back. Ultimately, they replaced 80 percent of the child’s skin.
  • ...1 more annotation...
  • A major concern with any type of gene therapy is that the inserted genetic material could have dangerous side effects, like turning off an essential gene or turning on one that could lead to cancer.
Javier E

Silicon Valley Is Not Your Friend - The New York Times - 0 views

  • By all accounts, these programmers turned entrepreneurs believed their lofty words and were at first indifferent to getting rich from their ideas. A 1998 paper by Sergey Brin and Larry Page, then computer-science graduate students at Stanford, stressed the social benefits of their new search engine, Google, which would be open to the scrutiny of other researchers and wouldn’t be advertising-driven.
  • The Google prototype was still ad-free, but what about the others, which took ads? Mr. Brin and Mr. Page had their doubts: “We expect that advertising-funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.”
  • He was concerned about them as young students lacking perspective about life and was worried that these troubled souls could be our new leaders. Neither Mr. Weizenbaum nor Mr. McCarthy mentioned, though it was hard to miss, that this ascendant generation were nearly all white men with a strong preference for people just like themselves. In a word, they were incorrigible, accustomed to total control of what appeared on their screens. “No playwright, no stage director, no emperor, however powerful,” Mr. Weizenbaum wrote, “has ever exercised such absolute authority to arrange a stage or a field of battle and to command such unswervingly dutiful actors or troops.”
  • ...7 more annotations...
  • In his epic anti-A.I. work from the mid-1970s, “Computer Power and Human Reason,” Mr. Weizenbaum described the scene at computer labs. “Bright young men of disheveled appearance, often with sunken glowing eyes, can be seen sitting at computer consoles, their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler’s on the rolling dice,” he wrote. “They exist, at least when so engaged, only through and for the computers. These are computer bums, compulsive programmers.”
  • Welcome to Silicon Valley, 2017.
  • As Mr. Weizenbaum feared, the current tech leaders have discovered that people trust computers and have licked their lips at the possibilities. The examples of Silicon Valley manipulation are too legion to list: push notifications, surge pricing, recommended friends, suggested films, people who bought this also bought that.
  • Growth becomes the overriding motivation — something treasured for its own sake, not for anything it brings to the world
  • Facebook and Google can point to a greater utility that comes from being the central repository of all people, all information, but such market dominance has obvious drawbacks, and not just the lack of competition. As we’ve seen, the extreme concentration of wealth and power is a threat to our democracy by making some people and companies unaccountable.
  • As is becoming obvious, these companies do not deserve the benefit of the doubt. We need greater regulation, even if it impedes the introduction of new services.
  • We need to break up these online monopolies because if a few people make the decisions about how we communicate, shop, learn the news, again, do we control our own society?
kushnerha

Facebook's Bias Is Built-In, and Bears Watching - The New York Times - 2 views

  • Facebook is the world’s most influential source of news.That’s true according to every available measure of size — the billion-plus people who devour its News Feed every day, the cargo ships of profit it keeps raking in, and the tsunami of online traffic it sends to other news sites.
  • But Facebook has also acquired a more subtle power to shape the wider news business. Across the industry, reporters, editors and media executives now look to Facebook the same way nesting baby chicks look to their engorged mother — as the source of all knowledge and nourishment, the model for how to behave in this scary new-media world. Case in point: The New York Times, among others, recently began an initiative to broadcast live video. Why do you suppose that might be? Yup, the F word. The deal includes payments from Facebook to news outlets, including The Times.
  • Yet few Americans think of Facebook as a powerful media organization, one that can alter events in the real world. When blowhards rant about the mainstream media, they do not usually mean Facebook, the mainstreamiest of all social networks. That’s because Facebook operates under a veneer of empiricism. Many people believe that what you see on Facebook represents some kind of data-mined objective truth unmolested by the subjective attitudes of fair-and-balanced human beings.
  • ...11 more annotations...
  • None of that is true. This week, Facebook rushed to deny a report in Gizmodo that said the team in charge of its “trending” news list routinely suppressed conservative points of view. Last month, Gizmodo also reported that Facebook employees asked Mark Zuckerberg, the social network’s chief executive, if the company had a responsibility to “help prevent President Trump in 2017.” Facebook denied it would ever try to manipulate elections.
  • Even if you believe that Facebook isn’t monkeying with the trending list or actively trying to swing the vote, the reports serve as timely reminders of the ever-increasing potential dangers of Facebook’s hold on the news.
  • The question isn’t whether Facebook has outsize power to shape the world — of course it does, and of course you should worry about that power. If it wanted to, Facebook could try to sway elections, favor certain policies, or just make you feel a certain way about the world, as it once proved it could do in an experiment devised to measure how emotions spread online.
  • There is no evidence Facebook is doing anything so alarming now. The danger is nevertheless real. The biggest worry is that Facebook doesn’t seem to recognize its own power, and doesn’t think of itself as a news organization with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.
  • That myth should die. It’s true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.
  • “With Facebook, humans are never not involved. Humans are in every step of the process — in terms of what we’re clicking on, who’s shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans.”Everything you see on Facebook is therefore the product of these people’s expertise and considered judgment, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It’s often hard to know which, because Facebook’s editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.
  • Facebook also has an unmistakable corporate ethos and point of view. The company is staffed mostly by wealthy coastal Americans who tend to support Democrats, and it is wholly controlled by a young billionaire who has expressed policy preferences that many people find objectionable.
  • You could argue that none of this is unusual. Many large media outlets are powerful, somewhat opaque, operated for profit, and controlled by wealthy people who aren’t shy about their policy agendas — Bloomberg News, The Washington Post, Fox News and The New York Times, to name a few.But there are some reasons to be even more wary of Facebook’s bias. One is institutional. Many mainstream outlets have a rigorous set of rules and norms about what’s acceptable and what’s not in the news business.
  • Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people’s previously held points of view. If News Feed shows news that we’re each likely to Like, it could trap us into echo chambers and contribute to rising political polarization. In a study last year, Facebook’s scientists asserted the echo chamber effect was muted.
  • are Facebook’s engineering decisions subject to ethical review? Nobody knows.
  • The other reason to be wary of Facebook’s bias has to do with sheer size. Ms. Caplan notes that when studying bias in traditional media, scholars try to make comparisons across different news outlets. To determine if The Times is ignoring a certain story unfairly, look at competitors like The Washington Post and The Wall Street Journal. If those outlets are covering a story and The Times isn’t, there could be something amiss about the Times’s news judgment.Such comparative studies are nearly impossible for Facebook. Facebook is personalized, in that what you see on your News Feed is different from what I see on mine, so the only entity in a position to look for systemic bias across all of Facebook is Facebook itself. Even if you could determine the spread of stories across all of Facebook’s readers, what would you compare it to?
caelengrubb

How Galileo Galilei's discoveries helped create modern science - 0 views

  • Few people in history can claim as large a contribution to how we conduct and think about science as Galileo. His work revolutionized our entire outlook on what it means to study nature (and got him in some very hot water with the Roman Inquisition)
  • He is perhaps best known for his championing of Copernicus’ heliocentric model (the one that says the Earth and other planets orbit the Sun), but that is by no means the full extent of his legacy. Far, far from it.
  • Galileo earned himself a place among the stars as Europe’s global navigation satellite system bears his name
  • ...10 more annotations...
  • Galileo is certainly among the titans of science — in many ways, he’s one of its ‘founders’. His legacy includes contributions to the fields of physics, astronomy, math, engineering, and the application of the scientific method
  • He was an accomplished mathematician and inventor, designing (among others) several military compasses and the thermoscope. He was also the one to pick up the torch of modern astronomy from Copernicus, cementing the foundations of this field of study by proving his theories right.
  • Showing others what science can do, and how one should go about it, is Galileo’s most important achievement. Its effects still ripple through the lives of every researcher to this day
  • Since the days of Aristotle, scholars in Europe believed that heavier objects fall faster than lighter ones. Galileo showed that this wasn’t the case, using balls of the same materials but different weights and sizes. In one of his infamous experiments, he dropped two such balls from the top of the leaning tower of Pisa to show that objects of different weights accelerate just as fast towards the ground (air resistance notwithstanding).
  • The truth is Galileo’s experiments in this area used a more reliable but less flashy bunch of inclined planes that he rolled balls down on.
  • His interest regarding motion and the falling of objects were tightly linked to Galileo’s interest in planets, stars, and the solar system.
  • Apart from his theoretical pursuits, Galileo was also an accomplished engineer — meaning he could also turn his knowledge to the solving of practical problems. Most of these, historical accounts tell us, were attempts by Galileo to earn a little bit of extra cash in order to support his extended family after his father passed away.
  • Among his creations are a set of military compasses (sectors) that were simple enough for artillery crews and surveyors to use.
  • He was also an early builder and user of telescopes and microscopes. Galileo, among a few select others, was the first to ever use a refracting telescope as an instrument to observe heavenly bodies, in 1609
  • His fascination with celestial bodies and defense of the heliocentric model is what eventually led to the Inquisition cracking down on him and his works.
kaylynfreeman

NASA's SLS Rocket to the Moon Faces Setback After Test - The New York Times - 0 views

  • After billions of dollars and a decade of work, NASA’s plans to send astronauts back to the moon had a new setback on Saturday. A planned eight-minute test firing of the four engines of a new mega rocket needed for the moon missions came to an abrupt end after only about a minute.
  • NASA officials, however, said that it was too early to predict delays, if any. “I don’t think at this point that we have enough information to know,” Jim Bridenstine, the NASA administrator, said during a news conference after the test. “It depends what the anomaly was and how challenging it’s going to be to fix it.”
  • The rocket, known as the Space Launch System, has yet to travel to space, and Saturday’s test was intended to be a key milestone. For the first time, the four engines on the booster stage were set to be fired for about eight minutes, simulating what they would do during an actual launch.
Javier E

We should know by now that progress isn't guaranteed - and often backfires - The Washin... - 1 views

  • We assume that progress is the natural order of things. Problems are meant to be solved. History is an upward curve of well-being. But what if all this is a fantasy
  • our most powerful disruptions shared one characteristic: They were not widely foreseen
  • This was true of the terrorism of 9/11; the financial crisis of 2008-2009 and the parallel Great Recession; and now the coronavirus pandemic
  • ...13 more annotations...
  • In each case, there was a failure of imagination, as Tom Friedman has noted. Warnings found little receptiveness among the public or government officials. We didn’t think what happened could happen. The presumption of progress bred complacency.
  • We fooled ourselves into thinking we had engineered permanent improvements in our social and economic systems.
  • To be fair, progress as it’s commonly understood — higher living standards — has not been at a standstill. Many advances have made life better
  • Similar inconsistencies and ambiguities attach to economic growth. It raises some up and pushes others down.
  • What we should have learned by now is that progress is often grudging, incomplete or contradictory.
  • Still, the setbacks loom ever larger. Our governmental debt is high, and economic stability is low. Many of the claims of progress turn out to be exaggerated, superficial, delusional or unattainable,
  • Sure, the Internet enables marvelous things. But it also imposes huge costs on society
  • Global warming is another example. It is largely a result of the burning of fossil fuels, which has been the engine of our progress. Now, it is anti-progress.
  • the lesson of both economic growth and technologies is that they are double-edged swords and must be judged as such.
  • What connects these various problems is the belief that the future can be orchestrated.
  • The reality is that our control over the future is modest at best, nonexistent at worst. We react more to events than lead them.
  • We worship at the altar of progress without adequately acknowledging its limits.
  • it does mean that we should be more candid about what is possible. If not, we might yet again wander over the “border between reality and impossibility.”
lucieperloff

A Theory About Conspiracy Theories - The New York Times - 0 views

  • More than 1 in 3 Americans believe that the Chinese government engineered the coronavirus as a weapon, and another third are convinced that the Centers for Disease Control and Prevention has exaggerated the threat of Covid-19 to undermine President Trump.
  • Another is less so: a more solitary, anxious figure, moody and detached, perhaps including many who are older and living alone.
  • all conspiring to use Covid-19 for their own dark purposes.
    • lucieperloff
       
      Taking advantage of a very realistic fear and making it much more dramatic
  • ...12 more annotations...
  • Still, psychologists do not have a good handle on the types of people who are prone to buy into Big Lie theories, especially the horror-film versions.
    • lucieperloff
       
      To an extent, most people are prone to buy into these theories
  • The theories afford some psychological ballast, a sense of control, an internal narrative to make sense of a world that seems senseless.
    • lucieperloff
       
      People don't like thinking things are random!! They like having some semblance of control!!
  • “With all changes happening in politics, the polarization and lack of respect, conspiracy theories are playing a bigger role in people’s thinking and behavior possibly than ever,”
  • Conspiracy theories are as old as human society, of course, and in the days when communities were small and vulnerable, being on guard for hidden plots was likely a matter of personal survival,
    • lucieperloff
       
      Believing conspiracy theories could have been important to survival at one point in time
  • More than 1 in 3 Americans believe that the Chinese government engineered the coronavirus as a weapon, and another third are convinced that the Centers for Disease Control and Prevention has exaggerated the threat of Covid-19 to undermine President Trump.
    • lucieperloff
       
      Conspiracy theories affect everyone around us. Constantly.
  • “You really have a perfect storm, in that the theories are directed at those who have fears of getting sick and dying or infecting someone else,”
    • lucieperloff
       
      Theories take advantage of those who already are extremely vulnerable and stressed
  • About 60 percent scored low on the scales, meaning they were resistant to such theories; the other 40 percent ranged above average or higher.
  • For example, qualities like conscientiousness, modesty and altruism were very weakly related to a person’s susceptibility. Levels of anger or sincerity bore no apparent relation; nor did self-esteem.
    • lucieperloff
       
      Are these the results scientists were expecting?
  • The personality features that were solidly linked to conspiracy beliefs included some usual suspects: entitlement, self-centered impulsivity, cold-heartedness (the confident injustice collector), elevated levels of depressive moods and anxiousness (the moody figure, confined by age or circumstance).
  • It’s a pattern of magical thinking that goes well beyond garden variety superstition and usually comes across socially as disjointed, uncanny or “off.”
  • when distracted, people are far more likely to forward headlines and stories without vetting their sources much, if at all.
    • lucieperloff
       
      Distractions can easily be undermined and make people believe conspiracy theories
  • They have a core constituency, and in the digital era its members are going to quickly find one another.
    • lucieperloff
       
      Spread through the internet and social media
« First ‹ Previous 41 - 60 of 233 Next › Last »
Showing 20 items per page