Skip to main content

Home/ TOK Friends/ Group items tagged million

Rss Feed Group items tagged

johnsonel7

67.3 Million in the United States Spoke a Foreign Language at Home in 2018 | Center for... - 0 views

  • Based on analysis of newly released Census Bureau data for 2018, the Center for Immigration Studies finds that 67.3 million residents in the United States now speak a language other than English at home, a number equal to the entire population of France
  • In America's five largest cities, just under half (48 percent) of residents now speak a language other than English at home.
  • As a share of the population, 21.9 percent of U.S. residents speak a foreign language at home — more than double the 11 percent in 1980.
  • ...1 more annotation...
  • Languages with more than a million people who speak it at home in 2018 were Spanish (41.5 million), Chinese (3.5 million), Tagalog (1.8 million), Vietnamese (1.5 million), Arabic (1.3 million), French (1.2 million), and Korean (1.1 million)
Javier E

The Coming Software Apocalypse - The Atlantic - 1 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • What made programming so difficult was that it required you to think like a computer.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
colemorris

Los Angeles becomes first county to hit 1 million Covid-19 cases - 0 views

  • Los Angeles on Saturday became the first county in the nation to record 1 million coronavirus cases since the start of the pandemic.
  • “The presence of the U.K. variant in Los Angeles County is troubling, as our healthcare system is already severely strained with more than 7,500 people currently hospitalized,
  • On Saturday, Los Angeles reported 1,003,923 confirmed Covid-19 infections and 13,741 deaths.
  • ...4 more annotations...
  • The state of California is also reporting staggering numbers with more than 2.9 million confirmed cases, according to NBC News counts. Texas, with 2 million cases, and Florida, with 1.5 million, are the next two states with the most infections. New York, which was one of the country’s first and biggest hot spots, has recorded 1.2 million cases to date.
  • But its faster spread will lead to more cases overall, the study authors wrote, "exacerbating the burden on an already strained health care system, and resulting in more deaths."
    • colemorris
       
      sad to think that things can get worse than this
  • Since then, rates have increased by 1,000 percent and have disproportionately affected Latinos, who comprise roughly half of the total population.
  • It will take a number of months to reach the level of vaccination needed in the population to curb ongoing transmission of the virus."
katherineharron

For many, the first line of defense against Covid-19 is out of reach - CNN - 0 views

  • Washing your hands frequently, with plenty of water and soap, is one of the simplest and most effective measures to stop the spread of the coronavirus. Yet due to a lack of water supply and indoor plumbing, three-quarters of households in the developing world won't be able to follow this advice, Tim Wainwright of the nonprofit WaterAid told The Guardian, because they lack some place to wash with soap and water. How will they cope when the pandemic escalates and there is no clean water to help stop contagion?
  • The pandemic may be raging in Europe and the US, but it is spreading across Asia, Africa and Latin America, from where it may yet return to strike the Northern Hemisphere again. This pandemic is a global threat, and it will not be defeated until our most vulnerable communities are safe. One of the crucial ways of keeping them safe is to ensure they have access to safe water and sanitation; never has the sixth UN Sustainable Development Goal, which aims to ensure just this by 2030, been more vital for saving and protecting lives.
  • Two years ago, Cape Town in South Africa came perilously close to "Day Zero" -- the day its 3.7 million residents would run out of water. Strict water rationing has been the order of the day ever since. Water scarcity increases the burden on the poorest of the poor -- women who must walk for miles to find water and carry it back to their homes. And, of course, water scarcity makes the challenge of delivering clean water and sanitation much more complex.
  • ...6 more annotations...
  • Every year, 1.5 million young children die of preventable infectious diseases such as diarrhea because of poor sanitation, according to UNICEF. One out of every three humans on our planet -- some 2.2 billion people -- lack access to safe drinking water, and six out of 10 lack access to proper sanitation, meaning toilets or safely managed sewage systems. Residents of the Kibera slum in Nairobi, Kenya, can share one pit latrine with over 100 people. In Dharavi, Asia's largest urban slum located in Mumbai, India, 80% of its seven million residents have no running water, the National Observer has noted. What hope do they have of washing their hands frequently?
  • More than two million Americans live without running water, indoor plumbing or wastewater treatment, according to the nonprofits Dig Deep and the US Water Alliance. A report by Food & Water Watch found that, in 2016, 15 million Americans had their water shut off due to an inability to pay water bills -- one out of every 20 households across the country. The US neither provides a constitutional right to water nor recognizes the UN Human Right to Water and Sanitation.
  • The truth is that in every country, water infrastructure -- where it is present -- is deteriorating.
  • But investing in clean water infrastructure saves lives, and we must seize this moment to ramp up investment in safe water and sanitation. Spending on water is not just the right thing to do; it's also the smart thing to do.
  • According to the UN, every $1 invested in safe drinking water in urban areas yields more than $3 in saved medical costs and added productivity, on average. For every $1 invested in basic sanitation, society makes $2.50 back. The return is higher in rural areas, with $7 gained or saved for every $1 invested in clean drinking water.
  • Without clean water and proper sanitation, we will not be able to combat the spread of the new coronavirus. Like Covid-19, water scarcity is a global problem that needs collective action. Never has there been a more urgent time to address the world's water crisis.
proudsa

UN: World's Refugees And Displaced To Exceed 60 Million This Year - 0 views

  • The number of people forcibly displaced worldwide is likely to have "far surpassed" a record 60 million this year, mainly driven by the Syrian war and other protracted conflicts, the United Nations said on Friday.
  • Nearly 2.5 million asylum seekers have requests pending, with Germany, Russia and the United States receiving the highest numbers of the nearly one million new claims lodged in the first half of the year, it said.
  • Developing countries bordering conflict zones still host the lion's share of the refugees, the report said, warning about growing "resentment" and "politicization of refugees."
  • ...2 more annotations...
  • Syria's civil war that began in 2011 has been the main driver of mass displacement, with more than 4.2 million Syrian refugees having fled abroad and 7.6 million uprooted within their shattered homeland as of mid-year, UNHCR said.
  • Many refugees will live in exile for years to come, it said. "In effect, if you become a refugee today your chances of going home are lower than at any time in more than 30 years."
Javier E

Huffington Post in Limbo at Verizon - NYTimes.com - 0 views

  • The Huffington Post sits at the center of a phenomenon that some describe as the birth of a new media establishment: Several digital start-ups, including BuzzFeed and Vice, are trying to upend news presentation the way cable channels encroached on broadcast television in the 1980s. By that measure, some in the industry say, $1 billion is a reasonable valuation for a site with more than 200 million unique visitors a month, and acquiring it is a smart play for Verizon as it follows other communications companies, like Comcast, in owning its own content.
  • Others see, instead, a frothy market that has led to overly high valuations for media companies, based largely on branding and a relentless focus on audience development techniques. Photo
  • According to a document published in 2013 by the website The Smoking Gun, The Huffington Post was expected to generate $60 million in revenue in 2011, when AOL bought it, with $10 million in Ebitda (earnings before interest, tax, depreciation and amortization) growing to $165 million in revenue and $58 million in Ebitda by 2013. People with knowledge of its current finances said that its annual revenue is now in the hundreds of millions, and that its profitability depends on how generously its recent investments in a global expansion and video are assessed.
sanderk

Coronavirus deaths in US: 200,000 could die, researchers predict - Business Insider - 1 views

  • Last week, the country saw its cases spike more than 40% in just 24 hours. This week, the number of daily cases continues to rise — even as Americans practice social distancing by working from home, limiting outdoor excursions, and staying 6 feet away from one another.
  • They estimated only 12% of coronavirus cases (including asymptomatic ones) had been reported in the US as of March 15, which would mean about 29,000 infections had gone undiagnosed by that time. The US has reported more than 69,000 cases and over 1,000 deaths as of Thursday.
  • The most extreme model predicted that up to 1.2 million people could die. By comparison, a typical flu season in the US kills between 11,000 and 95,000 people, according to the Centers for Disease Control and Prevention. 
  • ...5 more annotations...
  • Some estimated that the CDC had reported more than 20% of COVID-19 cases as of March 15, but others predicted that the agency had identified just 5% of cases. Some predicted that the US could see 1 million deaths by the end of 2020, while others predicted that the death toll would be in the thousands.
  • The New York Times recently used CDC data to model how the how the virus could spread if no actions were taken to stop transmission in the US. The models show that between 160 million and 214 million people could be infected and as many as 200,000 to 1.7 million people could die.
  • Even if all patients were able to receive treatment at hospitals, however, the researchers predicted that about 1.2 million people in the US could die.
  • But since this particular coronavirus hasn't been seen before in humans, scientists aren't certain whether it will behave the same way. Plus, it's spreading in places with high temperatures, like Australia.
  • A second outbreak could also arise after people resume normal activity. The US asked citizens to avoid international travel starting March 19, but opening its borders again could fuel the virus' spread. The same goes for allowing citizens to return to work or use mass transit.
anniina03

When Did Ancient Humans Start to Speak? - The Atlantic - 0 views

  • The larynx, also called the voice box, is where the trouble begins: Its location is, or was, supposed to be the key to language.
  • Scientists have agreed for a while that the organ is lower down the throat in humans than it is in any other primate, or was in our ancestors. And for decades, they thought that low-down larynx was a sort of secret ingredient to speech because it enabled its bearers to produce a variety of distinctive vowels, like the ones that make beet, bat, and boot sound like different words. That would mean that speech—and, therefore, language—couldn’t have evolved until the arrival of anatomically modern Homo sapiens about 200,000 years ago
  • In fact, they propose that the necessary equipment—specifically, the throat shape and motor control that produce distinguishable vowels—has been around as long as 27 million years, when humans and Old World monkeys (baboons, mandrills, and the like) last shared a common ancestor.
  • ...4 more annotations...
  • Those speech abilities could include distinct vowels and consonants, syllables, or even syntax—all of which, according to LDT, should be impossible for any animal without a human vocal tract.
  • As John Locke, a linguistics professor at Lehman College, put it, “Motor control rots when you die.” Soft tissues like tongues and nerves and brains generally don’t fossilize; DNA sequencing is impossible past a few hundred thousand years; no one has yet found a diary or rap track recorded by a teenage Australopithecus.
  • One of the quantitative models the new study relies on, he says, doesn’t properly represent the shape of the larynx, tongue, and other parts we use to talk: “It would convert a mailing tube into a human vocal tract.” And according to Lieberman, laryngeal descent theory “never claimed language was not possible” prior to the critical changes in our ancestors’ throat anatomy. “They’re trying to set up a straw man,” he said.
  • Rather than 27 million years, Hickok proposes that the earliest bound on any sort of speech ability would be nearer to human ancestors’ split with the Pan genus, which includes chimpanzees and bonobos, our closest living relatives. That split happened about 5 million to 7 million years ago—certainly longer than 200,000 years, but a far cry from 27 million. Lieberman argues that the precursors of speech might have emerged about a little more than 3 million years ago, when artifacts like jewelry appear in the archaeological record. The idea is that both language and jewelry are intimately related to the evolution of symbolic thinking.
anniina03

The Human Brain Evolved When Carbon Dioxide Was Lower - The Atlantic - 0 views

  • Kris Karnauskas, a professor of ocean sciences at the University of Colorado, has started walking around campus with a pocket-size carbon-dioxide detector. He’s not doing it to measure the amount of carbon pollution in the atmosphere. He’s interested in the amount of CO₂ in each room.
  • The indoor concentration of carbon dioxide concerns him—and not only for the usual reason. Karnauskas is worried that indoor CO₂ levels are getting so high that they are starting to impair human cognition.
  • Carbon dioxide, the same odorless and invisible gas that causes global warming, may be making us dumber.
  • ...11 more annotations...
  • “This is a hidden impact of climate change … that could actually impact our ability to solve the problem itself,” he said.
  • The science is, at first glance, surprisingly fundamental. Researchers have long believed that carbon dioxide harms the brain at very high concentrations. Anyone who’s seen the film Apollo 13 (or knows the real-life story behind it) may remember a moment when the mission’s three astronauts watch a gauge monitoring their cabin start to report dangerous levels of a gas. That gauge was measuring carbon dioxide. As one of the film’s NASA engineers remarks, if CO₂ levels rise too high, “you get impaired judgement, blackouts, the beginning of brain asphyxia.”
  • The same general principle, he argues, could soon affect people here on Earth. Two centuries of rampant fossil-fuel use have already spiked the amount of CO₂ in the atmosphere from about 280 parts per million before the Industrial Revolution to about 410 parts per million today. For Earth as a whole, that pollution traps heat in the atmosphere and causes climate change. But more locally, it also sets a baseline for indoor levels of carbon dioxide: You cannot ventilate a room’s carbon-dioxide levels below the global average.
  • In fact, many rooms have a much higher CO₂ level than the atmosphere, since ventilation systems don’t work perfectly.
  • On top of that, some rooms—in places such as offices, hospitals, and schools—are filled with many breathing people, that is, many people who are themselves exhaling carbon dioxide.
  • As the amount of atmospheric CO₂ keeps rising, indoor CO₂ will climb as well.
  • in one 2016 study Danish scientists cranked up indoor carbon-dioxide levels to 3,000 parts per million—more than seven times outdoor levels today—and found that their 25 subjects suffered no cognitive impairment or health issues. Only when scientists infused that same air with other trace chemicals and organic compounds emitted by the human body did the subjects begin to struggle, reporting “headache, fatigue, sleepiness, and difficulty in thinking clearly.” The subjects also took longer to solve basic math problems. The same lab, in another study, found that indoor concentrations of pure CO₂ could get to 5,000 parts per million and still cause little difficulty, at least for college students.
  • But other research is not as optimistic. When scientists at NASA’s Johnson Space Center tested the effects of CO₂ on about two dozen “astronaut-like subjects,” they found that their advanced decision-making skills declined with CO₂ at 1,200 parts per million. But cognitive skills did not seem to worsen as CO₂ climbed past that mark, and the intensity of the effect seemed to vary from person to person.
  • There’s evidence that carbon-dioxide levels may impair only the most complex and challenging human cognitive tasks. And we still don’t know why.
  • No one has looked at the effects of indoor CO₂ on children, the elderly, or people with health problems. Likewise, studies have so far exposed people to very high carbon levels for only a few hours, leaving open the question of what days-long exposure could do.
  • Modern humans, as a species, are only about 300,000 years old, and the ambient CO₂ that we encountered for most of our evolutionary life—from the first breath of infants to the last rattle of a dying elder—was much lower than the ambient CO₂ today. I asked Gall: Has anyone looked to see if human cognition improves under lower carbon-dioxide levels? If you tested someone in a room that had only 250 parts per million of carbon dioxide—a level much closer to that of Earth’s atmosphere three centuries or three millennia ago—would their performance on tests improve? In other words, is it possible that human cognitive ability has already declined?
kushnerha

Our Natural History, Endangered - The New York Times - 0 views

  • Worse, this rumored dustiness reinforces the widespread notion that natural history museums are about the past — just a place to display bugs and brontosaurs. Visitors may go there to be entertained, or even awe-struck, but they are often completely unaware that curators behind the scenes are conducting research into climate change, species extinction and other pressing concerns of our day. That lack of awareness is one reason these museums are now routinely being pushed to the brink. Even the National Science Foundation, long a stalwart of federal support for these museums, announced this month that it was suspending funding for natural history collections as it conducts a yearlong budget review.
  • It gets worse: A new Republican governor last year shut down the renowned Illinois State Museum, ostensibly to save the state $4.8 million a year. The museum pointed out that this would actually cost $33 million a year in lost tourism revenue and an untold amount in grants. But the closing went through, endangering a trove of 10 million artifacts, from mastodon bones to Native American tools, collected over 138 years, and now just languishing in the shuttered building. Eric Grimm, the museum’s director of science, characterized it as an act of “political corruption and malevolent anti-intellectualism.”
  • Other museums have survived by shifting their focus from research to something like entertainment.
  • ...9 more annotations...
  • The pandering can be insidious, too. The Perot Museum of Nature and Science in Dallas, which treats visitors to a virtual ride down a hydraulic fracturing well, recently made headlines for avoiding explicit references to climate change. Other museums omit scientific information on evolution. “We don’t need people to come in here and reject us,”
  • Even the best natural history museums have been obliged to reduce their scientific staff in the face of government cutbacks and the decline in donations following the 2008 economic crash. They still have their collections, and their public still comes through the door. But they no longer employ enough scientists to interpret those collections adequately for visitors or the world at large. Hence the journal Nature last year characterized natural history collections as “the endangered dead.”
  • these collections are less about the past than about our world and how it is changing. Sediment cores like the ones at the Illinois State Museum, for instance, may not sound terribly important, but the pollen in them reveals how past climates changed, what species lived and died as a result, and thus how our own future may be rapidly unfolding.
  • Natural history museums are so focused on the future that they have for centuries routinely preserved such specimens to answer questions they didn’t yet know how to ask, requiring methodologies that had not yet been invented, to make discoveries that would have been, for the original collectors, inconceivable.
  • THE people who first put gigantic mammoth and mastodon specimens in museums, for instance, did so mainly out of dumb wonderment. But those specimens soon led to the stunning 18th-century recognition that parts of God’s creation could become extinct. The heretical idea of extinction then became an essential preamble to Darwin, whose understanding of evolution by natural selection depended in turn on the detailed study of barnacle specimens collected and preserved over long periods and for no particular reason. Today, those same specimens continue to answer new questions with the help of genome sequencing, CT scans, stable isotope analysis and other technologies.
  • These museums also play a critical role in protecting what’s left of the natural world, in part because they often combine biological and botanical knowledge with broad anthropological experience.
  • “You have no nationality. You are scientists. You speak for nature.” Just since 1999, according to the Field Museum, inventories by its curators and their collaborators have been a key factor in the protection of 26.6 million acres of wilderness, mainly in the headwaters of the Amazon.
  • It may be optimistic to say that natural history museums have saved the world. It may even be too late for that. But they provide one other critical service that can save us, and our sense of wonder: Almost everybody in this country — even children in Denver who have never been to the Rocky Mountains, or people in San Francisco who have never walked on a Pacific Ocean beach — goes to a natural history museum at some point in his life, and these visits influence us in deep and unpredictable ways.
  • we dimly begin to understand the passage of time and cultures, and how our own species fits amid millions of others. We start to understand the strangeness and splendor of the only planet where we will ever have the great pleasure of living.
oliviaodon

White House Pushes 'Alternative Facts.' Here Are the Real Ones. - The New York Times - 0 views

  • Kellyanne Conway, counselor to President Trump, said on NBC’s “Meet the Press” on Sunday that the White House had put forth “alternative facts” to ones reported by the news media about the size of Mr. Trump’s inauguration crowd.
  • In leveling this attack, the president and Mr. Spicer made a series of false statements.Here are the facts.In a speech at the C.I.A. on Saturday, Mr. Trump said the news media had constructed a feud between him and the intelligence community. “They sort of made it sound like I had a ‘feud’ with the intelligence community,” he said. “It is exactly the opposite, and they understand that, too.”In fact, Mr. Trump repeatedly criticized the intelligence agencies during his transition to office and has questioned their conclusion that Russia meddled in the election to aid his candidacy. He called their assessment “ridiculous” and suggested that it had been politically motivated.
  • Mr. Trump said of his inauguration crowd, “It looked honestly like a million and a half people, whatever it was, it was, but it went all the way back to the Washington Monument.”Aerial photographs clearly show that the crowd did not stretch to the Washington Monument. An analysis by The New York Times, comparing photographs from Friday to ones taken of Barack Obama’s 2009 inauguration, showed that Mr. Trump’s crowd was significantly smaller and less than the 1.5 million people he claimed. An expert hired by The Times found that Mr. Trump’s crowd on the National Mall was about a third of the size of Mr. Obama’s in 2009.
  • ...2 more annotations...
  • Speaking later on Saturday in the White House briefing room, Mr. Spicer amplified Mr. Trump’s false claims. “This was the largest audience to ever witness an inauguration — period — both in person and around the globe,” he said.There is no evidence to support this claim. Not only was Mr. Trump’s inauguration crowd far smaller than Mr. Obama’s in 2009, but he also drew fewer television viewers in the United States (30.6 million) than Mr. Obama did in 2009 (38 million) and Ronald Reagan did in 1981 (42 million), Nielsen reported. Figures for online viewership were not available.
  • Mr. Spicer said that Washington’s Metro system had greater ridership on Friday than it did for Mr. Obama’s 2013 inauguration. “We know that 420,000 people used the D.C. Metro public transit yesterday, which actually compares to 317,000 that used it for President Obama’s last inaugural,” Mr. Spicer said.Neither number is correct, according to the transit system, which reported 570,557 entries into the rail system on Friday, compared with 782,000 on Inauguration Day in 2013.
  •  
    This article provides examples of alternative facts, and "real" facts.
Javier E

Was There a Civilization On Earth Before Humans? - The Atlantic - 0 views

  • When it comes to direct evidence of an industrial civilization—things like cities, factories, and roads—the geologic record doesn’t go back past what’s called the Quaternary period 2.6 million years ago
  • if we’re going back this far, we’re not talking about human civilizations anymore. Homo sapiens didn’t make their appearance on the planet until just 300,000 years or so ago. That means the question shifts to other species, which is why Gavin called the idea the Silurian hypothesis
  • could researchers find clear evidence that an ancient species built a relatively short-lived industrial civilization long before our own? Perhaps, for example, some early mammal rose briefly to civilization building during the Paleocene epoch about 60 million years ago. There are fossils, of course. But the fraction of life that gets fossilized is always minuscule and varies a lot depending on time and habitat. It would be easy, therefore, to miss an industrial civilization that only lasted 100,000 years—which would be 500 times longer than our industrial civilization has made it so far.
  • ...11 more annotations...
  • Given that all direct evidence would be long gone after many millions of years, what kinds of evidence might then still exist? The best way to answer this question is to figure out what evidence we’d leave behind if human civilization collapsed at its current stage of development.
  • Now that our industrial civilization has truly gone global, humanity’s collective activity is laying down a variety of traces that will be detectable by scientists 100 million years in the future. The extensive use of fertilizer, for example
  • And then there’s all that plastic. Studies have shown increasing amounts of plastic “marine litter” are being deposited on the seafloor everywhere from coastal areas to deep basins and even in the Arctic. Wind, sun, and waves grind down large-scale plastic artifacts, leaving the seas full of microscopic plastic particles that will eventually rain down on the ocean floor, creating a layer that could persist for geological timescales.
  • Likewise our relentless hunger for the rare-Earth elements used in electronic gizmos. Far more of these atoms are now wandering around the planet’s surface because of us than would otherwise be the case. They might also show up in future sediments, too.
  • Once you realize, through climate change, the need to find lower-impact energy sources, the less impact you will leave. So the more sustainable your civilization becomes, the smaller the signal you’ll leave for future generations.
  • The more fossil fuels we burn, the more the balance of these carbon isotopes shifts. Atmospheric scientists call this shift the Suess effect, and the change in isotopic ratios of carbon due to fossil-fuel use is easy to see over the last century. Increases in temperature also leave isotopic signals. These shifts should be apparent to any future scientist who chemically analyzes exposed layers of rock from our era. Along with these spikes, this Anthropocene layer might also hold brief peaks in nitrogen, plastic nanoparticles, and even synthetic steroids
  • Fifty-six million years ago, Earth passed through the Paleocene-Eocene Thermal Maximum (PETM). During the PETM, the planet’s average temperature climbed as high as 15 degrees Fahrenheit above what we experience today. It was a world almost without ice, as typical summer temperatures at the poles reached close to a balmy 70 degrees Fahrenheit.
  • While there is evidence that the PETM may have been driven by a massive release of buried fossil carbon into the air, it’s the timescale of these changes that matter. The PETM’s isotope spikes rise and fall over a few hundred thousand years. But what makes the Anthropocene so remarkable in terms of Earth’s history is the speed at which we’re dumping fossil carbon into the atmosphere. There have been geological periods where Earth’s CO2 has been as high or higher than today, but never before in the planet’s multibillion-year history has so much buried carbon been dumped back into the atmosphere so quickly
  • So the isotopic spikes we do see in the geologic record may not be spiky enough to fit the Silurian hypothesis’s bill.
  • ronically, however, the most promising marker of humanity’s presence as an advanced civilization is a by-product of one activity that may threaten it most.
  • “How do you know we’re the only time there’s been a civilization on our own planet?”
aprossi

(2) Fauci says 100 million vaccine doses in Biden's first 100 days is doable - 1 views

  • The latest on the coronavirus pandemic and vaccines
  • Fauci says 100 million vaccine doses in Biden's first 100 days is doable
  • Dr. Anthony Fauci said on Friday morning that "it's quite feasible" the United States can achieve President-elect Joe Biden's goal to distribute 100 million doses of Covid-19 vaccine in his first 100 days of office. Fauci is set to serve as Biden's chief medical adviser.
  • ...4 more annotations...
  • Right now, even now, we've gone from half a million a day to 750,000 a day.
  • "If we get about 70% to 85% of the people in the country vaccinated, we likely will get to that umbrella of herd immunity,
  • His remarks come a day after he outlined a $1.9 trillion emergency legislative package to fund a nationwide vaccination effort and provide direct economic relief to Americans amid the coronavirus pandemic, telling Americans that "the health of our nation is at stake."
  • 100 million vaccine shots in his initial 100 days in office.
jmfinizio

Man who accidentally threw out a bitcoin fortune offers $70 million for permission to d... - 0 views

  • A British man who accidentally threw a hard drive loaded with bitcoin into the trash has offered the local authority where he lives more than $70 million if it allows him to excavate a landfill site.
  • held a digital store of 7,500 bitcoins,
  • he discovered that he had mistakenly thrown the hard drive out with the trash.
  • ...5 more annotations...
  • They can then be used as payment, with every transaction being recorded in a public list known as blockchain.
  • "The plan would be to dig a specific area of the landfill based on a grid reference system and recover the hard drive whilst adhering to all safety and environmental standards,
  • they refused the offer and won't even have a face to face discussion with me on the matter."
  • Howells first discovered that the hard drive was missing when his bitcoin was worth around $9 million. Based on the current rates, he estimates it would be worth around $273 million.
  • "The cost of digging up the landfill, storing and treating the waste could run into millions of pounds -- without any guarantee of either finding it or it still being in working order."
lucieperloff

What NASA's OSIRIS-REx Mission Could Teach Us | Time - 0 views

  • and currently 322 million km (200 million mi.) from Earth
  • and currently 322 million km (200 million mi.) from Earth
    • lucieperloff
       
      Is that near or far? (Compared to other asteroids)
  • collect a sample.
  • ...3 more annotations...
  • TAGSAM was in contact with the surface of Bennu for six seconds, and collected material for five—the greatest share within the first three seconds.
    • lucieperloff
       
      Can they really get a good sample in only 3 seconds?
  • Studying their elemental composition can yield clues to planetary formation, cosmic chemistry and even the emergence of life on Earth.
  • Only then will the little bit of rock and dirt from the seven-year, $800 million mission be in the hands of the scientists. And only then will we begin to reveal the secrets that Bennu may hold.
    • lucieperloff
       
      A huge project but not much support/publicity
Javier E

When a Shitposter Runs a Social Media Platform - The Bulwark - 0 views

  • This is an unfortunate and pernicious pattern. Musk often refers to himself as moderate or independent, but he routinely treats far-right fringe figures as people worth taking seriously—and, more troublingly, as reliable sources of information.
  • By doing so, he boosts their messages: A message retweeted by or receiving a reply from Musk will potentially be seen by millions of people.
  • Also, people who pay for Musk’s Twitter Blue badges get a lift in the algorithm when they tweet or reply; because of the way Twitter Blue became a culture war front, its subscribers tend to skew to the righ
  • ...19 more annotations...
  • The important thing to remember amid all this, and the thing that has changed the game when it comes to the free speech/content moderation conversation, is that Elon Musk himself loves conspiracy theorie
  • The media isn’t just unduly critical—a perennial sore spot for Musk—but “all news is to some degree propaganda,” meaning he won’t label actual state-affiliated propaganda outlets on his platform to distinguish their stories from those of the New York Times.
  • In his mind, they’re engaged in the same activity, so he strikes the faux-populist note that the people can decide for themselves what is true, regardless of objectively very different track records from different sources.
  • Musk’s “just asking questions” maneuver is a classic Trump tactic that enables him to advertise conspiracy theories while maintaining a sort of deniability.
  • At what point should we infer that he’s taking the concerns of someone like Loomer seriously not despite but because of her unhinged beliefs?
  • Musk’s skepticism seems largely to extend to criticism of the far-right, while his credulity for right-wing sources is boundless.
  • This is part of the argument for content moderation that limits the dispersal of bullshit: People simply don’t have the time, energy, or inclination to seek out the boring truth when stimulated by some online outrage.
  • Refuting bullshit requires some technological literacy, perhaps some policy knowledge, but most of all it requires time and a willingness to challenge your own prior beliefs, two things that are in precious short supply online.
  • Brandolini’s Law holds that the amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it.
  • Here we can return to the example of Loomer’s tweet. People did fact-check her, but it hardly matters: Following Musk’s reply, she ended up receiving over 5 million views, an exponentially larger online readership than is normal for her. In the attention economy, this counts as a major win. “Thank you so much for posting about this, @elonmusk!” she gushed in response to his reply. “I truly appreciate it.”
  • the problem isn’t limited to elevating Loomer. Musk had his own stock of misinformation to add to the pile. After interacting with her account, Musk followed up last Tuesday by tweeting out last week a 2021 Federalist article claiming that Facebook founder Mark Zuckerberg had “bought” the 2020 election, an allegation previously raised by Trump and others, and which Musk had also brought up during his recent interview with Tucker Carlson.
  • If Zuckerberg wanted to use his vast fortune to tip the election, it would have been vastly more efficient to create a super PAC with targeted get-out-the-vote operations and advertising. Notwithstanding legitimate criticisms one can make about Facebook’s effect on democracy, and whatever Zuckerberg’s motivations, you have to squint hard to see this as something other than a positive act addressing a real problem.
  • It’s worth mentioning that the refutations I’ve just sketched of the conspiratorial claims made by Loomer and Musk come out to around 1,200 words. The tweets they wrote, read by millions, consisted of fewer than a hundred words in total. That’s Brandolini’s Law in action—an illustration of why Musk’s cynical free-speech-over-all approach amounts to a policy in favor of disinformation and against democracy.
  • Moderation is a subject where Zuckerberg’s actions provide a valuable point of contrast with Musk. Through Facebook’s independent oversight board, which has the power to overturn the company’s own moderation decisions, Zuckerberg has at least made an effort to have credible outside actors inform how Facebook deals with moderation issues
  • Meanwhile, we are still waiting on the content moderation council that Elon Musk promised last October:
  • The problem is about to get bigger than unhinged conspiracy theorists occasionally receiving a profile-elevating reply from Musk. Twitter is the venue that Tucker Carlson, whom advertisers fled and Fox News fired after it agreed to pay $787 million to settle a lawsuit over its election lies, has chosen to make his comeback. Carlson and Musk are natural allies: They share an obsessive anti-wokeness, a conspiratorial mindset, and an unaccountable sense of grievance peculiar to rich, famous, and powerful men who have taken it upon themselves to rail against the “elites,” however idiosyncratically construed
  • f the rumors are true that Trump is planning to return to Twitter after an exclusivity agreement with Truth Social expires in June, Musk’s social platform might be on the verge of becoming a gigantic rec room for the populist right.
  • These days, Twitter increasingly feels like a neighborhood where the amiable guy-next-door is gone and you suspect his replacement has a meth lab in the basement.
  • even if Twitter’s increasingly broken information environment doesn’t sway the results, it is profoundly damaging to our democracy that so many people have lost faith in our electoral system. The sort of claims that Musk is toying with in his feed these days do not help. It is one thing for the owner of a major source of information to be indifferent to the content that gets posted to that platform. It is vastly worse for an owner to actively fan the flames of disinformation and doubt.
kirkpatrickry

Charles Koch's Disturbing High School Economics Project Teaches 'Sacrificing Lives for ... - 0 views

  • Charles Koch is known for being CEO of industrial giant Koch Industries and a chief financier of the massive conservative political operation he runs with his brother David. In recent years, student activists and investigative journalists have exposed another of Koch’s hats: mega-donor to hundreds of colleges and universities, often funding free-market-focused academic centers housed at public and private schools alike. One Koch-funded program is advocating cutthroat economics to grade school students, even sacrificing lives for profits.
  • From 2005 to 2014, the Charles Koch Foundation doled out nearly $108 million to colleges and universities. The school that has accepted the second highest total from the Charles Koch Foundation from 2005 to 2014 is Florida State University, whose economics department entered into a 2008 agreement that gave the foundation a say in its curriculum and hiring decisions, as Dave Levinthal of the Center for Public Integrity reported. One part of the 2008 agreement, which proposed a $6.6 million budget to be funded by the Charles Koch Foundation and unnamed “Donor Partners,” established a “Program for Excellence in Economic Education” within the Gus A. Stavros Center for the Advancement of Free Enterprise and Economic Education, part of the economics department. Annual reports confirm these funding arrangements
Javier E

How Did Consciousness Evolve? - The Atlantic - 0 views

  • Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?
  • The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions.
  • The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence
  • ...23 more annotations...
  • Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition.
  • It coordinates something called overt attention – aiming the satellite dishes of the eyes, ears, and nose toward anything important.
  • Selective enhancement therefore probably evolved sometime between hydras and arthropods—between about 700 and 600 million years ago, close to the beginning of complex, multicellular life
  • The next evolutionary advance was a centralized controller for attention that could coordinate among all senses. In many animals, that central controller is a brain area called the tectum
  • At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing.
  • All vertebrates—fish, reptiles, birds, and mammals—have a tectum. Even lampreys have one, and they appeared so early in evolution that they don’t even have a lower jaw. But as far as anyone knows, the tectum is absent from all invertebrates
  • According to fossil and genetic evidence, vertebrates evolved around 520 million years ago. The tectum and the central control of attention probably evolved around then, during the so-called Cambrian Explosion when vertebrates were tiny wriggling creatures competing with a vast range of invertebrates in the sea.
  • The tectum is a beautiful piece of engineering. To control the head and the eyes efficiently, it constructs something called an internal model, a feature well known to engineers. An internal model is a simulation that keeps track of whatever is being controlled and allows for predictions and planning.
  • The tectum’s internal model is a set of information encoded in the complex pattern of activity of the neurons. That information simulates the current state of the eyes, head, and other major body parts, making predictions about how these body parts will move next and about the consequences of their movement
  • In fish and amphibians, the tectum is the pinnacle of sophistication and the largest part of the brain. A frog has a pretty good simulation of itself.
  • With the evolution of reptiles around 350 to 300 million years ago, a new brain structure began to emerge – the wulst. Birds inherited a wulst from their reptile ancestors. Mammals did too, but our version is usually called the cerebral cortex and has expanded enormously
  • The cortex also takes in sensory signals and coordinates movement, but it has a more flexible repertoire. Depending on context, you might look toward, look away, make a sound, do a dance, or simply store the sensory event in memory in case the information is useful for the future.
  • The most important difference between the cortex and the tectum may be the kind of attention they control. The tectum is the master of overt attention—pointing the sensory apparatus toward anything important. The cortex ups the ante with something called covert attention. You don’t need to look directly at something to covertly attend to it. Even if you’ve turned your back on an object, your cortex can still focus its processing resources on it
  • The cortex needs to control that virtual movement, and therefore like any efficient controller it needs an internal model. Unlike the tectum, which models concrete objects like the eyes and the head, the cortex must model something much more abstract. According to the AST, it does so by constructing an attention schema—a constantly updated set of information that describes what covert attention is doing moment-by-moment and what its consequences are
  • Covert attention isn’t intangible. It has a physical basis, but that physical basis lies in the microscopic details of neurons, synapses, and signals. The brain has no need to know those details. The attention schema is therefore strategically vague. It depicts covert attention in a physically incoherent way, as a non-physical essence
  • this, according to the theory, is the origin of consciousness. We say we have consciousness because deep in the brain, something quite primitive is computing that semi-magical self-description.
  • I’m reminded of Teddy Roosevelt’s famous quote, “Do what you can with what you have where you are.” Evolution is the master of that kind of opportunism. Fins become feet. Gill arches become jaws. And self-models become models of others. In the AST, the attention schema first evolved as a model of one’s own covert attention. But once the basic mechanism was in place, according to the theory, it was further adapted to model the attentional states of others, to allow for social prediction. Not only could the brain attribute consciousness to itself, it began to attribute consciousness to others.
  • In the AST’s evolutionary story, social cognition begins to ramp up shortly after the reptilian wulst evolved. Crocodiles may not be the most socially complex creatures on earth, but they live in large communities, care for their young, and can make loyal if somewhat dangerous pets.
  • If AST is correct, 300 million years of reptilian, avian, and mammalian evolution have allowed the self-model and the social model to evolve in tandem, each influencing the other. We understand other people by projecting ourselves onto them. But we also understand ourselves by considering the way other people might see us.
  • t the cortical networks in the human brain that allow us to attribute consciousness to others overlap extensively with the networks that construct our own sense of consciousness.
  • Language is perhaps the most recent big leap in the evolution of consciousness. Nobody knows when human language first evolved. Certainly we had it by 70 thousand years ago when people began to disperse around the world, since all dispersed groups have a sophisticated language. The relationship between language and consciousness is often debated, but we can be sure of at least this much: once we developed language, we could talk about consciousness and compare notes
  • Maybe partly because of language and culture, humans have a hair-trigger tendency to attribute consciousness to everything around us. We attribute consciousness to characters in a story, puppets and dolls, storms, rivers, empty spaces, ghosts and gods. Justin Barrett called it the Hyperactive Agency Detection Device, or HADD
  • the HADD goes way beyond detecting predators. It’s a consequence of our hyper-social nature. Evolution turned up the amplitude on our tendency to model others and now we’re supremely attuned to each other’s mind states. It gives us our adaptive edge. The inevitable side effect is the detection of false positives, or ghosts.
Javier E

Mark Zuckerberg, Let Me Pay for Facebook - NYTimes.com - 0 views

  • 93 percent of the public believes that “being in control of who can get information about them is important,” and yet the amount of information we generate online has exploded and we seldom know where it all goes.
  • the pop-up and the ad-financed business model. The former is annoying but it’s the latter that is helping destroy the fabric of a rich, pluralistic Internet.
  • Facebook makes about 20 cents per user per month in profit. This is a pitiful sum, especially since the average user spends an impressive 20 hours on Facebook every month, according to the company. This paltry profit margin drives the business model: Internet ads are basically worthless unless they are hyper-targeted based on tracking and extensive profiling of users. This is a bad bargain, especially since two-thirds of American adults don’t want ads that target them based on that tracking and analysis of personal behavior.
  • ...10 more annotations...
  • This way of doing business rewards huge Internet platforms, since ads that are worth so little can support only companies with hundreds of millions of users.
  • Ad-based businesses distort our online interactions. People flock to Internet platforms because they help us connect with one another or the world’s bounty of information — a crucial, valuable function. Yet ad-based financing means that the companies have an interest in manipulating our attention on behalf of advertisers, instead of letting us connect as we wish.
  • Many users think their feed shows everything that their friends post. It doesn’t. Facebook runs its billion-plus users’ newsfeed by a proprietary, ever-changing algorithm that decides what we see. If Facebook didn’t have to control the feed to keep us on the site longer and to inject ads into our stream, it could instead offer us control over this algorithm.
  • we’re not starting from scratch. Micropayment systems that would allow users to spend a few cents here and there, not be so easily tracked by all the Big Brothers, and even allow personalization were developed in the early days of the Internet. Big banks and large Internet platforms didn’t show much interest in this micropayment path, which would limit their surveillance abilities. We can revive it.
  • What to do? It’s simple: Internet sites should allow their users to be the customers. I would, as I bet many others would, happily pay more than 20 cents per month for a Facebook or a Google that did not track me, upgraded its encryption and treated me as a customer whose preferences and privacy matter.
  • Many people say that no significant number of users will ever pay directly for Internet services. But that is because we are misled by the mantra that these services are free. With growing awareness of the privacy cost of ads, this may well change. Millions of people pay for Netflix despite the fact that pirated copies of many movies are available free. We eventually pay for ads, anyway, as that cost is baked into products we purchase
  • A seamless, secure micropayment system that spreads a few pennies at a time as we browse a social network, up to a preset monthly limit, would alter the whole landscape for the better.
  • Many nonprofits and civic groups that were initially thrilled about their success in using Facebook to reach people are now despondent as their entries are less and less likely to reach people who “liked” their posts unless they pay Facebook to help boost their updates.
  • If even a quarter of Facebook’s 1.5 billion users were willing to pay $1 per month in return for not being tracked or targeted based on their data, that would yield more than $4 billion per year — surely a number worth considering.
  • Mr. Zuckerberg has reportedly spent more than $30 million to buy the homes around his in Palo Alto, Calif., and more than $100 million for a secluded parcel of land in Hawaii. He knows privacy is worth paying for. So he should let us pay a few dollars to protect ours.
jlessner

Super Bowl Ads: Incredibly Cheap or an Incredible Waste of Money? - The Atlantic - 0 views

  • or the second straight year, advertisers are willing to pay about $4 million for a 30-second Super Bowl spot, and for the umpteenth straight year, there are questions about whether Sunday represents a sensational steal or an insane rip-off.
  • Compared to another primetime TV commercial, there's no question: Super Bowl ads are cheap.
  • But compared to, say, any other sensible way of spending money, many Super Bowl ads are something like a ritual financial sacrifice, a pyre of money set on fire to please the Buzz Gods for no particular reason.
  • ...7 more annotations...
  • Super Bowl's audience exists on a different planet from the rest of everything we call "pop culture."
  • So if you add the year's biggest movie and the year's biggest TV show and the year's biggest album (while pretending that there is no overlap), you sum to an audience of 69 million, total. This year's Super Bowl is projected to have 120 million viewers watching—all at once. There is pop culture, and then there is the Super Bowl.
  • To understand why the Super Bowl is such a good deal by TV advertising standards, you have to understand the first thing about TV advertising. It's not about the price you pay for the advertisement. It's about the price you pay for the eyeballs
  • The argument "$4 million for 30 seconds is absurd" is sort of like saying "$1,000 for dinner is absurd." Yes, $1,000 is an expensive dinner-for-one. But what about a fancy dinner for 10 friends? Or 20 friends? Or 100? The more people at the table, the more that $1,000 starts to look like a bargain.
  • The Super Bowl's rate this year is about $35 to reach 1,000 people. Is that expensive? Not at all.
  • People in living rooms across the country say, in unison, "Everybody shut up, I want to experience this corporate messaging so that I can engage with the brand."
  • For four hours a year, a Super Bowl viewer transforms from an ordinary human, constantly rejecting the bombardment of advertising, into a marketing professor's platonic ideal of consumer, diligently seeking out great brand messaging. Surely, that remarkable metamorphosis is worth something.
1 - 20 of 526 Next › Last »
Showing 20 items per page