Skip to main content

Home/ TOK Friends/ Group items tagged models

Rss Feed Group items tagged

Emily Freilich

Higgs Boson Gets Nobel Prize, But Physicists Still Don't Know What It's Telling Them - ... - 2 views

  • This morning, two physicists who 50 years ago theorized the existence of this particle, which is responsible for conferring mass to all other known particles in the universe, got the Nobel, the highest prize in science.
  • left physicists without a clear roadmap of where to go next
  • No one is sure which of these models, if any, will eventually describe reality
  • ...6 more annotations...
  • Some of them look at the data and say that we need to throw out speculative ideas such as supersymmetry and the multiverse, models that look elegant mathematically but are unprovable from an experimental perspective. Others look at the exact same data and come to the opposite conclusion.
  • we’ve entered a very deep crisis.
  • hough happy to know the Higgs was there, many scientists had hoped it would turn out to be strange, to defy their predictions in some way and give a hint as to which models beyond the Standard Model were correct.
  • One possibility has been brought up that even physicists don’t like to think about. Maybe the universe is even stranger than they think. Like, so strange that even post-Standard Model models can’t account for it. Some physicists are starting to question whether or not our universe is natural.
  • The multiverse idea has two strikes against it, though. First, physicists would refer to it as an unnatural explanation because it simply happened by chance. And second, no real evidence for it exists and we have no experiment that could currently test for it.
  • physicists are still in the dark. We can see vague outlines ahead of us but no one knows what form they will take when we reach them.
Javier E

Reasons for COVID-19 Optimism on T-Cells and Herd Immunity - 0 views

  • It may well be the case that some amount of community protection kicks in below 60 percent exposure, and possibly quite a bit below that threshold, and that those who exhibit a cross-reactive T-cell immune response, while still susceptible to infection, may also have some meaningful amount of protection against severe disease.
  • early returns suggest that while the maximalist interpretation of each hypothesis is not very credible — herd immunity has probably not been reached in many places, and cross-reactive T-cell response almost certainly does not functionally immunize those who have it — more modest interpretations appear quite plausible.
  • Friston suggested that the truly susceptible portion of the population was certainly not 100 percent, as most modelers and conventional wisdom had it, but a much smaller share — surely below 50 percent, he said, and likely closer to about 20 percent. The analysis was ongoing, he said, but, “I suspect, once this has been done, it will look like the effective non-susceptible portion of the population will be about 80 percent. I think that’s what’s going to happen.”
  • ...31 more annotations...
  • one of the leading modelers, Gabriela Gomes, suggested the entire area of research was being effectively blackballed out of fear it might encourage a relaxation of pandemic vigilance. “This is the very sad reason for the absence of more optimistic projections on the development of this pandemic in the scientific literature,” she wrote on Twitter. “Our analysis suggests that herd-immunity thresholds are being achieved despite strict social-distancing measures.”
  • Gomes suggested, herd immunity could happen with as little as one quarter of the population of a community exposed — or perhaps just 20 percent. “We just keep running the models, and it keeps coming back at less than 20 percent,” she told Hamblin. “It’s very striking.” Such findings, if they held up, would be very instructive, as Hamblin writes: “It would mean, for instance, that at 25 percent antibody prevalence, New York City could continue its careful reopening without fear of another major surge in cases.”
  • But for those hoping that 25 percent represents a true ceiling for pandemic spread in a given community, well, it almost certainly does not, considering that recent serological surveys have shown that perhaps 93 percent of the population of Iquitos, Peru, has contracted the disease; as have more than half of those living in Indian slums; and as many as 68 percent in particular neighborhoods of New York City
  • overshoot of that scale would seem unlikely if the “true” threshold were as low as 20 or 25 percent.
  • But, of course, that threshold may not be the same in all places, across all populations, and is surely affected, to some degree, by the social behavior taken to protect against the spread of the disease.
  • we probably err when we conceive of group immunity in simplistically binary terms. While herd immunity is a technical term referring to a particular threshold at which point the disease can no longer spread, some amount of community protection against that spread begins almost as soon as the first people are exposed, with each case reducing the number of unexposed and vulnerable potential cases in the community by one
  • you would not expect a disease to spread in a purely exponential way until the point of herd immunity, at which time the spread would suddenly stop. Instead, you would expect that growth to slow as more people in the community were exposed to the disease, with most of them emerging relatively quickly with some immune response. Add to that the effects of even modest, commonplace protections — intuitive social distancing, some amount of mask-wearing — and you could expect to get an infection curve that tapers off well shy of 60 percent exposure.
  • Looking at the data, we see that transmissions in many severely impacted states began to slow down in July, despite limited interventions. This is especially notable in states like Arizona, Florida, and Texas. While we believe that changes in human behavior and changes in policy (such as mask mandates and closing of bars/nightclubs) certainly contributed to the decrease in transmission, it seems unlikely that these were the primary drivers behind the decrease. We believe that many regions obtained a certain degree of temporary herd immunity after reaching 10-35 percent prevalence under the current conditions. We call this 10-35 percent threshold the effective herd immunity threshold.
  • Indeed, that is more or less what was recently found by Youyang Gu, to date the best modeler of pandemic spread in the U.S
  • he cautioned again that he did not mean to imply that the natural herd-immunity level was as low as 10 percent, or even 35 percent. Instead, he suggested it was a plateau determined in part by better collective understanding of the disease and what precautions to take
  • Gu estimates national prevalence as just below 20 percent (i.e., right in the middle of his range of effective herd immunity), it still counts, I think, as encouraging — even if people in hard-hit communities won’t truly breathe a sigh of relief until vaccines arrive.
  • If you can get real protection starting at 35 percent, it means that even a mediocre vaccine, administered much more haphazardly to a population with some meaningful share of vaccination skeptics, could still achieve community protection pretty quickly. And that is really significant — making both the total lack of national coordination on rollout and the likely “vaccine wars” much less consequential.
  • At least 20 percent of the public, and perhaps 50 percent, had some preexisting, cross-protective T-cell response to SARS-CoV-2, according to one much-discussed recent paper. An earlier paper had put the figure at between 40 and 60 percent. And a third had found an even higher prevalence: 81 percent.
  • The T-cell story is similarly encouraging in its big-picture implications without being necessarily paradigm-changing
  • These numbers suggest their own heterogeneity — that different populations, with different demographics, would likely exhibit different levels of cross-reactive T-cell immune response
  • The most optimistic interpretation of the data was given to me by Francois Balloux, a somewhat contrarian disease geneticist and the director of the University College of London’s Genetics Institute
  • According to him, a cross-reactive T-cell response wouldn’t prevent infection, but would probably mean a faster immune response, a shorter period of infection, and a “massively” reduced risk of severe illness — meaning, he guessed, that somewhere between a third and three-quarters of the population carried into the epidemic significant protection against its scariest outcomes
  • the distribution of this T-cell response could explain at least some, and perhaps quite a lot, of COVID-19’s age skew when it comes to disease severity and mortality, since the young are the most exposed to other coronaviruses, and the protection tapers as you get older and spend less time in environments, like schools, where these viruses spread so promiscuously.
  • Balloux told me he believed it was also possible that the heterogeneous distribution of T-cell protection also explains some amount of the apparent decline in disease severity over time within countries on different pandemic timelines — a phenomenon that is more conventionally attributed to infection spreading more among the young, better treatment, and more effective protection of the most vulnerable (especially the old).
  • Going back to Youyang Gu’s analysis, what he calls the “implied infection fatality rate” — essentially an estimated ratio based on his modeling of untested cases — has fallen for the country as a whole from about one percent in March to about 0.8 percent in mid-April, 0.6 percent in May, and down to about 0.25 percent today.
  • even as we have seemed to reach a second peak of coronavirus deaths, the rate of death from COVID-19 infection has continued to decline — total deaths have gone up, but much less than the number of cases
  • In other words, at the population level, the lethality of the disease in America has fallen by about three-quarters since its peak. This is, despite everything that is genuinely horrible about the pandemic and the American response to it, rather fantastic.
  • there may be some possible “mortality displacement,” whereby the most severe cases show up first, in the most susceptible people, leaving behind a relatively protected population whose experience overall would be more mild, and that T-cell response may play a significant role in determining that susceptibility.
  • That, again, is Balloux’s interpretation — the most expansive assessment of the T-cell data offered to me
  • The most conservative assessment came from Sarah Fortune, the chair of Harvard’s Department of Immunology
  • Fortune cautioned not to assume that cross-protection was playing a significant role in determining severity of illness in a given patient. Those with such a T-cell response, she told me, would likely see a faster onset of robust response, yes, but that may or may not yield a shorter period of infection and viral shedding
  • Most of the scientists, doctors, epidemiologists, and immunologists I spoke to fell between those two poles, suggesting the T-cell cross-immunity findings were significant without necessarily being determinative — that they may help explain some of the shape of pandemic spread through particular populations, but only some of the dynamics of that spread.
  • he told me he believed, in the absence of that data, that T-cell cross-immunity from exposure to previous coronaviruses “might explain different disease severity in different people,” and “could certainly be part of the explanation for the age skew, especially for why the very young fare so well.”
  • the headline finding was quite clear and explicitly stated: that preexisting T-cell response came primarily via the variety of T-cells called CD4 T-cells, and that this dynamic was consistent with the hypothesis that the mechanism was inherited from previous exposure to a few different “common cold” coronaviruses
  • “This potential preexisting cross-reactive T-cell immunity to SARS-CoV-2 has broad implications,” the authors wrote, “as it could explain aspects of differential COVID-19 clinical outcomes, influence epidemiological models of herd immunity, or affect the performance of COVID-19 candidate vaccines.”
  • “This is at present highly speculative,” they cautioned.
Javier E

(1) A Brief History of Media and Audiences and Twitter and The Bulwark - 0 views

  • In the old days—and here I mean even as recently as 2000 or 2004—audiences were built around media institutions. The New York Times had an audience. The New Yorker had an audience. The Weekly Standard had an audience.
  • If you were a writer, you got access to these audiences by contributing to the institutions. No one cared if you, John Smith, wrote a piece about Al Gore. But if your piece about Al Gore appeared in Washington Monthly, then suddenly you had an audience.
  • There were a handful of star writers for whom this wasn’t true: Maureen Dowd, Tom Wolfe, Joan Didion. Readers would follow these stars wherever they appeared. But they were the exceptions to the rule. And the only way to ascend to such exalted status was by writing a lot of great pieces for established institutions and slowly assembling your audience from theirs.
  • ...16 more annotations...
  • The internet stripped institutions of their gatekeeping powers, thus making it possible for anyone to publish—and making it inevitable that many writers would create audiences independent of media institutions.
  • The internet destroyed the apprenticeship system that had dominated American journalism for generations. Under the old system, an aspiring writer took a low-level job at a media institution and worked her way up the ladder until she was trusted enough to write.
  • Under the new system, people started their careers writing outside of institutions—on personal blogs—and then were hired by institutions on the strength of their work.
  • In practice, these outsiders were primarily hired not on the merits of their work, but because of the size of their audience.
  • what it really did was transform the nature of audiences. Once the internet existed it became inevitable that institutions would see their power to hold audiences wane while individual writers would have their power to build personal audiences explode.
  • this meant that institutions would begin to hire based on the size of a writer’s audience. Which meant that writers’ overriding professional imperative was to build an audience, since that was the key to advancement.
  • Twitter killed the blog and lowered the barrier to entry for new writers from “Must have a laptop, the ability to navigate WordPress, and the capacity to write paragraphs” to “Do you have an iPhone and the ability to string 20 words together? With or without punctuation?”
  • If you were able to build a big enough audience on Twitter, then media institutions fell all over themselves trying to hire you—because they believed that you would then bring your audience to them.2
  • If you were a writer for the Washington Post, or Wired, or the Saginaw Express, you had to build your own audience not to advance, but to avoid being replaced.
  • For journalists, audience wasn’t just status—it was professional capital. In fact, it was the most valuable professional capital.
  • Everything we just talked about was driven by the advertising model of media, which prized pageviews and unique users above all else. About a decade ago, that model started to fray around the edges,3 which caused a shift to the subscription model.
  • Today, if you’re a subscription publication, what Twitter gives you is growth opportunity. Twitter’s not the only channel for growth—there are lots of others, from TikTok to LinkedIn to YouTube to podcasts to search. But it’s an important one.
  • Twitter’s attack on Substack was an attack on the subscription model of journalism itself.
  • since media has already seen the ad-based model fall apart, it’s not clear what the alternative will be if the subscription model dies, too.
  • All of which is why having a major social media platform run by a capricious bad actor is suboptimal.
  • And why I think anyone else who’s concerned about the future of media ought to start hedging against Twitter. None of the direct hedges—Post, Mastodon, etc.—are viable yet. But tech history shows that these shifts can happen fairly quickly.
grayton downing

Lab-Grown Model Brains | The Scientist Magazine® - 0 views

  • In an Austrian laboratory, a team of scientists has grown three-dimensional models of embryonic human brain
  • “Even the most complex organ—the human brain—can start to form without any micro-manipulation.”
  • Knoblich cautioned that the organoids are not “brains-in-a-jar.” “We’re talking about the very first steps of embryonic brain development, like in the first nine weeks of pregnancy,” he said. “They’re nowhere near an adult human brain and they don’t form anything that resembles a neuronal network.”
  • ...6 more annotations...
  • It took a huge amount of work to fine-tune the conditions, but once the team did, the organoids grew successfully within just 20 to 30 days.
  • scientists have developed organoids that mimic several human organs, including eyes, kidneys, intestines, and even brains.
  • They really highlight the ability just nudge these human embryonic cells and allow them to self-assemble
  • The mouse brain isn’t good enough for studying microcephaly,” said Huttner. “You need to put those genes into an adequate model like this one. It is, after all, human. It definitely enriches the field. There’s no doubt about that.”
  • organoids are unlikely to replace animal experiments entirely. “We can’t duplicate the elegance with which one can do genetics in animal models,” he said, “but we might be able to reduce the number of animal experiments, especially when it comes to toxicology or drug testing.”
  • the future, he hopes to develop larger organoids.
Javier E

How to Raise a University's Profile: Pricing and Packaging - NYTimes.com - 0 views

  • I talked to a half-dozen of Hugh Moren’s fellow students. A highly indebted senior who was terrified of the weak job market described George Washington, where he had invested considerable time getting and doing internships, as “the world’s most expensive trade school.” Another mentioned the abundance of rich students whose parents were giving them a fancy-sounding diploma the way they might a new car. There are serious students here, he acknowledged, but: “You can go to G.W. and essentially buy a degree.”
  • A recent study from the Organization for Economic Cooperation and Development found that, on average, American college graduates score well below college graduates from most other industrialized countries in mathematics. In literacy (“understanding, evaluating, using and engaging with written text”), scores are just average. This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students.Instead of focusing on undergraduate learning, nu
  • colleges have been engaged in the kind of building spree I saw at George Washington. Recreation centers with world-class workout facilities and lazy rivers rise out of construction pits even as students and parents are handed staggeringly large tuition bills. Colleges compete to hire famous professors even as undergraduates wander through academic programs that often lack rigor or coherence. Campuses vie to become the next Harvard — or at least the next George Washington — while ignoring the growing cost and suspect quality of undergraduate education.
  • ...58 more annotations...
  • Mr. Trachtenberg understood the centrality of the university as a physical place. New structures were a visceral sign of progress. They told visitors, donors and civic leaders that the institution was, like beams and scaffolding rising from the earth, ascending. He added new programs, recruited more students, and followed the dictate of constant expansion.
  • the American research university had evolved into a complicated and somewhat peculiar organization. It was built to be all things to all people: to teach undergraduates, produce knowledge, socialize young men and women, train workers for jobs, anchor local economies, even put on weekend sports events. And excellence was defined by similarity to old, elite institutions. Universities were judged by the quality of their scholars, the size of their endowments, the beauty of their buildings and the test scores of their incoming students.
  • John Silber embarked on a huge building campaign while bringing luminaries like Saul Bellow and Elie Wiesel on board to teach and lend their prestige to the B.U. name, creating a bigger, more famous and much more costly institution. He had helped write a game plan for the aspiring college president.
  • GWU is, for all intents and purposes, a for-profit organization. Best example: study abroad. Their top program, a partnering with Sciences Po, costs each student (30 of them, on a program with 'prestige' status?) a full semester's tuition. It costs GW, according to Sciences Po website, €1000. A neat $20,000 profit per student (who is in digging her/himself deeper and deeper in debt.) Moreover, the school takes a $500 admin fee for the study abroad application! With no guarantee that all credits transfer. Students often lose a partial semester, GW profits again. Nor does GW offer help with an antiquated, one-shot/no transfers, tricky registration process. It's tough luck in gay Paris.Just one of many examples. Dorms with extreme mold, off-campus housing impossible for freshmen and sophomores. Required meal plan: Chick-o-Filet etc. Classes with over 300 students (required).This is not Harvard, but costs same.Emotional problems? Counselors too few. Suicides continue and are not appropriately addressed. Caring environment? Extension so and so, please hold.It's an impressive campus, I'm an alum. If you apply, make sure the DC experience is worth the price: good are internships, a few colleges like Elliot School, post-grad.GWU uses undergrad $$ directly for building projects, like the medical center to which students have NO access. (Student health facility is underfunded, outsourced.)Outstanding professors still make a difference. But is that enough?
  • Mr. Trachtenberg, however, understood something crucial about the modern university. It had come to inhabit a market for luxury goods. People don’t buy Gucci bags merely for their beauty and functionality. They buy them because other people will know they can afford the price of purchase. The great virtue of a luxury good, from the manufacturer’s standpoint, isn’t just that people will pay extra money for the feeling associated with a name brand. It’s that the high price is, in and of itself, a crucial part of what people are buying.
  • Mr. Trachtenberg convinced people that George Washington was worth a lot more money by charging a lot more money. Unlike most college presidents, he was surprisingly candid about his strategy. College is like vodka, he liked to explain.
  • The Absolut Rolex plan worked. The number of applicants surged from some 6,000 to 20,000, the average SAT score of students rose by nearly 200 points, and the endowment jumped from $200 million to almost $1 billion.
  • The university became a magnet for the children of new money who didn’t quite have the SATs or family connections required for admission to Stanford or Yale. It also aggressively recruited international students, rich families from Asia and the Middle East who believed, as nearly everyone did, that American universities were the best in the world.
  • U.S. News & World Report now ranks the university at No. 54 nationwide, just outside the “first tier.”
  • The watch and vodka analogies are correct. Personally, I used car analogies when discussing college choices with my kids. We were in the fortunate position of being able to comfortably send our kids to any college in the country and have them leave debt free. Notwithstanding, I told them that they would be going to a state school unless they were able to get into one of about 40 schools that I felt, in whatever arbitrary manner I decided, that was worth the extra cost. They both ended up going to state schools.College is by and large a commodity and you get out of it what you put into it. Both of my kids worked hard in college and were involved in school life. They both left the schools better people and the schools better schools for them being there. They are both now successful adults.I believe too many people look for the prestige of a named school and that is not what college should be primarily about.
  • In 2013, only 14 percent of the university’s 10,000 undergraduates received a grant — a figure on a par with elite schools but far below the national average. The average undergraduate borrower leaves with about $30,800 in debt.
  • When I talk to the best high school students in my state I always stress the benefits of the honors college experience at an affordable public university. For students who won't qualify for a public honors college. the regular pubic university experience is far preferable to the huge debt of places like GW.
  • Carey would do well to look beyond high ticket private universities (which after all are still private enterprises) and what he describes as the Olympian heights of higher education (which for some reason seems also to embitter him) and look at the system overall . The withdrawal of public support was never a policy choice; it was a political choice, "packaged and branded" as some tax cutting palaver all wrapped up in the argument that a free-market should decide how much college should cost and how many seats we need. In such an environment, trustees at private universities are no more solely responsible for turning their degrees into commodities than the administrations of state universities are for raising the number of out-of-state students in order to offset the loss of support from their legislatures. No doubt, we will hear more about market based solutions and technology from Mr. Carey
  • I went to GW back in the 60s. It was affordable and it got me away from home in New York. While I was there, Newsweek famously published a article about the DC Universities - GW, Georgetown, American and Catholic - dubbing them the Pony league, the schools for the children of wealthy middle class New Yorkers who couldn't get into the Ivy League. Nobody really complained. But that wasn't me. I went because I wanted to be where the action was in the 60s, and as we used to say - "GW was literally a stone's throw from the White House. And we could prove it." Back then, the two biggest alumni names were Jackie Kennedy, who's taken some classes there, and J. Edgar Hoover. Now, according to the glossy magazine they send me each month, it's the actress Kerry Washington. There's some sort of progress there, but I'm a GW alum and not properly trained to understand it.
  • This explains a lot of the modern, emerging mentality. It encompasses the culture of enforced grade inflation, cheating and anti-intellectualism in much of higher education. It is consistent with our culture of misleading statistics and information, cronyism and fake quality, the "best and the brightest" being only schemers and glad handers. The wisdom and creativity engendered by an honest, rigorous academic education are replaced by the disingenuous quick fix, the winner-take-all mentality that neglects the common good.
  • I attended nearby Georgetown University and graduated in 1985. Relative to state schools and elite schools, it was expensive then. I took out loans. I had Pell grants. I had work-study and GSL. I paid my debt of $15,000 off in ten years. Would I have done it differently? Yes: I would have continued on to graduate school and not worried about paying off those big loans right after college. My career work out and I am grateful for the education I received and paid for. But I would not recommend to my nieces and nephews debts north of $100,000 for a BA in liberal arts. Go community. Then go state. Then punch your ticket to Harvard, Yale or Stanford — if you are good enough.
  • American universities appear to have more and more drifted away from educating individuals and citizens to becoming high priced trade schools and purveyors of occupational licenses. Lost in the process is the concept of expanding a student's ability to appreciate broadly and deeply, as well as the belief that a republican democracy needs an educated citizenry, not a trained citizenry, to function well.Both the Heisman Trophy winner and the producer of a successful tech I.P.O. likely have much in common, a college education whose rewards are limited to the financial. I don't know if I find this more sad on the individual level or more worrisome for the future of America.
  • This is now a consumer world for everything, including institutions once thought to float above the Shakespearean briars of the work-a-day world such as higher education, law and medicine. Students get this. Parents get this. Everything is negotiable: financial aid, a spot in the nicest dorm, tix to the big game. But through all this, there are faculty - lots of 'em - who work away from the fluff to link the ambitions of the students with the reality and rigor of the 21st century. The job of the student is to get beyond the visible hype of the surroundings and find those faculty members. They will make sure your investment is worth it
  • My experience in managing or working with GW alumni in their 20's or 30's has not been good. Virtually all have been mentally lazy and/or had a stunning sense of entitlement. Basically they've been all talk and no results. That's been quite a contrast to the graduates from VA/MD state universities.
  • More and more, I notice what my debt-financed contributions to the revenue streams of my vendors earn them, not me. My banks earned enough to pay ridiculous bonuses to employees for reckless risk-taking. My satellite tv operator earned enough to overpay ESPN for sports programming that I never watch--and that, in turn, overpays these idiotic pro athletes and college sports administrators. My health insurer earned enough to defeat one-payor insurance; to enable the opaque, inefficient billing practices of hospitals and other providers; and to feed the behemoth pharmaceutical industry. My church earned enough to buy the silence of sex abuse victims and oppose progressive political candidates. And my govt earned enough to continue ag subsidies, inefficient defense spending, and obsolete transportation and energy policies.
  • as the parent of GWU freshman I am grateful for every opportunity afforded her. She has a generous merit scholarship, is in the honors program with some small classes, and has access to internships that can be done while at school. GWU also gave her AP credits to advance her to sophomore status. Had she attended the state flagship school (where she was accepted into that exclusive honors program) she would have a great education but little else. It's not possible to do foreign affairs related internship far from D.C. or Manhattan. She went to a very competitive high school where for the one or two ivy league schools in which she was interested, she didn't have the same level of connections or wealth as many of her peers. Whether because of the Common Application or other factors, getting into a good school with financial help is difficult for a middle class student like my daughter who had a 4.0 GPA and 2300 on the SAT. She also worked after school.The bottom line - GWU offered more money than perceived "higher tier" universities, and brought tuition to almost that of our state school system. And by the way, I think she is also getting a very good education.
  • This article reinforces something I have learned during my daughter's college application process. Most students choose a school based on emotion (reputation) and not value. This luxury good analogy holds up.
  • The entire education problem can be solved by MOOCs lots and lots of them plus a few closely monitored tests and personal interviews with people. Of course many many people make MONEY off of our entirely inefficient way of "educating" -- are we even really doing that -- getting a degree does NOT mean one is actually educated
  • As a first-generation college graduate I entered GW ambitious but left saddled with debt, and crestfallen at the hard-hitting realization that my four undergraduate years were an aberration from what life is actually like post-college: not as simple as getting an [unpaid] internship with a fancy titled institution, as most Colonials do. I knew how to get in to college, but what do you do after the recess of life ends?I learned more about networking, resume plumping (designated responses to constituents...errr....replied to emails), and elevator pitches than actual theory, economic principles, strong writing skills, critical thinking, analysis, and philosophy. While relatively easy to get a job after graduating (for many with a GW degree this is sadly not the case) sustaining one and excelling in it is much harder. It's never enough just to be able to open a new door, you also need to be prepared to navigate your way through that next opportunity.
  • this is a very telling article. Aimless and directionless high school graduates are matched only by aimless and directionless institutes of higher learning. Each child and each parent should start with a goal - before handing over their hard earned tuition dollars, and/or leaving a trail of broken debt in the aftermath of a substandard, unfocused education.
  • it is no longer the most expensive university in America. It is the 46th.Others have been implementing the Absolut Rolex Plan. John Sexton turned New York University into a global higher-education player by selling the dream of downtown living to students raised on “Sex and the City.” Northeastern followed Boston University up the ladder. Under Steven B. Sample, the University of Southern California became a U.S. News top-25 university. Washington University in St. Louis did the same.
  • I currently attend GW, and I have to say, this article completely misrepresents the situation. I have yet to meet a single person who is paying the full $60k tuition - I myself am paying $30k, because the school gave me $30k in grants. As for the quality of education, Foreign Policy rated GW the #8 best school in the world for undergraduate education in international affairs, Princeton Review ranks it as one of the best schools for political science, and U.S. News ranks the law school #20. The author also ignores the role that an expanding research profile plays in growing a university's prestige and educational power.
  • And in hundreds of regional universities and community colleges, presidents and deans and department chairmen have watched this spectacle of ascension and said to themselves, “That could be me.” Agricultural schools and technical institutes are lobbying state legislatures for tuition increases and Ph.D. programs, fitness centers and arenas for sport. Presidents and boards are drawing up plans to raise tuition, recruit “better” students and add academic programs. They all want to go in one direction — up! — and they are all moving with a single vision of what they want to be.
  • this is the same playbook used by hospitals the past 30 years or so. It is how Hackensack Hospital became Hackensack Medical Center and McComb Hospital became Southwest Mississippi Regional Medical Center. No wonder the results have been the same in healthcare and higher education; both have priced themselves out of reach for average Americans.
  • a world where a college is rated not by the quality of its output, but instaed, by the quality of its inputs. A world where there is practically no work to be done by the administration because the college's reputation is made before the first class even begins! This is isanity! But this is the swill that the mammoth college marketing departments nationwide have shoved down America's throat. Colleges are ranked not by the quality of their graduates, but rather, by the test scores of their incoming students!
  • The Pew Foundation has been doing surveys on what students learn, how much homework they do, how much time they spend with professors etc. All good stuff to know before a student chooses a school. It is called the National Survey of Student Engagement (NSSE - called Nessy). It turns out that the higher ranked schools do NOT allow their information to be released to the public. It is SECRET.Why do you think that is?
  • The article blames "the standard university organizational model left teaching responsibilities to autonomous academic departments and individual faculty members, each of which taught and tested in its own way." This is the view of someone who has never taught at a university, nor thought much about how education there actually happens. Once undergraduates get beyond the general requirements, their educations _have_ to depend on "autonomous departments" because it's only those departments know what the requirements for given degree can be, and can grant the necessary accreditation of a given student. The idea that some administrator could know what's necessary for degrees in everything from engineering to fiction writing is nonsense, except that's what the people who only know the theory of education (but not its practice) actually seem to think. In the classroom itself, you have tremendously talented people, who nevertheless have their own particular strengths and approaches. Don't you think it's a good idea to let them do what they do best rather than trying to make everyone teach the same way? Don't you think supervision of young teachers by older colleagues, who actually know their field and its pedagogy, rather than some administrator, who knows nothing of the subject, is a good idea?
  • it makes me very sad to see how expensive some public schools have become. Used to be you could work your way through a public school without loans, but not any more. Like you, I had the advantage of a largely-scholarship paid undergraduate education at a top private college. However, I was also offered a virtually free spot in my state university's (then new) honors college
  • My daughter attended a good community college for a couple of classes during her senior year of high school and I could immediately see how such places are laboratories for failure. They seem like high schools in atmosphere and appearance. Students rush in by car and rush out again when the class is over.The four year residency college creates a completely different feel. On arrival, you get the sense that you are engaging in something important, something apart and one that will require your full attention. I don't say this is for everyone or that the model is not flawed in some ways (students actually only spend 2 1/2 yrs. on campus to get the four yr. degree). College is supposed to be a 60 hour per week job. Anything less than that and the student is seeking himself or herself
  • This. Is. STUNNING. I have always wondered, especially as my kids have approached college age, why American colleges have felt justified in raising tuition at a rate that has well exceeded inflation, year after year after year. (Nobody needs a dorm with luxury suites and a lazy river pool at college!) And as it turns out, they did it to become luxury brands. Just that simple. Incredible.I don't even blame this guy at GWU for doing what he did. He wasn't made responsible for all of American higher ed. But I do think we all need to realize what happened, and why. This is front page stuff.
  • I agree with you, but, unfortunately, given the choice between low tuition, primitive dorms, and no athletic center VS expensive & luxurious, the customers (and their parents) are choosing the latter. As long as this is the case, there is little incentive to provide bare-bones and cheap education.
  • Wesleyan University in CT is one school that is moving down the rankings. Syracuse University is another. Reed College is a third. Why? Because these schools try hard to stay out of the marketing game. (With its new president, Syracuse has jumped back into the game.) Bryn Mawr College, outside Philadelphia hasn't fared well over the past few decades in the rankings, which is true of practically every women's college. Wellesley is by far the highest ranked women's college, but even there the acceptance rate is significantly higher than one finds at comparable coed liberal arts colleges like Amherst & Williams. University of Chicago is another fascinating case for Mr. Carey to study (I'm sure he does in his forthcoming book, which I look forward to reading). Although it has always enjoyed an illustrious academic reputation, until recently Chicago's undergraduate reputation paled in comparison to peer institutions on the two coasts. A few years ago, Chicago changed its game plan to more closely resemble Harvard and Stanford in undergraduate amenities, and lo and behold, its rankings shot up. It was a very cynical move on the president's part to reassemble the football team, but it was a shrewd move because athletics draw more money than academics ever can (except at engineering schools like Cal Tech & MIT), and more money draws richer students from fancier secondary schools with higher test scores, which lead to higher rankings - and the beat goes on.
  • College INDUSTRY is out of control. Sorry, NYU, GW, BU are not worth the price. Are state schools any better? We have the University of Michigan, which is really not a state school, but a university that gives a discount to people who live in Michigan. Why? When you have an undergraduate body 40+% out-of-state that pays tuition of over $50K/year, you tell me?Perhaps the solution is two years of community college followed by two at places like U of M or Michigan State - get the same diploma at the end for much less and beat the system.
  • In one recent yr., the majority of undergrad professors at Harvard, according to Boston.com, where adjuncts. That means low pay, no benefits, no office, temp workers. Harvard.Easily available student loans fueled this arms race of amenities and frills that in which colleges now engage. They moved the cost of education onto the backs of people, kids, who don't understand what they are doing.Students in colleges these days are customers and the customers must be able to get through. If it requires dumbing things down, so be it. On top of tuition, G.W. U. is known by its students as the land of added fees on top of added fees. The joke around campus was that they would soon be installing pay toilets in the student union. No one was laughing.
  • You could written the same story about my alma mater, American University. The place reeked of ambition and upward mobility decades ago and still does. Whoever's running it now must look at its measly half-billion-dollar endowment and compare it to GWU's $1.5 billion and seethe with envy, while GWU's president sets his sights on an Ivy League-size endowment. And both get back to their real jobs: 24/7 fundraising,Which is what university presidents are all about these days. Money - including million-dollar salaries for themselves (GWU's president made more than Harvard's in 2011) - pride, cachet, power, a mansion, first-class all the way. They should just be honest about it and change their university's motto to Ostende mihi pecuniam! (please excuse my questionable Latin)Whether the students are actually learning anything is up to them, I guess - if they do, it's thanks to the professors, adjuncts and the administrative staff, who do the actual work of educating and keep the school running.
  • When I was in HS (70s), many of my richer friends went to GW and I was then of the impression that GW was a 'good' school. As I age, I have come to realize that this place is just another façade to the emptiness that has become America. All too often are we faced with a dilemma: damned if we do, damned if we don't. Yep, 'education' has become a trap for all too many of our citizen.
  • I transferred to GWU from a state school. I am forever grateful that I did. I wanted to get a good rigorous education and go to one of the best International Affairs schools in the world. Even though the state school I went to was dirt-cheap, the education and the faculty was awful. I transferred to GW and was amazed at the professors at that university. An ambassador or a prominent IA scholar taught every class. GW is an expensive school, but that is the free market. If you want a good education you need to be willing to pay for it or join the military. I did the latter and my school was completely free with no debt and I received an amazing education. If young people aren't willing to make some sort of sacrifice to get ahead or just expect everything to be given to then our country is in a sad state.We need to stop blaming universities like GWU that strive to attract better students, better professors, and better infrastructure. They are doing what is expected in America, to better oneself.
  • "Whether the students are actually learning anything is up to them, I guess." How could it possibly be otherwise??? I am glad that you are willing to give credit to teachers and administrators, but it is not they who "do the actual work of educating." From this fallacy comes its corollary, that we should blame teachers first for "under-performing schools". This long-running show of scapegoating may suit the wallets and vanity of American parents, but it is utterly senseless. When, if ever, American culture stops reeking of arrogance, greed and anti-intellectualism, things may improve, and we may resume the habit of bothering to learn. Until then, nothing doing.
  • Universities sell knowledge and grade students on how much they have learned. Fundamentally, there is conflict of interest in thsi setup. Moreover, students who are poorly educated, even if they know this, will not criticize their school, because doing so would make it harder for them to have a career. As such, many problems with higher education remain unexposed to the public.
  • I've lectured and taught in at least five different countries in three continents and the shortest perusal of what goes on abroad would totally undermine most of these speculations. For one thing American universities are unique in their dedication to a broad based liberal arts type education. In France, Italy or Germany, for example, you select a major like mathematics or physics and then in your four years you will not take even one course in another subject. The amount of work that you do that is critically evaluated by an instructor is a tiny fraction of what is done in an American University. While half educated critics based on profoundly incomplete research write criticism like this Universities in Germany Italy, the Netherlands, South Korea and Japan as well as France have appointed committees and made studies to explain why the American system of higher education so drastically outperforms their own system. Elsewhere students do get a rather nice dose of general education but it ends in secondary school and it has the narrowness and formulaic quality that we would just normally associate with that. The character who wrote this article probably never set foot on a "campus" of the University of Paris or Rome
  • The university is part of a complex economic system and it is responding to the demands of that system. For example, students and parents choose universities that have beautiful campuses and buildings. So universities build beautiful campuses. State support of universities has greatly declined, and this decline in funding is the greatest cause of increased tuition. Therefore universities must compete for dollars and must build to attract students and parents. Also, universities are not ranked based on how they educate students -- that's difficult to measure so it is not measured. Instead universities are ranked on research publications. So while universities certainly put much effort into teaching, research has to have a priority in order for the university to survive. Also universities do not force students and parents to attend high price institutions. Reasonably priced state institutions and community colleges are available to every student. Community colleges have an advantage because they are funded by property taxes. Finally learning requires good teaching, but it also requires students that come to the university funded, prepared, and engaged. This often does not happen. Conclusion- universities have to participate in profile raising actions in order to survive. The day that funding is provided for college, ranking is based on education, and students choose campuses with simple buildings, then things will change at the university.
  • This is the inevitable result of privatizing higher education. In the not-so-distant past, we paid for great state universities through our taxes, not tuition. Then, the states shifted funding to prisons and the Federal government radically cut research support and the GI bill. Instead, today we expect universities to support themselves through tuition, and to the extent that we offered students support, it is through non-dischargeable loans. To make matters worse, the interest rates on those loans are far above the government's cost of funds -- so in effect the loans are an excise tax on education (most of which is used to support a handful of for-profit institutions that account for the most student defaults). This "consumer sovereignty" privatized model of funding education works no better than privatizing California's electrical system did in the era of Enron, or our privatized funding of medical service, or our increasingly privatized prison system: it drives up costs at the same time that it replace quality with marketing.
  • There are data in some instances on student learning, but the deeper problem, as I suspect the author already knows, is that there is nothing like a consensus on how to measure that learning, or even on when is the proper end point to emphasize (a lot of what I teach -- I know this from what students have told me -- tends to come into sharp focus years after graduation).
  • Michael (Baltimore) has hit the nail on the head. Universities are increasingly corporatized institutions in the credentialing business. Knowledge, for those few who care about it (often not those paying for the credentials) is available freely because there's no profit in it. Like many corporate entities, it is increasingly run by increasingly highly paid administrators, not faculty.
  • GWU has not defined itself in any unique way, it has merely embraced the bland, but very expensive, accoutrements of American private education: luxury dorms, food courts, spa-like gyms, endless extracurricular activities, etc. But the real culprit for this bloat that students have to bear financially is the college ranking system by US News, Princeton Review, etc. An ultimately meaningless exercise in competition that has nevertheless pushed colleges and universities to be more like one another. A sad state of affairs, and an extremely expensive one for students
  • It is long past time to realize the failure of the Reagonomics-neoliberal private profits over public good program. In education, we need to return to public institutions publicly funded. Just as we need to recognize that Medicare, Social Security, the post office, public utilities, fire departments, interstate highway system, Veterans Administration hospitals and the GI bill are models to be improved and expanded, not destroyed.
  • George Washington is actually not a Rolex watch, it is a counterfeit Rolex. The real Rolexes of higher education -- places like Hopkins, Georgetown, Duke, the Ivies etc. -- have real endowments and real financial aid. No middle class kid is required to borrow $100,000 to get a degree from those schools, because they offer generous need-based financial aid in the form of grants, not loans. The tuition at the real Rolexes is really a sticker price that only the wealthy pay -- everybody else on a sliding scale. For middle class kids who are fortunate enough to get in, Penn actually ends up costing considerably less than a state university.The fake Rolexes -- BU, NYU, Drexel in Philadelphia -- don't have the sliding scale. They bury middle class students in debt.And really, though it is foolish to borrow $100,000 or $120,000 for an undergraduate degree, I don't find the transaction morally wrong. What is morally wrong is our federal government making that loan non-dischargeable in bankruptcy, so many if these kids will be having their wages garnished for the REST OF THEIR LIVES.There is a very simple solution to this, by the way. Cap the amount of non-dischargeable student loan debt at, say, $50,000
  • The slant of this article is critical of the growth of research universities. Couldn't disagree more. Modern research universities create are incredibly engines of economic opportunity not only for the students (who pay the bills) but also for the community via the creation of blue and white collar jobs. Large research university employ tens of thousands of locals from custodial and food service workers right up to high level administrators and specialist in finance, computer services, buildings and facilities management, etc. Johns Hopkins University and the University of Maryland system employ more people than any other industry in Maryland -- including the government. Research universities typically have hospitals providing cutting-edge medical care to the community. Local business (from cafes to property rental companies) benefit from a built-in, long-term client base as well as an educated workforce. And of course they are the foundry of new knowledge which is critical for the future growth of our country.Check out the work of famed economist Dr. Julia Lane on modeling the economic value of the research university. In a nutshell, there are few better investments America can make in herself than research universities. We are the envy of the world in that regard -- and with good reason. How many *industries* (let alone jobs) have Stanford University alone catalyzed?
  • What universities have the monopoly on is the credential. Anyone can learn, from books, from free lectures on the internet, from this newspaper, etc. But only universities can endow you with the cherished degree. For some reason, people are will to pay more for one of these pieces of paper with a certain name on it -- Ivy League, Stanford, even GW -- than another -- Generic State U -- though there is no evidence one is actually worth more in the marketplace of reality than the other. But, by the laws of economics, these places are actually underpriced: after all, something like 20 times more people are trying to buy a Harvard education than are allowed to purchase one. Usually that means you raise your price.
  • Overalll a good article, except for - "This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students." The measure of learning you report was a general thinking skills exam. That's not a good measure of college gains. Most psychologists and cognitive scientists worth their salt would tell you that improvement in critical thinking skills is going to be limited to specific areas. In other words, learning critical thinking skills in math will make little change in critical thinking about political science or biology. Thus we should not expect huge improvements in general critical thinking skills, but rather improvements in a student's major and other areas of focus, such as a minor. Although who has time for a minor when it is universally acknowledged that the purpose of a university is to please and profit an employer or, if one is lucky, an investor. Finally, improved critical thinking skills are not the end all and be all of a college education even given this profit centered perspective. Learning and mastering the cumulative knowledge of past generations is arguably the most important thing to be gained, and most universities still tend to excel at that even with the increasing mandate to run education like a business and cultivate and cull the college "consumer".
  • As for community colleges, there was an article in the Times several years ago that said it much better than I could have said it myself: community colleges are places where dreams are put on hold. Without making the full commitment to study, without leaving the home environment, many, if not most, community college students are caught betwixt and between, trying to balance work responsibilities, caring for a young child or baby and attending classes. For males, the classic "end of the road" in community college is to get a car, a job and a girlfriend, one who is not in college, and that is the end of the dream. Some can make it, but most cannot.
  • as a scientist I disagree with the claim that undergrad tuition subsidizes basic research. Nearly all lab equipment and research personnel (grad students, technicians, anyone with the title "research scientist" or similar) on campus is paid for through federal grants. Professors often spend all their time outside teaching and administration writing grant proposals, as the limited federal grant funds mean ~%85 of proposals must be rejected. What is more, out of each successful grant the university levies a "tax", called "overhead", of 30-40%, nominally to pay for basic operations (utilities, office space, administrators). So in fact one might say research helps fund the university rather than the other way around. Flag
  • It's certainly overrated as a research and graduate level university. Whether it is good for getting an undergraduate education is unclear, but a big part of the appeal is getting to live in D.C..while attending college instead of living in some small college town in the corn fields.
Javier E

The Tech Industry's Psychological War on Kids - Member Feature Stories - Medium - 0 views

  • she cried, “They took my f***ing phone!” Attempting to engage Kelly in conversation, I asked her what she liked about her phone and social media. “They make me happy,” she replied.
  • Even though they were loving and involved parents, Kelly’s mom couldn’t help feeling that they’d failed their daughter and must have done something terribly wrong that led to her problems.
  • My practice as a child and adolescent psychologist is filled with families like Kelly’s. These parents say their kids’ extreme overuse of phones, video games, and social media is the most difficult parenting issue they face — and, in many cases, is tearing the family apart.
  • ...88 more annotations...
  • What none of these parents understand is that their children’s and teens’ destructive obsession with technology is the predictable consequence of a virtually unrecognized merger between the tech industry and psychology.
  • Dr. B.J. Fogg, is a psychologist and the father of persuasive technology, a discipline in which digital machines and apps — including smartphones, social media, and video games — are configured to alter human thoughts and behaviors. As the lab’s website boldly proclaims: “Machines designed to change humans.”
  • These parents have no idea that lurking behind their kids’ screens and phones are a multitude of psychologists, neuroscientists, and social science experts who use their knowledge of psychological vulnerabilities to devise products that capture kids’ attention for the sake of industry profit.
  • psychology — a discipline that we associate with healing — is now being used as a weapon against children.
  • This alliance pairs the consumer tech industry’s immense wealth with the most sophisticated psychological research, making it possible to develop social media, video games, and phones with drug-like power to seduce young users.
  • Likewise, social media companies use persuasive design to prey on the age-appropriate desire for preteen and teen kids, especially girls, to be socially successful. This drive is built into our DNA, since real-world relational skills have fostered human evolution.
  • Called “the millionaire maker,” Fogg has groomed former students who have used his methods to develop technologies that now consume kids’ lives. As he recently touted on his personal website, “My students often do groundbreaking projects, and they continue having impact in the real world after they leave Stanford… For example, Instagram has influenced the behavior of over 800 million people. The co-founder was a student of mine.”
  • Persuasive technology (also called persuasive design) works by deliberately creating digital environments that users feel fulfill their basic human drives — to be social or obtain goals — better than real-world alternatives.
  • Kids spend countless hours in social media and video game environments in pursuit of likes, “friends,” game points, and levels — because it’s stimulating, they believe that this makes them happy and successful, and they find it easier than doing the difficult but developmentally important activities of childhood.
  • While persuasion techniques work well on adults, they are particularly effective at influencing the still-maturing child and teen brain.
  • “Video games, better than anything else in our culture, deliver rewards to people, especially teenage boys,” says Fogg. “Teenage boys are wired to seek competency. To master our world and get better at stuff. Video games, in dishing out rewards, can convey to people that their competency is growing, you can get better at something second by second.”
  • it’s persuasive design that’s helped convince this generation of boys they are gaining “competency” by spending countless hours on game sites, when the sad reality is they are locked away in their rooms gaming, ignoring school, and not developing the real-world competencies that colleges and employers demand.
  • Persuasive technologies work because of their apparent triggering of the release of dopamine, a powerful neurotransmitter involved in reward, attention, and addiction.
  • As she says, “If you don’t get 100 ‘likes,’ you make other people share it so you get 100…. Or else you just get upset. Everyone wants to get the most ‘likes.’ It’s like a popularity contest.”
  • there are costs to Casey’s phone obsession, noting that the “girl’s phone, be it Facebook, Instagram or iMessage, is constantly pulling her away from her homework, sleep, or conversations with her family.
  • Casey says she wishes she could put her phone down. But she can’t. “I’ll wake up in the morning and go on Facebook just… because,” she says. “It’s not like I want to or I don’t. I just go on it. I’m, like, forced to. I don’t know why. I need to. Facebook takes up my whole life.”
  • B.J. Fogg may not be a household name, but Fortune Magazine calls him a “New Guru You Should Know,” and his research is driving a worldwide legion of user experience (UX) designers who utilize and expand upon his models of persuasive design.
  • “No one has perhaps been as influential on the current generation of user experience (UX) designers as Stanford researcher B.J. Fogg.”
  • the core of UX research is about using psychology to take advantage of our human vulnerabilities.
  • As Fogg is quoted in Kosner’s Forbes article, “Facebook, Twitter, Google, you name it, these companies have been using computers to influence our behavior.” However, the driving force behind behavior change isn’t computers. “The missing link isn’t the technology, it’s psychology,” says Fogg.
  • UX researchers not only follow Fogg’s design model, but also his apparent tendency to overlook the broader implications of persuasive design. They focus on the task at hand, building digital machines and apps that better demand users’ attention, compel users to return again and again, and grow businesses’ bottom line.
  • the “Fogg Behavior Model” is a well-tested method to change behavior and, in its simplified form, involves three primary factors: motivation, ability, and triggers.
  • “We can now create machines that can change what people think and what people do, and the machines can do that autonomously.”
  • Regarding ability, Fogg suggests that digital products should be made so that users don’t have to “think hard.” Hence, social networks are designed for ease of use
  • Finally, Fogg says that potential users need to be triggered to use a site. This is accomplished by a myriad of digital tricks, including the sending of incessant notifications
  • moral questions about the impact of turning persuasive techniques on children and teens are not being asked. For example, should the fear of social rejection be used to compel kids to compulsively use social media? Is it okay to lure kids away from school tasks that demand a strong mental effort so they can spend their lives on social networks or playing video games that don’t make them think much at all?
  • Describing how his formula is effective at getting people to use a social network, the psychologist says in an academic paper that a key motivator is users’ desire for “social acceptance,” although he says an even more powerful motivator is the desire “to avoid being socially rejected.”
  • the startup Dopamine Labs boasts about its use of persuasive techniques to increase profits: “Connect your app to our Persuasive AI [Artificial Intelligence] and lift your engagement and revenue up to 30% by giving your users our perfect bursts of dopamine,” and “A burst of Dopamine doesn’t just feel good: it’s proven to re-wire user behavior and habits.”
  • Ramsay Brown, the founder of Dopamine Labs, says in a KQED Science article, “We have now developed a rigorous technology of the human mind, and that is both exciting and terrifying. We have the ability to twiddle some knobs in a machine learning dashboard we build, and around the world hundreds of thousands of people are going to quietly change their behavior in ways that, unbeknownst to them, feel second-nature but are really by design.”
  • Programmers call this “brain hacking,” as it compels users to spend more time on sites even though they mistakenly believe it’s strictly due to their own conscious choices.
  • Banks of computers employ AI to “learn” which of a countless number of persuasive design elements will keep users hooked
  • A persuasion profile of a particular user’s unique vulnerabilities is developed in real time and exploited to keep users on the site and make them return again and again for longer periods of time. This drives up profits for consumer internet companies whose revenue is based on how much their products are used.
  • “The leaders of Internet companies face an interesting, if also morally questionable, imperative: either they hijack neuroscience to gain market share and make large profits, or they let competitors do that and run away with the market.”
  • Social media and video game companies believe they are compelled to use persuasive technology in the arms race for attention, profits, and survival.
  • Children’s well-being is not part of the decision calculus.
  • one breakthrough occurred in 2017 when Facebook documents were leaked to The Australian. The internal report crafted by Facebook executives showed the social network boasting to advertisers that by monitoring posts, interactions, and photos in real time, the network is able to track when teens feel “insecure,” “worthless,” “stressed,” “useless” and a “failure.”
  • The report also bragged about Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.”
  • These design techniques provide tech corporations a window into kids’ hearts and minds to measure their particular vulnerabilities, which can then be used to control their behavior as consumers. This isn’t some strange future… this is now.
  • The official tech industry line is that persuasive technologies are used to make products more engaging and enjoyable. But the revelations of industry insiders can reveal darker motives.
  • Revealing the hard science behind persuasive technology, Hopson says, “This is not to say that players are the same as rats, but that there are general rules of learning which apply equally to both.”
  • After penning the paper, Hopson was hired by Microsoft, where he helped lead the development of the Xbox Live, Microsoft’s online gaming system
  • “If game designers are going to pull a person away from every other voluntary social activity or hobby or pastime, they’re going to have to engage that person at a very deep level in every possible way they can.”
  • This is the dominant effect of persuasive design today: building video games and social media products so compelling that they pull users away from the real world to spend their lives in for-profit domains.
  • Persuasive technologies are reshaping childhood, luring kids away from family and schoolwork to spend more and more of their lives sitting before screens and phones.
  • “Since we’ve figured to some extent how these pieces of the brain that handle addiction are working, people have figured out how to juice them further and how to bake that information into apps.”
  • Today, persuasive design is likely distracting adults from driving safely, productive work, and engaging with their own children — all matters which need urgent attention
  • Still, because the child and adolescent brain is more easily controlled than the adult mind, the use of persuasive design is having a much more hurtful impact on kids.
  • But to engage in a pursuit at the expense of important real-world activities is a core element of addiction.
  • younger U.S. children now spend 5 ½ hours each day with entertainment technologies, including video games, social media, and online videos.
  • Even more, the average teen now spends an incredible 8 hours each day playing with screens and phones
  • U.S. kids only spend 16 minutes each day using the computer at home for school.
  • Quietly, using screens and phones for entertainment has become the dominant activity of childhood.
  • Younger kids spend more time engaging with entertainment screens than they do in school
  • teens spend even more time playing with screens and phones than they do sleeping
  • kids are so taken with their phones and other devices that they have turned their backs to the world around them.
  • many children are missing out on real-life engagement with family and school — the two cornerstones of childhood that lead them to grow up happy and successful
  • persuasive technologies are pulling kids into often toxic digital environments
  • A too frequent experience for many is being cyberbullied, which increases their risk of skipping school and considering suicide.
  • And there is growing recognition of the negative impact of FOMO, or the fear of missing out, as kids spend their social media lives watching a parade of peers who look to be having a great time without them, feeding their feelings of loneliness and being less than.
  • The combined effects of the displacement of vital childhood activities and exposure to unhealthy online environments is wrecking a generation.
  • as the typical age when kids get their first smartphone has fallen to 10, it’s no surprise to see serious psychiatric problems — once the domain of teens — now enveloping young kids
  • Self-inflicted injuries, such as cutting, that are serious enough to require treatment in an emergency room, have increased dramatically in 10- to 14-year-old girls, up 19% per year since 2009.
  • While girls are pulled onto smartphones and social media, boys are more likely to be seduced into the world of video gaming, often at the expense of a focus on school
  • it’s no surprise to see this generation of boys struggling to make it to college: a full 57% of college admissions are granted to young women compared with only 43% to young men.
  • Economists working with the National Bureau of Economic Research recently demonstrated how many young U.S. men are choosing to play video games rather than join the workforce.
  • The destructive forces of psychology deployed by the tech industry are making a greater impact on kids than the positive uses of psychology by mental health providers and child advocates. Put plainly, the science of psychology is hurting kids more than helping them.
  • Hope for this wired generation has seemed dim until recently, when a surprising group has come forward to criticize the tech industry’s use of psychological manipulation: tech executives
  • Tristan Harris, formerly a design ethicist at Google, has led the way by unmasking the industry’s use of persuasive design. Interviewed in The Economist’s 1843 magazine, he says, “The job of these companies is to hook people, and they do that by hijacking our psychological vulnerabilities.”
  • Marc Benioff, CEO of the cloud computing company Salesforce, is one of the voices calling for the regulation of social media companies because of their potential to addict children. He says that just as the cigarette industry has been regulated, so too should social media companies. “I think that, for sure, technology has addictive qualities that we have to address, and that product designers are working to make those products more addictive, and we need to rein that back as much as possible,”
  • “If there’s an unfair advantage or things that are out there that are not understood by parents, then the government’s got to come forward and illuminate that.”
  • Since millions of parents, for example the parents of my patient Kelly, have absolutely no idea that devices are used to hijack their children’s minds and lives, regulation of such practices is the right thing to do.
  • Another improbable group to speak out on behalf of children is tech investors.
  • How has the consumer tech industry responded to these calls for change? By going even lower.
  • Facebook recently launched Messenger Kids, a social media app that will reach kids as young as five years old. Suggestive that harmful persuasive design is now honing in on very young children is the declaration of Messenger Kids Art Director, Shiu Pei Luu, “We want to help foster communication [on Facebook] and make that the most exciting thing you want to be doing.”
  • the American Psychological Association (APA) — which is tasked with protecting children and families from harmful psychological practices — has been essentially silent on the matter
  • APA Ethical Standards require the profession to make efforts to correct the “misuse” of the work of psychologists, which would include the application of B.J. Fogg’s persuasive technologies to influence children against their best interests
  • Manipulating children for profit without their own or parents’ consent, and driving kids to spend more time on devices that contribute to emotional and academic problems is the embodiment of unethical psychological practice.
  • “Never before in history have basically 50 mostly men, mostly 20–35, mostly white engineer designer types within 50 miles of where we are right now [Silicon Valley], had control of what a billion people think and do.”
  • Some may argue that it’s the parents’ responsibility to protect their children from tech industry deception. However, parents have no idea of the powerful forces aligned against them, nor do they know how technologies are developed with drug-like effects to capture kids’ minds
  • Others will claim that nothing should be done because the intention behind persuasive design is to build better products, not manipulate kids
  • similar circumstances exist in the cigarette industry, as tobacco companies have as their intention profiting from the sale of their product, not hurting children. Nonetheless, because cigarettes and persuasive design predictably harm children, actions should be taken to protect kids from their effects.
  • in a 1998 academic paper, Fogg describes what should happen if things go wrong, saying, if persuasive technologies are “deemed harmful or questionable in some regard, a researcher should then either take social action or advocate that others do so.”
  • I suggest turning to President John F. Kennedy’s prescient guidance: He said that technology “has no conscience of its own. Whether it will become a force for good or ill depends on man.”
  • The APA should begin by demanding that the tech industry’s behavioral manipulation techniques be brought out of the shadows and exposed to the light of public awareness
  • Changes should be made in the APA’s Ethics Code to specifically prevent psychologists from manipulating children using digital machines, especially if such influence is known to pose risks to their well-being.
  • Moreover, the APA should follow its Ethical Standards by making strong efforts to correct the misuse of psychological persuasion by the tech industry and by user experience designers outside the field of psychology.
  • It should join with tech executives who are demanding that persuasive design in kids’ tech products be regulated
  • The APA also should make its powerful voice heard amongst the growing chorus calling out tech companies that intentionally exploit children’s vulnerabilities.
tonycheng6

Accurate machine learning in materials science facilitated by using diverse data sources - 0 views

  • Computational modelling is also used to estimate the properties of materials. However, there is usually a trade-off between the cost of the experiments (or simulations) and the accuracy of the measurements (or estimates), which has limited the number of materials that can be tested rigorously.
  • Materials scientists commonly supplement their own ‘chemical intuition’ with predictions from machine-learning models, to decide which experiments to conduct next
  • More importantly, almost all of these studies use models built on data obtained from a single, consistent source. Such models are referred to as single-fidelity models.
  • ...4 more annotations...
  • However, for most real-world applications, measurements of materials’ properties have varying levels of fidelity, depending on the resources available.
  • A comparison of prediction errors clearly demonstrates the benefit of the multi-fidelity approach
  • The authors’ system is not restricted to materials science, but is generalizable to any problem that can be described using graph structures, such as social networks and knowledge graphs (digital frameworks that represent knowledge as concepts connected by relationships)
  • More research is needed to understand the scenarios for which multi-fidelity learning is most beneficial, balancing prediction accuracy with the cost of acquiring data
Javier E

Understanding What's Wrong With Facebook | Talking Points Memo - 0 views

  • to really understand the problem with Facebook we need to understand the structural roots of that problem, how much of it is baked into the core architecture of the site and its very business model
  • much of it is inherent in the core strategies of the post-2000, second wave Internet tech companies that now dominate our information space and economy.
  • Facebook is an ingenious engine for information and ideational manipulation.
  • ...17 more annotations...
  • Good old fashioned advertising does that to a degree. But Facebook is much more powerful, adaptive and efficient.
  • Facebook is designed to do specific things. It’s an engine to understand people’s minds and then manipulate their thinking.
  • Those tools are refined for revenue making but can be used for many other purposes. That makes it ripe for misuse and bad acting.
  • The core of all second wave Internet commerce operations was finding network models where costs grow mathematically and revenues grow exponentially.
  • The network and its dominance is the product and once it takes hold the cost inputs remained constrained while the revenues grow almost without limit.
  • Facebook is best understood as a fantastically profitable nuclear energy company whose profitability is based on dumping the waste on the side of the road and accepting frequent accidents and explosions as inherent to the enterprise.
  • That’s why these companies employ so few people relative to scale and profitability.
  • managing or distinguishing between legitimate and bad-acting uses of the powerful Facebook engine is one that would require huge, huge investments of money and armies of workers to manage
  • The core economic model requires doing all of it on the cheap. Indeed, what Zuckerberg et al. have created with Facebook is so vast that the money required not to do it on the cheap almost defies imagination.
  • Facebook’s core model and concept requires not taking responsibility for what others do with the engine created to drive revenue.
  • It all amounts to a grand exercise in socializing the externalities and keeping all the revenues for the owners.
  • Here’s a way to think about it. Nuclear power is actually incredibly cheap. The fuel is fairly plentiful and easy to pull out of the ground. You set up a little engine and it generates energy almost without limit. What makes it ruinously expensive is managing the externalities – all the risks and dangers, the radiation, accidents, the constant production of radioactive waste.
  • That’s why there’s no phone support for Google or Facebook or Twitter. If half the people on the planet are ‘customers’ or users that’s not remotely possible.
  • But back to Facebook. The point is that they’ve created a hugely powerful and potentially very dangerous machine
  • The core business model is based on harvesting the profits from the commercial uses of the machine and using algorithms and very, very limited personnel (relative to scale) to try to get a handle on the most outrageous and shocking abuses which the engine makes possible.
  • Zuckerberg may be a jerk and there really is a culture of bad acting within the organization. But it’s not about him being a jerk. Replace him and his team with non-jerks and you’d still have a similar core problem.
  • To manage the potential negative externalities, to take some responsibility for all the dangerous uses the engine makes possible would require money the owners are totally unwilling and in some ways are unable to spend.
Javier E

The 'E-Pimps' of OnlyFans - The New York Times - 0 views

  • Over the course of two dozen interviews spanning six countries, I’ve discovered a thriving warren of companies employing a similar business model, using ghostwriters on OnlyFans to provide digital intimacy at scale. These agencies operate, out of necessity, a little below the radar. They collectively represent hundreds of models, and some claim to bring in profits that can range into the seven figures annually.
  • OnlyFans started in 2016, and has since emerged as the top platform worldwide for creators to sell monthly subscriptions for self-produced erotic content. The platform has become synonymous with this sort of business, though some use it for other purposes.
  • The real product is relationships. Money from subscriptions can be trivial compared with the profits earned by selling custom videos, sexting sessions and other forms of fan interaction that require more concerted engagement than simply posting to a feed.
  • ...7 more annotations...
  • This can be extremely time-consuming: In an interview with this magazine last year, an OnlyFans creator said she spends six hours a day just sexting with subscribers. But these relationships are important to cultivate. In a blog post on its website, OnlyFans encourages creators to cater to their “superfans,” who pay for custom content and will “give more if they feel they’re getting something special.”
  • “Every page needs to have an established back story to make the person seem more believable,” it stated. OnlyFans works because people pay for a connection that feels deeper than porn. The document encouraged Ekko’s employees, called page managers, to identify “big spenders” who would part ways with more than $200 in short order, and cultivate a deep rapport by asking about their life and what they do for a living.
  • Above all, the manual emphasized efficiency. Managers were told to answer DMs in less than five minutes, since users were coming to OnlyFans for immediate gratification and would go elsewhere if ignored. It encouraged the creation of keyboard shortcuts, so that managers could deploy an arsenal of rote sexual phrases with a few keystrokes, steering conversations toward the hard sell. It also outlined a series of strategies to boost engagement on the pages, including a gambit in which models would offer to rate a picture of a subscriber’s penis for a fee.
  • But all of them take advantage of the same raw materials: the endless reproducibility of digital images; the widespread global availability of cheap English-speaking labor; and the world’s unquenchable desire for companionship.
  • The key to this business model is the ready availability of cheap English-speaking labor around the globe. Job postings for OnlyFans chatters are widespread on freelance sites like Upwork, many offering as little as $3 an hour. Agency heads told me they’ve hired workers from Eastern Europe, Africa and all across Southeast Asia. “At the end of the day, it is a geo-arbitrage business,”
  • This phenomenon is part of a broader boom in homespun online businesses that connect cheap developing-world labor with American consumers, allowing the proprietor to step back and reap the profits
  • During his stint as a chatter, Andre has become intimately familiar with the quirks and desires of the subscribers. Over time, he’s learned something of a sex-work cliché: More than sexual gratification, he said, many of the guys just want someone to talk to
Javier E

Predicting the Future Is Easier Than It Looks - By Michael D. Ward and Nils Metternich ... - 0 views

  • The same statistical revolution that changed baseball has now entered American politics, and no one has been more successful in popularizing a statistical approach to political analysis than New York Times blogger Nate Silver, who of course cut his teeth as a young sabermetrician. And on Nov. 6, after having faced a torrent of criticism from old-school political pundits -- Washington's rough equivalent of statistically illiterate tobacco chewing baseball scouts -- the results of the presidential election vindicated Silver's approach, which correctly predicted the electoral outcome in all 50 states.
  • Today, there are several dozen ongoing, public projects that aim to in one way or another forecast the kinds of things foreign policymakers desperately want to be able to predict: various forms of state failure, famines, mass atrocities, coups d'état, interstate and civil war, and ethnic and religious conflict. So while U.S. elections might occupy the front page of the New York Times, the ability to predict instances of extreme violence and upheaval represent the holy grail of statistical forecasting -- and researchers are now getting close to doing just that.
  • In 2010 scholars from the Political Instability Task Force published a report that demonstrated the ability to correctly predict onsets of instability two years in advance in 18 of 21 instances (about 85%)
  • ...5 more annotations...
  • Let's consider a case in which Ulfelder argues there is insufficient data to render a prediction -- North Korea. There is no official data on North Korean GDP, so what can we do? It turns out that the same data science approaches that were used to aggregate polls have other uses as well. One is the imputation of missing data. Yes, even when it is all missing. The basic idea is to use the general correlations among data that you do have to provide an aggregate way of estimating information that we don't have.
  • As it turned out, in this month's election public opinion polls were considerably more precise than the fundamentals. The fundamentals were not always providing bad predictions, but better is better.
  • In 2012 there were two types of models: one type based on fundamentals such as economic growth and unemployment and another based on public opinion surveys
  • There is a tradition in world politics to go either back until the Congress of Vienna (when there were fewer than two dozen independent countries) or to the early 1950s after the end of the Second World War. But in reality, there is no need to do this for most studies.
  • Ulfelder tells us that "when it comes to predicting major political crises like wars, coups, and popular uprisings, there are many plausible predictors for which we don't have any data at all, and much of what we do have is too sparse or too noisy to incorporate into carefully designed forecasting models." But this is true only for the old style of models based on annual data for countries. If we are willing to face data that are collected in rhythm with the phenomena we are studying, this is not the case
Emily Freilich

The Man Who Would Teach Machines to Think - James Somers - The Atlantic - 1 views

  • Douglas Hofstadter, the Pulitzer Prize–winning author of Gödel, Escher, Bach, thinks we've lost sight of what artificial intelligence really means. His stubborn quest to replicate the human mind.
  • “If somebody meant by artificial intelligence the attempt to understand the mind, or to create something human-like, they might say—maybe they wouldn’t go this far—but they might say this is some of the only good work that’s ever been done
  • Their operating premise is simple: the mind is a very unusual piece of software, and the best way to understand how a piece of software works is to write it yourself.
  • ...43 more annotations...
  • “It depends on what you mean by artificial intelligence.”
  • Computers are flexible enough to model the strange evolved convolutions of our thought, and yet responsive only to precise instructions. So if the endeavor succeeds, it will be a double victory: we will finally come to know the exact mechanics of our selves—and we’ll have made intelligent machines.
  • Ever since he was about 14, when he found out that his youngest sister, Molly, couldn’t understand language, because she “had something deeply wrong with her brain” (her neurological condition probably dated from birth, and was never diagnosed), he had been quietly obsessed by the relation of mind to matter.
  • How could consciousness be physical? How could a few pounds of gray gelatin give rise to our very thoughts and selves?
  • Consciousness, Hofstadter wanted to say, emerged via just the same kind of “level-crossing feedback loop.”
  • In 1931, the Austrian-born logician Kurt Gödel had famously shown how a mathematical system could make statements not just about numbers but about the system itself.
  • But then AI changed, and Hofstadter didn’t change with it, and for that he all but disappeared.
  • By the early 1980s, the pressure was great enough that AI, which had begun as an endeavor to answer yes to Alan Turing’s famous question, “Can machines think?,” started to mature—or mutate, depending on your point of view—into a subfield of software engineering, driven by applications.
  • Take Deep Blue, the IBM supercomputer that bested the chess grandmaster Garry Kasparov. Deep Blue won by brute force.
  • Hofstadter wanted to ask: Why conquer a task if there’s no insight to be had from the victory? “Okay,” he says, “Deep Blue plays very good chess—so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?”
  • AI started working when it ditched humans as a model, because it ditched them. That’s the thrust of the analogy: Airplanes don’t flap their wings; why should computers think?
  • It’s a compelling point. But it loses some bite when you consider what we want: a Google that knows, in the way a human would know, what you really mean when you search for something
  • Cognition is recognition,” he likes to say. He describes “seeing as” as the essential cognitive act: you see some lines a
  • How do you make a search engine that understands if you don’t know how you understand?
  • s “an A,” you see a hunk of wood as “a table,” you see a meeting as “an emperor-has-no-clothes situation” and a friend’s pouting as “sour grapes”
  • That’s what it means to understand. But how does understanding work?
  • analogy is “the fuel and fire of thinking,” the bread and butter of our daily mental lives.
  • there’s an analogy, a mental leap so stunningly complex that it’s a computational miracle: somehow your brain is able to strip any remark of the irrelevant surface details and extract its gist, its “skeletal essence,” and retrieve, from your own repertoire of ideas and experiences, the story or remark that best relates.
  • in Hofstadter’s telling, the story goes like this: when everybody else in AI started building products, he and his team, as his friend, the philosopher Daniel Dennett, wrote, “patiently, systematically, brilliantly,” way out of the light of day, chipped away at the real problem. “Very few people are interested in how human intelligence works,”
  • For more than 30 years, Hofstadter has worked as a professor at Indiana University at Bloomington
  • The quick unconscious chaos of a mind can be slowed down on the computer, or rewound, paused, even edited
  • project out of IBM called Candide. The idea behind Candide, a machine-translation system, was to start by admitting that the rules-based approach requires too deep an understanding of how language is produced; how semantics, syntax, and morphology work; and how words commingle in sentences and combine into paragraphs—to say nothing of understanding the ideas for which those words are merely conduits.
  • , Hofstadter directs the Fluid Analogies Research Group, affectionately known as FARG.
  • Parts of a program can be selectively isolated to see how it functions without them; parameters can be changed to see how performance improves or degrades. When the computer surprises you—whether by being especially creative or especially dim-witted—you can see exactly why.
  • When you read Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, which describes in detail this architecture and the logic and mechanics of the programs that use it, you wonder whether maybe Hofstadter got famous for the wrong book.
  • ut very few people, even admirers of GEB, know about the book or the programs it describes. And maybe that’s because FARG’s programs are almost ostentatiously impractical. Because they operate in tiny, seemingly childish “microdomains.” Because there is no task they perform better than a human.
  • “The entire effort of artificial intelligence is essentially a fight against computers’ rigidity.”
  • “Nobody is a very reliable guide concerning activities in their mind that are, by definition, subconscious,” he once wrote. “This is what makes vast collections of errors so important. In an isolated error, the mechanisms involved yield only slight traces of themselves; however, in a large collection, vast numbers of such slight traces exist, collectively adding up to strong evidence for (and against) particular mechanisms.
  • So IBM threw that approach out the window. What the developers did instead was brilliant, but so straightforward,
  • The technique is called “machine learning.” The goal is to make a device that takes an English sentence as input and spits out a French sentence
  • What you do is feed the machine English sentences whose French translations you already know. (Candide, for example, used 2.2 million pairs of sentences, mostly from the bilingual proceedings of Canadian parliamentary debates.)
  • By repeating this process with millions of pairs of sentences, you will gradually calibrate your machine, to the point where you’ll be able to enter a sentence whose translation you don’t know and get a reasonable resul
  • Google Translate team can be made up of people who don’t speak most of the languages their application translates. “It’s a bang-for-your-buck argument,” Estelle says. “You probably want to hire more engineers instead” of native speakers.
  • But the need to serve 1 billion customers has a way of forcing the company to trade understanding for expediency. You don’t have to push Google Translate very far to see the compromises its developers have made for coverage, and speed, and ease of engineering. Although Google Translate captures, in its way, the products of human intelligence, it isn’t intelligent itself.
  • “Did we sit down when we built Watson and try to model human cognition?” Dave Ferrucci, who led the Watson team at IBM, pauses for emphasis. “Absolutely not. We just tried to create a machine that could win at Jeopardy.”
  • For Ferrucci, the definition of intelligence is simple: it’s what a program can do. Deep Blue was intelligent because it could beat Garry Kasparov at chess. Watson was intelligent because it could beat Ken Jennings at Jeopardy.
  • “There’s a limited number of things you can do as an individual, and I think when you dedicate your life to something, you’ve got to ask yourself the question: To what end? And I think at some point I asked myself that question, and what it came out to was, I’m fascinated by how the human mind works, it would be fantastic to understand cognition, I love to read books on it, I love to get a grip on it”—he called Hofstadter’s work inspiring—“but where am I going to go with it? Really what I want to do is build computer systems that do something.
  • Peter Norvig, one of Google’s directors of research, echoes Ferrucci almost exactly. “I thought he was tackling a really hard problem,” he told me about Hofstadter’s work. “And I guess I wanted to do an easier problem.”
  • Of course, the folly of being above the fray is that you’re also not a part of it
  • As our machines get faster and ingest more data, we allow ourselves to be dumber. Instead of wrestling with our hardest problems in earnest, we can just plug in billions of examples of them.
  • Hofstadter hasn’t been to an artificial-intelligence conference in 30 years. “There’s no communication between me and these people,” he says of his AI peers. “None. Zero. I don’t want to talk to colleagues that I find very, very intransigent and hard to convince of anything
  • Everything from plate tectonics to evolution—all those ideas, someone had to fight for them, because people didn’t agree with those ideas.
  • Academia is not an environment where you just sit in your bath and have ideas and expect everyone to run around getting excited. It’s possible that in 50 years’ time we’ll say, ‘We really should have listened more to Doug Hofstadter.’ But it’s incumbent on every scientist to at least think about what is needed to get people to understand the ideas.”
Javier E

New Thinking and Old Books Revisited - NYTimes.com - 0 views

  • Mark Thoma’s classic crack — “I’ve learned that new economic thinking means reading old books” — has a serious point to it. We’ve had a couple of centuries of economic thought at this point, and quite a few smart people doing the thinking. It’s possible to come up with truly new concepts and approaches, but it takes a lot more than good intentions and casual observation to get there.
  • There is definitely a faction within economics that considers it taboo to introduce anything into its analysis that isn’t grounded in rational behavior and market equilibrium
  • what I do, and what everyone I’ve just named plus many others does, is a more modest, more eclectic form of analysis. You use maximization and equilibrium where it seems reasonably consistent with reality, because of its clarifying power, but you introduce ad hoc deviations where experience seems to demand them — downward rigidity of wages, balance-sheet constraints, bubbles (which are hard to predict, but you can say a lot about their consequences).
  • ...4 more annotations...
  • You may say that what we need is reconstruction from the ground up — an economics with no vestige of equilibrium analysis. Well, show me some results. As it happens, the hybrid, eclectic approach I’ve just described has done pretty well in this crisis, so you had better show me some really superior results before it gets thrown out the window.
  • if you think you’ve found a fundamental logical flaw in one of our workhorse economic models, the odds are very strong that you’ve just made a mistake.
  • it’s quite clear that the teaching of macroeconomics has gone seriously astray. As Saraceno says, the simple models that have proved so useful since 2008 are by and large taught only at the undergrad level — they’re treated as too simple, too ad hoc, whatever, to make it into the grad courses even at places that aren’t very ideological.
  • to temper your modeling with a sense of realism you need to know something about reality — and not just the statistical properties of U.S. time series since 1947. Economic history — global economic history — should be a core part of the curriculum. Nobody should be making pronouncements on macro without knowing a fair bit about the collapse of the gold standard in the 1930s, what actually happened in the stagflation of the 1970s, the Asian financial crisis of the 90s, and, looking forward, the euro crisis.
grayton downing

The Hospital Is No Place for the Elderly - Jonathan Rauch - The Atlantic - 0 views

  • The patient is feeble and near death, his bone marrow eviscerated by cancer. The supervising oncologist has ordered a course of chemotherapy using a very toxic investigational drug. Stuart knows enough to feel certain that the treatment will kill the patient, and he does not believe the patient understands this.
  • “On average, Medicare spends $20,870 per beneficiary who dies while in the hospital.”
  • advocating home-based primary care, which represents a fundamental change in the way we care for people who are chronically very ill. The idea is simple: rather than wait until people get sick and need hospitalization, you build a multidisciplinary team that visits them at home,
  • ...10 more annotations...
  • late-life care for the chronically sick is not only expensive but also, much too often, ineffective and inhumane. For years, the system seemed impervious to change.
  • Thanks to modern treatment, people commonly live into their 70s and 80s and even 90s, many of them with multiple chronic ailments.
  • five or more chronic conditions account for less than a fourth of Medicare’s beneficiaries but more than two-thirds of its spending—and they are the fastest-growing segment of the Medicare population. What to do with this burgeoning population of the frail elderly?
  • “I walked out of that room and said, ‘There has got to be a better way than this,’ ” he told me recently. “I was appalled by how we care for—or, more accurately, fail to care about—people who are near the end of life. We literally treat them to death.”
  • Home-based primary care comes in many varieties, but they share a treatment model and a business model. The treatment model begins from the counterintuitive premise that health care should not always be medical care.
  • by keeping patients out of the hospital whenever possible, saves Medicare upwards of $2,000 a month on each patient, maybe more
  • program collects whatever payment it can from Medicare and private insurance, it operates at a loss, and is run as a community service and a form of R&D.
  • Under the new health-care law, Medicare has begun using its financial clout to penalize hospitals that frequently readmit patients. Suddenly, hospitals are not so eager to see Grandma return for the third, fourth, or fifth time.
  • home-based model of primary care will be a challenge.
  • That would be like spiritual suicide right now,” he told me, “because there is so much going on. I’m more hopeful all the time. We’ve rolled the rock all the way to the top of the hill, and now we have to run to keep up as it rolls down the other side.”
Sophia C

Thomas Kuhn: Revolution Against Scientific Realism* - 1 views

  • as such a complex system that nobody believed that it corresponded to the physical reality of the universe. Although the Ptolemaic system accounted for observations-"saved the appearances"-its epicycles and deferents were never intended be anything more than a mathematical model to use in predicting the position of heavenly bodies. [3]
  • lileo that he was free to continue his work with Copernican theory if he agreed that the theory did not describe physical reality but was merely one of the many potential mathematical models. [10] Galileo continued to work, and while he "formally (23)claimed to prove nothing," [11] he passed his mathematical advances and his observational data to Newton, who would not only invent a new mathematics but would solve the remaining problems posed by Copernicus. [12]
  • Thus without pretending that his method could find the underlying causes of things such as gravity, Newton believed that his method produced theory, based upon empirical evidence, that was a close approximation of physical reality.
  • ...27 more annotations...
  • Medieval science was guided by "logical consistency."
  • The logical empiricist's conception of scientific progress was thus a continuous one; more comprehensive theory replaced compatible, older theory
  • Hempel also believed that science evolved in a continuous manner. New theory did not contradict past theory: "theory does not simply refute the earlier empirical generalizations in its field; rather, it shows that within a certain limited range defined by qualifying conditions, the generalizations hold true in fairly close approximation." [21]
  • New theory is more comprehensive; the old theory can be derived from the newer one and is one special manifestation" [22] of the more comprehensive new theory.
  • movement combined induction, based on empiricism, and deduction in the form of logic
  • It was the truth, and the prediction and control that came with it, that was the goal of logical-empirical science.
  • Each successive theory's explanation was closer to the truth than the theory before.
  • e notion of scientific realism held by Newton led to the evolutionary view of the progress of science
  • he entities and processes of theory were believed to exist in nature, and science should discover those entities and processes
  • Particularly disturbing discoveries were made in the area of atomic physics. For instance, Heisenberg's indeterminacy (25)principle, according to historian of science Cecil Schneer, yielded the conclusion that "the world of nature is indeterminate.
  • "even the fundamental principle of causality fail[ed] ."
  • was not until the second half of the twentieth century that the preservers of the evolutionary idea of scientific progress, the logical empiricists, were seriously challenged
  • revolutionary model of scientific change and examined the role of the scientific community in preventing and then accepting change. Kuhn's conception of scientific change occurring through revolutions undermined the traditional scientific goal, finding "truth" in nature
  • Textbooks inform scientists-to-be about this common body of knowledge and understanding.
  • for the world is too huge and complex to be explored randomly.
  • a scientist knows what facts are relevant and can build on past research
  • Normal science, as defined by Kuhn, is cumulative. New knowledge fills a gap of ignorance
  • ne standard product of the scientific enterprise is missing. Normal science does not aim at novelties of fact or theory and, when successful, finds none."
  • ntain a mechanism that uncovers anomaly, inconsistencies within the paradigm.
  • eventually, details arise that are inconsistent with the current paradigm
  • hese inconsistencies are eventually resolved or are ignored.
  • y concern a topic of central importance, a crisis occurs and normal science comes to a hal
  • that the scientists re-examine the foundations of their science that they had been taking for granted
  • it resolves the crisis better than the others, it offers promise for future research, and it is more aesthetic than its competitors. The reasons for converting to a new paradigm are never completely rational.
  • Unlike evolutionary science, in which new knowledge fills a gap of ignorance, in Kuhn's model new knowledge replaces incompatible knowledge.
  • Thus science is not a continuous or cumulative endeavor: when a paradigm shift occurs there is a revolution similar to a political revolution, with fundamental and pervasive changes in method and understanding. Each successive vision about the nature of the universe makes the past vision obsolete; predictions, though more precise, remain similar to the predictions of the past paradigm in their general orientation, but the new explanations do not accommodate the old
  • In a sense, we have circled back to the ancient and medieval practice of separating scientific theory from physical reality; both medieval scientists and Kuhn would agree that no theory corresponds to reality and therefore any number of theories might equally well explain a natural phenomenon. [36] Neither twentieth-century atomic theorists nor medieval astronomers are able to claim that their theories accurately describe physical phenomena. The inability to return to scientific realism suggests a tripartite division of the history of science, with a period of scientific realism fitting between two periods in which there is no insistence that theory correspond to reality. Although both scientific realism and the evolutionary idea of scientific progress appeal to common sense, both existed for only a few hundred years.
dpittenger

Departing Leader of CERN Ponders Uncertainties That Lie Ahead - 0 views

  • Dr. Heuer, born in Bad Boll in southern Germany in 1948, has spent his career in the trenches of particle physics, in which scientists emulate 3-year-olds by smashing bits of matter together to see what comes out.
  • He had an opportunity to put that philosophy to the test early in his term at CERN, when physicists reported in a seminar there that they had measured subatomic particles known as neutrinos streaming from Geneva to their detector in Italy faster than the speed of light, contrary to the laws of physics then known.
  • The neutrino controversy helped set a sort of dubious stage for the main event in particle physics so far this century: the Higgs boson.
  • ...2 more annotations...
  • The Higgs boson completed the Standard Model, a suite of equations that agrees with all the experiments that have been done on earth. But that model is not the end of physics. It does not explain dark matter or dark energy, the two major constituents of the cosmos, for example, or why the universe is made of matter instead of antimatter.
  • For decades, theorists have flirted with a concept called supersymmetry that would address some of these issues and produce a bounty of new particles for CERN’s collider.
Javier E

Uber's Business Model Could Change Your Work - NYTimes.com - 0 views

  • Just as Uber is doing for taxis, new technologies have the potential to chop up a broad array of traditional jobs into discrete tasks that can be assigned to people just when they’re needed, with wages set by a dynamic measurement of supply and demand, and every worker’s performance constantly tracked, reviewed and subject to the sometimes harsh light of customer satisfaction.
  • Uber and its ride-sharing competitors, including Lyft and Sidecar, are the boldest examples of this breed, which many in the tech industry see as a new kind of start-up — one whose primary mission is to efficiently allocate human beings and their possessions, rather than information.
  • “I do think we are defining a new category of work that isn’t full-time employment but is not running your own business either,”
  • ...11 more annotations...
  • Various companies are now trying to emulate Uber’s business model in other fields, from daily chores like grocery shopping and laundry to more upmarket products like legal services and even medicine.
  • Proponents of on-demand work point out that many of the tech giants that sprang up over the last decade minted billions in profits without hiring very many people; Facebook, for instance, serves more than a billion users, but employs only a few thousand highly skilled workers, most of them in California.
  • But the rise of such work could also make your income less predictable and your long-term employment less secure. And it may relegate the idea of establishing a lifelong career to a distant memory.
  • “This on-demand economy means a work life that is unpredictable, doesn’t pay very well and is terribly insecure.” After interviewing many workers in the on-demand world, Dr. Reich said he has concluded that “most would much rather have good, well-paying, regular jobs.”
  • “We may end up with a future in which a fraction of the work force would do a portfolio of things to generate an income — you could be an Uber driver, an Instacart shopper, an Airbnb host and a Taskrabbit,”
  • at the end of 2014, Uber had 160,000 drivers regularly working for it in the United States. About 40,000 new drivers signed up in December alone, and the number of sign-ups was doubling every six months.
  • The report found that on average, Uber’s drivers worked fewer hours and earned more per hour than traditional taxi drivers, even when you account for their expenses. That conclusion, though, has raised fierce debate among economists, because it’s not clear how much Uber drivers really are paying in expenses. Drivers on the service use their own cars and pay for their gas; taxi drivers generally do not.
  • A survey of Uber drivers contained in the report found that most were already employed full or part time when they found Uber, and that earning an additional income on the side was a primary benefit of driving for Uber.
  • The larger worry about on-demand jobs is not about benefits, but about a lack of agency — a future in which computers, rather than humans, determine what you do, when and for how much. The rise of Uber-like jobs is the logical culmination of an economic and tech system that holds efficiency as its paramount virtue.
  • “These services are successful because they are tapping into people’s available time more efficiently,” Dr. Sundararajan said. “You could say that people are monetizing their own downtime.”Think about that for a second; isn’t “monetizing downtime” a hellish vision of the future of work?
  • “I’m glad if people like working for Uber, but those subjective feelings have got to be understood in the context of there being very few alternatives,” Dr. Reich said. “Can you imagine if this turns into a Mechanical Turk economy, where everyone is doing piecework at all odd hours, and no one knows when the next job will come, and how much it will pay? What kind of private lives can we possibly have, what kind of relationships, what kind of families?”
Emilio Ergueta

Lessons from Gaming #2: Random Universe | Talking Philosophy - 0 views

  • My experiences as a tabletop and video gamer have taught me numerous lessons that are applicable to the real world (assuming there is such a thing). One key skill in getting about in reality is the ability to model reality.
  • Many games, such as Call of Cthulhu, D&D, Pathfinder and Star Fleet Battles make extensive use of dice to model the vagaries of reality.
  • even if things could have been different it does not follow that chance is real. After all, chance is not the only thing that could make a difference.
  • ...6 more annotations...
  • I do not know if the universe is random (contains elements of chance). After all, we tend to attribute chance to the unpredictable, but this unpredictability might be a matter of ignorance rather than chance.
  • Being a gamer, it is natural for me to look at reality as also being random—after all, if a random model (gaming system) nicely fits aspects of reality, then that suggests the model has things right. As such, I tend to think of this as being a random universe in which God (or whatever) plays dice with us.
  • Obviously, there is no way to prove that choice occurs—as with chance versus determinism, without simply knowing the brute fact about choice there is no way to know whether the universe allows for choice or not.
  • : because of chance, the results of any choice cannot be known with certainty
  • if things can fail or go wrong because of chance, then it makes sense to be more forgiving and understanding of failure—at least when the failure can be attributed in part to chance.
  • the role of chance in success and failure should be considered when planning and creating policies.
Javier E

In Defense of Naïve Reading - NYTimes.com - 1 views

  • Clearly, poems and novels and paintings were not produced as objects for future academic study; there is no a priori reason to think that they could be suitable objects of  “research.” By and large they were produced for the pleasure and enlightenment of those who enjoyed them.
  • But just as clearly, the teaching of literature in universities ─ especially after the 19th-century research model of Humboldt University of Berlin was widely copied ─ needed a justification consistent with the aims of that academic setting
  • The main aim was research: the creating and accumulation and transmission of knowledge. And the main model was the natural science model of collaborative research: define problems, break them down into manageable parts, create sub-disciplines and sub-sub-disciplines for the study of these, train students for such research specialties and share everything. With that model, what literature and all the arts needed was something like a general “science of meaning” that could eventually fit that sort of aspiration. Texts or art works could be analyzed as exemplifying and so helping establish such a science. Results could be published in scholarly journals, disputed by others, consensus would eventually emerge and so on.
  • ...3 more annotations...
  • literature study in a university education requires some method of evaluation of whether the student has done well or poorly. Students’ papers must be graded and no faculty member wants to face the inevitable “that’s just your opinion” unarmed, as it were. Learning how to use a research methodology, providing evidence that one has understood and can apply such a method, is understandably an appealing pedagogy
  • Literature and the arts have a dimension unique in the academy, not shared by the objects studied, or “researched” by our scientific brethren. They invite or invoke, at a kind of “first level,” an aesthetic experience that is by its nature resistant to restatement in more formalized, theoretical or generalizing language. This response can certainly be enriched by knowledge of context and history, but the objects express a first-person or subjective view of human concerns that is falsified if wholly transposed to a more “sideways on” or third person view.
  • such works also can directly deliver a  kind of practical knowledge and self-understanding not available from a third person or more general formulation of such knowledge. There is no reason to think that such knowledge — exemplified in what Aristotle said about the practically wise man (the phronimos)or in what Pascal meant by the difference between l’esprit géometrique and l’esprit de finesse — is any less knowledge because it cannot be so formalized or even taught as such.
Javier E

Next Stop: 100,000 Dead? - 0 views

  • A model is not a report sent back from the future. It's an exercise in taking what we know, what we think we know, and what we have no idea about, making some educated guesses about how those three pieces will interact, and coming up with a probabilistic set of possible future outcomes.
  • Models change as new data comes in (adding to the "stuff we know" inputs) and the universe of the other two inputs ("stuff we think we know" and "stuff we have no idea about") change.
Javier E

The Real Trouble With Economics - NYTimes.com - 1 views

  • far from acting as a free-spirited improviser, Bernanke has been largely implementing recipes developed in the academic literature years before.
  • They also misunderstand the nature of economists’ predictive failures. It’s true that few economists predicted the onset of crisis. Once crisis struck, however, basic macroeconomic models did a very good job in key respects — in particular, they did much better than people who relied on their intuitive feelings.
  • wonks who relied on suitably interpreted IS-LM confidently declared that all this intuition, based on experiences in a different environment, would prove wrong — and they were right. From my point of view, these past 5 years have been a triumph for and vindication of economic modeling.
  • ...5 more annotations...
  • Yet obviously something is deeply wrong with economics. While economists using textbook macro models got things mostly and impressively right, many famous economists refused to use those models — in fact, they made it clear in discussion that they didn’t understand points that had been worked out generations ago.
  • Moreover, it’s hard to find any economists who changed their minds when their predictions, say of sharply higher inflation, turned out wrong.
  • let’s grant that economics as practiced doesn’t look like a science. But that’s not because the subject is inherently unsuited to the scientific method. Sure, it’s highly imperfect — it’s a complex area, and our understanding is in its early stages.
  • And sure, the economy itself changes over time, so that what was true 75 years ago may not be true today — although what really impresses you if you study macro, in particular, is the continuity, so that Bagehot and Wicksell and Irving Fisher and, of course, Keynes remain quite relevant today.
  • No, the problem lies not in the inherent unsuitability of economics for scientific thinking as in the sociology of the economics profession — a profession that somehow, at least in macro, has ceased rewarding research that produces successful predictions and rewards research that fits preconceptions
‹ Previous 21 - 40 of 384 Next › Last »
Showing 20 items per page