Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Play

Rss Feed Group items tagged

Weiye Loh

Breakthrough Europe: Towards a Social Theory of Climate Change - 0 views

  • Lever-Tracy confronted sociologists head on about their worrisome silence on the issue. Why have sociologists failed to address the greatest and most overwhelming challenge facing modern society? Why have the figureheads of the discipline, such as Anthony Giddens and Ulrich Beck, so far refused to apply their seminal notions of structuration and the risk society to the issue?
  • Earlier, we re-published an important contribution by Ulrich Beck, the world-renowned German sociologist and a Breakthrough Senior Fellow. More recently, Current Sociology published a powerful response by Reiner Grundmann of Aston University and Nico Stehr of Zeppelin University.
  • sociologists should not rush into the discursive arena without asking some critical questions in advance, questions such as: What exactly could sociology contribute to the debate? And, is there something we urgently need that is not addressed by other disciplines or by political proposals?
  • ...12 more annotations...
  • he authors disagree with Lever-Tracy's observation that the lack of interest in climate change among sociologists is driven by a widespread suspicion of naturalistic explanations, teleological arguments and environmental determinism.
  • While conceding that Lever-Tracy's observation may be partially true, the authors argue that more important processes are at play, including cautiousness on the part of sociologists to step into a heavily politicized debate; methodological differences with the natural sciences; and sensitivity about locating climate change in the longue durée.
  • Secondly, while Lever-Tracy argues that "natural and social change are now in lockstep with each other, operating on the same scales," and that therefore a multidisciplinary approach is needed, Grundmann and Stehr suggest that the true challenge is interdisciplinarity, as opposed to multidisciplinarity.
  • Thirdly, and this possibly the most striking observation of the article, Grundmann and Stehr challenge Lever-Tracy's argument that natural scientists have successfully made the case for anthropogenic climate change, and that therefore social scientists should cease to endlessly question this scientific consensus on the basis of a skeptical postmodern 'deconstructionism'.
  • As opposed to both Lever-Tracy's positivist view and the radical postmodern deconstructionist view, Grundmann and Stehr take the social constructivist view, which argues that that every idea is socially constructed and therefore the product of human interpretation and communication. This raises the 'intractable' specters of discourse and framing, to which we will return in a second.
  • Finally, Lever-Tracy holds that climate change needs to be posited "firmly at the heart of the discipline." Grundmann and Stehr, however, emphasize that "if this is going to [be] more than wishful thinking, we need to carefully consider the prospects of such an enterprise."
  • The importance of framing climate change in a way that allows it to resonate with the concerns of the average citizen is an issue that the Breakthrough Institute has long emphasized. Especially the apocalyptic politics of fear that is often associated with climate change tends to have a counterproductive effect on public opinion. Realizing this, Grundmann and Stehr make an important warning to sociologists: "the inherent alarmism in many social science contributions on climate change merely repeats the central message provided by mainstream media." In other words, it fails to provide the kind of distantiated observation needed to approach the issue with at least a mild degree of objectivity or impartiality.
  • While this tension is symptomatic of many social scientific attempts to get involved, we propose to study these very underlying assumptions. For example, we should ask: Does the dramatization of events lead to effective political responses? Do we need a politics of fear? Is scientific consensus instrumental for sound policies? And more generally, what are the relations between a changing technological infrastructure, social shifts and belief systems? What contribution can bottom-up initiatives have in fighting climate change? What roles are there for markets, hierarchies and voluntary action? How was it possible that the 'fight against climate change' rose from a marginal discourse to a hegemonic one (from heresy to dogma)? And will the discourse remain hegemonic or will too much pub¬lic debate about climate change lead to 'climate change fatigue'?
  • In this respect, Grundmann and Stehr make another crucial observation: "the severity of a problem does not mean that we as sociologists should forget about our analytical apparatus." Bringing the analytical apparatus of sociology back in, the hunting season for positivist approaches to knowledge and nature is opened. Grundmann and Stehr consequently criticize not only Lever-Tracy's unspoken adherence to a positivist nature-society duality, taking instead a more dialectical Marxian approach to the relationship between man and his environment, but they also criticize her idea that incremental increases in our scientific knowledge of climate change and its impacts will automatically coalesce into successful and meaningful policy responses.
  • Political decisions about climate change are made on the basis of scientific research and a host of other (economic, political, cultural) considerations. Regarding the scientific dimension, it is a common perception (one that Lever-Tracy seems to share) that the more knowledge we have, the better the political response will be. This is the assumption of the linear model of policy-making that has been dominant in the past but debunked time and again (Godin, 2006). What we increasingly realize is that knowl¬edge creation leads to an excess of information and 'objectivity' (Sarewitz, 2000). Even the consensual mechanisms of the IPCC lead to an increase in options because knowledge about climate change increases.
  • Instead, Grundmann and Stehr propose to look carefully at how we frame climate change socially and whether the hegemonic climate discourse is actually contributing to successful political action or hampering it. Defending this social constructivist approach from the unfounded allegation that it would play into the hands of the climate skeptics, the authors note that defining climate change as a social construction ... is not to diminish its importance, relevance, or reality. It simply means that sociologists study the process whereby something (like anthropogenic climate change) is transformed from a conjecture into an accepted fact. With regard to policy, we observe a near exclusive focus on carbon dioxide emissions. This framing has proven counter productive, as the Hartwell paper and other sources demonstrate (see Eastin et al., 2010; Prins et al., 2010). Reducing carbon emissions in the short term is among the most difficult tasks. More progress could be made by a re-framing of the issue, not as an issue of human sinfulness, but of human dignity. [emphasis added]
  • These observations allow the authors to come full circle, arriving right back at their first observation about the real reasons why sociologists have so far kept silent on climate change. Somehow, "there seems to be the curious conviction that lest you want to be accused of helping the fossil fuel lobbies and the climate skeptics, you better keep quiet."
  •  
    Towards a Social Theory of Climate Change
Weiye Loh

The Greening of the American Brain - TIME - 0 views

  • The past few years have seen a marked decline in the percentage of Americans who believe what scientists say about climate, with belief among conservatives falling especially fast. It's true that the science community has hit some bumps — the IPCC was revealed to have made a few dumb errors in its recent assessment, and the "Climategate" hacked emails showed scientists behaving badly. But nothing changed the essential truth that more man-made CO2 means more warming; in fact, the basic scientific case has only gotten stronger. Yet still, much of the American public remains unconvinced — and importantly, last November that public returned control of the House of Representatives to a Republican party that is absolutely hostile to the basic truths of climate science.
  • facts and authority alone may not shift people's opinions on climate science or many other topics. That was the conclusion I took from the Climate, Mind and Behavior conference, a meeting of environmentalists, neuroscientists, psychologists and sociologists that I attended last week at the Garrison Institute in New York's Hudson Valley. We like to think of ourselves as rational creatures who select from the choices presented to us for maximum individual utility — indeed, that's the essential principle behind most modern economics. But when you do assume rationality, the politics of climate change get confusing. Why would so many supposedly rational human beings choose to ignore overwhelming scientific authority?
  • Maybe because we're not actually so rational after all, as research is increasingly showing. Emotions and values — not always fully conscious — play an enormous role in how we process information and make choices. We are beset by cognitive biases that throw what would be sound decision-making off-balance. Take loss aversion: psychologists have found that human beings tend to be more concerned about avoiding losses than achieving gains, holding onto what they have even when this is not in their best interests. That has a simple parallel to climate politics: environmentalists argue that the shift to a low-carbon economy will create abundant new green jobs, but for many people, that prospect of future gain — even if it comes with a safer planet — may not be worth the risk of losing the jobs and economy they have.
  • ...4 more annotations...
  • Group identification also plays a major role in how we make decisions — and that's another way facts can get filtered. Declining belief in climate science has been, for the most part in America, a conservative phenomenon. On the surface, that's curious: you could expect Republicans to be skeptical of economic solutions to climate change like a carbon tax, since higher taxes tend to be a Democratic policy, but scientific information ought to be non-partisan. Politicians never debate the physics of space travel after all, even if they argue fiercely over the costs and priorities associated with it. That, however, is the power of group thinking; for most conservative Americans, the very idea of climate science has been poisoned by ideologues who seek to advance their economic arguments by denying scientific fact. No additional data — new findings about CO2 feedback loops or better modeling of ice sheet loss — is likely to change their mind.
  • What's the answer for environmentalists? Change the message and frame the issue in a way that doesn't trigger unconscious opposition among so many Americans. That can be a simple as using the right labels: a recent study by researchers at the University of Michigan found that Republicans are less skeptical of "climate change" than "global warming," possibly because climate change sounds less specific. Possibly too because so broad a term includes the severe snowfalls of the past winter that can be a paradoxical result of a generally warmer world. Greens should also pin their message on subjects that are less controversial, like public health or national security. Instead of issuing dire warnings about an apocalyptic future — which seems to make many Americans stop listening — better to talk about the present generation's responsibility to the future, to bequeath their children and grandchildren a safer and healthy planet.
  • The bright side of all this irrationality is that it means human beings can act in ways that sometimes go against their immediate utility, sacrificing their own interests for the benefit of the group.
  • Our brains develop socially, not just selfishly, which means sustainable behavior — and salvation for the planet — may not be as difficult as it sometimes seem. We can motivate people to help stop climate change — it may just not be climate science that convinces them to act.
Weiye Loh

gladwell dot com - something borrowed - 0 views

  • Intellectual-property doctrine isn't a straightforward application of the ethical principle "Thou shalt not steal." At its core is the notion that there are certain situations where you can steal. The protections of copyright, for instance, are time-limited; once something passes into the public domain, anyone can copy it without restriction. Or suppose that you invented a cure for breast cancer in your basement lab. Any patent you received would protect your intellectual property for twenty years, but after that anyone could take your invention.
  • You get an initial monopoly on your creation because we want to provide economic incentives for people to invent things like cancer drugs. But everyone gets to steal your breast-cancer cure—after a decent interval—because it is also in society's interest to let as many people as possible copy your invention; only then can others learn from it, and build on it, and come up with better and cheaper alternatives. This balance between the protecting and the limiting of intellectual property
  • Stanford law professor Lawrence Lessig argues in his new book "Free Culture": In ordinary language, to call a copyright a "property" right is a bit misleading, for the property of copyright is an odd kind of property. . . . I understand what I am taking when I take the picnic table you put in your backyard. I am taking a thing, the picnic table, and after I take it, you don't have it. But what am I taking when I take the good idea you had to put a picnic table in the backyard—by, for example, going to Sears, buying a table, and putting it in my backyard? What is the thing that I am taking then? The point is not just about the thingness of picnic tables versus ideas, though that is an important difference. The point instead is that in the ordinary case—indeed, in practically every case except for a narrow range of exceptions—ideas released to the world are free. I don't take anything from you when I copy the way you dress—though I might seem weird if I do it every day. . . . Instead, as Thomas Jefferson said (and this is especially true when I copy the way someone dresses), "He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me."
  • ...5 more annotations...
  • Lessig argues that, when it comes to drawing this line between private interests and public interests in intellectual property, the courts and Congress have, in recent years, swung much too far in the direction of private interests.
  • We could have sat in his living room playing at musical genealogy for hours. Did the examples upset him? Of course not, because he knew enough about music to know that these patterns of influence—cribbing, tweaking, transforming—were at the very heart of the creative process.
  • True, copying could go too far. There were times when one artist was simply replicating the work of another, and to let that pass inhibited true creativity. But it was equally dangerous to be overly vigilant in policing creative expression, because if Led Zeppelin hadn't been free to mine the blues for inspiration we wouldn't have got "Whole Lotta Love," and if Kurt Cobain couldn't listen to "More Than a Feeling" and pick out and transform the part he really liked we wouldn't have "Smells Like Teen Spirit"—and, in the evolution of rock, "Smells Like Teen Spirit" was a real step forward from "More Than a Feeling." A successful music executive has to understand the distinction between borrowing that is transformative and borrowing that is merely derivative, and that distinction, I realized, was what was missing from the discussion of Bryony Lavery's borrowings. Yes, she had copied my work. But no one was asking why she had copied it, or what she had copied, or whether her copying served some larger purpose.
  • It also matters how Lavery chose to use my words. Borrowing crosses the line when it is used for a derivative work. It's one thing if you're writing a history of the Kennedys, like Doris Kearns Goodwin, and borrow, without attribution, from another history of the Kennedys. But Lavery wasn't writing another profile of Dorothy Lewis. She was writing a play about something entirely new—about what would happen if a mother met the man who killed her daughter. And she used my descriptions of Lewis's work and the outline of Lewis's life as a building block in making that confrontation plausible.
  • this is the second problem with plagiarism. It is not merely extremist. It has also become disconnected from the broader question of what does and does not inhibit creativity. We accept the right of one writer to engage in a full-scale knockoff of another—think how many serial-killer novels have been cloned from "The Silence of the Lambs." Yet, when Kathy Acker incorporated parts of a Harold Robbins sex scene verbatim in a satiric novel, she was denounced as a plagiarist (and threatened with a lawsuit)
  •  
    Under copyright law, what matters is not that you copied someone else's work. What matters is what you copied, and how much you copied.
Weiye Loh

Smithsonian's Crowdsourced "The Art Of Video Games" Exhibition Comes Under Fire | The C... - 0 views

  • My initial concerns about the current show were its sort of lack of perspective. The strength of a curated show comes from the choice and arrangement of the works, and I worried that with a crowdsourced show like this, it would be hard to form a central thesis. What makes each of these games influential and how will those qualities come together to paint a moving picture of games as an art medium? I wasn’t sure this list particularly answered those questions.
  • They’ve avoided directly addressing the question of why are video games art, and instead danced around it, showing a number of wonderful games and explaining why each great. Despite this success though, I feel that the show was still damaged by the crowdsourced curation approach. While I agree that the player is a major component of games (as Abe Stein recently posted to his blog, “A game not played is no game at all”), the argument that because games are played by the public they should be publicly curated doesn’t necessarily follow for me, especially when the resultant list is so muddled.
  • Despite Chris’ apparent love for the games, the show doesn’t feel as strongly curated as it could have been, overly heavy in some places, and completely missing in others, and I think that is a result of the crowdsourcing. Although I’m sure Chris has a fantastic perspective that will tie this all together beautifully and the resulting show will be enjoyable and successful, I wish that he had just selected a strong list of games on his own and been confident with his picks.
  • ...1 more annotation...
  • perhaps it would have been nice to not side-step the question of why are these games, as a whole, important as art. Considering this is the first major American art institution to put on a video game show, I would have liked to see a more powerful statement about the medium.
Weiye Loh

TODAYonline | Commentary | Trust us, we're academics ... or should you? - 0 views

  • the 2011 Edelman Trust Barometer, published by research firm StrategyOne, which surveyed 5,075 "informed publics" in 23 countries on their trust in business, government, institutions and individuals. One of the questions asked of respondents was: "If you heard information about a company from one of these people, how credible would that information be?". Of the eight groups of individuals - academic/expert, technical expert in company, financial/industry analyst, CEO, non-governmental organisation representative, government official, person like myself, and regular employee - academic/expert came out tops with a score of 70 per cent, followed by technical expert at 64 per cent.
  • the film on the global financial crisis Inside Job, which won the 2011 Academy Award for best documentary. One of the documentary's themes is the role a number of renowned academics, particularly academic economists, played in the global crisis. It highlighted potentially serious conflicts of interests related to significant compensation derived by these academics serving on boards of financial services firms and advising such firms.
  • Often, these academics also played key roles in shaping government policies relating to deregulation - most appear allergic to regulation of the financial services industry. The documentary argued that these academics from Ivy League universities had basically become advocates for financial services firms, which blinded them to firms' excesses. It noted that few academic economists saw the financial crisis coming, and suggested this might be because they were too busy making money from the industry.
  • ...12 more annotations...
  • It is difficult to say if the "failure" of the academics was due to an unstinting belief in free markets or conflicts of interest. Parts of the movie did appear to be trying too hard to prove the point. However, the threat posed by academics earning consulting fees that dwarf their academic compensation, and which might therefore impair their independence, is a real one.
  • One of the worst was the Ivy League university economics professor engaged by the Icelandic Chamber of Commerce to co-author a report on the Icelandic financial system. He concluded that the system was sound even though there were numerous warning signs. When he was asked how he arrived at his conclusions, he said he had talked to people and were misled by them. One wonders how much of his conclusions were actually based on rigorous analysis.
  • it is troubling if academics merely become mouthpieces for vested interests. The impression one gets from watching the movie certainly does not fit with the high level of trust in academics shown by the Edelman Trust Barometer.
  • As an academic, I have often been told that I can be independent and objective - that I should have no axe to grind and no wheels to grease. However, I worry about an erosion of trust in academics. This may be especially true in certain disciplines like business (which is mine, incidentally).
  • too many business school professors were serving on US corporate boards and have lost their willingness to be critical about unethical business practices. In corporate scandals such as Enron and Satyam, academics from top business schools have not particularly covered themselves in glory.
  • It is more and more common for universities - in the US and here - to invite business people to serve on their boards.
  • universities and academics may lose their independence and objectivity in commenting on business issues critically, for fear of offending those who ultimately have an oversight role over the varsity's senior management.
  • Universities might also have business leaders serving on boards as potential donors, which would also confuse the role of board members and lead to conflicts of interest. In the Satyam scandal in India, the founder of Satyam sat on the board of the Indian School of Business, while the Dean of the Indian School of Business sat on Satyam's board. Satyam also made a significant donation to the Indian School of Business.
  • Universities are increasingly dependent on funding from industry and wealthy individuals as well as other sources, sometimes even dubious ones. The recent scandal at the London School of Economics involving its affiliation with Libya is an example.
  • It is important for universities to have robust gift policies as part of the risk management to protect their reputation, which can be easily tainted if a donation comes from a questionable source. It is especially important that donations do not cause universities to be captured by vested interests.
  • From time to time, people in industry ask me if I have been pressured by the university to tone down on my outspokenness on corporate governance issues. Thankfully, while there have been instances where varsity colleagues and friends in industry have conveyed messages from others to "tone down", I have felt relatively free to express my views. Of course, were I trying to earn more money from external consulting, I guess I would be less vocal.
  • I do worry about the loss of independence and, therefore, trust in academics and academic institutions if we are not careful about it.
Weiye Loh

BrainGate gives paralysed the power of mind control | Science | The Observer - 0 views

  • brain-computer interface, or BCI
  • is a branch of science exploring how computers and the human brain can be meshed together. It sounds like science fiction (and can look like it too), but it is motivated by a desire to help chronically injured people. They include those who have lost limbs, people with Lou Gehrig's disease, or those who have been paralysed by severe spinal-cord injuries. But the group of people it might help the most are those whom medicine assumed were beyond all hope: sufferers of "locked-in syndrome".
  • These are often stroke victims whose perfectly healthy minds end up trapped inside bodies that can no longer move. The most famous example was French magazine editor Jean-Dominique Bauby who managed to dictate a memoir, The Diving Bell and the Butterfly, by blinking one eye. In the book, Bauby, who died in 1997 shortly after the book was published, described the prison his body had become for a mind that still worked normally.
  • ...9 more annotations...
  • Now the project is involved with a second set of human trials, pushing the technology to see how far it goes and trying to miniaturise it and make it wireless for a better fit in the brain. BrainGate's concept is simple. It posits that the problem for most patients does not lie in the parts of the brain that control movement, but with the fact that the pathways connecting the brain to the rest of the body, such as the spinal cord, have been broken. BrainGate plugs into the brain, picks up the right neural signals and beams them into a computer where they are translated into moving a cursor or controlling a computer keyboard. By this means, paralysed people can move a robot arm or drive their own wheelchair, just by thinking about it.
  • he and his team are decoding the language of the human brain. This language is made up of electronic signals fired by billions of neurons and it controls everything from our ability to move, to think, to remember and even our consciousness itself. Donoghue's genius was to develop a deceptively small device that can tap directly into the brain and pick up those signals for a computer to translate them. Gold wires are implanted into the brain's tissue at the motor cortex, which controls movement. Those wires feed back to a tiny array – an information storage device – attached to a "pedestal" in the skull. Another wire feeds from the array into a computer. A test subject with BrainGate looks like they have a large plug coming out the top of their heads. Or, as Donoghue's son once described it, they resemble the "human batteries" in The Matrix.
  • BrainGate's highly advanced computer programs are able to decode the neuron signals picked up by the wires and translate them into the subject's desired movement. In crude terms, it is a form of mind-reading based on the idea that thinking about moving a cursor to the right will generate detectably different brain signals than thinking about moving it to the left.
  • The technology has developed rapidly, and last month BrainGate passed a vital milestone when one paralysed patient went past 1,000 days with the implant still in her brain and allowing her to move a computer cursor with her thoughts. The achievement, reported in the prestigious Journal of Neural Engineering, showed that the technology can continue to work inside the human body for unprecedented amounts of time.
  • Donoghue talks enthusiastically of one day hooking up BrainGate to a system of electronic stimulators plugged into the muscles of the arm or legs. That would open up the prospect of patients moving not just a cursor or their wheelchair, but their own bodies.
  • If Nagle's motor cortex was no longer working healthily, the entire BrainGate project could have been rendered pointless. But when Nagle was plugged in and asked to imagine moving his limbs, the signals beamed out with a healthy crackle. "We asked him to imagine moving his arm to the left and to the right and we could hear the activity," Donoghue says. When Nagle first moved a cursor on a screen using only his thoughts, he exclaimed: "Holy shit!"
  • BrainGate and other BCI projects have also piqued the interest of the government and the military. BCI is melding man and machine like no other sector of medicine or science and there are concerns about some of the implications. First, beyond detecting and translating simple movement commands, BrainGate may one day pave the way for mind-reading. A device to probe the innermost thoughts of captured prisoners or dissidents would prove very attractive to some future military or intelligence service. Second, there is the idea that BrainGate or other BCI technologies could pave the way for robot warriors controlled by distant humans using only their minds. At a conference in 2002, a senior American defence official, Anthony Tether, enthused over BCI. "Imagine a warrior with the intellect of a human and the immortality of a machine." Anyone who has seen Terminator might worry about that.
  • Donoghue acknowledges the concerns but has little time for them. When it comes to mind-reading, current BrainGate technology has enough trouble with translating commands for making a fist, let alone probing anyone's mental secrets
  • As for robot warriors, Donoghue was slightly more circumspect. At the moment most BCI research, including BrainGate projects, that touch on the military is focused on working with prosthetic limbs for veterans who have lost arms and legs. But Donoghue thinks it is healthy for scientists to be aware of future issues. "As long as there is a rational dialogue and scientists think about where this is going and what is the reasonable use of the technology, then we are on a good path," he says.
  •  
    The robotic arm clutched a glass and swung it over a series of coloured dots that resembled a Twister gameboard. Behind it, a woman sat entirely immobile in a wheelchair. Slowly, the arm put the glass down, narrowly missing one of the dots. "She's doing that!" exclaims Professor John Donoghue, watching a video of the scene on his office computer - though the woman onscreen had not moved at all. "She actually has the arm under her control," he says, beaming with pride. "We told her to put the glass down on that dot." The woman, who is almost completely paralysed, was using Donoghue's groundbreaking technology to control the robot arm using only her thoughts. Called BrainGate, the device is implanted into her brain and hooked up to a computer to which she sends mental commands. The video played on, giving Donoghue, a silver-haired and neatly bearded man of 62, even more reason to feel pleased. The patient was not satisfied with her near miss and the robot arm lifted the glass again. After a brief hover, the arm positioned the glass on the dot.
Weiye Loh

Google's Next Mission: Fighting Violent Extremism | Fast Company - 0 views

  • Technology, of course, is playing a role both in recruiting members to extremist groups, as well as fueling pro-democracy and other movements--and that’s where Google’s interest lies. "Technology is a part of every challenge in the world, and a part of every solution,” Cohen tells Fast Company. "To the extent that we can bring that technology expertise, and mesh it with the Council on Foreign Relations’ academic expertise--and mesh all of that with the expertise of those who have had these experiences--that's a valuable network to explore these questions."
  • Cohen is the former State Department staffer who is best known for his efforts to bring technology into the country’s diplomatic efforts. But he was originally hired by Condaleezza Rice back in 2006 for a different--though related--purpose: to help Foggy Bottom better understand Middle Eastern youths (many of whom were big technology adopters) and how they could best "deradicalized." Last fall, Cohen joined Google as head of its nascent Google Ideas, which the company is labeling a "think/do tank."
  • This summer’s conference, "Summit Against Violent Extremism," takes place June 26-29 and will bring together about 50 former members of extremist groups--including former neo-Nazis, Muslim fundamentalists, and U.S. gang members--along with another 200 representatives from civil society organizations, academia, private corporations, and victims groups. The hope is to identify some common factors that cause young people to join violent organizations, and to form a network of people working on the issue who can collaborate going forward.
  • ...1 more annotation...
  • One of the technologies where extremism is playing out these days is in Google’s own backyard. While citizen empowerment movements have made use of YouTube to broadcast their messages, so have Terrorist and other groups. Just this week, anti-Hamas extremists kidnapped an Italian peace activist and posted their hostage video to YouTube first before eventually murdering him. YouTube has been criticized in the past for not removing violent videos quick enough. But Cohen says the conference is looking at the root causes that prompt a young person to join one of the groups in the first place. "There are a lot of different dimensions to this challenge," he says. "It’s important not to conflate everything."
  •  
    Neo-Nazi groups and al Qaeda might not seem to have much in common, but they do in one key respect: their recruits tend to be very young. The head of Google's new think tank, Jared Cohen, believes there might be some common reasons why young people are drawn to violent extremist groups, no matter their ideological or philosophical bent. So this summer, Cohen is spearheading a conference, in Dublin, Ireland, to explore what it is that draws young people to these groups and what can be done to redirect them.
Weiye Loh

Edge: HOW DOES OUR LANGUAGE SHAPE THE WAY WE THINK? By Lera Boroditsky - 0 views

  • Do the languages we speak shape the way we see the world, the way we think, and the way we live our lives? Do people who speak different languages think differently simply because they speak different languages? Does learning new languages change the way you think? Do polyglots think differently when speaking different languages?
  • For a long time, the idea that language might shape thought was considered at best untestable and more often simply wrong. Research in my labs at Stanford University and at MIT has helped reopen this question. We have collected data around the world: from China, Greece, Chile, Indonesia, Russia, and Aboriginal Australia.
  • What we have learned is that people who speak different languages do indeed think differently and that even flukes of grammar can profoundly affect how we see the world.
  • ...15 more annotations...
  • Suppose you want to say, "Bush read Chomsky's latest book." Let's focus on just the verb, "read." To say this sentence in English, we have to mark the verb for tense; in this case, we have to pronounce it like "red" and not like "reed." In Indonesian you need not (in fact, you can't) alter the verb to mark tense. In Russian you would have to alter the verb to indicate tense and gender. So if it was Laura Bush who did the reading, you'd use a different form of the verb than if it was George. In Russian you'd also have to include in the verb information about completion. If George read only part of the book, you'd use a different form of the verb than if he'd diligently plowed through the whole thing. In Turkish you'd have to include in the verb how you acquired this information: if you had witnessed this unlikely event with your own two eyes, you'd use one verb form, but if you had simply read or heard about it, or inferred it from something Bush said, you'd use a different verb form.
  • Clearly, languages require different things of their speakers. Does this mean that the speakers think differently about the world? Do English, Indonesian, Russian, and Turkish speakers end up attending to, partitioning, and remembering their experiences differently just because they speak different languages?
  • For some scholars, the answer to these questions has been an obvious yes. Just look at the way people talk, they might say. Certainly, speakers of different languages must attend to and encode strikingly different aspects of the world just so they can use their language properly. Scholars on the other side of the debate don't find the differences in how people talk convincing. All our linguistic utterances are sparse, encoding only a small part of the information we have available. Just because English speakers don't include the same information in their verbs that Russian and Turkish speakers do doesn't mean that English speakers aren't paying attention to the same things; all it means is that they're not talking about them. It's possible that everyone thinks the same way, notices the same things, but just talks differently.
  • Believers in cross-linguistic differences counter that everyone does not pay attention to the same things: if everyone did, one might think it would be easy to learn to speak other languages. Unfortunately, learning a new language (especially one not closely related to those you know) is never easy; it seems to require paying attention to a new set of distinctions. Whether it's distinguishing modes of being in Spanish, evidentiality in Turkish, or aspect in Russian, learning to speak these languages requires something more than just learning vocabulary: it requires paying attention to the right things in the world so that you have the correct information to include in what you say.
  • Follow me to Pormpuraaw, a small Aboriginal community on the western edge of Cape York, in northern Australia. I came here because of the way the locals, the Kuuk Thaayorre, talk about space. Instead of words like "right," "left," "forward," and "back," which, as commonly used in English, define space relative to an observer, the Kuuk Thaayorre, like many other Aboriginal groups, use cardinal-direction terms — north, south, east, and west — to define space.1 This is done at all scales, which means you have to say things like "There's an ant on your southeast leg" or "Move the cup to the north northwest a little bit." One obvious consequence of speaking such a language is that you have to stay oriented at all times, or else you cannot speak properly. The normal greeting in Kuuk Thaayorre is "Where are you going?" and the answer should be something like " Southsoutheast, in the middle distance." If you don't know which way you're facing, you can't even get past "Hello."
  • The result is a profound difference in navigational ability and spatial knowledge between speakers of languages that rely primarily on absolute reference frames (like Kuuk Thaayorre) and languages that rely on relative reference frames (like English).2 Simply put, speakers of languages like Kuuk Thaayorre are much better than English speakers at staying oriented and keeping track of where they are, even in unfamiliar landscapes or inside unfamiliar buildings. What enables them — in fact, forces them — to do this is their language. Having their attention trained in this way equips them to perform navigational feats once thought beyond human capabilities. Because space is such a fundamental domain of thought, differences in how people think about space don't end there. People rely on their spatial knowledge to build other, more complex, more abstract representations. Representations of such things as time, number, musical pitch, kinship relations, morality, and emotions have been shown to depend on how we think about space. So if the Kuuk Thaayorre think differently about space, do they also think differently about other things, like time? This is what my collaborator Alice Gaby and I came to Pormpuraaw to find out.
  • To test this idea, we gave people sets of pictures that showed some kind of temporal progression (e.g., pictures of a man aging, or a crocodile growing, or a banana being eaten). Their job was to arrange the shuffled photos on the ground to show the correct temporal order. We tested each person in two separate sittings, each time facing in a different cardinal direction. If you ask English speakers to do this, they'll arrange the cards so that time proceeds from left to right. Hebrew speakers will tend to lay out the cards from right to left, showing that writing direction in a language plays a role.3 So what about folks like the Kuuk Thaayorre, who don't use words like "left" and "right"? What will they do? The Kuuk Thaayorre did not arrange the cards more often from left to right than from right to left, nor more toward or away from the body. But their arrangements were not random: there was a pattern, just a different one from that of English speakers. Instead of arranging time from left to right, they arranged it from east to west. That is, when they were seated facing south, the cards went left to right. When they faced north, the cards went from right to left. When they faced east, the cards came toward the body and so on. This was true even though we never told any of our subjects which direction they faced. The Kuuk Thaayorre not only knew that already (usually much better than I did), but they also spontaneously used this spatial orientation to construct their representations of time.
  • I have described how languages shape the way we think about space, time, colors, and objects. Other studies have found effects of language on how people construe events, reason about causality, keep track of number, understand material substance, perceive and experience emotion, reason about other people's minds, choose to take risks, and even in the way they choose professions and spouses.8 Taken together, these results show that linguistic processes are pervasive in most fundamental domains of thought, unconsciously shaping us from the nuts and bolts of cognition and perception to our loftiest abstract notions and major life decisions. Language is central to our experience of being human, and the languages we speak profoundly shape the way we think, the way we see the world, the way we live our lives.
  • The fact that even quirks of grammar, such as grammatical gender, can affect our thinking is profound. Such quirks are pervasive in language; gender, for example, applies to all nouns, which means that it is affecting how people think about anything that can be designated by a noun.
  • How does an artist decide whether death, say, or time should be painted as a man or a woman? It turns out that in 85 percent of such personifications, whether a male or female figure is chosen is predicted by the grammatical gender of the word in the artist's native language. So, for example, German painters are more likely to paint death as a man, whereas Russian painters are more likely to paint death as a woman.
  • Does treating chairs as masculine and beds as feminine in the grammar make Russian speakers think of chairs as being more like men and beds as more like women in some way? It turns out that it does. In one study, we asked German and Spanish speakers to describe objects having opposite gender assignment in those two languages. The descriptions they gave differed in a way predicted by grammatical gender. For example, when asked to describe a "key" — a word that is masculine in German and feminine in Spanish — the German speakers were more likely to use words like "hard," "heavy," "jagged," "metal," "serrated," and "useful," whereas Spanish speakers were more likely to say "golden," "intricate," "little," "lovely," "shiny," and "tiny." To describe a "bridge," which is feminine in German and masculine in Spanish, the German speakers said "beautiful," "elegant," "fragile," "peaceful," "pretty," and "slender," and the Spanish speakers said "big," "dangerous," "long," "strong," "sturdy," and "towering." This was true even though all testing was done in English, a language without grammatical gender. The same pattern of results also emerged in entirely nonlinguistic tasks (e.g., rating similarity between pictures). And we can also show that it is aspects of language per se that shape how people think: teaching English speakers new grammatical gender systems influences mental representations of objects in the same way it does with German and Spanish speakers. Apparently even small flukes of grammar, like the seemingly arbitrary assignment of gender to a noun, can have an effect on people's ideas of concrete objects in the world.
  • Even basic aspects of time perception can be affected by language. For example, English speakers prefer to talk about duration in terms of length (e.g., "That was a short talk," "The meeting didn't take long"), while Spanish and Greek speakers prefer to talk about time in terms of amount, relying more on words like "much" "big", and "little" rather than "short" and "long" Our research into such basic cognitive abilities as estimating duration shows that speakers of different languages differ in ways predicted by the patterns of metaphors in their language. (For example, when asked to estimate duration, English speakers are more likely to be confused by distance information, estimating that a line of greater length remains on the test screen for a longer period of time, whereas Greek speakers are more likely to be confused by amount, estimating that a container that is fuller remains longer on the screen.)
  • An important question at this point is: Are these differences caused by language per se or by some other aspect of culture? Of course, the lives of English, Mandarin, Greek, Spanish, and Kuuk Thaayorre speakers differ in a myriad of ways. How do we know that it is language itself that creates these differences in thought and not some other aspect of their respective cultures? One way to answer this question is to teach people new ways of talking and see if that changes the way they think. In our lab, we've taught English speakers different ways of talking about time. In one such study, English speakers were taught to use size metaphors (as in Greek) to describe duration (e.g., a movie is larger than a sneeze), or vertical metaphors (as in Mandarin) to describe event order. Once the English speakers had learned to talk about time in these new ways, their cognitive performance began to resemble that of Greek or Mandarin speakers. This suggests that patterns in a language can indeed play a causal role in constructing how we think.6 In practical terms, it means that when you're learning a new language, you're not simply learning a new way of talking, you are also inadvertently learning a new way of thinking. Beyond abstract or complex domains of thought like space and time, languages also meddle in basic aspects of visual perception — our ability to distinguish colors, for example. Different languages divide up the color continuum differently: some make many more distinctions between colors than others, and the boundaries often don't line up across languages.
  • To test whether differences in color language lead to differences in color perception, we compared Russian and English speakers' ability to discriminate shades of blue. In Russian there is no single word that covers all the colors that English speakers call "blue." Russian makes an obligatory distinction between light blue (goluboy) and dark blue (siniy). Does this distinction mean that siniy blues look more different from goluboy blues to Russian speakers? Indeed, the data say yes. Russian speakers are quicker to distinguish two shades of blue that are called by the different names in Russian (i.e., one being siniy and the other being goluboy) than if the two fall into the same category. For English speakers, all these shades are still designated by the same word, "blue," and there are no comparable differences in reaction time. Further, the Russian advantage disappears when subjects are asked to perform a verbal interference task (reciting a string of digits) while making color judgments but not when they're asked to perform an equally difficult spatial interference task (keeping a novel visual pattern in memory). The disappearance of the advantage when performing a verbal task shows that language is normally involved in even surprisingly basic perceptual judgments — and that it is language per se that creates this difference in perception between Russian and English speakers.
  • What it means for a language to have grammatical gender is that words belonging to different genders get treated differently grammatically and words belonging to the same grammatical gender get treated the same grammatically. Languages can require speakers to change pronouns, adjective and verb endings, possessives, numerals, and so on, depending on the noun's gender. For example, to say something like "my chair was old" in Russian (moy stul bil' stariy), you'd need to make every word in the sentence agree in gender with "chair" (stul), which is masculine in Russian. So you'd use the masculine form of "my," "was," and "old." These are the same forms you'd use in speaking of a biological male, as in "my grandfather was old." If, instead of speaking of a chair, you were speaking of a bed (krovat'), which is feminine in Russian, or about your grandmother, you would use the feminine form of "my," "was," and "old."
  •  
    For a long time, the idea that language might shape thought was considered at best untestable and more often simply wrong. Research in my labs at Stanford University and at MIT has helped reopen this question. We have collected data around the world: from China, Greece, Chile, Indonesia, Russia, and Aboriginal Australia. What we have learned is that people who speak different languages do indeed think differently and that even flukes of grammar can profoundly affect how we see the world. Language is a uniquely human gift, central to our experience of being human. Appreciating its role in constructing our mental lives brings us one step closer to understanding the very nature of humanity.
Weiye Loh

Data Without Borders - 0 views

  •  
    Data is everywhere, but use of data is not. So many of our efforts are centered around making money or getting people to buy more things, and this is understandable; however, there are neglected areas that could actually have a huge impact on the way we live. Jake Porway, a data scientist at The New York Times, has a proposition for you, tentatively called Data Without Borders. [T]here are lots of NGOs and non-profits out there doing wonderful things for the world, from rehabilitating criminals, to battling hunger, to providing clean drinking water. However, they're increasingly finding themselves with more and more data about their practices, their clients, and their missions that they don't have the resources or budgets to analyze. At the same time, the data/dev communities love hacking together weekend projects where we play with new datasets or build helpful scripts, but they usually just culminate in a blog post or some Twitter buzz. Wouldn't it be rad if we could get these two sides together?
Weiye Loh

World Bank Institute: We're also the data bank - video | Media | guardian.co.uk - 0 views

  •  
    Aleem Walji, practice manager for innovation at the World Bank Institute, which assists and advises policy makers and NGOs, tells the Guardian's Activate summit in London about the organisation's commitment to open data
Weiye Loh

Harvard professor spots Web search bias - Business - The Boston Globe - 0 views

  • Sweeney said she has no idea why Google searches seem to single out black-sounding names. There could be myriad issues at play, some associated with the software, some with the people searching Google. For example, the more often searchers click on a particular ad, the more frequently it is displayed subsequently. “Since we don’t know the reason for it,” she said, “it’s hard to say what you need to do.”
  • But Danny Sullivan, editor of SearchEngineLand.com, an online trade publication that tracks the Internet search and advertising business, said Sweeney’s research has stirred a tempest in a teapot. “It looks like this fairly isolated thing that involves one advertiser.” He also said that the results could be caused by black Google users clicking on those ads as much as white users. “It could be that black people themselves could be causing the stuff that causes the negative copy to be selected more,” said Sullivan. “If most of the searches for black names are done by black people . . . is that racially biased?”
  • On the other hand, Sullivan said Sweeney has uncovered a problem with online searching — the casual display of information that might put someone in a bad light. Rather than focusing on potential instances of racism, he said, search services such as Google might want to put more restrictions on displaying negative information about anyone, black or white.
Weiye Loh

The Death of Postmodernism And Beyond | Philosophy Now - 0 views

  • Most of the undergraduates who will take ‘Postmodern Fictions’ this year will have been born in 1985 or after, and all but one of the module’s primary texts were written before their lifetime. Far from being ‘contemporary’, these texts were published in another world, before the students were born: The French Lieutenant’s Woman, Nights at the Circus, If on a Winter’s Night a Traveller, Do Androids Dream of Electric Sheep? (and Blade Runner), White Noise: this is Mum and Dad’s culture. Some of the texts (‘The Library of Babel’) were written even before their parents were born. Replace this cache with other postmodern stalwarts – Beloved, Flaubert’s Parrot, Waterland, The Crying of Lot 49, Pale Fire, Slaughterhouse 5, Lanark, Neuromancer, anything by B.S. Johnson – and the same applies. It’s all about as contemporary as The Smiths, as hip as shoulder pads, as happening as Betamax video recorders. These are texts which are just coming to grips with the existence of rock music and television; they mostly do not dream even of the possibility of the technology and communications media – mobile phones, email, the internet, computers in every house powerful enough to put a man on the moon – which today’s undergraduates take for granted.
  • somewhere in the late 1990s or early 2000s, the emergence of new technologies re-structured, violently and forever, the nature of the author, the reader and the text, and the relationships between them.
  • Postmodernism, like modernism and romanticism before it, fetishised [ie placed supreme importance on] the author, even when the author chose to indict or pretended to abolish him or herself. But the culture we have now fetishises the recipient of the text to the degree that they become a partial or whole author of it. Optimists may see this as the democratisation of culture; pessimists will point to the excruciating banality and vacuity of the cultural products thereby generated (at least so far).
  • ...17 more annotations...
  • Pseudo-modernism also encompasses contemporary news programmes, whose content increasingly consists of emails or text messages sent in commenting on the news items. The terminology of ‘interactivity’ is equally inappropriate here, since there is no exchange: instead, the viewer or listener enters – writes a segment of the programme – then departs, returning to a passive role. Pseudo-modernism also includes computer games, which similarly place the individual in a context where they invent the cultural content, within pre-delineated limits. The content of each individual act of playing the game varies according to the particular player.
  • The pseudo-modern cultural phenomenon par excellence is the internet. Its central act is that of the individual clicking on his/her mouse to move through pages in a way which cannot be duplicated, inventing a pathway through cultural products which has never existed before and never will again. This is a far more intense engagement with the cultural process than anything literature can offer, and gives the undeniable sense (or illusion) of the individual controlling, managing, running, making up his/her involvement with the cultural product. Internet pages are not ‘authored’ in the sense that anyone knows who wrote them, or cares. The majority either require the individual to make them work, like Streetmap or Route Planner, or permit him/her to add to them, like Wikipedia, or through feedback on, for instance, media websites. In all cases, it is intrinsic to the internet that you can easily make up pages yourself (eg blogs).
  • Where once special effects were supposed to make the impossible appear credible, CGI frequently [inadvertently] works to make the possible look artificial, as in much of Lord of the Rings or Gladiator. Battles involving thousands of individuals have really happened; pseudo-modern cinema makes them look as if they have only ever happened in cyberspace.
  • Similarly, television in the pseudo-modern age favours not only reality TV (yet another unapt term), but also shopping channels, and quizzes in which the viewer calls to guess the answer to riddles in the hope of winning money.
  • The purely ‘spectacular’ function of television, as with all the arts, has become a marginal one: what is central now is the busy, active, forging work of the individual who would once have been called its recipient. In all of this, the ‘viewer’ feels powerful and is indeed necessary; the ‘author’ as traditionally understood is either relegated to the status of the one who sets the parameters within which others operate, or becomes simply irrelevant, unknown, sidelined; and the ‘text’ is characterised both by its hyper-ephemerality and by its instability. It is made up by the ‘viewer’, if not in its content then in its sequence – you wouldn’t read Middlemarch by going from page 118 to 316 to 401 to 501, but you might well, and justifiably, read Ceefax that way.
  • A pseudo-modern text lasts an exceptionally brief time. Unlike, say, Fawlty Towers, reality TV programmes cannot be repeated in their original form, since the phone-ins cannot be reproduced, and without the possibility of phoning-in they become a different and far less attractive entity.
  • If scholars give the date they referenced an internet page, it is because the pages disappear or get radically re-cast so quickly. Text messages and emails are extremely difficult to keep in their original form; printing out emails does convert them into something more stable, like a letter, but only by destroying their essential, electronic state.
  • The cultural products of pseudo-modernism are also exceptionally banal
  • Much text messaging and emailing is vapid in comparison with what people of all educational levels used to put into letters.
  • A triteness, a shallowness dominates all.
  • In music, the pseudo-modern supersedingof the artist-dominated album as monolithic text by the downloading and mix-and-matching of individual tracks on to an iPod, selected by the listener, was certainly prefigured by the music fan’s creation of compilation tapes a generation ago. But a shift has occurred, in that what was a marginal pastime of the fan has become the dominant and definitive way of consuming music, rendering the idea of the album as a coherent work of art, a body of integrated meaning, obsolete.
  • To a degree, pseudo-modernism is no more than a technologically motivated shift to the cultural centre of something which has always existed (similarly, metafiction has always existed, but was never so fetishised as it was by postmodernism). Television has always used audience participation, just as theatre and other performing arts did before it; but as an option, not as a necessity: pseudo-modern TV programmes have participation built into them.
  • Whereas postmodernism called ‘reality’ into question, pseudo-modernism defines the real implicitly as myself, now, ‘interacting’ with its texts. Thus, pseudo-modernism suggests that whatever it does or makes is what is reality, and a pseudo-modern text may flourish the apparently real in an uncomplicated form: the docu-soap with its hand-held cameras (which, by displaying individuals aware of being regarded, give the viewer the illusion of participation); The Office and The Blair Witch Project, interactive pornography and reality TV; the essayistic cinema of Michael Moore or Morgan Spurlock.
  • whereas postmodernism favoured the ironic, the knowing and the playful, with their allusions to knowledge, history and ambivalence, pseudo-modernism’s typical intellectual states are ignorance, fanaticism and anxiety
  • pseudo-modernism lashes fantastically sophisticated technology to the pursuit of medieval barbarism – as in the uploading of videos of beheadings onto the internet, or the use of mobile phones to film torture in prisons. Beyond this, the destiny of everyone else is to suffer the anxiety of getting hit in the cross-fire. But this fatalistic anxiety extends far beyond geopolitics, into every aspect of contemporary life; from a general fear of social breakdown and identity loss, to a deep unease about diet and health; from anguish about the destructiveness of climate change, to the effects of a new personal ineptitude and helplessness, which yield TV programmes about how to clean your house, bring up your children or remain solvent.
  • Pseudo-modernism belongs to a world pervaded by the encounter between a religiously fanatical segment of the United States, a largely secular but definitionally hyper-religious Israel, and a fanatical sub-section of Muslims scattered across the planet: pseudo-modernism was not born on 11 September 2001, but postmodernism was interred in its rubble.
  • pseudo-modernist communicates constantly with the other side of the planet, yet needs to be told to eat vegetables to be healthy, a fact self-evident in the Bronze Age. He or she can direct the course of national television programmes, but does not know how to make him or herself something to eat – a characteristic fusion of the childish and the advanced, the powerful and the helpless. For varying reasons, these are people incapable of the “disbelief of Grand Narratives” which Lyotard argued typified postmodernists
  •  
    Postmodern philosophy emphasises the elusiveness of meaning and knowledge. This is often expressed in postmodern art as a concern with representation and an ironic self-awareness. And the argument that postmodernism is over has already been made philosophically. There are people who have essentially asserted that for a while we believed in postmodern ideas, but not any more, and from now on we're going to believe in critical realism. The weakness in this analysis is that it centres on the academy, on the practices and suppositions of philosophers who may or may not be shifting ground or about to shift - and many academics will simply decide that, finally, they prefer to stay with Foucault [arch postmodernist] than go over to anything else. However, a far more compelling case can be made that postmodernism is dead by looking outside the academy at current cultural production.
Weiye Loh

Paris Review - The Grandmaster Hoax, Lincoln Michel - 0 views

  • The Turk was a hoax, although he was incorrect about the workings of the trick. Rather than a man hidden inside the wooden body, the seemingly exposed innards of the cabinet did not extend all the way back. A hidden grandmaster slid around when the cabinet doors were opened and closed. The concealed grandmaster controlled The Turk’s movements and followed the game’s action through a clever arrangement of magnets and strings.
Weiye Loh

Your Brain on Computers - Attached to Technology and Paying a Price - NYTimes.com - 0 views

  • The message had slipped by him amid an electronic flood: two computer screens alive with e-mail, instant messages, online chats, a Web browser and the computer code he was writing. (View an interactive panorama of Mr. Campbell's workstation.)
  • Even after he unplugs, he craves the stimulation he gets from his electronic gadgets. He forgets things like dinner plans, and he has trouble focusing on his family.
  • “It seems like he can no longer be fully in the moment.”
  • ...4 more annotations...
  • juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.
  • These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive.
  • While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.
  • even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.
  •  
    YOUR BRAIN ON COMPUTERS Hooked on Gadgets, and Paying a Mental Price
Weiye Loh

Digital Domain - Computers at Home - Educational Hope vs. Teenage Reality - NYTimes.com - 0 views

  • MIDDLE SCHOOL students are champion time-wasters. And the personal computer may be the ultimate time-wasting appliance.
  • there is an automatic inclination to think of the machine in its most idealized form, as the Great Equalizer. In developing countries, computers are outfitted with grand educational hopes, like those that animate the One Laptop Per Child initiative, which was examined in this space in April.
  • Economists are trying to measure a home computer’s educational impact on schoolchildren in low-income households. Taking widely varying routes, they are arriving at similar conclusions: little or no educational benefit is found. Worse, computers seem to have further separated children in low-income households, whose test scores often decline after the machine arrives, from their more privileged counterparts.
  • ...5 more annotations...
  • Professor Malamud and his collaborator, Cristian Pop-Eleches, an assistant professor of economics at Columbia University, did their field work in Romania in 2009, where the government invited low-income families to apply for vouchers worth 200 euros (then about $300) that could be used for buying a home computer. The program provided a control group: the families who applied but did not receive a voucher.
  • the professors report finding “strong evidence that children in households who won a voucher received significantly lower school grades in math, English and Romanian.” The principal positive effect on the students was improved computer skills.
  • few children whose families obtained computers said they used the machines for homework. What they were used for — daily — was playing games.
  • negative effect on test scores was not universal, but was largely confined to lower-income households, in which, the authors hypothesized, parental supervision might be spottier, giving students greater opportunity to use the computer for entertainment unrelated to homework and reducing the amount of time spent studying.
  • The North Carolina study suggests the disconcerting possibility that home computers and Internet access have such a negative effect only on some groups and end up widening achievement gaps between socioeconomic groups. The expansion of broadband service was associated with a pronounced drop in test scores for black students in both reading and math, but no effect on the math scores and little on the reading scores of other students.
  •  
    Computers at Home: Educational Hope vs. Teenage Reality By RANDALL STROSS Published: July 9, 2010
Weiye Loh

Balderdash - 0 views

  • Addendum: People have notified me that after almost 2 1/2 years, many of the pictures are now missing. I have created galleries with the pictures and hosted them on my homepage:
  • I have no problem at all with people who have plastic surgery. Unlike those who believe that while it is great if you are born pretty, having a surgically constructed or enhanced face is a big no-no (ie A version of the Naturalistic fallacy), I have no problems with people getting tummy tucks, chin lifts, boob jobs or any other form of physical sculpting or enhancement. After all, she seems to have gotten quite a reception on Hottest Blogger.
  • Denying that you have gone under the knife and feigning, with a note of irritation, tired resignation about the accusations, however, is a very different matter. Considering that many sources know the truth about her plastic surgery, this is a most perilous assertion to make and I was riled enough to come up with this blog post. [Addendum: She also goes around online squashing accusations and allegations of surgery.]
  •  
    Two wrongs and two rights.
  •  
    Not exactly the most recent case, but still worth revisiting the ethical concerns behind it. It is easy to find more than one ethical question and problem in this case and it involves more than one technology. The dichotomies of lies versus truths, nature versus man-made, wrongs versus rights, beautiful versus ugly,and so on... So who is right and who is wrong in this case? Whose and what rights are invoked and/or violated? Can a right be wrong? Can a wrong be right? Do two wrongs make one right? What parts do the technologies play in this case?
  •  
    On a side note, given the internet's capability to dig up past issues and rehash them, is it ethical for us to open up old wounds in the name of academic freedom? Beyond research, with IRB and such, what about daily academic discourses and processes? What are the ethical concerns?
Paul Melissa

http://news.bbc.co.uk/2/hi/uk_news/5142702.stm - 0 views

  •  
    To me, the article demonstrates how online media activism can democratize the state and illustrates social responsibility. With greater online activism, more alternative views are heard. This translates to a utilitarian society where citizens will have the right to voice their views and also choose which opinions they are more incline to that matter, which is ethical to society as a whole. This also does not promote individualism as an individualistic or dominant idea is not forced onto the whole community. Next, if more media activism is actually allowed, it may promote social responsibility as citizens now have a part to play in society. Online participation will be in areas that matter more, like the country's politics, and economy, instead of trivial matters. Singapore should follow in the footsteps of London in that sense so that citizen journalism here can be more credible.
Weiye Loh

Libertarianism Is Marxism of the Right - 4 views

http://www.commongroundcommonsense.org/forums/lofiversion/index.php/t21933.html "Because 95 percent of the libertarianism one encounters at cocktail parties, on editorial pages, and on Capitol Hil...

Libertarianism Marxism

started by Weiye Loh on 28 Aug 09 no follow-up yet
Inosha Wickrama

Pirate Bay Victory - 11 views

http://www.telegraph.co.uk/technology/news/4686584/Pirate-Bay-victory-after-illegal-file-sharing-charges-dropped.html Summary: The Pirate Bay, the biggest file-sharing internet site which was accu...

Paul Melissa

Police raid 13 shops in Lucky Plaza - 13 views

http://www.tnp.sg/printfriendly/0,4139,209251,00.html 1) Officers from the Criminal Investigation Department (CID) raided 13 shops in Lucky Plaza and arrested 27 men and one woman, aged...

Pirated games Illegal modification

started by Paul Melissa on 24 Aug 09 no follow-up yet
‹ Previous 21 - 40 of 109 Next › Last »
Showing 20 items per page