Skip to main content

Home/ TOK Friends/ Group items tagged Parallel

Rss Feed Group items tagged

Javier E

Face It, Your Brain Is a Computer - The New York Times - 0 views

  • all the standard arguments about why the brain might not be a computer are pretty weak.
  • Take the argument that “brains are parallel, but computers are serial.” Critics are right to note that virtually every time a human does anything, many different parts of the brain are engaged; that’s parallel, not serial.
  • the trend over time in the hardware business has been to make computers more and more parallel, using new approaches like multicore processors and graphics processing units.
  • ...6 more annotations...
  • The real payoff in subscribing to the idea of a brain as a computer would come from using that idea to profitably guide research. In an article last fall in the journal Science, two of my colleagues (Adam Marblestone of M.I.T. and Thomas Dean of Google) and I endeavored to do just that, suggesting that a particular kind of computer, known as the field programmable gate array, might offer a preliminary starting point for thinking about how the brain works.
  • FIELD programmable gate arrays consist of a large number of “logic block” programs that can be configured, and reconfigured, individually, to do a wide range of tasks. One logic block might do arithmetic, another signal processing, and yet another look things up in a table. The computation of the whole is a function of how the individual parts are configured. Much of the logic can be executed in parallel, much like what happens in a brain.
  • our suggestion is that the brain might similarly consist of highly orchestrated sets of fundamental building blocks, such as “computational primitives” for constructing sequences, retrieving information from memory, and routing information between different locations in the brain. Identifying those building blocks, we believe, could be the Rosetta stone that unlocks the brain.
  • it is unlikely that we will ever be able to directly connect the language of neurons and synapses to the diversity of human behavior, as many neuroscientists seem to hope. The chasm between brains and behavior is just too vast.
  • Our best shot may come instead from dividing and conquering. Fundamentally, that may involve two steps: finding some way to connect the scientific language of neurons and the scientific language of computational primitives (which would be comparable in computer science to connecting the physics of electrons and the workings of microprocessors); and finding some way to connect the scientific language of computational primitives and that of human behavior (which would be comparable to understanding how computer programs are built out of more basic microprocessor instructions).
  • If neurons are akin to computer hardware, and behaviors are akin to the actions that a computer performs, computation is likely to be the glue that binds the two.
kirkpatrickry

Face It, Your Brain Is a Computer - The New York Times - 0 views

  • This approach is misguided. Too many scientists have given up on the computer analogy, and far too little has been offered in its place. In my view, the analogy is due for a rethink.To begin with, all the standard arguments about why the brain might not be a computer are pretty weak. Take the argument that “brains are parallel, but computers are serial.” Critics are right to note that virtually every time a human does anything, many different parts of the brain are engaged; that’s parallel, not serial.
  • But the idea that computers are strictly serial is woefully out of date. Ever since desktop computers became popular, there has always been some degree of parallelism in computers, with several different computations being performed simultaneously, by different components, such as the hard-drive controller and the central processor. And the trend over time in the hardware business has been to make computers more and more parallel, using new approaches like multicore processors and graphics processing units.Skeptics of the computer metaphor also like to argue that “brains are analog, while computers are digital.” The idea here is that things that are digital operate only with discrete divisions, as with a digital watch; things that are analog, like an old-fashioned watch, work on a smooth continuum.
Keiko E

Brian Greene: A Physicist Explains 'The Hidden Reality' Of Parallel Universes : NPR - 0 views

  • There are only so many ways matter can arrange itself within that infinite universe. Eventually, matter has to repeat itself and arrange itself in similar ways. So if the universe is infinitely large, it is also home to infinite parallel universes.
  • Greene thinks the key to understanding these multiverses comes from string theory, the area of physics he has studied for the past 25 years. In a nutshell, string theory attempts to reconcile a mathematical conflict between two already accepted ideas in physics: quantum mechanics and the theory of relativity.
manhefnawi

Sojourns in the Parallel World: America Ferrera Reads Denise Levertov's Ode to Our Ambi... - 0 views

  • We could lament that the price we have paid for our so-called progress in the century and half since Muir has been a loss of perspective blinding us to this essential kinship with the rest of nature.
  • Still, something deep inside us — something elemental, beyond the ego and its conscious reasonings — vibrates with an irrepressible sense of our belonging to and with nature.
Javier E

Opinion | America's Irrational Macreconomic Freak Out - The New York Times - 0 views

  • The same inflationary forces that pushed these prices higher have also pushed wages to be 22 percent higher than on the eve of the pandemic. Official statistics show that the stuff that a typical American buys now costs 20 percent more over the same period. Some prices rose a little more, some a little less, but they all roughly rose in parallel.
  • It follows that the typical worker can now afford two percent more stuff. That doesn’t sound like a lot, but it’s a faster rate of improvement than the average rate of real wage growth over the past few decades.
  • many folks feel that they’re falling behind, even when a careful analysis of the numbers suggests they’re not.
  • ...16 more annotations...
  • That’s because real people — and yes, even professional economists — tend to process the parallel rise of prices and wages in quite different ways.
  • In brief, researchers have found that we tend to internalize the gains due to inflation and externalize the losses. These different processes yield different emotional responses.
  • Let’s start with higher prices. Sticker shock hurts. Even as someone who closely studies the inflation statistics, I’m still often surprised by higher prices. They feel unfair. They undermine my spending power, and my sense of control and order.
  • in reality, higher prices are only the first act of the inflationary play. It’s a play that economists have seen before. In episode after episode, surges in prices have led to — or been preceded by — a proportional surge in wages.
  • Even though wages tend to rise hand-in-hand with prices, we tell ourselves a different story, in which the wage rises we get have nothing to do with price rises that cause them.
  • But then my economist brain took over, and slowly it sunk in that my raise wasn’t a reward for hard work, but rather a cost-of-living adjustment
  • Internalizing the gain and externalizing the cost of inflation protects you from this deflating realization. But it also distorts your sense of reality.
  • The reason so many Americans feel that inflation is stealing their purchasing power is that they give themselves unearned credit for the offsetting wage rises that actually restore it.
  • younger folks — anyone under 60 — had never experienced sustained inflation rates greater than 5 percent in their adult lives. And I think this explains why they’re so angry about today’s inflation.
  • While older Americans understood that the pain of inflation is transitory, younger folks aren’t so sure. Inflation is a lot scarier when you fear that today’s price rises will permanently undermine your ability to make ends meet.
  • Perhaps this explains why the recent moderate burst of inflation has created seemingly more anxiety than previous inflationary episodes.
  • More generally, being an economist makes me an optimist. Social media is awash with (false) claims that we’re in a “silent depression,” and those who want to make American great again are certain it was once so much better.
  • in reality, our economy this year is larger, more productive and will yield higher average incomes than in any prior year on record in American history
  • And because the United States is the world’s richest major economy, we can now say that we are almost certainly part of the richest large society in its richest year in the history of humanity.
  • The income of the average American will double approximately every 39 years. And so when my kids are my age, average income will be roughly double what it is today. Far from being fearful for my kids, I’m envious of the extraordinary riches their generation will enjoy.
  • Psychologists describe anxiety disorders as occurring when the panic you feel is out of proportion to the danger you face. By this definition, we’re in the midst of a macroeconomic anxiety attack.
Javier E

Survival Of The Highest « The Dish - 0 views

  • the Savanna-IQ Interaction H
  • this hypothesis predicts that individuals of higher intelligence are more likely to engage in novel behavior that goes against cultural traditions or social norms.
  • a forty-year-long study funded by the British government paralleled this hypothesis, and found that “very bright” individuals with IQs above 125 were about twice as likely to have tried psychoactive drugs than “very dull” individuals with IQs below 75. As Kanazawa explains, “Intelligent people don’t always do the ‘right’ thing, only the evolutionarily novel thing.”
Javier E

For Scientists, an Exploding World of Pseudo-Academia - NYTimes.com - 0 views

  • a parallel world of pseudo-academia, complete with prestigiously titled conferences and journals that sponsor them. Many of the journals and meetings have names that are nearly identical to those of established, well-known publications and events.
  • the dark side of open access,” the movement to make scholarly publications freely available.
  • The number of these journals and conferences has exploded in recent years as scientific publishing has shifted from a traditional business model for professional societies and organizations built almost entirely on subscription revenues to open access, which relies on authors or their backers to pay for the publication of papers online, where anyone can read them.
  • ...2 more annotations...
  • Open access got its start about a decade ago and quickly won widespread acclaim with the advent of well-regarded, peer-reviewed journals like those published by the Public Library of Science, known as PLoS. Such articles were listed in databases like PubMed, which is maintained by the National Library of Medicine, and selected for their quality.
  • Jeffrey Beall, a research librarian at the University of Colorado in Denver, has developed his own blacklist of what he calls “predatory open-access journals.” There were 20 publishers on his list in 2010, and now there are more than 300. He estimates that there are as many as 4,000 predatory journals today, at least 25 percent of the total number of open-access journals.
Javier E

Virtually Emotional « The Dish - 0 views

  • I don’t see online interaction as easily separable from real-life human interaction any more. We spend more and more time communicating with one another virtually rather than physically. But these communications are still between human beings, with all our foibles and needs and crushes and hatreds and, if we’re lucky, wit and humor. We do not cease being human online; but we do wear a kind of mask, concealing some things, revealing others – whether on a blog or a hook-up app or a list-serv or a Facebook wall. And if you spend more hours a day communicating that way, you haven’t stopped living. You’re actually slowly becoming another person on top of your regular self.
  • This is the current reality for a lot of us. We meet many more people virtually than on the street or in our physical daily lives. We also get to know them more. The anonymity of the web can allow people not just to trash talk in a way they wouldn’t in real life but to sext and love-talk with strangers they’ve only seen pictures of. Some of this may actually be more authentic an expression of ourselves than anything we have the courage to say to someone’s face.
  • the more time we live virtually, the more we will reproduce aspects of our pre-virtual life online. Including love. And this strange, amazing story was about love, not sex. It was about a panicked, conflicted young gay man knowing he would be rebuffed by his straight crush and setting up a fantasy where he could become a virtual woman to have a relationship with him.
  • ...1 more annotation...
  • Increasingly, we seem to live parallel lives – as a person with a body and as an online avatar. Comedy and tragedy will doubtless ensue. That’s what masks can do.
sissij

The Wave (2008 film) - Wikipedia, the free encyclopedia - 1 views

  • Ron Jones's "Third Wave" experiment, which took place at a Californian school in 1967. Because his students did not understand how something like national socialism could even happen, he founded a totalitarian, strictly-organized "movement" with harsh punishments that was led by him autocratically. The intricate sense of community led to a wave of enthusiasm not only from his own students, but also from students from other classes who joined the program later. Jones later admitted to having enjoyed having his students as followers. To eliminate the upcoming momentum, Jones aborted the project on the fifth day and showed the students the parallels towards the Nazi youth movements.[3][4]
  • “Therein lies the great danger. It is an interesting fact that we always believe that what happens to others would never happen to us. We blame others, for example the less educated or the East Germans etc. However, in the Third Reich the house caretaker was just as fascinated by the movement as was the intellectual.”[10]
  •  
    I think this experiment is very interesting because it shows a flaw in human thinking that we are always progressing. However, just like the quote says: "Barbarism is not the inheritance of our pre-history. It is the companion that dogs our every step." The film "Die Welle", based on this experiment, is also very interesting and worth-watching. --Sissi (Sept 17, 2016)
sissij

Lessons from Playing Golf with Trump - The New Yorker - 1 views

  • “I will buy one only if it has the potential to be the best. I’m not interested in having a nine.”
  • A friend asked me later whether Trump wasn’t “in on the joke” of his public persona, and I said that, as far as I could tell, the Trump we were used to seeing on television was the honest-to-god authentic Trump: a ten-year-old boy who, for unknown reasons, had been given a real airplane and a billion dollars. In other words, a fun guy to hang around with.
  • He was upset that I hadn’t written that he’d shot 71—a very good golf score, one stroke under par.
  • ...3 more annotations...
  • He complained to me that golf publications never rank his courses high enough, because the people who do the rating hold a grudge against him, but he also said that he never allows raters to play his courses, because they would just get in the way of the members.
  • He wanted the number, and the fact that I hadn’t published the number proved that I was just like all the other biased reporters, who, because we’re all part of the anti-Trump media conspiracy, never give him as much credit as he deserves for being awesome.
  • In Trump’s own mind, I suspect, he really did shoot 71 that day, if not (by now) 69. Trump’s world is a parallel universe in which truth takes many forms, none of them necessarily based on reality.
  •  
    I think this article has a very interesting interpretation on Trump's personalities and behaviors. Something we think is absurd might be totally normal in other people's perspective. For example, in this article, the author states that Trump values social status and potential profit more than the real person or the real thing. It shows how people see this world differently and this affects how they make their moves and decisions. I think the overwhelming critics on Trump is partly because we don't understand Trump and don't even try to understand and accept him. He is an anomaly. Also, I think everybody observe the universe through their unique senses and perception, so we cannot tell whose reality is truer than others. Condemning others' reality won't bring us a good negotiation. --Sissi (1/14/2017)
Javier E

Should we even go there? Historians on comparing fascism to Trumpism | US news | The Gu... - 0 views

  • “What are the necessary social and psychological conditions that allow populists of Hitler’s ilk to gain a mass following and attain power?”
  • “There are certain traits you can recognize that Hitler and Trump have in common,” Ullrich says. “I would say the egomania, the total egocentricity of both men, and the inclination to mix lies and truth – that was very characteristic of Hitler.”
  • Like Trump, “Hitler exploited peoples’ feelings of resentment towards the ruling elite.” He also said he would make Germany great again. Ullrich also notes both men’s talent at playing the media, making use of new technology and their propensity for stage effects.
  • ...26 more annotations...
  • “I think the differences are still greater than the similarities,” he says. “Hitler was not only more intelligent, but craftier. He was not just a powerful orator, but a talented actor who succeeded in winning over various social milieus. So not just the economically threatened lower middle classes which Trump targeted, but also the upper middle classes. Hitler had many supporters in the German aristocracy.”
  • Trump was also democratically elected, while Hitler never had a majority vote. “He was appointed by the president of the German Reich.” Then there’s the fact that Trump does not lead a party “which is unconditionally committed to him”.
  • “A further obvious difference is that Trump doesn’t have a private militia, as Hitler did with the SA, which he used in his first months after coming to power to settle scores with his opponents, like the Communists and Social Democrats. You can’t possibly imagine something similar with Trump – that he’ll be locking Democrats up into concentration camps
  • “Finally, the American constitution is based on a system of checks and balances. It remains to be seen how far Congress will really limit Trump or if, as is feared, he can override it. It was different with Hitler, who, as we know, managed to eliminate all resistance in the shortest space of time and effectively establish himself as an all-powerful dictator. Within a few months, there was effectively no longer any opposition.”
  • “Hitler profited from the fact that his opponents always underestimated him,” Ullrich explains. “His conservative allies in government assumed they could tame or ‘civilise’ him – that once he became chancellor he’d become vernünftig (meaning sensible, reasonable). Very quickly it became clear that was an illusion.”
  • “There were many situations where he could have been stopped. For example in 1923 after the failed Munich putsch – if he’d served his full prison sentence of several years, he wouldn’t have made a political comeback. Instead, he only spent a few months behind bars, [having been released after political pressure] and could rebuild his movement.”
  • The western powers made the same mistake with their appeasement politics, indecision and indulgence. “In the 1930s Hitler strengthened, rather than weakened, his aggressive intentions,” Ullrich says. “So you could learn from this that you have to react faster and much more vigorously than was the case at the time.”
  • llrich also contends that if Hindenburg, the president of the Reich, had allowed Chancellor Brüning, of the Centre party, to remain chancellor to the end of 1934, rather than responding to pressure from conservatives to dismiss him in 1932, “then the peak of the economic crisis would have passed and it would have been very questionable whether Hitler could still have come to power”.
  • At the same time, Hitler’s ascent was no mere fluke. “There were powerful forces in the big industries, but also in the landowning class and the armed forces, which approved of a fascist solution to the crisis.”
  • If fascism “now just means aggressive nationalism, racism, patriarchy and authoritarianism, then maybe it is back on the agenda,” Bosworth continues. But today’s context is fundamentally different
  • Today’s “alt-right” agitators “live in a neoliberal global order where the slogan, ‘all for the market, nothing outside the market, no one against the market’ is far more unquestionably accepted than the old fascist slogan of ‘all for the state, nothing outside the state, no one against the state’”.
  • “What is that if it’s not racially authoritarian?” asks Schama. “If you want to call it fascist, fine. I don’t really care if it’s called that or not. It’s authoritarian, you know, ferociously authoritarian.”
  • Schama also points to deeply worrying messaging, such as “the parallel universe of lies which are habitual, massive, cumulative”; the criminalization of political opponents; the threat to change the libel laws against the press and the demonization of different racial and ethnic groups, going as far as proposing a Muslim registry.
  • Schama is clear: Trump is obviously not Hitler. “But, you know, if you like, he’s an entertainment fascist, which may be less sinister but is actually in the end more dangerous. If you’re not looking for jackboots and swastikas – although swastikas are indeed appearing – there’s a kind of laundry list of things which are truly sinister and authoritarian and not business as usual.”
  • Don’t ignore what people vote fo
  • f you’re of German heritage, it’s hard to understand how so many people could have bought Mein Kampf and gone on to vote for Hitler. Maybe no one really read it, or got beyond the first few pages of bluster, or took antisemitism seriously, you tell yourself. “Or they liked what he said,
  • “I think one of the mistakes this time around would be not to think that the people who voted for Trump were serious. They may have been serious for different reasons, but it would be a big mistake not to try and figure out what their reasons were.
  • Hitler presented himself as a “messiah” offering the public “salvation”, Ullrich points out. With austerity and hostility to the EU and to immigrants riding high, there is fertile ground for European populists next year to seduce with equally simplistic, sweeping “solutions”.
  • The problem, in Mazower’s view, is that establishment politicians currently have no response
  • “The Gestapo was piddling compared with the size and reach of surveillance equipment and operations today,
  • “Very belatedly, everyone is waking up to the fact that there was a general assumption that no government in the west would fall into the wrong hands, that it was safe to acquiesce in this huge expansion of surveillance capabilities, and the debate wasn’t as vigorous as it could have been.”
  • “Now, there is a lot of discussion about allowing this kind of surveillance apparatus in the wrong hands,” he adds. “And we’ve woken up to this a bit late in the day.”
  • Ullrich calls crises, “the elixir of rightwing populists”, and urges that politicians “do everything they can to correct the inequalities and social injustice which have arisen in the course of extreme financial capitalism in western countries”
  • Jane Caplan, a history professor at Oxford University who has written about Trump and fascism, highlights the want of “dissenting voices against marketisation and neoliberalism
  • The failure to resist the incursion of the market as the only criterion for political utility, or economic utility, has been pretty comprehensive.
  • Paranoia, bullying and intimidation are a hallmark of authoritarian regimes. They are also alive and well in our culture today, where online trolls, violent thugs at rallies, threats of expensive libel action and of course terrorist acts are equally effective in getting individuals and the press to self-censor.
Javier E

untitled - 0 views

  • Scientists at Stanford University and the J. Craig Venter Institute have developed the first software simulation of an entire organism, a humble single-cell bacterium that lives in the human genital and respiratory tracts.
  • the work was a giant step toward developing computerized laboratories that could carry out many thousands of experiments much faster than is possible now, helping scientists penetrate the mysteries of diseases like cancer and Alzheimer’s.
  • cancer is not a one-gene problem; it’s a many-thousands-of-factors problem.”
  • ...7 more annotations...
  • This kind of modeling is already in use to study individual cellular processes like metabolism. But Dr. Covert said: “Where I think our work is different is that we explicitly include all of the genes and every known gene function. There’s no one else out there who has been able to include more than a handful of functions or more than, say, one-third of the genes.”
  • The simulation, which runs on a cluster of 128 computers, models the complete life span of the cell at the molecular level, charting the interactions of 28 categories of molecules — including DNA, RNA, proteins and small molecules known as metabolites, which are generated by cell processes.
  • They called the simulation an important advance in the new field of computational biology, which has recently yielded such achievements as the creation of a synthetic life form — an entire bacterial genome created by a team led by the genome pioneer J. Craig Venter. The scientists used it to take over an existing cell.
  • A decade ago, scientists developed simulations of metabolism that are now being used to study a wide array of cells, including bacteria, yeast and photosynthetic organisms. Other models exist for processes like protein synthesis.
  • “Right now, running a simulation for a single cell to divide only one time takes around 10 hours and generates half a gigabyte of data,” Dr. Covert wrote. “I find this fact completely fascinating, because I don’t know that anyone has ever asked how much data a living thing truly holds. We often think of the DNA as the storage medium, but clearly there is more to it than that.”
  • scientists chose an approach called object-oriented programming, which parallels the design of modern software systems. Software designers organize their programs in modules, which communicate with one another by passing data and instructions back and forth.
  • “The major modeling insight we had a few years ago was to break up the functionality of the cell into subgroups, which we could model individually, each with its own mathematics, and then to integrate these submodels together into a whole,”
Javier E

Facebook and Its Users, Mutually Dependent - NYTimes.com - 0 views

  • Even though we may occasionally feel that we can’t live with Facebook, we also haven’t been able to figure out how to live without it. The degree of this codependency may have no parallel. “I can’t think of another piece of passive software that has gotten so embedded in the cultural conversation to this extent before,” says Sherry Turkle, a professor at the Massachusetts Institute of Technology and author of “Alone Together.” “This company is reshaping how we think about ourselves and define ourselves and our digital selves.”
  • “It crystallized a set of issues that we will be defining for the next decade — the notion of self, privacy, how we connect and the price we’re willing to pay for it,” she said. “We have to decide what boundaries we’re going to establish between ourselves, advertisers and our personal information.”
  • “It’s a dynamic that is bred by the very nature of social media because users are the sources of the content,” said S. Shyam Sundar, co-director of the Media Effects Research Laboratory at Pennsylvania State University, who studies how people interact with social media. “Users feel like they have a sense of agency, like they are shareholders.”
  • ...2 more annotations...
  • as Facebook evolves into a sustainable business, the trick will be making sure that users don’t cool on its tactics. That could be devastating to the company’s main source of revenue — showing advertisements to its members based on what it knows about them.
  • Facebook might not be impervious to rivals, or at least to more divided attention from people who shift their time to other parts of the Web where intent is easier to understand and the interactions feel less public.
Javier E

Do you want to help build a happier city? BBC - 0 views

  • With colleagues at the University of Cambridge, I worked on a web game called urbangems.org. In it, you are shown 10 pairs of urban scenes of London, and for each pair you need to choose which one you consider to be more beautiful, quiet and happy. Based on user votes, one is able to rank all urban scenes by beauty, quiet and happiness. Those scenes have been studied at Yahoo Labs, image processing tools that extract colour histograms. The amount of greenery is associated with all three peaceful qualities: green is often found in scenes considered to be beautiful, quiet and happy. We then ran more sophisticated image analysis tools that extracted patches from our urban scenes and found that red-brick houses and public gardens also make people happy.
  • On the other hand, cars were the visual elements most strongly associated with sadness. In rich countries, car ownership is becoming unfashionable, and car-sharing and short-term hiring is becoming more popular. Self-driving cars such as those being prototyped by Google will be more common and will be likely to be ordered via the kind of mobile apps similar to the ones we use for ordering taxis nowadays. This will result into optimised traffic flows, fewer cars, and more space for alternative modes of transportation and for people on foot.
  • Cities will experience transformations similar to those New York has experienced since 2007. During these few years, new pedestrian plazas and hundreds of miles of bike lanes were created in the five boroughs, creating spaces for public art installations and recreation. And it’s proved popular with local businesses too, boosting the local economy in areas where cyclists are freer to travel.
  • ...4 more annotations...
  • it is not clear whether the rise of post-war tower dwelling is a definite improvement on the modern city sprawl. Tall buildings (with the exception of glassed-office buildings and landmarks) are often found in sad scenes.
  • In recent years, the new mayor of the Colombian capital Bogota, Enrique Penalosa, has cancelled highways projects and poured the money instead into cycle lanes, parks and open spaces for locals – undoing decades of car-centric planning that had made the streets a no-go area for the capital’s children. On the day in February 2000 when Penalosa banned cars from the street for 24 hours, hospital admissions fell by a third, air pollution levels dropped and residents said it made them feel more optimistic about living in the city.
  • are the technologies we are designing really helping its users to be happy? Take the simple example of a web map. It usually gives us the shortest walking direction to destination. But what if it would give us the small street, full of trees, parallel to the shortest path, which would make us happier? As more and more of us share these city streets, what will keep us happy as they become more crowded?
  • the share of the world’s population living in cities has surpassed 50%. By 2025, we will see another 1.2 billion city residents. With more and more of us moving to urban centres, quality of life becomes ever-more important.
Javier E

When No One Is Just a Face in the Crowd - NYTimes.com - 0 views

  • Facial recognition technology, already employed by some retail stores to spot and thwart shoplifters, may soon be used to identify and track the freest spenders in the aisles.
  • And companies like FaceFirst, in Camarillo, Calif., hope to soon complement their shoplifter-identification services with parallel programs to help retailers recognize customers eligible for special treatmen
  • . “Instantly, when a person in your FaceFirst database steps into one of your stores, you are sent an email, text or SMS alert that includes their picture and all biographical information of the known individual so you can take immediate and appropriate action.”
  • ...3 more annotations...
  • Because facial recognition can be used covertly to identify and track people by name at a distance, some civil liberties experts call it unequivocally intrusive. In view of intelligence documents made public by Edward J. Snowden, they also warn that once companies get access to such data, the government could, too. “This is you as an individual being monitored over time and your movements and habits being recorded,”
  • facial recognition may soon let companies link a person’s online persona with his or her actual offline self at a specific public location. That could seriously threaten our ability to remain anonymous in public.
  • industry and consumer advocates will have to contend with nascent facial-recognition apps like NameTag; it is designed to allow a user to scan photographs of strangers, then see information about them — like their occupations or social-network profiles.
Javier E

The End of the Future - Peter Thiel - National Review Online - 0 views

  • The state can successfully push science; there is no sense denying it. The Manhattan Project and the Apollo program remind us of this possibility. Free markets may not fund as much basic research as needed.
  • But in practice, we all sense that such gloating belongs to a very different time. Most of our political leaders are not engineers or scientists and do not listen to engineers or scientists.
  • Today’s aged hippies no longer understand that there is a difference between the election of a black president and the creation of cheap solar energy; in their minds, the movement towards greater civil rights parallels general progress everywhere. Because of these ideological conflations and commitments, the 1960s Progressive Left cannot ask whether things actually might be getting worse.
  • ...1 more annotation...
  • after 40 years of wandering, it is not easy to find a path back to the future. If there is to be a future, we would do well to reflect about it more. The first and the hardest step is to see that we now find ourselves in a desert, and not in an enchanted forest.
Javier E

Raymond Tallis Takes Out the 'Neurotrash' - The Chronicle Review - The Chronicle of Hig... - 0 views

  • Tallis informs 60 people gathered in a Kent lecture hall that his talk will demolish two "pillars of unwisdom." The first, "neuromania," is the notion that to understand people you must peer into the "intracranial darkness" of their skulls with brain-scanning technology. The second, "Darwinitis," is the idea that Charles Darwin's evolutionary theory can explain not just the origin of the human species—a claim Tallis enthusiastically accepts—but also the nature of human behavior and institutions.
  • Aping Mankind argues that neuroscientific approaches to things like love, wisdom, and beauty are flawed because you can't reduce the mind to brain activity alone.
  • Stephen Cave, a Berlin-based philosopher and writer who has called Aping Mankind "an important work," points out that most philosophers and scientists do in fact believe "that mind is just the product of certain brain activity, even if we do not currently know quite how." Tallis "does both the reader and these thinkers an injustice" by declaring that view "obviously" wrong,
  • ...5 more annotations...
  • Geraint Rees, director of University College London's Institute of Cognitive Neuroscience, complains that reading Tallis is "a bit like trying to nail jelly to the wall." He "rubbishes every current theory of the relationship between mind and brain, whether philosophical or neuroscientific," while offering "little or no alternative,"
  • cultural memes. The Darwinesque concept originates in Dawkins's 1976 book, The Selfish Gene. Memes are analogous to genes, Dennett has said, "replicating units of culture" that spread from mind to mind like a virus. Religion, chess, songs, clothing, tolerance for free speech—all have been described as memes. Tallis considers it absurd to talk of a noun-phrase like "tolerance for free speech" as a discrete entity. But Dennett argues that Tallis's objections are based on "a simplistic idea of what one might mean by a unit." Memes aren't units? Well, in that spirit, says Dennett, organisms aren't units of biology, nor are species—they're too complex, with too much variation. "He's got to allow theory to talk about entities which are not simple building blocks," Dennett says.
  • How is it that he perceives the glass of water on the table? How is it that he feels a sense of self over time? How is it that he can remember a patient he saw in 1973, and then cast his mind forward to his impending visit to the zoo? There are serious problems with trying to reduce such things to impulses in the brain, he argues. We can explain "how the light gets in," he says, but not "how the gaze looks out." And isn't it astonishing, he adds, that much neural activity seems to have no link to consciousness? Instead, it's associated with things like controlling automatic movements and regulating blood pressure. Sure, we need the brain for consciousness: "Chop my head off, and my IQ descends." But it's not the whole story. There is more to perceptions, memories, and beliefs than neural impulses can explain. The human sphere encompasses a "community of minds," Tallis has written, "woven out of a trillion cognitive handshakes of shared attention, within which our freedom operates and our narrated lives are led." Those views on perception and memory anchor his attack on "neurobollocks." Because if you can't get the basics right, he says, then it's premature to look to neuroscience for clues to complex things like love.
  • Yes, many unanswered questions persist. But these are early days, and neuroscience remains immature, says Churchland, a professor emerita of philosophy at University of California at San Diego and author of the subfield-spawning 1986 book Neurophilosophy. In the 19th century, she points out, people thought we'd never understand light. "Well, by gosh," she says, "by the time the 20th century rolls around, it turns out that light is electromagnetic radiation. ... So the fact that at a certain point in time something seems like a smooth-walled mystery that we can't get a grip on, doesn't tell us anything about whether some real smart graduate student is going to sort it out in the next 10 years or not."
  • Dennett claims he's got much of it sorted out already. He wrote a landmark book on the topic in 1991, Consciousness Explained. (The title "should have landed him in court, charged with breach of the Trade Descriptions Act," writes Tallis.) Dennett uses the vocabulary of computer science to explain how consciousness emerges from the huge volume of things happening in the brain all at once. We're not aware of everything, he tells me, only a "limited window." He describes that stream of consciousness as "the activities of a virtual machine which is running on the parallel hardware of the brain." "You—the fruits of all your experience, not just your genetic background, but everything you've learned and done and all your memories—what ties those all together? What makes a self?" Dennett asks. "The answer is, and has to be, the self is like a software program that organizes the activities of the brain."
1 - 20 of 53 Next › Last »
Showing 20 items per page