Skip to main content

Home/ TOK Friends/ Group items tagged google

Rss Feed Group items tagged

qkirkpatrick

Google takes a page from Apple with Android Pay | New York Post - 0 views

  • That’s the approach Google appears to be taking with the launch of Android Pay, a new mobile-payments system that closely mimics the Apple Pay platform launched last fall.
  • Scrambling to play catch-up, Google crucially has cut partnerships with a slew of big retailers including Macy’s, Best Buy, Walgreen, Whole Foods and McDonald’s
  • That list of retailers looks a lot like Apple Pay’s, and cutting those partnerships is something that the search giant’s flopped Google Wallet system failed to do when it was launched in 2011.
  • ...2 more annotations...
  • Google’s Near Field Communication technology will be at odds with Samsung’s upcoming “Samsung Pay” platform, which operates using the magnetic strips of traditional card readers.
  • In particular, analysts say mobile payments could help Google learn more about the effectiveness of its search ads.
  •  
    Can becoming more technologically reliant lead to a lot more problems in the future
Javier E

I Downloaded the Information That Facebook Has on Me. Yikes. - The New York Times - 0 views

  • When I downloaded a copy of my Facebook data last week, I didn’t expect to see much. My profile is sparse, I rarely post anything on the site, and I seldom click on ads
  • With a few clicks, I learned that about 500 advertisers — many that I had never heard of, like Bad Dad, a motorcycle parts store, and Space Jesus, an electronica band — had my contact information
  • Facebook also had my entire phone book, including the number to ring my apartment buzzer. The social network had even kept a permanent record of the roughly 100 people I had deleted from my friends list over the last 14 years, including my exes.
  • ...16 more annotations...
  • During his testimony, Mr. Zuckerberg repeatedly said Facebook has a tool for downloading your data that “allows people to see and take out all the information they’ve put into Facebook.”
  • Most basic information, like my birthday, could not be deleted. More important, the pieces of data that I found objectionable, like the record of people I had unfriended, could not be removed from Facebook, either.
  • what bothered me was the data that I had explicitly deleted but that lingered in plain sight. On my friends list, Facebook had a record of “Removed Friends,” a dossier of the 112 people I had removed along with the date I clicked the “Unfriend” button. Why should Facebook remember the people I’ve cut off from my life?
  • When you download a copy of your Facebook data, you will see a folder containing multiple subfolders and files. The most important one is the “index” file, which is essentially a raw data set of your Facebook account, where you can click through your profile, friends list, timeline and messages, among other features.
  • Upon closer inspection, it turned out that Facebook had stored my entire phone book because I had uploaded it when setting up Facebook’s messaging app, Messenger.
  • Facebook also kept a history of each time I opened Facebook over the last two years, including which device and web browser I used. On some days, it even logged my locations, like when I was at a hospital two years ago or when I visited Tokyo last year.
  • “They don’t delete anything, and that’s a general policy,” said Gabriel Weinberg, the founder of DuckDuckGo, which offers internet privacy tools. He added that data was kept around to eventually help brands serve targeted ads.
  • Facebook said unfamiliar advertisers might appear on the list because they might have obtained my contact information from elsewhere, compiled it into a list of people they wanted to target and uploaded that list into Facebook
  • Brands can obtain your information in many different ways. Those include:
  • ■ Buying information from a data provider like Acxiom, which has amassed one of the world’s largest commercial databases on consumers. Brands can buy different types of customer data sets from a provider, like contact information for people who belong to a certain demographic, and take that information to Facebook to serve targeted ads
  • ■ Using tracking technologies like web cookies and invisible pixels that load in your web browser to collect information about your browsing activities. There are many different trackers on the web, and Facebook offers 10 different trackers to help brands harvest your information, according to Ghostery, which offers privacy tools that block ads and trackers.
  • ■ Getting your information in simpler ways, too. Someone you shared information with could share it with another entity. Your credit card loyalty program, for example
  • I also downloaded copies of my Google data with a tool called Google Takeout. The data sets were exponentially larger than my Facebook data.
  • For my personal email account alone, Google’s archive of my data measured eight gigabytes, enough to hold about 2,000 hours of music. By comparison, my Facebook data was about 650 megabytes, the equivalent of about 160 hours of music.
  • In a folder labeled Ads, Google kept a history of many news articles I had read, like a Newsweek story about Apple employees walking into glass walls and a New York Times story about the editor of our Modern Love column. I didn’t click on ads for either of these stories, but the search giant logged them because the sites had loaded ads served by Google.
  • In another folder, labeled Android, Google had a record of apps I had opened on an Android phone since 2015, along with the date and time. This felt like an extraordinary level of detail.
Javier E

George Soros: Facebook and Google a menace to society | Business | The Guardian - 0 views

  • Facebook and Google have become “obstacles to innovation” and are a “menace” to society whose “days are numbered”
  • “Mining and oil companies exploit the physical environment; social media companies exploit the social environment,” said the Hungarian-American businessman, according to a transcript of his speech.
  • “This is particularly nefarious because social media companies influence how people think and behave without them even being aware of it. This has far-reaching adverse consequences on the functioning of democracy, particularly on the integrity of elections.”
  • ...8 more annotations...
  • In addition to skewing democracy, social media companies “deceive their users by manipulating their attention and directing it towards their own commercial purposes” and “deliberately engineer addiction to the services they provide”. The latter, he said, “can be very harmful, particularly for adolescents”
  • There is a possibility that once lost, people who grow up in the digital age will have difficulty in regaining it. This may have far-reaching political consequences.”
  • Soros warned of an “even more alarming prospect” on the horizon if data-rich internet companies such as Facebook and Google paired their corporate surveillance systems with state-sponsored surveillance – a trend that’s already emerging in places such as the Philippines.
  • “This may well result in a web of totalitarian control the likes of which not even Aldous Huxley or George Orwell could have imagined,”
  • “The internet monopolies have neither the will nor the inclination to protect society against the consequences of their actions. That turns them into a menace and it falls to the regulatory authorities to protect society against them,
  • He also echoed the words of world wide web inventor Sir Tim Berners-Lee when he said the tech giants had become “obstacles to innovation” that need to be regulated as public utilities “aimed at preserving competition, innovation and fair and open universal access”.
  • Earlier this week, Salesforce’s chief executive, Marc Benioff, said that Facebook should be regulated like a cigarette company because it’s addictive and harmful.
  • In November, Roger McNamee, who was an early investor in Facebook, described Facebook and Google as threats to public health.
Javier E

[Six Questions] | Astra Taylor on The People's Platform: Taking Back Power and Culture ... - 1 views

  • Astra Taylor, a cultural critic and the director of the documentaries Zizek! and Examined Life, challenges the notion that the Internet has brought us into an age of cultural democracy. While some have hailed the medium as a platform for diverse voices and the free exchange of information and ideas, Taylor shows that these assumptions are suspect at best. Instead, she argues, the new cultural order looks much like the old: big voices overshadow small ones, content is sensationalist and powered by advertisements, quality work is underfunded, and corporate giants like Google and Facebook rule. The Internet does offer promising tools, Taylor writes, but a cultural democracy will be born only if we work collaboratively to develop the potential of this powerful resource
  • Most people don’t realize how little information can be conveyed in a feature film. The transcripts of both of my movies are probably equivalent in length to a Harper’s cover story.
  • why should Amazon, Apple, Facebook, and Google get a free pass? Why should we expect them to behave any differently over the long term? The tradition of progressive media criticism that came out of the Frankfurt School, not to mention the basic concept of political economy (looking at the way business interests shape the cultural landscape), was nowhere to be seen, and that worried me. It’s not like political economy became irrelevant the second the Internet was invented.
  • ...15 more annotations...
  • How do we reconcile our enjoyment of social media even as we understand that the corporations who control them aren’t always acting in our best interests?
  • hat was because the underlying economic conditions hadn’t been changed or “disrupted,” to use a favorite Silicon Valley phrase. Google has to serve its shareholders, just like NBCUniversal does. As a result, many of the unappealing aspects of the legacy-media model have simply carried over into a digital age — namely, commercialism, consolidation, and centralization. In fact, the new system is even more dependent on advertising dollars than the one that preceded it, and digital advertising is far more invasive and ubiquitous
  • the popular narrative — new communications technologies would topple the establishment and empower regular people — didn’t accurately capture reality. Something more complex and predictable was happening. The old-media dinosaurs weren’t dying out, but were adapting to the online environment; meanwhile the new tech titans were coming increasingly to resemble their predecessors
  • I use lots of products that are created by companies whose business practices I object to and that don’t act in my best interests, or the best interests of workers or the environment — we all do, since that’s part of living under capitalism. That said, I refuse to invest so much in any platform that I can’t quit without remorse
  • these services aren’t free even if we don’t pay money for them; we pay with our personal data, with our privacy. This feeds into the larger surveillance debate, since government snooping piggybacks on corporate data collection. As I argue in the book, there are also negative cultural consequences (e.g., when advertisers are paying the tab we get more of the kind of culture marketers like to associate themselves with and less of the stuff they don’t) and worrying social costs. For example, the White House and the Federal Trade Commission have both recently warned that the era of “big data” opens new avenues of discrimination and may erode hard-won consumer protections.
  • I’m resistant to the tendency to place this responsibility solely on the shoulders of users. Gadgets and platforms are designed to be addictive, with every element from color schemes to headlines carefully tested to maximize clickability and engagement. The recent news that Facebook tweaked its algorithms for a week in 2012, showing hundreds of thousands of users only “happy” or “sad” posts in order to study emotional contagion — in other words, to manipulate people’s mental states — is further evidence that these platforms are not neutral. In the end, Facebook wants us to feel the emotion of wanting to visit Facebook frequently
  • social inequalities that exist in the real world remain meaningful online. What are the particular dangers of discrimination on the Internet?
  • That it’s invisible or at least harder to track and prove. We haven’t figured out how to deal with the unique ways prejudice plays out over digital channels, and that’s partly because some folks can’t accept the fact that discrimination persists online. (After all, there is no sign on the door that reads Minorities Not Allowed.)
  • just because the Internet is open doesn’t mean it’s equal; offline hierarchies carry over to the online world and are even amplified there. For the past year or so, there has been a lively discussion taking place about the disproportionate and often outrageous sexual harassment women face simply for entering virtual space and asserting themselves there — research verifies that female Internet users are dramatically more likely to be threatened or stalked than their male counterparts — and yet there is very little agreement about what, if anything, can be done to address the problem.
  • What steps can we take to encourage better representation of independent and non-commercial media? We need to fund it, first and foremost. As individuals this means paying for the stuff we believe in and want to see thrive. But I don’t think enlightened consumption can get us where we need to go on its own. I’m skeptical of the idea that we can shop our way to a better world. The dominance of commercial media is a social and political problem that demands a collective solution, so I make an argument for state funding and propose a reconceptualization of public media. More generally, I’m struck by the fact that we use these civic-minded metaphors, calling Google Books a “library” or Twitter a “town square” — or even calling social media “social” — but real public options are off the table, at least in the United States. We hand the digital commons over to private corporations at our peril.
  • 6. You advocate for greater government regulation of the Internet. Why is this important?
  • I’m for regulating specific things, like Internet access, which is what the fight for net neutrality is ultimately about. We also need stronger privacy protections and restrictions on data gathering, retention, and use, which won’t happen without a fight.
  • I challenge the techno-libertarian insistence that the government has no productive role to play and that it needs to keep its hands off the Internet for fear that it will be “broken.” The Internet and personal computing as we know them wouldn’t exist without state investment and innovation, so let’s be real.
  • there’s a pervasive and ill-advised faith that technology will promote competition if left to its own devices (“competition is a click away,” tech executives like to say), but that’s not true for a variety of reasons. The paradox of our current media landscape is this: our devices and consumption patterns are ever more personalized, yet we’re simultaneously connected to this immense, opaque, centralized infrastructure. We’re all dependent on a handful of firms that are effectively monopolies — from Time Warner and Comcast on up to Google and Facebook — and we’re seeing increased vertical integration, with companies acting as both distributors and creators of content. Amazon aspires to be the bookstore, the bookshelf, and the book. Google isn’t just a search engine, a popular browser, and an operating system; it also invests in original content
  • So it’s not that the Internet needs to be regulated but that these big tech corporations need to be subject to governmental oversight. After all, they are reaching farther and farther into our intimate lives. They’re watching us. Someone should be watching them.
Javier E

Googling Is Believing: Trumping the Informed Citizen - The New York Times - 1 views

  • Rubio’s Google gambit and Trump’s (non)reaction to it, reveals an interesting, and troubling, new change in attitude about a philosophical foundation of democracy: the ideal of an informed citizenry.
  • The idea is obvious: If citizens are going to make even indirect decisions about policy, we need to know the facts about the problem the policy is meant to rectify, and to be able to gain some understanding about how effective that policy would be.
  • Noam Chomsky argued in the 1980s that consent was being “manufactured” by Big Media — large consolidated content-delivery companies (like this newspaper) that could cause opinions to sway one way or the other at their whim.
  • ...13 more annotations...
  • searching the Internet can get you to information that would back up almost any claim of fact, no matter how unfounded. It is both the world’s best fact-checker and the world’s best bias confirmer — often at the same time.
  • Nor is it a coincidence that people are increasingly following the election on social media, using it both as the source of their information and as the way to get their view out. Consent is still being manufactured, but the manufacturing is being done willingly by us, usually intended for consumption by other people with whom we already agree, facts or no facts.
  • It really isn’t a surprise that Rubio would ask us to Google for certain facts; that’s how you and I know almost everything we know nowadays — it is a way of knowing that is so embedded into the very fabric of our lives that we don’t even notice it
  • The problem of course is that having more information available, even more accurate information, isn’t what is required by the ideal.
  • What is required is that people actually know and understand that information, and there are reasons to think we are no closer to an informed citizenry understood in that way than we ever have been. Indeed, we might be further away.
  • The worry is no longer about who controls content. It is about who controls the flow of that content.
  • the flow of digital information is just as prone to manipulation as its content
  • No wonder Trump and his followers on Twitter immediately shrugged off Rubio’s inconvenient truths; there is nothing to fear from information when counterinformation is just as plentiful.
  • The real worry concerns our faith in the ideal of an informed citizenry itself. That worry, as I see it, has two faces.
  • First, as Jason Stanley and others have emphasized recently, appeals to ideals can be used to undermine those very ideals.
  • The very availability of information can make us think that the ideal of the informed citizen is more realized than it is — and that, in turn, can actually undermine the ideal, making us less informed, simply because we think we know all we need to know already.
  • Second, the danger is that increasing recognition of the fact that Googling can get you wherever you want to go can make us deeply cynical about the ideal of an informed citizenry — for the simple reason that what counts as an “informed” citizen is a matter of dispute. We no longer disagree just over values. Nor do we disagree just over the facts. We disagree over whose source — whose fountain of facts — is the right one.
  • And once disagreement reaches that far down, the daylight of reason seems very far away indeed.
Javier E

Google's new media apocalypse: How the search giant wants to accelerate the end of the ... - 0 views

  • Google is announcing that it wants to cut out the middleman—that is to say, other websites—and serve you content within its own lovely little walled garden. That sound you just heard was a bunch of media publishers rushing to book an extra appointment with their shrink.
  • Back when search, and not social media, ruled the internet, Google was the sun around which the news industry orbited. Getting to the top of Google’s results was the key that unlocked buckets of page views. Outlet after outlet spent countless hours trying to figure out how to game Google’s prized, secretive algorithm. Whole swaths of the industry were killed instantly if Google tweaked the algorithm.
  • Facebook is now the sun. Facebook is the company keeping everyone up at night. Facebook is the place shaping how stories get chosen, how they get written, how they are packaged and how they show up on its site. And Facebook does all of this with just as much secrecy and just as little accountability as Google did.
  • ...3 more annotations...
  • Facebook just opened up its Instant Articles feature to all publishers. The feature allows external outlets to publish their content directly onto Facebook’s platform, eliminating that pesky journey to their actual website. They can either place their own ads on the content or join a revenue-sharing program with Facebook. Facebook has touted this plan as one which provides a better user experience and has noted the ability for publishers to create ads on the platform as well.
  • The benefit to Facebook is obvious: It gets to keep people inside its house. They don’t have to leave for even a second. The publisher essentially has to accept this reality, sigh about the gradual death of websites and hope that everything works out on the financial side.
  • It’s all part of a much bigger story: that of how the internet, that supposed smasher of gates and leveler of playing fields, has coalesced around a mere handful of mega-giants in the space of just a couple of decades. The gates didn’t really come down. The identities of the gatekeepers just changed. Google, Facebook, Apple, Amazon
Javier E

Lead Gen Sites Pose Challenge to Google - the Haggler - NYTimes.com - 0 views

  • Mr. Strom, it turns out, has so little chance of outranking lead gen sites that he’s having a hard time finding a Web consultant to help him fight back. “I told him that it would just be a waste of his money,” says Craig Baerwaldt of Local Inbound Marketing, a search engine expert whom Mr. Strom tried to hire recently. “There are hundreds of these lead gen sites and they spend a ton of money gaming Google.”
  • because few people search beyond the first page online, snookering Google might be far more effective, especially because many people assume that the company’s algorithm does a bit of consumer-friendly vetting.
  • Yet if the example of locksmiths is any indication, the horde has the upper hand in certain service sectors, and it all but owns Google Places.
  • ...1 more annotation...
  • ‘A young man came yesterday, quoted me $49 to open my door, then he drilled my lock, charged me $400 and left — and now I need a new lock.’ I hear something like that almost every week.”
Javier E

Why It Matters That Google's Android Is Coming to All the Screens - Alexis C. Madrigal ... - 1 views

  • If Android is to be the core of your own personal swarm of screens and robots, it means that all of the data Google knows about you will come to the interactions you have with the stuff around you. The profile you build up at your computer and on your tablet will now apply to your television watching and your commuting, seamlessly
  • Sitting underneath all the interactions you have across these devices, Google's algorithms will be churning. The way Google thinks—its habits of data collection, analysis, and optimization—will become part of these experiences that have previously remained outside the company's reach.
  • Google's announcement that they will be determining "the most important" notifications to show you. Like Facebook's News Feed or Gmail's Priority Inbox, you'll see a selection of possible notifications, rather than all of them, when you glance at the phone. (A tap lets you see all of them.) That is to say, Google is asserting algorithmic control over the new interface frontier. And the software that determines what you see will be opaque to you. 
  • ...1 more annotation...
  • let's be clear: Google is almost always willing to trade user control for small increases in efficiency. 
anonymous

Controversial Quantum Machine Tested by NASA and Google Shows Promise | MIT Technology ... - 0 views

  • artificial-intelligence software.
  • Google says it has proof that a controversial machine it bought in 2013 really can use quantum physics to work through a type of math that’s crucial to artificial intelligence much faster than a conventional computer.
  • “It is a truly disruptive technology that could change how we do everything,” said Rupak Biswas, director of exploration technology at NASA’s Ames Research Center in Mountain View, California.
  • ...7 more annotations...
  • An alternative algorithm is known that could have let the conventional computer be more competitive, or even win, by exploiting what Neven called a “bug” in D-Wave’s design. Neven said the test his group staged is still important because that shortcut won’t be available to regular computers when they compete with future quantum annealers capable of working on larger amounts of data.
  • “For a specific, carefully crafted proof-of-concept problem we achieve a 100-million-fold speed-up,” said Neven.
  • “the world’s first commercial quantum computer.” The computer is installed at NASA’s Ames Research Center in Mountain View, California, and operates on data using a superconducting chip called a quantum annealer.
  • Google is competing with D-Wave to make a quantum annealer that could do useful work.
  • Martinis is also working on quantum hardware that would not be limited to optimization problems, as annealers are.
  • Government and university labs, Microsoft (see “Microsoft’s Quantum Mechanics”), and IBM (see “IBM Shows Off a Quantum Computing Chip”) are also working on that technology.
  • “it may be several years before this research makes a difference to Google products.”
Javier E

After the Fact - The New Yorker - 1 views

  • newish is the rhetoric of unreality, the insistence, chiefly by Democrats, that some politicians are incapable of perceiving the truth because they have an epistemological deficit: they no longer believe in evidence, or even in objective reality.
  • the past of proof is strange and, on its uncertain future, much in public life turns. In the end, it comes down to this: the history of truth is cockamamie, and lately it’s been getting cockamamier.
  • . Michael P. Lynch is a philosopher of truth. His fascinating new book, “The Internet of Us: Knowing More and Understanding Less in the Age of Big Data,” begins with a thought experiment: “Imagine a society where smartphones are miniaturized and hooked directly into a person’s brain.” As thought experiments go, this one isn’t much of a stretch. (“Eventually, you’ll have an implant,” Google’s Larry Page has promised, “where if you think about a fact it will just tell you the answer.”) Now imagine that, after living with these implants for generations, people grow to rely on them, to know what they know and forget how people used to learn—by observation, inquiry, and reason. Then picture this: overnight, an environmental disaster destroys so much of the planet’s electronic-communications grid that everyone’s implant crashes. It would be, Lynch says, as if the whole world had suddenly gone blind. There would be no immediate basis on which to establish the truth of a fact. No one would really know anything anymore, because no one would know how to know. I Google, therefore I am not.
  • ...20 more annotations...
  • In England, the abolition of trial by ordeal led to the adoption of trial by jury for criminal cases. This required a new doctrine of evidence and a new method of inquiry, and led to what the historian Barbara Shapiro has called “the culture of fact”: the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth and the only kind of evidence that’s admissible not only in court but also in other realms where truth is arbitrated. Between the thirteenth century and the nineteenth, the fact spread from law outward to science, history, and journalism.
  • Lynch isn’t terribly interested in how we got here. He begins at the arrival gate. But altering the flight plan would seem to require going back to the gate of departure.
  • Lynch thinks we are frighteningly close to this point: blind to proof, no longer able to know. After all, we’re already no longer able to agree about how to know. (See: climate change, above.)
  • We now only rarely discover facts, Lynch observes; instead, we download them.
  • For the length of the eighteenth century and much of the nineteenth, truth seemed more knowable, but after that it got murkier. Somewhere in the middle of the twentieth century, fundamentalism and postmodernism, the religious right and the academic left, met up: either the only truth is the truth of the divine or there is no truth; for both, empiricism is an error.
  • That epistemological havoc has never ended: much of contemporary discourse and pretty much all of American politics is a dispute over evidence. An American Presidential debate has a lot more in common with trial by combat than with trial by jury,
  • came the Internet. The era of the fact is coming to an end: the place once held by “facts” is being taken over by “data.” This is making for more epistemological mayhem, not least because the collection and weighing of facts require investigation, discernment, and judgment, while the collection and analysis of data are outsourced to machines
  • “Most knowing now is Google-knowing—knowledge acquired online,”
  • Empiricists believed they had deduced a method by which they could discover a universe of truth: impartial, verifiable knowledge. But the movement of judgment from God to man wreaked epistemological havoc.
  • “The Internet didn’t create this problem, but it is exaggerating it,”
  • nothing could be less well settled in the twenty-first century than whether people know what they know from faith or from facts, or whether anything, in the end, can really be said to be fully proved.
  • In his 2012 book, “In Praise of Reason,” Lynch identified three sources of skepticism about reason: the suspicion that all reasoning is rationalization, the idea that science is just another faith, and the notion that objectivity is an illusion. These ideas have a specific intellectual history, and none of them are on the wane.
  • Their consequences, he believes, are dire: “Without a common background of standards against which we measure what counts as a reliable source of information, or a reliable method of inquiry, and what doesn’t, we won’t be able to agree on the facts, let alone values.
  • When we Google-know, Lynch argues, we no longer take responsibility for our own beliefs, and we lack the capacity to see how bits of facts fit into a larger whole
  • Essentially, we forfeit our reason and, in a republic, our citizenship. You can see how this works every time you try to get to the bottom of a story by reading the news on your smartphone.
  • what you see when you Google “Polish workers” is a function of, among other things, your language, your location, and your personal Web history. Reason can’t defend itself. Neither can Google.
  • rump doesn’t reason. He’s a lot like that kid who stole my bat. He wants combat. Cruz’s appeal is to the judgment of God. “Father God, please . . . awaken the body of Christ, that we might pull back from the abyss,” he preached on the campaign trail. Rubio’s appeal is to Google.
  • Is there another appeal? People who care about civil society have two choices: find some epistemic principles other than empiricism on which everyone can agree or else find some method other than reason with which to defend empiricism
  • Lynch suspects that doing the first of these things is not possible, but that the second might be. He thinks the best defense of reason is a common practical and ethical commitment.
  • That, anyway, is what Alexander Hamilton meant in the Federalist Papers, when he explained that the United States is an act of empirical inquiry: “It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.”
Javier E

New Foils for the Right: Google and Facebook - The New York Times - 0 views

  • In a sign of escalation, Peter Schweizer, a right-wing journalist known for his investigations into Hillary Clinton, plans to release a new film focusing on technology companies and their role in filtering the news.
  • The documentary, which has not been previously reported, dovetails with concerns raised in recent weeks by right-wing groups about censorship on digital media — a new front in a rapidly evolving culture war.
  • The critique from conservatives, in contrast, casts the big tech companies as censorious and oppressive, all too eager to stifle right-wing content in an effort to mollify liberal critics.
  • ...9 more annotations...
  • Big Tech is easily associated with West Coast liberalism and Democratic politics, making it a fertile target for the right. And operational opacity at Facebook, Google and Twitter, which are reluctant to reveal details about their algorithms and internal policies, can leave them vulnerable, too.
  • “There’s not even a real basis to establish objective research about what’s happening on Facebook, because it’s closed.”
  • And former President Barack Obama said at an off-the-record conference at the Massachusetts Institute of Technology last month that he worried Americans were living in “entirely different realities” and that large tech companies like Facebook were “not just an invisible platform, they’re shaping our culture in powerful ways.” The contents of the speech were published by Reason magazine.
  • “There are political activists in all of these companies that want to actively push a liberal agenda,” he said. “Why does it matter? Because these companies are so ubiquitous and powerful that they are controlling all the means of mass communication.”
  • He is also the president of the Government Accountability Institute, a conservative nonprofit organization. He and Mr. Bannon founded it with funding from the family of Robert Mercer, the billionaire hedge fund manager and donor to Donald J. Trump’s presidential campaign.
  • Jeffrey A. Zucker, the president of CNN, derided Google and Facebook as “monopolies” and called for regulators to step in during a speech in Spain last month, saying the tech hegemony is “the biggest issue facing the growth of journalism in the years ahead.”
  • The panelists accused social media platforms of delisting their videos or stripping them of advertising. Such charges have long been staples of far-right online discourse, especially among YouTubers, but Mr. Schweizer’s project is poised to bring such arguments to a new — and potentially larger — audience.
  • The Facebook adjustment has affected virtually every media organization that is partly dependent on the platform for audiences, but it appears to have hit some harder than others. They include right-wing sites like Gateway Pundit and the millennial-focused Independent Journal Review, which was forced to lay off staff members last month.
  • The social news giant BuzzFeed recently bought ads on Facebook with the message, “Facebook is taking the news out of your News Feed, but we’ve got you covered,” directing users to download its app. Away from the political scrum, the viral lifestyle site LittleThings, once a top publisher on the platform, announced last week that it would cease operations, blaming “a full-on catastrophic update” to Facebook’s revised algorithms.
Javier E

How Calls for Privacy May Upend Business for Facebook and Google - The New York Times - 0 views

  • People detailed their interests and obsessions on Facebook and Google, generating a river of data that could be collected and harnessed for advertising. The companies became very rich. Users seemed happy. Privacy was deemed obsolete, like bloodletting and milkmen
  • It has been many months of allegations and arguments that the internet in general and social media in particular are pulling society down instead of lifting it up.
  • That has inspired a good deal of debate about more restrictive futures for Facebook and Google. At the furthest extreme, some dream of the companies becoming public utilities.
  • ...20 more annotations...
  • There are other avenues still, said Jascha Kaykas-Wolff, the chief marketing officer of Mozilla, the nonprofit organization behind the popular Firefox browser, including advertisers and large tech platforms collecting vastly less user data and still effectively customizing ads to consumers.
  • The greatest likelihood is that the internet companies, frightened by the tumult, will accept a few more rules and work a little harder for transparency.
  • The Cambridge Analytica case, said Vera Jourova, the European Union commissioner for justice, consumers and gender equality, was not just a breach of private data. “This is much more serious, because here we witness the threat to democracy, to democratic plurality,” she said.
  • Although many people had a general understanding that free online services used their personal details to customize the ads they saw, the latest controversy starkly exposed the machinery.
  • Consumers’ seemingly benign activities — their likes — could be used to covertly categorize and influence their behavior. And not just by unknown third parties. Facebook itself has worked directly with presidential campaigns on ad targeting, describing its services in a company case study as “influencing voters.”
  • “If your personal information can help sway elections, which affects everyone’s life and societal well-being, maybe privacy does matter after all.”
  • some trade group executives also warned that any attempt to curb the use of consumer data would put the business model of the ad-supported internet at risk.
  • “You’re undermining a fundamental concept in advertising: reaching consumers who are interested in a particular product,”
  • If suspicion of Facebook and Google is a relatively new feeling in the United States, it has been embedded in Europe for historical and cultural reasons that date back to the Nazi Gestapo, the Soviet occupation of Eastern Europe and the Cold War.
  • “We’re at an inflection point, when the great wave of optimism about tech is giving way to growing alarm,” said Heather Grabbe, director of the Open Society European Policy Institute. “This is the moment when Europeans turn to the state for protection and answers, and are less likely than Americans to rely on the market to sort out imbalances.”
  • In May, the European Union is instituting a comprehensive new privacy law, called the General Data Protection Regulation. The new rules treat personal data as proprietary, owned by an individual, and any use of that data must be accompanied by permission — opting in rather than opting out — after receiving a request written in clear language, not legalese.
  • the protection rules will have more teeth than the current 1995 directive. For example, a company experiencing a data breach involving individuals must notify the data protection authority within 72 hours and would be subject to fines of up to 20 million euros or 4 percent of its annual revenue.
  • “With the new European law, regulators for the first time have real enforcement tools,” said Jeffrey Chester, the executive director of the Center for Digital Democracy, a nonprofit group in Washington. “We now have a way to hold these companies accountable.”
  • Privacy advocates and even some United States regulators have long been concerned about the ability of online services to track consumers and make inferences about their financial status, health concerns and other intimate details to show them behavior-based ads. They warned that such microtargeting could unfairly categorize or exclude certain people.
  • the Do Not Track effort and the privacy bill were both stymied.Industry groups successfully argued that collecting personal details posed no harm to consumers and that efforts to hinder data collection would chill innovation.
  • “If it can be shown that the current situation is actually a market failure and not an individual-company failure, then there’s a case to be made for federal regulation” under certain circumstances
  • The business practices of Facebook and Google were reinforced by the fact that no privacy flap lasted longer than a news cycle or two. Nor did people flee for other services. That convinced the companies that digital privacy was a dead issue.
  • If the current furor dies down without meaningful change, critics worry that the problems might become even more entrenched. When the tech industry follows its natural impulses, it becomes even less transparent.
  • “To know the real interaction between populism and Facebook, you need to give much more access to researchers, not less,” said Paul-Jasper Dittrich, a German research fellow
  • There’s another reason Silicon Valley tends to be reluctant to share information about what it is doing. It believes so deeply in itself that it does not even think there is a need for discussion. The technology world’s remedy for any problem is always more technology
Javier E

'Filter Bubble': Pariser on Web Personalization, Privacy - TIME - 0 views

  • the World Wide Web came along and blew the gatekeepers away. Suddenly anyone with a computer and an Internet connection could take part in the conversation. Countless viewpoints bloomed. There was no longer a mainstream; instead, there was an ocean of information, one in which Web users were free to swim.
  • Where once Google delivered search results based on an algorithm that was identical for everyone, now what we see when we enter a term in the big box depends on who we are, where we are and what we are. Facebook has long since done the same thing for its all-important News Feed: you'll see different status updates and stories float to the top based on the data Mark Zuckerberg and company have on you. The universal Web is a thing of the past. Instead, as Pariser writes, we've been left "isolated in a web of one" — and, given that we increasingly view the world through the lens of the Internet, that change has frightening consequences for the media, community and even democracy.
  • Google has begun personalizing search results — something it does even if you're not signed into your Google account. (A Google engineer told Pariser that the company uses 57 different signals to shape individual search results, including what kind of browser you're using and where you are.)
  • ...1 more annotation...
  • Yahoo! News — the biggest news site on the Web — is personalized, and even mainstream sites like those of the New York Times and the Washington Post are giving more space to personalized recommendations. As Google executive chairman Eric Schmidt has said, "It will be very hard for people to watch or consume something that is not tailored for them."
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

YouTube to Curb Its Referrals to Conspiracy Theories and Other False Claims - WSJ - 0 views

  • Videos that could “misinform users in harmful ways,” such as ones that claim the Earth isn’t round or question the actors behind the Sept. 11 terrorist attacks, will no longer be recommended with as much prominence, the Alphabet Inc. GOOGL 1.62% unit said in a blog post Friday.
  • Though the factors underpinning YouTube’s recommendation system are largely unknown, its influence is apparent in the numbers. YouTube has said its recommendations drive more than 70% of users’ viewing time, and that it recommends more than 200 million videos daily on its home page alone.
Javier E

Imagine a World Without Apps - The New York Times - 0 views

  • Allow me to ask a wild question: What if we played games, shopped, watched Netflix and read news on our smartphones — without using apps?
  • the downsides of our app system — principally the control that Apple and Google, the dominant app store owners in much of the world, exert over our digital lives — are onerous enough to contemplate another path.
  • in recent months, Microsoft’s Xbox video gaming console, the popular game Fortnite and other game companies have moved ahead with technology that makes it possible to play video games on smartphone web browsers.
  • ...7 more annotations...
  • if apps weren’t dominant, would we have a richer variety of digital services from a broader array of companies?
  • In the early smartphone era, there was a tug of war between technologies that were more like websites and the apps we know today. Apps won, mostly because they were technically superior.
  • control. Apple and Google dictate much of what is allowed on the world’s phones. There are good outcomes from this, including those companies weeding out bad or dangerous apps and giving us one place to find them.
  • ith unhappy side effects. Apple and Google charge a significant fee on many in-app purchases, and they’ve forced app makers into awkward workarounds.
  • You know what’s free from Apple and Google’s iron grip? The web. Smartphones could lean on the web instead.
  • This is about imagining an alternate reality where companies don’t need to devote money to creating apps that are tailored to iPhones and Android phones, can’t work on any other devices and obligate app makers to hand over a cut of each sale.
  • Maybe more smaller digital companies could thrive. Maybe our digital services would be cheaper and better. Maybe we’d have more than two dominant smartphone systems
Emily Freilich

The Man Who Would Teach Machines to Think - James Somers - The Atlantic - 1 views

  • Douglas Hofstadter, the Pulitzer Prize–winning author of Gödel, Escher, Bach, thinks we've lost sight of what artificial intelligence really means. His stubborn quest to replicate the human mind.
  • “If somebody meant by artificial intelligence the attempt to understand the mind, or to create something human-like, they might say—maybe they wouldn’t go this far—but they might say this is some of the only good work that’s ever been done
  • Their operating premise is simple: the mind is a very unusual piece of software, and the best way to understand how a piece of software works is to write it yourself.
  • ...43 more annotations...
  • “It depends on what you mean by artificial intelligence.”
  • Computers are flexible enough to model the strange evolved convolutions of our thought, and yet responsive only to precise instructions. So if the endeavor succeeds, it will be a double victory: we will finally come to know the exact mechanics of our selves—and we’ll have made intelligent machines.
  • Ever since he was about 14, when he found out that his youngest sister, Molly, couldn’t understand language, because she “had something deeply wrong with her brain” (her neurological condition probably dated from birth, and was never diagnosed), he had been quietly obsessed by the relation of mind to matter.
  • How could consciousness be physical? How could a few pounds of gray gelatin give rise to our very thoughts and selves?
  • Consciousness, Hofstadter wanted to say, emerged via just the same kind of “level-crossing feedback loop.”
  • In 1931, the Austrian-born logician Kurt Gödel had famously shown how a mathematical system could make statements not just about numbers but about the system itself.
  • But then AI changed, and Hofstadter didn’t change with it, and for that he all but disappeared.
  • By the early 1980s, the pressure was great enough that AI, which had begun as an endeavor to answer yes to Alan Turing’s famous question, “Can machines think?,” started to mature—or mutate, depending on your point of view—into a subfield of software engineering, driven by applications.
  • Take Deep Blue, the IBM supercomputer that bested the chess grandmaster Garry Kasparov. Deep Blue won by brute force.
  • Hofstadter wanted to ask: Why conquer a task if there’s no insight to be had from the victory? “Okay,” he says, “Deep Blue plays very good chess—so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?”
  • AI started working when it ditched humans as a model, because it ditched them. That’s the thrust of the analogy: Airplanes don’t flap their wings; why should computers think?
  • It’s a compelling point. But it loses some bite when you consider what we want: a Google that knows, in the way a human would know, what you really mean when you search for something
  • Cognition is recognition,” he likes to say. He describes “seeing as” as the essential cognitive act: you see some lines a
  • How do you make a search engine that understands if you don’t know how you understand?
  • s “an A,” you see a hunk of wood as “a table,” you see a meeting as “an emperor-has-no-clothes situation” and a friend’s pouting as “sour grapes”
  • That’s what it means to understand. But how does understanding work?
  • analogy is “the fuel and fire of thinking,” the bread and butter of our daily mental lives.
  • there’s an analogy, a mental leap so stunningly complex that it’s a computational miracle: somehow your brain is able to strip any remark of the irrelevant surface details and extract its gist, its “skeletal essence,” and retrieve, from your own repertoire of ideas and experiences, the story or remark that best relates.
  • in Hofstadter’s telling, the story goes like this: when everybody else in AI started building products, he and his team, as his friend, the philosopher Daniel Dennett, wrote, “patiently, systematically, brilliantly,” way out of the light of day, chipped away at the real problem. “Very few people are interested in how human intelligence works,”
  • For more than 30 years, Hofstadter has worked as a professor at Indiana University at Bloomington
  • The quick unconscious chaos of a mind can be slowed down on the computer, or rewound, paused, even edited
  • project out of IBM called Candide. The idea behind Candide, a machine-translation system, was to start by admitting that the rules-based approach requires too deep an understanding of how language is produced; how semantics, syntax, and morphology work; and how words commingle in sentences and combine into paragraphs—to say nothing of understanding the ideas for which those words are merely conduits.
  • , Hofstadter directs the Fluid Analogies Research Group, affectionately known as FARG.
  • Parts of a program can be selectively isolated to see how it functions without them; parameters can be changed to see how performance improves or degrades. When the computer surprises you—whether by being especially creative or especially dim-witted—you can see exactly why.
  • When you read Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, which describes in detail this architecture and the logic and mechanics of the programs that use it, you wonder whether maybe Hofstadter got famous for the wrong book.
  • ut very few people, even admirers of GEB, know about the book or the programs it describes. And maybe that’s because FARG’s programs are almost ostentatiously impractical. Because they operate in tiny, seemingly childish “microdomains.” Because there is no task they perform better than a human.
  • “The entire effort of artificial intelligence is essentially a fight against computers’ rigidity.”
  • “Nobody is a very reliable guide concerning activities in their mind that are, by definition, subconscious,” he once wrote. “This is what makes vast collections of errors so important. In an isolated error, the mechanisms involved yield only slight traces of themselves; however, in a large collection, vast numbers of such slight traces exist, collectively adding up to strong evidence for (and against) particular mechanisms.
  • So IBM threw that approach out the window. What the developers did instead was brilliant, but so straightforward,
  • The technique is called “machine learning.” The goal is to make a device that takes an English sentence as input and spits out a French sentence
  • What you do is feed the machine English sentences whose French translations you already know. (Candide, for example, used 2.2 million pairs of sentences, mostly from the bilingual proceedings of Canadian parliamentary debates.)
  • By repeating this process with millions of pairs of sentences, you will gradually calibrate your machine, to the point where you’ll be able to enter a sentence whose translation you don’t know and get a reasonable resul
  • Google Translate team can be made up of people who don’t speak most of the languages their application translates. “It’s a bang-for-your-buck argument,” Estelle says. “You probably want to hire more engineers instead” of native speakers.
  • But the need to serve 1 billion customers has a way of forcing the company to trade understanding for expediency. You don’t have to push Google Translate very far to see the compromises its developers have made for coverage, and speed, and ease of engineering. Although Google Translate captures, in its way, the products of human intelligence, it isn’t intelligent itself.
  • “Did we sit down when we built Watson and try to model human cognition?” Dave Ferrucci, who led the Watson team at IBM, pauses for emphasis. “Absolutely not. We just tried to create a machine that could win at Jeopardy.”
  • For Ferrucci, the definition of intelligence is simple: it’s what a program can do. Deep Blue was intelligent because it could beat Garry Kasparov at chess. Watson was intelligent because it could beat Ken Jennings at Jeopardy.
  • “There’s a limited number of things you can do as an individual, and I think when you dedicate your life to something, you’ve got to ask yourself the question: To what end? And I think at some point I asked myself that question, and what it came out to was, I’m fascinated by how the human mind works, it would be fantastic to understand cognition, I love to read books on it, I love to get a grip on it”—he called Hofstadter’s work inspiring—“but where am I going to go with it? Really what I want to do is build computer systems that do something.
  • Peter Norvig, one of Google’s directors of research, echoes Ferrucci almost exactly. “I thought he was tackling a really hard problem,” he told me about Hofstadter’s work. “And I guess I wanted to do an easier problem.”
  • Of course, the folly of being above the fray is that you’re also not a part of it
  • As our machines get faster and ingest more data, we allow ourselves to be dumber. Instead of wrestling with our hardest problems in earnest, we can just plug in billions of examples of them.
  • Hofstadter hasn’t been to an artificial-intelligence conference in 30 years. “There’s no communication between me and these people,” he says of his AI peers. “None. Zero. I don’t want to talk to colleagues that I find very, very intransigent and hard to convince of anything
  • Everything from plate tectonics to evolution—all those ideas, someone had to fight for them, because people didn’t agree with those ideas.
  • Academia is not an environment where you just sit in your bath and have ideas and expect everyone to run around getting excited. It’s possible that in 50 years’ time we’ll say, ‘We really should have listened more to Doug Hofstadter.’ But it’s incumbent on every scientist to at least think about what is needed to get people to understand the ideas.”
sissij

Quantum Gravity Loops Back To Ancient Atomic Logic, and The Big Bang Becomes A Big Boun... - 0 views

  • Greeks had the “first true alphabet”: a “universal” writing system that used a few letters to encode the infinite variety of all possible utterances. Similarly, all matter is written in a "language… of atoms."
  • Mysterious “meanings” still surround 100-year-old quantum mechanics equations
  • Their meaning/function/grammar is relational and sequential and word-like. The information encoded in matching sequential text-like compositions matters (DNA—>RNA, letters—>“social cartesian” lexicon).
  • ...3 more annotations...
  • Beyond the grammars of geometry and algebra lies a domain of not math-like but text-like compositions and meanings (of semantics beyond mathematics).
  • 17. Word and world both have grammars that don’t fit our available mathematical rules.
  • 18. Reality is relational, and not entirely objective. Subject and object aren’t separable, they’re entangled, inescapably. “Objective” is always relative to some other system/observer. 
  •  
    I find it very interesting that the author is trying o look at the world from a different perspective than mathematics. He thinks atoms as a language that have grammar and meanings. He thinks mathematical rules cannot fully explain our world because it is too objective. He involves the idea of language to describe how the world is entangled and relational. As we learned in TOK, language is an important AOK that shows human civilization in a very complicated way. Language is flexible, emotional and relational. It gives things meaning as human likes to assign meaning and pattern to things around. The world around us are not just cold fact, we as observers give them meaning to exist. In that sense, the concept of language can better help us depict the world. --Sissi (2/27/2017)
Javier E

Using Google to Tell Real Science from Fads | Mind Matters | Big Think - 0 views

  • a database of words from millions of books digitized by Google—4 percent of all books ever printed—could be one of the big ones. It's a fabulous source of ideas and evidence for theories about society, and it's fabulously democratic. Google offers a handy analyzer, the Ngram Viewer, which anyone can use to test an idea. A case in point: Yesterday, the social psychologist Rob Kurzban argued that the tool can distinguish between genuine scientific theories and intellectual fads.
Javier E

Does Google Make Us Stupid? - Pew Research Center - 0 views

  • Carr argued that the ease of online searching and distractions of browsing through the web were possibly limiting his capacity to concentrate. "I'm not thinking the way I used to," he wrote, in part because he is becoming a skimming, browsing reader, rather than a deep and engaged reader. "The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author's words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas.... If we lose those quiet spaces, or fill them up with ‘content,' we will sacrifice something important not only in our selves but in our culture."
  • force us to get smarter if we are to survive. "Most people don't realize that this process is already under way," he wrote. "In fact, it's happening all around us, across the full spectrum of how we understand intelligence. It's visible in the hive mind of the Internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines, and in the development of drugs that some people (myself included) have discovered let them study harder, focus better, and stay awake longer with full clarity." He argued that while the proliferation of technology and media can challenge humans' capacity to concentrate there were signs that we are developing "fluid intelligence-the ability to find meaning in confusion and solve new problems, independent of acquired knowledge." He also expressed hope that techies will develop tools to help people find and assess information smartly.
  • 76% of the experts agreed with the statement, "By 2020, people's use of the internet has enhanced human intelligence; as people are allowed unprecedented access to more information they become smarter and make better choices. Nicholas Carr was wrong: Google does not make us stupid."
‹ Previous 21 - 40 of 263 Next › Last »
Showing 20 items per page