Skip to main content

Home/ TOK Friends/ Group items matching "google" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

How Google Search works - 0 views

  • If you want to understand how Google search works, go through this diagram in detail.
Javier E

Dealing With an Identity Hijacked on the Online Highway - NYTimes.com - 0 views

  • his predicament stands as a chilling example of what it means to be at the mercy of the Google algorithm.
  • The question is best directed at the search engines. And Google’s defense — that the behavior of its ever-improving algorithm should be considered independent of the results it produces in a particular controversial case — has a particularly patronizing air, especially when it comes to hurting living, breathing people.
  • it was the algorithm that took the hit, and washed away accountability.
  • ...1 more annotation...
  • “When a company is filled with engineers, it turns to engineering to solve problems,” he wrote candidly. “Reduce each decision to a simple logic problem. Remove all subjectivity and just look at the data.”
julia rhodes

Opinion: Is Google redefining 'don't be evil'? - CNN.com - 0 views

  • Well, some of Google's recent forays are waking people up to the fact that evil is in the eyes of the beholder. The company just acquired military robot maker Boston Dynamics, leading to great consternation in the Twitterverse. As @BrentButt put it this week in a tweet that caught fire:
  • What we have to ask, and keep asking at every turn, is: To what end? What real purpose are we serving?
  • Not doing evil is actually a pretty low bar to begin with. Is this really a high aspiration? To avoid embodying Satan in silicon?
  • ...5 more annotations...
  • We can't employ an entirely programmatic approach to human affairs. However well we think we might be embedding our technologies with the values we hope to express, more often than not we also get unexpected consequences.
  • Still, we can't help but do a bit of evil when we build technology upon technology, without taking a pause to ask what it's all for. New technologies give us the opportunity to reevaluate the systems we have been using up until now, and consider doing things differently.
  • our best Stanford computer science graduates end up writing algorithms that better extract money from the stock market, rather than exploring whether capital is even serving its original purpose of getting funds to new businesses.
  • When we develop technology in a vacuum, disconnected from the reality in which people really live, we are too likely to spend our energy designing some abstract vision of a future life rather than addressing the pains and injustices around us right now. Technology becomes a way of escaping the world's problems, whether through virtual reality or massive Silicon Valley stock options packages, rather than engaging with them.
  • . It's not enough to computerize and digitize the society we have, and exacerbate its problems by new means. We must transcend the mere avoidance of the patently evil and instead seek to do good. That may involve actually overturning and remaking some institutions and processes from the ground up. That's the real potential of digital technology. To retrieve the values and ideas that may have seemed impossible before and see whether we can realize them today in this very new world.
maddieireland334

Google Fiber heading to Salt Lake City - 0 views

  •  
    Google made the announcement in a blog post Tuesday. The city joins several others where Google is laying down cable to provide incredibly fast Internet access. Atlanta, Georgia; Nashville, Tennessee; and both Charlotte and Raleigh-Durham in North Carolina are getting Fiber soon. Google Fiber delivers Internet speeds of up to 1 gigabit per second.
Javier E

How YouTube Drives People to the Internet's Darkest Corners - WSJ - 0 views

  • YouTube is the new television, with more than 1.5 billion users, and videos the site recommends have the power to influence viewpoints around the world.
  • Those recommendations often present divisive, misleading or false content despite changes the site has recently made to highlight more-neutral fare, a Wall Street Journal investigation found.
  • Behind that growth is an algorithm that creates personalized playlists. YouTube says these recommendations drive more than 70% of its viewing time, making the algorithm among the single biggest deciders of what people watch.
  • ...25 more annotations...
  • People cumulatively watch more than a billion YouTube hours daily world-wide, a 10-fold increase from 2012
  • After the Journal this week provided examples of how the site still promotes deceptive and divisive videos, YouTube executives said the recommendations were a problem.
  • When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints.
  • Such recommendations play into concerns about how social-media sites can amplify extremist voices, sow misinformation and isolate users in “filter bubbles”
  • Unlike Facebook Inc. and Twitter Inc. sites, where users see content from accounts they choose to follow, YouTube takes an active role in pushing information to users they likely wouldn’t have otherwise seen.
  • “The editorial policy of these new platforms is to essentially not have one,”
  • “That sounded great when it was all about free speech and ‘in the marketplace of ideas, only the best ones win.’ But we’re seeing again and again that that’s not what happens. What’s happening instead is the systems are being gamed and people are being gamed.”
  • YouTube has been tweaking its algorithm since last autumn to surface what its executives call “more authoritative” news source
  • YouTube last week said it is considering a design change to promote relevant information from credible news sources alongside videos that push conspiracy theories.
  • The Journal investigation found YouTube’s recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven’t shown interest in such content.
  • YouTube engineered its algorithm several years ago to make the site “sticky”—to recommend videos that keep users staying to watch still more, said current and former YouTube engineers who helped build it. The site earns money selling ads that run before and during videos.
  • YouTube’s algorithm tweaks don’t appear to have changed how YouTube recommends videos on its home page. On the home page, the algorithm provides a personalized feed for each logged-in user largely based on what the user has watched.
  • There is another way to calculate recommendations, demonstrated by YouTube’s parent, Alphabet Inc.’s Google. It has designed its search-engine algorithms to recommend sources that are authoritative, not just popular.
  • Google spokeswoman Crystal Dahlen said that Google improved its algorithm last year “to surface more authoritative content, to help prevent the spread of blatantly misleading, low-quality, offensive or downright false information,” adding that it is “working with the YouTube team to help share learnings.”
  • In recent weeks, it has expanded that change to other news-related queries. Since then, the Journal’s tests show, news searches in YouTube return fewer videos from highly partisan channels.
  • YouTube’s recommendations became even more effective at keeping people on the site in 2016, when the company began employing an artificial-intelligence technique called a deep neural network that makes connections between videos that humans wouldn’t. The algorithm uses hundreds of signals, YouTube says, but the most important remains what a given user has watched.
  • Using a deep neural network makes the recommendations more of a black box to engineers than previous techniques,
  • “We don’t have to think as much,” he said. “We’ll just give it some raw data and let it figure it out.”
  • To better understand the algorithm, the Journal enlisted former YouTube engineer Guillaume Chaslot, who worked on its recommendation engine, to analyze thousands of YouTube’s recommendations on the most popular news-related queries
  • Mr. Chaslot created a computer program that simulates the “rabbit hole” users often descend into when surfing the site. In the Journal study, the program collected the top five results to a given search. Next, it gathered the top three recommendations that YouTube promoted once the program clicked on each of those results. Then it gathered the top three recommendations for each of those promoted videos, continuing four clicks from the original search.
  • The first analysis, of November’s top search terms, showed YouTube frequently led users to divisive and misleading videos. On the 21 news-related searches left after eliminating queries about entertainment, sports and gaming—such as “Trump,” “North Korea” and “bitcoin”—YouTube most frequently recommended these videos:
  • The algorithm doesn’t seek out extreme videos, they said, but looks for clips that data show are already drawing high traffic and keeping people on the site. Those videos often tend to be sensationalist and on the extreme fringe, the engineers said.
  • Repeated tests by the Journal as recently as this week showed the home page often fed far-right or far-left videos to users who watched relatively mainstream news sources, such as Fox News and MSNBC.
  • Searching some topics and then returning to the home page without doing a new search can produce recommendations that push users toward conspiracy theories even if they seek out just mainstream sources.
  • After searching for “9/11” last month, then clicking on a single CNN clip about the attacks, and then returning to the home page, the fifth and sixth recommended videos were about claims the U.S. government carried out the attacks. One, titled “Footage Shows Military Plane hitting WTC Tower on 9/11—13 Witnesses React”—had 5.3 million views.
Javier E

Early Facebook and Google Employees Form Coalition to Fight What They Built - The New York Times - 0 views

  • A group of Silicon Valley technologists who were early employees at Facebook and Google, alarmed over the ill effects of social networks and smartphones, are banding together to challenge the companies they helped build.
  • The campaign, titled The Truth About Tech, will be funded with $7 million from Common Sense and capital raised by the Center for Humane Technology. Common Sense also has $50 million in donated media and airtime
  • . It will be aimed at educating students, parents and teachers about the dangers of technology, including the depression that can come from heavy use of social media.
  • ...9 more annotations...
  • Chamath Palihapitiya, a venture capitalist who was an early employee at Facebook, said in November that the social network was “ripping apart the social fabric of how society works.”
  • The new Center for Humane Technology includes an unprecedented alliance of former employees of some of today’s biggest tech companies. Apart from Mr. Harris, the center includes Sandy Parakilas, a former Facebook operations manager; Lynn Fox, a former Apple and Google communications executive; Dave Morin, a former Facebook executive; Justin Rosenstein, who created Facebook’s Like button and is a co-founder of Asana; Roger McNamee, an early investor in Facebook; and Renée DiResta, a technologist who studies bots.
  • Its first project to reform the industry will be to introduce a Ledger of Harms — a website aimed at guiding rank-and-file engineers who are concerned about what they are being asked to build. The site will include data on the health effects of different technologies and ways to make products that are healthier
  • “Facebook appeals to your lizard brain — primarily fear and anger,” he said. “And with smartphones, they’ve got you for every waking moment.”
  • Apple’s chief executive, Timothy D. Cook, told The Guardian last month that he would not let his nephew on social media, while the Facebook investor Sean Parker also recently said of the social network that “God only knows what it’s doing to our children’s brains.”Mr. Steyer said, “You see a degree of hypocrisy with all these guys in Silicon Valley.”
  • The new group also plans to begin lobbying for laws to curtail the power of big tech companies. It will initially focus on two pieces of legislation: a bill being introduced by Senator Edward J. Markey, Democrat of Massachusetts, that would commission research on technology’s impact on children’s health, and a bill in California by State Senator Bob Hertzberg, a Democrat, which would prohibit the use of digital bots without identification.
  • Mr. McNamee said he had joined the Center for Humane Technology because he was horrified by what he had helped enable as an early Facebook investor.
  • Truth About Tech campaign was modeled on antismoking drives and focused on children because of their vulnerability.
  • He said the people who made these products could stop them before they did more harm.
Javier E

Silicon Valley Is Not Your Friend - The New York Times - 0 views

  • By all accounts, these programmers turned entrepreneurs believed their lofty words and were at first indifferent to getting rich from their ideas. A 1998 paper by Sergey Brin and Larry Page, then computer-science graduate students at Stanford, stressed the social benefits of their new search engine, Google, which would be open to the scrutiny of other researchers and wouldn’t be advertising-driven.
  • The Google prototype was still ad-free, but what about the others, which took ads? Mr. Brin and Mr. Page had their doubts: “We expect that advertising-funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.”
  • He was concerned about them as young students lacking perspective about life and was worried that these troubled souls could be our new leaders. Neither Mr. Weizenbaum nor Mr. McCarthy mentioned, though it was hard to miss, that this ascendant generation were nearly all white men with a strong preference for people just like themselves. In a word, they were incorrigible, accustomed to total control of what appeared on their screens. “No playwright, no stage director, no emperor, however powerful,” Mr. Weizenbaum wrote, “has ever exercised such absolute authority to arrange a stage or a field of battle and to command such unswervingly dutiful actors or troops.”
  • ...7 more annotations...
  • In his epic anti-A.I. work from the mid-1970s, “Computer Power and Human Reason,” Mr. Weizenbaum described the scene at computer labs. “Bright young men of disheveled appearance, often with sunken glowing eyes, can be seen sitting at computer consoles, their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler’s on the rolling dice,” he wrote. “They exist, at least when so engaged, only through and for the computers. These are computer bums, compulsive programmers.”
  • As Mr. Weizenbaum feared, the current tech leaders have discovered that people trust computers and have licked their lips at the possibilities. The examples of Silicon Valley manipulation are too legion to list: push notifications, surge pricing, recommended friends, suggested films, people who bought this also bought that.
  • Welcome to Silicon Valley, 2017.
  • Growth becomes the overriding motivation — something treasured for its own sake, not for anything it brings to the world
  • Facebook and Google can point to a greater utility that comes from being the central repository of all people, all information, but such market dominance has obvious drawbacks, and not just the lack of competition. As we’ve seen, the extreme concentration of wealth and power is a threat to our democracy by making some people and companies unaccountable.
  • As is becoming obvious, these companies do not deserve the benefit of the doubt. We need greater regulation, even if it impedes the introduction of new services.
  • We need to break up these online monopolies because if a few people make the decisions about how we communicate, shop, learn the news, again, do we control our own society?
tongoscar

iPhone 11 Pro has just been defeated by Huawei's Google-less Mate 30 Pro | Express.co.uk - 0 views

  • “Overall, the iPhone is among the very best for exposure; it’s only in very low light when can’t keep up with devices with larger image sensors, such as the Huawei Mate 30 Pro.
  • Even though DxOMark claims the Mate 30 Pro has a better camera system overall, the iPhone 11 Pro remains the only phone of the two you can buy right now.
  • Although Huawei revealed the Mate 30 Pro at a glitzy hardware event in September, its European release remains elusive. There’s currently no word on when it’ll finally arrive.
  • ...1 more annotation...
  • The US’s trade ban on Huawei is the most likely to blame for the delayed release. This prevents Google from granting the Mate 30 Pro an Android licence, therefore the device can’t come pre-installed with Google apps and services like the Play Store, Gmail, Chrome and Google Maps.
Javier E

Even for a company that specialises in PR disasters, Facebook has excelled with its Australian blackout | Facebook | The Guardian - 0 views

  • Facebook released a statement on Wednesday stating that regrettably it was abandoning its plans to “significantly increase our investments with local publishers” and instead pulled the plug. Google meanwhile has managed to sidestep the proposition of a “link tax” by delivering the government’s objective of lucrative deals with Australian media companies from News Corp down to the smallest publishers. By flexing a little Google has for now avoided mandatory payment arbitration.
  • Governments have arguably not paid nearly enough attention to producing alternative digital solutions to giant centralised advertising companies that provide an increasing number of communication services for their citizens. Facebook’s petulance has inadvertently made a case in Australia for more regulation rather than less.
  • News organisations need to develop alternative platforms, and governments need to provide more regulated certainty. Highly digital newsrooms that have resources and strong relationships with their audiences started moving away from Facebook a long time ago, and are less affected by its volatility.
  • ...1 more annotation...
  • Smaller publishers, and those with communities with low resources themselves, are much more dependent. A withdrawal from Facebook could be a galvanising moment for Australia, and beyond.
Javier E

Google Glass May Be Hands-Free, But Not Brain-Free - NYTimes.com - 0 views

  • The “eyes-free” goal addresses an obvious limitation of the human brain: we can’t look away from where we’re heading for more than a few seconds without losing our bearings. And time spent looking at a cellphone is time spent oblivious to the world, as shown in the viral videos of distracted phone users who stumble into shopping-mall fountains. Most people intuitively grasp the “two-second rule.”
  • Researchers at the Virginia Tech Transportation Institute outfitted cars and trucks with cameras and sensors to monitor real-world driving behavior. When drivers were communicating, they tended to look away for as much as 4.6 seconds during a 6-second period. In effect, people lose track of time when texting, leading them to look at their phones far longer than they know they should
  • Heads-up displays like Google Glass, and voice interfaces like Siri, seem like ideal solutions, letting you simultaneously interact with your smartphone while staying alert to your surroundings
  • ...4 more annotations...
  • The problem is that looking is not the same as seeing, and people make wrong assumptions about what will grab their attention.
  • about 70 percent of Americans believe that “people will notice when something unexpected enters their field of view, even when they’re paying attention to something else.”
  • “inattentional blindness” shows that what we see depends not just on where we look but also on how we focus our attention.
  • Perception requires both your eyes and your mind, and if your mind is engaged, you can f
markfrankel18

Erasing History in the Internet Era - NYTimes.com - 1 views

  • Lorraine Martin, a nurse in Greenwich, was arrested in 2010 with her two grown sons when police raided her home and found a small stash of marijuana, scales and plastic bags. The case against her was tossed out when she agreed to take some drug classes, and the official record was automatically purged. It was, the law seemed to assure her, as if it had never happened.
  • Defamation is the publication of information that is both damaging and false. The arrest story was obviously true when it was first published. But Connecticut’s erasure law has already established that truth can be fungible. Martin, her suit says, was “deemed never to have been arrested.” And therefore the news story had metamorphosed into a falsehood.
  • They debate the difference between “historical fact” and “legal fact.” They dispute whether something that was true when it happened can become not just private but actually untrue, so untrue you can swear an oath that it never happened and, in the eyes of the law, you’ll be telling the truth.
  • ...7 more annotations...
  • Google’s latest transparency report shows a sharp rise in requests from governments and courts to take down potentially damaging material.
  • In Europe, where press freedoms are less sacred and the right to privacy is more ensconced, the idea has taken hold that individuals have a “right to be forgotten,” and those who want their online particulars expunged tend to have the government on their side. In Germany or Spain, Lorraine Martin might have a winning case.
  • The Connecticut case is just one manifestation of an anxious backlash against the invasive power of the Internet, a world of Big Data and ever more powerful search engines, in which it seems almost everything is permanently recorded and accessible to almost anyone — potential employers, landlords, dates, predators
  • The Times’s policy is not to censor history, because it’s history. The paper will update an arrest story if presented with evidence of an acquittal or dismissal, completing the story but not deleting the story.
  • Owen Tripp, a co-founder of Reputation.com, which has made a business out of helping clients manage their digital profile, advocated a “right to be forgotten” in a YouTube video. Tripp said everyone is entitled to a bit of space to grow up, to experiment, to make mistakes.
  • “This is not just a privacy problem,” said Viktor Mayer-Schönberger, a professor at the Oxford Internet Institute, and author of “Delete: The Virtue of Forgetting in the Digital Age.” “If we are continually reminded about people’s mistakes, we are not able to judge them for who they are in the present. We need some way to put a speed-brake on the omnipresence of the past.”
  • would like to see search engine companies — the parties that benefit the most financially from amassing our information — offer the kind of reputation-protecting tools that are now available only to those who can afford paid services like those of Reputation.com. Google, he points out, already takes down five million items a week because of claims that they violate copyrights. Why shouldn’t we expect Google to give users an option — and a simple process — to have news stories about them down-ranked or omitted from future search results? Good question. What’s so sacred about a search algorithm, anyway?
Javier E

The searchers | ROUGH TYPE - 0 views

  • When we talk about “searching” these days, we’re almost always talking about using Google to find something online.
  • That’s quite a twist for a word that has long carried existential connotations, that has been bound up in our sense of what it means to be conscious and alive. We don’t just search for car keys or missing socks. We search for truth and meaning, for love, for transcendence, for peace, for ourselves. To be human is to be a searcher.
  • in its original conception, the Google search engine did transport us into a messy and confusing world—the world of the web—with the intent of helping us make some sense of it. It pushed us outward, away from ourselves. It was a means of exploration
  • ...3 more annotations...
  • In its highest form, a search has no well-defined object. It’s open-ended, an act of exploration that takes us out into the world, beyond the self, in order to know the world, and the self, more fully
  • Google’s goal is no longer to read the web. It’s to read us. 
  • In its new design, Google’s search engine doesn’t push us outward; it turns us inward. It gives us information that fits the behavior and needs and biases we have displayed in the past, as meticulously interpreted by Google’s algorithms. Because it reinforces the existing state of the self rather than challenging it, it subverts the act of searching. We find out little about anything, least of all ourselves, through self-absorption.
Javier E

The right has its own version of political correctness. It's just as stifling. - The Washington Post - 0 views

  • Political correctness has become a major bugaboo of the right in the past decade, a rallying cry against all that has gone wrong with liberalism and America. Conservative writers fill volumes complaining how political correctness stifles free expression and promotes bunk social theories about “power structures” based on patriarchy, race and mass victimhood. Forbes charged that it “stifles freedom of speech.” The Daily Caller has gone so far as to claim that political correctness “kills Americans.”
  • But conservatives have their own, nationalist version of PC, their own set of rules regulating speech, behavior and acceptable opinions. I call it “patriotic correctness.” It’s a full-throated, un-nuanced, uncompromising defense of American nationalism, history and cherry-picked ideals. Central to its thesis is the belief that nothing in America can’t be fixed by more patriotism enforced by public shaming, boycotts and policies to cut out foreign and non-American influences.
  • Blaming the liberal or mainstream media and “media bias” is the patriotically correct version of blaming the corporations or capitalism. The patriotically correct notion that they “would rather be governed by the first 2,000 people in the Boston telephone directory than by the 2,000 people on the faculty of Harvard University” because the former have “common sense” and the “intellectual elites” don’t know anything, despite all the evidence to the contrary, can be sustained only in a total bubble.
  • ...10 more annotations...
  • Complaining about political correctness is patriotically correct. The patriotically correct must use the non-word “illegals,” or “illegal immigrant” or “illegal alien” to describe foreigners who broke our immigration laws. Dissenters support “open borders” or “shamnesty” for 30 million illegal alien invaders. The punishment is deportation because “we’re a nation of laws” and they didn’t “get in line,” even though no such line actually exists. Just remember that they are never anti-immigration, only anti-illegal immigration, even when they want to cut legal immigration.
  • Black Lives Matter is racist because it implies that black lives are more important than other lives, but Blue Lives Matter doesn’t imply that cops’ lives are more important than the rest of ours. Banning Islam or Muslim immigration is a necessary security measure, but homosexuals should not be allowed to get married because it infringes on religious liberty. Transgender people could access women’s restrooms for perverted purposes, but Donald Trump walking in on nude underage girls in dressing rooms before a beauty pageant is just “media bias.”
  • Terrorism is an “existential threat,” even though the chance of being killed in a terrorist attack is about 1 in 3.2 million a year. Saying the words “radical Islam” when describing terrorism is an important incantation necessary to defeat that threat. When Chobani yogurt founder Hamdi Ulukaya decides to employ refugees in his factories, it’s because of his ties to “globalist corporate figures.” Waving a Mexican flag on U.S. soil means you hate America, but waving a Confederate flag just means you’re proud of your heritage.
  • Insufficient displays of patriotism among the patriotically correct can result in exclusion from public life and ruined careers. It also restricts honest criticism of failed public policies, diverting blame for things like the war in Iraq to those Americans who didn’t support the war effort enough.
  • Poor white Americans are the victims of economic dislocation and globalization beyond their control, while poor blacks and Hispanics are poor because of their failed cultures. The patriotically correct are triggered when they hear strangers speaking in a language other than English. Does that remind you of the PC duty to publicly shame those who use unacceptable language to describe race, gender or whatever other identity is the victim du jour?
  • The patriotically correct rightly ridicule PC “safe spaces” but promptly retreat to Breitbart or talk radio, where they can have mutually reinforcing homogeneous temper tantrums while complaining about the lack of intellectual diversity on the left.
  • There is no such thing as too much national security, but it’s liberals who want to coddle Americans with a “nanny state.”
  • Those who disagree with the patriotically correct are animated by anti-Americanism, are post-American, or deserve any other of a long list of clunky and vague labels that signal virtue to other members of the patriotic in-group.
  • Every group has implicit rules against certain opinions, actions and language as well as enforcement mechanisms — and the patriotically correct are no exception. But they are different because they are near-uniformly unaware of how they are hewing to a code of speech and conduct similar to the PC lefties they claim to oppose.
  • The modern form of political correctness on college campuses and the media is social tyranny with manners, while patriotic correctness is tyranny without the manners, and its adherents do not hesitate to use the law to advance their goals.
sissij

What Facebook Owes to Journalism - The New York Times - 0 views

  • declared that “a strong news industry is also critical to building an informed community.”
  • Unfortunately, his memo ignored two major points — the role that Facebook and other technology platforms are playing in inadvertently damaging local news media, and the one way they could actually save journalism: with a massive philanthropic commitment.
  • As advertising spending shifted from print, TV and radio to the internet, the money didn’t mostly go to digital news organizations. Increasingly, it goes to Facebook and Google.
  • ...2 more annotations...
  • But just because the result is unintentional doesn’t mean it is fantasy: Newsrooms have been decimated, with basic accountability reporting slashed as a result.
  • I’m not saying that the good stuff — the mobile revolution, blocking intrusive ads, better marketing options for small businesses — doesn’t outweigh the bad. And local news organizations absolutely contributed to the problem with their sluggish and often uncreative reaction to the digital revolution.
  •  
    This article discuss the impact of internet on local news organizations. I agree with the author that the internet do get a lot of ad money and make local news organizations have less funding. Although there are donations, it is still very little compare to what local news organizations used to have. This might be part of the reason why local news organizations don't do well on giving great informations.But I think the time is moving forward, Facebook and google should take some of the responsibility as they get more funding and resources. This article is very persuasive as it has many data and evidence in support. I really like that the author acknowledge the counterargument in his article to make it more reliable. --Sissi (2/22/2017)
sissij

Google Training Ad Placement Computers to Be Offended - The New York Times - 0 views

  • But after seeing ads from Coca-Cola, Procter & Gamble and Wal-Mart appear next to racist, anti-Semitic or terrorist videos, its engineers realized their computer models had a blind spot: They did not understand context.
  • Now teaching computers to understand what humans can readily grasp may be the key to calming fears among big-spending advertisers that their ads have been appearing alongside videos from extremist groups and other offensive messages.
  • But the recent problems opened Google to criticism that it was not doing enough to look out for advertisers. It is a significant problem for a multibillion-dollar company that still gets most of its revenue through advertising.
  • ...1 more annotation...
  • The idea is for machines to eventually make the tough calls.
  •  
    I have never think about the context of where the ads are in. I though ads just pops up randomly and now I know that there are actually codes behind where the ads appears. Why is putting ads besides extremist video a bad idea? I think it is probably because they people would mistaken that the company sponsor the video. Actually I am not very sure about why it is a bad thing. However, ads can definitely be more efficient in the right context. Different people watch different kind of video, targeting the potential costumers. It would benefit both the viewer and the company. --Sissi (4/3/2017)
Javier E

Anti-vaccine activists, 9/11 deniers, and Google's social search. - Slate Magazine - 1 views

  • democratization of information-gathering—when accompanied by smart institutional and technological arrangements—has been tremendously useful, giving us Wikipedia and Twitter. But it has also spawned thousands of sites that undermine scientific consensus, overturn well-established facts, and promote conspiracy theories
  • Meanwhile, the move toward social search may further insulate regular visitors to such sites; discovering even more links found by their equally paranoid friends will hardly enlighten them.
  • Initially, the Internet helped them find and recruit like-minded individuals and promote events and petitions favorable to their causes. However, as so much of our public life has shifted online, they have branched out into manipulating search engines, editing Wikipedia entries, harassing scientists who oppose whatever pet theory they happen to believe in, and amassing digitized scraps of "evidence" that they proudly present to potential recruits.
  • ...9 more annotations...
  • The Vaccine article contains a number of important insights. First, the anti-vaccination cohort likes to move the goal posts: As scientists debunked the link between autism and mercury (once present in some childhood inoculations but now found mainly in certain flu vaccines), most activists dropped their mercury theory and point instead to aluminum or said that kids received “too many too soon.”
  • Second, it isn't clear whether scientists can "discredit" the movement's false claims at all: Its members are skeptical of what scientists have to say—not least because they suspect hidden connections between academia and pharmaceutical companies that manufacture the vaccines.
  • mere exposure to the current state of the scientific consensus will not sway hard-core opponents of vaccination. They are too vested in upholding their contrarian theories; some have consulting and speaking gigs to lose while others simply enjoy a sense of belonging to a community, no matter how kooky
  • attempts to influence communities that embrace pseudoscience or conspiracy theories by having independent experts or, worse, government workers join them—the much-debated antidote of “cognitive infiltration” proposed by Cass Sunstein (who now heads the Office of Information and Regulatory Affairs in the White House)—w
  • perhaps, it's time to accept that many of these communities aren't going to lose core members regardless of how much science or evidence is poured on them. Instead, resources should go into thwarting their growth by targeting their potential—rather than existent—members.
  • Given that censorship of search engines is not an appealing or even particularly viable option, what can be done to ensure that users are made aware that all the pseudoscientific advice they are likely to encounter may not be backed by science?
  • One is to train our browsers to flag information that may be suspicious or disputed. Thus, every time a claim like "vaccination leads to autism" appears in our browser, that sentence woul
  • The second—and not necessarily mutually exclusive—option is to nudge search engines to take more responsibility for their index and exercise a heavier curatorial control in presenting search results for issues like "global warming" or "vaccination." Google already has a list of search queries that send most traffic to sites that trade in pseudoscience and conspiracy theories; why not treat them differently than normal queries? Thus, whenever users are presented with search results that are likely to send them to sites run by pseudoscientists or conspiracy theorists, Google may simply display a huge red banner asking users to exercise caution and check a previously generated list of authoritative resources before making up their minds.
  • In more than a dozen countries Google already does something similar for users who are searching for terms like "ways to die" or "suicidal thoughts" by placing a prominent red note urging them to call the National Suicide Prevention Hotline.
catbclark

When Google Met Wikileaks: A New Philosophy For Our Times | The Cryptosphere - 0 views

  • Socrates went around Athens talking with people and making them think about their life, their values and their actions.
  • Even though the talk sometimes gets technical, it never gets so much so that you can’t follow; in any case, the Google people are more interested in Assange’s motivation than his technical abilities.
« First ‹ Previous 41 - 60 of 264 Next › Last »
Showing 20 items per page