Skip to main content

Home/ Instructional & Media Services at Dickinson College/ Group items tagged google

Rss Feed Group items tagged

Ed Webb

Google and Meta moved cautiously on AI. Then came OpenAI's ChatGPT. - The Washington Post - 0 views

  • The surge of attention around ChatGPT is prompting pressure inside tech giants including Meta and Google to move faster, potentially sweeping safety concerns aside
  • Tech giants have been skittish since public debacles like Microsoft’s Tay, which it took down in less than a day in 2016 after trolls prompted the bot to call for a race war, suggest Hitler was right and tweet “Jews did 9/11.”
  • Some AI ethicists fear that Big Tech’s rush to market could expose billions of people to potential harms — such as sharing inaccurate information, generating fake photos or giving students the ability to cheat on school tests — before trust and safety experts have been able to study the risks. Others in the field share OpenAI’s philosophy that releasing the tools to the public, often nominally in a “beta” phase after mitigating some predictable risks, is the only way to assess real world harms.
  • ...8 more annotations...
  • Silicon Valley’s sudden willingness to consider taking more reputational risk arrives as tech stocks are tumbling
  • A chatbot that pointed to one answer directly from Google could increase its liability if the response was found to be harmful or plagiarized.
  • AI has been through several hype cycles over the past decade, but the furor over DALL-E and ChatGPT has reached new heights.
  • Soon after OpenAI released ChatGPT, tech influencers on Twitter began to predict that generative AI would spell the demise of Google search. ChatGPT delivered simple answers in an accessible way and didn’t ask users to rifle through blue links. Besides, after a quarter of a century, Google’s search interface had grown bloated with ads and marketers trying to game the system.
  • Inside big tech companies, the system of checks and balances for vetting the ethical implications of cutting-edge AI isn’t as established as privacy or data security. Typically teams of AI researchers and engineers publish papers on their findings, incorporate their technology into the company’s existing infrastructure or develop new products, a process that can sometimes clash with other teams working on responsible AI over pressure to see innovation reach the public sooner.
  • Chatbots like OpenAI routinely make factual errors and often switch their answers depending on how a question is asked
  • To Timnit Gebru, executive director of the nonprofit Distributed AI Research Institute, the prospect of Google sidelining its responsible AI team doesn’t necessarily signal a shift in power or safety concerns, because those warning of the potential harms were never empowered to begin with. “If we were lucky, we’d get invited to a meeting,” said Gebru, who helped lead Google’s Ethical AI team until she was fired for a paper criticizing large language models.
  • Rumman Chowdhury, who led Twitter’s machine-learning ethics team until Elon Musk disbanded it in November, said she expects companies like Google to increasingly sideline internal critics and ethicists as they scramble to catch up with OpenAI.“We thought it was going to be China pushing the U.S., but looks like it’s start-ups,” she said.
Ryan Burke

The Ocean Floor in Google Earth - 0 views

  •  
    The newest version of Google Earth allows you to explore the ocean
Ed Webb

Google pushes journalists to create G+ profiles · kegill · Storify - 0 views

  • linking search results with Google+ was like Microsoft bundling Internet Explore with Windows
  • Market strength in one place being used to leverage sub optimal products in another.
  • It's time to tell both Google and Bing that we want to decide for ourselves, thank you very much, if content is credible, instead of their making those decisions for us, decisions made behind hidden -- and suspicious -- algorithms.
Ed Webb

Official Google Blog: More books in more places: public domain EPUB downloads on Google... - 0 views

  • Starting today, you'll be able to download these and over one million public domain books from Google Books in an additional format. We're excited to now offer downloads in EPUB format, a free and open industry standard for electronic books. It's supported by a wide variety of applications, so once you download a book, you'll be able to read it on any device or through any reading application that supports the format. That means that people will be able to access public domain works that we've digitized from libraries around the world in more ways, including some that haven't even been built or imagined yet.
Ed Webb

Professors Find Ways to Keep Heads Above 'Exaflood' of Data - Wired Campus - The Chroni... - 0 views

  • Google, a major source of information overload, can also help manage it, according to Google's chief economist. Hal Varian, who was a professor at the University of California at Berkeley before going to work for the search-engine giant, showed off an analytic tool called Google Insights for Search.
  • accurately tagging data and archiving it
Ed Webb

Oxford University Press launches the Anti-Google - 0 views

  • he Anti-Google: Oxford Bibliographies Online (OBO)
  • essentially a straightforward, hyperlinked collection of professionally-produced, peer-reviewed bibliographies in different subject areas—sort of a giant, interactive syllabus put together by OUP and teams of scholars in different disciplines
  • "You can't come up with a search filter that solves the problem of information overload," Zucca told Ars. OUP is betting that the solution to the problem lies in content, which is its area of expertise, and not in technology, which is Google's and Microsoft's.
  • ...3 more annotations...
  • at least users can see exactly how the sausage is made. Contrast this to Google or Bing, where the search algorithm that produces results is a closely guarded secret.
  • The word that Zucca used a number of times in our chat was "authority," and OUP is betting that individual and institutional users will value the authority enough that they'll be willing to pay for access to the service
  • This paywall is the only feature of OBO that seems truly unfortunate, given that the competition (search and Wikipedia) is free. High school kids and motivated amateurs will be left slumming it with whatever they can get from the public Internet, and OBO's potential reach and impact will be severely limite
Ed Webb

Google's App Inventor now for mere mortals, available to all with Google account | BGR ... - 0 views

  •  
    roll-your-own droid apps
Ed Webb

Official Google Blog: Introducing Google Building Maker - 1 views

  •  
    Coolness.
Ed Webb

Lacktribution: Be Like Everyone Else - CogDogBlog - 0 views

  • What exactly are the issues about attributing? Why is it good to not have to attribute? Is it a severe challenge to attribute? Does it hurt? Does it call for technical or academic skills beyond reach? Does it consume great amounts of time, resources? Why, among professional designers and technologists is it such a good thing to be free of this odious chore? I can translate this typical reason to use public domain content, “I prefer to be lazy.”
  • There is a larger implication when you reuse content and choose not to attribute. Out in the flow of all other information, it more or less says to readers, “all images are free to pilfer. Just google and take them all. Be like me.”
  • It’s not about the rules of the license, it’s about maybe, maybe, operating in this mechanized place as a human, rather than a copy cat.
  • ...4 more annotations...
  • Google search results gives more weight to pxhere.com where the image has a mighty 4 views (some of which are me) over the original image, with almost 5000 views.
  • What kind of algorithm is that? It’s one that does not favor the individual. Image search results will favor sites like Needpix, Pixsels, Pixnio, Peakpx, Nicepic, and they still favor the really slimy maxpixel which is a direct rip off of pixabay.
  • did you know that the liberating world of “use any photo you want w/o the hassle of attribution” is such a bucket of questionable slime? And that Google, with all of their algorithmic prowess, gives more favorable results to sites that lift photos than to the ones where the originals exist?
  • So yes, just reuse photos without taking all of the severe effort to give credit to the source, because “you don’t have to.” Be a copycat. Show your flag of Lacktribution. Like everyone else. I will not. I adhere to Thanktribution.
Ed Webb

ChatGPT Is Nothing Like a Human, Says Linguist Emily Bender - 0 views

  • Please do not conflate word form and meaning. Mind your own credulity.
  • We’ve learned to make “machines that can mindlessly generate text,” Bender told me when we met this winter. “But we haven’t learned how to stop imagining the mind behind it.”
  • A handful of companies control what PricewaterhouseCoopers called a “$15.7 trillion game changer of an industry.” Those companies employ or finance the work of a huge chunk of the academics who understand how to make LLMs. This leaves few people with the expertise and authority to say, “Wait, why are these companies blurring the distinction between what is human and what’s a language model? Is this what we want?”
  • ...16 more annotations...
  • “We call on the field to recognize that applications that aim to believably mimic humans bring risk of extreme harms,” she co-wrote in 2021. “Work on synthetic human behavior is a bright line in ethical Al development, where downstream effects need to be understood and modeled in order to block foreseeable harm to society and different social groups.”
  • chatbots that we easily confuse with humans are not just cute or unnerving. They sit on a bright line. Obscuring that line and blurring — bullshitting — what’s human and what’s not has the power to unravel society
  • She began learning from, then amplifying, Black women’s voices critiquing AI, including those of Joy Buolamwini (she founded the Algorithmic Justice League while at MIT) and Meredith Broussard (the author of Artificial Unintelligence: How Computers Misunderstand the World). She also started publicly challenging the term artificial intelligence, a sure way, as a middle-aged woman in a male field, to get yourself branded as a scold. The idea of intelligence has a white-supremacist history. And besides, “intelligent” according to what definition? The three-stratum definition? Howard Gardner’s theory of multiple intelligences? The Stanford-Binet Intelligence Scale? Bender remains particularly fond of an alternative name for AI proposed by a former member of the Italian Parliament: “Systematic Approaches to Learning Algorithms and Machine Inferences.” Then people would be out here asking, “Is this SALAMI intelligent? Can this SALAMI write a novel? Does this SALAMI deserve human rights?”
  • Tech-makers assuming their reality accurately represents the world create many different kinds of problems. The training data for ChatGPT is believed to include most or all of Wikipedia, pages linked from Reddit, a billion words grabbed off the internet. (It can’t include, say, e-book copies of everything in the Stanford library, as books are protected by copyright law.) The humans who wrote all those words online overrepresent white people. They overrepresent men. They overrepresent wealth. What’s more, we all know what’s out there on the internet: vast swamps of racism, sexism, homophobia, Islamophobia, neo-Nazism.
  • One fired Google employee told me succeeding in tech depends on “keeping your mouth shut to everything that’s disturbing.” Otherwise, you’re a problem. “Almost every senior woman in computer science has that rep. Now when I hear, ‘Oh, she’s a problem,’ I’m like, Oh, so you’re saying she’s a senior woman?”
  • “We haven’t learned to stop imagining the mind behind it.”
  • In March 2021, Bender published “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” with three co-authors. After the paper came out, two of the co-authors, both women, lost their jobs as co-leads of Google’s Ethical AI team.
  • “On the Dangers of Stochastic Parrots” is not a write-up of original research. It’s a synthesis of LLM critiques that Bender and others have made: of the biases encoded in the models; the near impossibility of studying what’s in the training data, given the fact they can contain billions of words; the costs to the climate; the problems with building technology that freezes language in time and thus locks in the problems of the past. Google initially approved the paper, a requirement for publications by staff. Then it rescinded approval and told the Google co-authors to take their names off it. Several did, but Google AI ethicist Timnit Gebru refused. Her colleague (and Bender’s former student) Margaret Mitchell changed her name on the paper to Shmargaret Shmitchell, a move intended, she said, to “index an event and a group of authors who got erased.” Gebru lost her job in December 2020, Mitchell in February 2021. Both women believe this was retaliation and brought their stories to the press. The stochastic-parrot paper went viral, at least by academic standards. The phrase stochastic parrot entered the tech lexicon.
  • Tech execs loved it. Programmers related to it. OpenAI CEO Sam Altman was in many ways the perfect audience: a self-identified hyperrationalist so acculturated to the tech bubble that he seemed to have lost perspective on the world beyond. “I think the nuclear mutually assured destruction rollout was bad for a bunch of reasons,” he said on AngelList Confidential in November. He’s also a believer in the so-called singularity, the tech fantasy that, at some point soon, the distinction between human and machine will collapse. “We are a few years in,” Altman wrote of the cyborg merge in 2017. “It’s probably going to happen sooner than most people think. Hardware is improving at an exponential rate … and the number of smart people working on AI is increasing exponentially as well. Double exponential functions get away from you fast.” On December 4, four days after ChatGPT was released, Altman tweeted, “i am a stochastic parrot, and so r u.”
  • “This is one of the moves that turn up ridiculously frequently. People saying, ‘Well, people are just stochastic parrots,’” she said. “People want to believe so badly that these language models are actually intelligent that they’re willing to take themselves as a point of reference and devalue that to match what the language model can do.”
  • The membrane between academia and industry is permeable almost everywhere; the membrane is practically nonexistent at Stanford, a school so entangled with tech that it can be hard to tell where the university ends and the businesses begin.
  • “No wonder that men who live day in and day out with machines to which they believe themselves to have become slaves begin to believe that men are machines.”
  • what’s tenure for, after all?
  • LLMs are tools made by specific people — people who stand to accumulate huge amounts of money and power, people enamored with the idea of the singularity. The project threatens to blow up what is human in a species sense. But it’s not about humility. It’s not about all of us. It’s not about becoming a humble creation among the world’s others. It’s about some of us — let’s be honest — becoming a superspecies. This is the darkness that awaits when we lose a firm boundary around the idea that humans, all of us, are equally worthy as is.
  • The AI dream is “governed by the perfectibility thesis, and that’s where we see a fascist form of the human.”
  • “Why are you trying to trick people into thinking that it really feels sad that you lost your phone?”
Ed Webb

The End of Cyberspace: Google's cloudy Web clipboard - 1 views

  • shifts in metaphors matter
  • Today I noticed that Google Docs doesn't have a clipboard; instead, it has a "Web clipboard."
  • Notice that the Web clipboard isn't a conventional clipboard icon, but a clipboard with a cloud in front of it.
  • ...1 more annotation...
  • Google's icon designers are assuming that people are familiar enough with the cloud = Web equation to make its use uncontroversial. Another step away from cyberspace as place.
1 - 20 of 63 Next › Last »
Showing 20 items per page