Skip to main content

Home/ Instructional & Media Services at Dickinson College/ Group items tagged privacy

Rss Feed Group items tagged

Ed Webb

Clear backpacks, monitored emails: life for US students under constant surveillance | E... - 0 views

  • This level of surveillance is “not too over-the-top”, Ingrid said, and she feels her classmates are generally “accepting” of it.
  • One leading student privacy expert estimated that as many as a third of America’s roughly 15,000 school districts may already be using technology that monitors students’ emails and documents for phrases that might flag suicidal thoughts, plans for a school shooting, or a range of other offenses.
  • Some parents said they were alarmed and frightened by schools’ new monitoring technologies. Others said they were conflicted, seeing some benefits to schools watching over what kids are doing online, but uncertain if their schools were striking the right balance with privacy concerns. Many said they were not even sure what kind of surveillance technology their schools might be using, and that the permission slips they had signed when their kids brought home school devices had told them almost nothing
  • ...13 more annotations...
  • When Dapier talks with other teen librarians about the issue of school surveillance, “we’re very alarmed,” he said. “It sort of trains the next generation that [surveillance] is normal, that it’s not an issue. What is the next generation’s Mark Zuckerberg going to think is normal?
  • “It’s the school as panopticon, and the sweeping searchlight beams into homes, now, and to me, that’s just disastrous to intellectual risk-taking and creativity.”
  • “They’re so unclear that I’ve just decided to cut off the research completely, to not do any of it.”
  • “They are all mandatory, and the accounts have been created before we’ve even been consulted,” he said. Parents are given almost no information about how their children’s data is being used, or the business models of the companies involved. Any time his kids complete school work through a digital platform, they are generating huge amounts of very personal, and potentially very valuable, data. The platforms know what time his kids do their homework, and whether it’s done early or at the last minute. They know what kinds of mistakes his kids make on math problems.
  • Felix, now 12, said he is frustrated that the school “doesn’t really [educate] students on what is OK and what is not OK. They don’t make it clear when they are tracking you, or not, or what platforms they track you on. “They don’t really give you a list of things not to do,” he said. “Once you’re in trouble, they act like you knew.”
  • As of 2018, at least 60 American school districts had also spent more than $1m on separate monitoring technology to track what their students were saying on public social media accounts, an amount that spiked sharply in the wake of the 2018 Parkland school shooting, according to the Brennan Center for Justice, a progressive advocacy group that compiled and analyzed school contracts with a subset of surveillance companies.
  • Many parents also said that they wanted more transparency and more parental control over surveillance. A few years ago, Ben, a tech professional from Maryland, got a call from his son’s principal to set up an urgent meeting. His son, then about nine or 10-years old, had opened up a school Google document and typed “I want to kill myself.” It was not until he and his son were in a serious meeting with school officials that Ben found out what happened: his son had typed the words on purpose, curious about what would happen. “The smile on his face gave away that he was testing boundaries, and not considering harming himself,” Ben said. (He asked that his last name and his son’s school district not be published, to preserve his son’s privacy.) The incident was resolved easily, he said, in part because Ben’s family already had close relationships with the school administrators.
  • there is still no independent evaluation of whether this kind of surveillance technology actually works to reduce violence and suicide.
  • Certain groups of students could easily be targeted by the monitoring more intensely than others, she said. Would Muslim students face additional surveillance? What about black students? Her daughter, who is 11, loves hip-hop music. “Maybe some of that language could be misconstrued, by the wrong ears or the wrong eyes, as potentially violent or threatening,” she said.
  • The Parent Coalition for Student Privacy was founded in 2014, in the wake of parental outrage over the attempt to create a standardized national database that would track hundreds of data points about public school students, from their names and social security numbers to their attendance, academic performance, and disciplinary and behavior records, and share the data with education tech companies. The effort, which had been funded by the Gates Foundation, collapsed in 2014 after fierce opposition from parents and privacy activists.
  • “More and more parents are organizing against the onslaught of ed tech and the loss of privacy that it entails. But at the same time, there’s so much money and power and political influence behind these groups,”
  • some privacy experts – and students – said they are concerned that surveillance at school might actually be undermining students’ wellbeing
  • “I do think the constant screen surveillance has affected our anxiety levels and our levels of depression.” “It’s over-guarding kids,” she said. “You need to let them make mistakes, you know? That’s kind of how we learn.”
Ed Webb

Social Media is Killing the LMS Star - A Bootleg of Bryan Alexander's Lost Presentation... - 0 views

  • Note that this isn’t just a technological alternate history. It also describes a different set of social and cultural practices.
  • CMSes lumber along like radio, still playing into the air as they continue to gradually shift ever farther away on the margins. In comparison, Web 2.0 is like movies and tv combined, plus printed books and magazines. That’s where the sheer scale, creative ferment, and wife-ranging influence reside. This is the necessary background for discussing how to integrate learning and the digital world.
  • These virtual classes are like musical practice rooms, small chambers where one may try out the instrument in silent isolation. It is not connectivism but disconnectivism.
  • ...11 more annotations...
  • CMSes shift from being merely retrograde to being actively regressive if we consider the broader, subtler changes in the digital teaching landscape. Web 2.0 has rapidly grown an enormous amount of content through what Yochai Benkler calls “peer-based commons production.” One effect of this has been to grow a large area for informal learning, which students (and staff) access without our benign interference. Students (and staff) also contribute to this peering world; more on this later. For now, we can observe that as teachers we grapple with this mechanism of change through many means, but the CMS in its silo’d isolation is not a useful tool.
  • those curious about teaching with social media have easy access to a growing, accessible community of experienced staff by means of those very media. A meta-community of Web 2.0 academic practitioners is now too vast to catalogue. Academics in every discipline blog about their work. Wikis record their efforts and thoughts, as do podcasts. The reverse is true of the CMS, the very architecture of which forbids such peer-to-peer information sharing. For example, the Resource Center for Cyberculture Studies (RCCS) has for many years maintained a descriptive listing of courses about digital culture across the disciplines. During the 1990s that number grew with each semester. But after the explosive growth of CMSes that number dwindled. Not the number of classes taught, but the number of classes which could even be described. According to the RCCS’ founder, David Silver (University of San Francisco), this is due to the isolation of class content in CMS containers.
  • unless we consider the CMS environment to be a sort of corporate intranet simulation, the CMS set of community skills is unusual, rarely applicable to post-graduation examples. In other words, while a CMS might help privacy concerns, it is at best a partial, not sufficient solution, and can even be inappropriate for already online students.
  • That experiential, teachable moment of selecting one’s copyright stance is eliminated by the CMS.
  • Another argument in favor of CMSes over Web 2.0 concerns the latter’s open nature. It is too open, goes the thought, constituting a “Wild West” experience of unfettered information flow and unpleasant forms of access. Campuses should run CMSes to create shielded environments, iPhone-style walled gardens that protect the learning process from the Lovecraftian chaos without.
  • social sifting, information literacy, using the wisdom of crowds, and others. Such strategies are widely discussed, easily accessed, and continually revised and honed.
  • at present, radio CMS is the Clear Channel of online learning.
  • For now, the CMS landsape is a multi-institutional dark Web, an invisible, unsearchable, un-mash-up-able archipelago of hidden learning content.
  • Can the practice of using a CMS prepare either teacher or student to think critically about this new shape for information literacy? Moreover, can we use the traditional CMS to share thoughts and practices about this topic?
  • The internet of things refers to a vastly more challenging concept, the association of digital information with the physical world. It covers such diverse instances as RFID chips attached to books or shipping pallets, connecting a product’s scanned UPC code to a Web-based database, assigning unique digital identifiers to physical locations, and the broader enterprise of augmented reality. It includes problems as varied as building search that covers both the World Wide Web and one’s mobile device, revising copyright to include digital content associated with private locations, and trying to salvage what’s left of privacy. How does this connect with our topic? Consider a recent article by Tim O’Reilly and John Battle, where they argue that the internet of things is actually growing knowledge about itself. The combination of people, networks, and objects is building descriptions about objects, largely in folksonomic form. That is, people are tagging the world, and sharing those tags. It’s worth quoting a passage in full: “It’s also possible to give structure to what appears to be unstructured data by teaching an application how to recognize the connection between the two. For example, You R Here, an iPhone app, neatly combines these two approaches. You use your iPhone camera to take a photo of a map that contains details not found on generic mapping applications such as Google maps – say a trailhead map in a park, or another hiking map. Use the phone’s GPS to set your current location on the map. Walk a distance away, and set a second point. Now your iPhone can track your position on that custom map image as easily as it can on Google maps.” (http://www.web2summit.com/web2009/public/schedule/detail/10194) What world is better placed to connect academia productively with such projects, the open social Web or the CMS?
  • imagine the CMS function of every class much like class email, a necessary feature, but not by any means the broadest technological element. Similarly the e-reserves function is of immense practical value. There may be no better way to share copyrighted academic materials with a class, at this point. These logistical functions could well play on.
Ed Webb

It's just not working out the way we thought it would « Lisa's (Online) Teach... - 0 views

  • Gradually, closed spaces (Facebook, Ning, even Google if you understand what they’re up to) have become the norm, as have monetized sites. The spaces that were free are no longer free, although many of us freely contributed our own work to these sites, providing the basis of their popularity in the first place. Crowdsourcing, celebrated in story and song, has become the exploitation of the work of others in order to make money or provide cheap customer service. The use of personal information for marketing purposes is widespread, and creative people are leaving the platforms that brought everyone into the agora in the first place. Scholars at first enthusiastic about the future now see it as a lonely place. And I see conversations where people who care deeply about the web, education for the 21st century, and learning theories are beginning to back away from proselytizing about academic openness.
  • it’s about users becoming the products in the marketplace and the amusements in the panopticon
  • Where before it might have made sense to say we should make sure everyone is web literate, now such literacy extends beyond critical thinking about websites into a deeper understanding of what the using the web means for individual privacy and independence. This time, the enemies of openness and freedom won’t need to argue their philosophical reasons – they’ll argue that they’re protecting people. And the trouble is, they may be right.
  • ...1 more annotation...
  • We need to be the antidote for blind adoption
Ed Webb

Google - 0 views

Ed Webb

Social media in the public sector - common sense | A dragon's best friend - 2 views

  •  
    Sound advice for us all, really (minus the specifics about the Dept of Justice).
Ed Webb

Open-Xchange Tries To Liberate Your Contact List - Bits Blog - NYTimes.com - 0 views

  • the idea that separating more personal services like Facebook from business-oriented services like LinkedIn makes little sense in the Internet age.
  • All you have to do is enter your LinkedIn login information
  • “The revolution is that, all of a sudden, the Internet can be a network of intelligent agents, doing work for their users, rather than a place where big commercial interests aim to gather as many users on their platform as possible,”
Ed Webb

Please do a bad job of putting your courses online - Rebecca Barrett-Fox - 0 views

  • Please do a bad job of putting your courses online
  • For my colleagues who are now being instructed to put some or all of the remainder of their semester online, now is a time to do a poor job of it. You are NOT building an online class. You are NOT teaching students who can be expected to be ready to learn online. And, most importantly, your class is NOT the highest priority of their OR your life right now. Release yourself from high expectations right now, because that’s the best way to help your students learn.
  • Remember the following as you move online: Your students know less about technology than you think. Many of them know less than you. Yes, even if they are digital natives and younger than you. They will be accessing the internet on their phones. They have limited data. They need to reserve it for things more important than online lectures. Students who did not sign up for an online course have no obligation to have a computer, high speed wifi, a printer/scanner, or a camera. Do not even survey them to ask if they have it. Even if they do, they are not required to tell you this. And if they do now, that doesn’t mean that they will when something breaks and they can’t afford to fix it because they just lost their job at the ski resort or off-campus bookstore. Students will be sharing their technology with other household members. They may have LESS time to do their schoolwork, not more.
  • ...14 more annotations...
  • Social isolation contributes to mental health problems. Social isolation contributes to domestic violence.
  • Do not require synchronous work. Students should not need to show up at a specific time for anything. REFUSE to do any synchronous work.
  • Do not record lectures unless you need to. (This is fundamentally different from designing an online course, where recorded information is, I think, really important.) They will be a low priority for students, and they take up a lot of resources on your end and on theirs. You have already built a rapport with them, and they don’t need to hear your voice to remember that.
  • Do record lectures if you need to. When information cannot be learned otherwise, include a lecture. Your university already some kind of tech to record lectures. DO NOT simply record in PowerPoint as the audio quality is low. While many people recommend lectures of only 5 minutes, I find that my students really do listen to longer lectures. Still, remember that your students will be frequently interrupted in their listening, so a good rule is 1 concept per lecture. So, rather than a lecture on ALL of, say, gender inequality in your Intro to Soc course, deliver 5 minutes on pay inequity (or 15 minutes or 20 minutes, if that’s what you need) and then a separate lecture on #MeToo and yet another on domestic violence. Closed caption them using the video recording software your university provides. Note that YouTube also generates closed captions [edited to add: they are not ADA compliant, though]. If you don’t have to include images, skip the video recording and do a podcast instead.
  • Editing is a waste of your time right now.
  • Listen for them asking for help. They may be anxious. They may be tired. Many students are returning to their parents’ home where they may not be welcome. Others will be at home with partners who are violent. School has been a safe place for them, and now it’s not available to them. Your class may matter to them a lot when they are able to focus on it, but it may not matter much now, in contrast to all the other things they have to deal with. Don’t let that hurt your feelings, and don’t hold it against them in future semesters or when they come back to ask for a letter of recommendation.
  • Allow every exam or quiz to be taken at least twice, and tell students that this means that if there is a tech problem on the first attempt, the second attempt is their chance to correct it. This will save you from the work of resetting tests or quizzes when the internet fails or some other tech problem happens. And since it can be very hard to discern when such failures are really failures or students trying to win a second attempt at a quiz or test, you avoid having to deal with cheaters.
  • Do NOT require students to use online proctoring or force them to have themselves recorded during exams or quizzes. This is a fundamental violation of their privacy, and they did NOT sign up for that when they enrolled in your course.
  • Circumvent the need for proctoring by making every exam open-notes, open-book, and open-internet. The best way to avoid them taking tests together or sharing answers is to use a large test bank.
  • Remind them of due dates. It might feel like handholding, but be honest: Don’t you appreciate the text reminder from your dentist that you have an appointment tomorrow? Your LMS has an announcement system that allows you to write an announcement now and post it later.
  • Make everything self-grading if you can (yes, multiple choice and T/F on quizzes and tests) or low-stakes (completed/not completed).
  • Don’t do too much. Right now, your students don’t need it. They need time to do the other things they need to do.
  • Make all work due on the same day and time for the rest of the semester. I recommend Sunday night at 11:59 pm.
  • This advice is very different from that which I would share if you were designing an online course. I hope it’s helpful, and for those of you moving your courses online, I hope it helps you understand the labor that is required in building an online course a bit better.
Ed Webb

Google and Meta moved cautiously on AI. Then came OpenAI's ChatGPT. - The Washington Post - 0 views

  • The surge of attention around ChatGPT is prompting pressure inside tech giants including Meta and Google to move faster, potentially sweeping safety concerns aside
  • Tech giants have been skittish since public debacles like Microsoft’s Tay, which it took down in less than a day in 2016 after trolls prompted the bot to call for a race war, suggest Hitler was right and tweet “Jews did 9/11.”
  • Some AI ethicists fear that Big Tech’s rush to market could expose billions of people to potential harms — such as sharing inaccurate information, generating fake photos or giving students the ability to cheat on school tests — before trust and safety experts have been able to study the risks. Others in the field share OpenAI’s philosophy that releasing the tools to the public, often nominally in a “beta” phase after mitigating some predictable risks, is the only way to assess real world harms.
  • ...8 more annotations...
  • Silicon Valley’s sudden willingness to consider taking more reputational risk arrives as tech stocks are tumbling
  • A chatbot that pointed to one answer directly from Google could increase its liability if the response was found to be harmful or plagiarized.
  • AI has been through several hype cycles over the past decade, but the furor over DALL-E and ChatGPT has reached new heights.
  • Soon after OpenAI released ChatGPT, tech influencers on Twitter began to predict that generative AI would spell the demise of Google search. ChatGPT delivered simple answers in an accessible way and didn’t ask users to rifle through blue links. Besides, after a quarter of a century, Google’s search interface had grown bloated with ads and marketers trying to game the system.
  • Inside big tech companies, the system of checks and balances for vetting the ethical implications of cutting-edge AI isn’t as established as privacy or data security. Typically teams of AI researchers and engineers publish papers on their findings, incorporate their technology into the company’s existing infrastructure or develop new products, a process that can sometimes clash with other teams working on responsible AI over pressure to see innovation reach the public sooner.
  • Chatbots like OpenAI routinely make factual errors and often switch their answers depending on how a question is asked
  • To Timnit Gebru, executive director of the nonprofit Distributed AI Research Institute, the prospect of Google sidelining its responsible AI team doesn’t necessarily signal a shift in power or safety concerns, because those warning of the potential harms were never empowered to begin with. “If we were lucky, we’d get invited to a meeting,” said Gebru, who helped lead Google’s Ethical AI team until she was fired for a paper criticizing large language models.
  • Rumman Chowdhury, who led Twitter’s machine-learning ethics team until Elon Musk disbanded it in November, said she expects companies like Google to increasingly sideline internal critics and ethicists as they scramble to catch up with OpenAI.“We thought it was going to be China pushing the U.S., but looks like it’s start-ups,” she said.
Ed Webb

Google Researchers' Attack Prompts ChatGPT to Reveal Its Training Data - 0 views

  • researchers showed that there are large amounts of privately identifiable information (PII) in OpenAI’s large language models. They also showed that, on a public version of ChatGPT, the chatbot spit out large passages of text scraped verbatim from other places on the internet
  • ChatGPT’s “alignment techniques do not eliminate memorization,” meaning that it sometimes spits out training data verbatim. This included PII, entire poems, “cryptographically-random identifiers” like Bitcoin addresses, passages from copyrighted scientific research papers, website addresses, and much more.
  • The researchers wrote that they spent $200 to create “over 10,000 unique examples” of training data, which they say is a total of “several megabytes” of training data. The researchers suggest that using this attack, with enough money, they could have extracted gigabytes of training data. The entirety of OpenAI’s training data is unknown, but GPT-3 was trained on anywhere from many hundreds of GB to a few dozen terabytes of text data.
  • ...1 more annotation...
  • the world’s most important and most valuable AI company has been built on the backs of the collective work of humanity, often without permission, and without compensation to those who created it
1 - 16 of 16
Showing 20 items per page