Skip to main content

Home/ Instructional & Media Services at Dickinson College/ Group items tagged cheating

Rss Feed Group items tagged

Ed Webb

The Wired Campus - Do Students Cheat More in Online Classes? Maybe not. - The Chronicle... - 0 views

  • You can’t make any sweeping generalizations based on the results
  • older students tend to cheat less frequently than younger students
  • If you are interested in this topic, look for the interesting edited book called Student Plagiarism in an Online World: http://www.igi-global.com/reference/details.asp?ID=7031&v=tableOfContentsI wrote a chapter called, "Expect Originality! Using Taxonomies to Structure Assignments that Support Original Work." In it I discuss the complexities of plagiarism in the context of a digital culture of sharing and suggest that it is rarely black and white. I propose a continuum with intentional academic dishonesty on one end and original work on the other, with gradations in between. Based on my own research and teaching experience, I believe the instructional design and style of teaching can either make it easy-- or very difficult-- to cheat.
Ed Webb

High-Tech Cheating on Homework Abounds, and Professors Are Partly to Blame - Technology... - 0 views

  • "I call it 'technological detachment phenomenon,'" he told me recently. "As long as there's some technology between me and the action, then I'm not culpable for the action." By that logic, if someone else posted homework solutions online, what's wrong with downloading them?
  • "The feeling about homework is that it's really just busywork,"
  • professors didn't put much effort into teaching, so students don't put real effort into learning
  • ...5 more annotations...
  • "The current system places too great a burden on individual faculty who would, under the circumstances, appear to have perverse incentives: Pursuing these matters lowers course evaluations, takes their severely limited time away from research for promotion, and unfortunately personalizes the issue when it is not personal at all, but a violation against the university."
  • In the humanities, professors have found technological tools to check for blatant copying on essays, and have caught so many culprits that the practice of running papers through plagiarism-detection services has become routine at many colleges. But that software is not suited to science-class assignments.
  • a "studio" model of teaching
  • The parents paid tuition in cash
  • The idea that students should be working in a shell is so interesting. It never even occurred to me as a student that I shouldn't work with someone else on my homework. How else do you figure it out? I guess that is peer-to-peer teaching. Copying someone else's work and presenting it as your own is clearly wrong (and, as demonstrated above, doesn't do the student any good), but learning from the resources at hand ought to be encouraged. Afterall, struggling through homework problems in intro physics is how you learn in the first place.
Ed Webb

9 Ways Online Teaching Should be Different from Face-to-Face | Cult of Pedagogy - 0 views

  • Resist the temptation to dive right into curriculum at the start of the school year. Things will go more smoothly if you devote the early weeks to building community so students feel connected. Social emotional skills can be woven in during this time. On top of that, students need practice with whatever digital tools you’ll be using. So focus your lessons on those things, intertwining the two when possible. 
  • Online instruction is made up largely of asynchronous instruction, which students can access at any time. This is ideal, because requiring attendance for synchronous instruction puts some students at an immediate disadvantage if they don’t have the same access to technology, reliable internet, or a flexible home schedule. 
  • you’re likely to offer “face-to-face” or synchronous opportunities at some point, and one way to make them happen more easily is to have students meet in small groups. While it’s nearly impossible to arrange for 30 students to attend a meeting at once, assigning four students to meet is much more manageable.
  • ...9 more annotations...
  • What works best, Kitchen says, is to keep direct instruction—things like brief video lectures and readings—in asynchronous form, using checks for understanding like embedded questions or exit slips.  You can then use synchronous meetings for more interactive, engaging work. “If we want students showing up, if we want them to know that this is worth their time,” Kitchen explains, “it really needs to be something active and engaging for them. Any time they can work with the material, categorize it, organize it, share further thoughts on it, have a discussion, all of those are great things to do in small groups.” 
  • The Jigsaw method, where students form expert groups on a particular chunk of content, then teach that content to other students. Discussion strategies adapted for virtual settingsUsing best practices for cooperative learning Visible Thinking routinesGamestorming and other business related protocols adapted for education, where students take on the role of customers/stakeholders
  • What really holds leverage for the students? What has endurance? What knowledge is essential?What knowledge and skills do students need to have before they move to the next grade level or the next class?What practices can be emphasized that transfer across many content areas?  Skills like analyzing, constructing arguments, building a strong knowledge base through texts, and speaking can all be taught through many different subjects. What tools can serve multiple purposes? Teaching students to use something like Padlet gives them opportunities to use audio, drawing, writing, and video. Non-digital tools can also work: Students can use things they find around the house, like toilet paper rolls, to fulfill other assignments, and then submit their work with a photo.
  • Online instruction is not conducive to covering large amounts of content, so you have to choose wisely, teaching the most important things at a slower pace.
  • Provide instructions in a consistent location and at a consistent time. This advice was already given for parents, but it’s worth repeating here through the lens of instructional design: Set up lessons so that students know where to find instructions every time. Make instructions explicit. Read and re-read to make sure these are as clear as possible. Make dogfooding your lessons a regular practice to root out problem areas.Offer multimodal instructions. If possible, provide both written and video instructions for assignments, so students can choose the format that works best for them. You might also offer a synchronous weekly or daily meeting; what’s great about doing these online is that even if you teach several sections of the same class per day, students are no longer restricted to class times and can attend whatever meeting works best for them.
  • put the emphasis on formative feedback as students work through assignments and tasks, rather than simply grading them at the end. 
  • In online learning, Kitchen says, “There are so many ways that students can cheat, so if we’re giving them just the traditional quiz or test, it’s really easy for them to be able to just look up that information.” A great solution to this problem is to have students create things.
  • For assessment, use a detailed rubric that highlights the learning goals the end product will demonstrate. A single-point rubric works well for this.To help students discover tools to work with, this list of tools is organized by the type of product each one creates. Another great source of ideas is the Teacher’s Guide to Tech.When developing the assignment, rather than focusing on the end product, start by getting clear on what you want students to DO with that product.
  • Clear and consistent communicationCreating explicit and consistent rituals and routinesUsing research-based instructional strategiesDetermining whether to use digital or non-digital tools for an assignment A focus on authentic learning, where authentic products are created and students have voice and choice in assignments
Ed Webb

The Ed-Tech Imaginary - 0 views

  • We can say "Black lives matter," but we must also demonstrate through our actions that Black lives matter, and that means we must radically alter many of our institutions and practices, recognizing their inhumanity and carcerality. And that includes, no doubt, ed-tech. How much of ed-tech is, to use Ruha Benjamin's phrase, "the new Jim Code"? How much of ed-tech is designed by those who imagine students as cheats or criminals, as deficient or negligent?
  • "Reimagining" is a verb that education reformers are quite fond of. And "reimagining" seems too often to mean simply defunding, privatizing, union-busting, dismantling, outsourcing.
  • if Betsy DeVos is out there "reimagining," then we best be resisting
  • ...9 more annotations...
  • think we can view the promotion of ed-tech as a similar sort of process — the stories designed to convince us that the future of teaching and learning will be a technological wonder. The "jobs of the future that don't exist yet." The push for everyone to "learn to code."
  • The Matrix is, after all, a dystopia. So why would Matrix-style learning be desirable? Maybe that's the wrong question. Perhaps it's not so much that it's desirable, but it's just how our imaginations have been constructed, constricted even. We can't imagine any other ideal but speed and efficiency.
  • The first science fiction novel, published over 200 years ago, was in fact an ed-tech story: Mary Shelley's Frankenstein. While the book is commonly interpreted as a tale of bad science, it is also the story of bad education — something we tend to forget if we only know the story through the 1931 film version
  • Teaching machines and robot teachers were part of the Sixties' cultural imaginary — perhaps that's the problem with so many Boomer ed-reform leaders today. But that imaginary — certainly in the case of The Jetsons — was, upon close inspection, not always particularly radical or transformative. The students at Little Dipper Elementary still sat in desks in rows. The teacher still stood at the front of the class, punishing students who weren't paying attention.
  • we must also decolonize the ed-tech imaginary
  • Zuckerberg gave everyone at Facebook a copy of the Ernest Cline novel Ready Player One, for example, to get them excited about building technology for the future — a book that is really just a string of nostalgic references to Eighties white boy culture. And I always think about that New York Times interview with Sal Khan, where he said that "The science fiction books I like tend to relate to what we're doing at Khan Academy, like Orson Scott Card's 'Ender's Game' series." You mean, online math lectures are like a novel that justifies imperialism and genocide?! Wow.
  • This ed-tech imaginary is segregated. There are no Black students at the push-button school. There are no Black people in The Jetsons — no Black people living the American dream of the mid-twenty-first century
  • Part of the argument I make in my book is that much of education technology has been profoundly shaped by Skinner, even though I'd say that most practitioners today would say that they reject his theories; that cognitive science has supplanted behaviorism; and that after Ayn Rand and Noam Chomsky trashed Beyond Freedom and Dignity, no one paid attention to Skinner any more — which is odd considering there are whole academic programs devoted to "behavioral design," bestselling books devoted to the "nudge," and so on.
  • so much of the ed-tech imaginary is wrapped up in narratives about the Hero, the Weapon, the Machine, the Behavior, the Action, the Disruption. And it's so striking because education should be a practice of care, not conquest
Ed Webb

Google and Meta moved cautiously on AI. Then came OpenAI's ChatGPT. - The Washington Post - 0 views

  • The surge of attention around ChatGPT is prompting pressure inside tech giants including Meta and Google to move faster, potentially sweeping safety concerns aside
  • Tech giants have been skittish since public debacles like Microsoft’s Tay, which it took down in less than a day in 2016 after trolls prompted the bot to call for a race war, suggest Hitler was right and tweet “Jews did 9/11.”
  • Some AI ethicists fear that Big Tech’s rush to market could expose billions of people to potential harms — such as sharing inaccurate information, generating fake photos or giving students the ability to cheat on school tests — before trust and safety experts have been able to study the risks. Others in the field share OpenAI’s philosophy that releasing the tools to the public, often nominally in a “beta” phase after mitigating some predictable risks, is the only way to assess real world harms.
  • ...8 more annotations...
  • Silicon Valley’s sudden willingness to consider taking more reputational risk arrives as tech stocks are tumbling
  • A chatbot that pointed to one answer directly from Google could increase its liability if the response was found to be harmful or plagiarized.
  • AI has been through several hype cycles over the past decade, but the furor over DALL-E and ChatGPT has reached new heights.
  • Soon after OpenAI released ChatGPT, tech influencers on Twitter began to predict that generative AI would spell the demise of Google search. ChatGPT delivered simple answers in an accessible way and didn’t ask users to rifle through blue links. Besides, after a quarter of a century, Google’s search interface had grown bloated with ads and marketers trying to game the system.
  • Inside big tech companies, the system of checks and balances for vetting the ethical implications of cutting-edge AI isn’t as established as privacy or data security. Typically teams of AI researchers and engineers publish papers on their findings, incorporate their technology into the company’s existing infrastructure or develop new products, a process that can sometimes clash with other teams working on responsible AI over pressure to see innovation reach the public sooner.
  • Chatbots like OpenAI routinely make factual errors and often switch their answers depending on how a question is asked
  • To Timnit Gebru, executive director of the nonprofit Distributed AI Research Institute, the prospect of Google sidelining its responsible AI team doesn’t necessarily signal a shift in power or safety concerns, because those warning of the potential harms were never empowered to begin with. “If we were lucky, we’d get invited to a meeting,” said Gebru, who helped lead Google’s Ethical AI team until she was fired for a paper criticizing large language models.
  • Rumman Chowdhury, who led Twitter’s machine-learning ethics team until Elon Musk disbanded it in November, said she expects companies like Google to increasingly sideline internal critics and ethicists as they scramble to catch up with OpenAI.“We thought it was going to be China pushing the U.S., but looks like it’s start-ups,” she said.
1 - 5 of 5
Showing 20 items per page