Skip to main content

Home/ Bucknell Digital Pedagogy & Scholarship/ Group items tagged digital-pedagogy data

Rss Feed Group items tagged

Todd Suomela

Beyond buttonology: Digital humanities, digital pedagogy, and the ACRL Framework | Russ... - 0 views

  • Here are a few specific examples you can apply to your instructional design process to help learners with metacognition: Model the metacognitive process during instruction (or in one-on-one consultations) to ask and reflect on big picture questions such as: “What questions can you answer with this tool?” “What can you not do with this tool?” Keep in mind some answers may be simple (e.g., this tool can only work with data in this way, so it is excluded automatically). Also, “Did I get the results I expected? What could I have done differently?” Start with inquiry and build conversations based on the learner’s answers. “Is it the data that does not work? Or is the research question fundamentally wrong to begin with?” Collaborate with faculty to teach together, modelling your practices while demonstrating a specific tool. This could include thinking aloud as you make decisions so learners can self-correct assumptions. Also, be aware of your own expert bias so you can demonstrate how to clear obstacles. Ask learners to specifically define what is difficult for them during the process of instruction. Digital humanities tools are complex and are based on complex methodologies and research questions. By constructing opportunities for learners to self-question as they move from one task to another, they learn to self-assess their progress and adjust accordingly. There are several instructional design activities that promote metacognition: think-pair-share, one minute paper (“share a key concept learned” or “what comes next?”), and case studies.
  • There are specific strategies we can implement to help learners escape the recursive spiral of the liminal state they experience while managing complex digital projects: One of the most challenging aspects of teaching digital tools is forgetting what it is like to be a novice learner. Sometimes being a near-novice oneself helps you better prepare for the basic problems and frustrations learners are facing. But recognizing liminality is a reminder to you as a teacher that the learning process is not smooth, and it requires anticipating common difficulties and regularly checking in with learners to make sure you are not leaving them behind. When meeting with learners one-on-one, make sure to use your in-depth reference interview skills to engage in methods discussions. When a learner is in the liminal state, they are not always able to “see the forest for the trees.” Your directed questions will illuminate the problems they are having and the solutions they had not seen. Pay close attention to the digital humanities work and discussions happening on your own campus, as well as across the academic community. Working through the liminal space may require helping learners make connections to others facing similar problems. Also follow online discussions in order to point your learners to a wide variety of group learning opportunities, such as the active digital humanities community on Slack.9 When designing instructional opportunities, such as workshops and hackathons, pay particular attention to outreach strategies that may bring like-minded learners together, as well as diverse voices. For example, invite the scholar whose project was completed last year to add a more experienced voice to the conversation. By encouraging the formation of learning communities on your campus, you are creating safe spaces to help learners navigate the liminal state with others who may be on the other side of struggling with specific digital project issues. In designing instructional activities, guide learners through visualization exercises that help to identify “stuck” places. Making graphic representations of one’s thoughts (e.g., concept maps) can highlight areas that require clarification.
Todd Suomela

A Guide for Resisting Edtech: the Case against Turnitin - Hybrid Pedagogy - 0 views

  • At the Digital Pedagogy Lab Institutes where we’ve taught, there’s one exercise in particular we return to again and again. In our “crap detection” exercise (named for Rheingold’s use of the term), participants use a rubric to assess one of a number of digital tools. The tools are pitted, head to head, in a sort of edtech celebrity deathmatch. Participants compare Blackboard and Canvas, for instance, or WordPress and Medium, Twitter and Facebook, Genius and Hypothes.is. We start by seeing what the tools say they do and comparing that to what they actually do. But the work asks educators to do more than simply look at the platform’s own web site, which more often than not says only the very best things (and sometimes directly misleading things) about the company and its tool. We encourage participants to do research — to find forums, articles, and blog posts written about the platform, to read the tool’s terms of service, and even to tweet questions directly to the company’s CEO.
  • Here’s the rubric for the exercise: Who owns the tool? What is the name of the company, the CEO? What are their politics? What does the tool say it does? What does it actually do? What data are we required to provide in order to use the tool (login, e-mail, birthdate, etc.)? What flexibility do we have to be anonymous, or to protect our data? Where is data housed; who owns the data? What are the implications for in-class use? Will others be able to use/copy/own our work there? How does this tool act or not act as a mediator for our pedagogies? Does the tool attempt to dictate our pedagogies? How is its design pedagogical? Or exactly not pedagogical? Does the tool offer a way that “learning can most deeply and intimately begin”? Over time, the exercise has evolved as the educators we’ve worked with have developed further questions through their research. Accessibility, for example, has always been an implicit component of the activity, which we’ve now brought more distinctly to the fore, adding these questions: How accessible is the tool? For a blind student? For a hearing-impaired student? For a student with a learning disability? For introverts? For extroverts? Etc. What statements does the company make about accessibility? Ultimately, this is a critical thinking exercise aimed at asking critical questions, empowering critical relationships, encouraging new digital literacies.
Todd Suomela

Rejecting Test Surveillance in Higher Education by Lindsey Barrett :: SSRN - 0 views

  •  
    "The rise of remote proctoring software during the COVID-19 pandemic illustrates the dangers of surveillance-enabled pedagogy built on the belief that students can't be trusted. These services, which deploy a range of identification protocols, computer and internet access limitations, and human or automated observation of students as they take tests remotely, are marketed as necessary to prevent cheating. But the success of these services in their stated goal is ill- supported at best and discredited at worst, particularly given their highly over- inclusive criteria for "suspicious" behavior. Meanwhile, the harms they inflict on students are clear: severe anxiety among test-takers, concerning data collection and use practices, and discriminatory flagging of students of color and students with disabilities have provoked widespread outcry from students, professors, privacy advocates, policymakers, and sometimes universities themselves. To make matters worse, the privacy and civil rights laws most relevant to the use of these services are generally inadequate to protect students from the harms they inflict. Colleges and universities routinely face difficult decisions that require reconciling conflicting interests, but whether to use remote proctoring software isn't one of them. Remote proctoring software is not pedagogically beneficial, institutionally necessary, or remotely unavoidable, and its use further entrenches inequities in higher education that schools should be devoted to rooting out. Colleges and universities should abandon remote proctoring software, and apply the lessons from this failed experiment to their other existing or potential future uses of surveillance technologies and automated decision-making systems that threaten students' privacy, access to important life opportunities, and intellectual freedom. "
Todd Suomela

vSTEM.org - 0 views

  •  
    "The simulations on this site are meant to give students the ability to experiment on traditionally static textbook problems and examples. We believe experimenting with a flexible, dynamic system can give students deeper insights into core engineering concepts than that gained from solving for single snapshots of a system. Tweak variables; solve for unknowns; experiment; see what happens and figure out why. This site is also used to augment hands-on experiments, by tracking student training on lab equipment and comparing lab with simulated data. "
1 - 7 of 7
Showing 20 items per page