Skip to main content

Home/ Bucknell Digital Pedagogy & Scholarship/ Group items tagged digital-pedagogy

Rss Feed Group items tagged

Todd Suomela

Beyond buttonology: Digital humanities, digital pedagogy, and the ACRL Framework | Russ... - 0 views

  • Here are a few specific examples you can apply to your instructional design process to help learners with metacognition: Model the metacognitive process during instruction (or in one-on-one consultations) to ask and reflect on big picture questions such as: “What questions can you answer with this tool?” “What can you not do with this tool?” Keep in mind some answers may be simple (e.g., this tool can only work with data in this way, so it is excluded automatically). Also, “Did I get the results I expected? What could I have done differently?” Start with inquiry and build conversations based on the learner’s answers. “Is it the data that does not work? Or is the research question fundamentally wrong to begin with?” Collaborate with faculty to teach together, modelling your practices while demonstrating a specific tool. This could include thinking aloud as you make decisions so learners can self-correct assumptions. Also, be aware of your own expert bias so you can demonstrate how to clear obstacles. Ask learners to specifically define what is difficult for them during the process of instruction. Digital humanities tools are complex and are based on complex methodologies and research questions. By constructing opportunities for learners to self-question as they move from one task to another, they learn to self-assess their progress and adjust accordingly. There are several instructional design activities that promote metacognition: think-pair-share, one minute paper (“share a key concept learned” or “what comes next?”), and case studies.
  • There are specific strategies we can implement to help learners escape the recursive spiral of the liminal state they experience while managing complex digital projects: One of the most challenging aspects of teaching digital tools is forgetting what it is like to be a novice learner. Sometimes being a near-novice oneself helps you better prepare for the basic problems and frustrations learners are facing. But recognizing liminality is a reminder to you as a teacher that the learning process is not smooth, and it requires anticipating common difficulties and regularly checking in with learners to make sure you are not leaving them behind. When meeting with learners one-on-one, make sure to use your in-depth reference interview skills to engage in methods discussions. When a learner is in the liminal state, they are not always able to “see the forest for the trees.” Your directed questions will illuminate the problems they are having and the solutions they had not seen. Pay close attention to the digital humanities work and discussions happening on your own campus, as well as across the academic community. Working through the liminal space may require helping learners make connections to others facing similar problems. Also follow online discussions in order to point your learners to a wide variety of group learning opportunities, such as the active digital humanities community on Slack.9 When designing instructional opportunities, such as workshops and hackathons, pay particular attention to outreach strategies that may bring like-minded learners together, as well as diverse voices. For example, invite the scholar whose project was completed last year to add a more experienced voice to the conversation. By encouraging the formation of learning communities on your campus, you are creating safe spaces to help learners navigate the liminal state with others who may be on the other side of struggling with specific digital project issues. In designing instructional activities, guide learners through visualization exercises that help to identify “stuck” places. Making graphic representations of one’s thoughts (e.g., concept maps) can highlight areas that require clarification.
Todd Suomela

"We do software so that you can do education": The curious case of MOOC platforms - Wor... - 0 views

  • edX’s case illustrates one mechanism through which this happens: the construction of organizational roles. Consider the separation of software and pedagogy within the edX ecosystem. As edX expanded its slate of partners, its first clients and patrons, MIT and Harvard, saw a decline in their own ability to set the agenda and control the direction of the software. These “users” argue that the software has an implicit theory of pedagogy embedded in it, and that, as experts on pedagogy, they should have more of a say in shaping the software. While acknowledging this, edX’s architects counter that they—and not the Harvard-MIT folks—should have the final say on prioritizing which features to build, not only because they understand the software the best, but also because they see themselves as best placed to understand which features might benefit the whole eco-system rather than just particular players. The standard template in the education technology industry is that the technology experts are only supposed to “implement” what the pedagogy experts ask. What is arguably new about the edX platform framework is that the software is prior to, and thereby more constitutive of, the pedagogy.
Todd Suomela

A Guide for Resisting Edtech: the Case against Turnitin - Hybrid Pedagogy - 0 views

  • At the Digital Pedagogy Lab Institutes where we’ve taught, there’s one exercise in particular we return to again and again. In our “crap detection” exercise (named for Rheingold’s use of the term), participants use a rubric to assess one of a number of digital tools. The tools are pitted, head to head, in a sort of edtech celebrity deathmatch. Participants compare Blackboard and Canvas, for instance, or WordPress and Medium, Twitter and Facebook, Genius and Hypothes.is. We start by seeing what the tools say they do and comparing that to what they actually do. But the work asks educators to do more than simply look at the platform’s own web site, which more often than not says only the very best things (and sometimes directly misleading things) about the company and its tool. We encourage participants to do research — to find forums, articles, and blog posts written about the platform, to read the tool’s terms of service, and even to tweet questions directly to the company’s CEO.
  • Here’s the rubric for the exercise: Who owns the tool? What is the name of the company, the CEO? What are their politics? What does the tool say it does? What does it actually do? What data are we required to provide in order to use the tool (login, e-mail, birthdate, etc.)? What flexibility do we have to be anonymous, or to protect our data? Where is data housed; who owns the data? What are the implications for in-class use? Will others be able to use/copy/own our work there? How does this tool act or not act as a mediator for our pedagogies? Does the tool attempt to dictate our pedagogies? How is its design pedagogical? Or exactly not pedagogical? Does the tool offer a way that “learning can most deeply and intimately begin”? Over time, the exercise has evolved as the educators we’ve worked with have developed further questions through their research. Accessibility, for example, has always been an implicit component of the activity, which we’ve now brought more distinctly to the fore, adding these questions: How accessible is the tool? For a blind student? For a hearing-impaired student? For a student with a learning disability? For introverts? For extroverts? Etc. What statements does the company make about accessibility? Ultimately, this is a critical thinking exercise aimed at asking critical questions, empowering critical relationships, encouraging new digital literacies.
Todd Suomela

Rejecting Test Surveillance in Higher Education by Lindsey Barrett :: SSRN - 0 views

  •  
    "The rise of remote proctoring software during the COVID-19 pandemic illustrates the dangers of surveillance-enabled pedagogy built on the belief that students can't be trusted. These services, which deploy a range of identification protocols, computer and internet access limitations, and human or automated observation of students as they take tests remotely, are marketed as necessary to prevent cheating. But the success of these services in their stated goal is ill- supported at best and discredited at worst, particularly given their highly over- inclusive criteria for "suspicious" behavior. Meanwhile, the harms they inflict on students are clear: severe anxiety among test-takers, concerning data collection and use practices, and discriminatory flagging of students of color and students with disabilities have provoked widespread outcry from students, professors, privacy advocates, policymakers, and sometimes universities themselves. To make matters worse, the privacy and civil rights laws most relevant to the use of these services are generally inadequate to protect students from the harms they inflict. Colleges and universities routinely face difficult decisions that require reconciling conflicting interests, but whether to use remote proctoring software isn't one of them. Remote proctoring software is not pedagogically beneficial, institutionally necessary, or remotely unavoidable, and its use further entrenches inequities in higher education that schools should be devoted to rooting out. Colleges and universities should abandon remote proctoring software, and apply the lessons from this failed experiment to their other existing or potential future uses of surveillance technologies and automated decision-making systems that threaten students' privacy, access to important life opportunities, and intellectual freedom. "
Todd Suomela

Fluent in Social Media, Failing in Fake News: Generation Z, Online - Pacific Standard - 0 views

  • Instead of burrowing into a silo or vertical on a single webpage, as our Gen Z digital natives do, fact checkers tended to read laterally, a strategy that sent them zipping off a site to open new tabs across the horizontal axis of their screens. And their first stop was often the site we tell kids they should avoid: Wikipedia. But checkers used Wikipedia differently than the rest of us often do, skipping the main article to dive straight into the references, where more established sources can be found. They knew that the more controversial the topic, the more likely the entry was to be "protected," through the various locks Wikipedia applies to prevent changes by anyone except high-ranking editors. Further, the fact checkers knew how to use a Wikipedia article's "Talk" page, the tab hiding in plain sight right next to the article—a feature few students even know about, still less consult. It's the "Talk" page where an article's claims are established, disputed, and, when the evidence merits it, altered.
  • In the short term, we can do a few useful things. First, let's make sure that kids (and their teachers) possess some basic skills for evaluating digital claims. Some quick advice: When you land on an unfamiliar website, don't get taken in by official-looking logos or snazzy graphics. Open a new tab (better yet, several) and Google the group that's trying to persuade you. Second, don't click on the first result. Take a tip from fact checkers and practice click restraint: Scan the snippets (the brief sentence accompanying each search result) and make a smart first choice.
  • What if the answer isn't more media literacy, but a different kind of media literacy?
  • ...1 more annotation...
  • We call them "digital natives." Digitally naive might be more accurate.Between January of 2015 and June of 2016, my colleagues and I at the Stanford History Education Group surveyed 7,804 students across 12 states. Our goal was to take the pulse of civic online reasoning: students' ability to judge the information that affects them as citizens. What we found was a stunning and dismaying consistency. Young people's ability to navigate the Internet can be summed up in one word: bleak.
Todd Suomela

Rescuing Student Participation Through Digital Platforms - DML Central - 0 views

  • One problem is that participation is largely taken for granted and under theorized in many classrooms. The way we make use of a term like participation is in need of rescuing: moving away from a limited view of participation as it is often linked to motivation, engagement, or hand-raising and toward the view that participation as a concept is more generative when connected to the idea of membership in communities of practice (Wenger, 1998). Our limited view of participation is evident by simply turning to the language included in most syllabi: often, syllabi list “participation points” as part of the grade of the course. I find this to be an odd way to think about participation. What we often mean is that we will give students some points for “talking in class” and “raising their hands.” But demonstrating engagement by hand-raising and talk are fairly limited views of participation, and in fact, these ways of being are more connected to performance — acting like a student — than participation. We certainly want students to participate more than 10 percent, or even half, of the time. Are they participating when they are listening and pondering the ideas of their peers? Of course they are, but how do they demonstrate that? In thinking about course design, we should consider how students become members of our classroom community and our disciplines. Social media sites can open up other avenues for participation, and further, connect students to communities of practice outside our classrooms that they hope to enter.
jatolbert

Robin - 2008 - Digital Storytelling A Powerful Technology Tool f.pdf - 3 views

shared by jatolbert on 07 Apr 17 - No Cached
  •  
    Dated, largely descriptive, but still useful discussion of digital storytelling in the classroom. Touches on issues like digital literacy/multimedia skill development. Implies connections between creative and scholarly output.
Todd Suomela

vSTEM.org - 0 views

  •  
    "The simulations on this site are meant to give students the ability to experiment on traditionally static textbook problems and examples. We believe experimenting with a flexible, dynamic system can give students deeper insights into core engineering concepts than that gained from solving for single snapshots of a system. Tweak variables; solve for unknowns; experiment; see what happens and figure out why. This site is also used to augment hands-on experiments, by tracking student training on lab equipment and comparing lab with simulated data. "
1 - 20 of 39 Next ›
Showing 20 items per page