Skip to main content

Home/ Bucknell Digital Pedagogy & Scholarship/ Group items tagged software

Rss Feed Group items tagged

Todd Suomela

"We do software so that you can do education": The curious case of MOOC platforms - Wor... - 0 views

  • edX’s case illustrates one mechanism through which this happens: the construction of organizational roles. Consider the separation of software and pedagogy within the edX ecosystem. As edX expanded its slate of partners, its first clients and patrons, MIT and Harvard, saw a decline in their own ability to set the agenda and control the direction of the software. These “users” argue that the software has an implicit theory of pedagogy embedded in it, and that, as experts on pedagogy, they should have more of a say in shaping the software. While acknowledging this, edX’s architects counter that they—and not the Harvard-MIT folks—should have the final say on prioritizing which features to build, not only because they understand the software the best, but also because they see themselves as best placed to understand which features might benefit the whole eco-system rather than just particular players. The standard template in the education technology industry is that the technology experts are only supposed to “implement” what the pedagogy experts ask. What is arguably new about the edX platform framework is that the software is prior to, and thereby more constitutive of, the pedagogy.
Todd Suomela

Rejecting Test Surveillance in Higher Education by Lindsey Barrett :: SSRN - 0 views

  •  
    "The rise of remote proctoring software during the COVID-19 pandemic illustrates the dangers of surveillance-enabled pedagogy built on the belief that students can't be trusted. These services, which deploy a range of identification protocols, computer and internet access limitations, and human or automated observation of students as they take tests remotely, are marketed as necessary to prevent cheating. But the success of these services in their stated goal is ill- supported at best and discredited at worst, particularly given their highly over- inclusive criteria for "suspicious" behavior. Meanwhile, the harms they inflict on students are clear: severe anxiety among test-takers, concerning data collection and use practices, and discriminatory flagging of students of color and students with disabilities have provoked widespread outcry from students, professors, privacy advocates, policymakers, and sometimes universities themselves. To make matters worse, the privacy and civil rights laws most relevant to the use of these services are generally inadequate to protect students from the harms they inflict. Colleges and universities routinely face difficult decisions that require reconciling conflicting interests, but whether to use remote proctoring software isn't one of them. Remote proctoring software is not pedagogically beneficial, institutionally necessary, or remotely unavoidable, and its use further entrenches inequities in higher education that schools should be devoted to rooting out. Colleges and universities should abandon remote proctoring software, and apply the lessons from this failed experiment to their other existing or potential future uses of surveillance technologies and automated decision-making systems that threaten students' privacy, access to important life opportunities, and intellectual freedom. "
Todd Suomela

The Internet as existential threat « Raph's Website - 1 views

  • Our medical systems have terrible Internet security… MRI machines you can connect to with USB that still have “admin:password” to gain root access. That’s horrifying, sure, but that’s not an attack at scale. More frightening: we’re busily uploading all our medical records to the cloud. Take down that cloud, and no patients can be treated, because nobody will know what they have, what meds they are on. Software swallows your insulin pumps and your pacemakers. To kill people, all you need is to hack that database, or simply erase it or block access to it. After all, we don’t tend to realize that in an Internet of Things, humans are just Things too. As this software monster has encroached on stuff like election systems, the common reaction has been to go back to paper. So let’s consider a less obvious example. We should be going back to paper for our libraries too! We’ve outsourced so much of our knowledge to digital that the amount of knowledge available in analog has dropped notably. There are less librarians in the fewer libraries with smaller collections than there used to be. If the net goes down, how much reference material is simply not accessible that was thirty years ago? Google Search is “critical cultural infrastructure.” How much redundancy do we actually have? Could a disconnected town actually educate its children? How critical is Google as a whole? If Google went down for a month, I am pretty sure we would see worldwide economic collapse. How much of the world economy passes through Google hosting? How much of it is in GMail? How much is dependent on Google Search, Google Images, Google Docs? The answer is a LOT. And because financial systems are now also JIT, ten thousand corporate blips where real estate agencies and local car washes and a huge pile of software companies and a gaggle of universities and so on are suddenly 100% unable to function digitally (no payroll! no insurance verification!) would absolutely have ripple effects into their suppliers and their customers, and thence to the worldwide economic market. Because interconnection without redundancy increases odds of cascades.
  • But just as critically, governments and state actors seem to be the source of so many of the problems precisely because the Internet is now too many forms of critical infrastructure, and therefore too juicy a target. If software eats everything, then the ability to kill software is the ability to kill anything. Net connectivity becomes the single point of failure for every system connected to it. Even if the Net itself is designed to route around damage, that doesn’t help if it is the single vector of attack that can take down any given target. It’s too juicy a target for the military, too juicy a target for terror, too juicy a target for criminal ransom. The old adage goes “when they came for this, I said nothing. When they came for that…” — we all know it. Consider that the more we hand gleefully over to the cloud because we want convenience, big data, personalization, and on, we’re creating a single thing that can be taken from us in an instant. We’ve decided to subscribe to everything, instead of owning it. When they came for your MP3s, your DVDs, fine,. not “critical infrastructure.” When they came for your resumes, OK, getting closer.
  • As we rush towards putting more and more things “in the cloud,” as we rush towards an Internet of Things with no governance beyond profit motive and anarchy, what we’re effectively doing is creating a massive single point of failure for every system we put in it.
Todd Suomela

Tools for Scaffolding Students in a Complex Learning Environment: What Have We Gained a... - 0 views

  •  
    "This article discusses the change in the notion of scaffolding from a description of the interactions between a tutor and a student to the design of tools to support student learning in project-based and design-based classrooms. The notion of scaffolding is now increasingly being used to describe various forms of support provided by software tools, curricula, and other resources designed to help students learn successfully in a classroom. However, some of the critical elements of scaffolding are missing in the current use of the scaffolding construct. Although new curricula and software tools now described as scaffolds have provided us with novel techniques to support student learning, the important theoretical features of scaffolding such as ongoing diagnosis, calibrated support, and fading are being neglected. This article discusses how to implement these critical features of scaffolding in tools, resources, and curricula. It is suggested that if tools are designed based on the multiple levels of the student understanding found in a classroom, tools themselves might be removed to achieve fading."
Jennifer Parrott

Writing professors question plagiarism detection software | Inside Higher Ed - 0 views

  •  
    Discusses problems with requiring students to use Turnitin 
Todd Suomela

Welcome to the GEODE Initiative! - 0 views

  •  
    "The Geographic Data in Education (GEODE) Initiative at Northwestern University is dedicated to improving public understanding of our world through education about the Earth's physical, biological, and social systems. Toward that end, the GEODE Initiative is engaged in a program of integrated research and development in the areas of learning, teaching and educational reform. The GEODE Initiative develops and studies curriculum, software, and teacher professional development. "
Leslie Harris

U Illinois Prof Places Herself into Flipped Courses -- Campus Technology - 0 views

  •  
    A faculty member uses software called Personify to record her "flipped classroom" lectures. Personify allows her to place her head and shoulders (recorded as she delivers the lecture) on top of the screencasting content, and she can choose when to display her image and when not to display her image as she delivers the lecture.
Todd Suomela

DSHR's Blog: Ithaka's Perspective on Digital Preservation - 0 views

  • Second, there is very little coverage of Web archiving, which is clearly by far the largest and most important digital preservation initiative both for current and future readers. The Internet Archive rates only two mentions, in the middle of a list of activities and in a footnote. This is despite the fact that archive.org is currently the 211th most visited site in the US (272nd globally) with over 5.5M registered users, adding over 500 per day, and serving nearly 4M unique IPs per day. For comparison, the Library of Congress currently ranks 1439th in the US (5441st globally). The Internet Archive's Web collection alone probably dwarfs all other digital preservation efforts combined both in size and in usage. Not to mention their vast collections of software, digitized books, audio, video and TV news.. Rieger writes: There is a lack of understanding about how archived websites are discovered, used, and referenced. “Researchers prefer to cite the original live-web as it is easier and shorter,” pointed out one of the experts. “There is limited awareness of the existence of web archives and lack of community consensus on how to treat them in scholarly work. The problems are not about technology any more, it is about usability, awareness, and scholarly practices.” The interviewee referred to a recent CRL study based on an analysis of referrals to archived content from papers that concluded that the citations were mainly to articles about web archiving projects. It is surprising that the report doesn't point out that the responsibility for educating scholars in the use of resources lies with the "experts and thought leaders" from institutions such as the University of California, Michigan State, Cornell, MIT, NYU and Virginia Tech. That these "experts and thought leaders" don't consider the Internet Archive to be a resource worth mentioning might have something to do with the fact that their scholars don't know that they should be using it. A report whose first major section, entitled "What's Working Well", totally fails to acknowledge the single most important digital preservation effort of the last two decades clearly lacks credibility
  • Finally, there is no acknowledgement that the most serious challenge facing the field is economic. Except for a few corner cases, we know how to do digital preservation, we just don't want to pay enough to have it done. Thus the key challenge is to achieve some mixture of significant increase in funding for, and significant cost reduction in the processes of, digital preservation. Information technology processes naturally have very strong economies of scale, which result in winner-take-all markets (as W. Brian Arthur pointed out in 1985). It is notable that the report doesn't mention the winners we already have, in Web and source code archiving, and in emulation. All are at the point where a competitor is unlikely to be viable. To be affordable, digital preservation needs to be done at scale. The report's orientation is very much "let a thousand flowers bloom", which in IT markets only happens at a very early stage. This is likely the result of talking only to people nurturing a small-scale flower, not to people who have already dominated their market niche. It is certainly a risk that each area will have a single point of failure, but trying to fight against the inherent economics of IT pretty much guarantees ineffectiveness.
  • 1) The big successes in the field haven't come from consensus building around a roadmap, they have come from idiosyncratic individuals such as Brewster Kahle, Roberto di Cosmo and Jason Scott identifying a need and building a system to address it no matter what "the community" thinks. We have a couple of decades of experience showing that "the community" is incapable of coming to a coherent consensus that leads to action on a scale appropriate to the problem. In any case, describing road-mapping as "research" is a stretch. 2) Under severe funding pressure, almost all libraries have de-emphasized their custodial role of building collections in favor of responding to immediate client needs. Rieger writes: As one interviewee stated, library leaders have “shifted their attention from seeing preservation as a moral imperative to catering to the university’s immediate needs.” Regrettably, but inevitably given the economics of IT markets, this provides a market opportunity for outsourcing. Ithaka has exploited one such opportunity with Portico. This bullet does describe "research" in the sense of "market research".  Success is, however, much more likely to come from the success of an individual effort than from a consensus about what should be done among people who can't actually do it. 3) In the current climate, increased funding for libraries and archives simply isn't going to happen. These institutions have shown a marked reluctance to divert their shrinking funds from legacy to digital media. Thus the research topic with the greatest leverage in turning funds into preserved digital content is into increasing the cost-effectiveness of the tools, processes and infrastructure of digital preservation.
1 - 18 of 18
Showing 20 items per page