Skip to main content

Home/ Bucknell Digital Pedagogy & Scholarship/ Group items matching ""Digital Humanities"" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
2More

DLAx | The Digital Liberal Arts Exchange - 0 views

shared by jatolbert on 08 Jun 17 - No Cached
  • Many schools have recently embarked upon initiatives in digital scholarship – those forms of scholarship largely in the humanities and humanistic social sciences that emphasize digital tools and infrastructure, as well as accompanying expertise and support.
    • jatolbert
       
      Why is it "largely in the humanities and humanistic social sciences"? I don't buy this.
1More

LIterature and Literary Study in the Digital Age | A course at SUNY Geneseo - 0 views

  •  
    Course site for Literature and Literary Study in the Digital Age built using WP and the Commons in a box plug-in
3More

Who Framed Augmented Reality? | Johannah King-Slutzky - 0 views

  • The human/drawing interaction trope that Zuckerberg is rebranding as Facebook’s own innovation even predates animated cartoons. One type of scrapbook, the paper dollhouse, played with the appeal of mixing real-life and an invented world. It was most popular from 1875-1920, and over forty years its form remained consistent: A dollhouse unfolded theatrically to create illusions of progress and depth.
  • Winsor McCay’s Gertie the Dinosaur is generally considered the first animated cartoon ever, and it made use of the same trope of mixing reality and man-made art when it premiered all the way back in 1914. McCay was a cartoonist famous for the Freudian, surrealist comic Little Nemo in Slumberland, which was published in weekly instalments in the New York Herald and New York American—though its material is more frequently compared to Bosch than to Garfield. McCay, already two hits deep into his career in the first decade of the twentieth century, purportedly decided to animate a comic strip in 1909 on a dare from friends griping about his daunting productivity. Following a brief stint with an animated Nemo, McCay developed Gertie the Dinosaur, an amiable brontosaurus with a stoner grin, and took her on a vaudeville roadshow across America.
  • LAST MONTH Facebook premiered its vision for the future at its development conference, F8. The camera-app technology Mark Zuckerberg calls augmented reality (or AR) borrows heavily from the social network Snapchat, which enables users to layer animated digital content onto photos on the fly. On stage, Zuckerberg promoted this collaging as social media’s first steps toward modish virtual screen manipulations. “This will allow us to create all kinds of things that were only available in the digital world,” Zuckerberg bubbled effusively. “We’re going to interact with them and explore them together.” Taken in, USA Today repeated this claim to innovation, elaborating on the digital mogul’s Jules Verne-like promise: “We will wander not one, but two worlds—the physical and the digital,” For my part, I was particularly delighted by Facebook’s proposal to animate bowls of cereal with marauding cartoon sharks, savoring, perhaps, the insouciant violence I associate with childhood adventure.
1More

Parsing Ronald Reagan's Words for Early Signs of Alzheimer's - NYTimes.com - 0 views

  •  
    Interesting article about an analysis of Ronald Reagan's news conferences in an attempt to determine early signs of dementia. The "digital humanities" aspect is that the same linguistic analysis has been use to study word use patterns by novelists.
1More

Digital Map of the Roman Empire - 0 views

  •  
    This map of the Roman Empire allows you to zoom in and out to search for cities and other sites during the Roman Empire.
1More

Rejecting Test Surveillance in Higher Education by Lindsey Barrett :: SSRN - 0 views

  •  
    "The rise of remote proctoring software during the COVID-19 pandemic illustrates the dangers of surveillance-enabled pedagogy built on the belief that students can't be trusted. These services, which deploy a range of identification protocols, computer and internet access limitations, and human or automated observation of students as they take tests remotely, are marketed as necessary to prevent cheating. But the success of these services in their stated goal is ill- supported at best and discredited at worst, particularly given their highly over- inclusive criteria for "suspicious" behavior. Meanwhile, the harms they inflict on students are clear: severe anxiety among test-takers, concerning data collection and use practices, and discriminatory flagging of students of color and students with disabilities have provoked widespread outcry from students, professors, privacy advocates, policymakers, and sometimes universities themselves. To make matters worse, the privacy and civil rights laws most relevant to the use of these services are generally inadequate to protect students from the harms they inflict. Colleges and universities routinely face difficult decisions that require reconciling conflicting interests, but whether to use remote proctoring software isn't one of them. Remote proctoring software is not pedagogically beneficial, institutionally necessary, or remotely unavoidable, and its use further entrenches inequities in higher education that schools should be devoted to rooting out. Colleges and universities should abandon remote proctoring software, and apply the lessons from this failed experiment to their other existing or potential future uses of surveillance technologies and automated decision-making systems that threaten students' privacy, access to important life opportunities, and intellectual freedom. "
1More

Computing Crime and Punishment - NYTimes.com - 0 views

  •  
    The article discusses a computer-based analysis of word use in the court reports of trials at the Old Bailey from 1674 through 1913.
3More

The Internet as existential threat « Raph's Website - 1 views

  • Our medical systems have terrible Internet security… MRI machines you can connect to with USB that still have “admin:password” to gain root access. That’s horrifying, sure, but that’s not an attack at scale. More frightening: we’re busily uploading all our medical records to the cloud. Take down that cloud, and no patients can be treated, because nobody will know what they have, what meds they are on. Software swallows your insulin pumps and your pacemakers. To kill people, all you need is to hack that database, or simply erase it or block access to it. After all, we don’t tend to realize that in an Internet of Things, humans are just Things too. As this software monster has encroached on stuff like election systems, the common reaction has been to go back to paper. So let’s consider a less obvious example. We should be going back to paper for our libraries too! We’ve outsourced so much of our knowledge to digital that the amount of knowledge available in analog has dropped notably. There are less librarians in the fewer libraries with smaller collections than there used to be. If the net goes down, how much reference material is simply not accessible that was thirty years ago? Google Search is “critical cultural infrastructure.” How much redundancy do we actually have? Could a disconnected town actually educate its children? How critical is Google as a whole? If Google went down for a month, I am pretty sure we would see worldwide economic collapse. How much of the world economy passes through Google hosting? How much of it is in GMail? How much is dependent on Google Search, Google Images, Google Docs? The answer is a LOT. And because financial systems are now also JIT, ten thousand corporate blips where real estate agencies and local car washes and a huge pile of software companies and a gaggle of universities and so on are suddenly 100% unable to function digitally (no payroll! no insurance verification!) would absolutely have ripple effects into their suppliers and their customers, and thence to the worldwide economic market. Because interconnection without redundancy increases odds of cascades.
  • But just as critically, governments and state actors seem to be the source of so many of the problems precisely because the Internet is now too many forms of critical infrastructure, and therefore too juicy a target. If software eats everything, then the ability to kill software is the ability to kill anything. Net connectivity becomes the single point of failure for every system connected to it. Even if the Net itself is designed to route around damage, that doesn’t help if it is the single vector of attack that can take down any given target. It’s too juicy a target for the military, too juicy a target for terror, too juicy a target for criminal ransom. The old adage goes “when they came for this, I said nothing. When they came for that…” — we all know it. Consider that the more we hand gleefully over to the cloud because we want convenience, big data, personalization, and on, we’re creating a single thing that can be taken from us in an instant. We’ve decided to subscribe to everything, instead of owning it. When they came for your MP3s, your DVDs, fine,. not “critical infrastructure.” When they came for your resumes, OK, getting closer.
  • As we rush towards putting more and more things “in the cloud,” as we rush towards an Internet of Things with no governance beyond profit motive and anarchy, what we’re effectively doing is creating a massive single point of failure for every system we put in it.
‹ Previous 21 - 40 of 50 Next ›
Showing 20 items per page