Skip to main content

Home/ Bucknell Digital Pedagogy & Scholarship/ Group items tagged innovation

Rss Feed Group items tagged

jatolbert

Does Digital Scholarship Have a Future? | EDUCAUSE - 1 views

  • Although the phrase sometimes refers to issues surrounding copyright and open access and sometimes to scholarship analyzing the online world, digital scholarship—emanating, perhaps, from digital humanities—most frequently describes discipline-based scholarship produced with digital tools and presented in digital form.
    • jatolbert
       
      A couple of points. First, there's no reason to assume that DS comes from DH. "Digital" was a term and concept before DH claimed it. Second, I would suggest that DS can be produced with digital tools OR presented digitally OR both. It isn't necessarily always both. I did digital scholarship that was both printed in a conventional journal and published online. Semantic difference, but still important.
  • Though the recent popularity of the phrase digital scholarship reflects impressive interdisciplinary ambition and coherence, two crucial elements remain in short supply in the emerging field. First, the number of scholars willing to commit themselves and their careers to digital scholarship has not kept pace with institutional opportunities. Second, today few scholars are trying, as they did earlier in the web's history, to reimagine the form as well as the substance of scholarship. In some ways, scholarly innovation has been domesticated, with the very ubiquity of the web bringing a lowered sense of excitement, possibility, and urgency. These two deficiencies form a reinforcing cycle: the diminished sense of possibility weakens the incentive for scholars to take risks, and the unwillingness to take risks limits the impact and excitement generated by boldly innovative projects.
    • jatolbert
       
      I'm not sure about any of this. There's plenty of innovation happening. Also, galloping towards innovation for its own sake, without considering the specific needs of scholars, seems like a mistake.
  • Digital scholarship, reimagined in bolder ways, is cost-effective, a smart return on investment. By radically extending the audience for a work of scholarship, by reaching students of many ages and backgrounds, by building the identity of the host institution, by attracting and keeping excellent faculty and students, by creating bonds between faculty and the library, and by advancing knowledge across many otherwise disparate disciplines, innovative digital scholarship makes sense.
  • ...5 more annotations...
  • Yet, other aspects of the changing digital environment may not be encouraging digital scholarship. The large and highly visible investments being made in MOOCs, for example, lead some faculty to equate technology with the diminution of hard-won traditions of teaching and scholarship. Using new capacities in bandwidth, MOOCs extend well-established patterns of large lectures to audiences otherwise out of the hearing range of those lectures. Unlike digital scholarship, however, MOOCs make no claim to creating new disciplinary knowledge, to advancing the scholarly conversation, to unifying research and teaching.
    • jatolbert
       
      I don't see why any of this is necessarily a problem--unless you reject the notion of lectures as useful pedagogical forms entirely
  • In other words, digital scholarship may have greater impact if it takes fuller advantage of the digital medium and innovates more aggressively. Digital books and digital articles that mimic their print counterparts may be efficient, but they do not expand our imagination of what scholarship could be in an era of boundlessness, an era of ubiquity. They do not imagine other forms in which scholarship might live in a time when our audiences can be far more vast and varied than in previous generations. They do not challenge us to think about keeping alive the best traditions of the academy by adapting those traditions to the possibilities of our own time. They do not encourage new kinds of writing, of seeing, of explaining. And we need all those things.
    • jatolbert
       
      Somewhat melodramatic. What kind of innovation does he want, exactly? And what doesn't he like about the formats he mentions here? He lists things that scholars do, suggests they need to change, but makes no compelling case re: WHY they need to change.
  • Interpretation must be an integral and explicit part of the fundamental architecture of new efforts. Insisting that colleges and universities broaden their standards and definitions of scholarship to make room for digital scholarship is necessary, but it is only a partial answer. To be recognized and rewarded as scholarship in the traditional sense, digital scholarship must do the work we have long expected scholarship to do: contribute, in a meaningful and enduring way, to an identifiable collective and cumulative enterprise.
  • By way of example, the Digital Scholarship Lab at the University of Richmond is attempting to build one model of what this new scholarship might look like. The lab combines various elements of proven strategies while also breaking new ground. With the support of the Andrew W. Mellon Foundation, the historians Robert K. Nelson and Scott Nesbit and their colleagues are creating a digital atlas of American history. The first instantiation of the atlas, Visualizing Emancipation, will soon be followed by an amplified, annotated, and animated digital edition of The Atlas of the Historical Geography of the United States, first published in 1932. Over the next three years, chapters of original and dynamic maps and interpretations will focus on key aspects of the American experience since the nation's founding. The digital atlas will allow scholars to see patterns we have never been able to envision before while at the same time it will make available to teachers of all levels visualizations of crucial processes in American history.
    • jatolbert
       
      This one example doesn't seem all that innovative--story maps, etc. have been around a long time. Also, what he's doing is still basically a repackaging of print scholarship. It could be useful, but it's not nearly as radical as he seems to think.
  • Does Digital Scholarship Have a Future?
    • jatolbert
       
      A problematic think piece about digital scholarship in general. Has some useful definitions. Unfortunately Ayers is doing a lot of hand-wringing over what he sees as the lack of meaningful innovation in digital scholarship. It's not at all clear, though, what he means by this. He argues that what innovation has happened isn't sufficient, then gives an example of a project--a digital atlas of American history--that he seems to think is radically different, but isn't in any way I can discern from his description.
Todd Suomela

Making Culture - Expressive & Creative Interaction Technologies Center - 0 views

  •  
    "Making Culture is the first in-depth examination of K-12 education makerspaces nationwide and was created as part of the ExCITe Center's Learning Innovation initiative. This report reveals the significance of cultural aspects of making (student interests, real world relevance, and community collaboration) that enable learning. "
Todd Suomela

Guest Post: Is the Research Article Immune to Innovation? - The Scholarly Kitchen - 0 views

  • The real argument, then, is not about when and how traditional formats like the PDF will be replaced; it’s about accepting that the familiar (and perhaps boring) research article still has its purpose, while at the same time thinking ambitiously and creatively about how the humble document can be supplemented with the modern features and functions that the digital environment offers.
Todd Suomela

Who Framed Augmented Reality? | Johannah King-Slutzky - 0 views

  • The human/drawing interaction trope that Zuckerberg is rebranding as Facebook’s own innovation even predates animated cartoons. One type of scrapbook, the paper dollhouse, played with the appeal of mixing real-life and an invented world. It was most popular from 1875-1920, and over forty years its form remained consistent: A dollhouse unfolded theatrically to create illusions of progress and depth.
  • Winsor McCay’s Gertie the Dinosaur is generally considered the first animated cartoon ever, and it made use of the same trope of mixing reality and man-made art when it premiered all the way back in 1914. McCay was a cartoonist famous for the Freudian, surrealist comic Little Nemo in Slumberland, which was published in weekly instalments in the New York Herald and New York American—though its material is more frequently compared to Bosch than to Garfield. McCay, already two hits deep into his career in the first decade of the twentieth century, purportedly decided to animate a comic strip in 1909 on a dare from friends griping about his daunting productivity. Following a brief stint with an animated Nemo, McCay developed Gertie the Dinosaur, an amiable brontosaurus with a stoner grin, and took her on a vaudeville roadshow across America.
  • LAST MONTH Facebook premiered its vision for the future at its development conference, F8. The camera-app technology Mark Zuckerberg calls augmented reality (or AR) borrows heavily from the social network Snapchat, which enables users to layer animated digital content onto photos on the fly. On stage, Zuckerberg promoted this collaging as social media’s first steps toward modish virtual screen manipulations. “This will allow us to create all kinds of things that were only available in the digital world,” Zuckerberg bubbled effusively. “We’re going to interact with them and explore them together.” Taken in, USA Today repeated this claim to innovation, elaborating on the digital mogul’s Jules Verne-like promise: “We will wander not one, but two worlds—the physical and the digital,” For my part, I was particularly delighted by Facebook’s proposal to animate bowls of cereal with marauding cartoon sharks, savoring, perhaps, the insouciant violence I associate with childhood adventure.
Todd Suomela

Young Men Are Playing Video Games Instead of Getting Jobs. That's OK. (For Now.) - Reas... - 0 views

  • Video games, like work, are basically a series of quests comprised of mundane and repetitive tasks: Receive an assignment, travel to a location, overcome some obstacles, perform some sort of search, pick up an item, and then deliver it in exchange for a reward—and, usually, another quest, which starts the cycle all over again. You are not playing the game so much as following its orders. The game is your boss; to succeed, you have to do what it says.
  • Instead of working, they are playing video games. About three quarters of the increase in leisure time among men since 2000 has gone to gaming. Total time spent on computers, including game consoles, has nearly doubled. You might think that this would be demoralizing. A life spent unemployed, living at home, without romantic prospects, playing digital time wasters does not sound particularly appealing on its face. Yet this group reports far higher levels of overall happiness than low-skilled young men from the turn of the 21st century. In contrast, self-reported happiness for older workers without college degrees fell during the same period. For low-skilled young women and men with college degrees, it stayed basically the same. A significant part of the difference comes down to what Hurst has called "innovations in leisure computer activities for young men." The problems come later. A young life spent playing video games can lead to a middle age without marketable skills or connections. "There is some evidence," Hurst pointed out, "that these young, lower-skilled men who are happy in their 20s become much less happy in their 30s or 40s." So are these guys just wasting their lives, frittering away their time on anti-social activities? Hurst describes his figures as "staggering" and "shocking"—a seismic shift in the relationship of young men to work. "Men in their 20s historically are a group with a strong attachment to the labor force," he writes. "The decline in employment rates for low-skilled men in their 20s was larger than it was for all other sex, age, and skill groups during this same time period." But there's another way to think about the change: as a shift in their relationship to unemployment. Research has consistently found that long-term unemployment is one of the most dispiriting things that can happen to a person. Happiness levels tank and never recover. One 2010 study by a group of German researchers suggests that it's worse, over time, for life satisfaction than even the death of a spouse. What video games appear to do is ease the psychic pain of joblessness—and to do it in a way that is, if not permanent, at least long-lasting. For low-skilled young men, what is the alternative to playing games? We might like to imagine that they would all become sociable and highly productive members of society, but that is not necessarily the case.
  • A military shooter might offer a simulation of being a crack special forces soldier. A racing game might simulate learning to handle a performance sports car. A sci-fi role-playing game might simulate becoming an effective leader of a massive space colonization effort. But what you're really doing is training yourself to effectively identify on-screen visual cues and twitch your thumb at the right moment. You're learning to handle a controller, not a gun or a race car. You're learning to manage a game's hidden stats system, not a space station. A game provides the sensation of mastery without the actual ability. "It's a simulation of being an expert," Wolpaw says. "It's a way to fulfill a fantasy." That fantasy, ultimately, is one of work, purpose, and social and professional success.
1 - 7 of 7
Showing 20 items per page