Skip to main content

Home/ Bucknell Digital Pedagogy & Scholarship/ Group items tagged writing

Rss Feed Group items tagged

Jennifer Parrott

Writing professors question plagiarism detection software | Inside Higher Ed - 0 views

  •  
    Discusses problems with requiring students to use Turnitin 
jatolbert

Open Stacks: Making DH Labor Visible ← dh+lib - 1 views

  • When infrastructure is understood as an irrational social formation, emotional labor tends to compensate for a perceived lack of resources. Scholars who are used to the invisibility of traditional library services, for instance, find that digital projects expose hierarchies and bureaucracies that they don’t want to negotiate or even think about, and the DH librarian or one of her colleagues steps in to run interference. Why can’t the dean of libraries just tell that department to create the metadata for my project? After all, they already create metadata for the library’s systems. Why can’t web programming be a service you provide to me like interlibrary loan? I thought the library was here to support my scholarship. Why can’t you maintain my website after I retire–exactly the way it looks and feels today, plus update it as technology changes? In some conversations, these questions may be rhetorical; it may take emotional labor to answer them, but doing so exposes the workings of the library’s infrastructure–its social stack.
    • jatolbert
       
      More conflation of DH with all digital scholarship
  • How does DH fit within this megastructure? According to some critics, DH is part of the problem of the neoliberal university because it privileges networked, collaborative scholarship over individual production. If creating a tool (hacking) or using computational methods has the same scholarly significance as writing a monograph, then individualized knowledge pursued for its own sake, the struggle at the heart of humanistic inquiry, is devalued. Yet writing a book always depended on invisible (gendered) labor in the academy. Word processing, library automation, and widespread digitization are just three examples of the support labor for traditional scholarly work that Bratton’s globalized technology Stack has absorbed. (And we know that the fruits of that labor are in no way distributed equitably.) What has changed in the neoliberal university is that the humanities scholar becomes one more node in a knowledge-producing system. Does it matter, then, whether DH work produces ideas or things, critics say, if all are absorbed into a totalizing system that elides the individual scholar’s privileged position? This is of course a vision of scholarship that is traditionally specific to the humanities; lab science and the performing arts, for example, have always been deeply collaborative (but with their own systems of privilege and credit).
  • DH librarians, whose highly collaborative work is dedicated to social justice and public engagement, may be one particularly vital community of practice for exposing the changing conditions that create knowledge.
  • ...7 more annotations...
  • like the fish who asks “what is water?”–most scholars are unaware of the extent to which their work, professional interactions, and finances are imbricated with the global technology Stack.
    • jatolbert
       
      Also not sure that this is true.
  • Many DH programs, initiatives, and teams have arisen organically out of social connections rather than centralized planning.
  • the myth of scarcity
  • Scholars often presume that because libraries acquire, shelve, and preserve the print books that they write, that the same libraries will acquire, shelve (or host), and preserve digital projects.
    • jatolbert
       
      This is a natural assumption, and in fact is true in many cases.
  • digital scholarship
  • DH
  • digital scholarship
Todd Suomela

DSHR's Blog: Ithaka's Perspective on Digital Preservation - 0 views

  • Second, there is very little coverage of Web archiving, which is clearly by far the largest and most important digital preservation initiative both for current and future readers. The Internet Archive rates only two mentions, in the middle of a list of activities and in a footnote. This is despite the fact that archive.org is currently the 211th most visited site in the US (272nd globally) with over 5.5M registered users, adding over 500 per day, and serving nearly 4M unique IPs per day. For comparison, the Library of Congress currently ranks 1439th in the US (5441st globally). The Internet Archive's Web collection alone probably dwarfs all other digital preservation efforts combined both in size and in usage. Not to mention their vast collections of software, digitized books, audio, video and TV news.. Rieger writes: There is a lack of understanding about how archived websites are discovered, used, and referenced. “Researchers prefer to cite the original live-web as it is easier and shorter,” pointed out one of the experts. “There is limited awareness of the existence of web archives and lack of community consensus on how to treat them in scholarly work. The problems are not about technology any more, it is about usability, awareness, and scholarly practices.” The interviewee referred to a recent CRL study based on an analysis of referrals to archived content from papers that concluded that the citations were mainly to articles about web archiving projects. It is surprising that the report doesn't point out that the responsibility for educating scholars in the use of resources lies with the "experts and thought leaders" from institutions such as the University of California, Michigan State, Cornell, MIT, NYU and Virginia Tech. That these "experts and thought leaders" don't consider the Internet Archive to be a resource worth mentioning might have something to do with the fact that their scholars don't know that they should be using it. A report whose first major section, entitled "What's Working Well", totally fails to acknowledge the single most important digital preservation effort of the last two decades clearly lacks credibility
  • Finally, there is no acknowledgement that the most serious challenge facing the field is economic. Except for a few corner cases, we know how to do digital preservation, we just don't want to pay enough to have it done. Thus the key challenge is to achieve some mixture of significant increase in funding for, and significant cost reduction in the processes of, digital preservation. Information technology processes naturally have very strong economies of scale, which result in winner-take-all markets (as W. Brian Arthur pointed out in 1985). It is notable that the report doesn't mention the winners we already have, in Web and source code archiving, and in emulation. All are at the point where a competitor is unlikely to be viable. To be affordable, digital preservation needs to be done at scale. The report's orientation is very much "let a thousand flowers bloom", which in IT markets only happens at a very early stage. This is likely the result of talking only to people nurturing a small-scale flower, not to people who have already dominated their market niche. It is certainly a risk that each area will have a single point of failure, but trying to fight against the inherent economics of IT pretty much guarantees ineffectiveness.
  • 1) The big successes in the field haven't come from consensus building around a roadmap, they have come from idiosyncratic individuals such as Brewster Kahle, Roberto di Cosmo and Jason Scott identifying a need and building a system to address it no matter what "the community" thinks. We have a couple of decades of experience showing that "the community" is incapable of coming to a coherent consensus that leads to action on a scale appropriate to the problem. In any case, describing road-mapping as "research" is a stretch. 2) Under severe funding pressure, almost all libraries have de-emphasized their custodial role of building collections in favor of responding to immediate client needs. Rieger writes: As one interviewee stated, library leaders have “shifted their attention from seeing preservation as a moral imperative to catering to the university’s immediate needs.” Regrettably, but inevitably given the economics of IT markets, this provides a market opportunity for outsourcing. Ithaka has exploited one such opportunity with Portico. This bullet does describe "research" in the sense of "market research".  Success is, however, much more likely to come from the success of an individual effort than from a consensus about what should be done among people who can't actually do it. 3) In the current climate, increased funding for libraries and archives simply isn't going to happen. These institutions have shown a marked reluctance to divert their shrinking funds from legacy to digital media. Thus the research topic with the greatest leverage in turning funds into preserved digital content is into increasing the cost-effectiveness of the tools, processes and infrastructure of digital preservation.
Todd Suomela

The Scholar's Stage: Teaching the Humanities as Terribly as Possible - 0 views

  • Dive into the past and you will see this theme will emerge time and again: the purpose of studying history, philosophy, and poetry is to help us lead better lives and be better people. The humanities are an education for the soul. Placed next to these paeans to education, the aims of the "Theology of Dostoevsky" course are crippling. Reading Dostoevsky will help students will learn how to "contextualize literature within its anthropological milieu." Dostoevsky will teach them to see "the unique interpretive problems inherent in studying creative genres" and discussing his works will help them "communicate more effectively, verbally and in writing, about theological literature." That is the purpose of reading a man regularly called the best novelist in human history! We read him to "meet academics standards for writing and notation!" How painfully limited.
jatolbert

The Digital-Humanities Bust - The Chronicle of Higher Education - 0 views

  • To ask about the field is really to ask how or what DH knows, and what it allows us to know. The answer, it turns out, is not much. Let’s begin with the tension between promise and product. Any neophyte to digital-humanities literature notices its extravagant rhetoric of exuberance. The field may be "transforming long-established disciplines like history or literary criticism," according to a Stanford Literary Lab email likely unread or disregarded by a majority in those disciplines. Laura Mandell, director of the Initiative for Digital Humanities, Media, and Culture at Texas A&M University, promises to break "the book format" without explaining why one might want to — even as books, against all predictions, doggedly persist, filling the airplane-hanger-sized warehouses of Amazon.com.
  • A similar shortfall is evident when digital humanists turn to straight literary criticism. "Distant reading," a method of studying novels without reading them, uses computer scanning to search for "units that are much smaller or much larger than the text" (in Franco Moretti’s words) — tropes, at one end, genres or systems, at the other. One of the most intelligent examples of the technique is Richard Jean So and Andrew Piper’s 2016 Atlantic article, "How Has the MFA Changed the American Novel?" (based on their research for articles published in academic journals). The authors set out to quantify "how similar authors were across a range of literary aspects, including diction, style, theme, setting." But they never cite exactly what the computers were asked to quantify. In the real world of novels, after all, style, theme, and character are often achieved relationally — that is, without leaving a trace in words or phrases recognizable as patterns by a program.
  • Perhaps toward that end, So, an assistant professor of English at the University of Chicago, wrote an elaborate article in Critical Inquiry with Hoyt Long (also of Chicago) on the uses of machine learning and "literary pattern recognition" in the study of modernist haiku poetry. Here they actually do specify what they instructed programmers to look for, and what computers actually counted. But the explanation introduces new problems that somehow escape the authors. By their own admission, some of their interpretations derive from what they knew "in advance"; hence the findings do not need the data and, as a result, are somewhat pointless. After 30 pages of highly technical discussion, the payoff is to tell us that haikus have formal features different from other short poems. We already knew that.
  • ...2 more annotations...
  • The outsized promises of big-data mining (which have been a fixture in big-figure grant proposals) seem curiously stuck at the level of confident assertion. In a 2011 New Left Review article, "Network Theory, Plot Analysis," Moretti gives us a promissory note that characterizes a lot of DH writing: "One day, after we add to these skeletons the layers of direction, weight and semantics, those richer images will perhaps make us see different genres — tragedies and comedies; picaresque, gothic, Bildungsroman … — as different shapes; ideally, they may even make visible the micro-patterns out of which these larger network shapes emerge." But what are the semantics of a shape when measured against the tragedy to which it corresponds? If "shape" is only a place-holder meant to allow for more-complex calculations of literary meaning (disburdened of their annoyingly human baggage), by what synesthetic principle do we reconvert it into its original, now reconfigured, genre-form? It is not simply that no answers are provided; it is that DH never asks the questions. And without them, how can Moretti’s "one day" ever arrive?
  • For all its resources, the digital humanities makes a rookie mistake: It confuses more information for more knowledge. DH doesn’t know why it thinks it knows what it does not know. And that is an odd place for a science to be.
Todd Suomela

The Scholar's Stage: How to Save the (Institutional) Humanities - 0 views

  • A few years after I graduated my alma mater decided to overhaul their generals program. After much contentious wrangling over what students should or should be forced to study, the faculty tasked with developing the general curriculum settled on an elegant compromise: there would be no generals. Except for a basic primer course in mathematics and writing, general credit requirements were jettisoned entirely. Instead, faculty made a list of all majors, minors, and certificates offered at the university, and placed each into one of three categories: science and mathematics, the humanities, and professional skills. From this point forward all students would be required to gain a separate qualification in each of the three categories.
Todd Suomela

Fanning the Flames While the Humanities Burn - The Chronicle of Higher Education - 0 views

  • Yet before we have even had time to digest this criticism, the authors change their mind: Perhaps diversification is responsible for shrinkage, and it is for the best! "Some of what Kay figures as disciplinary attrition," they write, "looks from our vantage point like the very necessary unsettling of white male dominance." It is not entirely clear whether they mean that the tremendous drop in enrollment and jobs can be accounted for by the attrition of white males (it cannot), or rather, more likely, that the shrinkage of the profession is a necessary and therefore justified consequence of the moral housecleaning it was forced to endure. On the latter reading, the problem with Kay’s essay is not one of diagnosis. It is rather that he fails to appreciate the extent to which both he and the discipline he eulogizes deserve whatever misfortune happens to befall them. (Indeed, these four horsewomen of the apocalypse promise that "a cleansing flame will allow us to build a better structure.")
  • Their lengthy description, displaying most fully the confusions of snark for wit and of hyperbole for exactitude that pervade the piece, is not, strictly speaking, false. Choosing what to wear for conferences and interviews is not always easy. And the attempt to meet contradictory standards (formal but not too formal, etc.) leads both men and women to come to resemble one another, as they all jointly reach toward an elusive ideal of professional suitability. But this awkward process of convergence is surely just what Kay means to convey with his description of the participants as sharing a "self-conscious aesthetic." As for misogyny, it is not obvious to me who comes off worse, the men with their "mummified" hair and pairs of identical try-hard-casual sneakers, or those women in their suits. But then again, who cares? The structuring idea of the essay, remember, is that English professors are dithering while their profession dies. It is hard to imagine a better illustration of this point than four tenure-track professors spending five paragraphs of their response criticizing a line about Ann Taylor dresses.
  • Kay says very clearly what he misses about his life at the university: talking to and reading poetry with an adviser he admires, doing work he cares about, and being part of a community that could provide him with the opportunity to talk about literature with those who share his love for it. This fantasy is the fantasy of those who wish to dedicate themselves to a life of the mind. It is mine. Here, apparently, lies Kay’s real sin. It is not his unwitting bigotry. Ultimately, the scandal of his piece has little to do with his adoring descriptions of his academic adviser or his sartorial observational satire. His sin is that he fails to embrace his own sacrifice as well justified, fails to see his own loss as the "very necessary unsettling of white male dominance," fails to welcome the "cleansing flame." The problem is not what Kay says but that he dares to speak of his own predicament — that he dares to want publicly anything at all. After all, according to the authors, Kay, despite having had to abandon his vocation, possesses a power and freedom that they can only dream of. "Our point is this: It’s not that no woman would have written an essay like Kay’s. It’s that no woman could have done so, because no woman is permitted to navigate the MLA — let alone the world — in this fashion." Kay, that is, betrays women not only by failing to portray them as sufficiently capable and accomplished but also, and without contradiction, by failing to portray the degree to which they are, compared to him, utterly powerless. What woman could go in and out of conference rooms? What woman could sidle up to a couple of octogenarians in a conference-hotel lobby?
jatolbert

Does Digital Scholarship Have a Future? | EDUCAUSE - 1 views

  • Although the phrase sometimes refers to issues surrounding copyright and open access and sometimes to scholarship analyzing the online world, digital scholarship—emanating, perhaps, from digital humanities—most frequently describes discipline-based scholarship produced with digital tools and presented in digital form.
    • jatolbert
       
      A couple of points. First, there's no reason to assume that DS comes from DH. "Digital" was a term and concept before DH claimed it. Second, I would suggest that DS can be produced with digital tools OR presented digitally OR both. It isn't necessarily always both. I did digital scholarship that was both printed in a conventional journal and published online. Semantic difference, but still important.
  • Though the recent popularity of the phrase digital scholarship reflects impressive interdisciplinary ambition and coherence, two crucial elements remain in short supply in the emerging field. First, the number of scholars willing to commit themselves and their careers to digital scholarship has not kept pace with institutional opportunities. Second, today few scholars are trying, as they did earlier in the web's history, to reimagine the form as well as the substance of scholarship. In some ways, scholarly innovation has been domesticated, with the very ubiquity of the web bringing a lowered sense of excitement, possibility, and urgency. These two deficiencies form a reinforcing cycle: the diminished sense of possibility weakens the incentive for scholars to take risks, and the unwillingness to take risks limits the impact and excitement generated by boldly innovative projects.
    • jatolbert
       
      I'm not sure about any of this. There's plenty of innovation happening. Also, galloping towards innovation for its own sake, without considering the specific needs of scholars, seems like a mistake.
  • Digital scholarship, reimagined in bolder ways, is cost-effective, a smart return on investment. By radically extending the audience for a work of scholarship, by reaching students of many ages and backgrounds, by building the identity of the host institution, by attracting and keeping excellent faculty and students, by creating bonds between faculty and the library, and by advancing knowledge across many otherwise disparate disciplines, innovative digital scholarship makes sense.
  • ...5 more annotations...
  • Yet, other aspects of the changing digital environment may not be encouraging digital scholarship. The large and highly visible investments being made in MOOCs, for example, lead some faculty to equate technology with the diminution of hard-won traditions of teaching and scholarship. Using new capacities in bandwidth, MOOCs extend well-established patterns of large lectures to audiences otherwise out of the hearing range of those lectures. Unlike digital scholarship, however, MOOCs make no claim to creating new disciplinary knowledge, to advancing the scholarly conversation, to unifying research and teaching.
    • jatolbert
       
      I don't see why any of this is necessarily a problem--unless you reject the notion of lectures as useful pedagogical forms entirely
  • In other words, digital scholarship may have greater impact if it takes fuller advantage of the digital medium and innovates more aggressively. Digital books and digital articles that mimic their print counterparts may be efficient, but they do not expand our imagination of what scholarship could be in an era of boundlessness, an era of ubiquity. They do not imagine other forms in which scholarship might live in a time when our audiences can be far more vast and varied than in previous generations. They do not challenge us to think about keeping alive the best traditions of the academy by adapting those traditions to the possibilities of our own time. They do not encourage new kinds of writing, of seeing, of explaining. And we need all those things.
    • jatolbert
       
      Somewhat melodramatic. What kind of innovation does he want, exactly? And what doesn't he like about the formats he mentions here? He lists things that scholars do, suggests they need to change, but makes no compelling case re: WHY they need to change.
  • Interpretation must be an integral and explicit part of the fundamental architecture of new efforts. Insisting that colleges and universities broaden their standards and definitions of scholarship to make room for digital scholarship is necessary, but it is only a partial answer. To be recognized and rewarded as scholarship in the traditional sense, digital scholarship must do the work we have long expected scholarship to do: contribute, in a meaningful and enduring way, to an identifiable collective and cumulative enterprise.
  • By way of example, the Digital Scholarship Lab at the University of Richmond is attempting to build one model of what this new scholarship might look like. The lab combines various elements of proven strategies while also breaking new ground. With the support of the Andrew W. Mellon Foundation, the historians Robert K. Nelson and Scott Nesbit and their colleagues are creating a digital atlas of American history. The first instantiation of the atlas, Visualizing Emancipation, will soon be followed by an amplified, annotated, and animated digital edition of The Atlas of the Historical Geography of the United States, first published in 1932. Over the next three years, chapters of original and dynamic maps and interpretations will focus on key aspects of the American experience since the nation's founding. The digital atlas will allow scholars to see patterns we have never been able to envision before while at the same time it will make available to teachers of all levels visualizations of crucial processes in American history.
    • jatolbert
       
      This one example doesn't seem all that innovative--story maps, etc. have been around a long time. Also, what he's doing is still basically a repackaging of print scholarship. It could be useful, but it's not nearly as radical as he seems to think.
  • Does Digital Scholarship Have a Future?
    • jatolbert
       
      A problematic think piece about digital scholarship in general. Has some useful definitions. Unfortunately Ayers is doing a lot of hand-wringing over what he sees as the lack of meaningful innovation in digital scholarship. It's not at all clear, though, what he means by this. He argues that what innovation has happened isn't sufficient, then gives an example of a project--a digital atlas of American history--that he seems to think is radically different, but isn't in any way I can discern from his description.
Todd Suomela

Young Men Are Playing Video Games Instead of Getting Jobs. That's OK. (For Now.) - Reas... - 0 views

  • Video games, like work, are basically a series of quests comprised of mundane and repetitive tasks: Receive an assignment, travel to a location, overcome some obstacles, perform some sort of search, pick up an item, and then deliver it in exchange for a reward—and, usually, another quest, which starts the cycle all over again. You are not playing the game so much as following its orders. The game is your boss; to succeed, you have to do what it says.
  • Instead of working, they are playing video games. About three quarters of the increase in leisure time among men since 2000 has gone to gaming. Total time spent on computers, including game consoles, has nearly doubled. You might think that this would be demoralizing. A life spent unemployed, living at home, without romantic prospects, playing digital time wasters does not sound particularly appealing on its face. Yet this group reports far higher levels of overall happiness than low-skilled young men from the turn of the 21st century. In contrast, self-reported happiness for older workers without college degrees fell during the same period. For low-skilled young women and men with college degrees, it stayed basically the same. A significant part of the difference comes down to what Hurst has called "innovations in leisure computer activities for young men." The problems come later. A young life spent playing video games can lead to a middle age without marketable skills or connections. "There is some evidence," Hurst pointed out, "that these young, lower-skilled men who are happy in their 20s become much less happy in their 30s or 40s." So are these guys just wasting their lives, frittering away their time on anti-social activities? Hurst describes his figures as "staggering" and "shocking"—a seismic shift in the relationship of young men to work. "Men in their 20s historically are a group with a strong attachment to the labor force," he writes. "The decline in employment rates for low-skilled men in their 20s was larger than it was for all other sex, age, and skill groups during this same time period." But there's another way to think about the change: as a shift in their relationship to unemployment. Research has consistently found that long-term unemployment is one of the most dispiriting things that can happen to a person. Happiness levels tank and never recover. One 2010 study by a group of German researchers suggests that it's worse, over time, for life satisfaction than even the death of a spouse. What video games appear to do is ease the psychic pain of joblessness—and to do it in a way that is, if not permanent, at least long-lasting. For low-skilled young men, what is the alternative to playing games? We might like to imagine that they would all become sociable and highly productive members of society, but that is not necessarily the case.
  • A military shooter might offer a simulation of being a crack special forces soldier. A racing game might simulate learning to handle a performance sports car. A sci-fi role-playing game might simulate becoming an effective leader of a massive space colonization effort. But what you're really doing is training yourself to effectively identify on-screen visual cues and twitch your thumb at the right moment. You're learning to handle a controller, not a gun or a race car. You're learning to manage a game's hidden stats system, not a space station. A game provides the sensation of mastery without the actual ability. "It's a simulation of being an expert," Wolpaw says. "It's a way to fulfill a fantasy." That fantasy, ultimately, is one of work, purpose, and social and professional success.
1 - 14 of 14
Showing 20 items per page