The Scholar's Stage: How to Save the (Institutional) Humanities - 0 views
-
A few years after I graduated my alma mater decided to overhaul their generals program. After much contentious wrangling over what students should or should be forced to study, the faculty tasked with developing the general curriculum settled on an elegant compromise: there would be no generals. Except for a basic primer course in mathematics and writing, general credit requirements were jettisoned entirely. Instead, faculty made a list of all majors, minors, and certificates offered at the university, and placed each into one of three categories: science and mathematics, the humanities, and professional skills. From this point forward all students would be required to gain a separate qualification in each of the three categories.
Jaron Lanier Interview on What Went Wrong With the Internet - 0 views
-
The theory of markets and capitalism is that when we compete, what we’re competing for is to get better at something that’s actually a benefit to people, so that everybody wins. So if you’re building a better mousetrap, or a better machine-learning algorithm, then that competition should generate improvement for everybody. But if it’s a purely abstract competition set up between insiders to the exclusion of outsiders, it might feel like a competition, it might feel very challenging and stressful and hard to the people doing it, but it doesn’t actually do anything for anybody else. It’s no longer genuinely productive for anybody, it’s a fake. And I’m a little concerned that a lot of what we’ve been doing in Silicon Valley has started to take on that quality. I think that’s been a problem in Wall Street for a while, but the way it’s been a problem in Wall Street has been aided by Silicon Valley. Everything becomes a little more abstract and a little more computer-based. You have this very complex style of competition that might not actually have much substance to it.
-
I think the fundamental mistake we made is that we set up the wrong financial incentives, and that’s caused us to turn into jerks and screw around with people too much. Way back in the ’80s, we wanted everything to be free because we were hippie socialists. But we also loved entrepreneurs because we loved Steve Jobs. So you wanna be both a socialist and a libertarian at the same time, and it’s absurd. But that’s the kind of absurdity that Silicon Valley culture has to grapple with. And there’s only one way to merge the two things, which is what we call the advertising model, where everything’s free but you pay for it by selling ads. But then because the technology gets better and better, the computers get bigger and cheaper, there’s more and more data — what started out as advertising morphed into continuous behavior modification on a mass basis, with everyone under surveillance by their devices and receiving calculated stimulus to modify them. So you end up with this mass behavior-modification empire, which is straight out of Philip K. Dick, or from earlier generations, from 1984. It’s this thing that we were warned about. It’s this thing that we knew could happen. Norbert Wiener, who coined the term cybernetics, warned about it as a possibility. And despite all the warnings, and despite all of the cautions, we just walked right into it, and we created mass behavior-modification regimes out of our digital networks. We did it out of this desire to be both cool socialists and cool libertarians at the same time.
-
But at the end, I have one that’s a spiritual one. The argument is that social media hates your soul. And it suggests that there’s a whole spiritual, religious belief system along with social media like Facebook that I think people don’t like. And it’s also fucking phony and false. It suggests that life is some kind of optimization, like you’re supposed to be struggling to get more followers and friends. Zuckerberg even talked about how the new goal of Facebook would be to give everybody a meaningful life, as if something about Facebook is where the meaning of life is. It suggests that you’re just a cog in a giant global brain or something like that. The rhetoric from the companies is often about AI, that what they’re really doing — like YouTube’s parent company, Google, says what they really are is building the giant global brain that’ll inherit the earth and they’ll upload you to that brain and then you won’t have to die. It’s very, very religious in the rhetoric. And so it’s turning into this new religion, and it’s a religion that doesn’t care about you. It’s a religion that’s completely lacking in empathy or any kind of personal acknowledgment. And it’s a bad religion. It’s a nerdy, empty, sterile, ugly, useless religion that’s based on false ideas. And I think that of all of the things, that’s the worst thing about it. I mean, it’s sort of like a cult of personality. It’s like in North Korea or some regime where the religion is your purpose to serve this one guy. And your purpose is to serve this one system, which happens to be controlled by one guy, in the case of Facebook. It’s not as blunt and out there, but that is the underlying message of it and it’s ugly and bad. I loathe it, and I think a lot of people have that feeling, but they might not have articulated it or gotten it to the surface because it’s just such a weird and new situation.
The Necessity of Looking Stupid | Just Visiting - 0 views
-
I’ve found students to be very insightful when it comes to understanding and assessing their own learning and very forgiving of my “mistakes.” Just about 100% of what I now do in the classroom has been “authorized” by student feedback, not given through end-of-semester evaluations, but collaborative discussion. Ask students if something worked, and they will tell you. The best part of moving the professorial pedestal out of the room is that all of us get to be a little less fearful, and little more brave.
Editorial: Digital Engagements; Or, the Virtual Gets Real | Public - 0 views
Historian Ed Ayers forges ahead with new digital humanities projects - 0 views
Author discusses new book about how American higher education has always been 'a perfec... - 0 views
-
The typical university is in constant tension between autonomous academic departments, which control curriculum and faculty hiring and promotion, and a strong president, who controls funding and is responsible only to the lay board of directors who own the place. Also thrown into the mix are a jumble of independent institutes, research centers and academic programs that have emerged in response to a variety of funding opportunities and faculty initiatives. The resulting institution is a hustler’s paradise, driven by a wide array of entrepreneurial actors: faculty trying to pursue intellectual interests and forge a career; administrators trying to protect and enrich the larger enterprise; and donors and students who want to draw on the university’s rich resources and capitalize on association with its stellar brand. These actors are feverishly pursuing their own interests within the framework of the university, which lures them with incentives, draws strength from their complex interactions and then passes these benefits on to society.
-
The biggest problem facing the American system of higher education today is how to deal with its own success. In the 19th century, very few people attended college, so the system was not much in the public spotlight. Burgeoning enrollments in the 20th century put the system center stage, especially when it became the expectation that most people should graduate from some sort of college. As higher education moved from being an option to becoming a necessity, it increasingly found itself under the kind of intense scrutiny that has long been directed at American schools.
-
The danger posed by this accountability pressure is that colleges, like the K-12 schools before them, will come under pressure to narrow their mission to a small number of easily measurable outcomes. Most often the purpose boils down to the efficient delivery of instructional services to students, which will provide them with good jobs and provide society with an expanding economy. This ignores the wide array of social functions that the university serves. It’s a laboratory for working on pressing social problems; a playpen for intellectuals to pursue whatever questions seem interesting; a repository for the knowledge needed to address problems that haven’t yet emerged; a zone of creativity and exploration partially buffered from the realm of necessity; and, yes, a classroom for training future workers. The system’s organizational messiness is central to its social value.
- ...1 more annotation...
Is AI Riding a One-Trick Pony? - MIT Technology Review - 0 views
The Dark Secret at the Heart of AI - MIT Technology Review - 0 views
The Digital-Humanities Bust - The Chronicle of Higher Education - 0 views
-
To ask about the field is really to ask how or what DH knows, and what it allows us to know. The answer, it turns out, is not much. Let’s begin with the tension between promise and product. Any neophyte to digital-humanities literature notices its extravagant rhetoric of exuberance. The field may be "transforming long-established disciplines like history or literary criticism," according to a Stanford Literary Lab email likely unread or disregarded by a majority in those disciplines. Laura Mandell, director of the Initiative for Digital Humanities, Media, and Culture at Texas A&M University, promises to break "the book format" without explaining why one might want to — even as books, against all predictions, doggedly persist, filling the airplane-hanger-sized warehouses of Amazon.com.
-
A similar shortfall is evident when digital humanists turn to straight literary criticism. "Distant reading," a method of studying novels without reading them, uses computer scanning to search for "units that are much smaller or much larger than the text" (in Franco Moretti’s words) — tropes, at one end, genres or systems, at the other. One of the most intelligent examples of the technique is Richard Jean So and Andrew Piper’s 2016 Atlantic article, "How Has the MFA Changed the American Novel?" (based on their research for articles published in academic journals). The authors set out to quantify "how similar authors were across a range of literary aspects, including diction, style, theme, setting." But they never cite exactly what the computers were asked to quantify. In the real world of novels, after all, style, theme, and character are often achieved relationally — that is, without leaving a trace in words or phrases recognizable as patterns by a program.
-
Perhaps toward that end, So, an assistant professor of English at the University of Chicago, wrote an elaborate article in Critical Inquiry with Hoyt Long (also of Chicago) on the uses of machine learning and "literary pattern recognition" in the study of modernist haiku poetry. Here they actually do specify what they instructed programmers to look for, and what computers actually counted. But the explanation introduces new problems that somehow escape the authors. By their own admission, some of their interpretations derive from what they knew "in advance"; hence the findings do not need the data and, as a result, are somewhat pointless. After 30 pages of highly technical discussion, the payoff is to tell us that haikus have formal features different from other short poems. We already knew that.
- ...2 more annotations...
Diminishing duplication in evaluating accessibility of education technology - 0 views
« First
‹ Previous
61 - 75 of 75
Showing 20▼ items per page