Skip to main content

Home/ Bucknell Digital Pedagogy & Scholarship/ Group items tagged html

Rss Feed Group items tagged

3More

Jaron Lanier Interview on What Went Wrong With the Internet - 0 views

  • The theory of markets and capitalism is that when we compete, what we’re competing for is to get better at something that’s actually a benefit to people, so that everybody wins. So if you’re building a better mousetrap, or a better machine-learning algorithm, then that competition should generate improvement for everybody. But if it’s a purely abstract competition set up between insiders to the exclusion of outsiders, it might feel like a competition, it might feel very challenging and stressful and hard to the people doing it, but it doesn’t actually do anything for anybody else. It’s no longer genuinely productive for anybody, it’s a fake. And I’m a little concerned that a lot of what we’ve been doing in Silicon Valley has started to take on that quality. I think that’s been a problem in Wall Street for a while, but the way it’s been a problem in Wall Street has been aided by Silicon Valley. Everything becomes a little more abstract and a little more computer-based. You have this very complex style of competition that might not actually have much substance to it.
  • I think the fundamental mistake we made is that we set up the wrong financial incentives, and that’s caused us to turn into jerks and screw around with people too much. Way back in the ’80s, we wanted everything to be free because we were hippie socialists. But we also loved entrepreneurs because we loved Steve Jobs. So you wanna be both a socialist and a libertarian at the same time, and it’s absurd. But that’s the kind of absurdity that Silicon Valley culture has to grapple with. And there’s only one way to merge the two things, which is what we call the advertising model, where everything’s free but you pay for it by selling ads. But then because the technology gets better and better, the computers get bigger and cheaper, there’s more and more data — what started out as advertising morphed into continuous behavior modification on a mass basis, with everyone under surveillance by their devices and receiving calculated stimulus to modify them. So you end up with this mass behavior-modification empire, which is straight out of Philip K. Dick, or from earlier generations, from 1984. It’s this thing that we were warned about. It’s this thing that we knew could happen. Norbert Wiener, who coined the term cybernetics, warned about it as a possibility. And despite all the warnings, and despite all of the cautions, we just walked right into it, and we created mass behavior-modification regimes out of our digital networks. We did it out of this desire to be both cool socialists and cool libertarians at the same time.
  • But at the end, I have one that’s a spiritual one. The argument is that social media hates your soul. And it suggests that there’s a whole spiritual, religious belief system along with social media like Facebook that I think people don’t like. And it’s also fucking phony and false. It suggests that life is some kind of optimization, like you’re supposed to be struggling to get more followers and friends. Zuckerberg even talked about how the new goal of Facebook would be to give everybody a meaningful life, as if something about Facebook is where the meaning of life is. It suggests that you’re just a cog in a giant global brain or something like that. The rhetoric from the companies is often about AI, that what they’re really doing — like YouTube’s parent company, Google, says what they really are is building the giant global brain that’ll inherit the earth and they’ll upload you to that brain and then you won’t have to die. It’s very, very religious in the rhetoric. And so it’s turning into this new religion, and it’s a religion that doesn’t care about you. It’s a religion that’s completely lacking in empathy or any kind of personal acknowledgment. And it’s a bad religion. It’s a nerdy, empty, sterile, ugly, useless religion that’s based on false ideas. And I think that of all of the things, that’s the worst thing about it. I mean, it’s sort of like a cult of personality. It’s like in North Korea or some regime where the religion is your purpose to serve this one guy. And your purpose is to serve this one system, which happens to be controlled by one guy, in the case of Facebook. It’s not as blunt and out there, but that is the underlying message of it and it’s ugly and bad. I loathe it, and I think a lot of people have that feeling, but they might not have articulated it or gotten it to the surface because it’s just such a weird and new situation.
1More

New World Notes: Palmer Luckey's Support for Pro-Trump Group Reminds Me of His Support ... - 0 views

  •  
    Palmer Luckey is the founder of Oculus.
2More

the social-rhetorical challenges of information technology - digital digs - 0 views

  • This is why a survey coming from IT asking me about the usefulness of the technology in the classroom seems tone deaf to me. The problem isn’t the technology or if there are problems with the technology then they are obscured by the limits of the physical space. I would like for students to have enough space to bring their laptops, move around, work in groups, share their screens (even if only by all moving around in front of a laptop), and have conversations without getting in each others way.  I’d also like to be able to move among those groups without worrying about pulling a muscle.
  •  
    For Matt to make his points about the interaction between physical space and technology.
1More

DSHR's Blog: The Orphans of Scholarship - 0 views

  •  
    Reference across scholarly artifacts.
13More

DHQ: Digital Humanities Quarterly: A Genealogy of Distant Reading - 0 views

  • Because Radway’s voice is candid and engaging, the book may not always sound like social science.
    • jatolbert
       
      I wonder what social science he's been reading.
  • In calling this approach minimally "scientific," I don’t mean to imply that we must suddenly adopt all the mores of chemists, or even psychologists
    • jatolbert
       
      And yet the effect is the same: scientizing processes and products which by their very natures as human works resist scientific analysis.
  • social science
    • jatolbert
       
      Again, this is a very different social science from that in which I received my own training, which has long held to the notion that objectivity is not only unobtainable, but undesirable.
  • ...4 more annotations...
  • But computational methods now matter deeply for literary history, because they can be applied to large digital libraries, guided by a theoretical framework that tells us how to pose meaningful questions on a social scale.
    • jatolbert
       
      I wonder about this. Is he suggesting that examining a large corpus of published works is the same as examining an entire society? This would seem to ignore issues of access and audience, literacy, representation, etc.
  • The term digital humanities stages intellectual life as a dialogue between humanists and machines. Instead of explicitly foregrounding experimental methods, it underlines a boundary between the humanities and social science.
  • Conflations of that kind could begin to create an unproductive debate, where parties to the debate fail to grasp the reason for disagreement, because they misunderstand each other’s real positions and commitments.
    • jatolbert
       
      Similar to the conflation of sociology with all of the social sciences.
  • the past
    • jatolbert
       
      Is it appropriate to conflate the -literary- past with -the past-? That is, can any study based wholly on texts claim to be in any way representative of things outside the sphere of what we call "literature"?
« First ‹ Previous 41 - 55 of 55
Showing 20 items per page