Skip to main content

Home/ Bucknell Digital Pedagogy & Scholarship/ Contents contributed and discussions participated by Todd Suomela

Contents contributed and discussions participated by Todd Suomela

Todd Suomela

Jaron Lanier Interview on What Went Wrong With the Internet - 0 views

  • The theory of markets and capitalism is that when we compete, what we’re competing for is to get better at something that’s actually a benefit to people, so that everybody wins. So if you’re building a better mousetrap, or a better machine-learning algorithm, then that competition should generate improvement for everybody.

    But if it’s a purely abstract competition set up between insiders to the exclusion of outsiders, it might feel like a competition, it might feel very challenging and stressful and hard to the people doing it, but it doesn’t actually do anything for anybody else. It’s no longer genuinely productive for anybody, it’s a fake. And I’m a little concerned that a lot of what we’ve been doing in Silicon Valley has started to take on that quality. I think that’s been a problem in Wall Street for a while, but the way it’s been a problem in Wall Street has been aided by Silicon Valley. Everything becomes a little more abstract and a little more computer-based. You have this very complex style of competition that might not actually have much substance to it.

  • I think the fundamental mistake we made is that we set up the wrong financial incentives, and that’s caused us to turn into jerks and screw around with people too much. Way back in the ’80s, we wanted everything to be free because we were hippie socialists. But we also loved entrepreneurs because we loved Steve Jobs. So you wanna be both a socialist and a libertarian at the same time, and it’s absurd. But that’s the kind of absurdity that Silicon Valley culture has to grapple with.

    And there’s only one way to merge the two things, which is what we call the advertising model, where everything’s free but you pay for it by selling ads. But then because the technology gets better and better, the computers get bigger and cheaper, there’s more and more data — what started out as advertising morphed into continuous behavior modification on a mass basis, with everyone under surveillance by their devices and receiving calculated stimulus to modify them. So you end up with this mass behavior-modification empire, which is straight out of Philip K. Dick, or from earlier generations, from 1984.

    It’s this thing that we were warned about. It’s this thing that we knew could happen. Norbert Wiener, who coined the term cybernetics, warned about it as a possibility. And despite all the warnings, and despite all of the cautions, we just walked right into it, and we created mass behavior-modification regimes out of our digital networks. We did it out of this desire to be both cool socialists and cool libertarians at the same time.

  • But at the end, I have one that’s a spiritual one. The argument is that social media hates your soul. And it suggests that there’s a whole spiritual, religious belief system along with social media like Facebook that I think people don’t like. And it’s also fucking phony and false. It suggests that life is some kind of optimization, like you’re supposed to be struggling to get more followers and friends. Zuckerberg even talked about how the new goal of Facebook would be to give everybody a meaningful life, as if something about Facebook is where the meaning of life is.

    It suggests that you’re just a cog in a giant global brain or something like that. The rhetoric from the companies is often about AI, that what they’re really doing — like YouTube’s parent company, Google, says what they really are is building the giant global brain that’ll inherit the earth and they’ll upload you to that brain and then you won’t have to die. It’s very, very religious in the rhetoric. And so it’s turning into this new religion, and it’s a religion that doesn’t care about you. It’s a religion that’s completely lacking in empathy or any kind of personal acknowledgment. And it’s a bad religion. It’s a nerdy, empty, sterile, ugly, useless religion that’s based on false ideas. And I think that of all of the things, that’s the worst thing about it.

    I mean, it’s sort of like a cult of personality. It’s like in North Korea or some regime where the religion is your purpose to serve this one guy. And your purpose is to serve this one system, which happens to be controlled by one guy, in the case of Facebook.

    It’s not as blunt and out there, but that is the underlying message of it and it’s ugly and bad. I loathe it, and I think a lot of people have that feeling, but they might not have articulated it or gotten it to the surface because it’s just such a weird and new situation.

Todd Suomela

Open Pedagogy Notebook | Sharing Practices, Building Community - 0 views

    "This website is designed to serve as a resource for educators interested in learning more about Open Pedagogy.

    We invite you to browse through the examples, which include both classroom-tested practices and budding ideas, and to consider contributing examples of your own experiments with open pedagog"
Todd Suomela

Topic Modeling in Python with NLTK and Gensim | DataScience+ - 0 views

    "In this post, we will learn how to identify which topic is discussed in a document, called topic modeling. In particular, we will cover Latent Dirichlet Allocation (LDA): a widely used topic modelling technique. And we will apply LDA to convert set of research papers to a set of topics."
Todd Suomela

The Art of Unlearning - 0 views

  • I see two main views of learning. The first is like stamp collecting. The person wants to collect more and more knowledge, mostly for the purposes of showing it off to people they want to impress. The knowledge here is largely inert and unimportant for their lives—it’s just a collecting hobby accruing more facts and ideas.

    There’s nothing wrong with stamp collecting. Knowing facts and ideas, even if they aren’t particularly useful or central to our lives, isn’t a bad thing. It’s probably a superior hobby to many other pursuits, since knowledge can, at least some of the time, spillover to more practical consequences.

    The other view of learning, however, is centered around unlearning. This is the view that what we think we know about the world is a veneer of sense-making atop a much deeper strangeness. The things we think we know, we often don’t. The ideas, philosophies and truths that guide our lives may be convenient approximations, but often the more accurate picture is a lot stranger and more interesting.

  • A good meta-belief to this whole unlearning endeavor is to be comfortable with the idea that everything you know is provisional, and that underneath what you know is likely a more complex and stranger picture.

    Human beings seem to be naturally afraid of this groundless view of things. I’m not quite sure why that is. It may be that this kind of epistemic flexibility might start to question societal norms and rules of conduct, and so people who think too much about things may have an amoral character. That’s certainly the perspective of many traditional religious viewpoints on things, which discourages open-ended inquiry in favor of professing allegiance to dogma.

Todd Suomela

"We do software so that you can do education": The curious case of MOOC platforms - Wor... - 0 views

  • edX’s case illustrates one mechanism through which this happens: the construction of organizational roles. Consider the separation of software and pedagogy within the edX ecosystem. As edX expanded its slate of partners, its first clients and patrons, MIT and Harvard, saw a decline in their own ability to set the agenda and control the direction of the software. These “users” argue that the software has an implicit theory of pedagogy embedded in it, and that, as experts on pedagogy, they should have more of a say in shaping the software. While acknowledging this, edX’s architects counter that they—and not the Harvard-MIT folks—should have the final say on prioritizing which features to build, not only because they understand the software the best, but also because they see themselves as best placed to understand which features might benefit the whole eco-system rather than just particular players.

    The standard template in the education technology industry is that the technology experts are only supposed to “implement” what the pedagogy experts ask. What is arguably new about the edX platform framework is that the software is prior to, and thereby more constitutive of, the pedagogy.

1 - 20 of 160 Next › Last »
Showing 20 items per page