Skip to main content

Home/ Instructional & Media Services at Dickinson College/ Group items tagged everything

Rss Feed Group items tagged

Ed Webb

What Bruce Sterling Actually Said About Web 2.0 at Webstock 09 | Beyond the Beyond from... - 0 views

  • things in it that pretended to be ideas, but were not ideas at all: they were attitudes
    • Ed Webb
       
      Like Edupunk
  • A sentence is a verbal construction meant to express a complete thought. This congelation that Tim O'Reilly constructed, that is not a complete thought. It's a network in permanent beta.
  • This chart is five years old now, which is 35 years old in Internet years, but intellectually speaking, it's still new in the world. It's alarming how hard it is to say anything constructive about this from any previous cultural framework.
  • ...20 more annotations...
  • "The cloud as platform." That is insanely great. Right? You can't build a "platform" on a "cloud!" That is a wildly mixed metaphor! A cloud is insubstantial, while a platform is a solid foundation! The platform falls through the cloud and is smashed to earth like a plummeting stock price!
  • luckily, we have computers in banking now. That means Moore's law is gonna save us! Instead of it being really obvious who owes what to whom, we can have a fluid, formless ownership structure that's always in permanent beta. As long as we keep moving forward, adding attractive new features, the situation is booming!
  • Web 2.0 is supposed to be business. This isn't a public utility or a public service, like the old model of an Information Superhighway established for the public good.
  • it's turtles all the way down
  • "Tagging not taxonomy." Okay, I love folksonomy, but I don't think it's gone very far. There have been books written about how ambient searchability through folksonomy destroys the need for any solid taxonomy. Not really. The reality is that we don't have a choice, because we have no conceivable taxonomy that can catalog the avalanche of stuff on the Web.
  • JavaScript is the duct tape of the Web. Why? Because you can do anything with it. It's not the steel girders of the web, it's not the laws of physics of the web. Javascript is beloved of web hackers because it's an ultimate kludge material that can stick anything to anything. It's a cloud, a web, a highway, a platform and a floor wax. Guys with attitude use JavaScript.
  • Before the 1990s, nobody had any "business revolutions." People in trade are supposed to be very into long-term contracts, a stable regulatory environment, risk management, and predictable returns to stockholders. Revolutions don't advance those things. Revolutions annihilate those things. Is that "businesslike"? By whose standards?
  • I just wonder what kind of rattletrap duct-taped mayhem is disguised under a smooth oxymoron like "collective intelligence."
  • the people whose granular bits of input are aggregated by Google are not a "collective." They're not a community. They never talk to each other. They've got basically zero influence on what Google chooses to do with their mouseclicks. What's "collective" about that?
  • I really think it's the original sin of geekdom, a kind of geek thought-crime, to think that just because you yourself can think algorithmically, and impose some of that on a machine, that this is "intelligence." That is not intelligence. That is rules-based machine behavior. It's code being executed. It's a powerful thing, it's a beautiful thing, but to call that "intelligence" is dehumanizing. You should stop that. It does not make you look high-tech, advanced, and cool. It makes you look delusionary.
  • I'd definitely like some better term for "collective intelligence," something a little less streamlined and metaphysical. Maybe something like "primeval meme ooze" or "semi-autonomous data propagation." Even some Kevin Kelly style "neobiological out of control emergent architectures." Because those weird new structures are here, they're growing fast, we depend on them for mission-critical acts, and we're not gonna get rid of them any more than we can get rid of termite mounds.
  • Web 2.0 guys: they've got their laptops with whimsical stickers, the tattoos, the startup T-shirts, the brainy-glasses -- you can tell them from the general population at a glance. They're a true creative subculture, not a counterculture exactly -- but in their number, their relationship to the population, quite like the Arts and Crafts people from a hundred years ago. Arts and Crafts people, they had a lot of bad ideas -- much worse ideas than Tim O'Reilly's ideas. It wouldn't bother me any if Tim O'Reilly was Governor of California -- he couldn't be any weirder than that guy they've got already. Arts and Crafts people gave it their best shot, they were in earnest -- but everything they thought they knew about reality was blown to pieces by the First World War. After that misfortune, there were still plenty of creative people surviving. Futurists, Surrealists, Dadaists -- and man, they all despised Arts and Crafts. Everything about Art Nouveau that was sexy and sensual and liberating and flower-like, man, that stank in their nostrils. They thought that Art Nouveau people were like moronic children.
  • in the past eighteen months, 24 months, we've seen ubiquity initiatives from Nokia, Cisco, General Electric, IBM... Microsoft even, Jesus, Microsoft, the place where innovative ideas go to die.
  • what comes next is a web with big holes blown in it. A spiderweb in a storm. The turtles get knocked out from under it, the platform sinks through the cloud. A lot of the inherent contradictions of the web get revealed, the contradictions in the oxymorons smash into each other. The web has to stop being a meringue frosting on the top of business, this make-do melange of mashups and abstraction layers. Web 2.0 goes away. Its work is done. The thing I always loved best about Web 2.0 was its implicit expiration date. It really took guts to say that: well, we've got a bunch of cool initiatives here, and we know they're not gonna last very long. It's not Utopia, it's not a New World Order, it's just a brave attempt to sweep up the ashes of the burst Internet Bubble and build something big and fast with the small burnt-up bits that were loosely joined. That showed more maturity than Web 1.0. It was visionary, it was inspiring, but there were fewer moon rockets flying out of its head. "Gosh, we're really sorry that we accidentally ruined the NASDAQ." We're Internet business people, but maybe we should spend less of our time stock-kiting. The Web's a communications medium -- how 'bout working on the computer interface, so that people can really communicate? That effort was time well spent. Really.
  • The poorest people in the world love cellphones.
  • Digital culture, I knew it well. It died -- young, fast and pretty. It's all about network culture now.
  • There's gonna be a Transition Web. Your economic system collapses: Eastern Europe, Russia, the Transition Economy, that bracing experience is for everybody now. Except it's not Communism transitioning toward capitalism. It's the whole world into transition toward something we don't even have proper words for.
  • The Transition Web is a culture model. If it's gonna work, it's got to replace things that we used to pay for with things that we just plain use.
  • Not every Internet address was a dotcom. In fact, dotcoms showed up pretty late in the day, and they were not exactly welcome. There were dot-orgs, dot edus, dot nets, dot govs, and dot localities. Once upon a time there were lots of social enterprises that lived outside the market; social movements, political parties, mutual aid societies, philanthropies. Churches, criminal organizations -- you're bound to see plenty of both of those in a transition... Labor unions... not little ones, but big ones like Solidarity in Poland; dissident organizations, not hobby activists, big dissent, like Charter 77 in Czechoslovakia. Armies, national guards. Rescue operations. Global non-governmental organizations. Davos Forums, Bilderberg guys. Retired people. The old people can't hold down jobs in the market. Man, there's a lot of 'em. Billions. What are our old people supposed to do with themselves? Websurf, I'm thinking. They're wise, they're knowledgeable, they're generous by nature; the 21st century is destined to be an old people's century. Even the Chinese, Mexicans, Brazilians will be old. Can't the web make some use of them, all that wisdom and talent, outside the market?
  • I've never seen so much panic around me, but panic is the last thing on my mind. My mood is eager impatience. I want to see our best, most creative, best-intentioned people in world society directly attacking our worst problems. I'm bored with the deceit. I'm tired of obscurantism and cover-ups. I'm disgusted with cynical spin and the culture war for profit. I'm up to here with phony baloney market fundamentalism. I despise a prostituted society where we put a dollar sign in front of our eyes so we could run straight into the ditch. The cure for panic is action. Coherent action is great; for a scatterbrained web society, that may be a bit much to ask. Well, any action is better than whining. We can do better.
Ed Webb

The Internet Intellectual - 0 views

  • Even Thomas Friedman would be aghast at some of Jarvis’s cheesy sound-bites
  • What does that actually mean?
  • In Jarvis’s universe, all the good things are technologically determined and all the bad things are socially determined
  • ...7 more annotations...
  • Jarvis never broaches such subtleties. His is a simple world:
  • why not consider the possibility that the incumbents may be using the same tools, Jarvis’s revered technologies, to tell us what to think, and far more effectively than before? Internet shelf space may be infinite, but human attention is not. Cheap self-publishing marginally improves one’s chances of being heard, but nothing about this new decentralized public sphere suggests that old power structures—provided they are smart and willing to survive—will not be able to use it to their benefit
  • Jarvis 1.0 was all about celebrating Google, but Jarvis 2.0 has new friends in Facebook and Twitter. (An Internet intellectual always keeps up.) Jarvis 1.0 wrote that “Google’s moral of universal empowerment is the sometimes-forgotten ideal of democracy,” and argued that the company “provides the infrastructure for a culture of choice,” while its “algorithms and its business model work because Google trusts us.” Jarvis 2.0 claims that “by sharing publicly, we people challenge Google’s machines and reclaim our authority on the internet from algorithms.”
  • Jarvis has another reference point, another sacred telos: the equally grand and equally inexorable march of the Internet, which in his view is a technology that generates its own norms, its own laws, its own people. (He likes to speak of “us, people of the Net.”) For the Technology Man, the Internet is the glue that holds our globalized world together and the divine numen that fills it with meaning. If you thought that ethnocentrism was bad, brace yourself for Internet-centrism
  • Why worry about the growing dominance of such digitalism? The reason should be obvious. As Internet-driven explanations crowd out everything else, our entire vocabulary is being re-defined. Collaboration is re-interpreted through the prism of Wikipedia; communication, through the prism of social networking; democratic participation, through the prism of crowd-sourcing; cosmopolitanism, through the prism of reading the blogs of exotic “others”; political upheaval, through the prism of the so-called Twitter revolutions. Even the persecution of dissidents is now seen as an extension of online censorship (rather than the other way around). A recent headline on the blog of the Harvard-based Herdictproject—it tracks Internet censorship worldwide—announces that, in Mexico and Morocco, “Online Censorship Goes Offline.” Were activists and dissidents never harassed before Twitter and Facebook?
  • Most Internet intellectuals simply choose a random point in the distant past—the honor almost invariably goes to the invention of the printing press—and proceed to draw a straight line from Gutenberg to Zuckerberg, as if the Counter-Reformation, the Thirty Years’ War, the Reign of Terror, two world wars—and everything else—never happened.
  • even their iPad is of interest to them only as a “platform”—another buzzword of the incurious—and not as an artifact that is assembled in dubious conditions somewhere in East Asian workshops so as to produce cultic devotion in its more fortunate owners. This lack of elementary intellectual curiosity is the defining feature of the Internet intellectual. History, after all, is about details, but no Internet intellectual wants to be accused of thinking small. And so they think big—sloppily, ignorantly, pretentiously, and without the slightest appreciation of the difference between critical thought and market propaganda.
  •  
    In which Evgeny rips Jeff a new one
Ed Webb

Kathy Schrock's - Google Blooms Taxonomy - 1 views

  •  
    Googlification of everything
Ed Webb

ChatGPT Is Nothing Like a Human, Says Linguist Emily Bender - 0 views

  • Please do not conflate word form and meaning. Mind your own credulity.
  • We’ve learned to make “machines that can mindlessly generate text,” Bender told me when we met this winter. “But we haven’t learned how to stop imagining the mind behind it.”
  • A handful of companies control what PricewaterhouseCoopers called a “$15.7 trillion game changer of an industry.” Those companies employ or finance the work of a huge chunk of the academics who understand how to make LLMs. This leaves few people with the expertise and authority to say, “Wait, why are these companies blurring the distinction between what is human and what’s a language model? Is this what we want?”
  • ...16 more annotations...
  • “We call on the field to recognize that applications that aim to believably mimic humans bring risk of extreme harms,” she co-wrote in 2021. “Work on synthetic human behavior is a bright line in ethical Al development, where downstream effects need to be understood and modeled in order to block foreseeable harm to society and different social groups.”
  • chatbots that we easily confuse with humans are not just cute or unnerving. They sit on a bright line. Obscuring that line and blurring — bullshitting — what’s human and what’s not has the power to unravel society
  • She began learning from, then amplifying, Black women’s voices critiquing AI, including those of Joy Buolamwini (she founded the Algorithmic Justice League while at MIT) and Meredith Broussard (the author of Artificial Unintelligence: How Computers Misunderstand the World). She also started publicly challenging the term artificial intelligence, a sure way, as a middle-aged woman in a male field, to get yourself branded as a scold. The idea of intelligence has a white-supremacist history. And besides, “intelligent” according to what definition? The three-stratum definition? Howard Gardner’s theory of multiple intelligences? The Stanford-Binet Intelligence Scale? Bender remains particularly fond of an alternative name for AI proposed by a former member of the Italian Parliament: “Systematic Approaches to Learning Algorithms and Machine Inferences.” Then people would be out here asking, “Is this SALAMI intelligent? Can this SALAMI write a novel? Does this SALAMI deserve human rights?”
  • Tech-makers assuming their reality accurately represents the world create many different kinds of problems. The training data for ChatGPT is believed to include most or all of Wikipedia, pages linked from Reddit, a billion words grabbed off the internet. (It can’t include, say, e-book copies of everything in the Stanford library, as books are protected by copyright law.) The humans who wrote all those words online overrepresent white people. They overrepresent men. They overrepresent wealth. What’s more, we all know what’s out there on the internet: vast swamps of racism, sexism, homophobia, Islamophobia, neo-Nazism.
  • One fired Google employee told me succeeding in tech depends on “keeping your mouth shut to everything that’s disturbing.” Otherwise, you’re a problem. “Almost every senior woman in computer science has that rep. Now when I hear, ‘Oh, she’s a problem,’ I’m like, Oh, so you’re saying she’s a senior woman?”
  • “We haven’t learned to stop imagining the mind behind it.”
  • In March 2021, Bender published “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” with three co-authors. After the paper came out, two of the co-authors, both women, lost their jobs as co-leads of Google’s Ethical AI team.
  • “On the Dangers of Stochastic Parrots” is not a write-up of original research. It’s a synthesis of LLM critiques that Bender and others have made: of the biases encoded in the models; the near impossibility of studying what’s in the training data, given the fact they can contain billions of words; the costs to the climate; the problems with building technology that freezes language in time and thus locks in the problems of the past. Google initially approved the paper, a requirement for publications by staff. Then it rescinded approval and told the Google co-authors to take their names off it. Several did, but Google AI ethicist Timnit Gebru refused. Her colleague (and Bender’s former student) Margaret Mitchell changed her name on the paper to Shmargaret Shmitchell, a move intended, she said, to “index an event and a group of authors who got erased.” Gebru lost her job in December 2020, Mitchell in February 2021. Both women believe this was retaliation and brought their stories to the press. The stochastic-parrot paper went viral, at least by academic standards. The phrase stochastic parrot entered the tech lexicon.
  • Tech execs loved it. Programmers related to it. OpenAI CEO Sam Altman was in many ways the perfect audience: a self-identified hyperrationalist so acculturated to the tech bubble that he seemed to have lost perspective on the world beyond. “I think the nuclear mutually assured destruction rollout was bad for a bunch of reasons,” he said on AngelList Confidential in November. He’s also a believer in the so-called singularity, the tech fantasy that, at some point soon, the distinction between human and machine will collapse. “We are a few years in,” Altman wrote of the cyborg merge in 2017. “It’s probably going to happen sooner than most people think. Hardware is improving at an exponential rate … and the number of smart people working on AI is increasing exponentially as well. Double exponential functions get away from you fast.” On December 4, four days after ChatGPT was released, Altman tweeted, “i am a stochastic parrot, and so r u.”
  • “This is one of the moves that turn up ridiculously frequently. People saying, ‘Well, people are just stochastic parrots,’” she said. “People want to believe so badly that these language models are actually intelligent that they’re willing to take themselves as a point of reference and devalue that to match what the language model can do.”
  • The membrane between academia and industry is permeable almost everywhere; the membrane is practically nonexistent at Stanford, a school so entangled with tech that it can be hard to tell where the university ends and the businesses begin.
  • “No wonder that men who live day in and day out with machines to which they believe themselves to have become slaves begin to believe that men are machines.”
  • what’s tenure for, after all?
  • LLMs are tools made by specific people — people who stand to accumulate huge amounts of money and power, people enamored with the idea of the singularity. The project threatens to blow up what is human in a species sense. But it’s not about humility. It’s not about all of us. It’s not about becoming a humble creation among the world’s others. It’s about some of us — let’s be honest — becoming a superspecies. This is the darkness that awaits when we lose a firm boundary around the idea that humans, all of us, are equally worthy as is.
  • The AI dream is “governed by the perfectibility thesis, and that’s where we see a fascist form of the human.”
  • “Why are you trying to trick people into thinking that it really feels sad that you lost your phone?”
Pat Pehlman

iTextEditors - iPhone and iPad text/code editors and writing tools compared - 0 views

  •  
    The iOS Text Editor roundup This is a feature comparison of text editors on iOS. The information was compiled by the web community on an open Google spreadsheet. I cannot vouch for its current accuracy, but will be verifying everything as I'm able.
Ed Webb

Mind - Research Upends Traditional Thinking on Study Habits - NYTimes.com - 1 views

  • instead of sticking to one study location, simply alternating the room where a person studies improves retention. So does studying distinct but related skills or concepts in one sitting, rather than focusing intensely on a single thing. “We have known these principles for some time, and it’s intriguing that schools don’t pick them up, or that people don’t learn them by trial and error,” said Robert A. Bjork, a psychologist at the University of California, Los Angeles. “Instead, we walk around with all sorts of unexamined beliefs about what works that are mistaken.”
  • The brain makes subtle associations between what it is studying and the background sensations it has at the time, the authors say, regardless of whether those perceptions are conscious. It colors the terms of the Versailles Treaty with the wasted fluorescent glow of the dorm study room, say; or the elements of the Marshall Plan with the jade-curtain shade of the willow tree in the backyard. Forcing the brain to make multiple associations with the same material may, in effect, give that information more neural scaffolding.
  • Cognitive scientists do not deny that honest-to-goodness cramming can lead to a better grade on a given exam. But hurriedly jam-packing a brain is akin to speed-packing a cheap suitcase, as most students quickly learn — it holds its new load for a while, then most everything falls out. “With many students, it’s not like they can’t remember the material” when they move to a more advanced class, said Henry L. Roediger III, a psychologist at Washington University in St. Louis. “It’s like they’ve never seen it before.”
  • ...6 more annotations...
  • An hour of study tonight, an hour on the weekend, another session a week from now: such so-called spacing improves later recall, without requiring students to put in more overall study effort or pay more attention, dozens of studies have found.
  • “The idea is that forgetting is the friend of learning,” said Dr. Kornell. “When you forget something, it allows you to relearn, and do so effectively, the next time you see it.”
  • cognitive scientists see testing itself — or practice tests and quizzes — as a powerful tool of learning, rather than merely assessment. The process of retrieving an idea is not like pulling a book from a shelf; it seems to fundamentally alter the way the information is subsequently stored, making it far more accessible in the future.
  • “Testing not only measures knowledge but changes it,” he says — and, happily, in the direction of more certainty, not less.
  • “Testing has such bad connotation; people think of standardized testing or teaching to the test,” Dr. Roediger said. “Maybe we need to call it something else, but this is one of the most powerful learning tools we have.”
  • The harder it is to remember something, the harder it is to later forget. This effect, which researchers call “desirable difficulty,”
Ed Webb

The Future of WPMu at bavatuesdays - 1 views

  • I grab feeds from external blogs all the time that are related to UMW an pull them into our sitewide “tags” blog (the name tags here is confusing, it is simply a republishing of everything in the entire WPMu install) with FeedWordPress. For example, I stumbled across this post in the tags blog on UMW Blogs tonight, which was actually being pulled in from a WordPress.com blog of a student who graduated years ago, but regularly blogs about her work in historic preservation.  This particular post was all about a book she read as an undergraduate in Historic Preservation, and how great a resource it is.  A valuable post, especially since the professor who recommended that book, W. Brown Morton, retired last year. There is a kind of eternal echo in a system like this that students, faculty, and staff can continue to feed into a community of teaching and learning well beyond their matriculation period, or even their career.
  • what we are doing as instructional technologists, scholars and students in higher ed right now is much bigger than a particular blogging system or software, I see my job as working with people to imagine the implications and possibilities of managing and maintaining their digital identity in a moment when we are truly in a deep transformation of information, identity, and scholarship.
  • we’ll host domains that professors purchase and, ideally, map all their domains onto one WP install that can manage many multi-blogging solutions from one install.  The whole Russian Doll thing that WPMu can do with the Multi-Site Manager plugin. So you offer a Bluehost like setup for faculty, and if that is too much, allow them to map a domain, take control of their own course work, and encourage an aggregated course management model that pushes students to take control of their digital identity and spaces by extension.  Giving students a space and voice on your domain or application is not the same as asking them to create, manage and maintain their own space.  Moreover, it doesn’t feed into the idea of a digital trajectory that starts well before they come to college and will end well after they leave.  This model extends the community, and brings in key resources like a recent graduate discussing an out-of-print historic preservation text book a retired professor assigned to be one of the best resources for an aspiring Preservation graduate student. This is what it is all about, right there, and it’s not gonna happen in silos and on someone else’s space, we need to provision, empower, and imagine the merge as a full powered move to many. many domains of one’s own.
Ed Webb

The Greatest and Most Flawed Experiment Ever in Online Learning - CogDogBlog - 1 views

  • I don’t think we should at all be talking about “putting courses online.” What we are really faced with is coming up with some quick alternative modes for students to complete course work without showing up on campus. This does not call for apps and vendor solutions, but what the best teachers always do- improvise, change up on the fly when things change.
  • my suggestion an strategy would be… do as little as possible online. Use online for communicating, caring, attending to people’s needs, but not really for being the “course”. Flip that stuff outside.
  • This is why I cringe when what I seem to hear is “Zoom! Zoom! Can we have 30 students in zoom?” Everything you try to do online is going to call on for jumping unfair levels of barriers- access, technology, experience. I’d say recast your activities in ways students can do as much without going online- reading, writing, thinking, practicing, doing stuff away from the screen.
  • ...4 more annotations...
  • The most important things to me are quickly establishing, and having backup modes, for students to be in touch with you, and you with them. As individuals. It might be direct messaging, email, texting. It could be but need not be something Slack-like. I’d really go simplest (email)
  • Get going with web annotation tools
  • We need not have just talking sessions for use of video. Think about drop in hours with Whereby (the new appear.in) – it lacks a need for logins and downloads, and works on mobile.
  • This experiment is going to.. well I bet, go bad in a lot of ways. I don’t know what we can expect of un-experienced teachers and unprepared students, who on top of all the concerns they carry and we rarely see, now have to ponder where they might live and sustain income to live on. It will be interesting… but it need not be awful nor a disaster, if we go about as sharing in the situation.
Ed Webb

Please do a bad job of putting your courses online - Rebecca Barrett-Fox - 0 views

  • Please do a bad job of putting your courses online
  • For my colleagues who are now being instructed to put some or all of the remainder of their semester online, now is a time to do a poor job of it. You are NOT building an online class. You are NOT teaching students who can be expected to be ready to learn online. And, most importantly, your class is NOT the highest priority of their OR your life right now. Release yourself from high expectations right now, because that’s the best way to help your students learn.
  • Remember the following as you move online: Your students know less about technology than you think. Many of them know less than you. Yes, even if they are digital natives and younger than you. They will be accessing the internet on their phones. They have limited data. They need to reserve it for things more important than online lectures. Students who did not sign up for an online course have no obligation to have a computer, high speed wifi, a printer/scanner, or a camera. Do not even survey them to ask if they have it. Even if they do, they are not required to tell you this. And if they do now, that doesn’t mean that they will when something breaks and they can’t afford to fix it because they just lost their job at the ski resort or off-campus bookstore. Students will be sharing their technology with other household members. They may have LESS time to do their schoolwork, not more.
  • ...14 more annotations...
  • Social isolation contributes to mental health problems. Social isolation contributes to domestic violence.
  • Do not require synchronous work. Students should not need to show up at a specific time for anything. REFUSE to do any synchronous work.
  • Do not record lectures unless you need to. (This is fundamentally different from designing an online course, where recorded information is, I think, really important.) They will be a low priority for students, and they take up a lot of resources on your end and on theirs. You have already built a rapport with them, and they don’t need to hear your voice to remember that.
  • Do record lectures if you need to. When information cannot be learned otherwise, include a lecture. Your university already some kind of tech to record lectures. DO NOT simply record in PowerPoint as the audio quality is low. While many people recommend lectures of only 5 minutes, I find that my students really do listen to longer lectures. Still, remember that your students will be frequently interrupted in their listening, so a good rule is 1 concept per lecture. So, rather than a lecture on ALL of, say, gender inequality in your Intro to Soc course, deliver 5 minutes on pay inequity (or 15 minutes or 20 minutes, if that’s what you need) and then a separate lecture on #MeToo and yet another on domestic violence. Closed caption them using the video recording software your university provides. Note that YouTube also generates closed captions [edited to add: they are not ADA compliant, though]. If you don’t have to include images, skip the video recording and do a podcast instead.
  • Editing is a waste of your time right now.
  • Make all work due on the same day and time for the rest of the semester. I recommend Sunday night at 11:59 pm.
  • Allow every exam or quiz to be taken at least twice, and tell students that this means that if there is a tech problem on the first attempt, the second attempt is their chance to correct it. This will save you from the work of resetting tests or quizzes when the internet fails or some other tech problem happens. And since it can be very hard to discern when such failures are really failures or students trying to win a second attempt at a quiz or test, you avoid having to deal with cheaters.
  • Do NOT require students to use online proctoring or force them to have themselves recorded during exams or quizzes. This is a fundamental violation of their privacy, and they did NOT sign up for that when they enrolled in your course.
  • Circumvent the need for proctoring by making every exam open-notes, open-book, and open-internet. The best way to avoid them taking tests together or sharing answers is to use a large test bank.
  • Remind them of due dates. It might feel like handholding, but be honest: Don’t you appreciate the text reminder from your dentist that you have an appointment tomorrow? Your LMS has an announcement system that allows you to write an announcement now and post it later.
  • Make everything self-grading if you can (yes, multiple choice and T/F on quizzes and tests) or low-stakes (completed/not completed).
  • Don’t do too much. Right now, your students don’t need it. They need time to do the other things they need to do.
  • Listen for them asking for help. They may be anxious. They may be tired. Many students are returning to their parents’ home where they may not be welcome. Others will be at home with partners who are violent. School has been a safe place for them, and now it’s not available to them. Your class may matter to them a lot when they are able to focus on it, but it may not matter much now, in contrast to all the other things they have to deal with. Don’t let that hurt your feelings, and don’t hold it against them in future semesters or when they come back to ask for a letter of recommendation.
  • This advice is very different from that which I would share if you were designing an online course. I hope it’s helpful, and for those of you moving your courses online, I hope it helps you understand the labor that is required in building an online course a bit better.
Ed Webb

Guest Post: The Complexities of Certainty | Just Visiting - 0 views

  • Privileges abound in academia, but so do experiences of loss, instability and fear. And into this situation we were called to respond to a pandemic.
  • It is tempting to reach for certainties when everything around us is in chaos, and for a vast swath of higher ed instructors, the rapid shift from face-to-face teaching to emergency distance learning has been chaos. Small wonder, then, that people have offered -- and clung to -- advice that seeks to bring order to disorder. Many people have advised instructors to prioritize professionalism, ditching the sweatpants and putting away the visible clutter in our homes before making a Zoom call, upholding concepts like "rigor" so that our standards do not slip. To some, these appeals to universal principles are right-minded and heartening, a bulwark against confusion and disarray. But to others they have felt oppressive, even dangerously out of touch with the world in which we and our students live.
  • certainties can be dangerous; their very power is based upon reifying well-worn inequities dressed up as tradition
  • ...3 more annotations...
  • there is no objective standard of success that we reach when we insist on rigor, which is too often deployed in defense of practices that are ableist and unkind
  • We are not just teachers, or scholars, or professionals. We are individuals thrown back in varying degrees on our own resources, worried about ourselves and our families and friends as we navigate the effects of COVID-19. Many of us are deeply anxious and afraid. Our pre-existing frailties have been magnified; we feel vulnerable, distracted and at sea. Our loved ones are sick, even dying. This is trauma. Few of us have faced such world-changing circumstances before, and as our minds absorb the impact of that reality, our brains cannot perform as capably as they usually would.
  • The most professional people I know right now are those who show up, day after day, to teach under extraordinary circumstances. Perhaps they do it with their laundry waiting to be folded, while their children interrupt, thinking constantly of their loved ones, weathering loneliness, wearing sweatpants and potentially in need of a haircut. But I know they do it while acknowledging this is not the world in which we taught two months before, and that every student is facing disruption, uncertainty and distraction. They do it creatively, making room for the unexpected, challenging their students, with the world a participant in the conversation.
Ed Webb

A Few Responses to Criticism of My SXSW-Edu Keynote on Media Literacy - 0 views

  • Can you give me examples of programs that are rooted in, speaking to, and resonant with conservative and religious communities in this country? In particular, I’d love to know about programs that work in conservative white Evangelical and religious black and LatinX communities? I’d love to hear how educators integrate progressive social justice values into conservative cultural logics. Context: To the best that I can tell, every program I’ve seen is rooted in progressive (predominantly white) ways of thinking. I know that communities who define “fake news” as CNN (as well as black communities who see mainstream media as rooted in the history of slavery and white supremacy) have little patience for the logics of progressive white educators. So what does media literacy look like when it starts with religious and/or conservative frameworks? What examples exist?
  • Can you tell me how you teach across gaslighting? How do you stabilize students’ trust in Information, particularly among those whose families are wary of institutions and Information intermediaries?Context: Foreign adversaries (and some domestic groups) are primarily focused on destabilizing people’s trust in information intermediaries. They want people to doubt everything and turn their backs on institutions. We are seeing the impact of this agenda. I’m not finding that teaching someone the source of a piece of content helps build up trust. Instead, it seems to further undermine it. So how do you approach media literacy to build up confidence in institutions and information intermediaries?
  • For what it’s worth, when I try to untangle the threads to actually address the so-called “fake news” problem, I always end in two places: 1) dismantle financialized capitalism (which is also the root cause of some of the most challenging dynamics of tech companies); 2) reknit the social fabric of society by strategically connecting people. But neither of those are recommendations for educators.
Ed Webb

Business as Unusual: The New Normal for Online Learning - BCcampus - 0 views

  • One of the most interesting changes that I saw in terms of online learning was the use of WhatsApp, a text and voice messaging app that is very popular in South Africa. Through the app’s group chat feature, instructors can moderate the discussion and students can leave voice notes, which gives them the ability to have their voices heard asynchronously
  • I’ve imagined a north–south dialogue. Now, due to COVID-19, it’s happening organically, and I’m in the process of reimagining the course I would have been teaching in Vancouver this summer as an online course. I need to factor in which apps to use, how to prepare for students who only have cellphones, and the reality that many students come from other countries to study at Emily Carr, and now they’ll be learning remotely. It’s fascinating that the forced global aspect of the classroom will influence the way I design the educational technology for my program
  • In the past, some educators might have been excited to tear everything apart and build it back up with a goal of helping students learn in a better way, but the institutions wouldn’t be able to support it. Not because they didn’t want to, but because it was difficult for them to do it. Now there’s an opportunity for institutions to let the reins go and encourage creative and new approaches. It’s scary, but it’s also inspiring for educators to have that freedom. The research is available, the interest is there, and the resources are open, so now is the time to make it happen
  • ...9 more annotations...
  • “What surprised me was the resurgence of many of the zombie ideas about online learning creeping into the discussions, such as the idea that online learning isn’t as personal, or that you can’t have interactivity, or that it just doesn’t work. And while it is true you need to change how you think about your course — you can’t just replicate what you used to do in the classroom — there’s an opportunity to evolve your teaching practices and create a better learning experience for your students.”
  • What’s happening now is going to reshape education for years, if not decades.
  • People want the old normal, not the new normal. We will, to some degree, get back to what we know and love, but it won’t ever look like it did before
  • “Like your physical buildings on campus, you also have a somewhat invisible set of resources called your educational technology. If you don’t understand it well and don’t treat it as important infrastructure, your ability to move online sustainably will be challenged. Sometimes institutions see eLearning as a project, not a strategy. Online learning isn’t a fly-by-the-seat-of-your-pants project; it has to be integrated into your academic plan and institutional strategy. I hope that COVID-19 underlined that for institutions.”
  • “We’ve known for over 30 years now that one-hour lectures are not a great way to teach: you can have a good one-hour session, but can you have 13 over a semester? It’s about cognitive load, and students can’t focus for more than 15 to 20 minutes at a time without being distracted. There’s room for synchronous discussion, but we can do it better. There’s a huge amount of research into online learning and what happens when students have access to online learning whenever they want it. And just like in real life, you have to know how to do both synchronous and asynchronous interactions well.”
  • We need to make space for the voices of communities who haven’t traditionally been heard: non-traditional learners, students who are food or housing insecure, students who are neurodivergent, students of colour, and Indigenous students. We must think of all these populations and the degree to which our educational system — our technology, our platforms — has not been built for them. We do a lot of work to make our methods accessible, but at the core, our systems, institutions, and platforms aren’t really built for — or by — those students
  • s challenging as it is, I’m seeing online pedagogy’s focus on equity and care resonating with many of those new to the medium
  • I’ve used really experimental styles over the past few years, but I won’t be doing that as much over the coming year because I shouldn’t. My classes are traditionally where students get to work with tools and platforms outside of the norm. If everyone moving online treats it that way, the cognitive load on the students will be absolutely overwhelming. My right to flex my academic freedom regarding platforms should be superseded by care and consideration for my students’ cognitive loads across a program. Navigating different platforms and tools is hard and distracting.
  • “One of the most vital tools and resources that I’ve seen people using is their human capacities for compassion and patience — the degree to which faculty are stepping up and approaching their students from a place of care, and a place of genuine desire for students to feel a sense of hope, safety, and flexibility.”
Ed Webb

The Myth Of AI | Edge.org - 0 views

  • The distinction between a corporation and an algorithm is fading. Does that make an algorithm a person? Here we have this interesting confluence between two totally different worlds. We have the world of money and politics and the so-called conservative Supreme Court, with this other world of what we can call artificial intelligence, which is a movement within the technical culture to find an equivalence between computers and people. In both cases, there's an intellectual tradition that goes back many decades. Previously they'd been separated; they'd been worlds apart. Now, suddenly they've been intertwined.
  • Since our economy has shifted to what I call a surveillance economy, but let's say an economy where algorithms guide people a lot, we have this very odd situation where you have these algorithms that rely on big data in order to figure out who you should date, who you should sleep with, what music you should listen to, what books you should read, and on and on and on. And people often accept that because there's no empirical alternative to compare it to, there's no baseline. It's bad personal science. It's bad self-understanding.
  • there's no way to tell where the border is between measurement and manipulation in these systems
  • ...8 more annotations...
  • It's not so much a rise of evil as a rise of nonsense. It's a mass incompetence, as opposed to Skynet from the Terminator movies. That's what this type of AI turns into.
  • What's happened here is that translators haven't been made obsolete. What's happened instead is that the structure through which we receive the efforts of real people in order to make translations happen has been optimized, but those people are still needed.
  • In order to create this illusion of a freestanding autonomous artificial intelligent creature, we have to ignore the contributions from all the people whose data we're grabbing in order to make it work. That has a negative economic consequence.
  • If you talk to translators, they're facing a predicament, which is very similar to some of the other early victim populations, due to the particular way we digitize things. It's similar to what's happened with recording musicians, or investigative journalists—which is the one that bothers me the most—or photographers. What they're seeing is a severe decline in how much they're paid, what opportunities they have, their long-term prospects.
  • because of the mythology about AI, the services are presented as though they are these mystical, magical personas. IBM makes a dramatic case that they've created this entity that they call different things at different times—Deep Blue and so forth. The consumer tech companies, we tend to put a face in front of them, like a Cortana or a Siri
  • If you talk about AI as a set of techniques, as a field of study in mathematics or engineering, it brings benefits. If we talk about AI as a mythology of creating a post-human species, it creates a series of problems that I've just gone over, which include acceptance of bad user interfaces, where you can't tell if you're being manipulated or not, and everything is ambiguous. It creates incompetence, because you don't know whether recommendations are coming from anything real or just self-fulfilling prophecies from a manipulative system that spun off on its own, and economic negativity, because you're gradually pulling formal economic benefits away from the people who supply the data that makes the scheme work.
  • This idea that some lab somewhere is making these autonomous algorithms that can take over the world is a way of avoiding the profoundly uncomfortable political problem, which is that if there's some actuator that can do harm, we have to figure out some way that people don't do harm with it. There are about to be a whole bunch of those. And that'll involve some kind of new societal structure that isn't perfect anarchy. Nobody in the tech world wants to face that, so we lose ourselves in these fantasies of AI. But if you could somehow prevent AI from ever happening, it would have nothing to do with the actual problem that we fear, and that's the sad thing, the difficult thing we have to face.
  • To reject your own ignorance just casts you into a silly state where you're a lesser scientist. I don't see that so much in the neuroscience field, but it comes from the computer world so much, and the computer world is so influential because it has so much money and influence that it does start to bleed over into all kinds of other things.
Ed Webb

William Davies · How many words does it take to make a mistake? Education, Ed... - 0 views

  • The problem waiting round the corner for universities is essays generated by AI, which will leave a textual pattern-spotter like Turnitin in the dust. (Earlier this year, I came across one essay that felt deeply odd in some not quite human way, but I had no tangible evidence that anything untoward had occurred, so that was that.)
  • To accuse someone of plagiarism is to make a moral charge regarding intentions. But establishing intent isn’t straightforward. More often than not, the hearings bleed into discussions of issues that could be gathered under the heading of student ‘wellbeing’, which all universities have been struggling to come to terms with in recent years.
  • I have heard plenty of dubious excuses for acts of plagiarism during these hearings. But there is one recurring explanation which, it seems to me, deserves more thoughtful consideration: ‘I took too many notes.’ It isn’t just students who are familiar with information overload, one of whose effects is to morph authorship into a desperate form of curatorial management, organising chunks of text on a screen. The discerning scholarly self on which the humanities depend was conceived as the product of transitions between spaces – library, lecture hall, seminar room, study – linked together by work with pen and paper. When all this is replaced by the interface with screen and keyboard, and everything dissolves into a unitary flow of ‘content’, the identity of the author – as distinct from the texts they have read – becomes harder to delineate.
  • ...19 more annotations...
  • This generation, the first not to have known life before the internet, has acquired a battery of skills in navigating digital environments, but it isn’t clear how well those skills line up with the ones traditionally accredited by universities.
  • From the perspective of students raised in a digital culture, the anti-plagiarism taboo no doubt seems to be just one more academic hang-up, a weird injunction to take perfectly adequate information, break it into pieces and refashion it. Students who pay for essays know what they are doing; others seem conscientious yet intimidated by secondary texts: presumably they won’t be able to improve on them, so why bother trying? For some years now, it’s been noticeable how many students arrive at university feeling that every interaction is a test they might fail. They are anxious. Writing seems fraught with risk, a highly complicated task that can be executed correctly or not.
  • Many students may like the flexibility recorded lectures give them, but the conversion of lectures into yet more digital ‘content’ further destabilises traditional conceptions of learning and writing
  • the evaluation forms which are now such a standard feature of campus life suggest that many students set a lot of store by the enthusiasm and care that are features of a good live lecture
  • the drift of universities towards a platform model, which makes it possible for students to pick up learning materials as and when it suits them. Until now, academics have resisted the push for ‘lecture capture’. It causes in-person attendance at lectures to fall dramatically, and it makes many lecturers feel like mediocre television presenters. Unions fear that extracting and storing teaching for posterity threatens lecturers’ job security and weakens the power of strikes. Thanks to Covid, this may already have happened.
  • In the utopia sold by the EdTech industry (the companies that provide platforms and software for online learning), pupils are guided and assessed continuously. When one task is completed correctly, the next begins, as in a computer game; meanwhile the platform providers are scraping and analysing data from the actions of millions of children. In this behaviourist set-up, teachers become more like coaches: they assist and motivate individual ‘learners’, but are no longer so important to the provision of education. And since it is no longer the sole responsibility of teachers or schools to deliver the curriculum, it becomes more centralised – the latest front in a forty-year battle to wrest control from the hands of teachers and local authorities.
  • an injunction against creative interpretation and writing, a deprivation that working-class children will feel at least as deeply as anyone else.
  • There may be very good reasons for delivering online teaching in segments, punctuated by tasks and feedback, but as Yandell observes, other ways of reading and writing are marginalised in the process. Without wishing to romanticise the lonely reader (or, for that matter, the lonely writer), something is lost when alternating periods of passivity and activity are compressed into interactivity, until eventually education becomes a continuous cybernetic loop of information and feedback. How many keystrokes or mouse-clicks before a student is told they’ve gone wrong? How many words does it take to make a mistake?
  • This vision of language as code may already have been a significant feature of the curriculum, but it appears to have been exacerbated by the switch to online teaching. In a journal article from August 2020, ‘Learning under Lockdown: English Teaching in the Time of Covid-19’, John Yandell notes that online classes create wholly closed worlds, where context and intertextuality disappear in favour of constant instruction. In these online environments, readingis informed not by prior reading experiences but by the toolkit that the teacher has provided, and ... is presented as occurring along a tramline of linear development. Different readings are reducible to better or worse readings: the more closely the student’s reading approximates to the already finalised teacher’s reading, the better it is. That, it would appear, is what reading with precision looks like.
  • Constant interaction across an interface may be a good basis for forms of learning that involve information-processing and problem-solving, where there is a right and a wrong answer. The cognitive skills that can be trained in this way are the ones computers themselves excel at: pattern recognition and computation. The worry, for anyone who cares about the humanities in particular, is about the oversimplifications required to conduct other forms of education in these ways.
  • Blanket surveillance replaces the need for formal assessment.
  • Confirming Adorno’s worst fears of the ‘primacy of practical reason’, reading is no longer dissociable from the execution of tasks. And, crucially, the ‘goals’ to be achieved through the ability to read, the ‘potential’ and ‘participation’ to be realised, are economic in nature.
  • since 2019, with the Treasury increasingly unhappy about the amount of student debt still sitting on the government’s balance sheet and the government resorting to ‘culture war’ at every opportunity, there has been an effort to single out degree programmes that represent ‘poor value for money’, measured in terms of graduate earnings. (For reasons best known to itself, the usually independent Institute for Fiscal Studies has been leading the way in finding correlations between degree programmes and future earnings.) Many of these programmes are in the arts and humanities, and are now habitually referred to by Tory politicians and their supporters in the media as ‘low-value degrees’.
  • studying the humanities may become a luxury reserved for those who can fall back on the cultural and financial advantages of their class position. (This effect has already been noticed among young people going into acting, where the results are more visible to the public than they are in academia or heritage organisations.)
  • given the changing class composition of the UK over the past thirty years, it’s not clear that contemporary elites have any more sympathy for the humanities than the Conservative Party does. A friend of mine recently attended an open day at a well-known London private school, and noticed that while there was a long queue to speak to the maths and science teachers, nobody was waiting to speak to the English teacher. When she asked what was going on, she was told: ‘I’m afraid parents here are very ambitious.’ Parents at such schools, where fees have tripled in real terms since the early 1980s, tend to work in financial and business services themselves, and spend their own days profitably manipulating and analysing numbers on screens. When it comes to the transmission of elite status from one generation to the next, Shakespeare or Plato no longer has the same cachet as economics or physics.
  • Leaving aside the strategic political use of terms such as ‘woke’ and ‘cancel culture’, it would be hard to deny that we live in an age of heightened anxiety over the words we use, in particular the labels we apply to people. This has benefits: it can help to bring discriminatory practices to light, potentially leading to institutional reform. It can also lead to fruitless, distracting public arguments, such as the one that rumbled on for weeks over Angela Rayner’s description of Conservatives as ‘scum’. More and more, words are dredged up, edited or rearranged for the purpose of harming someone. Isolated words have acquired a weightiness in contemporary politics and public argument, while on digital media snippets of text circulate without context, as if the meaning of a single sentence were perfectly contained within it, walled off from the surrounding text. The exemplary textual form in this regard is the newspaper headline or corporate slogan: a carefully curated series of words, designed to cut through the blizzard of competing information.
  • Visit any actual school or university today (as opposed to the imaginary ones described in the Daily Mail or the speeches of Conservative ministers) and you will find highly disciplined, hierarchical institutions, focused on metrics, performance evaluations, ‘behaviour’ and quantifiable ‘learning outcomes’.
  • If young people today worry about using the ‘wrong’ words, it isn’t because of the persistence of the leftist cultural power of forty years ago, but – on the contrary – because of the barrage of initiatives and technologies dedicated to reversing that power. The ideology of measurable literacy, combined with a digital net that has captured social and educational life, leaves young people ill at ease with the language they use and fearful of what might happen should they trip up.
  • It has become clear, as we witness the advance of Panopto, Class Dojo and the rest of the EdTech industry, that one of the great things about an old-fashioned classroom is the facilitation of unrecorded, unaudited speech, and of uninterrupted reading and writing.
1 - 15 of 15
Showing 20 items per page