Skip to main content

Home/ Instructional & Media Services at Dickinson College/ Group items tagged games

Rss Feed Group items tagged

Ed Webb

BBC NEWS | Technology | Battling swine flu in cyberspace - 0 views

  • Dubbed "The Great Flu", the game is based on the threat that the emergence of a new flu virus and its rapid spread across the globe would pose to humanity. "The game is based on the need to increase public awareness to the threat posed by a pandemic and the measures in place to contain it," said Albert Osterhaus, head of virology at the Erasmus Medical Centre and one of the experts involved in creating the game.
Ryan Burke

What are we trying to do here? - 6 views

I've been adding things that in many cases have broader potential use, so have not added specific department tags - where they apply, I add them. Looks like I'm the only one adding bookmarks at th...

welcome

Ed Webb

Op-Ed Contributor - Lost in the Cloud - NYTimes.com - 0 views

  • the most difficult challenge — both to grasp and to solve — of the cloud is its effect on our freedom to innovate.
  • Apple can decide who gets to write code for your phone and which of those offerings will be allowed to run. The company has used this power in ways that Bill Gates never dreamed of when he was the king of Windows: Apple is reported to have censored e-book apps that contain controversial content, eliminated games with political overtones, and blocked uses for the phone that compete with the company’s products. The market is churning through these issues. Amazon is offering a generic cloud-computing infrastructure so anyone can set up new software on a new Web site without gatekeeping by the likes of Facebook. Google’s Android platform is being used in a new generation of mobile phones with fewer restrictions on outside code. But the dynamics here are complicated. When we vest our activities and identities in one place in the cloud, it takes a lot of dissatisfaction for us to move. And many software developers who once would have been writing whatever they wanted for PCs are simply developing less adventurous, less subversive, less game-changing code under the watchful eyes of Facebook and Apple.
Ed Webb

Dark Social: We Have the Whole History of the Web Wrong - Alexis C. Madrigal - The Atla... - 0 views

  • this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you
  • the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic
  • direct socia
  • ...6 more annotations...
  • Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent
  • if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut
  • the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself
  • the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made. 
  • "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage,"
  • only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app
  •  
    Heh. Social is really social, not 'social' - who knew?
Ed Webb

"1945-1998" by Isao Hashimoto: CTBTO Preparatory Commission - 0 views

    • Ed Webb
       
      The retro computer game aesthetic really works for this atompunk artwork
Ed Webb

Bryan Alexander at the 2009 Baylor Educational Technology Showcase « Gardner ... - 0 views

  • Web 2.0. Social Networking. Gaming. Mobile Computing. Above all: teaching and learning.
  •  
    Color me envious
Ed Webb

The powerful and mysterious brain circuitry that makes us love Google, Twitter, and tex... - 0 views

  • For humans, this desire to search is not just about fulfilling our physical needs. Panksepp says that humans can get just as excited about abstract rewards as tangible ones. He says that when we get thrilled about the world of ideas, about making intellectual connections, about divining meaning, it is the seeking circuits that are firing.
  • Our internal sense of time is believed to be controlled by the dopamine system. People with hyperactivity disorder have a shortage of dopamine in their brains, which a recent study suggests may be at the root of the problem. For them even small stretches of time seem to drag.
  • When we get the object of our desire (be it a Twinkie or a sexual partner), we engage in consummatory acts that Panksepp says reduce arousal in the brain and temporarily, at least, inhibit our urge to seek.
  • ...3 more annotations...
  • But our brains are designed to more easily be stimulated than satisfied. "The brain seems to be more stingy with mechanisms for pleasure than for desire," Berridge has said. This makes evolutionary sense. Creatures that lack motivation, that find it easy to slip into oblivious rapture, are likely to lead short (if happy) lives. So nature imbued us with an unquenchable drive to discover, to explore. Stanford University neuroscientist Brian Knutson has been putting people in MRI scanners and looking inside their brains as they play an investing game. He has consistently found that the pictures inside our skulls show that the possibility of a payoff is much more stimulating than actually getting one.
  • all our electronic communication devices—e-mail, Facebook feeds, texts, Twitter—are feeding the same drive as our searches. Since we're restless, easily bored creatures, our gadgets give us in abundance qualities the seeking/wanting system finds particularly exciting. Novelty is one. Panksepp says the dopamine system is activated by finding something unexpected or by the anticipation of something new. If the rewards come unpredictably—as e-mail, texts, updates do—we get even more carried away. No wonder we call it a "CrackBerry."
  • If humans are seeking machines, we've now created the perfect machines to allow us to seek endlessly. This perhaps should make us cautious. In Animals in Translation, Temple Grandin writes of driving two indoor cats crazy by flicking a laser pointer around the room. They wouldn't stop stalking and pouncing on this ungraspable dot of light—their dopamine system pumping. She writes that no wild cat would indulge in such useless behavior: "A cat wants to catch the mouse, not chase it in circles forever." She says "mindless chasing" makes an animal less likely to meet its real needs "because it short-circuits intelligent stalking behavior." As we chase after flickering bits of information, it's a salutary warning.
Ed Webb

YouTube - Understanding the Millennial Student - 0 views

  •  
    The takeaway for me: reinforces my belief in the importance of play. Engaged learning is playful, versus passive learning which is the old entertainment model - Web2.0 versus television
Ed Webb

The Ed-Tech Imaginary - 0 views

  • We can say "Black lives matter," but we must also demonstrate through our actions that Black lives matter, and that means we must radically alter many of our institutions and practices, recognizing their inhumanity and carcerality. And that includes, no doubt, ed-tech. How much of ed-tech is, to use Ruha Benjamin's phrase, "the new Jim Code"? How much of ed-tech is designed by those who imagine students as cheats or criminals, as deficient or negligent?
  • "Reimagining" is a verb that education reformers are quite fond of. And "reimagining" seems too often to mean simply defunding, privatizing, union-busting, dismantling, outsourcing.
  • if Betsy DeVos is out there "reimagining," then we best be resisting
  • ...9 more annotations...
  • think we can view the promotion of ed-tech as a similar sort of process — the stories designed to convince us that the future of teaching and learning will be a technological wonder. The "jobs of the future that don't exist yet." The push for everyone to "learn to code."
  • The Matrix is, after all, a dystopia. So why would Matrix-style learning be desirable? Maybe that's the wrong question. Perhaps it's not so much that it's desirable, but it's just how our imaginations have been constructed, constricted even. We can't imagine any other ideal but speed and efficiency.
  • The first science fiction novel, published over 200 years ago, was in fact an ed-tech story: Mary Shelley's Frankenstein. While the book is commonly interpreted as a tale of bad science, it is also the story of bad education — something we tend to forget if we only know the story through the 1931 film version
  • Teaching machines and robot teachers were part of the Sixties' cultural imaginary — perhaps that's the problem with so many Boomer ed-reform leaders today. But that imaginary — certainly in the case of The Jetsons — was, upon close inspection, not always particularly radical or transformative. The students at Little Dipper Elementary still sat in desks in rows. The teacher still stood at the front of the class, punishing students who weren't paying attention.
  • we must also decolonize the ed-tech imaginary
  • Zuckerberg gave everyone at Facebook a copy of the Ernest Cline novel Ready Player One, for example, to get them excited about building technology for the future — a book that is really just a string of nostalgic references to Eighties white boy culture. And I always think about that New York Times interview with Sal Khan, where he said that "The science fiction books I like tend to relate to what we're doing at Khan Academy, like Orson Scott Card's 'Ender's Game' series." You mean, online math lectures are like a novel that justifies imperialism and genocide?! Wow.
  • This ed-tech imaginary is segregated. There are no Black students at the push-button school. There are no Black people in The Jetsons — no Black people living the American dream of the mid-twenty-first century
  • Part of the argument I make in my book is that much of education technology has been profoundly shaped by Skinner, even though I'd say that most practitioners today would say that they reject his theories; that cognitive science has supplanted behaviorism; and that after Ayn Rand and Noam Chomsky trashed Beyond Freedom and Dignity, no one paid attention to Skinner any more — which is odd considering there are whole academic programs devoted to "behavioral design," bestselling books devoted to the "nudge," and so on.
  • so much of the ed-tech imaginary is wrapped up in narratives about the Hero, the Weapon, the Machine, the Behavior, the Action, the Disruption. And it's so striking because education should be a practice of care, not conquest
Ed Webb

Google and Meta moved cautiously on AI. Then came OpenAI's ChatGPT. - The Washington Post - 0 views

  • The surge of attention around ChatGPT is prompting pressure inside tech giants including Meta and Google to move faster, potentially sweeping safety concerns aside
  • Tech giants have been skittish since public debacles like Microsoft’s Tay, which it took down in less than a day in 2016 after trolls prompted the bot to call for a race war, suggest Hitler was right and tweet “Jews did 9/11.”
  • Some AI ethicists fear that Big Tech’s rush to market could expose billions of people to potential harms — such as sharing inaccurate information, generating fake photos or giving students the ability to cheat on school tests — before trust and safety experts have been able to study the risks. Others in the field share OpenAI’s philosophy that releasing the tools to the public, often nominally in a “beta” phase after mitigating some predictable risks, is the only way to assess real world harms.
  • ...8 more annotations...
  • Silicon Valley’s sudden willingness to consider taking more reputational risk arrives as tech stocks are tumbling
  • A chatbot that pointed to one answer directly from Google could increase its liability if the response was found to be harmful or plagiarized.
  • AI has been through several hype cycles over the past decade, but the furor over DALL-E and ChatGPT has reached new heights.
  • Soon after OpenAI released ChatGPT, tech influencers on Twitter began to predict that generative AI would spell the demise of Google search. ChatGPT delivered simple answers in an accessible way and didn’t ask users to rifle through blue links. Besides, after a quarter of a century, Google’s search interface had grown bloated with ads and marketers trying to game the system.
  • Inside big tech companies, the system of checks and balances for vetting the ethical implications of cutting-edge AI isn’t as established as privacy or data security. Typically teams of AI researchers and engineers publish papers on their findings, incorporate their technology into the company’s existing infrastructure or develop new products, a process that can sometimes clash with other teams working on responsible AI over pressure to see innovation reach the public sooner.
  • Chatbots like OpenAI routinely make factual errors and often switch their answers depending on how a question is asked
  • To Timnit Gebru, executive director of the nonprofit Distributed AI Research Institute, the prospect of Google sidelining its responsible AI team doesn’t necessarily signal a shift in power or safety concerns, because those warning of the potential harms were never empowered to begin with. “If we were lucky, we’d get invited to a meeting,” said Gebru, who helped lead Google’s Ethical AI team until she was fired for a paper criticizing large language models.
  • Rumman Chowdhury, who led Twitter’s machine-learning ethics team until Elon Musk disbanded it in November, said she expects companies like Google to increasingly sideline internal critics and ethicists as they scramble to catch up with OpenAI.“We thought it was going to be China pushing the U.S., but looks like it’s start-ups,” she said.
Ed Webb

William Davies · How many words does it take to make a mistake? Education, Ed... - 0 views

  • The problem waiting round the corner for universities is essays generated by AI, which will leave a textual pattern-spotter like Turnitin in the dust. (Earlier this year, I came across one essay that felt deeply odd in some not quite human way, but I had no tangible evidence that anything untoward had occurred, so that was that.)
  • To accuse someone of plagiarism is to make a moral charge regarding intentions. But establishing intent isn’t straightforward. More often than not, the hearings bleed into discussions of issues that could be gathered under the heading of student ‘wellbeing’, which all universities have been struggling to come to terms with in recent years.
  • I have heard plenty of dubious excuses for acts of plagiarism during these hearings. But there is one recurring explanation which, it seems to me, deserves more thoughtful consideration: ‘I took too many notes.’ It isn’t just students who are familiar with information overload, one of whose effects is to morph authorship into a desperate form of curatorial management, organising chunks of text on a screen. The discerning scholarly self on which the humanities depend was conceived as the product of transitions between spaces – library, lecture hall, seminar room, study – linked together by work with pen and paper. When all this is replaced by the interface with screen and keyboard, and everything dissolves into a unitary flow of ‘content’, the identity of the author – as distinct from the texts they have read – becomes harder to delineate.
  • ...19 more annotations...
  • This generation, the first not to have known life before the internet, has acquired a battery of skills in navigating digital environments, but it isn’t clear how well those skills line up with the ones traditionally accredited by universities.
  • From the perspective of students raised in a digital culture, the anti-plagiarism taboo no doubt seems to be just one more academic hang-up, a weird injunction to take perfectly adequate information, break it into pieces and refashion it. Students who pay for essays know what they are doing; others seem conscientious yet intimidated by secondary texts: presumably they won’t be able to improve on them, so why bother trying? For some years now, it’s been noticeable how many students arrive at university feeling that every interaction is a test they might fail. They are anxious. Writing seems fraught with risk, a highly complicated task that can be executed correctly or not.
  • Many students may like the flexibility recorded lectures give them, but the conversion of lectures into yet more digital ‘content’ further destabilises traditional conceptions of learning and writing
  • the evaluation forms which are now such a standard feature of campus life suggest that many students set a lot of store by the enthusiasm and care that are features of a good live lecture
  • the drift of universities towards a platform model, which makes it possible for students to pick up learning materials as and when it suits them. Until now, academics have resisted the push for ‘lecture capture’. It causes in-person attendance at lectures to fall dramatically, and it makes many lecturers feel like mediocre television presenters. Unions fear that extracting and storing teaching for posterity threatens lecturers’ job security and weakens the power of strikes. Thanks to Covid, this may already have happened.
  • In the utopia sold by the EdTech industry (the companies that provide platforms and software for online learning), pupils are guided and assessed continuously. When one task is completed correctly, the next begins, as in a computer game; meanwhile the platform providers are scraping and analysing data from the actions of millions of children. In this behaviourist set-up, teachers become more like coaches: they assist and motivate individual ‘learners’, but are no longer so important to the provision of education. And since it is no longer the sole responsibility of teachers or schools to deliver the curriculum, it becomes more centralised – the latest front in a forty-year battle to wrest control from the hands of teachers and local authorities.
  • an injunction against creative interpretation and writing, a deprivation that working-class children will feel at least as deeply as anyone else.
  • There may be very good reasons for delivering online teaching in segments, punctuated by tasks and feedback, but as Yandell observes, other ways of reading and writing are marginalised in the process. Without wishing to romanticise the lonely reader (or, for that matter, the lonely writer), something is lost when alternating periods of passivity and activity are compressed into interactivity, until eventually education becomes a continuous cybernetic loop of information and feedback. How many keystrokes or mouse-clicks before a student is told they’ve gone wrong? How many words does it take to make a mistake?
  • This vision of language as code may already have been a significant feature of the curriculum, but it appears to have been exacerbated by the switch to online teaching. In a journal article from August 2020, ‘Learning under Lockdown: English Teaching in the Time of Covid-19’, John Yandell notes that online classes create wholly closed worlds, where context and intertextuality disappear in favour of constant instruction. In these online environments, readingis informed not by prior reading experiences but by the toolkit that the teacher has provided, and ... is presented as occurring along a tramline of linear development. Different readings are reducible to better or worse readings: the more closely the student’s reading approximates to the already finalised teacher’s reading, the better it is. That, it would appear, is what reading with precision looks like.
  • Constant interaction across an interface may be a good basis for forms of learning that involve information-processing and problem-solving, where there is a right and a wrong answer. The cognitive skills that can be trained in this way are the ones computers themselves excel at: pattern recognition and computation. The worry, for anyone who cares about the humanities in particular, is about the oversimplifications required to conduct other forms of education in these ways.
  • Blanket surveillance replaces the need for formal assessment.
  • Confirming Adorno’s worst fears of the ‘primacy of practical reason’, reading is no longer dissociable from the execution of tasks. And, crucially, the ‘goals’ to be achieved through the ability to read, the ‘potential’ and ‘participation’ to be realised, are economic in nature.
  • since 2019, with the Treasury increasingly unhappy about the amount of student debt still sitting on the government’s balance sheet and the government resorting to ‘culture war’ at every opportunity, there has been an effort to single out degree programmes that represent ‘poor value for money’, measured in terms of graduate earnings. (For reasons best known to itself, the usually independent Institute for Fiscal Studies has been leading the way in finding correlations between degree programmes and future earnings.) Many of these programmes are in the arts and humanities, and are now habitually referred to by Tory politicians and their supporters in the media as ‘low-value degrees’.
  • studying the humanities may become a luxury reserved for those who can fall back on the cultural and financial advantages of their class position. (This effect has already been noticed among young people going into acting, where the results are more visible to the public than they are in academia or heritage organisations.)
  • given the changing class composition of the UK over the past thirty years, it’s not clear that contemporary elites have any more sympathy for the humanities than the Conservative Party does. A friend of mine recently attended an open day at a well-known London private school, and noticed that while there was a long queue to speak to the maths and science teachers, nobody was waiting to speak to the English teacher. When she asked what was going on, she was told: ‘I’m afraid parents here are very ambitious.’ Parents at such schools, where fees have tripled in real terms since the early 1980s, tend to work in financial and business services themselves, and spend their own days profitably manipulating and analysing numbers on screens. When it comes to the transmission of elite status from one generation to the next, Shakespeare or Plato no longer has the same cachet as economics or physics.
  • Leaving aside the strategic political use of terms such as ‘woke’ and ‘cancel culture’, it would be hard to deny that we live in an age of heightened anxiety over the words we use, in particular the labels we apply to people. This has benefits: it can help to bring discriminatory practices to light, potentially leading to institutional reform. It can also lead to fruitless, distracting public arguments, such as the one that rumbled on for weeks over Angela Rayner’s description of Conservatives as ‘scum’. More and more, words are dredged up, edited or rearranged for the purpose of harming someone. Isolated words have acquired a weightiness in contemporary politics and public argument, while on digital media snippets of text circulate without context, as if the meaning of a single sentence were perfectly contained within it, walled off from the surrounding text. The exemplary textual form in this regard is the newspaper headline or corporate slogan: a carefully curated series of words, designed to cut through the blizzard of competing information.
  • Visit any actual school or university today (as opposed to the imaginary ones described in the Daily Mail or the speeches of Conservative ministers) and you will find highly disciplined, hierarchical institutions, focused on metrics, performance evaluations, ‘behaviour’ and quantifiable ‘learning outcomes’.
  • If young people today worry about using the ‘wrong’ words, it isn’t because of the persistence of the leftist cultural power of forty years ago, but – on the contrary – because of the barrage of initiatives and technologies dedicated to reversing that power. The ideology of measurable literacy, combined with a digital net that has captured social and educational life, leaves young people ill at ease with the language they use and fearful of what might happen should they trip up.
  • It has become clear, as we witness the advance of Panopto, Class Dojo and the rest of the EdTech industry, that one of the great things about an old-fashioned classroom is the facilitation of unrecorded, unaudited speech, and of uninterrupted reading and writing.
1 - 20 of 21 Next ›
Showing 20 items per page