Skip to main content

Home/ Instructional & Media Services at Dickinson College/ Group items matching "production" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Ed Webb

9 Ways Online Teaching Should be Different from Face-to-Face | Cult of Pedagogy - 0 views

  • Resist the temptation to dive right into curriculum at the start of the school year. Things will go more smoothly if you devote the early weeks to building community so students feel connected. Social emotional skills can be woven in during this time. On top of that, students need practice with whatever digital tools you’ll be using. So focus your lessons on those things, intertwining the two when possible. 
  • Online instruction is made up largely of asynchronous instruction, which students can access at any time. This is ideal, because requiring attendance for synchronous instruction puts some students at an immediate disadvantage if they don’t have the same access to technology, reliable internet, or a flexible home schedule. 
  • you’re likely to offer “face-to-face” or synchronous opportunities at some point, and one way to make them happen more easily is to have students meet in small groups. While it’s nearly impossible to arrange for 30 students to attend a meeting at once, assigning four students to meet is much more manageable.
  • ...9 more annotations...
  • What works best, Kitchen says, is to keep direct instruction—things like brief video lectures and readings—in asynchronous form, using checks for understanding like embedded questions or exit slips.  You can then use synchronous meetings for more interactive, engaging work. “If we want students showing up, if we want them to know that this is worth their time,” Kitchen explains, “it really needs to be something active and engaging for them. Any time they can work with the material, categorize it, organize it, share further thoughts on it, have a discussion, all of those are great things to do in small groups.” 
  • The Jigsaw method, where students form expert groups on a particular chunk of content, then teach that content to other students. Discussion strategies adapted for virtual settingsUsing best practices for cooperative learning Visible Thinking routinesGamestorming and other business related protocols adapted for education, where students take on the role of customers/stakeholders
  • Online instruction is not conducive to covering large amounts of content, so you have to choose wisely, teaching the most important things at a slower pace.
  • What really holds leverage for the students? What has endurance? What knowledge is essential?What knowledge and skills do students need to have before they move to the next grade level or the next class?What practices can be emphasized that transfer across many content areas?  Skills like analyzing, constructing arguments, building a strong knowledge base through texts, and speaking can all be taught through many different subjects. What tools can serve multiple purposes? Teaching students to use something like Padlet gives them opportunities to use audio, drawing, writing, and video. Non-digital tools can also work: Students can use things they find around the house, like toilet paper rolls, to fulfill other assignments, and then submit their work with a photo.
  • Provide instructions in a consistent location and at a consistent time. This advice was already given for parents, but it’s worth repeating here through the lens of instructional design: Set up lessons so that students know where to find instructions every time. Make instructions explicit. Read and re-read to make sure these are as clear as possible. Make dogfooding your lessons a regular practice to root out problem areas.Offer multimodal instructions. If possible, provide both written and video instructions for assignments, so students can choose the format that works best for them. You might also offer a synchronous weekly or daily meeting; what’s great about doing these online is that even if you teach several sections of the same class per day, students are no longer restricted to class times and can attend whatever meeting works best for them.
  • put the emphasis on formative feedback as students work through assignments and tasks, rather than simply grading them at the end. 
  • In online learning, Kitchen says, “There are so many ways that students can cheat, so if we’re giving them just the traditional quiz or test, it’s really easy for them to be able to just look up that information.” A great solution to this problem is to have students create things.
  • For assessment, use a detailed rubric that highlights the learning goals the end product will demonstrate. A single-point rubric works well for this.To help students discover tools to work with, this list of tools is organized by the type of product each one creates. Another great source of ideas is the Teacher’s Guide to Tech.When developing the assignment, rather than focusing on the end product, start by getting clear on what you want students to DO with that product.
  • Clear and consistent communicationCreating explicit and consistent rituals and routinesUsing research-based instructional strategiesDetermining whether to use digital or non-digital tools for an assignment A focus on authentic learning, where authentic products are created and students have voice and choice in assignments
Ed Webb

Elgan: Why goofing off boosts productivity - 0 views

  • I believe that not only are office slackers more productive than work-only employees, but that people who work from home are more productive than the office crowd -- and for many of the same reasons
  • 2. It gets personal things off your mind.
  • 1. The subconscious mind keeps working.
  • ...2 more annotations...
  • 3. It builds work relationships.
  • 4. It converts real-time interactions into asynchronous ones.
  •  
    That's my story and I'm sticking to it.
Ed Webb

Social Media is Killing the LMS Star - A Bootleg of Bryan Alexander's Lost Presentation - Open Education Conference - 0 views

  • Note that this isn’t just a technological alternate history. It also describes a different set of social and cultural practices.
  • CMSes lumber along like radio, still playing into the air as they continue to gradually shift ever farther away on the margins. In comparison, Web 2.0 is like movies and tv combined, plus printed books and magazines. That’s where the sheer scale, creative ferment, and wife-ranging influence reside. This is the necessary background for discussing how to integrate learning and the digital world.
  • These virtual classes are like musical practice rooms, small chambers where one may try out the instrument in silent isolation. It is not connectivism but disconnectivism.
  • ...11 more annotations...
  • CMSes shift from being merely retrograde to being actively regressive if we consider the broader, subtler changes in the digital teaching landscape. Web 2.0 has rapidly grown an enormous amount of content through what Yochai Benkler calls “peer-based commons production.” One effect of this has been to grow a large area for informal learning, which students (and staff) access without our benign interference. Students (and staff) also contribute to this peering world; more on this later. For now, we can observe that as teachers we grapple with this mechanism of change through many means, but the CMS in its silo’d isolation is not a useful tool.
  • those curious about teaching with social media have easy access to a growing, accessible community of experienced staff by means of those very media. A meta-community of Web 2.0 academic practitioners is now too vast to catalogue. Academics in every discipline blog about their work. Wikis record their efforts and thoughts, as do podcasts. The reverse is true of the CMS, the very architecture of which forbids such peer-to-peer information sharing. For example, the Resource Center for Cyberculture Studies (RCCS) has for many years maintained a descriptive listing of courses about digital culture across the disciplines. During the 1990s that number grew with each semester. But after the explosive growth of CMSes that number dwindled. Not the number of classes taught, but the number of classes which could even be described. According to the RCCS’ founder, David Silver (University of San Francisco), this is due to the isolation of class content in CMS containers.
  • unless we consider the CMS environment to be a sort of corporate intranet simulation, the CMS set of community skills is unusual, rarely applicable to post-graduation examples. In other words, while a CMS might help privacy concerns, it is at best a partial, not sufficient solution, and can even be inappropriate for already online students.
  • That experiential, teachable moment of selecting one’s copyright stance is eliminated by the CMS.
  • Another argument in favor of CMSes over Web 2.0 concerns the latter’s open nature. It is too open, goes the thought, constituting a “Wild West” experience of unfettered information flow and unpleasant forms of access. Campuses should run CMSes to create shielded environments, iPhone-style walled gardens that protect the learning process from the Lovecraftian chaos without.
  • social sifting, information literacy, using the wisdom of crowds, and others. Such strategies are widely discussed, easily accessed, and continually revised and honed.
  • at present, radio CMS is the Clear Channel of online learning.
  • For now, the CMS landsape is a multi-institutional dark Web, an invisible, unsearchable, un-mash-up-able archipelago of hidden learning content.
  • Can the practice of using a CMS prepare either teacher or student to think critically about this new shape for information literacy? Moreover, can we use the traditional CMS to share thoughts and practices about this topic?
  • The internet of things refers to a vastly more challenging concept, the association of digital information with the physical world. It covers such diverse instances as RFID chips attached to books or shipping pallets, connecting a product’s scanned UPC code to a Web-based database, assigning unique digital identifiers to physical locations, and the broader enterprise of augmented reality. It includes problems as varied as building search that covers both the World Wide Web and one’s mobile device, revising copyright to include digital content associated with private locations, and trying to salvage what’s left of privacy. How does this connect with our topic? Consider a recent article by Tim O’Reilly and John Battle, where they argue that the internet of things is actually growing knowledge about itself. The combination of people, networks, and objects is building descriptions about objects, largely in folksonomic form. That is, people are tagging the world, and sharing those tags. It’s worth quoting a passage in full: “It’s also possible to give structure to what appears to be unstructured data by teaching an application how to recognize the connection between the two. For example, You R Here, an iPhone app, neatly combines these two approaches. You use your iPhone camera to take a photo of a map that contains details not found on generic mapping applications such as Google maps – say a trailhead map in a park, or another hiking map. Use the phone’s GPS to set your current location on the map. Walk a distance away, and set a second point. Now your iPhone can track your position on that custom map image as easily as it can on Google maps.” (http://www.web2summit.com/web2009/public/schedule/detail/10194) What world is better placed to connect academia productively with such projects, the open social Web or the CMS?
  • imagine the CMS function of every class much like class email, a necessary feature, but not by any means the broadest technological element. Similarly the e-reserves function is of immense practical value. There may be no better way to share copyrighted academic materials with a class, at this point. These logistical functions could well play on.
Ed Webb

CRITICAL AI: Adapting College Writing for the Age of Large Language Models such as ChatGPT: Some Next Steps for Educators - Critical AI - 1 views

  • In the long run, we believe, teachers need to help students develop a critical awareness of generative machine models: how they work; why their content is often biased, false, or simplistic; and what their social, intellectual, and environmental implications might be. But that kind of preparation takes time, not least because journalism on this topic is often clickbait-driven, and “AI” discourse tends to be jargony, hype-laden, and conflated with science fiction.
  • Make explicit that the goal of writing is neither a product nor a grade but, rather, a process that empowers critical thinking
  • Students are more likely to misuse text generators if they trust them too much. The term “Artificial Intelligence” (“AI”) has become a marketing tool for hyping products. For all their impressiveness, these systems are not intelligent in the conventional sense of that term. They are elaborate statistical models that rely on mass troves of data—which has often been scraped indiscriminately from the web and used without knowledge or consent.
  • ...9 more annotations...
  • LLMs usually cannot do a good job of explaining how a particular passage from a longer text illuminates the whole of that longer text. Moreover, ChatGPT’s outputs on comparison and contrast are often superficial. Typically the system breaks down a task of logical comparison into bite-size pieces, conveys shallow information about each of those pieces, and then formulaically “compares” and “contrasts” in a noticeably superficial or repetitive way. 
  • In-class writing, whether digital or handwritten, may have downsides for students with anxiety and disabilities
  • ChatGPT can produce outputs that take the form of  “brainstorms,” outlines, and drafts. It can also provide commentary in the style of peer review or self-analysis. Nonetheless, students would need to coordinate multiple submissions of automated work in order to complete this type of assignment with a text generator.  
  • No one should present auto-generated writing as their own on the expectation that this deception is undiscoverable. 
  • LLMs often mimic the harmful prejudices, misconceptions, and biases found in data scraped from the internet
  • Show students examples of inaccuracy, bias, logical, and stylistic problems in automated outputs. We can build students’ cognitive abilities by modeling and encouraging this kind of critique. Given that social media and the internet are full of bogus accounts using synthetic text, alerting students to the intrinsic problems of such writing could be beneficial. (See the “ChatGPT/LLM Errors Tracker,” maintained by Gary Marcus and Ernest Davis.)
  • Since ChatGPT is good at grammar and syntax but suffers from formulaic, derivative, or inaccurate content, it seems like a poor foundation for building students’ skills and may circumvent their independent thinking.
  • Good journalism on language models is surprisingly hard to find since the technology is so new and the hype is ubiquitous. Here are a few reliable short pieces.     “ChatGPT Advice Academics Can Use Now” edited by Susan Dagostino, Inside Higher Ed, January 12, 2023  “University students recruit AI to write essays for them. Now what?” by Katyanna Quach, The Register, December 27, 2022  “How to spot AI-generated text” by Melissa Heikkilä, MIT Technology Review, December 19, 2022  The Road to AI We Can Trust, Substack by Gary Marcus, a cognitive scientist and AI researcher who writes frequently and lucidly about the topic. See also Gary Marcus and Ernest Davis, “GPT-3, Bloviator: OpenAI’s Language Generator Has No Idea What It’s Talking About” (2020).
  • “On the Dangers of Stochastic Parrots” by Emily M. Bender, Timnit Gebru, et al, FAccT ’21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, March 2021. Association for Computing Machinery, doi: 10.1145/3442188. A blog post summarizing and discussing the above essay derived from a Critical AI @ Rutgers workshop on the essay: summarizes key arguments, reprises discussion, and includes links to video-recorded presentations by digital humanist Katherine Bode (ANU) and computer scientist and NLP researcher Matthew Stone (Rutgers).
Ed Webb

It's just not working out the way we thought it would « Lisa's (Online) Teaching Blog - 0 views

  • Gradually, closed spaces (Facebook, Ning, even Google if you understand what they’re up to) have become the norm, as have monetized sites. The spaces that were free are no longer free, although many of us freely contributed our own work to these sites, providing the basis of their popularity in the first place. Crowdsourcing, celebrated in story and song, has become the exploitation of the work of others in order to make money or provide cheap customer service. The use of personal information for marketing purposes is widespread, and creative people are leaving the platforms that brought everyone into the agora in the first place. Scholars at first enthusiastic about the future now see it as a lonely place. And I see conversations where people who care deeply about the web, education for the 21st century, and learning theories are beginning to back away from proselytizing about academic openness.
  • it’s about users becoming the products in the marketplace and the amusements in the panopticon
  • Where before it might have made sense to say we should make sure everyone is web literate, now such literacy extends beyond critical thinking about websites into a deeper understanding of what the using the web means for individual privacy and independence. This time, the enemies of openness and freedom won’t need to argue their philosophical reasons – they’ll argue that they’re protecting people. And the trouble is, they may be right.
  • ...1 more annotation...
  • We need to be the antidote for blind adoption
test and tagging

Be Safe With [e]Safe - 1 views

The welfare of my employees is my number one priority so that I can ensure that they will work productively. That is why when I established my company, I made sure that the equipment to be used are...

test and tagging

started by test and tagging on 15 Dec 11 no follow-up yet
Ed Webb

Google pushes journalists to create G+ profiles · kegill · Storify - 0 views

  • linking search results with Google+ was like Microsoft bundling Internet Explore with Windows
  • Market strength in one place being used to leverage sub optimal products in another.
  • It's time to tell both Google and Bing that we want to decide for ourselves, thank you very much, if content is credible, instead of their making those decisions for us, decisions made behind hidden -- and suspicious -- algorithms.
Ed Webb

Reflections on open courses « Connectivism - 0 views

  • There is value of blending traditional with emergent knowledge spaces (online conferences and traditional journals) - Learners will create and innovate if they can express ideas and concepts in their own spaces and through their own expertise (i.e. hosting events in Second Life) - Courses are platforms for innovation. Too rigid a structure puts the educator in full control. Using a course as a platform fosters creativity…and creativity generates a bit of chaos and can be unsettling to individuals who prefer a structure with which they are familiar. - (cliche) Letting go of control is a bit stressful, but surprisingly rewarding in the new doors it opens and liberating in how it brings others in to assist in running a course and advancing the discussion. - People want to participate…but they will only do so once they have “permission” and a forum in which to utilize existing communication/technological skills.
  • The internet is a barrier-reducing system. In theory, everyone has a voice online (the reality of technology ownership, digital skills, and internet access add an unpleasant dimension). Costs of duplication are reduced. Technology (technique) is primarily a duplicationary process, as evidenced by the printing press, assembly line, and now the content duplication ability of digital technologies. As a result, MOOCs embody, rather than reflect, practices within the digital economy. MOOCs reduce barriers to information access and to the dialogue that permits individuals (and society) to grow knowledge. Much of the technical innovation in the last several centuries has permitted humanity to extend itself physically (cars, planes, trains, telescopes). The internet, especially in recent developments of connective and collaborative applications, is a cognitive extension for humanity. Put another way, the internet offers a model where the reproduction of knowledge is not confined to the production of physical objects.
  • Knowledge is a mashup. Many people contribute. Many different forums are used. Multiple media permit varied and nuanced expressions of knowledge. And, because the information base (which is required for knowledge formation) changes so rapidly, being properly connected to the right people and information is vitally important. The need for proper connectedness to the right people and information is readily evident in intelligence communities. Consider the Christmas day bomber. Or 9/11. The information was being collected. But not connected.
  • ...11 more annotations...
  • The open model of participation calls into question where value is created in the education system. Gutenberg created a means to duplicate content. The social web creates the opportunity for many-to-many interactions and to add a global social layer on content creation and knowledge growth.
  • Whatever can be easily duplicated cannot serve as the foundation for economic value. Integration and connectedness are economic value points.
  • In education, content can easily be produced (it’s important but has limited economic value). Lectures also have limited value (easy to record and to duplicate). Teaching – as done in most universities – can be duplicated. Learning, on the other hand, can’t be duplicated. Learning is personal, it has to occur one learner at a time. The support needed for learners to learn is a critical value point.
  • Learning, however, requires a human, social element: both peer-based and through interaction with subject area experts
  • Content is readily duplicated, reducing its value economically. It is still critical for learning – all fields have core elements that learners must master before they can advance (research in expertise supports this notion). - Teaching can be duplicated (lectures can be recorded, Elluminate or similar webconferencing system can bring people from around the world into a class). Assisting learners in the learning process, correcting misconceptions (see Private Universe), and providing social support and brokering introductions to other people and ideas in the discipline is critical. - Accreditation is a value statement – it is required when people don’t know each other. Content was the first area of focus in open education. Teaching (i.e. MOOCs) are the second. Accreditation will be next, but, before progress can be made, profile, identity, and peer-rating systems will need to improve dramatically. The underlying trust mechanism on which accreditation is based cannot yet be duplicated in open spaces (at least, it can’t be duplicated to such a degree that people who do not know each other will trust the mediating agent of open accreditation)
  • The skills that are privileged and rewarded in a MOOC are similar to those that are needed to be effective in communicating with others and interacting with information online (specifically, social media and information sources like journals, databases, videos, lectures, etc.). Creative skills are the most critical. Facilitators and learners need something to “point to”. When a participant creates an insightful blog post, a video, a concept map, or other resource/artifact it generally gets attention.
  • Intentional diversity – not necessarily a digital skill, but the ability to self-evaluate ones network and ensure diversity of ideologies is critical when information is fragmented and is at risk of being sorted by single perspectives/ideologies.
  • The volume of information is very disorienting in a MOOC. For example, in CCK08, the initial flow of postings in Moodle, three weekly live sessions, Daily newsletter, and weekly readings and assignments proved to be overwhelming for many participants. Stephen and I somewhat intentionally structured the course for this disorienting experience. Deciding who to follow, which course concepts are important, and how to form sub-networks and sub-systems to assist in sensemaking are required to respond to information abundance. The process of coping and wayfinding (ontology) is as much a lesson in the learning process as mastering the content (epistemology). Learners often find it difficult to let go of the urge to master all content, read all the comments and blog posts.
  • e. Learning is a social trust-based process.
  • Patience, tolerance, suspension of judgment, and openness to other cultures and ideas are required to form social connections and negotiating misunderstandings.
  • An effective digital citizenry needs the skills to participate in important conversations. The growth of digital content and social networks raises the need citizens to have the technical and conceptual skills to express their ideas and engage with others in those spaces. MOOCs are a first generation testing grounds for knowledge growth in a distributed, global, digital world. Their role in developing a digital citizenry is still unclear, but democratic societies require a populace with the skills to participate in growing a society’s knowledge. As such, MOOCs, or similar open transparent learning experiences that foster the development of citizens confidence engage and create collaboratively, are important for the future of society.
Ed Webb

Op-Ed Contributor - Lost in the Cloud - NYTimes.com - 0 views

  • the most difficult challenge — both to grasp and to solve — of the cloud is its effect on our freedom to innovate.
  • Apple can decide who gets to write code for your phone and which of those offerings will be allowed to run. The company has used this power in ways that Bill Gates never dreamed of when he was the king of Windows: Apple is reported to have censored e-book apps that contain controversial content, eliminated games with political overtones, and blocked uses for the phone that compete with the company’s products. The market is churning through these issues. Amazon is offering a generic cloud-computing infrastructure so anyone can set up new software on a new Web site without gatekeeping by the likes of Facebook. Google’s Android platform is being used in a new generation of mobile phones with fewer restrictions on outside code. But the dynamics here are complicated. When we vest our activities and identities in one place in the cloud, it takes a lot of dissatisfaction for us to move. And many software developers who once would have been writing whatever they wanted for PCs are simply developing less adventurous, less subversive, less game-changing code under the watchful eyes of Facebook and Apple.
Ed Webb

Ian Bogost - Beyond Blogs - 0 views

  • I wish these were the sorts of questions so-called digital humanists considered, rather than figuring out how to pay homage to the latest received web app or to build new tools to do the same old work. But as I recently argued, a real digital humanism isn't one that's digital, but one that's concerned with the present and the future. A part of that concern involves considering the way we want to interact with one another and the world as scholars, and to intervene in that process by making it happen. Such a question is far more interesting and productive than debating the relative merits of blogs or online journals, acts that amount to celebrations of how little has really changed.
  • Perhaps a blog isn't a great tool for (philosophical; videogame) discussion or even for knowledge retention, etc... but a whole *blogosphere*...? If individuals (and individual memory in particular) are included within the scope of "the blogosphere" then surely someone remembers the "important" posts, like you seemed to be asking for...?
Ed Webb

News: A Gripe Session at Blackboard - Inside Higher Ed - 0 views

  • At an open "listening session" with top executives of Blackboard here Wednesday at the company's annual conference, college officials expressed frustration with many of the system's fundamental characteristics. At times, the meeting seemed to turn into a communal gripe session, with complaints ranging from the system's discussion forum application, to the improved -- but still lacking -- user support, to the training materials for faculty members. Participants' concerns were often greeted with nods of agreement and outright applause from their peers as they spoke of their frustrations with the system.
  • "We recognize there are still some shortcomings in our products," responded Michael Chasen, president and CEO of Blackboard.
Ed Webb

Elgan: Why goofing off boosts productivity - 0 views

  • The human mind is a curiosity engine.
    • Ed Webb
       
      Very quotable
Ed Webb

Open Monologue » Creativity is the new technology - 0 views

  • technology has been losing our attention lately. It hasn’t become unimportant - developments in medicine, transportation and energy production are still critical to our well being. But we’ve got such a surfeit of technology available to us that it’s just become part of the environment. It’s just there. I think that the 21st century will be a century of creativity in the same way that the 20th was of technology. Much of the creativity, interestingly enough, will be based on the tools provided by technology, especially tools that allow us to create, collaborate and communicate.
  • video literacy including comprehension and creation - is one of those 21st century skills that get discussed so often. I’m in total agreement and I think that the foundational 21st century skill underlying many of the others will be creativity. If that’s the case, what will schools look like when they are designed to nurture creativity the way they nurtured technology skills like science and math in the 20th?
Ed Webb

The powerful and mysterious brain circuitry that makes us love Google, Twitter, and texting. - By Emily Yoffe - Slate Magazine - 0 views

  • For humans, this desire to search is not just about fulfilling our physical needs. Panksepp says that humans can get just as excited about abstract rewards as tangible ones. He says that when we get thrilled about the world of ideas, about making intellectual connections, about divining meaning, it is the seeking circuits that are firing.
  • Our internal sense of time is believed to be controlled by the dopamine system. People with hyperactivity disorder have a shortage of dopamine in their brains, which a recent study suggests may be at the root of the problem. For them even small stretches of time seem to drag.
  • When we get the object of our desire (be it a Twinkie or a sexual partner), we engage in consummatory acts that Panksepp says reduce arousal in the brain and temporarily, at least, inhibit our urge to seek.
  • ...3 more annotations...
  • But our brains are designed to more easily be stimulated than satisfied. "The brain seems to be more stingy with mechanisms for pleasure than for desire," Berridge has said. This makes evolutionary sense. Creatures that lack motivation, that find it easy to slip into oblivious rapture, are likely to lead short (if happy) lives. So nature imbued us with an unquenchable drive to discover, to explore. Stanford University neuroscientist Brian Knutson has been putting people in MRI scanners and looking inside their brains as they play an investing game. He has consistently found that the pictures inside our skulls show that the possibility of a payoff is much more stimulating than actually getting one.
  • all our electronic communication devices—e-mail, Facebook feeds, texts, Twitter—are feeding the same drive as our searches. Since we're restless, easily bored creatures, our gadgets give us in abundance qualities the seeking/wanting system finds particularly exciting. Novelty is one. Panksepp says the dopamine system is activated by finding something unexpected or by the anticipation of something new. If the rewards come unpredictably—as e-mail, texts, updates do—we get even more carried away. No wonder we call it a "CrackBerry."
  • If humans are seeking machines, we've now created the perfect machines to allow us to seek endlessly. This perhaps should make us cautious. In Animals in Translation, Temple Grandin writes of driving two indoor cats crazy by flicking a laser pointer around the room. They wouldn't stop stalking and pouncing on this ungraspable dot of light—their dopamine system pumping. She writes that no wild cat would indulge in such useless behavior: "A cat wants to catch the mouse, not chase it in circles forever." She says "mindless chasing" makes an animal less likely to meet its real needs "because it short-circuits intelligent stalking behavior." As we chase after flickering bits of information, it's a salutary warning.
Ed Webb

The Wired Campus - ProfHacker Blog Highlights Widespread Interest in Teaching With Technology - The Chronicle of Higher Education - 0 views

  • a site that wants to look at the intersection of productivity, technology, and pedagogy in higher education
  • showing that the barrier of entry to this new stuff is lower than it seems
  • One hundred percent of this stuff we bring into our own classroom. If the slogan for software development is to eat your own dog food, we are always eating our own dog food. These are our assignments, our best practices.
  •  
    I've been following via GoogleReader for a few weeks - it's a VERY busy site, which I guess is what you get from 10 people producing a group blog.
Ed Webb

Google and Meta moved cautiously on AI. Then came OpenAI's ChatGPT. - The Washington Post - 0 views

  • The surge of attention around ChatGPT is prompting pressure inside tech giants including Meta and Google to move faster, potentially sweeping safety concerns aside
  • Tech giants have been skittish since public debacles like Microsoft’s Tay, which it took down in less than a day in 2016 after trolls prompted the bot to call for a race war, suggest Hitler was right and tweet “Jews did 9/11.”
  • Some AI ethicists fear that Big Tech’s rush to market could expose billions of people to potential harms — such as sharing inaccurate information, generating fake photos or giving students the ability to cheat on school tests — before trust and safety experts have been able to study the risks. Others in the field share OpenAI’s philosophy that releasing the tools to the public, often nominally in a “beta” phase after mitigating some predictable risks, is the only way to assess real world harms.
  • ...8 more annotations...
  • Silicon Valley’s sudden willingness to consider taking more reputational risk arrives as tech stocks are tumbling
  • A chatbot that pointed to one answer directly from Google could increase its liability if the response was found to be harmful or plagiarized.
  • AI has been through several hype cycles over the past decade, but the furor over DALL-E and ChatGPT has reached new heights.
  • Soon after OpenAI released ChatGPT, tech influencers on Twitter began to predict that generative AI would spell the demise of Google search. ChatGPT delivered simple answers in an accessible way and didn’t ask users to rifle through blue links. Besides, after a quarter of a century, Google’s search interface had grown bloated with ads and marketers trying to game the system.
  • Inside big tech companies, the system of checks and balances for vetting the ethical implications of cutting-edge AI isn’t as established as privacy or data security. Typically teams of AI researchers and engineers publish papers on their findings, incorporate their technology into the company’s existing infrastructure or develop new products, a process that can sometimes clash with other teams working on responsible AI over pressure to see innovation reach the public sooner.
  • Chatbots like OpenAI routinely make factual errors and often switch their answers depending on how a question is asked
  • To Timnit Gebru, executive director of the nonprofit Distributed AI Research Institute, the prospect of Google sidelining its responsible AI team doesn’t necessarily signal a shift in power or safety concerns, because those warning of the potential harms were never empowered to begin with. “If we were lucky, we’d get invited to a meeting,” said Gebru, who helped lead Google’s Ethical AI team until she was fired for a paper criticizing large language models.
  • Rumman Chowdhury, who led Twitter’s machine-learning ethics team until Elon Musk disbanded it in November, said she expects companies like Google to increasingly sideline internal critics and ethicists as they scramble to catch up with OpenAI.“We thought it was going to be China pushing the U.S., but looks like it’s start-ups,” she said.
1 - 20 of 22 Next ›
Showing 20 items per page