Skip to main content

Home/ Instructional & Media Services at Dickinson College/ Group items tagged learning

Rss Feed Group items tagged

Ed Webb

The Wired Campus - A Year Later, a Texas University Says Giving Students iPhones Is an ... - 1 views

  • Abilene Christian University says handing out iPhones to its entire first-year class in 2008 has improved interaction between students and faculty members.
  • Does positive feeling mean better teaching and learning? Mr. Schubert adds that it's too early to collect enough data to understand how giving out iPhones improves education. Student testimonials in the report, however, highlight easier access to professors. One savvy student says having an iPhone means he's less confused in class. "My professor will ask a question about something and I don't know what it is, but right here on my phone, with just one touch, I have Dictionary.com, I have a Wikipedia app—I can look it up," said Tyler Sutphen, a marketing major. "I know what they're talking about, because it's right there."
Ed Webb

The Future of WPMu at bavatuesdays - 1 views

  • I grab feeds from external blogs all the time that are related to UMW an pull them into our sitewide “tags” blog (the name tags here is confusing, it is simply a republishing of everything in the entire WPMu install) with FeedWordPress. For example, I stumbled across this post in the tags blog on UMW Blogs tonight, which was actually being pulled in from a WordPress.com blog of a student who graduated years ago, but regularly blogs about her work in historic preservation.  This particular post was all about a book she read as an undergraduate in Historic Preservation, and how great a resource it is.  A valuable post, especially since the professor who recommended that book, W. Brown Morton, retired last year. There is a kind of eternal echo in a system like this that students, faculty, and staff can continue to feed into a community of teaching and learning well beyond their matriculation period, or even their career.
  • what we are doing as instructional technologists, scholars and students in higher ed right now is much bigger than a particular blogging system or software, I see my job as working with people to imagine the implications and possibilities of managing and maintaining their digital identity in a moment when we are truly in a deep transformation of information, identity, and scholarship.
  • we’ll host domains that professors purchase and, ideally, map all their domains onto one WP install that can manage many multi-blogging solutions from one install.  The whole Russian Doll thing that WPMu can do with the Multi-Site Manager plugin. So you offer a Bluehost like setup for faculty, and if that is too much, allow them to map a domain, take control of their own course work, and encourage an aggregated course management model that pushes students to take control of their digital identity and spaces by extension.  Giving students a space and voice on your domain or application is not the same as asking them to create, manage and maintain their own space.  Moreover, it doesn’t feed into the idea of a digital trajectory that starts well before they come to college and will end well after they leave.  This model extends the community, and brings in key resources like a recent graduate discussing an out-of-print historic preservation text book a retired professor assigned to be one of the best resources for an aspiring Preservation graduate student. This is what it is all about, right there, and it’s not gonna happen in silos and on someone else’s space, we need to provision, empower, and imagine the merge as a full powered move to many. many domains of one’s own.
Ed Webb

The Greatest and Most Flawed Experiment Ever in Online Learning - CogDogBlog - 1 views

  • I don’t think we should at all be talking about “putting courses online.” What we are really faced with is coming up with some quick alternative modes for students to complete course work without showing up on campus. This does not call for apps and vendor solutions, but what the best teachers always do- improvise, change up on the fly when things change.
  • my suggestion an strategy would be… do as little as possible online. Use online for communicating, caring, attending to people’s needs, but not really for being the “course”. Flip that stuff outside.
  • This is why I cringe when what I seem to hear is “Zoom! Zoom! Can we have 30 students in zoom?” Everything you try to do online is going to call on for jumping unfair levels of barriers- access, technology, experience. I’d say recast your activities in ways students can do as much without going online- reading, writing, thinking, practicing, doing stuff away from the screen.
  • ...4 more annotations...
  • The most important things to me are quickly establishing, and having backup modes, for students to be in touch with you, and you with them. As individuals. It might be direct messaging, email, texting. It could be but need not be something Slack-like. I’d really go simplest (email)
  • Get going with web annotation tools
  • We need not have just talking sessions for use of video. Think about drop in hours with Whereby (the new appear.in) – it lacks a need for logins and downloads, and works on mobile.
  • This experiment is going to.. well I bet, go bad in a lot of ways. I don’t know what we can expect of un-experienced teachers and unprepared students, who on top of all the concerns they carry and we rarely see, now have to ponder where they might live and sustain income to live on. It will be interesting… but it need not be awful nor a disaster, if we go about as sharing in the situation.
Ed Webb

Guest Post: The Complexities of Certainty | Just Visiting - 0 views

  • Privileges abound in academia, but so do experiences of loss, instability and fear. And into this situation we were called to respond to a pandemic.
  • It is tempting to reach for certainties when everything around us is in chaos, and for a vast swath of higher ed instructors, the rapid shift from face-to-face teaching to emergency distance learning has been chaos. Small wonder, then, that people have offered -- and clung to -- advice that seeks to bring order to disorder. Many people have advised instructors to prioritize professionalism, ditching the sweatpants and putting away the visible clutter in our homes before making a Zoom call, upholding concepts like "rigor" so that our standards do not slip. To some, these appeals to universal principles are right-minded and heartening, a bulwark against confusion and disarray. But to others they have felt oppressive, even dangerously out of touch with the world in which we and our students live.
  • certainties can be dangerous; their very power is based upon reifying well-worn inequities dressed up as tradition
  • ...3 more annotations...
  • there is no objective standard of success that we reach when we insist on rigor, which is too often deployed in defense of practices that are ableist and unkind
  • We are not just teachers, or scholars, or professionals. We are individuals thrown back in varying degrees on our own resources, worried about ourselves and our families and friends as we navigate the effects of COVID-19. Many of us are deeply anxious and afraid. Our pre-existing frailties have been magnified; we feel vulnerable, distracted and at sea. Our loved ones are sick, even dying. This is trauma. Few of us have faced such world-changing circumstances before, and as our minds absorb the impact of that reality, our brains cannot perform as capably as they usually would.
  • The most professional people I know right now are those who show up, day after day, to teach under extraordinary circumstances. Perhaps they do it with their laundry waiting to be folded, while their children interrupt, thinking constantly of their loved ones, weathering loneliness, wearing sweatpants and potentially in need of a haircut. But I know they do it while acknowledging this is not the world in which we taught two months before, and that every student is facing disruption, uncertainty and distraction. They do it creatively, making room for the unexpected, challenging their students, with the world a participant in the conversation.
Ed Webb

Clear backpacks, monitored emails: life for US students under constant surveillance | E... - 0 views

  • This level of surveillance is “not too over-the-top”, Ingrid said, and she feels her classmates are generally “accepting” of it.
  • One leading student privacy expert estimated that as many as a third of America’s roughly 15,000 school districts may already be using technology that monitors students’ emails and documents for phrases that might flag suicidal thoughts, plans for a school shooting, or a range of other offenses.
  • Some parents said they were alarmed and frightened by schools’ new monitoring technologies. Others said they were conflicted, seeing some benefits to schools watching over what kids are doing online, but uncertain if their schools were striking the right balance with privacy concerns. Many said they were not even sure what kind of surveillance technology their schools might be using, and that the permission slips they had signed when their kids brought home school devices had told them almost nothing
  • ...13 more annotations...
  • When Dapier talks with other teen librarians about the issue of school surveillance, “we’re very alarmed,” he said. “It sort of trains the next generation that [surveillance] is normal, that it’s not an issue. What is the next generation’s Mark Zuckerberg going to think is normal?
  • “It’s the school as panopticon, and the sweeping searchlight beams into homes, now, and to me, that’s just disastrous to intellectual risk-taking and creativity.”
  • “They’re so unclear that I’ve just decided to cut off the research completely, to not do any of it.”
  • “They are all mandatory, and the accounts have been created before we’ve even been consulted,” he said. Parents are given almost no information about how their children’s data is being used, or the business models of the companies involved. Any time his kids complete school work through a digital platform, they are generating huge amounts of very personal, and potentially very valuable, data. The platforms know what time his kids do their homework, and whether it’s done early or at the last minute. They know what kinds of mistakes his kids make on math problems.
  • Felix, now 12, said he is frustrated that the school “doesn’t really [educate] students on what is OK and what is not OK. They don’t make it clear when they are tracking you, or not, or what platforms they track you on. “They don’t really give you a list of things not to do,” he said. “Once you’re in trouble, they act like you knew.”
  • As of 2018, at least 60 American school districts had also spent more than $1m on separate monitoring technology to track what their students were saying on public social media accounts, an amount that spiked sharply in the wake of the 2018 Parkland school shooting, according to the Brennan Center for Justice, a progressive advocacy group that compiled and analyzed school contracts with a subset of surveillance companies.
  • Many parents also said that they wanted more transparency and more parental control over surveillance. A few years ago, Ben, a tech professional from Maryland, got a call from his son’s principal to set up an urgent meeting. His son, then about nine or 10-years old, had opened up a school Google document and typed “I want to kill myself.” It was not until he and his son were in a serious meeting with school officials that Ben found out what happened: his son had typed the words on purpose, curious about what would happen. “The smile on his face gave away that he was testing boundaries, and not considering harming himself,” Ben said. (He asked that his last name and his son’s school district not be published, to preserve his son’s privacy.) The incident was resolved easily, he said, in part because Ben’s family already had close relationships with the school administrators.
  • there is still no independent evaluation of whether this kind of surveillance technology actually works to reduce violence and suicide.
  • Certain groups of students could easily be targeted by the monitoring more intensely than others, she said. Would Muslim students face additional surveillance? What about black students? Her daughter, who is 11, loves hip-hop music. “Maybe some of that language could be misconstrued, by the wrong ears or the wrong eyes, as potentially violent or threatening,” she said.
  • The Parent Coalition for Student Privacy was founded in 2014, in the wake of parental outrage over the attempt to create a standardized national database that would track hundreds of data points about public school students, from their names and social security numbers to their attendance, academic performance, and disciplinary and behavior records, and share the data with education tech companies. The effort, which had been funded by the Gates Foundation, collapsed in 2014 after fierce opposition from parents and privacy activists.
  • “More and more parents are organizing against the onslaught of ed tech and the loss of privacy that it entails. But at the same time, there’s so much money and power and political influence behind these groups,”
  • some privacy experts – and students – said they are concerned that surveillance at school might actually be undermining students’ wellbeing
  • “I do think the constant screen surveillance has affected our anxiety levels and our levels of depression.” “It’s over-guarding kids,” she said. “You need to let them make mistakes, you know? That’s kind of how we learn.”
Ed Webb

I unintentionally created a biased AI algorithm 25 years ago - tech companies are still... - 0 views

  • How and why do well-educated, well-intentioned scientists produce biased AI systems? Sociological theories of privilege provide one useful lens.
  • Scientists also face a nasty subconscious dilemma when incorporating diversity into machine learning models: Diverse, inclusive models perform worse than narrow models.
  • fairness can still be the victim of competitive pressures in academia and industry. The flawed Bard and Bing chatbots from Google and Microsoft are recent evidence of this grim reality. The commercial necessity of building market share led to the premature release of these systems.
  • ...3 more annotations...
  • Their training data is biased. They are designed by an unrepresentative group. They face the mathematical impossibility of treating all categories equally. They must somehow trade accuracy for fairness. And their biases are hiding behind millions of inscrutable numerical parameters.
  • biased AI systems can still be created unintentionally and easily. It’s also clear that the bias in these systems can be harmful, hard to detect and even harder to eliminate.
  • with North American computer science doctoral programs graduating only about 23% female, and 3% Black and Latino students, there will continue to be many rooms and many algorithms in which underrepresented groups are not represented at all.
Ed Webb

Google and Meta moved cautiously on AI. Then came OpenAI's ChatGPT. - The Washington Post - 0 views

  • The surge of attention around ChatGPT is prompting pressure inside tech giants including Meta and Google to move faster, potentially sweeping safety concerns aside
  • Tech giants have been skittish since public debacles like Microsoft’s Tay, which it took down in less than a day in 2016 after trolls prompted the bot to call for a race war, suggest Hitler was right and tweet “Jews did 9/11.”
  • Some AI ethicists fear that Big Tech’s rush to market could expose billions of people to potential harms — such as sharing inaccurate information, generating fake photos or giving students the ability to cheat on school tests — before trust and safety experts have been able to study the risks. Others in the field share OpenAI’s philosophy that releasing the tools to the public, often nominally in a “beta” phase after mitigating some predictable risks, is the only way to assess real world harms.
  • ...8 more annotations...
  • Silicon Valley’s sudden willingness to consider taking more reputational risk arrives as tech stocks are tumbling
  • A chatbot that pointed to one answer directly from Google could increase its liability if the response was found to be harmful or plagiarized.
  • AI has been through several hype cycles over the past decade, but the furor over DALL-E and ChatGPT has reached new heights.
  • Soon after OpenAI released ChatGPT, tech influencers on Twitter began to predict that generative AI would spell the demise of Google search. ChatGPT delivered simple answers in an accessible way and didn’t ask users to rifle through blue links. Besides, after a quarter of a century, Google’s search interface had grown bloated with ads and marketers trying to game the system.
  • Inside big tech companies, the system of checks and balances for vetting the ethical implications of cutting-edge AI isn’t as established as privacy or data security. Typically teams of AI researchers and engineers publish papers on their findings, incorporate their technology into the company’s existing infrastructure or develop new products, a process that can sometimes clash with other teams working on responsible AI over pressure to see innovation reach the public sooner.
  • Chatbots like OpenAI routinely make factual errors and often switch their answers depending on how a question is asked
  • To Timnit Gebru, executive director of the nonprofit Distributed AI Research Institute, the prospect of Google sidelining its responsible AI team doesn’t necessarily signal a shift in power or safety concerns, because those warning of the potential harms were never empowered to begin with. “If we were lucky, we’d get invited to a meeting,” said Gebru, who helped lead Google’s Ethical AI team until she was fired for a paper criticizing large language models.
  • Rumman Chowdhury, who led Twitter’s machine-learning ethics team until Elon Musk disbanded it in November, said she expects companies like Google to increasingly sideline internal critics and ethicists as they scramble to catch up with OpenAI.“We thought it was going to be China pushing the U.S., but looks like it’s start-ups,” she said.
« First ‹ Previous 61 - 70 of 70
Showing 20 items per page