Skip to main content

Home/ Instructional & Media Services at Dickinson College/ Group items tagged training

Rss Feed Group items tagged

Ed Webb

Google Researchers' Attack Prompts ChatGPT to Reveal Its Training Data - 0 views

  • researchers showed that there are large amounts of privately identifiable information (PII) in OpenAI’s large language models. They also showed that, on a public version of ChatGPT, the chatbot spit out large passages of text scraped verbatim from other places on the internet
  • ChatGPT’s “alignment techniques do not eliminate memorization,” meaning that it sometimes spits out training data verbatim. This included PII, entire poems, “cryptographically-random identifiers” like Bitcoin addresses, passages from copyrighted scientific research papers, website addresses, and much more.
  • The researchers wrote that they spent $200 to create “over 10,000 unique examples” of training data, which they say is a total of “several megabytes” of training data. The researchers suggest that using this attack, with enough money, they could have extracted gigabytes of training data. The entirety of OpenAI’s training data is unknown, but GPT-3 was trained on anywhere from many hundreds of GB to a few dozen terabytes of text data.
  • ...1 more annotation...
  • the world’s most important and most valuable AI company has been built on the backs of the collective work of humanity, often without permission, and without compensation to those who created it
R DAVIS

Delhi School of Communication offers Advertising Courses in Delhi - 0 views

  •  
    DSC has been a first in many domains. It pioneered the concept of Advertising Industry training, also bridging the theory-practice gap through its Apprenticeships and Internships.
Ed Webb

Bad News : CJR - 0 views

  • Students in Howard Rheingold’s journalism class at Stanford recently teamed up with NewsTrust, a nonprofit Web site that enables people to review and rate news articles for their level of quality, in a search for lousy journalism.
  • the News Hunt is a way of getting young journalists to critically examine the work of professionals. For Rheingold, an influential writer and thinker about the online world and the man credited with coining the phrase “virtual community,” it’s all about teaching them “crap detection.”
  • last year Rheingold wrote an important essay about the topic for the San Francisco Chronicle’s Web site
  • ...3 more annotations...
  • What’s at stake is no less than the quality of the information available in our society, and our collective ability to evaluate its accuracy and value. “Are we going to have a world filled with people who pass along urban legends and hoaxes?” Rheingold said, “or are people going to educate themselves about these tools [for crap detection] so we will have collective intelligence instead of misinformation, spam, urban legends, and hoaxes?”
  • I previously called fact-checking “one of the great American pastimes of the Internet age.” But, as Rheingold noted, the opposite is also true: the manufacture and promotion of bullshit is endemic. One couldn’t exist without the other. That makes Rheingold’s essay, his recent experiment with NewsTrust, and his wiki of online critical-thinking tools” essential reading for journalists. (He’s also writing a book about this topic.)
  • I believe if we want kids to succeed online, the biggest danger is not porn or predators—the biggest danger is them not being able to distinguish truth from carefully manufactured misinformation or bullshit
  •  
    As relevant to general education as to journalism training
Ed Webb

ChatGPT Is Nothing Like a Human, Says Linguist Emily Bender - 0 views

  • Please do not conflate word form and meaning. Mind your own credulity.
  • We’ve learned to make “machines that can mindlessly generate text,” Bender told me when we met this winter. “But we haven’t learned how to stop imagining the mind behind it.”
  • A handful of companies control what PricewaterhouseCoopers called a “$15.7 trillion game changer of an industry.” Those companies employ or finance the work of a huge chunk of the academics who understand how to make LLMs. This leaves few people with the expertise and authority to say, “Wait, why are these companies blurring the distinction between what is human and what’s a language model? Is this what we want?”
  • ...16 more annotations...
  • “We call on the field to recognize that applications that aim to believably mimic humans bring risk of extreme harms,” she co-wrote in 2021. “Work on synthetic human behavior is a bright line in ethical Al development, where downstream effects need to be understood and modeled in order to block foreseeable harm to society and different social groups.”
  • chatbots that we easily confuse with humans are not just cute or unnerving. They sit on a bright line. Obscuring that line and blurring — bullshitting — what’s human and what’s not has the power to unravel society
  • She began learning from, then amplifying, Black women’s voices critiquing AI, including those of Joy Buolamwini (she founded the Algorithmic Justice League while at MIT) and Meredith Broussard (the author of Artificial Unintelligence: How Computers Misunderstand the World). She also started publicly challenging the term artificial intelligence, a sure way, as a middle-aged woman in a male field, to get yourself branded as a scold. The idea of intelligence has a white-supremacist history. And besides, “intelligent” according to what definition? The three-stratum definition? Howard Gardner’s theory of multiple intelligences? The Stanford-Binet Intelligence Scale? Bender remains particularly fond of an alternative name for AI proposed by a former member of the Italian Parliament: “Systematic Approaches to Learning Algorithms and Machine Inferences.” Then people would be out here asking, “Is this SALAMI intelligent? Can this SALAMI write a novel? Does this SALAMI deserve human rights?”
  • Tech-makers assuming their reality accurately represents the world create many different kinds of problems. The training data for ChatGPT is believed to include most or all of Wikipedia, pages linked from Reddit, a billion words grabbed off the internet. (It can’t include, say, e-book copies of everything in the Stanford library, as books are protected by copyright law.) The humans who wrote all those words online overrepresent white people. They overrepresent men. They overrepresent wealth. What’s more, we all know what’s out there on the internet: vast swamps of racism, sexism, homophobia, Islamophobia, neo-Nazism.
  • One fired Google employee told me succeeding in tech depends on “keeping your mouth shut to everything that’s disturbing.” Otherwise, you’re a problem. “Almost every senior woman in computer science has that rep. Now when I hear, ‘Oh, she’s a problem,’ I’m like, Oh, so you’re saying she’s a senior woman?”
  • “We haven’t learned to stop imagining the mind behind it.”
  • In March 2021, Bender published “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” with three co-authors. After the paper came out, two of the co-authors, both women, lost their jobs as co-leads of Google’s Ethical AI team.
  • “On the Dangers of Stochastic Parrots” is not a write-up of original research. It’s a synthesis of LLM critiques that Bender and others have made: of the biases encoded in the models; the near impossibility of studying what’s in the training data, given the fact they can contain billions of words; the costs to the climate; the problems with building technology that freezes language in time and thus locks in the problems of the past. Google initially approved the paper, a requirement for publications by staff. Then it rescinded approval and told the Google co-authors to take their names off it. Several did, but Google AI ethicist Timnit Gebru refused. Her colleague (and Bender’s former student) Margaret Mitchell changed her name on the paper to Shmargaret Shmitchell, a move intended, she said, to “index an event and a group of authors who got erased.” Gebru lost her job in December 2020, Mitchell in February 2021. Both women believe this was retaliation and brought their stories to the press. The stochastic-parrot paper went viral, at least by academic standards. The phrase stochastic parrot entered the tech lexicon.
  • Tech execs loved it. Programmers related to it. OpenAI CEO Sam Altman was in many ways the perfect audience: a self-identified hyperrationalist so acculturated to the tech bubble that he seemed to have lost perspective on the world beyond. “I think the nuclear mutually assured destruction rollout was bad for a bunch of reasons,” he said on AngelList Confidential in November. He’s also a believer in the so-called singularity, the tech fantasy that, at some point soon, the distinction between human and machine will collapse. “We are a few years in,” Altman wrote of the cyborg merge in 2017. “It’s probably going to happen sooner than most people think. Hardware is improving at an exponential rate … and the number of smart people working on AI is increasing exponentially as well. Double exponential functions get away from you fast.” On December 4, four days after ChatGPT was released, Altman tweeted, “i am a stochastic parrot, and so r u.”
  • “This is one of the moves that turn up ridiculously frequently. People saying, ‘Well, people are just stochastic parrots,’” she said. “People want to believe so badly that these language models are actually intelligent that they’re willing to take themselves as a point of reference and devalue that to match what the language model can do.”
  • The membrane between academia and industry is permeable almost everywhere; the membrane is practically nonexistent at Stanford, a school so entangled with tech that it can be hard to tell where the university ends and the businesses begin.
  • “No wonder that men who live day in and day out with machines to which they believe themselves to have become slaves begin to believe that men are machines.”
  • what’s tenure for, after all?
  • LLMs are tools made by specific people — people who stand to accumulate huge amounts of money and power, people enamored with the idea of the singularity. The project threatens to blow up what is human in a species sense. But it’s not about humility. It’s not about all of us. It’s not about becoming a humble creation among the world’s others. It’s about some of us — let’s be honest — becoming a superspecies. This is the darkness that awaits when we lose a firm boundary around the idea that humans, all of us, are equally worthy as is.
  • The AI dream is “governed by the perfectibility thesis, and that’s where we see a fascist form of the human.”
  • “Why are you trying to trick people into thinking that it really feels sad that you lost your phone?”
Ed Webb

A Conversation With Bill Gates - Technology - The Chronicle of Higher Education - 2 views

  • argues for radical reform of college teaching, advocating a move toward a "flipped" classroom, where students watch videos from superstar professors as homework and use class time for group projects and other interactive activities
  • it's much harder to then take it for the broad set of students in the institutional framework and decide, OK, where is technology the best and where is the face-to-face the best. And they don't have very good metrics of what is their value-added. If you try and compare two universities, you'll find out a lot more about the inputs—this university has high SAT scores compared to this one. And it's sort of the opposite of what you'd think. You'd think people would say, "We take people with low SATs and make them really good lawyers." Instead they say, "We take people with very high SATs and we don't really know what we create, but at least they're smart when they show up here so maybe they still are when we're done with them."
  • The various rankings have focused on the input side of the equation, not the output
  • ...11 more annotations...
  • Something that's not purely digital but also that the efficiency of the face-to-face time is much greater
  • Can we transform this credentialing process? And in fact the ideal would be to separate out the idea of proving your knowledge from the way you acquire that knowledge
  • Employers have decided that having the breadth of knowledge that's associated with a four-year degree is often something they want to see in the people they give that job to. So instead of testing for that different profession, they'll be testing that you have that broader exposure
  • that failing student is a disaster for everyone
  • What is it that we need to do to strengthen this fundamental part of our country that both in a broad sort of economic level and an individual-rights level is the key enabler. And it's amazing how little effort's been put into this. Of saying, OK, why are some teachers at any different level way better than others? You've got universities in this country with a 7-percent completion rate. Why is it that they don't come under pressure to change what they're doing to come up with a better way of doing things?
  • We bet on the change agents within the universities. And so, various universities come to us and say, We have some ideas about completion rates, here are some things we want to try out, it's actually budget that holds us back from being able to do that. People come to us and say, We want to try a hybrid course where some piece is online, some piece is not, and we're aiming this at the students that are in the most need, not just the most elite. So that's who we're giving grants to, people who are trying out new things in universities. Now the idea that if you have a few universities that figure out how to do things well. how do you spread these best practices, that's a tough challenge. It's not the quite same way as in the private sector that if somebody's doing something better, the price signals force that to be adopted broadly. Here, things move very slowly even if they are an improvement.
  • Q. Some of what you've been talking about is getting people to completion by weeding out extraneous courses. There's a concern by some that that might create pressure to make universities into a kind of job-training area without the citizenship focus of that broad liberal-arts degree.
  • it is important to distinguish when people are taking extra courses that broaden them as a citizen and that would be considered a plus, versus they're just marking time because they're being held up because the capacity doesn't exist in the system to let them do what they want to do. As you go through the student survey data, it's mostly the latter. But I'm the biggest believer in taking a lot of different things. And hopefully, if these courses are appealing enough, we can get people even after they've finished a college degree to want to go online and take these courses.
  • Other countries are sending more kids to college. They're getting higher completion rates. They've moved ahead of us
  • There's nothing that was more important to me in terms of the kind of opportunity I had personally. I went to a great high school. I went to a great university. I only went three years, but it doesn't matter; it was still extremely valuable to me to be in that environment. And I had fantastic professors throughout that whole thing. And so, if every kid could have that kind of education, we'd achieve a lot of goals both at the individual and country level
  • One of the strengths of higher ed is the variety. But the variety has also meant that if somebody is doing something particularly well, it's hard to map that across a lot of different institutions. There aren't very many good metrics. At least in high schools we can talk about dropout rates. Completion rate was really opaque, and not talked about a lot. The quality-measure things are equally different. We don't have a gold standard like SAT scores or No Child Left Behind up at the collegiate level. And of course, kids are more dispersed in terms of what their career goals are at that point. So it's got some things that make it particularly challenging, but it has a lot in common, and I'd say it's equally important to get it right
Ed Webb

Reflections on open courses « Connectivism - 0 views

  • There is value of blending traditional with emergent knowledge spaces (online conferences and traditional journals) - Learners will create and innovate if they can express ideas and concepts in their own spaces and through their own expertise (i.e. hosting events in Second Life) - Courses are platforms for innovation. Too rigid a structure puts the educator in full control. Using a course as a platform fosters creativity…and creativity generates a bit of chaos and can be unsettling to individuals who prefer a structure with which they are familiar. - (cliche) Letting go of control is a bit stressful, but surprisingly rewarding in the new doors it opens and liberating in how it brings others in to assist in running a course and advancing the discussion. - People want to participate…but they will only do so once they have “permission” and a forum in which to utilize existing communication/technological skills.
  • The internet is a barrier-reducing system. In theory, everyone has a voice online (the reality of technology ownership, digital skills, and internet access add an unpleasant dimension). Costs of duplication are reduced. Technology (technique) is primarily a duplicationary process, as evidenced by the printing press, assembly line, and now the content duplication ability of digital technologies. As a result, MOOCs embody, rather than reflect, practices within the digital economy. MOOCs reduce barriers to information access and to the dialogue that permits individuals (and society) to grow knowledge. Much of the technical innovation in the last several centuries has permitted humanity to extend itself physically (cars, planes, trains, telescopes). The internet, especially in recent developments of connective and collaborative applications, is a cognitive extension for humanity. Put another way, the internet offers a model where the reproduction of knowledge is not confined to the production of physical objects.
  • Knowledge is a mashup. Many people contribute. Many different forums are used. Multiple media permit varied and nuanced expressions of knowledge. And, because the information base (which is required for knowledge formation) changes so rapidly, being properly connected to the right people and information is vitally important. The need for proper connectedness to the right people and information is readily evident in intelligence communities. Consider the Christmas day bomber. Or 9/11. The information was being collected. But not connected.
  • ...11 more annotations...
  • The open model of participation calls into question where value is created in the education system. Gutenberg created a means to duplicate content. The social web creates the opportunity for many-to-many interactions and to add a global social layer on content creation and knowledge growth.
  • Whatever can be easily duplicated cannot serve as the foundation for economic value. Integration and connectedness are economic value points.
  • In education, content can easily be produced (it’s important but has limited economic value). Lectures also have limited value (easy to record and to duplicate). Teaching – as done in most universities – can be duplicated. Learning, on the other hand, can’t be duplicated. Learning is personal, it has to occur one learner at a time. The support needed for learners to learn is a critical value point.
  • Learning, however, requires a human, social element: both peer-based and through interaction with subject area experts
  • Content is readily duplicated, reducing its value economically. It is still critical for learning – all fields have core elements that learners must master before they can advance (research in expertise supports this notion). - Teaching can be duplicated (lectures can be recorded, Elluminate or similar webconferencing system can bring people from around the world into a class). Assisting learners in the learning process, correcting misconceptions (see Private Universe), and providing social support and brokering introductions to other people and ideas in the discipline is critical. - Accreditation is a value statement – it is required when people don’t know each other. Content was the first area of focus in open education. Teaching (i.e. MOOCs) are the second. Accreditation will be next, but, before progress can be made, profile, identity, and peer-rating systems will need to improve dramatically. The underlying trust mechanism on which accreditation is based cannot yet be duplicated in open spaces (at least, it can’t be duplicated to such a degree that people who do not know each other will trust the mediating agent of open accreditation)
  • The skills that are privileged and rewarded in a MOOC are similar to those that are needed to be effective in communicating with others and interacting with information online (specifically, social media and information sources like journals, databases, videos, lectures, etc.). Creative skills are the most critical. Facilitators and learners need something to “point to”. When a participant creates an insightful blog post, a video, a concept map, or other resource/artifact it generally gets attention.
  • Intentional diversity – not necessarily a digital skill, but the ability to self-evaluate ones network and ensure diversity of ideologies is critical when information is fragmented and is at risk of being sorted by single perspectives/ideologies.
  • The volume of information is very disorienting in a MOOC. For example, in CCK08, the initial flow of postings in Moodle, three weekly live sessions, Daily newsletter, and weekly readings and assignments proved to be overwhelming for many participants. Stephen and I somewhat intentionally structured the course for this disorienting experience. Deciding who to follow, which course concepts are important, and how to form sub-networks and sub-systems to assist in sensemaking are required to respond to information abundance. The process of coping and wayfinding (ontology) is as much a lesson in the learning process as mastering the content (epistemology). Learners often find it difficult to let go of the urge to master all content, read all the comments and blog posts.
  • e. Learning is a social trust-based process.
  • Patience, tolerance, suspension of judgment, and openness to other cultures and ideas are required to form social connections and negotiating misunderstandings.
  • An effective digital citizenry needs the skills to participate in important conversations. The growth of digital content and social networks raises the need citizens to have the technical and conceptual skills to express their ideas and engage with others in those spaces. MOOCs are a first generation testing grounds for knowledge growth in a distributed, global, digital world. Their role in developing a digital citizenry is still unclear, but democratic societies require a populace with the skills to participate in growing a society’s knowledge. As such, MOOCs, or similar open transparent learning experiences that foster the development of citizens confidence engage and create collaboratively, are important for the future of society.
Ed Webb

News: A Gripe Session at Blackboard - Inside Higher Ed - 0 views

  • At an open "listening session" with top executives of Blackboard here Wednesday at the company's annual conference, college officials expressed frustration with many of the system's fundamental characteristics. At times, the meeting seemed to turn into a communal gripe session, with complaints ranging from the system's discussion forum application, to the improved -- but still lacking -- user support, to the training materials for faculty members. Participants' concerns were often greeted with nods of agreement and outright applause from their peers as they spoke of their frustrations with the system.
  • "We recognize there are still some shortcomings in our products," responded Michael Chasen, president and CEO of Blackboard.
Ed Webb

K-12 Media Literacy No Panacea for Fake News, Report Argues - Digital Education - Educa... - 0 views

  • "Media literacy has long focused on personal responsibility, which can not only imbue individuals with a false sense of confidence in their skills, but also put the onus of monitoring media effects on the audience, rather than media creators, social media platforms, or regulators,"
  • the need to better understand the modern media environment, which is heavily driven by algorithm-based personalization on social-media platforms, and the need to be more systematic about evaluating the impact of various media-literacy strategies and interventions
  • In response, bills to promote media literacy in schools have been introduced or passed in more than a dozen states. A range of nonprofit, corporate, and media organizations have stepped up efforts to promote related curricula and programs. Such efforts should be applauded—but not viewed as a "panacea," the Data & Society researchers argue.
  • ...4 more annotations...
  • existing efforts "focus on the interpretive responsibilities of the individual,"
  • "if bad actors intentionally dump disinformation online with an aim to distract and overwhelm, is it possible to safeguard against media manipulation?"
  • A 2012 meta-analysis by academic researchers found that media literacy efforts could help boost students' critical awareness of messaging, bias, and representation in the media they consumed. There have been small studies suggesting that media-literacy efforts can change students' behaviors—for example, by making them less likely to seek out violent media for their own consumption. And more recently, a pair of researchers found that media-literacy training was more important than prior political knowledge when it comes to adopting a critical stance to partisan media content.
  • the roles of institutions, technology companies, and governments
Ed Webb

Clear backpacks, monitored emails: life for US students under constant surveillance | E... - 0 views

  • This level of surveillance is “not too over-the-top”, Ingrid said, and she feels her classmates are generally “accepting” of it.
  • One leading student privacy expert estimated that as many as a third of America’s roughly 15,000 school districts may already be using technology that monitors students’ emails and documents for phrases that might flag suicidal thoughts, plans for a school shooting, or a range of other offenses.
  • Some parents said they were alarmed and frightened by schools’ new monitoring technologies. Others said they were conflicted, seeing some benefits to schools watching over what kids are doing online, but uncertain if their schools were striking the right balance with privacy concerns. Many said they were not even sure what kind of surveillance technology their schools might be using, and that the permission slips they had signed when their kids brought home school devices had told them almost nothing
  • ...13 more annotations...
  • When Dapier talks with other teen librarians about the issue of school surveillance, “we’re very alarmed,” he said. “It sort of trains the next generation that [surveillance] is normal, that it’s not an issue. What is the next generation’s Mark Zuckerberg going to think is normal?
  • “It’s the school as panopticon, and the sweeping searchlight beams into homes, now, and to me, that’s just disastrous to intellectual risk-taking and creativity.”
  • “They’re so unclear that I’ve just decided to cut off the research completely, to not do any of it.”
  • “They are all mandatory, and the accounts have been created before we’ve even been consulted,” he said. Parents are given almost no information about how their children’s data is being used, or the business models of the companies involved. Any time his kids complete school work through a digital platform, they are generating huge amounts of very personal, and potentially very valuable, data. The platforms know what time his kids do their homework, and whether it’s done early or at the last minute. They know what kinds of mistakes his kids make on math problems.
  • Felix, now 12, said he is frustrated that the school “doesn’t really [educate] students on what is OK and what is not OK. They don’t make it clear when they are tracking you, or not, or what platforms they track you on. “They don’t really give you a list of things not to do,” he said. “Once you’re in trouble, they act like you knew.”
  • As of 2018, at least 60 American school districts had also spent more than $1m on separate monitoring technology to track what their students were saying on public social media accounts, an amount that spiked sharply in the wake of the 2018 Parkland school shooting, according to the Brennan Center for Justice, a progressive advocacy group that compiled and analyzed school contracts with a subset of surveillance companies.
  • Many parents also said that they wanted more transparency and more parental control over surveillance. A few years ago, Ben, a tech professional from Maryland, got a call from his son’s principal to set up an urgent meeting. His son, then about nine or 10-years old, had opened up a school Google document and typed “I want to kill myself.” It was not until he and his son were in a serious meeting with school officials that Ben found out what happened: his son had typed the words on purpose, curious about what would happen. “The smile on his face gave away that he was testing boundaries, and not considering harming himself,” Ben said. (He asked that his last name and his son’s school district not be published, to preserve his son’s privacy.) The incident was resolved easily, he said, in part because Ben’s family already had close relationships with the school administrators.
  • there is still no independent evaluation of whether this kind of surveillance technology actually works to reduce violence and suicide.
  • Certain groups of students could easily be targeted by the monitoring more intensely than others, she said. Would Muslim students face additional surveillance? What about black students? Her daughter, who is 11, loves hip-hop music. “Maybe some of that language could be misconstrued, by the wrong ears or the wrong eyes, as potentially violent or threatening,” she said.
  • The Parent Coalition for Student Privacy was founded in 2014, in the wake of parental outrage over the attempt to create a standardized national database that would track hundreds of data points about public school students, from their names and social security numbers to their attendance, academic performance, and disciplinary and behavior records, and share the data with education tech companies. The effort, which had been funded by the Gates Foundation, collapsed in 2014 after fierce opposition from parents and privacy activists.
  • “More and more parents are organizing against the onslaught of ed tech and the loss of privacy that it entails. But at the same time, there’s so much money and power and political influence behind these groups,”
  • some privacy experts – and students – said they are concerned that surveillance at school might actually be undermining students’ wellbeing
  • “I do think the constant screen surveillance has affected our anxiety levels and our levels of depression.” “It’s over-guarding kids,” she said. “You need to let them make mistakes, you know? That’s kind of how we learn.”
Ed Webb

How much 'work' should my online course be for me and my students? - Dave's Educational... - 0 views

  • My recommendation for people planning their courses, is to stop thinking about ‘contact hours’. A contact hour is a constraint that is applied to the learning process because of the organizational need to have people share a space in a building. Also called a credit hour, (particularly for American universities) this has meant, from a workload perspective, that for every in class hour a student is meant to do at least 2 (in some cases 3) hours of study outside of class. Even Cliff Notes agrees with me. So… for a full load, that 30 to 45 Total Work Hours for students per course that you are designing.
  • Simple break down (not quite 90, yes i know) Watch 3 hours of video* – 5 hoursRead stuff – 20 hoursListen to me talk – 15 hoursTalk with other students in a group – 15 hoursWrite reflections about group chat – 7.5 hoursRespond to other people’s reflections – 7.5 hoursWork on a term paper – 10 hoursDo weekly quiz – 3 hoursWrite take home mid-term – 3 hoursWrite take home final – 3 hours
  • A thousand variations of this might be imagined
  • ...3 more annotations...
  • a possible structure recommended by one of the faculty we were talking to was – read/watch, quiz, lecture, student group discussion, reflection. The reasoning here is that if you give learners (particularly new learners) a reading without some form of accountability (a quiz) they are much less likely to do it. I know that for me, when I’ve done the readings, I’m far more likely to attend class. Putting the student group discussion after the lecture gives students who can’t attend a synchronous session a chance to review the recording
  • The standardization police have been telling us for years that each student must learn the same things. Poppycock. Scaffolding doesn’t mean taking away student choice. There are numerous approaches to allowing a little or a lot of choice into your classes (learner contracts come to mind). Just remember, most students don’t want choice – at first. 12-16 years of training has told them that you the faculty member have something you want them to do and they need to find the trick of it. It will take a while until those students actually believe you want their actual opinion.
  • You can have a goal like – get them acculturated to the field – and work through your activities to get there. It’s harder, they will need your patience, but once they get their minds around it, it makes things much more interesting.
Ed Webb

I unintentionally created a biased AI algorithm 25 years ago - tech companies are still... - 0 views

  • How and why do well-educated, well-intentioned scientists produce biased AI systems? Sociological theories of privilege provide one useful lens.
  • Scientists also face a nasty subconscious dilemma when incorporating diversity into machine learning models: Diverse, inclusive models perform worse than narrow models.
  • fairness can still be the victim of competitive pressures in academia and industry. The flawed Bard and Bing chatbots from Google and Microsoft are recent evidence of this grim reality. The commercial necessity of building market share led to the premature release of these systems.
  • ...3 more annotations...
  • Their training data is biased. They are designed by an unrepresentative group. They face the mathematical impossibility of treating all categories equally. They must somehow trade accuracy for fairness. And their biases are hiding behind millions of inscrutable numerical parameters.
  • biased AI systems can still be created unintentionally and easily. It’s also clear that the bias in these systems can be harmful, hard to detect and even harder to eliminate.
  • with North American computer science doctoral programs graduating only about 23% female, and 3% Black and Latino students, there will continue to be many rooms and many algorithms in which underrepresented groups are not represented at all.
Ed Webb

William Davies · How many words does it take to make a mistake? Education, Ed... - 0 views

  • The problem waiting round the corner for universities is essays generated by AI, which will leave a textual pattern-spotter like Turnitin in the dust. (Earlier this year, I came across one essay that felt deeply odd in some not quite human way, but I had no tangible evidence that anything untoward had occurred, so that was that.)
  • To accuse someone of plagiarism is to make a moral charge regarding intentions. But establishing intent isn’t straightforward. More often than not, the hearings bleed into discussions of issues that could be gathered under the heading of student ‘wellbeing’, which all universities have been struggling to come to terms with in recent years.
  • I have heard plenty of dubious excuses for acts of plagiarism during these hearings. But there is one recurring explanation which, it seems to me, deserves more thoughtful consideration: ‘I took too many notes.’ It isn’t just students who are familiar with information overload, one of whose effects is to morph authorship into a desperate form of curatorial management, organising chunks of text on a screen. The discerning scholarly self on which the humanities depend was conceived as the product of transitions between spaces – library, lecture hall, seminar room, study – linked together by work with pen and paper. When all this is replaced by the interface with screen and keyboard, and everything dissolves into a unitary flow of ‘content’, the identity of the author – as distinct from the texts they have read – becomes harder to delineate.
  • ...19 more annotations...
  • This generation, the first not to have known life before the internet, has acquired a battery of skills in navigating digital environments, but it isn’t clear how well those skills line up with the ones traditionally accredited by universities.
  • From the perspective of students raised in a digital culture, the anti-plagiarism taboo no doubt seems to be just one more academic hang-up, a weird injunction to take perfectly adequate information, break it into pieces and refashion it. Students who pay for essays know what they are doing; others seem conscientious yet intimidated by secondary texts: presumably they won’t be able to improve on them, so why bother trying? For some years now, it’s been noticeable how many students arrive at university feeling that every interaction is a test they might fail. They are anxious. Writing seems fraught with risk, a highly complicated task that can be executed correctly or not.
  • Many students may like the flexibility recorded lectures give them, but the conversion of lectures into yet more digital ‘content’ further destabilises traditional conceptions of learning and writing
  • the evaluation forms which are now such a standard feature of campus life suggest that many students set a lot of store by the enthusiasm and care that are features of a good live lecture
  • the drift of universities towards a platform model, which makes it possible for students to pick up learning materials as and when it suits them. Until now, academics have resisted the push for ‘lecture capture’. It causes in-person attendance at lectures to fall dramatically, and it makes many lecturers feel like mediocre television presenters. Unions fear that extracting and storing teaching for posterity threatens lecturers’ job security and weakens the power of strikes. Thanks to Covid, this may already have happened.
  • In the utopia sold by the EdTech industry (the companies that provide platforms and software for online learning), pupils are guided and assessed continuously. When one task is completed correctly, the next begins, as in a computer game; meanwhile the platform providers are scraping and analysing data from the actions of millions of children. In this behaviourist set-up, teachers become more like coaches: they assist and motivate individual ‘learners’, but are no longer so important to the provision of education. And since it is no longer the sole responsibility of teachers or schools to deliver the curriculum, it becomes more centralised – the latest front in a forty-year battle to wrest control from the hands of teachers and local authorities.
  • an injunction against creative interpretation and writing, a deprivation that working-class children will feel at least as deeply as anyone else.
  • There may be very good reasons for delivering online teaching in segments, punctuated by tasks and feedback, but as Yandell observes, other ways of reading and writing are marginalised in the process. Without wishing to romanticise the lonely reader (or, for that matter, the lonely writer), something is lost when alternating periods of passivity and activity are compressed into interactivity, until eventually education becomes a continuous cybernetic loop of information and feedback. How many keystrokes or mouse-clicks before a student is told they’ve gone wrong? How many words does it take to make a mistake?
  • This vision of language as code may already have been a significant feature of the curriculum, but it appears to have been exacerbated by the switch to online teaching. In a journal article from August 2020, ‘Learning under Lockdown: English Teaching in the Time of Covid-19’, John Yandell notes that online classes create wholly closed worlds, where context and intertextuality disappear in favour of constant instruction. In these online environments, readingis informed not by prior reading experiences but by the toolkit that the teacher has provided, and ... is presented as occurring along a tramline of linear development. Different readings are reducible to better or worse readings: the more closely the student’s reading approximates to the already finalised teacher’s reading, the better it is. That, it would appear, is what reading with precision looks like.
  • Constant interaction across an interface may be a good basis for forms of learning that involve information-processing and problem-solving, where there is a right and a wrong answer. The cognitive skills that can be trained in this way are the ones computers themselves excel at: pattern recognition and computation. The worry, for anyone who cares about the humanities in particular, is about the oversimplifications required to conduct other forms of education in these ways.
  • Blanket surveillance replaces the need for formal assessment.
  • Confirming Adorno’s worst fears of the ‘primacy of practical reason’, reading is no longer dissociable from the execution of tasks. And, crucially, the ‘goals’ to be achieved through the ability to read, the ‘potential’ and ‘participation’ to be realised, are economic in nature.
  • since 2019, with the Treasury increasingly unhappy about the amount of student debt still sitting on the government’s balance sheet and the government resorting to ‘culture war’ at every opportunity, there has been an effort to single out degree programmes that represent ‘poor value for money’, measured in terms of graduate earnings. (For reasons best known to itself, the usually independent Institute for Fiscal Studies has been leading the way in finding correlations between degree programmes and future earnings.) Many of these programmes are in the arts and humanities, and are now habitually referred to by Tory politicians and their supporters in the media as ‘low-value degrees’.
  • studying the humanities may become a luxury reserved for those who can fall back on the cultural and financial advantages of their class position. (This effect has already been noticed among young people going into acting, where the results are more visible to the public than they are in academia or heritage organisations.)
  • given the changing class composition of the UK over the past thirty years, it’s not clear that contemporary elites have any more sympathy for the humanities than the Conservative Party does. A friend of mine recently attended an open day at a well-known London private school, and noticed that while there was a long queue to speak to the maths and science teachers, nobody was waiting to speak to the English teacher. When she asked what was going on, she was told: ‘I’m afraid parents here are very ambitious.’ Parents at such schools, where fees have tripled in real terms since the early 1980s, tend to work in financial and business services themselves, and spend their own days profitably manipulating and analysing numbers on screens. When it comes to the transmission of elite status from one generation to the next, Shakespeare or Plato no longer has the same cachet as economics or physics.
  • Leaving aside the strategic political use of terms such as ‘woke’ and ‘cancel culture’, it would be hard to deny that we live in an age of heightened anxiety over the words we use, in particular the labels we apply to people. This has benefits: it can help to bring discriminatory practices to light, potentially leading to institutional reform. It can also lead to fruitless, distracting public arguments, such as the one that rumbled on for weeks over Angela Rayner’s description of Conservatives as ‘scum’. More and more, words are dredged up, edited or rearranged for the purpose of harming someone. Isolated words have acquired a weightiness in contemporary politics and public argument, while on digital media snippets of text circulate without context, as if the meaning of a single sentence were perfectly contained within it, walled off from the surrounding text. The exemplary textual form in this regard is the newspaper headline or corporate slogan: a carefully curated series of words, designed to cut through the blizzard of competing information.
  • Visit any actual school or university today (as opposed to the imaginary ones described in the Daily Mail or the speeches of Conservative ministers) and you will find highly disciplined, hierarchical institutions, focused on metrics, performance evaluations, ‘behaviour’ and quantifiable ‘learning outcomes’.
  • If young people today worry about using the ‘wrong’ words, it isn’t because of the persistence of the leftist cultural power of forty years ago, but – on the contrary – because of the barrage of initiatives and technologies dedicated to reversing that power. The ideology of measurable literacy, combined with a digital net that has captured social and educational life, leaves young people ill at ease with the language they use and fearful of what might happen should they trip up.
  • It has become clear, as we witness the advance of Panopto, Class Dojo and the rest of the EdTech industry, that one of the great things about an old-fashioned classroom is the facilitation of unrecorded, unaudited speech, and of uninterrupted reading and writing.
Ed Webb

The Generative AI Race Has a Dirty Secret | WIRED - 0 views

  • The race to build high-performance, AI-powered search engines is likely to require a dramatic rise in computing power, and with it a massive increase in the amount of energy that tech companies require and the amount of carbon they emit.
  • Every time we see a step change in online processing, we see significant increases in the power and cooling resources required by large processing centres
  • third-party analysis by researchers estimates that the training of GPT-3, which ChatGPT is partly based on, consumed 1,287 MWh, and led to emissions of more than 550 tons of carbon dioxide equivalent—the same amount as a single person taking 550 roundtrips between New York and San Francisco
  • ...3 more annotations...
  • There’s also a big difference between utilizing ChatGPT—which investment bank UBS estimates has 13 million users a day—as a standalone product, and integrating it into Bing, which handles half a billion searches every day.
  • Data centers already account for around one percent of the world’s greenhouse gas emissions, according to the International Energy Agency. That is expected to rise as demand for cloud computing increases, but the companies running search have promised to reduce their net contribution to global heating. “It’s definitely not as bad as transportation or the textile industry,” Gómez-Rodríguez says. “But [AI] can be a significant contributor to emissions.”
  • The environmental footprint and energy cost of integrating AI into search could be reduced by moving data centers onto cleaner energy sources, and by redesigning neural networks to become more efficient, reducing the so-called “inference time”—the amount of computing power required for an algorithm to work on new data.
1 - 14 of 14
Showing 20 items per page