"The European Commission has asked X to hand over internal documents about its algorithms, as it steps up its investigation into whether Elon Musk's social media platform has breached EU rules on content moderation.
The EU's executive branch told the company it wanted to see internal documentation about its "recommender system", which makes content suggestions to users, and any recent changes made to it, by 15 February.
X has been under investigation since December 2023 under the EU's content law - known as the Digital Services Act (DSA) - over how it tackles the spread of illegal content and information manipulation. The company has been accused of manipulating the platform's systems to give far-right posts and politicians greater visibility over other political groups."
"We've been told that social media feeds can either be engagement-maximizing or chronological-and that those are the only two options. But this is a false choice. A new report by the Knight-Georgetown Institute, Better Feeds: Algorithms That Put People First, makes it clear: platforms could offer far better feeds-ones that serve users' interests without the distortions of engagement-driven design."
"The announcement that Meta would be changing their approach to political content and discussions of gender is concerning, though it is unclear exactly what those changes are. Given that many product changes regarding those content areas were used in high-risk settings, a change intended to allay US free speech concerns could lead to violence incitement elsewhere. For example, per this post from Meta, reducing "content that has been shared by a chain of two or more people" was a content-neutral product change done to protect people in Ethiopia, where algorithms have been implicated in the spread of ethnic violence. A similar change - removing optimizations for reshared content - was discussed in this post concerning reductions in political content. Will those changes be undone? Globally? Such changes could also lead to increased amplification of attention getting discussions of gender. Per this report from Equimundo and Futures Without Violence, 40% of young men trust at least one "manosphere" influencer - who often exploit algorithmic incentives by posting increasingly extreme, attention-getting mixes of ideas about self-improvement, aggression, and traditional gender roles."
"A second major academic institution has accused Uber of using opaque computer code to dramatically increase its profits at the expense of the ride-hailing app's drivers and passengers.
Research by academics at New York's Columbia Business School concluded that the Silicon Valley company had implemented "algorithmic price discrimination" that had raised "rider fares and cut driver pay on billions of … trips, systematically, selectively, and opaquely"."
"But the documents show that the company relies heavily on the intervention of a small editorial team to determine what makes its "trending module" headlines - the list of news topics that shows up on the side of the browser window on Facebook's desktop version. The company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users' feeds."
"In a survey of more than 4,200 people conducted by CAA, travelers most frequently cited being split from their party while traveling on Ryanair, but the airline insists that it doesn't employ a family-splitting algorithm. Ryanair says if a person doesn't pay for their seat assignment, they are "randomly" assigned, which may result in them not sitting with their party."
"Specifically, the question is whether that algorithm is sorting suggestions based on the race of the creator - something TikTok denies it's doing intentionally. But it's another example of the need for more scrutiny into how the app and other social media platforms promote particular creators or content."
"We have become symbiotic with these machines. We feed them with energy and data, and they reward us with a host of services. But our relationship with them goes deeper. There are multiple layers of feedback loops as we shape algorithms and they shape us, at the individual and collective levels. What framework can we turn to to analyze this complex ecosystem?"
"Researchers from MIT and Google recently showed off a machine learning algorithm capable of automatically retouching photos just like a professional photographer. Snap a photo and the neural network identifies exactly how to make it look better-increase contrast a smidge, tone down brightness, whatever-and apply the changes in less than 20 milliseconds."
"Bitcoin mining - the process in which a bitcoin is awarded to a computer that solves a complex series of algorithms - is a deeply energy-intensive process.
"Mining" bitcoin involves solving complex math problems in order to create new bitcoins. Miners are rewarded in bitcoin.
Earlier in bitcoin's relatively short history - the currency was created in 2009 - one could mine bitcoin on an average computer. But the way bitcoin mining has been set up by its creator (or creators - no one really knows for sure who created it) is that there is a finite number of bitcoins that can be mined: 21m. The more bitcoin that is mined, the harder the algorithms that must be solved to get a bitcoin become."
""I consider 'bias' a euphemism," says Brandeis Marshall, PhD, data scientist and CEO of DataedX, an edtech and data science firm. "The words that are used are varied: There's fairness, there's responsibility, there's algorithmic bias, there's a number of terms… but really, it's dancing around the real topic… A dataset is inherently entrenched in systemic racism and sexism.""
"Crucial to his success, he says, was YouTube's recommendation system, the feature that promotes videos for you to watch on the homepage or in the "Up Next" column to the right of whatever you're watching. "We were recommended constantly," he tells me. YouTube's algorithms, he says, figured out that "people getting into flat earth apparently go down this rabbit hole, and so we're just gonna keep recommending.""
"Universities don't merely face essays or assignments entirely generated by algorithms: they must also adjudicate a myriad of more subtle problems. For instance, AI-powered word processors habitually suggest alternatives to our ungrammatical phrases. But if software can algorithmically rewrite a student's sentence, why shouldn't it do the same with a paragraph - and if a paragraph, why not a page?
At what point does the intrusion of AI constitute cheating?"
"WHEN 14-YEAR-OLD MOLLY Russell died in 2017, her cell phone contained graphic images of self-harm, an email roundup of "depression pins you might like," and advice on concealing mental illness from loved ones. Investigators initially ruled the British teen's death a suicide. But almost five years later, a British coroner's court has reversed the findings. Now, they claim that Russell died "from an act of self-harm while suffering from depression and the negative effects of online content"-and the algorithms themselves are on notice."
"To arrive at a recommended rent, the software deploys an algorithm - a set of mathematical rules - to analyze a trove of data RealPage gathers from clients, including private information on what nearby competitors charge.
For tenants, the system upends the practice of negotiating with apartment building staff. RealPage discourages bargaining with renters and has even recommended that landlords in some cases accept a lower occupancy rate in order to raise rents and make more money.
One of the algorithm's developers told ProPublica that leasing agents had "too much empathy" compared to computer generated pricing."
""An emotionally intelligent human does not usually claim they can accurately put a label on everything everyone says and tell you this person is currently feeling 80% angry, 18% fearful, and 2% sad," says Edward B Kang, an assistant professor at New York University writing about the intersection of AI and sound. "In fact, that sounds to me like the opposite of what an emotionally intelligent person would say."
Adding to this is the notorious problem of AI bias. "Your algorithms are only as good as the training material," Barrett says. "And if your training material is biased in some way, then you are enshrining that bias in code.""
"A UK-based startup has used an AI algorithm to identify a previously unknown kind of rare-earth free magnet, in a potential breakthrough for how we discover and create new materials.
Materials Nexus, headquartered in London, used its machine learning algorithm to identify and analyse over 100 million combinations of materials that could produce a viable rare-earth free magnet."
"
Meta apologises over flood of gore, violence and dead bodies on Instagram
Users of Reels report feeds dominated by violent and graphic footage after apparent algorithm malfunction
Dan Milmo Global technology editor
Fri 28 Feb 2025 15.01 GMT
Share
Mark Zuckerberg's Meta has apologised after Instagram users were subjected to a flood of violence, gore, animal abuse and dead bodies on their Reels feeds.
Users reported the footage after an apparent malfunction in Instagram's algorithm, which curates what people see on the app.
Reels is a feature on the social media platform that allows users to share short videos, similar to TikTok."