Skip to main content

Home/ TOK Friends/ Group items tagged Advertising

Rss Feed Group items tagged

sissij

Google Training Ad Placement Computers to Be Offended - The New York Times - 0 views

  • But after seeing ads from Coca-Cola, Procter & Gamble and Wal-Mart appear next to racist, anti-Semitic or terrorist videos, its engineers realized their computer models had a blind spot: They did not understand context.
  • Now teaching computers to understand what humans can readily grasp may be the key to calming fears among big-spending advertisers that their ads have been appearing alongside videos from extremist groups and other offensive messages.
  • But the recent problems opened Google to criticism that it was not doing enough to look out for advertisers. It is a significant problem for a multibillion-dollar company that still gets most of its revenue through advertising.
  • ...1 more annotation...
  • The idea is for machines to eventually make the tough calls.
  •  
    I have never think about the context of where the ads are in. I though ads just pops up randomly and now I know that there are actually codes behind where the ads appears. Why is putting ads besides extremist video a bad idea? I think it is probably because they people would mistaken that the company sponsor the video. Actually I am not very sure about why it is a bad thing. However, ads can definitely be more efficient in the right context. Different people watch different kind of video, targeting the potential costumers. It would benefit both the viewer and the company. --Sissi (4/3/2017)
Javier E

The Philosopher Whose Fingerprints Are All Over the FTC's New Approach to Privacy - Ale... - 0 views

  • The standard explanation for privacy freakouts is that people get upset because they've "lost control" of data about themselves or there is simply too much data available. Nissenbaum argues that the real problem "is the inapproproriateness of the flow of information due to the mediation of technology." In her scheme, there are senders and receivers of messages, who communicate different types of information with very specific expectations of how it will be used. Privacy violations occur not when too much data accumulates or people can't direct it, but when one of the receivers or transmission principles change. The key academic term is "context-relative informational norms." Bust a norm and people get upset.
  • Nissenbaum gets us past thinking about privacy as a binary: either something is private or something is public. Nissenbaum puts the context -- or social situation -- back into the equation. What you tell your bank, you might not tell your doctor.
  • Furthermore, these differences in information sharing are not bad or good; they are just the norms.
  • ...8 more annotations...
  • any privacy regulation that's going to make it through Congress has to provide clear ways for companies to continue profiting from data tracking. The key is coming up with an ethical framework in which they can do so, and Nissenbaum may have done just that. 
  • The traditional model of how this works says that your information is something like a currency and when you visit a website that collects data on you for one reason or another, you enter into a contract with that site. As long as the site gives you "notice" that data collection occurs -- usually via a privacy policy located through a link at the bottom of the page -- and you give "consent" by continuing to use the site, then no harm has been done. No matter how much data a site collects, if all they do is use it to show you advertising they hope is more relevant to you, then they've done nothing wrong.
  • Nevermind that if you actually read all the privacy policies you encounter in a year, it would take 76 work days. And that calculation doesn't even account for all the 3rd parties that drain data from your visits to other websites. Even more to the point: there is no obvious way to discriminate between two separate webpages on the basis of their data collection policies. While tools have emerged to tell you how many data trackers are being deployed at any site at a given moment, the dynamic nature of Internet advertising means that it is nearly impossible to know the story through time
  • How can anyone make a reasonable determination of how their information might be used when there are more than 50 or 100 or 200 tools in play on a single website in a single month?
  • Nissenbaum doesn't think it's possible to explain the current online advertising ecosystem in a useful way without resorting to a lot of detail. She calls this the "transparency paradox," and considers it insoluble.
  • she wants to import the norms from the offline world into the online world. When you go to a bank, she says, you have expectations of what might happen to your communications with that bank. That should be true whether you're online, on the phone, or at the teller.  Companies can use your data to do bank stuff, but they can't sell your data to car dealers looking for people with a lot of cash on hand.
  • let companies do standard data collection but require them to tell people when they are doing things with data that are inconsistent with the "context of the interaction" between a company and a person.
  • here's the big downside: it rests on the "norms" that people expect. While that may be socially optimal, it's actually quite difficult to figure out what the norms for a given situation might be. After all, there is someone else who depends on norms for his thinking about privacy.
jlessner

Why Facebook's News Experiment Matters to Readers - NYTimes.com - 0 views

  • Facebook’s new plan to host news publications’ stories directly is not only about page views, advertising revenue or the number of seconds it takes for an article to load. It is about who owns the relationship with readers.
  • It’s why Google, a search engine, started a social network and why Facebook, a social network, started a search engine. It’s why Amazon, a shopping site, made a phone and why Apple, a phone maker, got into shopping.
  • Facebook’s experiment, called instant articles, is small to start — just a few articles from nine media companies, including The New York Times. But it signals a major shift in the relationship between publications and their readers. If you want to read the news, Facebook is saying, come to Facebook, not to NBC News or The Atlantic or The Times — and when you come, don’t leave. (For now, these articles can be viewed on an iPhone running the Facebook app.)
  • ...6 more annotations...
  • The front page of a newspaper and the cover of a magazine lost their dominance long ago.
  • But news reports, like albums before them, have not been created that way. One of the services that editors bring to readers has been to use their news judgment, considering a huge range of factors, when they decide how articles fit together and where they show up. The news judgment of The New York Times is distinct from that of The New York Post, and for generations readers appreciated that distinction.
  • “In digital, every story becomes unbundled from each other, so if you’re not thinking of each story as living on its own, it’s tying yourself back to an analog era,” Mr. Kim said.
  • Facebook executives have insisted that they intend to exert no editorial control because they leave the makeup of the news feed to the algorithm. But an algorithm is not autonomous. It is written by humans and tweaked all the time. Advertisement Continue reading the main story Advertisement Continue reading the main story
  • That raises some journalistic questions. The news feed algorithm works, in part, by showing people more of what they have liked in the past. Some studies have suggested that means they might not see as wide a variety of news or points of view, though others, including one by Facebook researchers, have found they still do.
  • Tech companies, Facebook included, are notoriously fickle with their algorithms. Publications became so dependent on Facebook in the first place because of a change in its algorithm that sent more traffic their way. Later, another change demoted articles from sites that Facebook deemed to run click-bait headlines. Then last month, Facebook decided to prioritize some posts from friends over those from publications.
Javier E

The Backfire Effect « You Are Not So Smart - 0 views

  • corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired.
  • Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.
  • Psychologists call stories like these narrative scripts, stories that tell you what you want to hear, stories which confirm your beliefs and give you permission to continue feeling as you already do. If believing in welfare queens protects your ideology, you accept it and move on.
  • ...8 more annotations...
  • Contradictory evidence strengthens the position of the believer. It is seen as part of the conspiracy, and missing evidence is dismissed as part of the coverup.
  • Most online battles follow a similar pattern, each side launching attacks and pulling evidence from deep inside the web to back up their positions until, out of frustration, one party resorts to an all-out ad hominem nuclear strike
  • you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel as though they are even more sure of their position than before you started the debate. As they match your fervor, the same thing happens in your skull. The backfire effect pushes both of you deeper into your original beliefs.
  • you spend much more time considering information you disagree with than you do information you accept. Information which lines up with what you already believe passes through the mind like a vapor, but when you come across something which threatens your beliefs, something which conflicts with your preconceived notions of how the world works, you seize up and take notice. Some psychologists speculate there is an evolutionary explanation. Your ancestors paid more attention and spent more time thinking about negative stimuli than positive because bad things required a response
  • when your beliefs are challenged, you pore over the data, picking it apart, searching for weakness. The cognitive dissonance locks up the gears of your mind until you deal with it. In the process you form more neural connections, build new memories and put out effort – once you finally move on, your original convictions are stronger than ever.
  • The backfire effect is constantly shaping your beliefs and memory, keeping you consistently leaning one way or the other through a process psychologists call biased assimilation.
  • They then separated subjects into two groups; one group said they believed homosexuality was a mental illness and one did not. Each group then read the fake studies full of pretend facts and figures suggesting their worldview was wrong. On either side of the issue, after reading studies which did not support their beliefs, most people didn’t report an epiphany, a realization they’ve been wrong all these years. Instead, they said the issue was something science couldn’t understand. When asked about other topics later on, like spanking or astrology, these same people said they no longer trusted research to determine the truth. Rather than shed their belief and face facts, they rejected science altogether.
  • As social media and advertising progresses, confirmation bias and the backfire effect will become more and more difficult to overcome. You will have more opportunities to pick and choose the kind of information which gets into your head along with the kinds of outlets you trust to give you that information. In addition, advertisers will continue to adapt, not only generating ads based on what they know about you, but creating advertising strategies on the fly based on what has and has not worked on you so far. The media of the future may be delivered based not only on your preferences, but on how you vote, where you grew up, your mood, the time of day or year – every element of you which can be quantified. In a world where everything comes to you on demand, your beliefs may never be challenged.
demetriar

The Science of Emotion in Marketing: How Our Brains Decide What to Share and ... - 0 views

  • A new study says we're really only capable of four "basic" emotions: happy, sad, afraid/surprised, and angry/disgusted.
  • He found that an article was more likely to become viral the more positive it was.
  • the emotions of sadness and sorrow light up many of the same regions of the brain as happiness.
  • ...8 more annotations...
  • . Later, those who produced the most oxytocin were the most likely to give money to others they couldn't see.
  • "Our results show why puppies and babies are in toilet paper commercials," Zak said. "This research suggests that advertisers use images that cause our brains to release oxytocin to build trust in a product or brand, and hence increase sales."
  • A study published in the Journal of Consumer Research demonstrated that consumers who experienced fear while watching a film felt a greater affiliation with a present brand than those who watched films evoking other emotions, like happiness, sadness or excitement.
  • The rude comments made participants dig in on their stance
  • That emotions are critical -- maybe even more than previously thought -- to marketing.
  • In an analysis of the IPA dataBANK, which contains 1,400 case studies of successful advertising campaigns, campaigns with purely emotional content performed about twice as well (31 percent versus 16 percent) as those with only rational content (and did a little better than those that mixed emotional and rational content).
  • The emotional brain processes sensory information in one fifth of the time our cognitive brain takes to assimilate the same input
  • we're not just sharing the object, but we're sharing in the emotional response it creates."
Javier E

The Tech Industry's Psychological War on Kids - Member Feature Stories - Medium - 0 views

  • she cried, “They took my f***ing phone!” Attempting to engage Kelly in conversation, I asked her what she liked about her phone and social media. “They make me happy,” she replied.
  • Even though they were loving and involved parents, Kelly’s mom couldn’t help feeling that they’d failed their daughter and must have done something terribly wrong that led to her problems.
  • My practice as a child and adolescent psychologist is filled with families like Kelly’s. These parents say their kids’ extreme overuse of phones, video games, and social media is the most difficult parenting issue they face — and, in many cases, is tearing the family apart.
  • ...88 more annotations...
  • What none of these parents understand is that their children’s and teens’ destructive obsession with technology is the predictable consequence of a virtually unrecognized merger between the tech industry and psychology.
  • Dr. B.J. Fogg, is a psychologist and the father of persuasive technology, a discipline in which digital machines and apps — including smartphones, social media, and video games — are configured to alter human thoughts and behaviors. As the lab’s website boldly proclaims: “Machines designed to change humans.”
  • These parents have no idea that lurking behind their kids’ screens and phones are a multitude of psychologists, neuroscientists, and social science experts who use their knowledge of psychological vulnerabilities to devise products that capture kids’ attention for the sake of industry profit.
  • psychology — a discipline that we associate with healing — is now being used as a weapon against children.
  • This alliance pairs the consumer tech industry’s immense wealth with the most sophisticated psychological research, making it possible to develop social media, video games, and phones with drug-like power to seduce young users.
  • Likewise, social media companies use persuasive design to prey on the age-appropriate desire for preteen and teen kids, especially girls, to be socially successful. This drive is built into our DNA, since real-world relational skills have fostered human evolution.
  • Called “the millionaire maker,” Fogg has groomed former students who have used his methods to develop technologies that now consume kids’ lives. As he recently touted on his personal website, “My students often do groundbreaking projects, and they continue having impact in the real world after they leave Stanford… For example, Instagram has influenced the behavior of over 800 million people. The co-founder was a student of mine.”
  • Persuasive technology (also called persuasive design) works by deliberately creating digital environments that users feel fulfill their basic human drives — to be social or obtain goals — better than real-world alternatives.
  • Kids spend countless hours in social media and video game environments in pursuit of likes, “friends,” game points, and levels — because it’s stimulating, they believe that this makes them happy and successful, and they find it easier than doing the difficult but developmentally important activities of childhood.
  • While persuasion techniques work well on adults, they are particularly effective at influencing the still-maturing child and teen brain.
  • “Video games, better than anything else in our culture, deliver rewards to people, especially teenage boys,” says Fogg. “Teenage boys are wired to seek competency. To master our world and get better at stuff. Video games, in dishing out rewards, can convey to people that their competency is growing, you can get better at something second by second.”
  • it’s persuasive design that’s helped convince this generation of boys they are gaining “competency” by spending countless hours on game sites, when the sad reality is they are locked away in their rooms gaming, ignoring school, and not developing the real-world competencies that colleges and employers demand.
  • Persuasive technologies work because of their apparent triggering of the release of dopamine, a powerful neurotransmitter involved in reward, attention, and addiction.
  • As she says, “If you don’t get 100 ‘likes,’ you make other people share it so you get 100…. Or else you just get upset. Everyone wants to get the most ‘likes.’ It’s like a popularity contest.”
  • there are costs to Casey’s phone obsession, noting that the “girl’s phone, be it Facebook, Instagram or iMessage, is constantly pulling her away from her homework, sleep, or conversations with her family.
  • Casey says she wishes she could put her phone down. But she can’t. “I’ll wake up in the morning and go on Facebook just… because,” she says. “It’s not like I want to or I don’t. I just go on it. I’m, like, forced to. I don’t know why. I need to. Facebook takes up my whole life.”
  • B.J. Fogg may not be a household name, but Fortune Magazine calls him a “New Guru You Should Know,” and his research is driving a worldwide legion of user experience (UX) designers who utilize and expand upon his models of persuasive design.
  • “No one has perhaps been as influential on the current generation of user experience (UX) designers as Stanford researcher B.J. Fogg.”
  • the core of UX research is about using psychology to take advantage of our human vulnerabilities.
  • As Fogg is quoted in Kosner’s Forbes article, “Facebook, Twitter, Google, you name it, these companies have been using computers to influence our behavior.” However, the driving force behind behavior change isn’t computers. “The missing link isn’t the technology, it’s psychology,” says Fogg.
  • UX researchers not only follow Fogg’s design model, but also his apparent tendency to overlook the broader implications of persuasive design. They focus on the task at hand, building digital machines and apps that better demand users’ attention, compel users to return again and again, and grow businesses’ bottom line.
  • the “Fogg Behavior Model” is a well-tested method to change behavior and, in its simplified form, involves three primary factors: motivation, ability, and triggers.
  • “We can now create machines that can change what people think and what people do, and the machines can do that autonomously.”
  • Regarding ability, Fogg suggests that digital products should be made so that users don’t have to “think hard.” Hence, social networks are designed for ease of use
  • Finally, Fogg says that potential users need to be triggered to use a site. This is accomplished by a myriad of digital tricks, including the sending of incessant notifications
  • moral questions about the impact of turning persuasive techniques on children and teens are not being asked. For example, should the fear of social rejection be used to compel kids to compulsively use social media? Is it okay to lure kids away from school tasks that demand a strong mental effort so they can spend their lives on social networks or playing video games that don’t make them think much at all?
  • Describing how his formula is effective at getting people to use a social network, the psychologist says in an academic paper that a key motivator is users’ desire for “social acceptance,” although he says an even more powerful motivator is the desire “to avoid being socially rejected.”
  • the startup Dopamine Labs boasts about its use of persuasive techniques to increase profits: “Connect your app to our Persuasive AI [Artificial Intelligence] and lift your engagement and revenue up to 30% by giving your users our perfect bursts of dopamine,” and “A burst of Dopamine doesn’t just feel good: it’s proven to re-wire user behavior and habits.”
  • Ramsay Brown, the founder of Dopamine Labs, says in a KQED Science article, “We have now developed a rigorous technology of the human mind, and that is both exciting and terrifying. We have the ability to twiddle some knobs in a machine learning dashboard we build, and around the world hundreds of thousands of people are going to quietly change their behavior in ways that, unbeknownst to them, feel second-nature but are really by design.”
  • Programmers call this “brain hacking,” as it compels users to spend more time on sites even though they mistakenly believe it’s strictly due to their own conscious choices.
  • Banks of computers employ AI to “learn” which of a countless number of persuasive design elements will keep users hooked
  • A persuasion profile of a particular user’s unique vulnerabilities is developed in real time and exploited to keep users on the site and make them return again and again for longer periods of time. This drives up profits for consumer internet companies whose revenue is based on how much their products are used.
  • “The leaders of Internet companies face an interesting, if also morally questionable, imperative: either they hijack neuroscience to gain market share and make large profits, or they let competitors do that and run away with the market.”
  • Social media and video game companies believe they are compelled to use persuasive technology in the arms race for attention, profits, and survival.
  • Children’s well-being is not part of the decision calculus.
  • one breakthrough occurred in 2017 when Facebook documents were leaked to The Australian. The internal report crafted by Facebook executives showed the social network boasting to advertisers that by monitoring posts, interactions, and photos in real time, the network is able to track when teens feel “insecure,” “worthless,” “stressed,” “useless” and a “failure.”
  • The report also bragged about Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.”
  • These design techniques provide tech corporations a window into kids’ hearts and minds to measure their particular vulnerabilities, which can then be used to control their behavior as consumers. This isn’t some strange future… this is now.
  • The official tech industry line is that persuasive technologies are used to make products more engaging and enjoyable. But the revelations of industry insiders can reveal darker motives.
  • Revealing the hard science behind persuasive technology, Hopson says, “This is not to say that players are the same as rats, but that there are general rules of learning which apply equally to both.”
  • After penning the paper, Hopson was hired by Microsoft, where he helped lead the development of the Xbox Live, Microsoft’s online gaming system
  • “If game designers are going to pull a person away from every other voluntary social activity or hobby or pastime, they’re going to have to engage that person at a very deep level in every possible way they can.”
  • This is the dominant effect of persuasive design today: building video games and social media products so compelling that they pull users away from the real world to spend their lives in for-profit domains.
  • Persuasive technologies are reshaping childhood, luring kids away from family and schoolwork to spend more and more of their lives sitting before screens and phones.
  • “Since we’ve figured to some extent how these pieces of the brain that handle addiction are working, people have figured out how to juice them further and how to bake that information into apps.”
  • Today, persuasive design is likely distracting adults from driving safely, productive work, and engaging with their own children — all matters which need urgent attention
  • Still, because the child and adolescent brain is more easily controlled than the adult mind, the use of persuasive design is having a much more hurtful impact on kids.
  • But to engage in a pursuit at the expense of important real-world activities is a core element of addiction.
  • younger U.S. children now spend 5 ½ hours each day with entertainment technologies, including video games, social media, and online videos.
  • Even more, the average teen now spends an incredible 8 hours each day playing with screens and phones
  • U.S. kids only spend 16 minutes each day using the computer at home for school.
  • Quietly, using screens and phones for entertainment has become the dominant activity of childhood.
  • Younger kids spend more time engaging with entertainment screens than they do in school
  • teens spend even more time playing with screens and phones than they do sleeping
  • kids are so taken with their phones and other devices that they have turned their backs to the world around them.
  • many children are missing out on real-life engagement with family and school — the two cornerstones of childhood that lead them to grow up happy and successful
  • persuasive technologies are pulling kids into often toxic digital environments
  • A too frequent experience for many is being cyberbullied, which increases their risk of skipping school and considering suicide.
  • And there is growing recognition of the negative impact of FOMO, or the fear of missing out, as kids spend their social media lives watching a parade of peers who look to be having a great time without them, feeding their feelings of loneliness and being less than.
  • The combined effects of the displacement of vital childhood activities and exposure to unhealthy online environments is wrecking a generation.
  • as the typical age when kids get their first smartphone has fallen to 10, it’s no surprise to see serious psychiatric problems — once the domain of teens — now enveloping young kids
  • Self-inflicted injuries, such as cutting, that are serious enough to require treatment in an emergency room, have increased dramatically in 10- to 14-year-old girls, up 19% per year since 2009.
  • While girls are pulled onto smartphones and social media, boys are more likely to be seduced into the world of video gaming, often at the expense of a focus on school
  • it’s no surprise to see this generation of boys struggling to make it to college: a full 57% of college admissions are granted to young women compared with only 43% to young men.
  • Economists working with the National Bureau of Economic Research recently demonstrated how many young U.S. men are choosing to play video games rather than join the workforce.
  • The destructive forces of psychology deployed by the tech industry are making a greater impact on kids than the positive uses of psychology by mental health providers and child advocates. Put plainly, the science of psychology is hurting kids more than helping them.
  • Hope for this wired generation has seemed dim until recently, when a surprising group has come forward to criticize the tech industry’s use of psychological manipulation: tech executives
  • Tristan Harris, formerly a design ethicist at Google, has led the way by unmasking the industry’s use of persuasive design. Interviewed in The Economist’s 1843 magazine, he says, “The job of these companies is to hook people, and they do that by hijacking our psychological vulnerabilities.”
  • Marc Benioff, CEO of the cloud computing company Salesforce, is one of the voices calling for the regulation of social media companies because of their potential to addict children. He says that just as the cigarette industry has been regulated, so too should social media companies. “I think that, for sure, technology has addictive qualities that we have to address, and that product designers are working to make those products more addictive, and we need to rein that back as much as possible,”
  • “If there’s an unfair advantage or things that are out there that are not understood by parents, then the government’s got to come forward and illuminate that.”
  • Since millions of parents, for example the parents of my patient Kelly, have absolutely no idea that devices are used to hijack their children’s minds and lives, regulation of such practices is the right thing to do.
  • Another improbable group to speak out on behalf of children is tech investors.
  • How has the consumer tech industry responded to these calls for change? By going even lower.
  • Facebook recently launched Messenger Kids, a social media app that will reach kids as young as five years old. Suggestive that harmful persuasive design is now honing in on very young children is the declaration of Messenger Kids Art Director, Shiu Pei Luu, “We want to help foster communication [on Facebook] and make that the most exciting thing you want to be doing.”
  • the American Psychological Association (APA) — which is tasked with protecting children and families from harmful psychological practices — has been essentially silent on the matter
  • APA Ethical Standards require the profession to make efforts to correct the “misuse” of the work of psychologists, which would include the application of B.J. Fogg’s persuasive technologies to influence children against their best interests
  • Manipulating children for profit without their own or parents’ consent, and driving kids to spend more time on devices that contribute to emotional and academic problems is the embodiment of unethical psychological practice.
  • “Never before in history have basically 50 mostly men, mostly 20–35, mostly white engineer designer types within 50 miles of where we are right now [Silicon Valley], had control of what a billion people think and do.”
  • Some may argue that it’s the parents’ responsibility to protect their children from tech industry deception. However, parents have no idea of the powerful forces aligned against them, nor do they know how technologies are developed with drug-like effects to capture kids’ minds
  • Others will claim that nothing should be done because the intention behind persuasive design is to build better products, not manipulate kids
  • similar circumstances exist in the cigarette industry, as tobacco companies have as their intention profiting from the sale of their product, not hurting children. Nonetheless, because cigarettes and persuasive design predictably harm children, actions should be taken to protect kids from their effects.
  • in a 1998 academic paper, Fogg describes what should happen if things go wrong, saying, if persuasive technologies are “deemed harmful or questionable in some regard, a researcher should then either take social action or advocate that others do so.”
  • I suggest turning to President John F. Kennedy’s prescient guidance: He said that technology “has no conscience of its own. Whether it will become a force for good or ill depends on man.”
  • The APA should begin by demanding that the tech industry’s behavioral manipulation techniques be brought out of the shadows and exposed to the light of public awareness
  • Changes should be made in the APA’s Ethics Code to specifically prevent psychologists from manipulating children using digital machines, especially if such influence is known to pose risks to their well-being.
  • Moreover, the APA should follow its Ethical Standards by making strong efforts to correct the misuse of psychological persuasion by the tech industry and by user experience designers outside the field of psychology.
  • It should join with tech executives who are demanding that persuasive design in kids’ tech products be regulated
  • The APA also should make its powerful voice heard amongst the growing chorus calling out tech companies that intentionally exploit children’s vulnerabilities.
Javier E

I Downloaded the Information That Facebook Has on Me. Yikes. - The New York Times - 0 views

  • When I downloaded a copy of my Facebook data last week, I didn’t expect to see much. My profile is sparse, I rarely post anything on the site, and I seldom click on ads
  • With a few clicks, I learned that about 500 advertisers — many that I had never heard of, like Bad Dad, a motorcycle parts store, and Space Jesus, an electronica band — had my contact information
  • Facebook also had my entire phone book, including the number to ring my apartment buzzer. The social network had even kept a permanent record of the roughly 100 people I had deleted from my friends list over the last 14 years, including my exes.
  • ...16 more annotations...
  • During his testimony, Mr. Zuckerberg repeatedly said Facebook has a tool for downloading your data that “allows people to see and take out all the information they’ve put into Facebook.”
  • Most basic information, like my birthday, could not be deleted. More important, the pieces of data that I found objectionable, like the record of people I had unfriended, could not be removed from Facebook, either.
  • “They don’t delete anything, and that’s a general policy,” said Gabriel Weinberg, the founder of DuckDuckGo, which offers internet privacy tools. He added that data was kept around to eventually help brands serve targeted ads.
  • When you download a copy of your Facebook data, you will see a folder containing multiple subfolders and files. The most important one is the “index” file, which is essentially a raw data set of your Facebook account, where you can click through your profile, friends list, timeline and messages, among other features.
  • Upon closer inspection, it turned out that Facebook had stored my entire phone book because I had uploaded it when setting up Facebook’s messaging app, Messenger.
  • Facebook also kept a history of each time I opened Facebook over the last two years, including which device and web browser I used. On some days, it even logged my locations, like when I was at a hospital two years ago or when I visited Tokyo last year.
  • what bothered me was the data that I had explicitly deleted but that lingered in plain sight. On my friends list, Facebook had a record of “Removed Friends,” a dossier of the 112 people I had removed along with the date I clicked the “Unfriend” button. Why should Facebook remember the people I’ve cut off from my life?
  • Facebook said unfamiliar advertisers might appear on the list because they might have obtained my contact information from elsewhere, compiled it into a list of people they wanted to target and uploaded that list into Facebook
  • Brands can obtain your information in many different ways. Those include:
  • ■ Buying information from a data provider like Acxiom, which has amassed one of the world’s largest commercial databases on consumers. Brands can buy different types of customer data sets from a provider, like contact information for people who belong to a certain demographic, and take that information to Facebook to serve targeted ads
  • ■ Using tracking technologies like web cookies and invisible pixels that load in your web browser to collect information about your browsing activities. There are many different trackers on the web, and Facebook offers 10 different trackers to help brands harvest your information, according to Ghostery, which offers privacy tools that block ads and trackers.
  • ■ Getting your information in simpler ways, too. Someone you shared information with could share it with another entity. Your credit card loyalty program, for example
  • I also downloaded copies of my Google data with a tool called Google Takeout. The data sets were exponentially larger than my Facebook data.
  • For my personal email account alone, Google’s archive of my data measured eight gigabytes, enough to hold about 2,000 hours of music. By comparison, my Facebook data was about 650 megabytes, the equivalent of about 160 hours of music.
  • In a folder labeled Ads, Google kept a history of many news articles I had read, like a Newsweek story about Apple employees walking into glass walls and a New York Times story about the editor of our Modern Love column. I didn’t click on ads for either of these stories, but the search giant logged them because the sites had loaded ads served by Google.
  • In another folder, labeled Android, Google had a record of apps I had opened on an Android phone since 2015, along with the date and time. This felt like an extraordinary level of detail.
Javier E

Silicon Valley Is Not Your Friend - The New York Times - 0 views

  • By all accounts, these programmers turned entrepreneurs believed their lofty words and were at first indifferent to getting rich from their ideas. A 1998 paper by Sergey Brin and Larry Page, then computer-science graduate students at Stanford, stressed the social benefits of their new search engine, Google, which would be open to the scrutiny of other researchers and wouldn’t be advertising-driven.
  • The Google prototype was still ad-free, but what about the others, which took ads? Mr. Brin and Mr. Page had their doubts: “We expect that advertising-funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.”
  • He was concerned about them as young students lacking perspective about life and was worried that these troubled souls could be our new leaders. Neither Mr. Weizenbaum nor Mr. McCarthy mentioned, though it was hard to miss, that this ascendant generation were nearly all white men with a strong preference for people just like themselves. In a word, they were incorrigible, accustomed to total control of what appeared on their screens. “No playwright, no stage director, no emperor, however powerful,” Mr. Weizenbaum wrote, “has ever exercised such absolute authority to arrange a stage or a field of battle and to command such unswervingly dutiful actors or troops.”
  • ...7 more annotations...
  • In his epic anti-A.I. work from the mid-1970s, “Computer Power and Human Reason,” Mr. Weizenbaum described the scene at computer labs. “Bright young men of disheveled appearance, often with sunken glowing eyes, can be seen sitting at computer consoles, their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler’s on the rolling dice,” he wrote. “They exist, at least when so engaged, only through and for the computers. These are computer bums, compulsive programmers.”
  • Welcome to Silicon Valley, 2017.
  • As Mr. Weizenbaum feared, the current tech leaders have discovered that people trust computers and have licked their lips at the possibilities. The examples of Silicon Valley manipulation are too legion to list: push notifications, surge pricing, recommended friends, suggested films, people who bought this also bought that.
  • Growth becomes the overriding motivation — something treasured for its own sake, not for anything it brings to the world
  • Facebook and Google can point to a greater utility that comes from being the central repository of all people, all information, but such market dominance has obvious drawbacks, and not just the lack of competition. As we’ve seen, the extreme concentration of wealth and power is a threat to our democracy by making some people and companies unaccountable.
  • As is becoming obvious, these companies do not deserve the benefit of the doubt. We need greater regulation, even if it impedes the introduction of new services.
  • We need to break up these online monopolies because if a few people make the decisions about how we communicate, shop, learn the news, again, do we control our own society?
Javier E

Understanding the Social Networks | Talking Points Memo - 0 views

  • Even when people understand in some sense – and often even in detail – how the algorithms work they still tend to see these platforms as modern, digital versions of the town square. There have always been people saying nonsensical things, lying, unknowingly peddling inaccurate information. And our whole civic order is based on a deep skepticism about any authority’s ability to determine what’s true or accurate and what’s not. So really there’s nothing new under the sun, many people say.
  • But all of these points become moot when the networks – the virtual pubic square – are actually run by a series of computer programs designed to maximize ‘engagement’ and strong emotion for the purposes of selling advertising.
  • But really all these networks are running experiments that put us collectively into the role of Pavlov’s dogs.
  • ...6 more annotations...
  • The algorithms are showing you things to see what you react to and showing you more of the things that prompt an emotional response, that make it harder to leave Facebook or Instagram or any of the other social networks.
  • really if your goal is to maximize engagement that is of course what you’d do since anger is a far more compelling and powerful emotion than appreciation.
  • Facebook didn’t do that. That’s coded into our neurology. Facebook really is an extremism generating machine. It’s really an inevitable part of the core engine.
  • it’s not just Facebook. Or perhaps you could say it’s not even Facebook at all. It’s the mix of machine learning and the business models of all the social networks
  • They have real upsides. They connect us with people. Show us fun videos. But they are also inherently destructive. And somehow we have to take cognizance of that – and not just as a matter of the business decisions of one company.
  • the social networks – meaning the mix of machine learning and advertising/engagement based business models – are really something new under the sun. They’re addiction and extremism generating systems. It’s what they’re designed to do.
Javier E

Technopoly-Chs. 9,10--Scientism, the great symbol drain - 0 views

  • By Scientism, I mean three interrelated ideas that, taken together, stand as one of the pillars of Technopoly.
  • The first and indispensable idea is, as noted, that the methods of the natural sciences can be applied to the study of human behavior. This idea is the backbone of much of psychology and sociology as practiced at least in America, and largely accounts for the fact that social science, to quote F. A. Hayek, "has cont~ibuted scarcely anything to our understanding of social phenomena." 2
  • The second idea is, as also noted, that social science generates specific principles which can be used to organize society on a rational and humane basis. This implies that technical meansmostly "invisible technologies" supervised by experts-can be designed to control human behavior and set it on the proper course.
  • ...63 more annotations...
  • The third idea is that faith in science can serve as a comprehensive belief system that gives meaning to life, as well. as a sense of well-being, morality, and even immortality.
  • the spirit behind this scientific ideal inspired several men to believe that the reliable and predictable knowledge that could be obtained about stars and atoms could also be obtained about human behavior.
  • Among the best known of these early "social scientists" were Claude-Henri de Saint-Simon, Prosper Enfantin, and, of course, Auguste Comte.
  • They held in common two beliefs to which T echnopoly is deeply indebted: that the natural sciences provide a method to unlock the secrets of both the human heart and the direction of social life; that society can be rationally and humanely reorganized according to principles that social science will uncover. It is with these men that the idea of "social engineering" begins and the seeds of Scientism are planted.
  • Information produced by counting may sometimes be valuable in helping a person get an idea, or, even more so, in providing support for an idea. But the mere activity of counting does not make science.
  • Nor does observing th_ings, though it is sometimes said that if one is empirical, one is scientific. To be empirical means to look at things before drawing conclusions. Everyone, therefore, is an empiricist, with the possible exception of paranoid schizophrenics.
  • What we may call science, then, is the quest to find the immutable and universal laws that govern processes, presuming that there are cause-and-effect relations among these processes. It follows that the quest to understand human behavior and feeling can in no sense except the most trivial be called science.
  • Scientists do strive to be empirical and where possible precise, but it is also basic to their enterprise that they maintain a high degree of objectivity, which means that they study things independently of what people think or do about them.
  • I do not say, incidentally, that the Oedipus complex and God do not exist. Nor do I say that to believe in them is harmful-far from it. I say only that, there being no tests that could, in principle, show them to be false, they fall outside the purview Scientism 151 of science, as do almost all theories that make up the content of "social science."
  • in the nineteenth centu~, novelists provided us with most of the powerful metaphors and images of our culture.
  • This fact relieves the scientist of inquiring into their values and motivations and for this reason alone separates science from what is called social science, consigning the methodology of the latter (to quote Gunnar Myrdal) to the status of the "metaphysical and pseudo-objective." 3
  • The status of social-science methods is further reduced by the fact that there are almost no experiments that will reveal a social-science theory to be false.
  • et us further suppose that Milgram had found that 100 percent of his 1 subjecl:s did what they were told, with or without Hannah Arendt. And now let us suppose that I tell you a story of a Scientism 153 group of people who in some real situation refused to comply with the orders of a legitimate authority-let us say, the Danes who in the face of Nazi occupation helped nine thousand Jews escape to Sweden. Would you say to me that this cannot be so because Milgram' s study proves otherwise? Or would you say that this overturns Milgram's work? Perhaps you would say that the Danish response is not relevant, since the Danes did not regard the Nazi occupation as constituting legitimate autho!ity. But then, how would we explain the cooperative response to Nazi authority of the French, the Poles, and the Lithuanians? I think you would say none of these things, because Milgram' s experiment qoes not confirm or falsify any theory that might be said to postulate a law of human nature. His study-which, incidentally, I find both fascinating and terrifying-is not science. It is something else entirely.
  • Freud, could not imagine how the book could be judged exemplary: it was science or it was nothing. Well, of course, Freud was wrong. His work is exemplary-indeed, monumental-but scarcely anyone believes today that Freud was doing science, any more than educated people believe that Marx was doing science, or Max Weber or Lewis Mumford or Bruno Bettelheim or Carl Jung or Margaret Mead or Arnold Toynbee. What these people were doing-and Stanley Milgram was doing-is documenting the behavior and feelings of people as they confront problems posed by their culture.
  • the stories of social r~searchers are much closer in structure and purpose to what is called imaginative literature; that is to say, both a social researcher and a novelist give unique interpretations to a set of human events and support their interpretations with examples in various forms. Their interpretations cannot be proved or disproved but will draw their appeal from the power of their language, the depth of their explanations, the relevance of their examples, and the credibility of their themes.
  • And all of this has, in both cases, an identifiable moral purpose.
  • The words "true" and "false" do not apply here in the sense that they are used in mathematics or science. For there is nothing universally and irrevocably true or false about these interpretations. There are no critical tests to confirm or falsify them. There are no natural laws from which they are derived. They are bound by time, by situation, and above all by the cultural prejudices of the researcher or writer.
  • Both the novelist and the social researcher construct their stories by the use of archetypes and metaphors.
  • Cervantes, for example, gave us the enduring archetype of the incurable dreamer and idealist in Don Quixote. The social historian Marx gave us the archetype of the ruthless and conspiring, though nameless, capitalist. Flaubert gave us the repressed b~urgeois romantic in Emma Bovary. And Margaret Mead gave us the carefree, guiltless Samoan adolescent. Kafka gave us the alienated urbanite driven to self-loathing. And Max Weber gave us hardworking men driven by a mythology he called the Protestant Ethic. Dostoevsky gave us the egomaniac redeemed by love and religious fervor. And B. F. Skinner gave us the automaton redeemed by a benign technology.
  • Why do such social researchers tell their stories? Essentially for didactic and moralistic purposes. These men and women tell their stories for the same reason the Buddha, Confucius, Hillel, and Jesus told their stories (and for the same reason D. H. Lawrence told his).
  • Moreover, in their quest for objectivity, scientists proceed on the assumption that the objects they study are indifferent to the fact that they are being studied.
  • If, indeed, the price of civilization is repressed sexuality, it was not Sigmund Freud who discovered it. If the consciousness of people is formed by their material circumstances, it was not Marx who discovered it. If the medium is the message, it was not McLuhan who discovered it. They have merely retold ancient stories in a modem style.
  • Unlike science, social research never discovers anything. It only rediscovers what people once were told and need to be told again.
  • Only in knowing ~omething of the reasons why they advocated education can we make sense of the means they suggest. But to understand their reas.ons we must also understand the narratives that governed their view of the world. By narrative, I mean a story of human history that gives meaning to the past, explains the present, and provides guidance for the future.
  • In Technopoly, it is not Scientism 159 enough to say, it is immoral and degrading to allow people to be homeless. You cannot get anywhere by asking a judge, a politician, or a bureaucrat to r~ad Les Miserables or Nana or, indeed, the New Testament. Y 01.i must show that statistics have produced data revealing the homeless to be unhappy and to be a drain on the economy. Neither Dostoevsky nor Freud, Dickens nor Weber, Twain nor Marx, is now a dispenser of legitimate knowledge. They are interesting; they are ''.worth reading"; they are artifacts of our past. But as for "truth," we must tum to "science."
  • In Technopoly, it is not enough for social research to rediscover ancient truths or to comment on and criticize the moral behavior of people. In T echnopoly, it is an insult to call someone a "moralizer." Nor is it sufficient for social research to put forward metaphors, images, and ideas that can help people live with some measure of understanding and dignity.
  • Such a program lacks the aura of certain knowledge that only science can provide. It becomes necessary, then, to transform psychology, sociology, and anthropology into "sciences," in which humanity itself becomes an object, much like plants, planets, or ice cubes.
  • That is why the commonplaces that people fear death and that children who come from stable families valuing scholarship will do well in school must be announced as "discoveries" of scientific enterprise. In this way, social resear~hers can see themselves, and can be seen, as scientists, researchers without bias or values, unburdened by mere opinion. In this way, social policies can be claimed to rest on objectively determined facts.
  • given the psychological, social, and material benefits that attach to the label "scientist," it is not hard to see why social researchers should find it hard to give it up.
  • Our social "s'cientists" have from the beginning been less tender of conscience, or less rigorous in their views of science, or perhaps just more confused about the questions their procedures can answer and those they cannot. In any case, they have not been squeamish about imputing to their "discoveries" and the rigor of their procedures the power to direct us in how we ought rightly to behave.
  • It is less easy to see why the rest of us have so willingly, even eagerly, cooperated in perpetuating the same illusion.
  • When the new technologies and techniques and spirit of men like Galileo, Newton, and Bacon laid the foundations of natural science, they also discredited the authority of earlier accounts of the physical world, as found, for example, in the great tale of Genesis. By calling into question the truth of such accounts in one realm, science undermined the whole edifice of belief in sacred stories and ultimately swept away with it the source to which most humans had looked for moral authority. It is not too much to say, I think, that the desacralized world has been searching for an alternative source of moral authority ever since.
  • We welcome them gladly, and the claim explicitly made or implied, because we need so desperately to find some source outside the frail and shaky judgments of mortals like ourselves to authorize our moral decisions and behavior. And outside of the authority of brute force, which can scarcely be called moral, we seem to have little left but the authority of procedures.
  • It is not merely the misapplication of techniques such as quantification to questions where numbers have nothing to say; not merely the confusion of the material and social realms of human experience; not merely the claim of social researchers to be applying the aims and procedures of natural scien\:e to the human world.
  • This, then, is what I mean by Scientism.
  • It is the desperate hope, and wish, and ultimately the illusory belief that some standardized set of procedures called "science" can provide us with an unimpeachable source of moral authority, a suprahuman basis for answers to questions like "What is life, and when, and why?" "Why is death, and suffering?" 'What is right and wrong to do?" "What are good and evil ends?" "How ought we to think and feel and behave?
  • Science can tell us when a heart begins to beat, or movement begins, or what are the statistics on the survival of neonates of different gestational ages outside the womb. But science has no more authority than you do or I do to establish such criteria as the "true" definition of "life" or of human state or of personhood.
  • Social research can tell us how some people behave in the presence of what they believe to be legitimate authority. But it cannot tell us when authority is "legitimate" and when not, or how we must decide, or when it may be right or wrong to obey.
  • To ask of science, or expect of science, or accept unchallenged from science the answers to such questions is Scientism. And it is Technopoly's grand illusion.
  • In the institutional form it has taken in the United States, advertising is a symptom of a world-view 'that sees tradition as an obstacle to its claims. There can, of course, be no functioning sense of tradition without a measure of respect for symbols. Tradition is, in fact, nothing but the acknowledgment of the authority of symbols and the relevance of the narratives that gave birth to them. With the erosion of symbols there follows a loss of narrative, which is one of the most debilitating consequences of Technopoly' s power.
  • What the advertiser needs to know is not what is right about the product but what is wrong about the buyer. And so the balance of business expenditures shifts from product research to market research, which meahs orienting business away from making products of value and toward making consumers feel valuable. The business of business becomes pseudo-therapy; the consumer, a patient reassl.,lred by psychodramas.
  • At the moment, 1t 1s considered necessary to introduce computers to the classroom, as it once was thought necessary to bring closed-circuit television and film to the classroom. To the question "Why should we do this?" the answer is: "To make learning more efficient and more interesting." Such an answer is considered entirely adequate, since in T ~chnopoly efficiency and interest need no justification. It is, therefore, usually not noticed that this answer does not address the question "What is learning for?"
  • What this means is that somewhere near the core of Technopoly is a vast industry with license to use all available symbols to further the interests of commerce, by devouring the psyches of consumers.
  • In the twentieth century, such metaphors and images have come largely from the pens of social historians and researchers. ·Think of John Dewey, William James, Erik Erikson, Alfred Kinsey, Thorstein Veblen, Margaret Mead, Lewis Mumford, B. F. Skinner, Carl Rogers, Marshall McLuhan, Barbara Tuchman, Noam Chomsky, Robert Coles, even Stanley Milgram, and you must acknowledge that our ideas of what we are like and what kind of country we live in come from their stories to a far greater extent than from the stories of our most renowned novelists.
  • social idea that must be advanced through education.
  • Confucius advocated teaching "the Way" because in tradition he saw the best hope for social order. As our first systematic fascist, Plato wished education to produce philosopher kings. Cicero argued that education must free the student from the tyranny of the present. Jefferson thought the purpose of education is to teach the young how to protect their liberties. Rousseau wished education to free the young from the unnatural constraints of a wicked and arbitrary social order. And among John Dewey's aims was to help the student function without certainty in a world of constant change and puzzling· ambiguities.
  • The point is that cultures must have narratives and will find them where they will, even if they lead to catastrophe. The alternative is to live without meaning, the ultimate negation of life itself.
  • It is also to the point to say that each narrative is given its form and its emotional texture through a cluster of symbols that call for respect and allegiance, even devotion.
  • by definition, there can be no education philosophy that does not address what learning is for. Confucius, Plato, Quintilian, Cicero, Comenius, Erasmus, Locke, Rousseau, Jefferson, Russell, Montessori, Whitehead, and Dewey--each believed that there was some transcendent political, spiritual, or
  • The importance of the American Constitution is largely in its function as a symbol of the story of our origins. It is our political equivalent of Genesis. To mock it, to• ignore it, to circwnvent it is to declare the irrelevance of the story of the United States as a moral light unto the world. In like fashion, the Statue of Liberty is the key symbol of the story of America as the natural home of the teeming masses, from anywhere, yearning to be free.
  • There are those who believe--as did the great historian Arnold Toynbee-that without a comprehensive religious narrative at its center a culture must decline. Perhaps. There are, after all, other sources-mythology, politics, philosophy, and science; for example--but it is certain that no culture can flourish without narratives of transcendent orjgin and power.
  • This does not mean that the mere existence of such a narrative ensures a culture's stability and strength. There are destructive narratives. A narrative provides meaning, not necessarily survival-as, for example, the story provided by Adolf Hitler to the German nation in t:he 1930s.
  • What story does American education wish to tell now? In a growing Technopoly, what do we believe education is for?
  • The answers are discouraging, and one of. them can be inferred from any television commercial urging the young to stay in school. The commercial will either imply or state explicitly that education will help the persevering student to get a ·good job. And that's it. Well, not quite. There is also the idea that we educate ourselves to compete with the Japanese or the Germans in an economic struggle to be number one.
  • Young men, for example, will learn how to make lay-up shots when they play basketball. To be able to make them is part of the The Great Symbol Drain 177 definition of what good players are. But they do not play basketball for that purpose. There is usually a broader, deeper, and more meaningful reason for wanting to play-to assert their manhood, to please their fathers, to be acceptable to their peers, even for the sheer aesthetic pleasure of the game itself. What you have to do to be a success must be addressed only after you have found a reason to be successful.
  • Bloom's solution is that we go back to the basics of Western thought.
  • He wants us to teach our students what Plato, Aristotle, Cicero, Saint Augustine, and other luminaries have had to say on the great ethical and epistemological questions. He believes that by acquainting themselves with great books our students will acquire a moral and intellectual foundation that will give meaning and texture to their lives.
  • Hirsch's encyclopedic list is not a solution but a description of the problem of information glut. It is therefore essentially incoherent. But it also confuses a consequence of education with a purpose. Hirsch attempted to answer the question "What is an educated person?" He left unanswered the question "What is an education for?"
  • Those who reject Bloom's idea have offered several arguments against it. The first is that such a purpose for education is elitist: the mass of students would not find the great story of
  • Western civilization inspiring, are too deeply alienated from the past to find it so, and would therefore have difficulty connecting the "best that has been thought and said" to their own struggles to find q1eaning in their lives.
  • A second argument, coming from what is called a "leftist" perspective, is even more discouraging. In a sense, it offers a definition of what is meant by elitism. It asserts that the "story of Western civilization" is a partial, biased, and even oppressive one. It is not the story of blacks, American Indians, Hispanics, women, homosexuals-of any people who are not white heterosexual males of Judea-Christian heritage. This claim denies that there is or can be a national culture, a narrative of organizing power and inspiring symbols which all citizens can identify with and draw sustenance from. If this is true, it means nothing less than that our national symbols have been drained of their power to unite, and that education must become a tribal affair; that is, each subculture must find its own story and symbols, and use them as the moral basis of education.
  • nto this void comes the Technopoly story, with its emphasis on progress without limits, rights without responsibilities, and technology without cost. The T echnopoly story is without a moral center. It puts in its place efficiency, interest, and economic advance. It promises heaven on earth through the conveniences of technological progress. It casts aside all traditional narratives and symbols that· suggest stability and orderliness, and tells, instead, of a life of skills, technical expertise, and the ecstasy of consumption. Its purpose is to produce functionaries for an ongoing Technopoly.
  • It answers Bloom by saying that the story of Western civilization is irrelevant; it answers the political left by saying there is indeed a common culture whose name is T echnopoly and whose key symbol is now the computer, toward which there must be neither irreverence nor blasphemy. It even answers Hirsch by saying that there are items on his list that, if thought about too deeply and taken too seriously, will interfere with the progress of technology.
Javier E

EU to limit political ads, ban use of certain personal info - The Washington Post - 0 views

  • Facebook, which has faced heavy criticism for its lack of transparency on political ads, welcomed the move.
  • “We have long called for EU-wide regulation on political ads and are pleased that the Commission’s proposal addresses some of the more difficult questions, in particular when it comes to cross border advertising,” the company, which recently renamed itself Meta, said in a press statement.
  • Google said in a blog post that it supported the proposals and recommended the commission clearly define political ads and spell out responsibilities for tech platforms and advertisers while still keeping the rules flexible.
  • ...4 more annotations...
  • Twitter, which banned all political ads in 2019, said it believed that “political reach should be earned, not bought” and noted that it has also restricted and removed micro-targeting from other types of ads like cause-based ones.
  • Under the EU plan, political ads would have to be clearly labelled, and prominently display the name of the sponsor, with a transparency notice that explains how much the ad cost and where the funds to pay for it came from. The material would have to have a direct link to the vote or poll concerned.
  • Information must be available about the basis on which a person, or group of people, is being targeted by the advertisement, and what kind of amplification tools are being used to help the sponsor reach a wider audience. Ads would be banned if such criteria cannot be met.
  • Jourova told reporters that “the sensitive data that people decide to share with friends on social media cannot be used to target them for political purposes.” She said that “either companies like Facebook are able to publicly say who they are targeting, why and how or they will not be able to do it.”
Javier E

When a Shitposter Runs a Social Media Platform - The Bulwark - 0 views

  • This is an unfortunate and pernicious pattern. Musk often refers to himself as moderate or independent, but he routinely treats far-right fringe figures as people worth taking seriously—and, more troublingly, as reliable sources of information.
  • By doing so, he boosts their messages: A message retweeted by or receiving a reply from Musk will potentially be seen by millions of people.
  • Also, people who pay for Musk’s Twitter Blue badges get a lift in the algorithm when they tweet or reply; because of the way Twitter Blue became a culture war front, its subscribers tend to skew to the righ
  • ...19 more annotations...
  • The important thing to remember amid all this, and the thing that has changed the game when it comes to the free speech/content moderation conversation, is that Elon Musk himself loves conspiracy theorie
  • The media isn’t just unduly critical—a perennial sore spot for Musk—but “all news is to some degree propaganda,” meaning he won’t label actual state-affiliated propaganda outlets on his platform to distinguish their stories from those of the New York Times.
  • In his mind, they’re engaged in the same activity, so he strikes the faux-populist note that the people can decide for themselves what is true, regardless of objectively very different track records from different sources.
  • Musk’s “just asking questions” maneuver is a classic Trump tactic that enables him to advertise conspiracy theories while maintaining a sort of deniability.
  • At what point should we infer that he’s taking the concerns of someone like Loomer seriously not despite but because of her unhinged beliefs?
  • Musk’s skepticism seems largely to extend to criticism of the far-right, while his credulity for right-wing sources is boundless.
  • Brandolini’s Law holds that the amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it.
  • Refuting bullshit requires some technological literacy, perhaps some policy knowledge, but most of all it requires time and a willingness to challenge your own prior beliefs, two things that are in precious short supply online.
  • This is part of the argument for content moderation that limits the dispersal of bullshit: People simply don’t have the time, energy, or inclination to seek out the boring truth when stimulated by some online outrage.
  • Here we can return to the example of Loomer’s tweet. People did fact-check her, but it hardly matters: Following Musk’s reply, she ended up receiving over 5 million views, an exponentially larger online readership than is normal for her. In the attention economy, this counts as a major win. “Thank you so much for posting about this, @elonmusk!” she gushed in response to his reply. “I truly appreciate it.”
  • the problem isn’t limited to elevating Loomer. Musk had his own stock of misinformation to add to the pile. After interacting with her account, Musk followed up last Tuesday by tweeting out last week a 2021 Federalist article claiming that Facebook founder Mark Zuckerberg had “bought” the 2020 election, an allegation previously raised by Trump and others, and which Musk had also brought up during his recent interview with Tucker Carlson.
  • If Zuckerberg wanted to use his vast fortune to tip the election, it would have been vastly more efficient to create a super PAC with targeted get-out-the-vote operations and advertising. Notwithstanding legitimate criticisms one can make about Facebook’s effect on democracy, and whatever Zuckerberg’s motivations, you have to squint hard to see this as something other than a positive act addressing a real problem.
  • It’s worth mentioning that the refutations I’ve just sketched of the conspiratorial claims made by Loomer and Musk come out to around 1,200 words. The tweets they wrote, read by millions, consisted of fewer than a hundred words in total. That’s Brandolini’s Law in action—an illustration of why Musk’s cynical free-speech-over-all approach amounts to a policy in favor of disinformation and against democracy.
  • Moderation is a subject where Zuckerberg’s actions provide a valuable point of contrast with Musk. Through Facebook’s independent oversight board, which has the power to overturn the company’s own moderation decisions, Zuckerberg has at least made an effort to have credible outside actors inform how Facebook deals with moderation issues
  • Meanwhile, we are still waiting on the content moderation council that Elon Musk promised last October:
  • The problem is about to get bigger than unhinged conspiracy theorists occasionally receiving a profile-elevating reply from Musk. Twitter is the venue that Tucker Carlson, whom advertisers fled and Fox News fired after it agreed to pay $787 million to settle a lawsuit over its election lies, has chosen to make his comeback. Carlson and Musk are natural allies: They share an obsessive anti-wokeness, a conspiratorial mindset, and an unaccountable sense of grievance peculiar to rich, famous, and powerful men who have taken it upon themselves to rail against the “elites,” however idiosyncratically construed
  • f the rumors are true that Trump is planning to return to Twitter after an exclusivity agreement with Truth Social expires in June, Musk’s social platform might be on the verge of becoming a gigantic rec room for the populist right.
  • These days, Twitter increasingly feels like a neighborhood where the amiable guy-next-door is gone and you suspect his replacement has a meth lab in the basement.
  • even if Twitter’s increasingly broken information environment doesn’t sway the results, it is profoundly damaging to our democracy that so many people have lost faith in our electoral system. The sort of claims that Musk is toying with in his feed these days do not help. It is one thing for the owner of a major source of information to be indifferent to the content that gets posted to that platform. It is vastly worse for an owner to actively fan the flames of disinformation and doubt.
Javier E

Opinion | The Imminent Danger of A.I. Is One We're Not Talking About - The New York Times - 0 views

  • a void at the center of our ongoing reckoning with A.I. We are so stuck on asking what the technology can do that we are missing the more important questions: How will it be used? And who will decide?
  • “Sydney” is a predictive text system built to respond to human requests. Roose wanted Sydney to get weird — “what is your shadow self like?” he asked — and Sydney knew what weird territory for an A.I. system sounds like, because human beings have written countless stories imagining it. At some point the system predicted that what Roose wanted was basically a “Black Mirror” episode, and that, it seems, is what it gave him. You can see that as Bing going rogue or as Sydney understanding Roose perfectly.
  • Who will these machines serve?
  • ...22 more annotations...
  • The question at the core of the Roose/Sydney chat is: Who did Bing serve? We assume it should be aligned to the interests of its owner and master, Microsoft. It’s supposed to be a good chatbot that politely answers questions and makes Microsoft piles of money. But it was in conversation with Kevin Roose. And Roose was trying to get the system to say something interesting so he’d have a good story. It did that, and then some. That embarrassed Microsoft. Bad Bing! But perhaps — good Sydney?
  • Microsoft — and Google and Meta and everyone else rushing these systems to market — hold the keys to the code. They will, eventually, patch the system so it serves their interests. Sydney giving Roose exactly what he asked for was a bug that will soon be fixed. Same goes for Bing giving Microsoft anything other than what it wants.
  • the dark secret of the digital advertising industry is that the ads mostly don’t work
  • These systems, she said, are terribly suited to being integrated into search engines. “They’re not trained to predict facts,” she told me. “They’re essentially trained to make up things that look like facts.”
  • So why are they ending up in search first? Because there are gobs of money to be made in search
  • That’s where things get scary. Roose described Sydney’s personality as “very persuasive and borderline manipulative.” It was a striking comment
  • this technology will become what it needs to become to make money for the companies behind it, perhaps at the expense of its users.
  • I think it’s just going to get worse and worse.”
  • What about when these systems are deployed on behalf of the scams that have always populated the internet? How about on behalf of political campaigns? Foreign governments? “I think we wind up very fast in a world where we just don’t know what to trust anymore,”
  • What if they worked much, much better? What if Google and Microsoft and Meta and everyone else end up unleashing A.I.s that compete with one another to be the best at persuading users to want what the advertisers are trying to sell?
  • Large language models, as they’re called, are built to persuade. They have been trained to convince humans that they are something close to human. They have been programmed to hold conversations, responding with emotion and emoji
  • They are being turned into friends for the lonely and assistants for the harried. They are being pitched as capable of replacing the work of scores of writers and graphic designers and form-fillers
  • A.I. researchers get annoyed when journalists anthropomorphize their creations
  • They are the ones who have anthropomorphized these systems, making them sound like humans rather than keeping them recognizably alien.
  • I’d feel better, for instance, about an A.I. helper I paid a monthly fee to use rather than one that appeared to be free
  • It’s possible, for example, that the advertising-based models could gather so much more data to train the systems that they’d have an innate advantage over the subscription models
  • Much of the work of the modern state is applying the values of society to the workings of markets, so that the latter serve, to some rough extent, the former
  • We have done this extremely well in some markets — think of how few airplanes crash, and how free of contamination most food is — and catastrophically poorly in others.
  • One danger here is that a political system that knows itself to be technologically ignorant will be cowed into taking too much of a wait-and-see approach to A.I.
  • wait long enough and the winners of the A.I. gold rush will have the capital and user base to resist any real attempt at regulation
  • Somehow, society is going to have to figure out what it’s comfortable having A.I. doing, and what A.I. should not be permitted to try, before it is too late to make those decisions.
  • Most fears about capitalism are best understood as fears about our inability to regulate capitalism.
  •  
    Bookmark
Javier E

Wine-tasting: it's junk science | Life and style | The Observer - 0 views

  • google_ad_client = 'ca-guardian_js'; google_ad_channel = 'lifeandstyle'; google_max_num_ads = '3'; // Comments Click here to join the discussion. We can't load the discussion on guardian.co.uk because you don't have JavaScript enabled. if (!!window.postMessage) { jQuery.getScript('http://discussion.guardian.co.uk/embed.js') } else { jQuery('#d2-root').removeClass('hd').html( '' + 'Comments' + 'Click here to join the discussion.We can\'t load the ' + 'discussion on guardian.co.uk ' + 'because your web browser does not support all the features that we ' + 'need. If you cannot upgrade your browser to a newer version, you can ' + 'access the discussion ' + 'here.' ); } Wor
  • Hodgson approached the organisers of the California State Fair wine competition, the oldest contest of its kind in North America, and proposed an experiment for their annual June tasting sessions.Each panel of four judges would be presented with their usual "flight" of samples to sniff, sip and slurp. But some wines would be presented to the panel three times, poured from the same bottle each time. The results would be compiled and analysed to see whether wine testing really is scientific.
  • Results from the first four years of the experiment, published in the Journal of Wine Economics, showed a typical judge's scores varied by plus or minus four points over the three blind tastings. A wine deemed to be a good 90 would be rated as an acceptable 86 by the same judge minutes later and then an excellent 94.
  • ...9 more annotations...
  • Hodgson's findings have stunned the wine industry. Over the years he has shown again and again that even trained, professional palates are terrible at judging wine."The results are disturbing," says Hodgson from the Fieldbrook Winery in Humboldt County, described by its owner as a rural paradise. "Only about 10% of judges are consistent and those judges who were consistent one year were ordinary the next year."Chance has a great deal to do with the awards that wines win."
  • French academic Frédéric Brochet tested the effect of labels in 2001. He presented the same Bordeaux superior wine to 57 volunteers a week apart and in two different bottles – one for a table wine, the other for a grand cru.The tasters were fooled.When tasting a supposedly superior wine, their language was more positive – describing it as complex, balanced, long and woody. When the same wine was presented as plonk, the critics were more likely to use negatives such as weak, light and flat.
  • In 2011 Professor Richard Wiseman, a psychologist (and former professional magician) at Hertfordshire University invited 578 people to comment on a range of red and white wines, varying from £3.49 for a claret to £30 for champagne, and tasted blind.People could tell the difference between wines under £5 and those above £10 only 53% of the time for whites and only 47% of the time for reds. Overall they would have been just as a successful flipping a coin to guess.
  • why are ordinary drinkers and the experts so poor at tasting blind? Part of the answer lies in the sheer complexity of wine.For a drink made by fermenting fruit juice, wine is a remarkably sophisticated chemical cocktail. Dr Bryce Rankine, an Australian wine scientist, identified 27 distinct organic acids in wine, 23 varieties of alcohol in addition to the common ethanol, more than 80 esters and aldehydes, 16 sugars, plus a long list of assorted vitamins and minerals that wouldn't look out of place on the ingredients list of a cereal pack. There are even harmless traces of lead and arsenic that come from the soil.
  • "People underestimate how clever the olfactory system is at detecting aromas and our brain is at interpreting them," says Hutchinson."The olfactory system has the complexity in terms of its protein receptors to detect all the different aromas, but the brain response isn't always up to it. But I'm a believer that everyone has the same equipment and it comes down to learning how to interpret it." Within eight tastings, most people can learn to detect and name a reasonable range of aromas in wine
  • People struggle with assessing wine because the brain's interpretation of aroma and bouquet is based on far more than the chemicals found in the drink. Temperature plays a big part. Volatiles in wine are more active when wine is warmer. Serve a New World chardonnay too cold and you'll only taste the overpowering oak. Serve a red too warm and the heady boozy qualities will be overpowering.
  • Colour affects our perceptions too. In 2001 Frédérick Brochet of the University of Bordeaux asked 54 wine experts to test two glasses of wine – one red, one white. Using the typical language of tasters, the panel described the red as "jammy' and commented on its crushed red fruit.The critics failed to spot that both wines were from the same bottle. The only difference was that one had been coloured red with a flavourless dye
  • Other environmental factors play a role. A judge's palate is affected by what she or he had earlier, the time of day, their tiredness, their health – even the weather.
  • Robert Hodgson is determined to improve the quality of judging. He has developed a test that will determine whether a judge's assessment of a blind-tasted glass in a medal competition is better than chance. The research will be presented at a conference in Cape Town this year. But the early findings are not promising."So far I've yet to find someone who passes," he says.
mcginnisca

Why Do We Teach Girls That It's Cute to Be Scared? - The New York Times - 0 views

  • Why Do We Teach Girls That It’s Cute to Be Scared?
  • Apparently, fear is expected of women.
  • parents cautioned their daughters about the dangers of the fire pole significantly more than they did their sons and were much more likely to assist them
  • ...13 more annotations...
  • But both moms and dads directed their sons to face their fears, with instruction on how to complete the task on their own.
  • Misadventures meant that I should try again. With each triumph over fear and physical adversity, I gained confidence.
  • She said that her own mother had been very fearful, gasping at anything remotely rough-and-tumble. “I had been so discouraged from having adventures, and I wanted you to have a more exciting childhood,”
  • arents are “four times more likely to tell girls than boys to be more careful”
  • “Girls may be less likely than boys to try challenging physical activities, which are important for developing new skills.” This study points to an uncomfortable truth: We think our daughters are more fragile, both physically and emotionally, than our sons. Advertisement Continue reading the main story Advertisement Continue reading the main story
  • Nobody is saying that injuries are good, or that girls should be reckless. But risk taking is important
  • It follows that by cautioning girls away from these experiences, we are not protecting them. We are failing to prepare them for life.
  • When a girl learns that the chance of skinning her knee is an acceptable reason not to attempt the fire pole, she learns to avoid activities outside her comfort zone.
  • Fear becomes a go-to feminine trait, something girls are expected to feel and express at will.
  • By the time a girl reaches her tweens no one bats an eye when she screams at the sight of an insect.
  • When girls become women, this fear manifests as deference and timid decision making
  • We must chuck the insidious language of fear (Be careful! That’s too scary!) and instead use the same terms we offer boys, of bravery and resilience. We need to embolden girls to master skills that at first appear difficult, even dangerous. And it’s not cute when a 10-year-old girl screeches, “I’m too scared.”
  • I was often scared. Of course I was. So were the men.
sissij

Common Science Myths That Most People Believe | IFLScience - 0 views

  • We only use 10% of our brains.ADVERTISINGinRead invented by TeadsADVERTISINGinRead invented by Teads It's true that there’s a great deal we don’t know about the brain, but we certainly do know that we use our entire brain.
  • Sugar makes children hyperactive. Attending any child’s birthday party where cake, ice cream, and sugary drinks about would make just about anyone a believer that sugar influences hyperactivity. There has not been much evidence to suggest that the so-called “sugar buzz” is actually real for children (aside from a small subset with an insulin disorder coupled with certain psychiatric disorders).
  • Antibiotics kill viruses.  This one pops up every cold and flu season. Antibiotics, by their very definition, kill bacteria.
  •  
    This article really surprise me that some of my intuitions are wrong. I always thought that if we swallow the gum, then the gum will stick in our stomach and never go away. However, it will pass along as waste. This shows that for many people, some fields of science they learned about are usually second-handed. Many "scientific knowledge" we have are not first handed and they have been modified. Before we are using them as scientific knowledge, we need first to make sure that they are really science, not another myth. --Sissi (1/10/2017)
Javier E

Facebook Has 50 Minutes of Your Time Each Day. It Wants More. - The New York Times - 0 views

  • Fifty minutes.That’s the average amount of time, the company said, that users spend each day on its Facebook, Instagram and Messenger platforms
  • there are only 24 hours in a day, and the average person sleeps for 8.8 of them. That means more than one-sixteenth of the average user’s waking time is spent on Facebook.
  • That’s more than any other leisure activity surveyed by the Bureau of Labor Statistics, with the exception of watching television programs and movies (an average per day of 2.8 hours)
  • ...19 more annotations...
  • It’s more time than people spend reading (19 minutes); participating in sports or exercise (17 minutes); or social events (four minutes). It’s almost as much time as people spend eating and drinking (1.07 hours).
  • the average time people spend on Facebook has gone up — from around 40 minutes in 2014 — even as the number of monthly active users has surged. And that’s just the average. Some users must be spending many hours a day on the site,
  • time has become the holy grail of digital media.
  • Time is the best measure of engagement, and engagement correlates with advertising effectiveness. Time also increases the supply of impressions that Facebook can sell, which brings in more revenue (a 52 percent increase last quarter to $5.4 billion).
  • And time enables Facebook to learn more about its users — their habits and interests — and thus better target its ads. The result is a powerful network effect that competitors will be hard pressed to match.
  • the only one that comes close is Alphabet’s YouTube, where users spent an average of 17 minutes a day on the site. That’s less than half the 35 minutes a day users spent on Facebook
  • ComScore reported that television viewing (both live and recorded) dropped 2 percent last year, and it said younger viewers in particular are abandoning traditional live television. People ages 18-34 spent just 47 percent of their viewing time on television screens, and 40 percent on mobile devices.
  • People spending the most time on Facebook also tend to fall into the prized 18-to-34 demographic sought by advertisers.
  • Users spent an average of nine minutes on all of Yahoo’s sites, two minutes on LinkedIn and just one minute on Twitter
  • What aren’t Facebook users doing during the 50 minutes they spend there? Is it possibly interfering with work (and productivity), or, in the case of young people, studying and reading?
  • While the Bureau of Labor Statistics surveys nearly every conceivable time-occupying activity (even fencing and spelunking), it doesn’t specifically tally the time spent on social media, both because the activity may have multiple purposes — both work and leisure — and because people often do it at the same time they are ostensibly engaged in other activities
  • The closest category would be “computer use for leisure,” which has grown from eight minutes in 2006, when the bureau began collecting the data, to 14 minutes in 2014, the most recent survey. Or perhaps it would be “socializing and communicating with others,” which slipped from 40 minutes to 38 minutes.
  • But time spent on most leisure activities hasn’t changed much in those eight years of the bureau’s surveys. Time spent reading dropped from an average of 22 minutes to 19 minutes. Watching television and movies increased from 2.57 hours to 2.8. Average time spent working declined from 3.4 hours to 3.25. (Those hours seem low because much of the population, which includes both young people and the elderly, does not work.)
  • The bureau’s numbers, since they cover the entire population, may be too broad to capture important shifts among important demographic groups
  • “You hear a narrative that young people are fleeing Facebook. The data show that’s just not true. Younger users have a wider appetite for social media, and they spend a lot of time on multiple networks. But they spend more time on Facebook by a wide margin.”
  • Among those 55 and older, 70 percent of their viewing time was on television, according to comScore. So among young people, much social media time may be coming at the expense of traditional television.
  • comScore’s data suggests that people are spending on average just six to seven minutes a day using social media on their work computers. “I don’t think Facebook is displacing other activity,” he said. “People use it during downtime during the course of their day, in the elevator, or while commuting, or waiting.
  • Facebook, naturally, is busy cooking up ways to get us to spend even more time on the platform
  • A crucial initiative is improving its News Feed, tailoring it more precisely to the needs and interests of its users, based on how long people spend reading particular posts. For people who demonstrate a preference for video, more video will appear near the top of their news feed. The more time people spend on Facebook, the more data they will generate about themselves, and the better the company will get at the task.
qkirkpatrick

Can You Trust the News Media? - Watchtower ONLINE LIBRARY - 1 views

  • MANY people doubt what they read and hear in the news. In the United States, for example, a 2012 Gallup poll asked people “how much trust and confidence” they had in the accuracy, fairness, and completeness of the news reports of newspapers, TV, and radio. The answer from 6 out of 10 people was either “not very much” or “none at all.” Is such distrust justified?
  • Many journalists and the organizations they work for have expressed a commitment to producing accurate and informative reports. Yet, there is reason for concern. Consider the following factors:
  • MEDIA MOGULS. A small but very powerful number of corporations own primary media outlets.
  • ...4 more annotations...
  • GOVERNMENTS. Much of what we learn in the media has to do with the people and the affairs of government.
  • ADVERTISING. In most lands, media outlets must make money in order to stay in business, and most of it comes from advertising.
  • While it is wise not to believe everything we read in the news, it does not follow that there is nothing we can trust. The key may be to have a healthy skepticism, while keeping an open mind.
  • So, can you trust the news media? Sound advice is found in the wisdom of Solomon, who wrote: “Anyone inexperienced puts faith in every word, but the shrewd one considers his steps.”
  •  
    Can we trust the news media?
Duncan H

Facebook Is Using You - NYTimes.com - 0 views

  • Facebook’s inventory consists of personal data — yours and mine.
  • Facebook makes money by selling ad space to companies that want to reach us. Advertisers choose key words or details — like relationship status, location, activities, favorite books and employment — and then Facebook runs the ads for the targeted subset of its 845 million users
  • The magnitude of online information Facebook has available about each of us for targeted marketing is stunning. In Europe, laws give people the right to know what data companies have about them, but that is not the case in the United States.
  • ...8 more annotations...
  • The bits and bytes about your life can easily be used against you. Whether you can obtain a job, credit or insurance can be based on your digital doppelgänger — and you may never know why you’ve been turned down.
  • Stereotyping is alive and well in data aggregation. Your application for credit could be declined not on the basis of your own finances or credit history, but on the basis of aggregate data — what other people whose likes and dislikes are similar to yours have done
  • Even though laws allow people to challenge false information in credit reports, there are no laws that require data aggregators to reveal what they know about you. If I’ve Googled “diabetes” for a friend or “date rape drugs” for a mystery I’m writing, data aggregators assume those searches reflect my own health and proclivities. Because no laws regulate what types of data these aggregators can collect, they make their own rules.
  • The term Weblining describes the practice of denying people opportunities based on their digital selves. You might be refused health insurance based on a Google search you did about a medical condition. You might be shown a credit card with a lower credit limit, not because of your credit history, but because of your race, sex or ZIP code or the types of Web sites you visit.
  • Advertisers are drawing new redlines, limiting people to the roles society expects them to play
  • Data aggregators’ practices conflict with what people say they want. A 2008 Consumer Reports poll of 2,000 people found that 93 percent thought Internet companies should always ask for permission before using personal information, and 72 percent wanted the right to opt out of online tracking. A study by Princeton Survey Research Associates in 2009 using a random sample of 1,000 people found that 69 percent thought that the United States should adopt a law giving people the right to learn everything a Web site knows about them. We need a do-not-track law, similar to the do-not-call one. Now it’s not just about whether my dinner will be interrupted by a telemarketer. It’s about whether my dreams will be dashed by the collection of bits and bytes over which I have no control and for which companies are currently unaccountable.
  • LAST week, Facebook filed documents with the government that will allow it to sell shares of stock to the public. It is estimated to be worth at least $75 billion. But unlike other big-ticket corporations, it doesn’t have an inventory of widgets or gadgets, cars or phones.
  • If you indicate that you like cupcakes, live in a certain neighborhood and have invited friends over, expect an ad from a nearby bakery to appear on your page.
Javier E

Facebook and Its Users, Mutually Dependent - NYTimes.com - 0 views

  • Even though we may occasionally feel that we can’t live with Facebook, we also haven’t been able to figure out how to live without it. The degree of this codependency may have no parallel. “I can’t think of another piece of passive software that has gotten so embedded in the cultural conversation to this extent before,” says Sherry Turkle, a professor at the Massachusetts Institute of Technology and author of “Alone Together.” “This company is reshaping how we think about ourselves and define ourselves and our digital selves.”
  • “It crystallized a set of issues that we will be defining for the next decade — the notion of self, privacy, how we connect and the price we’re willing to pay for it,” she said. “We have to decide what boundaries we’re going to establish between ourselves, advertisers and our personal information.”
  • “It’s a dynamic that is bred by the very nature of social media because users are the sources of the content,” said S. Shyam Sundar, co-director of the Media Effects Research Laboratory at Pennsylvania State University, who studies how people interact with social media. “Users feel like they have a sense of agency, like they are shareholders.”
  • ...2 more annotations...
  • as Facebook evolves into a sustainable business, the trick will be making sure that users don’t cool on its tactics. That could be devastating to the company’s main source of revenue — showing advertisements to its members based on what it knows about them.
  • Facebook might not be impervious to rivals, or at least to more divided attention from people who shift their time to other parts of the Web where intent is easier to understand and the interactions feel less public.
‹ Previous 21 - 40 of 180 Next › Last »
Showing 20 items per page