Free Speech in the Algorithmic Society_ Big Data Private Governa.pdf - 0 views
-
The problems of free speech in any era are shaped by the communications technology available for people to use and by the ways that people actually use that technology.
-
The First Amendment, I argued, would prove increasingly inadequate to this task;5 moreover, if courts interpreted the Constitution in a short-sighted manner, judge-made doctrines of the First Amendment would actually hinder the protection and development of a truly democratic culture. 6
-
To be sure, digital companies would often find themselves on the side of the values of a democratic culture. But just as often they would seek constitutional protection for novel forms of surveillance and control of individuals and groups. 9
- ...31 more annotations...
-
The Algorithmic Society features the collection of vast amounts of data about individuals and facilitates new forms of surveillance, control, discrimination and manipulation, both by governments and by private companies. Call this the problem of Big Data. 10
-
In the digital age, individuals do not face the familiar dyadic model of speech regulation. In a dyadic model, there are two central actors: the power of the state threatens the individual's right to speak.
-
In the pluralist model individuals may be controlled, censored, and surveilled both by the nation state and by the owners of many different kinds of private infrastructure
-
In this world, the judge-made doctrines of the First Amendment, although still necessary, are inadequate to provide sufficient guarantees of free expression.
-
The Algorithmic Society depends on huge databases that can cheaply and easily be collected, collated, and analyzed.
-
New forms of wealth emerge in the Digital Age just as they did in the Industrial Revolution. Four especially important forms of wealth in the Information Age are intellectual property, fame, information security, and Big Data.
-
We should make a key distinction between distributed and democratic power. A form of power is democratic if many people participate in it and participate in decisionmaking about how to
-
employ it. A form of power is distributed if it operates in many different places and affects many different people and situations. In some ways the Internet and its associated digital technologies have made power more democratic. But in other ways the Internet has made it possible for power to be widely distributed but not democratic.
-
We tend to associate power with the effects of technology itself. But technology is actually a way of exemplifying and constituting relationships of power between one set of human beings and another set of human beings. This was true even of the technology of writing, which, Claude Levi-Strauss famously asserted, was used to organize the labor of slaves. 20 It is true today in the development of decisionmaking by algorithms and Al agents.
-
the Algorithmic Age is a struggle over the collection, transmission, use, and analysis of data. For this reason, the central constitutional questions do not concern freedom of contract. They concern freedom of expression.
-
The most important question is not whether robots have First Amendment rights; it is whether companies will be able to shield themselves from regulation by claiming that their uses of Al agents, robots, and algorithms are First Amendment protected activities.
-
Two key ideas help us understand when the First Amendment permits legal regulation of the people and organizations that use Big Data, algorithms, and artificial intelligence. The first is the concept of information fiduciaries. The second is the concept of algorithmic nuisance.
-
Governments can impose reasonable regulations on how information fiduciaries collect, use, distribute, and sell information derived from their fiduciary relationships with end-users.
-
Although these businesses use data and share data, the First Amendment does not prevent regulation of how they make and implement their decisions. That is because permissible regulation aims at the outputs of algorithmic decisionmaking: discrimination and manipulation.4 1
-
This means that many of the digital organizations that people deal with every day - including Internet service providers ("ISPs"), search engines, and social media platforms - should be treated as information fiduciaries with respect to their clients and end-users. Therefore, consistent with the First Amendment, governments can subject the information fiduciary to reasonable restrictions on collection, collation, analysis, use, sale, and distribution of personal information.
-
his is the idea of algorithmic nuisance. The concept of algorithmic nuisance applies when companies use Big Data and algorithms to make judgments that construct people's identities, traits, and associations that affect people's opportunities and vulnerabilities.
-
The classic examples of information fiduciaries are doctors and lawyers. 2 9 Both collect lots of personal information about their clients, their operations are not transparent to relatively untrained clients, and clients' ability to monitor professionals is limited by their lack of training.
-
Businesses use algorithms and ratings systems derived from algorithms to make decisions about who gets what opportunity - credit, a job, or entrance to and exclusion from any number of different benefits. In order to make these decisions, businesses increasingly rely on Big Data and algorithms, because so many decisions have to be made and it is too costly to engage in individualized decisionmaking. 47
-
The idea behind algorithmic nuisance is that algorithmic decisionmaking has cumulative side effects on populations as more and more public and private businesses adopt it.49 Algorithms construct people's identities and reputations by classifying them as risky,
-
To deal with this new organization of consumer products and services, we need the concepts of information fiduciary and algorithmic nuisance. Home robots and smart appliances collect an enormous amount of information about us which, in theory, can be collated with information about many other people that is stored in the cloud. Home robots and smart appliances are always-on, interconnected cloud entities that rely on and contribute to huge databases.
-
The second set of issues is symbolized by the ideas of "the right to forget" and "fake news." These two issues may seem unrelated. In fact, they are about the same issue: a fundamental change in how freedom of speech is regulated in the digital era. This alteration in governance has two key elements. The first is a change in how governments regulate - or attempt to regulate - speech in the digital era, from "old school" to "new school" speech regulation. The second is that privately owned online platforms engage in private governance of speech.
-
Both the creation of a right to forget and recent calls for a solution to the problem of fake news are examples of a larger phenomenon: the emergence of a new form of government speech regulation.
-
Nation states have not abandoned old school speech regulation. But they have increasingly moved to new school speech regulation because online speech is hard to govern. Speakers may be judgment proof, anonymous, and located outside the country, and they may not be human at all, but an army of bots. By contrast, owners of infrastructure are usually large for-profit enterprises, they are readily identifiable, and they have assets and do business within nation states
-
The first key feature of new school speech regulation is collateral censorship. Collateral censorship occurs when the state aims at A in order to control B's speech. 6
-
Problems of collateral censorship occur whenever governments adopt intermediary liability rules. 7 0
-
A key problem of administrative prior restraint is that it involves informal or bureaucratic censorship. 7 2
-
In a system of prior restraints, by contrast, the effects of the burden of action are flipped. The speaker may not speak unless he or she gets prior permission; until the bureaucrat or employee gets around to giving permission, the speech is forbidden.
-
Because of the dangers of collateral censorship, some governments, like the United States, provide for varying degrees of intermediary immunity. 7 7 Intermediary immunity rules relieve collateral censorship by holding the infrastructure owner harmless for content that is stored on their sites, or moves through their channels, when certain conditions are met.
-
A second key feature of new school speech regulation is public/ private cooperation and cooptation. 8 1 Governments aim at infrastructure providers in order to get them to censor or regulate the speech of people that governments cannot easily otherwise control. New school speech regulation seeks to coax the infrastructure provider into helping the state in various ways.
-
The relationship between nation states and infrastructure providers varies along a spectrum. It ranges from direct regulation, to threats, to suggestions that things will go better for infrastructure operators if they cooperate, to negotiations over the terms of cooperation.
-
A research paper by Jack Balkin on the rise of algorithms within society, repercussions of these algorithms being used by large businesses, and the scope of relationships between Big Data, private consumers, and national governmental bodies. Primarily, this paper looks at the increasing interconnection of these relationships, how they've changed in the years since the internet and algorithms have been introduced, and how the First Amendment may no longer be enough in this new online space.