Skip to main content

Home/ Digit_al Society/ Group items tagged algorithm discrimination

Rss Feed Group items tagged

dr tech

The Bias Embedded in Algorithms | Pocket - 0 views

  •  
    "Algorithms and the data that drive them are designed and created by people, which means those systems can carry biases based on who builds them and how they're ultimately deployed. Safiya Umoja Noble, author of Algorithms of Oppression: How Search Engines Reinforce Racism, offers a curated reading list exploring how technology can replicate and reinforce racist and sexist beliefs, how that bias can affect everything from health outcomes to financial credit to criminal justice, and why data discrimination is a major 21st century challenge."
dr tech

I Know Some Algorithms Are Biased--because I Created One - Scientific American Blog Net... - 0 views

  •  
    "Creating an algorithm that discriminates or shows bias isn't as hard as it might seem, however. As a first-year graduate student, my advisor asked me to create a machine-learning algorithm to analyze a survey sent to United States physics instructors about teaching computer programming in their courses."
dr tech

Discrimination by algorithm: scientists devise test to detect AI bias | Technology | Th... - 0 views

  •  
    "Concerns have been growing about AI's so-called "white guy problem" and now scientists have devised a way to test whether an algorithm is introducing gender or racial biases into decision-making."
dr tech

Ethics committee raises alarm over 'predictive policing' tool | UK news | The Guardian - 0 views

  •  
    "Amid mounting financial pressure, at least a dozen police forces are using or considering predictive analytics, despite warnings from campaigners that use of algorithms and "predictive policing" models risks locking discrimination into the criminal justice system."
dr tech

Facebook isn't looking out for your privacy. It wants your data for itself | Technology... - 0 views

  •  
    "If Facebook cared about unfair profiling and privacy abuse, for instance, it would probably not have started grouping its users together based on their "Ethnic Affinity". It wouldn't then allow that ethnic affinity to be used as a basis for excluding users from advertisements, and it certainly wouldn't allows that ethnic affinity to be used as a basis for potentially illegal discrimination in real estate advertising."
dr tech

Still flattening the curve?: Increased risk of digital authoritarianism after... - 0 views

  •  
    "The main rationale for increasing state surveillance was to tackle the pandemic effectively to save people's lives. Yet, states are not enthusiastic about abandoning these digital tools, even though the pandemic is winding down. Instead, they are determined to preserve their surveillance capacities under the pretext of national security or preparation for future pandemics. In the face of increasing state surveillance, however, we should thoroughly discuss the risk of digital authoritarianism and the possible use of surveillance technologies to violate privacy, silence political opposition, and oppress minorities. For example, South Korea's sophisticated contact tracing technology that involves surveillance camera footage, cell-phone location data, and credit card purchases has disclosed patients' personal information, such as nationality. It raised privacy concerns, particularly for ethnic minorities, and underlined the risk of technology-enabled ethnic mapping and discrimination."
1 - 6 of 6
Showing 20 items per page