The court ruled in favor of the Dynamex drivers, agreeing that they had been misclassified as independent contractors and are, in fact, employees. The ruling also concluded that employers could only classify as independent contractors those workers who meet the conditions laid out in the "ABC standard" established in other states:(a) that the worker is free from control and direction over performance of the work, both under the contract and in fact; (b) that the work provided is outside the usual course of the business for which the work is performed; and (c) that the worker is customarily engaged in an independently established trade, occupation or business (hence the ABC standard).
A California Court Just Ruled That Gig Workers Are Bona Fide Employees. Will Courts in ... - 0 views
-
-
While some of these workers may be independent contractors by choice, others, like the Dynamex drivers, were forced into the classification by employers looking to save money. The National Employment Law Project estimates that employers can reduce payroll and other taxes by up to 30 percent by re-classifying employees. State-level studies on the issue, meanwhile, have uncovered extremely high misclassification rates—a series of audits in Ohio found that 47 percent of workers were misclassified. (This misclassification, not surprisingly, costs federal, state, and local governments hundreds of millions of dollars in lost tax revenues.)
Why Mathematicians Like to Classify Things | Quanta Magazine - 0 views
How We Made AI As Racist and Sexist As Humans - 0 views
-
Artificial intelligence may have cracked the code on certain tasks that typically require human smarts, but in order to learn, these algorithms need vast quantities of data that humans have produced. They hoover up that information, rummage around in search of commonalities and correlations, and then offer a classification or prediction (whether that lesion is cancerous, whether you’ll default on your loan) based on the patterns they detect. Yet they’re only as clever as the data they’re trained on, which means that our limitations—our biases, our blind spots, our inattention—become theirs as well.
-
The majority of AI systems used in commercial applications—the ones that mediate our access to services like jobs, credit, and loans— are proprietary, their algorithms and training data kept hidden from public view. That makes it exceptionally difficult for an individual to interrogate the decisions of a machine or to know when an algorithm, trained on historical examples checkered by human bias, is stacked against them. And forget about trying to prove that AI systems may be violating human rights legislation.
-
Data is essential to the operation of an AI system. And the more complicated the system—the more layers in the neural nets, to translate speech or identify faces or calculate the likelihood someone defaults on a loan—the more data must be collected.
- ...8 more annotations...
1 - 4 of 4
Showing 20▼ items per page