Follow

The software can help developers constrain their creations so they don't make bad decisions.

wired.com/story/these-startups

@Qwant it's ultimately the designer who makes the decision. Building a face recognition algorithm is a human choice, and nothing else.
An algorithm that misclassifies black people was trained on biased datas, collected in a society with biases and annoted by biased humans. The wording of this article and your headline seems to suggest the opposite.

Sign in to participate in the conversation
Mastodon 🔐 privacytools.io

Fast, secure and up-to-date instance. PrivacyTools provides knowledge and tools to protect your privacy against global mass surveillance.

Website: privacytools.io
Matrix Chat: chat.privacytools.io
Support us on OpenCollective, many contributions are tax deductible!