MIT removes huge dataset that teaches AI systems to use racist, misogynistic slurs

MIT removes huge dataset that teaches AI systems to use racist, misogynistic slurs


MIT has taken offline a massive and highly-cited dataset that trained AI systems to use racist and misogynistic terms to describe people, The Register reports. The training set — called 80 Million Tiny Images, as that’s how many labeled images it’s scraped from Google Images — was created in 2008 to develop advanced object detection techniques. It has been used to teach machine-learning models to identify the people and objects in still images. As The Register’s Katyanna Quach wrote: “Thanks to MIT‘s cavalier approach when assembling its training set, though, these systems may also label women as whores or bitches,…

This story continues at The Next Web
  1. No comment added yet!
  2. Leave Your Comment

    Your email address will not be published. Required fields are marked *

AtSign Innovations