Notes
- African immigrant, experienced racism early on
- Got into deep learning, didn’t worry about surveylence
first notices
- COMPAS (recidivism-risk-algorithm)
- more false reoffend predictions for black people than for white
- Joy Buolamwini: face recognition worked worse on black people
- only 6 black people at a conference with 8500 attendees (NIPS)
- “not worried about machines..” → because few benifits and many will get harmed
- “gender shades” project recognized white men but was highly inaccurate for black women
- → data based on internet data, access was not equal
- gmail smart short responses: gender assumptions
problems
- data sets are too large to get sanitized
- bert: 3.3b, gpt3 half a trillion
google next ai
- none of the
suggestions
- “datasheets for datasets”: researcher document patterns/contents of data