• I can imagine in the future there will be grid locks in front of the police station with AV cars full of black people when the cops send out an ABP with the description of a black suspect.

    We’ve seen plenty of racist AI programs in the past because the programmers, intentionally or not, added their own bias into the training data.

    • The AIs are not racist themselves, it’s a side effect of the full technology stack: cameras have lower dynamic resolution for darker colors, images get encoded with a gamma that leaves less information in darker areas, AIs that work fine with images of light skinned faces, don’t get the same amount of information from images of dark skinned faces, leading to higher uncertainty and more false positives.

      The bias starts with cameras themselves; security cameras in particular should have an even higher dynamic range than the human eye, but instead they’re often a cheap afterthought, and then go figure out what have they recorded.