• The AIs are not racist themselves, it’s a side effect of the full technology stack: cameras have lower dynamic resolution for darker colors, images get encoded with a gamma that leaves less information in darker areas, AIs that work fine with images of light skinned faces, don’t get the same amount of information from images of dark skinned faces, leading to higher uncertainty and more false positives.

    The bias starts with cameras themselves; security cameras in particular should have an even higher dynamic range than the human eye, but instead they’re often a cheap afterthought, and then go figure out what have they recorded.