Was this AI trained on an unbalanced data set? (Only black folks?) Or has it only been used to identify photos of black people? I have so many questions: some technical, some on media sensationalism

    • Not talking about systemic racism in general. I know there’s a lot of that. I’m talking about systemic racism causing this particular issue. I’m saying because there have been cases of motion sensors not detecting black hands because of technical issues. I’m not apologizing for anyone, just pointing out the fact that it has happened before due to technical deficiencies.

        • Putting the same thing the other way around: The fact that there have been issues with systemic racism (which is true) does not disprove technical malfunction (which exists). That’s like saying that because the lemon juice is sour it means it has vinegar in it. It doesn’t follow. Lemon juice can be sour just because it has lemons in it, without need of any vinegar in it.

            • Again, I’m not disagreeing on systemic racism in the police at all. That is a big issue that needs to be solved. Just saying that this doesn’t have to be related to it, because the technology itself has some issues like this. The vinegar is in the food, yes, but lemon is naturally sour. Even if there is no vinegar, it’s gonna be sour. Attributing everything to vinegar wouldn’t make food better. It would just make it difficult to identify issues with individual ingredients.