Here’s a great example of dystopian tech being rolled out without guardrails. Brought to you by Axos, which you may know as the company that rebranded after Taser became a liability as a name.

  • the federal government will refuse to pay medicare/caid bills if the narrative associated is ‘too templative’ ie, if you sound like a robot using the identical phrases all the time, you get busted.

    clearly cops need to be held to similar standards, and also face consequences for outright lying.

    this product should be made illegal in this context. theres too much riding on it.

  • So, all those videos where the cops are screaming “Stop resisting!” to some person who is laid out on the ground, not moving and not resisting, piled onto with eight cops holding them down - I’m sure the AI chatbot notices and notes down all that nuance, right?

    Relying exclusively on body camera audio—not video

    Oops, guess not … :(

  • So, AI that is strictly incapabale of generating new ideas is going to be fed decades of police reports as it’s database, and use that data to discern that makes a good police report?

    Surely this won’t replicate decade old systematic problems with racial profiling. I mean, all these police reports are certainly objective, with no hint of bias to be found in the officers writing.