• I am also not a fan. “Resulted in people teaching to the test, rather than teaching to the material” as you say has already been a problem before AI and I agree that it is about to get worse with this. Another point is that psychological and emotional changes that occur during childhood and adolescence may differ from child to child. Will there be sone ‘gentle pressure’ to ‘meet’ some data? And what does it do to children if the teachers looks at the data instead of just talking to them?

    I’m not an expert for this, but I feel very uneasy. It seems an education system preparing you for a job at Amazon or so.

    • To be fair I don’t think that AI is involved here, it’s a simple likert scale where data is aggregated. I could see the company attempting to use AI to sell talking points or more complicated analysis on top of this data such as by ‘proactively identifying’ students which might need more help, but it wasn’t mentioned in the article. Either way I agree it creates perverse incentives centered around the data points and not the children.