• Good luck creating an interface that doesn’t require calibration which can easily be thwarted by simply being non-compliant.

    To even use a BCI with an incredibly simple UI such as a screen with only text characters takes tens to hundreds of hours of training and message transmission is quite slow (in this study about 12 characters per minute). This is because brains aren’t all built the same, and the very idea that you could decode brainwaves into digestible thought that another human will understand without willful participation of both parties is a long, long way out.

    You might be able to create a generalized AI/ML model trained on tens of thousands or millions of participants in BCI (which wouldn’t exist until BCI became mainstream and we had large datasets for sale) but even that would be quite easy to thwart with simple noncompliance for a solid 10+ years of research. Eventually we’d probably be able to decode even that, but simple techniques such as meditation could easily thwart appropriate calibration in the presence of noncompliance.

    • Naturally they require a calibration, currently still quite complicated, but in a few years this will surely be unnecessary with the advancement of AI and quantum computers. Not so long ago a 1 Mb hard drive weighed half a ton. Apart from all this calibration it sure isn’t that big of a deal, if you use this in interrogations, also a current lie detector requires a calibration of around half an hour. Current technological advances are exponential

      • I understand your concern, but I think you are underestimating just how different one human brain is from another. There’s a reason brain connectome research has moved so slowly and that’s not even trying to compare one human to another, that’s just trying to understand the basics of an incredibly complex and incredibly varied set of systems which only nominally look similar.

        • I know that we still need to fully understand the human brain, but in view of the exponential scientific progress in recent decades, I prefer to refer to Clarke’s first law

            • I think on the contrary we are closer to the future we fear. Governments are becoming more and more radical, taking away one basic right after another, we are on the verge of a world war with a huge amount of money invested in weapons and science in products that are used for military use. Regarding climate change, yes, we are almost at the point of no return and this means future catastrophes that are going to cost very many lives, that is, a warming of more than 2º of the average temperature in the coming decades. That is, if we continue down this path, we would not even experience the consequences of global warming, we will have Mad Max much sooner.