Technology has blurred all sorts of boundaries we used to take for granted—between work and leisure, between being alone and being with others, between private and public spaces. One boundary we still generally treat as sacrosanct, though, is the one around our own minds, which allows us to think for ourselves and to keep those thoughts private, whether they are rebellious, impolite or simply irrelevant. After all, the power to make up our own minds is an essential part of what makes us individuals.
Technology may now be challenging this mental independence, too, and some of its applications could threaten our autonomy if they were to fall into the wrong hands.
Take emotion recognition technology, for example, which you can try on the Emojify website. It is now being deployed widely in China, including to measure pupils’ attentiveness at school, as well as for uses in detention and social care settings. The software claims to be able to recognize different emotions, as well as personal characteristics such as age and gender. But there are serious concerns about this technology. The science behind it, and therefore its accuracy, is problematic. It carries serious risks of perpetuating bias and discrimination, as its responses may vary with personal characteristics. And the raison d’être of emotion recognition technology is to deprive people of privacy in their thoughts and feelings. Chilling analogies with the telescreens of George Orwell’s “1984” are difficult to avoid.