“While mind reading sounds like everyone’s worst nightmare, with the leaps and bounds being made in AI and ChatGPT, this scenario could soon become a reality”, says Nigel Pereira.


While it’s pretty hard to control the thoughts that go through our heads on a daily basis, most people have a tight grip over their actions and every crazy idea doesn’t necessarily have to be carried out. For example, it’s pretty common for people leaning over an edge or standing on a tall building to think “What would happen if I jump,” though it’s not a thought most people follow through with. That being said, having a machine read and interpret your thoughts is a scary thought to say the least, especially since thoughts are heavily influenced by the world around us and not easy to control. A person who has violent thoughts isn’t necessarily a violent person just like a person who has charitable thoughts isn’t necessarily a saint.

Mind reading by AI isn’t a new development in any way and a 2016 post on ScienceAlert claimed scientists had invented a machine that could visualize your thoughts. Imagine having your private thoughts aired on a screen for everyone to see. While it sounds like everyone’s worst nightmare, with the leaps and bounds being made in AI and ChatGPT, in particular, this scenario could soon become a reality.

Similarly, a 2018 post on Computerworld explains how brain-wave pattern detection has been around for decades and AI was the missing link to actually interpret them. The article goes on to explain how scientists use a neural network to match neuromuscular signals with specific words and manage a 92% rate of accuracy after just 15 minutes of training.

Image Credit: Shutterstock

Emotional surveillance technologies

The Chinese have been using emotional surveillance technology for nearly a decade since it was introduced at State Grid Zhejiang Electric Power in 2014. Since then, the technology has been used in everything from public transport to factories, state-owned enterprises, and military installations around China.

Employees or soldiers are made to wear wireless sensors inside their helmets, or hard hats which transmit their brain activity to an AI algorithm built to detect signs of anxiety or stress. If such signs do arise, the employees are either given a day off or are designated to less important tasks according to the Deccan Chronicle. As reported by Business Insider, this “emotional surveillance technology” is used to tweak workflows and breaks to increase productivity,

Similarly, companies like Huawei, China Mobile, China Unicorn, and PetroChina are all part of a list of companies that use Taigusys, another AI emotional recognition system that can apparently judge how stressed, confrontational, or nervous a person is based on facial features.

That’s right, if you work for one of these organizations, faking a smile at the end of a 9 or 10-hour shift isn’t going to fool middle management.

The AI that powers Taigusys works by reading footage from security cameras and then analyzing them for facial muscle movements, body movements, and other biometric data. The AI algorithm then evaluates these emotions on several scales and classifies them as good emotions, bad emotions, and neutral emotions. If your facial expressions don’t reflect your mood for some reason, it’s probably going to be a pretty long day every day.

Image Credit: Shutterstock

Decoding the human mind

With all the above-mentioned developments in mind reading with AI, it was only a matter of time till someone threw something like ChatGPT into the mix and that time has come. Earlier this month, a research team led by Jerry Tang, a doctoral student in Computer Science, and Alex Huth, a professor in Neuroscience and Computer Science from the University of Texas in Austin developed an AI model that can read your thoughts. It does this with the help of what is called a semantic decoder that’s powered by a transformer model not unlike OpenAI’s ChatGpt or Google’s Bard. The study was conducted by having three people sit inside an fMRI chamber for 15 hours each and listen intently to stories while the decoder interprets their thoughts.

Now while the research results are positive, it is to be noted that the AI doesn’t interpret your thoughts word for word but somehow summarises the “gist” of it, which might be even scarier. For example, a participant listening to a story that read “I don’t have my driver’s license yet,” interpreted it as “she has not even started to learn to drive yet,” as reported by The Telegraph. That being said, however, it’s entirely possible that a person could be thinking a million different things while listening to a story, so as far as mind reading goes, that sounds pretty accurate. And while there’s no danger of someone reading your thoughts unless you agree to get your brain scanned by noninvasive technology for 15 hours, the fact that the day might come when we are judged by our thoughts should be enough to give the majority of the population nightmares.

In case you missed:

With a background in Linux system administration, Nigel Pereira began his career with Symantec Antivirus Tech Support. He has now been a technology journalist for over 6 years and his interests lie in Cloud Computing, DevOps, AI, and enterprise technologies.

Leave A Reply

Share.
© Copyright Sify Technologies Ltd, 1998-2022. All rights reserved