COVID boosted the use of AI-based chatbots that detect human emotions. Researchers are developing other applications that use similar technology.
Emotion-sensing robots have long been the stuff of science fiction. Lately, there’s been a need for machines to be able to sort through human emotion, perhaps even taking the place of a psychologist or therapist.
The use of AI-driven chatbots has been booming since the COVID-19 crisis took hold last year, reports Laken Brooks in a recent post on the subject. “Millions of isolated people have found comfort by chatting with an AI bot. Psychiatrists are studying how these AI companions can improve mental wellness during the pandemic and beyond.”
Pushing the boundaries of using AI to detect human emotions even further, researchers at Incheon National University have developed a 5G-assisted emotion detection system that uses wireless signals and body movement. As reported in ScienceDaily, the system goes beyond therapy, with potential life-saving applications, such as sensing the mental state of the driver of a vehicle, or of an agitated person in a public space.
“Some emotions can also disrupt the normal functioning of a society and put people’s lives in danger, such as those of an unstable driver,” explains Professor Hyunbum Kim from Incheon National University, leader of the project. “Emotion-detection technology thus has great potential for recognizing any disruptive emotion and in tandem with 5G and beyond-5G communication, warning others of potential dangers. For instance, in the case of the unstable driver, the AI-enabled driver system of the car can inform the nearest network towers, from where nearby pedestrians can be informed via their personal smart devices.”
Outside of vehicles, if the system detects an extreme amount of anger or fear in a person in a public area, the information can be “rapidly conveyed to the nearest police department or relevant entities who can then take steps to prevent any potential crime or terrorism threats.” At the same time, the system still needs work — security is weak, and there is always the danger of false positives. Still, these developments reflect an increasing capability for AI-based systems to interact, in real-time, with human counterparts on an emotional level.
Reading body language is an art that many systems can learn. At Carnegie Mellon University, researchers have been developing a system, called OpenPose, that can track body movements such as that of the hands and face in real-time. As reported by Apoorva Komarraju in Analytics Insight, the OpenPose system employs computer vision and machine learning to process video frames. “The OpenPose system can also track individual fingers and their movement, Komarrajo relates. “To make this happen, researchers used Panoptic Studio, a dome lined with 500 cameras that were used to capture body postures from a variety of angles. These images were used to build the data set for the system. All those several images were then passed through a keypoint detector which helped identify and label the body parts. OpenPose learns to associate the body parts with its individuals which makes tracking multiple people possible without creating chaos regarding whose hand is where.”