Researchers say that while a method of wirelessly determining emotions in study participants works, they would be cautious to offer wider use of it because of the potential ethical challenges it may pose. Photo by StockSnap/Pixabay
Feb. 3 (UPI) -- Scientists have developed a new artificial intelligence system that uses radio wave signals and a deep-learning neural network to remotely detect a subject's emotions.
The novel system -- described in a new paper, published Wednesday in the journal PLOS One -- can identify heart rate and breathing patterns associated with anger, sadness, joy and pleasure.
To build their system, researchers had study volunteers watch videos designed to evoke one of the four primary emotions mentioned above.
While the volunteers watched, researchers bounced radio waves off them and fed the returning signals into an artificial intelligence systems programmed for deep learning.
"The low power radio signal is transmitted from an antenna and it reflects from the body," corresponding author Yang Hao told UPI in an email.
"During breathing an individual's chest moves when they inhale and exhale, which modulates the reflected signal. The internal heartbeat movements also modulates the reflected signal," said Yang, a professor of antennas and electromagnetics at Queen Mary University of London.
The artificial neural network deployed by Yang and his colleagues was able to pick out predictive patterns in the "hidden" data.
Unlike traditional machine learning algorithms, which require humans to curate data and feed it to an algorithm, the deep learning network analyzes raw data in real-time.
"Traditional machine learning approaches necessitate manual extraction of hand-crafted features that generally requires domain expertise and can even be subject to human bias," Yang said.
"For example, a human would decide what descriptors would carry the important information inherent in raw data. This tedious step is no longer needed with a deep neural network where it can self-capture even the slightest details from raw data," Yang said.
Previous efforts to train computers to recognize emotional states have mostly relied on facial recognition software -- a subject-dependent technique.
This deep learning network, however, provided subject-independent analysis. In other words, the neural network was able to identify hidden data patterns that anticipated emotional states in a diversity of test subjects.
Most emotion-sensing technologies require bulky sensors, but the latest research showed emotions can be detected wirelessly using radio signals.
Traditionally, automated emotional detection systems are limited to psychological or neuroscientific studies, but the latest study suggests wireless emotion-detection could be used in more public places -- like an office.
Of course, the deployment of such a system outside scientific settings raises significant ethical considerations.
"Emotions are someone's personal privacy matter, and should not be monitored in public places unless strict legislation of data protection is widely accepted for its effective utilization," Yang said. "Moreover, use of this technology should only be considered in specific areas that are acceptable to society."
"For instance, emotions detected using this method may not provide an accurate representation of someone's true feelings so the results should not be used directly in decision making or healthcare," Yang said. "For this reason, to develop this technology for wider use more work is required around ethical concerns and its social impact."
Yang and his colleagues are currently recruiting healthcare professionals and social scientists to help them address ethical concerns as they developing publicly acceptable uses for the new technology.