Use Cases of Artificial Intelligence driven by biosignals

Introduction

Imagine harnessing the signals generated by our muscles, hearts, and more to fuel the engines of Artificial Intelligence (AI). A few years ago that might have sounded like science fiction, but over the course of the last couple of months AI solutions have become more and more presence in our everyday lives through tools like ChatGPT and research. And biosignals are no exception to this trend, being a feed AI solutions in clinical applications and consumer products.

Biosignal sensors, those ingenious devices capable of capturing our body's electrical and physiological signals, have established themselves as a valuable source of data in many AI projects. These sensors have found their way into the laboratories and minds of researchers, propelling innovation to new heights. From healthcare to gaming, from understanding emotions to predicting user behavior, the marriage of biosignals and AI is sparking some incredible use cases.

In this blog post, we'll take you on a journey through the remarkable ways biosignal sensors are supercharging AI research. Whether you're an AI enthusiast or simply curious about the synergy between technology and the human body, this blog post is for you.

 

Getting Started: Finding the right algorithms

Biosignals can be complex signals that not often reveal their patterns to the naked eye. Advanced Machine Learning techniques, a subset of AI, are capable of learning from provided datasets and extract patterns that would likely go unnoticed without a significant amount of biosignal processing engineering.

Finding the best approach to your new AI challenge is not always obvious. Fortunately, researchers like M. Harman et al. provide helpful information by analyzing common pattern recognition and data analysis methods on their applicability on biosignals data. This study in particular explores six common methods and tests them on public data sets containing Electrocardiography recordings (heart signals) acquired using a PLUX Electrocardiography (ECG) sensor. The approaches examine both raw signals and visual representations of heartbeats with the goal of predicting normal and abnormal heartbeats.

This information may help you kick-off your new AI challenge and point you into the right direction. Find out more about this research project in the published research paper.

 

Biosignals-powered video game difficulty levels

Ever wondered if your body can tell how challenging a video game is? Turns out it can as demonstrated by the research conducted by P. Rodrigues et al., who introduce an innovative framework that measures your physiological responses to classify the difficulty level of a Virtual Reality (VR) video game.

To put this to the test, a puzzle-based VR game was developed, featuring three distinct difficulty levels, each targeting specific emotional regions. 32 participants took part in the study, playing the game while their physiological responses were recorded using Electrocardiography (ECG), Electrodermal Activity (EDA) and Respiration sensors. The participants also reported their emotions during the gameplay to match physiological responses with perceived emotions and difficulty levels.

Analysis of these self-reports confirmed that the different game levels indeed triggered varying emotions and difficulty levels. And using the collected biosignals alone, the researchers were able to process the data and predict the game difficulties 3 out of 4 times correctly using Machine Learning methods.

This opens a whole new world of video game interaction and calls for further ways to explore this approach. Perhaps the difficulty level of your next video game is already powered by biosignals (we just hope you’re not into horror games then).

Find out more about this creative research project in the published research paper.

 

Let your emotions find the perfect running playlist

Wearable technology is reshaping user-focused applications, and its influence in sports does not go unnoticed. This technology is being strategically employed to amplify athletes' performance, curtail injury risks, and regulate fatigue. Notably, emotions have emerged as pivotal components in these advancements.

So why not use your own emotions to, let’s say, control a motivating Spotify running playlist for yourself?

The researchers from the DJ-Running Project thought exactly that and introduced a trail-blazing (pun intended) methodology that leverages wearables and machine learning models to infer runners' emotions during their training sessions.

The study centers around the use of an Electrodermal Activity (EDA) sensor which records a physiological signal linked with emotion recognition. The derived emotional insights seamlessly integrate with a mobile application which, within the project’s framework, meticulously selects and plays tailored music, fostering runners' motivation in the moment.

 

In essence, this research exemplifies the synergy of wearable technology and emotional analysis in the domain of sports training, a research field often dominated by monitoring performance alterations and injury prevention through mechanical and muscular activity.

Find out more about this project on the project homepage or on the published research paper.

 

Detecting sleep stages with Machine Learning and wearables

The pivotal role of sleep in well-being is widely acknowledged. In a study conducted by R. Brage et al., the research group focused on developing machine learning algorithms for automated sleep cycle detection.

Using features aligned with the Sleep Scoring Manual provided by the American Academy of Sleep Medicine, the models were trained on a substantial database of 2056 polysomnographies, sleep studies that monitor brain, cardiac, and respirator activity among others during a patient’s sleep to provide insights on their sleep pattern and sleep quality.

The study aimed to create an algorithm that predicts sleep stages and quality solely through a Photoplethysmography (PPG) device, a simple optical sensor that is placed on the index finger of the subject during sleep. In a practical test, the researchers' algorithm's classifications yielded robust outputs matching 90% of the results across all four sleep stages compared to a popular sleep stage classification app.

 

This study highlights the use case and reliability of simplified and wearable biosignals sensors as potential use for inexpensive and easy-to-use tools for sleep studies at home. All through the combination of wearable biosignals and AI solutions, bridging research with real-world applications.

Find out more about this project in the published research paper.

 

Closure

Long gone are the times where the combination of Artificial Intelligence (AI) and biosignals were only part of science fiction. Instead, we can find biosignals everywhere and their use with AI show potential for interesting and motiving applications and shown by the use cases presented in this blog post.

From heart rate analysis for clinical purposes to AI-curated playlists driven by a runners emotional state, joining biosignals and AI reveals a landscape of innovation. Predicting video game difficulties and identifying sleep phases offer not only entertainment enhancements but also potential insights into user experiences.

These use cases collectively underline the practical value of merging biosignals and AI, inviting us to explore and harness this synergy in our ever-evolving technological landscape. And perhaps this blog posts inspired you to learn more about biosignals and start your own AI challenge.