A Feasibility Study of Data Driven Ecological Momentary Assessment Surveys Using Smart Wearables to Capture Emotions

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

39 Downloads (Pure)

Abstract

Using physiological signals to determine emotional states is a concept that has evolved based on the observed physical bodily reactions different emotions can cause, however, to date, few studies have applied this to ecological momentary
assessments, out of the lab. In addition, many studies lack clear defined reasoning as to why classifying emotions may be beneficial, or what applications it may be used for. This paper presents a data collection tool designed for field studies, along with a standardised methodology, combining real time data
processing and smart wearables to monitor physiological signals for emotional state changes. It highlights a defined target group of Autistic adults who may encounter difficulties perceiving and regulating their own emotions, also known as Alexithymia, with a use case of providing a tool to help identify their emotions in real time to assess if emotion regulation intervention may be needed without the need for careers or outside support. We offer results and feedback learnt from the first feasibility study completed with this tool and the methodology used to collect data. This paper features commentary on Autistic accessibility design for
assistive tools from our experience of co-designing the tool with our target group, and direct feedback from various patient and public involvement activities.
Original languageEnglish
Title of host publicationIEEE Sensors Applications Symposium 2024
DOIs
Publication statusE-pub ahead of print - 23 Aug 2024

Keywords

  • emotion
  • recognition
  • wearable sensors
  • field study
  • autism
  • data-driven modeling

Fingerprint

Dive into the research topics of 'A Feasibility Study of Data Driven Ecological Momentary Assessment Surveys Using Smart Wearables to Capture Emotions'. Together they form a unique fingerprint.

Cite this