To main content

User-adaptive models for activity and emotion recognition using deep transfer learning and data augmentation

User-adaptive models for activity and emotion recognition using deep transfer learning and data augmentation

Category
Journal publication
Abstract
Building predictive models for human-interactive systems is a challenging task. Every individual has unique characteristics and behaviors. A generic human–machine system will not perform equally well for each user given the between-user differences. Alternatively, a system built specifically for each particular user will perform closer to the optimum. However, such a system would require more training data for every specific user, thus hindering its applicability for real-world scenarios. Collecting training data can be time consuming and expensive. For example, in clinical applications it can take weeks or months until enough data is collected to start training machine learning models. End users expect to start receiving quality feedback from a given system as soon as possible without having to rely on time consuming calibration and training procedures. In this work, we build and test user-adaptive models (UAM) which are predictive models that adapt to each users’ characteristics and behaviors with reduced training data. Our UAM are trained using deep transfer learning and data augmentation and were tested on two public datasets. The first one is an activity recognition dataset from accelerometer data. The second one is an emotion recognition dataset from speech recordings. Our results show that the UAM have a significant increase in recognition performance with reduced training data with respect to a general model. Furthermore, we show that individual characteristics such as gender can influence the models’ performance.
Language
English
Author(s)
Affiliation
  • SINTEF Digital / Software and Service Innovation
  • Kristiania University College
  • Simula Metropolitan Center for Digital Engineering
  • University of Oslo
  • University of Oslo
Year
2019
Published in
User modeling and user-adapted interaction
ISSN
0924-1868
Page(s)
1 - 29