Cargando…
Deep Learning–Based Multimodal Data Fusion: Case Study in Food Intake Episodes Detection Using Wearable Sensors
BACKGROUND: Multimodal wearable technologies have brought forward wide possibilities in human activity recognition, and more specifically personalized monitoring of eating habits. The emerging challenge now is the selection of most discriminative information from high-dimensional data collected from...
Autores principales: | Bahador, Nooshin, Ferreira, Denzil, Tamminen, Satu, Kortelainen, Jukka |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
JMIR Publications
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7878112/ https://www.ncbi.nlm.nih.gov/pubmed/33507156 http://dx.doi.org/10.2196/21926 |
Ejemplares similares
-
Predicting Emotion with Biosignals: A Comparison of Classification and Regression Models for Estimating Valence and Arousal Level Using Wearable Sensors
por: Siirtola, Pekka, et al.
Publicado: (2023) -
Sensor Data Acquisition and Multimodal Sensor Fusion for Human Activity Recognition Using Deep Learning
por: Chung, Seungeun, et al.
Publicado: (2019) -
Feature Fusion of a Deep-Learning Algorithm into Wearable Sensor Devices for Human Activity Recognition
por: Yen, Chih-Ta, et al.
Publicado: (2021) -
Sensor Type, Axis, and Position-Based Fusion and Feature Selection for Multimodal Human Daily Activity Recognition in Wearable Body Sensor Networks
por: Badawi, Abeer A., et al.
Publicado: (2020) -
Sensor-Fusion for Smartphone Location Tracking Using Hybrid Multimodal Deep Neural Networks
por: Wei, Xijia, et al.
Publicado: (2021)