Cargando…

Clothing Insulation Rate and Metabolic Rate Estimation for Individual Thermal Comfort Assessment in Real Life

Satisfactory indoor thermal environments can improve working efficiencies of office staff. To build such satisfactory indoor microclimates, individual thermal comfort assessment is important, for which personal clothing insulation rate ([Formula: see text]) and metabolic rate (M) need to be estimate...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Jinsong, Foged, Isak Worre, Moeslund, Thomas B.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8779511/
https://www.ncbi.nlm.nih.gov/pubmed/35062580
http://dx.doi.org/10.3390/s22020619
Descripción
Sumario:Satisfactory indoor thermal environments can improve working efficiencies of office staff. To build such satisfactory indoor microclimates, individual thermal comfort assessment is important, for which personal clothing insulation rate ([Formula: see text]) and metabolic rate (M) need to be estimated dynamically. Therefore, this paper proposes a vision-based method. Specifically, a human tracking-by-detection framework is implemented to acquire each person’s clothing status (short-sleeved, long-sleeved), key posture (sitting, standing), and bounding box information simultaneously. The clothing status together with a key body points detector locate the person’s skin region and clothes region, allowing the measurement of skin temperature ([Formula: see text]) and clothes temperature ([Formula: see text]), and realizing the calculation of [Formula: see text] from [Formula: see text] and [Formula: see text]. The key posture and the bounding box change across time can category the person’s activity intensity into a corresponding level, from which the M value is estimated. Moreover, we have collected a multi-person thermal dataset to evaluate the method. The tracking-by-detection framework achieves a mAP(50) (Mean Average Precision) rate of 89.1% and a MOTA (Multiple Object Tracking Accuracy) rate of 99.5%. The [Formula: see text] estimation module gets an accuracy of 96.2% in locating skin and clothes. The M estimation module obtains a classification rate of 95.6% in categorizing activity level. All of these prove the usefulness of the proposed method in a multi-person scenario of real-life applications.