Cargando…
Dynamic Data Streams for Time-Critical IoT Systems in Energy-Aware IoT Devices Using Reinforcement Learning
Thousands of energy-aware sensors have been placed for monitoring in a variety of scenarios, such as manufacturing, control systems, disaster management, flood control and so on, requiring time-critical energy-efficient solutions to extend their lifetime. This paper proposes reinforcement learning (...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8949606/ https://www.ncbi.nlm.nih.gov/pubmed/35336544 http://dx.doi.org/10.3390/s22062375 |
Sumario: | Thousands of energy-aware sensors have been placed for monitoring in a variety of scenarios, such as manufacturing, control systems, disaster management, flood control and so on, requiring time-critical energy-efficient solutions to extend their lifetime. This paper proposes reinforcement learning (RL) based dynamic data streams for time-critical IoT systems in energy-aware IoT devices. The designed solution employs the Q-Learning algorithm. The proposed mechanism has the potential to adjust the data transport rate based on the amount of renewable energy resources that are available, to ensure collecting reliable data while also taking into account the sensor battery lifetime. The solution was evaluated using historical data for solar radiation levels, which shows that the proposed solution can increase the amount of transmitted data up to 23%, ensuring the continuous operation of the device. |
---|