An Analysis of Double Q-learning Based Energy Management Strategies for TEG-powered IoT Devices
Description
The study presents a self-learning controller for managing the energy in an Internet-of-Things (IoT) device powered by energy harvested from a thermoelectric generator (TEG). The device’s controller is based on a double Q-learning (DQL) method; the hardware incorporates a TEG energy harvesting subsystem with a DC/DC converter, a load module with a microcontroller, and a LoRaWAN communications interface. The model is controlled according to adaptive measurements and transmission periods. The controller’s reward policy evaluates the level of charge available to the device. The controller applies and evaluates various learning parameters and reduces the learning rate over time. Using four years of historical soil temperature data in an experimental simulation of several controller configurations, the DQL controller demonstrated correct operation, a low learning rate and high cumulative rewards. The best energy management controller operated with a completed cycle and missed cycle ratio of 98.5 %. The novelty of the presented approach is discussed in relation to state-of-the-art methods in adaptive ability, learning processes and practical applications of the device.
Files
An_Analysis_of_Double_Q-learning_Based_Energy_Management_Strategies_for_TEG-powered_IoT_Devices.pdf
Files
(3.2 MB)
Name | Size | Download all |
---|---|---|
md5:0c9037b3490ef4f3032789745f8579ee
|
3.2 MB | Preview Download |