Autonomous Power Management With Double-QReinforcement Learning Method
- 18 November 2019
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Industrial Informatics
- Vol. 16 (3), 1938-1946
- https://doi.org/10.1109/tii.2019.2953932
Abstract
Energy efficiency and autonomous power management are extremely important for mobile edge computing. Reducing energy consumption of a number of applications running concurrently in mobile devices while maintaining performance poses a challenge to energy optimization due to the limited capacity of the embedded battery. To extend battery life and offer a long-lasting working energy, dynamic voltage and frequency scaling (DVFS) has been widely used in mobile devices for energy consumption minimization. However, most conventional DVFS techniques scale operating frequency based on static policies, and thus they are difficult to be adapted to systems of varied conditions. In order to improve adaptivity, we proposed a Double- Q power management approach to scale operating frequency based on learning. The Double-Q method stores two Q-tables and two corresponding update functions. In each decision point, either of Q-tables is randomly chosen and updated, while the other is used for the measurement. This mechanism reduces the overestimation in Q-values, consequently enhancing the accurateness of frequency predictions. To evaluate the effectiveness of our proposed approach, a Double-Q governor is implemented in the Linux kernel. Our approach is computationally light and experimental results indicate that it achieves at least 5%-18% total energy saving compared to ondemand and conservative governors as well as Q learning-based method.Keywords
Funding Information
- Natural Sciences and Engineering Research Council of Canada
This publication has 25 references indexed in Scilit:
- Investigating the effect of design patterns on energy consumptionJournal of Software: Evolution and Process, 2017
- Efficient Energy Management for the Internet of Things in Smart CitiesIEEE Communications Magazine, 2017
- Extending software architecture views with an energy consumption perspectiveComputing, 2016
- PARSECSsACM Transactions on Architecture and Code Optimization, 2015
- Learning Transfer-Based Adaptive Energy Minimization in Embedded SystemsIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 2015
- Learning-Based Power Management for Multicore Processors via Idle Period ManipulationIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 2014
- Online learning of timeout policies for dynamic power managementACM Transactions on Embedded Computing Systems, 2014
- Achieving autonomous power management using reinforcement learningACM Transactions on Design Automation of Electronic Systems, 2013
- Thermal-aware task scheduling in 3D chip multiprocessor with real-time constrained workloadsACM Transactions on Embedded Computing Systems, 2013
- Hybrid power management in real time embedded systems: an interplay of DVFS and DPM techniquesReal-Time Systems, 2011