location: Current position: Home >> Scientific Research >> Paper Publications

Energy-Efficient Scheduling for Real-Time Systems Based on Deep Q-Learning Model

Hits:

Indexed by:Journal Papers

Date of Publication:2019-01-01

Journal:IEEE Transactions on Sustainable Computing

Included Journals:EI

Volume:4

Issue:1

Page Number:132-141

Abstract:Energy saving is a critical and challenging issue for real-time systems in embedded devices because of their limited energy supply. To reduce the energy consumption, a hybrid dynamic voltage and frequency scaling (DVFS) scheduling based on Q-learning (QL-HDS) was proposed by combining energy-efficient DVFS techniques. However, QL-HDS discretizes the system state parameters with a certain step size, resulting in a poor distinction of the system states. More importantly, it is difficult for QL-HDS to learn a system for various task sets with a Q-table and limited training sets. In this paper, an energy-efficient scheduling scheme based on deep Q-learning model is proposed for periodic tasks in real-time systems (DQL-EES). Specially, a deep Q-learning model is designed by combining a stacked auto-encoder and a Q-learning model. In the deep Q-learning model, the stacked auto-encoder is used to replace the Q-function for learning the Q-value of each DVFS technology for any system state. Furthermore, a training strategy is devised to learn the parameters of the deep Q-learning model based on the experience replay scheme. Finally, the performance of the proposed scheme is evaluated by comparison with QL-HDS on different simulation task sets. Results demonstrated that the proposed algorithm can save average 4.2\% energy than QL-HDS. ? 2016 IEEE.

Pre One:A Survey on Deep Learning for Multimodal Data Fusion

Next One:A Survey on Deep Learning for Multimodal Data Fusion