Energy-Efficiency Oriented Traffic Offloading in Wireless Networks: A Brief Survey and a Learning Approach for Heterogeneous Cellular Networks
Top Cited Papers
- 16 January 2015
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Journal on Selected Areas in Communications
- Vol. 33 (4), 627-640
- https://doi.org/10.1109/jsac.2015.2393496
Abstract
This paper first provides a brief survey on existing traffic offloading techniques in wireless networks. Particularly as a case study, we put forward an online reinforcement learning framework for the problem of traffic offloading in a stochastic heterogeneous cellular network (HCN), where the time-varying traffic in the network can be offloaded to nearby small cells. Our aim is to minimize the total discounted energy consumption of the HCN while maintaining the quality-of-service (QoS) experienced by mobile users. For each cell (i.e., a macro cell or a small cell), the energy consumption is determined by its system load, which is coupled with system loads in other cells due to the sharing over a common frequency band. We model the energy-aware traffic offloading problem in such HCNs as a discrete-time Markov decision process (DTMDP). Based on the traffic observations and the traffic offloading operations, the network controller gradually optimizes the traffic offloading strategy with no prior knowledge of the DTMDP statistics. Such a model-free learning framework is important, particularly when the state space is huge. In order to solve the curse of dimensionality, we design a centralized Q-learning with compact state representation algorithm, which is named QC-learning. Moreover, a decentralized version of the QC-learning is developed based on the fact the macro base stations (BSs) can independently manage the operations of local small-cell BSs through making use of the global network state information obtained from the network controller. Simulations are conducted to show the effectiveness of the derived centralized and decentralized QC-learning algorithms in balancing the tradeoff between energy saving and QoS satisfaction.Keywords
Funding Information
- National Basic Research Program of China
- Key
- Key Technologies R&D Program of China (2012BAH75F01)
This publication has 51 references indexed in Scilit:
- Cellular traffic offloading onto network-assisted device-to-device connectionsIEEE Communications Magazine, 2014
- Optimal Cellular Offloading via Device-to-Device Communication Networks With Fairness ConstraintsIEEE Transactions on Wireless Communications, 2014
- Data Offloading in Load Coupled Networks: A Utility Maximization FrameworkIEEE Transactions on Wireless Communications, 2014
- Minimizing Base Station Power ConsumptionIEEE Journal on Selected Areas in Communications, 2013
- From cognition to docition: The teaching radio paradigm for distributed & autonomous deploymentsComputer Communications, 2010
- A stability version of Hölder's inequalityJournal of Mathematical Analysis and Applications, 2008
- Harris recurrence of Metropolis-within-Gibbs and trans-dimensional Markov chainsThe Annals of Applied Probability, 2006
- Learning and optimization using the clonal selection principleIEEE Transactions on Evolutionary Computation, 2002
- The Evolution of ConventionsEconometrica, 1993
- Open, Closed, and Mixed Networks of Queues with Different Classes of CustomersJournal of the ACM, 1975