A Markov Decision Process-based service migration procedure for follow me cloud

Abstract
The Follow-Me Cloud (FMC) concept enables service mobility across federated data centers (DCs). Following the mobility of a mobile user, the service located in a given DC is migrated each time an optimal DC is detected. The detailed criterion for optimality is defined by operator policy, but it may be typically derived from geographical proximity or load. Service migration may be an expensive operation given the incurred cost in terms of signaling messages and data transferred between DCs. Decision on service migration defines therefore a tradeoff between cost and user perceived quality. In this paper, we address this tradeoff by modeling the service migration procedure using a Markov Decision Process (MDP). The aim is to formulate a decision policy that determines whether to migrate a service or not when the concerned User Equipment (UE) is at a certain distance from the source DC. We numerically formulate the decision policies and compare the proposed approach against the baseline counterpart.

This publication has 7 references indexed in Scilit: