Assessing Visual Quality of Omnidirectional Videos

Abstract
In contrast with traditional videos, omnidirectional videos enable spherical viewing direction with support for head-mounted displays, providing an interactive and immersive experience. Unfortunately, to the best of our knowledge, there are only a few visual quality assessment (VQA) methods, either subjective or objective, for omnidirectional video coding. This paper proposes both subjective and objective methods for assessing the quality loss in encoding an omnidirectional video. Specifically, we first present a new database, which includes the viewing direction data from several subjects watching omnidirectional video sequences. Then, from our database, we find a high consistency in viewing directions across different subjects. The viewing directions are normally distributed in the center of the front regions, but they sometimes fall into other regions, related to the video content. Given this finding, we present a subjective VQA method for measuring the difference mean opinion score (DMOS) of the whole and regional omnidirectional video, in terms of overall DMOS and vectorized DMOS, respectively. Moreover, we propose two objective VQA methods for the encoded omnidirectional video, in light of the human perception characteristics of the omnidirectional video. One method weighs the distortion of pixels with regard to their distances to the center of front regions, which considers human preference in a panorama. The other method predicts viewing directions according to the video content, and then the predicted viewing directions are leveraged to allocate weights to the distortion of each pixel in our objective VQA method. Finally, our experimental results verify that both the subjective and objective methods proposed in this paper advance the state-of-the-art VQA for omnidirectional videos.
Funding Information
  • National Natural Science Foundation of China (61876013, 61573037, 61471022)
  • Fok Ying Tung Education Foundation (151061)

This publication has 48 references indexed in Scilit: