Abstract :
Keywords: Video streaming; Attention-based bitrate allocation; Saliency maps with transfer learning and supervision Abstract In this work, we propose and investigate a user-centric framework for the delivery of omnidirectional video (ODV) on VR systems by taking advantage of visual attention (saliency) models for bitrate allocation module. For this purpose, we formulate a new bitrate allocation algorithm that takes saliency map and nonlinear sphere-to-plane mapping into account for each ODV and solve the formulated problem using linear integer programming. For visual attention models, we use both image- and video-based saliency prediction results moreover, we explore two types of attention model approaches: (i) salient object detection with transfer learning using pre-trained networks, (ii) saliency prediction with supervised networks trained on eye-fixation dataset. Experimental evaluations on saliency integration of models are discussed with interesting findings on transfer learning and supervised saliency approaches. Author Affiliation: (1) V-SENSE, Trinity College Dublin, Dublin, Ireland (2) Artificial Intelligence Research Center, National Institute of Advanced Industrial Science and Technology, Tokyo, Japan (a) c.ozcinar@tcd.ie Article History: Registration Date: 08/25/2020 Received Date: 05/02/2020 Accepted Date: 08/25/2020 Online Date: 09/09/2020 Byline: