Block Background Subtraction Method Based on ViBe
Lianfen Huang
1, a
, Qingyue Chen
1, b
, Jinfeng Lin
1,c
and Hezhi Lin
1, d, *
1
College of Information Science and Technology, Xiamen University, Xiamen 361005, China;
* Corresponding Author
a
lfhuang@xmu.edu.cn,
b
339740923@qq.com,
c
494972599@qq.com,
d
linhezhi@xmu.edu.cn
Keywords: Background subtraction; ViBe; Block processing; Background model
Abstract. The key of background subtraction which is widely used in moving object detecting is to
set up and update the background model. This paper presents a block background subtraction method
based on ViBe, using the spatial correlation and time continuity of the video sequence. Set up the
video sequence background model firstly. Then, update the background model through block
processing. Finally employ the difference between the current frame and background model to extract
moving objects.
Introduction
With the development of society and the improvement of the science technology, video monitoring
technology occupies an increasingly important position in People's Daily life. Moving object
detection is a key part of the object extraction in the video monitoring technology.
At present there are three main types of moving object detection algorithm: optical flow method,
frame-difference method and background subtraction method.
Optical flow method is an important method for the analysis of moving objects in an image
sequence. But the calculated optical flow field distribution is not very reliable and accurate due to
noise, multiple light sources, transparency, et al. The calculation method of optical flow field is quite
complicated, with poor anti-noise performance. The processing speed of this method is slow, that
could not reach the requirements of real-time if there are no specialized hardware devices.
Frame-difference method subtract the continuous image of two adjacent frames, and get
information of moving object such as location and shape from the difference image. Frame-difference
method has a good adaptability to environment, especially for the change of illumination. This
method can't detect moving object in full. It just can only get some of the information of the moving
object because some information of the moving object such as pixel texture and grayscale is similar.
Background subtraction method provides a more reliable and comprehensive information of the
moving objects, but is very sensitive to the change of the scene, such as the influence of illumination
so that the process is more complex. The main idea of this method is to get an estimate of the
background model by statistical data, and to subtract the current frame and the background model. If
the difference of pixel value between the two image is greater than a certain threshold, then determine
this pixel belong to the moving object. Otherwise determine this pixel belong to the background. The
result of the threshold operation directly presents the information such as location and shapes of the
moving object. One downside to this method is that it is sensitive to changes in lighting conditions,
such as light and weather. The shadow of the moving object also affects the accuracy of test results
and motion tracking.
ViBe (visual background extractor) algorithm, that is visual background extraction algorithm
which has good robustness and efficiency, is raised by Olivier Barnich et al. in 2009 using for fast
background extraction and moving object detection.
This paper puts forward an algorithm of block background subtraction detection based on ViBe.
This algorithm makes use of the spatial correlation and time continuity of the video sequence. Block
processing can reduce the interference due to the dynamic changes of the background such as
Applied Mechanics and Materials Vols. 556-562 (2014) pp 3549-3552
© (2014) Trans Tech Publications, Switzerland
doi:10.4028/www.scientific.net/AMM.556-562.3549
All rights reserved. No part of contents of this paper may be reproduced or transmitted in any form or by any means without the written permission of TTP,
www.ttp.net. (ID: 59.49.30.35-16/05/14,17:47:28)