VisDrone2018-SOT Tooklit for Single-Object Tracking
Introduction
This is the documentation of the VisDrone2018 competitions development kit for single-object tracking (SOT) challenge.
This code library is for research purpose only, which is modified based on the visual benchmark platform of Wu et al. [1].
The code is tested on the Windows 10 and macOS Sierra 10.12.6 systems, with the Matlab 2013a/2014b/2016b/2017b platforms.
If you have any questions, please contact us (email:tju.drone.vision@gmail.com).
Citation
If you use our toolkit or dataset, please cite our paper as follows:
@article{zhuvisdrone2018,
title={Vision Meets Drones: A Challenge},
author={Zhu, Pengfei and Wen, Longyin and Bian, Xiao and Haibin, Ling and Hu, Qinghua},
journal={arXiv preprint:1804.07437},
year={2018}
}
Dataset
For SOT competition, there are three sets of data and labels: training data, validation data,
and test-challenge data. There is no overlap between the three sets.
Number of snippets
----------------------------------------------------------------------------------------------
Dataset Training Validation Test-Challenge
----------------------------------------------------------------------------------------------
Signle object tracking 86 clips 11 clips 35 clips
69,941 frames 7,046 frames 29,367 frames
----------------------------------------------------------------------------------------------
For an input video sequence and the initial bounding box of the target object in the first frame, the challenge requires a participating algorithm to locate the target bounding boxes in the subsequent video frames. The objects to be tracked are of various types including pedestrians, cars, and animals. We manually annotate the bounding boxes of different objects in each video frame. Annotations on the training and validation sets are publicly available.
The link for downloading the data can be obtained by registering for the challenge at
http://www.aiskyeye.com/
Evaluation Routines
The notes for the folders:
* The tracking results will be stored in the folder '.\results'.
* The folder '.\trackers' contains all the source codes for trackers (e.g., Staple)
* The folder '.\util' contains some scripts used in the main functions.
* main functions
* main_running.m is the main function to run your tracker
-put the source codes in ./trackers/ according to the source codes of Staple tracker
-modify the dataset path in ./main_running.m
-input the method named in ./util/configTrackers.m
-the results with mat format are saved in ./results/results_OPE/
* perfPlot.m is the main function to evaluate your tracker based on the results with mat or txt format.
Besides, the visual attributes are defined as same as these in [2].
-modify the dataset path in ./perfPlot.m (re-evaluate the results by setting the flag "reEvalFlag = 1")
-select a tracker named in ./util/configTrackers.m
-select the rankingType e.g., AUC and threshold
-check the tracking results in ./results/results_OPE/
-the figures are saved in ./figs/overall/
* drawResultBB.m is the main function used to show the results
-modify the dataset path in ./drawResultBB.m
-select a tracker named in ./util/configTrackers.m
-check the tracking results in ./results/results_OPE/
-the visual results are saved in ./tmp/OPE/
SOT submission format
Submission of the results will consist of TXT files with one line per predicted object or MAT files as same as that in [1].
For txt submission, it looks as follows:
<bbox_left>,<bbox_top>,<bbox_width>,<bbox_height>
Name Description
--------------------------------------------------------------------------------------------------
<bbox_left> The x coordinate of the top-left corner of the predicted bounding box
<bbox_top> The y coordinate of the top-left corner of the predicted object bounding box
<bbox_width> The width in pixels of the predicted object bounding box
<bbox_height> The height in pixels of the predicted object bounding box.
For mat submission, it looks as follows:
< results: {type = 'rect', res, fps, len, annoBegin = 1, startFrame = 1} >
Variable Description
---------------------------------------------------------------------------------------------------------
<type> The representation type of the predicted bounding box representation.
It should be set as 'rect'.
<res> The tracking results in the video clip. Notably, each row includes the frame index,
the x and y coordinates of the top-left corner of the predicted bounding box,
and the width and height in pixels of the predicted bounding box.
<fps> The running speed of the evaluated tracker, namely frame-per-second.
<len> The length of the evaluated sequence.
<annoBegin> The start frame index for tracking. The default value is 1.
<startFrame> The start frame index of the video. The default value is 1.
The sample submission of the tracker can be found in our website.
References
[1] Y. Wu, J. Lim, and M.-H. Yang, "Online Object Tracking: A Benchmark", in CVPR 2013.
[2] M. Mueller, N. Smith, B. Ghanem, "A Benchmark and Simulator for UAV Tracking", in ECCV 2016.
-----------------------------------------------------------------
Version History
1.0.2 - May 7, 2018
- Fix the bugs in the main_running and perPlot functions.
1.0.1 - May 3, 2018
- Fix the bug in the genPerfMat function.
1.0.0 - Apr 19, 2018
- Initial release.
没有合适的资源?快使用搜索试试~ 我知道了~
matlabauc代码-VisDrone2018-SOT-toolkit:适用于VisDrone2019的单一对象跟踪工具包
共75个文件
m:55个
mexmaci64:2个
dll:2个
需积分: 49 1 下载量 168 浏览量
2021-05-24
09:49:00
上传
评论
收藏 4.96MB ZIP 举报
温馨提示
Matlab的耳语VisDrone2018-SOT用于单对象跟踪的Tooklit 介绍 这是VisDrone2018竞赛开发套件的文档,用于单对象跟踪(SOT)挑战。 该代码库仅用于研究目的,它是根据Wu等人的视觉基准平台进行了修改。 [1]。 该代码已在Windows 10和macOS Sierra 10.12.6系统以及Matlab 2013a / 2014b / 2016b / 2017b平台上进行了测试。 如有任何疑问,请与我们联系(电子邮件:)。 引文 如果您使用我们的工具包或数据集,请按以下方式引用我们的论文: @article {zhuvisdrone2018, title={Vision Meets Drones: A Challenge}, author={Zhu, Pengfei and Wen, Longyin and Bian, Xiao and Haibin, Ling and Hu, Qinghua}, journal={arXiv preprint:1804.07437}, year={2018} } 数据集 对于SOT竞赛,有三组数据和标签:训练数据,验
资源详情
资源评论
资源推荐
收起资源包目录
VisDrone2018-SOT-toolkit-master.zip (75个子文件)
VisDrone2018-SOT-toolkit-master
VisDrone2018-SOT_toolkit
rstEval
drawResultRect.m 3KB
aff2image.m 800B
getLKcorner.m 576B
calcSeqErrRobust.m 3KB
getORIAcorner.m 900B
drawAffine.m 919B
calcSeqErr.m 2KB
calcPlotErr.m 5KB
calcCenter_L1.m 449B
corner2rect.m 278B
drawbox.m 2KB
calcRectInt.m 894B
parameters_to_projective_matrix.m 1KB
calcRectCenter.m 1KB
util
getDrawStyle.m 3KB
plotSetting.m 2KB
calcSeqErrRobust.m 3KB
splitSeqTRE.m 1KB
genPerfMat.m 4KB
configSeqs.m 499B
shiftInitBB.m 2KB
checkResult.m 640B
configTrackers.m 128B
plotDrawSave.m 2KB
drawResultBB.m 4KB
trackers
Staple
readParams.m 881B
updateHistModel.m 2KB
getColourMap.m 771B
runTracker_webcam.m 5KB
opencv_core242.dll 2.31MB
opencv_imgproc242.dll 1.98MB
libopencv_core.a 3.42MB
isToolboxAvailable.m 1KB
runTracker.m 2KB
gradientMex.mexa64 23KB
fhog.m 3KB
runAll_vot15.m 1014B
initTracker_webcam.m 2KB
TODO.txt 106B
libopencv_imgproc.a 3.26MB
mexResize.mexw64 54KB
mexResize.mexmaci64 81KB
getAxisAlignedBB.m 538B
getCenterLikelihood.m 1005B
mergeResponses.m 873B
getP.m 400B
getScaleSubwindow.m 1KB
updateTracker_webcam.m 8KB
freezeColors.m 10KB
computeHistogram.m 904B
gradientMex.cpp 19KB
LICENSE 1KB
initializeAllAreas.m 2KB
gaussianResponse.m 532B
README.md 3KB
libopencv_imgproc.so 1.75MB
runAll_vot14.m 452B
mexResize.mexa64 100KB
mySubplot.m 424B
getFeatureMap.m 917B
cropFilterResponse.m 686B
params.txt 2KB
gradientMex.mexw64 30KB
getSubwindow.m 1KB
libopencv_core.so 1.92MB
runTracker_VOT.m 2KB
gradientMex.mexmaci64 26KB
trackerMain.m 10KB
unfreezeColors.m 4KB
run_Staple.m 1KB
perfPlot.m 5KB
perfMat
overall
aveSuccessRatePlot_1alg_overlap_OPE.mat 4KB
aveSuccessRatePlot_1alg_error_OPE.mat 7KB
main_running.m 4KB
README.md 6KB
共 75 条
- 1
weixin_38682953
- 粉丝: 7
- 资源: 986
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
评论0