# CVRL FORTH HandTracker
## Description
This script uses the Model Based Vision (MBV) libraries created by the Computer Vision and Robotics Lab at ICS/FORTH. The libraries are free for academic and non-profit use under this [licence](license.txt).
It implements a hand tracker pipeline described first in [Oikonomidis et al: Efficient model-based 3D tracking of hand articulations using Kinect](http://users.ics.forth.gr/~argyros/mypapers/2011_09_bmvc_kinect_hand_tracking.pdf).
The software tracks the 3D position, orientation and full articulation of a human hand from markerless visual observations. The developed method:
* estimates the full articulation of a hand (26 DoFs redundantly encoded in 27 parameters) involved in unconstrained motion
* operates on input acquired by easy-to-install and widely used/supported RGB-D cameras (e.g. Kinect, Xtion)
* does not require markers, special gloves
* performs at a rate of 30fps in modern architectures (GPU acceleration)
* does not require calibration
* does not rely on any proprietary built-in tracking technologies (Nite, OpenNI, Kinect SDK)
<a href="http://www.youtube.com/watch?feature=player_embedded&v=Fxa43qcm1C4" target="_blank"><img src="http://img.youtube.com/vi/Fxa43qcm1C4/0.jpg" alt="Single hand tracking" width="320" height="240" border="10"/></a>
## Citation
If you use any part of this work please cite the following:
Oikonomidis, Iason, Nikolaos Kyriazis, and Antonis A. Argyros. "Efficient model-based 3D tracking of hand articulations using Kinect." BMVC. Vol. 1. No. 2. 2011.
```
@inproceedings{oikonomidis2011efficient,
title={Efficient model-based 3D tracking of hand articulations using Kinect.},
author={Oikonomidis, Iason and Kyriazis, Nikolaos and Argyros, Antonis A},
booktitle={BMVC},
volume={1},
number={2},
pages={3},
year={2011}
}
```
**Notice**: The citation targets are subject to change. Please make sure to use the latest information provided.
## Hardware Requirements
System requirements:
- Hardware
- Multi-core Intel CPU
- 1 GB of RAM or more
- CUDA-enabled GPU
- 512MB GPU RAM or more
- CUDA compute capability > 1.0
- OpenGL 3.3
- Software
- OS
- 64bit Windows 8 or newer
- 64bit Ubuntu 14.04 Linux
- Environment
- **Python 2.7 64bit**
- Drivers
- [Latest CUDA driver](https://developer.nvidia.com/cuda-downloads)
- OpenNI driver
- Kinect 2 driver
## Download links
<a name="download"></a>
- [Ubuntu 3D hand tracking](http://cvrlcode.ics.forth.gr/files/mbv/v1.1/MBV_PythonAPI_Linux_1.1.zip)
- [Windows 3D hand tracking](http://cvrlcode.ics.forth.gr/files/mbv/v1.1/MBV_PythonAPI_Win_1.1.zip)
### Windows Dependencies
- [Visual C++ **64bit** Redistributable Packages for Visual Studio 2013](https://www.microsoft.com/en-us/download/details.aspx?id=40784)
- [OpenNI 1.x SDK for Windows 8 **64bit** and newer](http://cvrlcode.ics.forth.gr/web_share/OpenNI/OpenNI_SDK/OpenNI_1.x/OpenNI-Win64-1.5.7.10-Dev.zip) (install prior to sensor driver)
- [OpenNI 1.x sensor driver for 8 Windows **64bit** and newer](http://cvrlcode.ics.forth.gr/web_share/OpenNI/OpenNI_SDK/OpenNI_1.x/Sensor_Driver/Sensor-Win64-5.1.6.6-Redist.zip)
- [Kinect 2 SDK for Windows 8 **64bit** and newer](http://www.microsoft.com/en-us/download/details.aspx?id=44561)
## Installation and usage
As a first step, download the package that matches your OS from [the download section](#download). Extract the downloaded package to a location and set an environment variable named <tt>MBV_LIBS</tt> to point to this location. For example, if the package is extracted to the path <tt>c:\Users\User\Documents\FORTH\HANDTRACKER</tt> (Windows) or <tt>/home/user/FORTH/HANDTRACKER</tt> (Ubuntu), do the following from the command line:
Ubuntu:
```
export MBV_LIBS=/home/user/FORTH/HANDTRACKER
```
Windows:
```
set MBV_LIBS=c:\Users\User\Documents\FORTH\HANDTRACKER
```
The provided package has some external dependencies, listed below. One such dependency is a working Python 2.7 environment.
**Notice:** Make sure the Python version is 2.7 64bit.
**Notice:** Binaries were build against CUDA 7.5. This might require the user to update the GPU driver to the latest version. In the lack of a supported driver, an error message of the form "*CUDA driver version is insufficient for CUDA runtime version*" is issued.
### Ubuntu
Install opencv, thread building blocks (TBB) python and numpy by executing the following in the command line:
```
sudo apt-get install libopencv-dev libtbb2 python-numpy python-opencv
```
If you plan to use openni1.x (required for running some of the example scripts), also execute:
```
sudo apt-get install libopenni0 libopenni-sensor-primesense0
```
Make sure that you have nvidia driver 352 or newer installed. Use the
"Additional Drivers" tool to select the correct driver version.
### Windows
OpenCV is statically built with the provided binaries. Thread building blocks is bundled with the downloadable package. The rest of the dependencies should be downloaded from the download section. For python support it is suggested to use [anaconda] (https://www.continuum.io/downloads). After installing Anaconda, the installation of numpy is a simple as executing the following in the command line:
```
conda install numpy
```
**Notice:** Binaries were built against numpy 1.10.1. If a numpy related error (import or other) is issued, updating numpy to this version will be required. In Anaconda it would suffice to execute the following, from the command line:
```
conda update numpy
```
### Usage
Make sure the current working directory is the root of HandTracker and that <tt>MBV_LIBS</tt> is set.
Run the `runme.sh` (Ubuntu) or `runme.bat` (Windows) script to test the hand tracker. Press `s` to stop/start 3D hand tracking.
**Notice:** Be aware that the first execution will take a significant amount of time, CPU and memory. This is due to the intermediate CUDA code being compiled. This will only happen once, as the compilation result is cached. In Ubuntu the cache limit might be too restricting and in these cases caching will fail, leading to recompilation at every execution. To remedy this the size can be increased as follows (command line):
```
export CUDA_CACHE_MAXSIZE=2147483648
```
## Contact
For questions, comments and any kind of feedback please use the github issues, and the wiki.
没有合适的资源?快使用搜索试试~ 我知道了~
使用深度传感器 输入的 3D 手部追踪_Python_代码_相关文件_下载
共18个文件
obj:3个
py:2个
jpg:2个
1.该资源内容由用户上传,如若侵权请联系客服进行举报
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
版权申诉
0 下载量 117 浏览量
2022-07-13
21:26:07
上传
评论
收藏 120KB ZIP 举报
温馨提示
该脚本使用由 ICS/FORTH 的计算机视觉和机器人实验室创建的基于模型的视觉 (MBV) 库。在此许可下,这些库可免费用于学术和非营利用途。 它实现了在Oikonomidis 等人中首次描述的手部跟踪器管道:使用 Kinect 对手部关节进行高效的基于模型的 3D 跟踪。 该软件通过无标记视觉观察跟踪人手的 3D 位置、方向和完整关节。开发的方法: 估计无约束运动中涉及的手的完整关节(26 个自由度冗余编码在 27 个参数中) 对易于安装且广泛使用/支持的 RGB-D 相机(例如 Kinect、Xtion)获取的输入进行操作 不需要标记,特殊手套 在现代架构中以 30fps 的速度执行(GPU 加速) 不需要校准 不依赖任何专有的内置跟踪技术(Nite、OpenNI、Kinect SDK) 更多详情、使用方法,请下载后细读README.md文件
资源推荐
资源详情
资源评论
收起资源包目录
HandTracker-master.zip (18个子文件)
HandTracker-master
runme.sh 353B
hand_spheres.jpg 50KB
license.txt 10KB
media
openni.xml 2KB
cylinder_low.obj 2KB
cube.obj 663B
sphere_low.obj 6KB
hand_right_low_RH.xml 62KB
runme.bat 310B
libraries_README.txt 1KB
hand-object.oni 134B
hand_cylinders.jpg 50KB
src
HandObjectTracking.py 10KB
SingleHandTracking.py 9KB
loop.oni 133B
.gitignore 466B
README.md 6KB
.gitattributes 149B
共 18 条
- 1
资源评论
快撑死的鱼
- 粉丝: 1w+
- 资源: 9154
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- 2c60fbb3dt9ad50ed8864298eea1484b.MP4
- 基于yolov8+dlib实现视觉识别的安全驾驶监测系统部署到jetson NX平台源码+模型.zip
- Qt框架+OpenCV+动态爱心+编程教学+520
- 基于opencv+yolov8实现目标追踪及驻留时长统计源码.zip
- 水稻病害基于Yolov8算法优化目标检测识别与AI辅助决策python源码+模型+使用说明.zip
- 海尔618算价表_七海5.20_16.00xlsx(1)(2).xlsx
- WebCrawler.scr
- 【计算机专业毕业设计】大学生就业信息管理系统设计源码.zip
- YOLO 数据集:8种路面缺陷病害检测【包含划分好的数据集、类别class文件、数据可视化脚本】
- JAVA实现Modbus RTU或Modbus TCPIP案例.zip
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功