# hand-gesture-recognition-using-mediapipe
Estimate hand pose using MediaPipe (Python version).<br> This is a sample
program that recognizes hand signs and finger gestures with a simple MLP using the detected key points.
<br> ❗ _️**This is English Translated version of the [original repo](https://github.com/Kazuhito00/hand-gesture-recognition-using-mediapipe). All Content is translated to english along with comments and notebooks**_ ❗
<br>
![mqlrf-s6x16](https://user-images.githubusercontent.com/37477845/102222442-c452cd00-3f26-11eb-93ec-c387c98231be.gif)
This repository contains the following contents.
* Sample program
* Hand sign recognition model(TFLite)
* Finger gesture recognition model(TFLite)
* Learning data for hand sign recognition and notebook for learning
* Learning data for finger gesture recognition and notebook for learning
# Requirements
* mediapipe 0.8.1
* OpenCV 3.4.2 or Later
* Tensorflow 2.3.0 or Later<br>tf-nightly 2.5.0.dev or later (Only when creating a TFLite for an LSTM model)
* scikit-learn 0.23.2 or Later (Only if you want to display the confusion matrix)
* matplotlib 3.3.2 or Later (Only if you want to display the confusion matrix)
# Demo
Here's how to run the demo using your webcam.
```bash
python app.py
```
The following options can be specified when running the demo.
* --device<br>Specifying the camera device number (Default:0)
* --width<br>Width at the time of camera capture (Default:960)
* --height<br>Height at the time of camera capture (Default:540)
* --use_static_image_mode<br>Whether to use static_image_mode option for MediaPipe inference (Default:Unspecified)
* --min_detection_confidence<br>
Detection confidence threshold (Default:0.5)
* --min_tracking_confidence<br>
Tracking confidence threshold (Default:0.5)
# Directory
<pre>
│ app.py
│ keypoint_classification.ipynb
│ point_history_classification.ipynb
│
├─model
│ ├─keypoint_classifier
│ │ │ keypoint.csv
│ │ │ keypoint_classifier.hdf5
│ │ │ keypoint_classifier.py
│ │ │ keypoint_classifier.tflite
│ │ └─ keypoint_classifier_label.csv
│ │
│ └─point_history_classifier
│ │ point_history.csv
│ │ point_history_classifier.hdf5
│ │ point_history_classifier.py
│ │ point_history_classifier.tflite
│ └─ point_history_classifier_label.csv
│
└─utils
└─cvfpscalc.py
</pre>
### app.py
This is a sample program for inference.<br>
In addition, learning data (key points) for hand sign recognition,<br>
You can also collect training data (index finger coordinate history) for finger gesture recognition.
### keypoint_classification.ipynb
This is a model training script for hand sign recognition.
### point_history_classification.ipynb
This is a model training script for finger gesture recognition.
### model/keypoint_classifier
This directory stores files related to hand sign recognition.<br>
The following files are stored.
* Training data(keypoint.csv)
* Trained model(keypoint_classifier.tflite)
* Label data(keypoint_classifier_label.csv)
* Inference module(keypoint_classifier.py)
### model/point_history_classifier
This directory stores files related to finger gesture recognition.<br>
The following files are stored.
* Training data(point_history.csv)
* Trained model(point_history_classifier.tflite)
* Label data(point_history_classifier_label.csv)
* Inference module(point_history_classifier.py)
### utils/cvfpscalc.py
This is a module for FPS measurement.
# Training
Hand sign recognition and finger gesture recognition can add and change training data and retrain the model.
### Hand sign recognition training
#### 1.Learning data collection
Press "k" to enter the mode to save key points(displayed as 「MODE:Logging Key Point」)<br>
<img src="https://user-images.githubusercontent.com/37477845/102235423-aa6cb680-3f35-11eb-8ebd-5d823e211447.jpg" width="60%"><br><br>
If you press "0" to "9", the key points will be added to "model/keypoint_classifier/keypoint.csv" as shown below.<br>
1st column: Pressed number (used as class ID), 2nd and subsequent columns: Key point coordinates<br>
<img src="https://user-images.githubusercontent.com/37477845/102345725-28d26280-3fe1-11eb-9eeb-8c938e3f625b.png" width="80%"><br><br>
The key point coordinates are the ones that have undergone the following preprocessing up to ④.<br>
<img src="https://user-images.githubusercontent.com/37477845/102242918-ed328c80-3f3d-11eb-907c-61ba05678d54.png" width="80%">
<img src="https://user-images.githubusercontent.com/37477845/102244114-418a3c00-3f3f-11eb-8eef-f658e5aa2d0d.png" width="80%"><br><br>
In the initial state, three types of learning data are included: open hand (class ID: 0), close hand (class ID: 1), and pointing (class ID: 2).<br>
If necessary, add 3 or later, or delete the existing data of csv to prepare the training data.<br>
<img src="https://user-images.githubusercontent.com/37477845/102348846-d0519400-3fe5-11eb-8789-2e7daec65751.jpg" width="25%"> <img src="https://user-images.githubusercontent.com/37477845/102348855-d2b3ee00-3fe5-11eb-9c6d-b8924092a6d8.jpg" width="25%"> <img src="https://user-images.githubusercontent.com/37477845/102348861-d3e51b00-3fe5-11eb-8b07-adc08a48a760.jpg" width="25%">
#### 2.Model training
Open "[keypoint_classification.ipynb](keypoint_classification.ipynb)" in Jupyter Notebook and execute from top to bottom.<br>
To change the number of training data classes, change the value of "NUM_CLASSES = 3" <br>and modify the label of "model/keypoint_classifier/keypoint_classifier_label.csv" as appropriate.<br><br>
#### X.Model structure
The image of the model prepared in "[keypoint_classification.ipynb](keypoint_classification.ipynb)" is as follows.
<img src="https://user-images.githubusercontent.com/37477845/102246723-69c76a00-3f42-11eb-8a4b-7c6b032b7e71.png" width="50%"><br><br>
### Finger gesture recognition training
#### 1.Learning data collection
Press "h" to enter the mode to save the history of fingertip coordinates (displayed as "MODE:Logging Point History").<br>
<img src="https://user-images.githubusercontent.com/37477845/102249074-4d78fc80-3f45-11eb-9c1b-3eb975798871.jpg" width="60%"><br><br>
If you press "0" to "9", the key points will be added to "model/point_history_classifier/point_history.csv" as shown below.<br>
1st column: Pressed number (used as class ID), 2nd and subsequent columns: Coordinate history<br>
<img src="https://user-images.githubusercontent.com/37477845/102345850-54ede380-3fe1-11eb-8d04-88e351445898.png" width="80%"><br><br>
The key point coordinates are the ones that have undergone the following preprocessing up to ④.<br>
<img src="https://user-images.githubusercontent.com/37477845/102244148-49e27700-3f3f-11eb-82e2-fc7de42b30fc.png" width="80%"><br><br>
In the initial state, 4 types of learning data are included: stationary (class ID: 0), clockwise (class ID: 1), counterclockwise (class ID: 2), and moving (class ID: 4). <br>
If necessary, add 5 or later, or delete the existing data of csv to prepare the training data.<br>
<img src="https://user-images.githubusercontent.com/37477845/102350939-02b0c080-3fe9-11eb-94d8-54a3decdeebc.jpg" width="20%"> <img src="https://user-images.githubusercontent.com/37477845/102350945-05131a80-3fe9-11eb-904c-a1ec573a5c7d.jpg" width="20%"> <img src="https://user-images.githubusercontent.com/37477845/102350951-06444780-3fe9-11eb-98cc-91e352edc23c.jpg" width="20%"> <img src="https://user-images.githubusercontent.com/37477845/102350942-047a8400-3fe9-11eb-9103-dbf383e67bf5.jpg" width="20%">
#### 2.Model training
Open "[point_history_classification.ipynb](point_history_classification.ipynb)" in Jupyter Notebook and execute from top to bottom.<br>
To change the number of training data classes, change the value of "NUM_CLASSES = 4" and <br>modify the label of "model/point_history_classifier/point_history_classifier_label.
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
【资源说明】 毕业设计 基于Python手势识别的计算机视觉综合系统源码+部署文档+全部数据资料(优秀项目).zip毕业设计 基于Python手势识别的计算机视觉综合系统源码+部署文档+全部数据资料(优秀项目).zip 【备注】 1、该项目是个人高分毕业设计项目源码,已获导师指导认可通过,答辩评审分达到95分 2、该资源内项目代码都经过测试运行成功,功能ok的情况下才上传的,请放心下载使用! 3、本项目适合计算机相关专业(如软件工程、计科、人工智能、通信工程、自动化、电子信息等)的在校学生、老师或者企业员工下载使用,也可作为毕业设计、课程设计、作业、项目初期立项演示等,当然也适合小白学习进阶。 4、如果基础还行,可以在此代码基础上进行修改,以实现其他功能,也可直接用于毕设、课设、作业等。 欢迎下载,沟通交流,互相学习,共同进步!
资源推荐
资源详情
资源评论
收起资源包目录
毕业设计 基于Python手势识别的计算机视觉综合系统源码+部署文档+全部数据资料(优秀项目).zip (36个子文件)
hand-main
bishe.yaml 6KB
LICENSE 11KB
demo
demo2.py 11KB
毕设.py 14KB
手势识别.py 14KB
屏幕检测.py 1KB
hand
app.py 22KB
keypoint_classification.ipynb 91KB
utils
__init__.py 37B
cvfpscalc.py 615B
__pycache__
__init__.cpython-311.pyc 253B
cvfpscalc.cpython-311.pyc 2KB
cvfpscalc.cpython-38.pyc 934B
__init__.cpython-38.pyc 178B
point_history_classification.ipynb 177KB
mm.py 463B
model
__init__.py 167B
keypoint_classifier
keypoint.csv 3.89MB
keypoint_classifier.py 1KB
keypoint_classifier_label.csv 46B
keypoint_classifier.hdf5 23KB
__pycache__
keypoint_classifier.cpython-311.pyc 2KB
keypoint_classifier.cpython-38.pyc 1KB
keypoint_classifier.tflite 6KB
point_history_classifier
point_history_classifier.tflite 6KB
point_history_classifier_label.csv 48B
point_history_classifier.py 1KB
point_history.csv 1.68MB
point_history_classifier.hdf5 22KB
__pycache__
point_history_classifier.cpython-38.pyc 1KB
point_history_classifier.cpython-311.pyc 2KB
__pycache__
__init__.cpython-311.pyc 410B
__init__.cpython-38.pyc 319B
README.md 9KB
README.md 441B
171265889347208773632.zip 416B
共 36 条
- 1
资源评论
- 兄弟7092024-06-07终于找到了超赞的宝藏资源,果断冲冲冲,支持!
不走小道
- 粉丝: 3321
- 资源: 5061
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- 技术资料分享CC2530非常好的技术资料.zip
- 技术资料分享AU9254A21非常好的技术资料.zip
- 技术资料分享AT070TN92非常好的技术资料.zip
- 技术资料分享ADV7123非常好的技术资料.zip
- 技术资料分享信利4.3单芯片TFT1N4633-Ev1.0非常好的技术资料.zip
- 技术资料分享手机-SMS-PDU-格式参考手册非常好的技术资料.zip
- 技术资料分享Z-Stackapi函数非常好的技术资料.zip
- 技术资料分享Z-Stack-API-Chinese非常好的技术资料.zip
- 技术资料分享Z-Stack 开发指南非常好的技术资料.zip
- 技术资料分享Zigbee协议栈中文说明免费非常好的技术资料.zip
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功