Commit f546986a authored by Anh Tran Viet's avatar Anh Tran Viet

Update README.md

parent 619a41cb
# mvi-sp # mvi-sp
## Assignment ## Assignment: Real-time gesture detection
* Create a dataset with at least five static gestures (fist, open fist, pinch, pointing and peace sign). * Create a dataset with at least five static gestures (fist, open fist, pinch, pointing and peace sign).
* Create a Python library for gesture detection that uses a neural network that detects the gestures from point 2, with the library designed so that the gesture set can be easily expanded with other static and dynamic gestures. * Create a Python library for gesture detection that uses a neural network that detects the gestures from point 2, with the library designed so that the gesture set can be easily expanded with other static and dynamic gestures.
* Create a simple application to visualize the output of the detector. * Create a simple application to visualize the output of the detector.
* Describe the MultiLeap library and its algorithms for fusing data from multiple Leap Motion sensors and integrate it into your library. Compare the precision of the gesture detector when more than one Leap Motion sensor is used.
## Directory structure
This directory consists of the pdf version of the "work in progress" conference article, src directory, and conda environment configuration file.
* text directory - *LaTeX source files for the bachelor thesis*
* src directory - *source code and experimental Jupyter notebooks*
* executables directory - *executable files, python script, dataset, pre-trained model, application's configuration file*
```
.
├── README.md
├── MultiLeap__Gestures.pdf
├── environment.yml
└── src
├── Dataset
│ └── ...
├── TrainedModel
│ └── ...
├── model_training.py
└── model_training.ipynb
```
## Run and setup ## Run and setup
...@@ -17,7 +38,7 @@ The Python script serves for model training on custom datasets. Run `python mode ...@@ -17,7 +38,7 @@ The Python script serves for model training on custom datasets. Run `python mode
The script loads its configuration from `gestureLeapConfig.json` in the execution folder. The user can change `dataset_directory`, `model_directory`, `epoch`, `dynamic_timestep`, `gpu` values, or use default values. If the configuration file is not found, it gets created with default values. Dataset directory must follow structure defined by DataSampler application `<dataset_directory>/<gesture type>/<index>.txt`. Names of gesture types must be numeric values. The `model_directory` is the location where the trained model will be saved, must exist. The script loads its configuration from `gestureLeapConfig.json` in the execution folder. The user can change `dataset_directory`, `model_directory`, `epoch`, `dynamic_timestep`, `gpu` values, or use default values. If the configuration file is not found, it gets created with default values. Dataset directory must follow structure defined by DataSampler application `<dataset_directory>/<gesture type>/<index>.txt`. Names of gesture types must be numeric values. The `model_directory` is the location where the trained model will be saved, must exist.
Set `gpu: True` to run the script with GPU support, more on installation [here](https://www.tensorflow.org/install/gpu) Set `gpu: True` to run the script with GPU support, more on installation [here](https://www.tensorflow.org/install/gpu)
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment