Introduction - If you have any usage issues, please Google them yourself
The Gesture Recognition Toolkit (GRT) is a cross-platform, open-source, C++ machine learning library that has been specifically designed for real-time gesture recognition.
In addition to a comprehensive C++ API, the GRT now also includes an easy-to-use graphical user interface (GUI) which enables user s to stream real-time data into the GUI via the Open Sound Control network protocol. Using the GUI you can:
(1) Setup and configure a gesture recognition pipeline that can be used for classification, regression, or timeseries analysis.
(2) Stream real-time data into the GUI via Open Sound Control (OSC) another application (such as Processing, Max, Pure Data, Openfr a meworks, etc.).
(3) Record, label, save and load your training data.
(4) Train a model for classification or regression.
(5) Test the generalization abilities of the model (using another test dataset or cross validation).
(6) Perform real-time prediction on new data streamed into the GUI via OSC.
(7) Stre