This demonstrator is designed to be run from a desktop or a laptop computer.

If you are using a phone or a tablet, some functionalities may not be displayed or not work properly.

Scientific mediation of the GREYC laboratory
Illustration of the demonstrator Hand Rhythm

Hand Rhythm

Team IMAGE

This demonstrator presents a use of deep learning in computer vision.

From a video stream of somebody, the dominant hand (right or left) is detected using pre-trained neural networks.

The hand orientation recognition is performed online in order to detect the user interaction with the game.

To do this, the player makes gestures (up👆, down👇, left👈, right👉) with its hand (left or right) to activate a column of musical notes.

During the game, the goal is to activate the right column when a note passes through the hit zone in order to score points.

This demonstrator uses the TensorFlowJS technology.

You may also like