This demonstrator is designed to be run from a desktop or a laptop computer.

If you are using a phone or a tablet, some functionalities may not be displayed or not work properly.

Scientific mediation of the GREYC laboratory
Illustration of the demonstrator Hand Rhythm

Hand Rhythm

Team IMAGE

This demonstrator presents a use of deep learning in computer vision.

From a video stream of somebody, the dominant hand (right or left) is detected using pre-trained neural networks.

The hand orientation recognition is performed online in order to detect the user interaction with the game.

To do this, the player makes gestures (up👆, down👇, left👈, right👉) with its hand (left or right) to activate a column of musical notes.

During the game, the goal is to activate the right column when a note passes through the hit zone in order to score points.

This demonstrator uses the TensorFlowJS technology.

You may also like

Illustration of the demonstrator UDGVNS

UDGVNS

Illustration of the demonstrator AnnoTag

AnnoTag

Illustration of the demonstrator Face Emotion

Face Emotion

Illustration of the demonstrator LeNet-5

LeNet-5

Illustration of the demonstrator IVOIRe

IVOIRe

Illustration of the demonstrator Norns

Norns

Illustration of the demonstrator GMICol

GMICol

Illustration of the demonstrator Tapisserie de Bayeux

Tapisserie de Bayeux

Illustration of the demonstrator Gitbistex

Gitbistex

Illustration of the demonstrator GREYC Star

GREYC Star

Illustration of the demonstrator GREYC Escape

GREYC Escape

Illustration of the demonstrator Human Sense

Human Sense

Illustration of the demonstrator Uni MS-PS

Uni MS-PS

Illustration of the demonstrator Shifumi

Shifumi

Illustration of the demonstrator Elevate

Elevate

Illustration of the demonstrator Hurmony

Hurmony

Illustration of the demonstrator Snake Face

Snake Face