Who does not know it? A small gesture and the fictional computer knows what to do. Pretty much in all films and series where “the future” is to be represented, the display or mouse no longer have to be touched to carry out simple actions. Instead of clicking or swiping across the screen, a quick hand movement signals to the computer what you want to see now. And a new file is opened or the page is changed. In the area of accessibility there are actually some options today that go in this direction and Siri, Alexa and Co can do some things that were only known from the film until a few years ago. Accessibility tools are not necessarily easy to use for users who are used to other input methods; and ask Siri during a slide show to that she should switch to the next slide is more annoying than helpful in many contexts. Who would like to get the answer “I did not understand you” in the middle of the presentation? This is not what the Science Ficition promised us! There is also gesture control here and there, but the concept has not really caught on so far.
Hand.js: Presenting slides controlled by gestures
So far, however, Jurić has asked to contact him before using it in his own projects ; for this reason, the library is blocked from being used on the localhost domain. The Chrome extension, which can also be used for Edge, is also not yet available in the official Chrome Web Store, but has to be downloaded from GitHub and installed manually. Hand.js has only been available on GitHub for a few days and, as already mentioned, is still in the beta phase. So the project is very young and in development. However, the library is under GPL-2 license, the Chrome extension has an MIT license. It is therefore to be expected that the restrictions on possible uses will be lifted over time.
Gesture Control: The Siri Problem 2.0?
In the context of an interaction controlled via the camera, the question of the security of the recorded data naturally also arises. The Hand.js website shows that there is no data transfer, but that work is only carried out locally on the device that is controlled by the library. How this will be shaped in practice remains to be seen. The past few years have shown time and again that data security is a major hurdle for smart assistance systems. A small library for gesture control should not play in the same league as the well-known voice assistants when it comes to data collection; Nevertheless, this aspect must be considered.
The challenges in the development of the library will certainly also include the need to distinguish between gestures that are directed at a human counterpart and those that are used to control the slides. Just like with Siri and Alexa, who sometimes burst into conversations without actually being meant, such an input path could otherwise lead to their own difficulties. But the option is always exciting if it brings us a little closer to the SciFi vision.
All information about Hand.js can be found on the project website.