In recent years we have become accustomed to interacting with systems through touch screens – both for input and for output purposes. The emerging industry of virtual and augmented reality seem to lead the next generation of output devices – where output is typically projected directly into the user’s eyes using dedicated glasses. However, the same industry has not yet converged to one standard when it comes to input/control devices. We believe that smart-bands are in excellent position to be part of this input/control standard due to their built-in sensors and their expected adoption rates.
As a first step towards demonstrating the potential of smart-bands as the next generation input/control devices, we propose a novel system for textual input which is based on air-writing recognition using smart-bands. The proposed system enables the user to hand-write in the air in an intuitive way, where text is recognized by analyzing the motion signals generated by an off-the-shelf smart-band worn by the user. In order to evaluate our system, we collected 15 sets of the ABC letters (written on the air and collected using a smart-band) from 57 different subjects. The results of our evaluation demonstrate the feasibility of the proposed system that obtained an accuracy of 90.9% for a personalized model and 78.3% accuracy for a (non-personalized) global model