2022-067 DiscoBand: Multiview Depth-Sensing Smartwatch Strap
Abstract
Real-time tracking of a user’s hands, arms and environment is valuable in a wide variety of human-computer interaction applications, from context awareness to virtual reality. Rather than rely on fixed and external tracking infrastructure, the most flexible and consumer-friendly approaches are mobile, self-contained, and compatible with popular device form factors (e.g., smartwatches).
This invention enables imaging and estimating the pose of the hand, body, and nearby environment through multi-view depth sensing via a sensor band worn on the wrist. The sensing band is thin and self contained, in a form factor similar to a smartwatch strap. Arranging the orientation and position of depth sensors on the strap, allows for the ability to image different features of interest, such as the hand, torso, or environment. The device solves problems with a bulky form factor and imaging occlusion that are present in similar devices. Beyond the hardware and physical construction, firmware has also been developed to boost the bandwidth of sensor data that is able to be collected from the strap and software to process sensor data into actionable output for specific applications. Example applications for this device include static hand gesture recognition, continuous hand pose estimation, arm pose estimation, object recognition, and environmental mapping.
Benefit
This technology offers a unique combination of features and properties. The band is thin, and could be integrated into future smartwatches. In addition, the multi-view approach is more robust to occlusion than single-view methods. The low-resolution depth data is more privacy preserving than conventional camera-based wrist systems. Lastly, the band's unique design and data opens entirely new capabilities not previously demonstrated with wrist-worn setups, including the ability to estimate user upper body pose, detect held objects, and scan the environment for obstacles and contextual clues.
Market Application
Virtual Reality (VR) and Augmented Reality (AR): Hand tracking is crucial for providing immersive experiences in VR and AR applications. With the ability to accurately track hand movements and gestures, users can interact with virtual objects and interfaces naturally. The market for hand position sensing in VR and AR is expected to grow as these technologies become more mainstream.*
Gaming: Hand position sensing can enhance the gaming experience by allowing players to control and interact with games using natural hand movements. This can range from simple gestures to more complex actions. The gaming industry is constantly looking for new ways to improve user engagement and interactivity, making hand position sensing a potentially valuable market.*
Robotics and Industrial Automation: In robotics and industrial automation, hand position sensing can be used for human-robot interaction, collaborative robots, and precise control of robotic arms. These applications are relevant in manufacturing, healthcare, logistics, and other industries where human-robot collaboration is increasing. The market for hand position sensing in robotics and automation is expected to grow as these technologies continue to advance.*
Rehabilitation and Healthcare: Hand position sensing can play a vital role in rehabilitation and healthcare applications. It can be used for monitoring and assessing hand movements in patients with neurological disorders, tracking progress during therapy, and designing personalized rehabilitation programs. The market potential for hand position sensing in healthcare and rehabilitation is driven by the increasing demand for advanced technology solutions in these areas.*
Human-Computer Interaction (HCI): Hand position sensing can enable new modes of interaction with computers and devices. It can be used for touchless interfaces, gesture-based controls, and natural user interfaces. As HCI continues to evolve and adapt to user needs, hand position sensing technologies can find applications in various domains, including smart homes, automotive interfaces, and public displays.*
Other Information
*some aspects of this description were generated using ChatGPT and modified to fit the objectives of the description.
Publications
https://dl.acm.org/doi/pdf/10.1145/3526113.3545634
https://dl.acm.org/doi/fullHtml/10.1145/3526113.3545634