A new AR/VR positional development tool, the uSens AR Hand Tracking SDK allows developers to create AI and skeletal hand tracking content for the smartphone, including both iOS and Android, using the camera on the phone.
“While Pokémon Go and Snapchat brought augmented reality mainstream and into the lives of hundreds and millions of smartphone users, these interactions have so far been limited to their touchscreens,” said Anli He, co-founder and CEO of uSens. “uSens is proud to push AR to the next frontier, by enabling developers to create engaging, enjoyable and entertaining augmented reality experiences made more intuitive, for the smartphone user—by simply moving their hands and fingers in the air.”
“This opens a whole new world of possibilities for developers, enabling them to create a truly one-of-kind experience for a mainstream audience. Similar to how touchscreens enabled even the most technologically challenged to embrace smartphones, providing an easy and natural way for users to engage with AR/VR objects and environments will play a major role in boosting consumer adoption,” said He.
Unlike other technologies that only track fingertips, uSensAR Hand Tracking SDK uses computer vision and deep learning technology with a mobile phone’s RGB camera to provide full hand skeleton tracking and 3D motion recognition.
uSens’ hand skeleton and 3D-motion recognition software:
- Users only need to move their hands in the air to place themselves in the virtual world.
- Provides 3D skeletal hand tracking, allowing for precise, controller-less interaction on an AR/VR device.
- Greatly reduces reliance on external input devices such as game pads or external cameras, providing more intuitive control in mobile devices.
- Is compatible with both Android and iOS systems.
uSens’ hand skeleton and 3D-motion recognition technology opens up potential target markets such as virtual training and education, healthcare/medical, entertainment and creating new emotional connections through shared virtual experiences.