Leap Motion connects its hand motion API to virtual reality. Expect Labs gives voice to its MindMeld shopping API. Plus: Disrupt SF judges and workshops announced, and the Mendeley API gets a new Android kit.
Leap Motion Provides VR Developers with New Hand Controls and Environment
Leap Motion announced a hand tracking API in late Spring. Leap Motion are the people who can translate hand gestures so that they appear on a screen. This makes it possible to manipulate objects seen on the computer. Now they are jumping into virtual reality with a new API and a controller that mounts on a VR headset. They just released their Image API that can see "a grayscale stereo image of the near-infrared light spectrum, separated into the left and right cameras." The blended results are impressive: you see your hands as they move through real space. Yet the rest of reality disappears because the infrared can't see very far. Then, with the Leap Motion Platform, you can manipulate objects projected into the area where your hands are visible. In the video image below, reality beyond the hands is replaced by computer generated balls. The image is shown twice--left and right-eye views, essential to a 3D effect. It's also sharper in the video than what is captured in this single frame. They are also releasing a software update to their SDK and a VT Developer Mount that sells for $19.99 to easily attach and remove the Leap Motion Controller from a VR headset.
In a blog post, co-founder and Chief Technology Officer David Holz was a bit more forthcoming than super secret firms like Apple about what lies ahead:
I’d like to also give a hint of what we’re working on for the future. One prototype sensor that we’re beginning to show today (and will be giving out more information on in the future) is codenamed “Dragonfly.” Designed to be embedded by VR OEMs, Dragonfly possesses greater-than-HD image resolution, color and infrared imagery, and a significantly larger field of view.
This attention to constantly approaching a more real experience mirrors Leap Motion's hand controls, which are on a trajectory to become ever more fine-grained. In V2, for example, the strength of a grab can be indicated by the number of fingers that are curled toward making a fist.
Expect Labs Adds Voice to its MindMeld Shopping APIs
We reported on Expect Labs MindMeld APIs focused on shopping in February. Now they aim to add voice power. But why bother employing such cutting edge technology as voice powered discovery for something as mundane as shopping? According to their blog post, the addition of voice is a game changer:
In a world with large product catalogs and small smartphone screens, however, navigating to the right product can often be a cumbersome task — especially while scouring the aisles. Even with the best mobile shopping apps, the smart shopper must often drop their shopping bags to tap through numerous menus or type a search with two hands. For any serious shopper on a mission, this dystopian reality is entirely insufferable.
The mundane aspect of this picture might be the secret to its power: everybody shops, so the market for people wanting to leverage this could be massive. The API Documentation includes some demos that spell out how this works.
API News You Shouldn't Miss
- New Android Kit Released for Mendeley API
- Expect Labs Offers MindMeld API Voice-Powered Search Results
- Announcing The Disrupt SF Hackathon Judges And API Workshops
- Why telecom network APIs are a catalyst for growth
- Leap Motion Gets Your Hands in the Game
- How to accelerate partnerships and generate revenue with API management