The Google Plus logo

Gesture based interaction arrives with Leap Motion

This post was written by Gavin Beddow and first appeared on the Epic blog on 29th July 2013.

leap motionHaving eagerly awaited the FedEx van to deliver our pre-ordered Leap Motion device to the tools/innovation team here at Epic, the day finally arrived for us to see what it can do.

For those who may not have heard about this little box of wonder, Leap Motion is a gestural input device for your PC or Mac that enables you to interact with it using hand gestures. Kinect (for Xbox and Windows) is also actively establishing itself in this space, but it offers a full body visual and audio tracking solution whereas Leap Motion is a tiny USB key-sized device that tracks your hands and sits on your desktop.

Gesture based interaction is of great interest to us, because the new device allows us to explore how we can enhance HTML-based learning with gesture input. This could include new interactions (e.g. gesture based drag and drop) through to exploring and engaging with content in a whole new way (e.g. exploring a product 360º).

As we unboxed the device, I couldn’t help but think of the film Minority Report, which is a great example of how a gesture-based interface might work in the not-too-distant future.

Setting up your Leap Motion device

Setting up the device was easy. A simple instruction sheet directed me to to get started.

Once the device drivers and software were installed, I had access to the demonstrations and an Airspace app, which is essentially a launch pad for any Leap Motion compatible apps you have installed on your computer already.

Airspace Airspace desktop app

There is also an online Airspace App store which is similar to Apple’s App Store, and it conveniently lists Leap Motion compatible apps that are currently available.

Coming to grips with using gestures for navigation

When you first use Leap Motion, an ‘Orientation’ app is launched which guides you through the key features of the device. After that you can begin exploring the different apps that are available through the AirSpace app store.

Cyber Science motion app Cyber Science – Motion App allows you to explore the human skull in 3D and deconstruct its individual parts.

After using a number of these, the thing that struck me the most was that it feels like there is a lack of consistent interaction patterns for gesture control. Each app seemed to operate differently and you had to get used to the individual gestures of each app before being able to interact with it successfully.

What’s next for Leap Motion?

As the technology establishes itself, and I hope it does, a default set of interactions needs to be established so that software developers can utilise them to make the most of this new technology and the possibilities it can provide.

Leap Motion Visualizer
The Leap Motion Visualizer displays a 3D grid that you can rotate, scale and translate by moving your hands and fingers around the Leap Motion field of view.

There’s some really interesting technical demos included with the Leap Motion SDK showing the device’s tracking capabilities. In addition, there are examples of what people have been creating using the Leap Motion JavaScript library as interactions via a web browser. 

Hand fingers recognition

A simple hand and finger recognition HTML/JavaScript app

So it may be early days for the Leap Motion device, but we are keen to explore its capabilities, potential uses and accuracy.

Have you tried Leap Motion out yet? We’re interested in your thoughts. Share them in the comments below.