Iris Classon
Iris Classon - In Love with Code

Stupid Question 120: What is Leap Motion and The Leap?

[To celebrate my first year of programming I will ask a ‘stupid’ questions daily on my blog for a year, to make sure I learn at least 365 new things during my second year as a developer]

The Leap by Leap Motion

The Leap by Leap Motion

Leap Motion is a San Francisco startup that recently got a lot of attention for their product (not yet launched) The Leap. The Leap is a motion detection device that captures multi touch and finger gestures such as pinch and zoom, all the gestures we are used to using on touch devices. The thing is that you don’t actually touch the device.

If you remember the discussion I had about Hicks and Fitts law, according to Fitts law this will basically make us more efficient as we have less the distance to travel, and the target area will be larger. As a lover of Kinect, I must say I’m head over heels over the product and all the things I would be able to create for its usage. While I’ve been patiently waiting for the touch less alternatives, and eye controlled computers are slowly, with various results, slowly creeping into the market, I still think that motion sensing software will be an even larger part of our future, and integrated in ever screen and laptop. As a matter of fact Asus is already cooperating with them, with the first computer scheduled for release this year.

While eye tracking software integration with computers help those with limited motor skills, 3D motion sensing might have a wider market and usage. My best bet is that a combination of both would be best, with the user being allowed to switch between the the (or three modes, as I think that keyboard and mouse input will need to stay around for a little more).

Tobii PCEye a product of Tobii, is an eye tracking device and software that is alreasy being intgrated in computers and sold, can also be bought as a stand alone control.

If you are a developer you can sign up at leap Motion and tell them about your idea for an app, and if you are lucky (and so far 12000 developers have been) you will get a hold of the device The Leap prior to launch. I have a super-duper idea, so I hope mine gets accepted. We will see :)

If not, The Leap is planned for release next month, and it’s not expensive. 69 dollars. Yes, the price of The Leap Leap Motion is 69 dollars. I’ll keep you posted on whether I get a device sent to me or not, non-coding fingers crossed!

What is your take on the future of this kind of technology, and what would be a great usage?

Comments

Leave a comment below, or by email.
Fredrik Jönsson
1/7/2013 2:53:43 AM
I doubt non-tactile interfaces will be huge outside gaming and some other very specific applications. It's hard enough to get necessary precision with mouse, trackpad and such, and waiving in the air to manipulate objects on a screen is about as non-intuitive as it gets.

So, no, I don't think this will fly as much as they try to make me believe. I rather expect continuous improvements to touch interfaces, particularly improvements to tactile feedback. 
Iris Classon
1/7/2013 3:03:06 AM
Reply to: Fredrik Jönsson
Really? Fair enough :) I guess it will depend on how easy to use it will be. As soon as I get my hands (pun intended ;) ) on this thing I'll write up a review. 
Joe
1/7/2013 3:30:11 AM
Leap seems to be some pretty cool hw tech.  I think the biggest challenge now is getting 'outside the box' on usage.  First instincts with input devices is to emulate what we know...for instance the Kinect 'cursor' or the Leap 'mouse demos' in the videos.  I think first gen software will largely be emulating touch, or doing mouse replacements with this kind of tech.


What interests me is the next gen stuff..where the devs start looking at it as a totally new paradigm...a 3d visual input device.  Can we model as if we're using virtual clay? can we manipulate drawings with different layers by simply moving our hands/brushes in 3d space?  If this ended up being another way to emulate touch, and the result is playing angry birds, I'd feel really sad :)  Let's hope that all of us devs can really innovate as these visual style interfaces become more commonplace! 
Iris Classon
1/7/2013 3:36:13 AM
Reply to: Joe
Yeah, couldn't agree more! It is really up to us as devs to show the possibilities! 
Hasen Ahmad
1/7/2013 3:55:29 AM
a cool idea is to just implement the motion control from the Iron Man movies on Tony Stark's computer! 
Fredrik Jönsson
1/7/2013 4:15:26 AM
The point with touch interfaces is that they are the most intuitive interfaces we've come up with to date. If you see children interact with devices with mouses, trackpads and touch interfaces you'll see an obvious increase in usability between these and the tactile feedback as well as the direct interaction with the manipulated object is fundamental.

My suspicion is that when we've passed the "ooh-it's-magic" sensation this will prove not so useful.

However, when we get to the point where we have true 3D-interfaces and are able to use our hands to directly manipulate objects that are rendered "in the air" some manner or other it's a different story. Still, lack of tactile feedback is a definite drawback. 
Robin Osborne
1/7/2013 4:34:22 AM
Reply to: Fredrik Jönsson
No-touch gesture interaction will probably be welcomed in the medical world where hygiene is so important; currently a surgeon may need a nurse to change the x-rays and other documents being displayed at any one time during an operation, but what if this was all on a large screen and a few gestures could manipulate the information displayed. 

I doubt there will be any alienation of users by taking away a physical interface; even young children can use a Kinect these days! 
Andy Dent
1/10/2013 7:20:09 AM
Expect a lot of people to have to start paying more attention to how they design in the Undo features of their app!

I'm a real "alternate-UI" gadget nut, collecting all sorts of one-handed keyboards and alternate touch or other input devices. I've been fighting the temptation to get a Kinect controller - this may push me over the edge! Thanks.

In SF worlds, alternate UI's abound. Steve Perry's Matador Series has a "betydelse space" communicator where people do multi-modal gestures: Carlos waved his right hand in a series of quick, short gestures. Programming mode signals, she knew, though she couldn't read them. Until recently, she hadn't known that much. At the same time, Carlos fluttered his left hand back and forth, wiggling his fingers in a precise pattern. Mathematical code. And, while both hands spoke separate languages to the transmitter, he sub-vocalized yet a third set of instructions to the machine

One of the patterns I've seen discussed years ago is gestural interfaces for "steering" and voice or some other input for commands. eg: the Glove TalkII from 1995. 


Last modified on 2013-01-05

comments powered by Disqus