Name: Leap Motion Controller
Summary: Introduce support for input mappings using the Leap Motion Controller
Scope: Optional / pluggable Engine piece? Can it be a module?
Current Goal: Re-implement the initial Leap branch prototype using the shiny new Jitter lib (which also needs more work)
Phase: Implementation
Curator: Cervator + begla + potentially one more if somebody is interested?
Related: Giant mess of structure overhaul fun - not sure if this would go best in an alternative input directory separate from the main engine (if it can't be a straight module, anyway), Jitter library repo
Updates:
So here's my little hinted-at secret - begla was contacted by Leap Motion last week and we met with one of their representatives from San Francisco over Skype on Friday. Very interesting and impressive stuff, both the hardware (trivial motion detection lag, high accuracy, many data points) and their product maturity with 10k dev units prepared and many more pre-orders than that. In a nutshell it is a super Kinect for PC & Mac, and maybe Android in the future (and Linux soonish (tm))
We signed up as interested developers and they mailed us both a dev unit each, free of charge, and I got mine earlier today (Tuesday). begla might get his tomorrow? Unsure. Neat to be able to get a hold of stuff like this, especially now with hardware startups contacting us instead of the other way around via funding kickstarters (OUYA, Oculus Rift)
So again this is another one of those "would kick ass if we could do this, but no promises!" options. Bit cheaper and simpler this time vs OUYA or the Rift, and in a few hours I had Terasology reading the motion of my hand from in-game and triggering movement events - although admittedly said events aren't working and my rush-code is full of dirty hacks just trying to get it working fast. An actual good developer could probably get it working from scratch in an hour
Not to sound like a broken record but this has a lot of potential for nifty controller alternatives. The Leap can pull lots of data from multiple hands down to where individual fingers are pointing, the roll/yaw/pitch of your palm, distance from the controller, etc. I did some thinking and imagine you could do about everything in-game without even touching the mouse or keyboard (though you might need voice chat to go with it in multiplayer). In particular creature commands could be entertaining as hand gestures, and I miss casting spells via mouse gestures in Black & White
If somebody is super curious to play with this it is possible we can get another controller sent out (doesn't hurt to ask, I figure!) since the company is very eager to get interesting games implementing their stuff, especially by launch in about two months (in Best Buy for starters). I'm still not coding enough to get in shape where it'll be worth my time so if push comes to shove I can even send my unit onwards
I'll share some more thoughts on what kind of motions could be behind what movement/action in-game later, but I spent the whole evening trying to get input events working and have given up for now very tired I figured this might be a good consideration to go with the mouse button behavior thread going on (that I also haven't managed to crunch through fully due to this ..)
Here's the actual logging for me raising my hand (palm flat, fingers together) from a low position above the device to a high position
Summary: Introduce support for input mappings using the Leap Motion Controller
Scope: Optional / pluggable Engine piece? Can it be a module?
Current Goal: Re-implement the initial Leap branch prototype using the shiny new Jitter lib (which also needs more work)
Phase: Implementation
Curator: Cervator + begla + potentially one more if somebody is interested?
Related: Giant mess of structure overhaul fun - not sure if this would go best in an alternative input directory separate from the main engine (if it can't be a straight module, anyway), Jitter library repo
Updates:
- Feb 8th - Prototype is functional, but pretty awful and rough to control
- Feb 16th - @Begla's first round overhaul got us to two-handed full movement control including camera view. Fly like Superman!
- Feb 18th - We technically have a video now, but it isn't very presentable yet
- Feb 23rd - New Leap SDK out, integrated along with a couple native gestures for attack & god mode
- March 9th - Leap SDK 0.7.5 out, upgraded our setup, stability is a little better. Polished video hopefully just about ready
- March 16th - Video published
- April 11th - Started extracting Marcel's LeapMotionP5 lib to a pure Java version to base Jitter on (with him on board). Structure looking good
- April 21st - First version of Jitter lib published!
So here's my little hinted-at secret - begla was contacted by Leap Motion last week and we met with one of their representatives from San Francisco over Skype on Friday. Very interesting and impressive stuff, both the hardware (trivial motion detection lag, high accuracy, many data points) and their product maturity with 10k dev units prepared and many more pre-orders than that. In a nutshell it is a super Kinect for PC & Mac, and maybe Android in the future (and Linux soonish (tm))
We signed up as interested developers and they mailed us both a dev unit each, free of charge, and I got mine earlier today (Tuesday). begla might get his tomorrow? Unsure. Neat to be able to get a hold of stuff like this, especially now with hardware startups contacting us instead of the other way around via funding kickstarters (OUYA, Oculus Rift)
So again this is another one of those "would kick ass if we could do this, but no promises!" options. Bit cheaper and simpler this time vs OUYA or the Rift, and in a few hours I had Terasology reading the motion of my hand from in-game and triggering movement events - although admittedly said events aren't working and my rush-code is full of dirty hacks just trying to get it working fast. An actual good developer could probably get it working from scratch in an hour
Not to sound like a broken record but this has a lot of potential for nifty controller alternatives. The Leap can pull lots of data from multiple hands down to where individual fingers are pointing, the roll/yaw/pitch of your palm, distance from the controller, etc. I did some thinking and imagine you could do about everything in-game without even touching the mouse or keyboard (though you might need voice chat to go with it in multiplayer). In particular creature commands could be entertaining as hand gestures, and I miss casting spells via mouse gestures in Black & White
If somebody is super curious to play with this it is possible we can get another controller sent out (doesn't hurt to ask, I figure!) since the company is very eager to get interesting games implementing their stuff, especially by launch in about two months (in Best Buy for starters). I'm still not coding enough to get in shape where it'll be worth my time so if push comes to shove I can even send my unit onwards
I'll share some more thoughts on what kind of motions could be behind what movement/action in-game later, but I spent the whole evening trying to get input events working and have given up for now very tired I figured this might be a good consideration to go with the mouse button behavior thread going on (that I also haven't managed to crunch through fully due to this ..)
Immortius - could I ask for a quick bit of review to figure out what I'm doing wrong? I'm thinking it is probably something simple, and me not having messed with input in ages (if ever) is just leaving me missing it. Was also unsure what to initialize when.
17 is what the "key" variable holds for 'w' - not sure how to just say "forwards button" instead
I can make that trigger but the event just doesn't seem to go anywhere. Or rather, it goes through several registered event handlers none of which seem to really care about it. Any ideas?
- We need to instantiate a Controller class from the Leap SDK - seems to work if I put it in the CoreRegistry but doing so in StateSinglePlayer doesn't seem right ...
- I first created a LeapListener based on their SDK's listener class but when hooked up as a listener (straight in the engine class for testing) it would either exit after 59 frames if I didn't keep the Controller as an instance variable (the max history by default is 59 frames) and if I did keep an instance variable around it would crash Java instead. Huh. It has native hooks just like lwjgl, btw.
- I cloned InputSystem to LeapSystem to see if I could pull frame updates manually in the main loop instead which avoided crashes but doesn't seem to get event communication done right. InputSystem is registered somewhat differently, not sure if that matters or if it is something else
- I'm trying to simply fake a key input event to move the player forward (as if 'w' is pressed) but I'm not quite faking it well enough it seems. The LeapSystem initialize doesn't seem to persist its instance variables and while I can just fetch stuff each iteration from CoreRegistry that doesn't seem right, and maybe it is part of the problem?
Code:
KeyEvent event = KeyDownEvent.create(17, 0f);
CoreRegistry.get(LocalPlayer.class).getEntity().send(event);
I can make that trigger but the event just doesn't seem to go anywhere. Or rather, it goes through several registered event handlers none of which seem to really care about it. Any ideas?
Here's the actual logging for me raising my hand (palm flat, fingers together) from a low position above the device to a high position
Frame id: 2096284, timestamp: 18217952018, hands: 1, fingers: 0, tools: 0, framesTotal: 649
Hand sphere radius: 85.49996 mm, palm position: (145.418, 707.702, -183.015)
Previous hand y was 109.12005 while new is 707.70154 so delta is 598.5815
Detecting RAISED hand, triggering move forward
Hand pitch: 6.814376855197349 degrees, roll: 11.944047424249678 degrees, yaw: -20.013106551203098 degrees