Thanks to the Intel Galileo board, I can remove the wired part of the system. No more serial port, no more special buffering functions, a huge load off the implementation.
The Galileo board now sits on the same network as the client pc. It runs an simple web server at port 80. It reads requests from an incoming client connection. The client PC, connected to the Leap Motion, continuously polls the Galileo board by sending a GET request with the calculated PWM values. The Galileo board received the request, parses out the PWM values and send that data to the assigned PWM pins that control the feedback to each finger.
I also put in a lot of special feedback functions. Instead of simple strength values, i can now code it to accept specif patters of vibration.
There is no More Serial driver, so the NodeJS app i wrote that accepts GET requests and translates them into serial data is no more.
Reading the Stabilized Z position of the fingers and hands is very lagy. Watch the video..May be an issue with the current SDK.
For some reason the Stabilized Z data is very slow to calculate. If you watch the video and note the Stabilized Z vs just the RAW Z data its miles behind.
In fact I think this issue has been ongoing, as most of the stuff i have tried to code with the Z axis, I have always ran into issues lag issues! Looking back i have abandoned heaps of ideas because of this, as i though it was just my bad coding! Lets hope this is a bug!
I spent a few hours trying to get smooth operation of some kind using both hands on the leap motion. However i was not able to get reliable results. My left hand would get tired of holding still and move off spot. I decided to at least focus on the detailed control and manipulation with 1 hand and have the left hand use the keyboard. For example. . In the video above, I’m using the keys:
A = Width S = Height D = Depth F = Location
If one is being held down, i can use the right hand to select the object by hovering over it and adjust its values by moving the thumb and index finger together or further apart.
I have started working with dual hand interactions. Left hand is for control selection, right hand is for manipulation. The biggest difficulty is the precise tracking of both hands.. hopefully that will change as the leap team puts work into the SDK, as they have been recently. Light interference is also a huge issue
X coordinates of Index-Thumb control the width, Y coordinates of Index-Thumb control the height and the Ring Finger+Middle finger X control the depth. I need to test the large version of this on a machine that’s not 5 years old.. I’m running into serious performance issues when rendering more than 10 SVGs but that is totally expected.
This is a small example of the SVG code im working on. The screen has 2 sets of bands that center in the middle. Left and Right are controlled via the Y coordinates of each finger and the Top and Bottom are controlled by the X coordinate. I am working on manipulating the centre of all the polygons to simulate the Z coordinates….Here is a short video example:
The video is tad laggy, because my current dev machine is an old Thinkpad…cause i like it. Ill do a few other videos on a new machine later on
I’m testing out some stuff, and have created this nifty little app. When you touch a blob it grows(and at the same time makes a charge-up sound). The finger you touch with, has its PWM value increased up to maximum until the finger has been touching the blob for a set timeout. Then the blob is deleted, a pew sound is played, and 2 new blobs are created at random spots close to where the original was. If you hold all 5 fingers over a set of blobs or one large blob, it/they are deleted and Pew..pew…..pew……pew echo is played…