In my original prototype plan, I had said the 2nd prototype would be photographs of hand motions during crocheting. However, I was provided with a Leap Motion Controller the week before and decided to give that a go.
I attempted to setup the Leap Starter Development Kit (SDK) on my laptop and unfortunately in class last week, I couldn’t open the program or get the USB port on my computer to sense that the Leap controller was plugged in.
Although, this week, I finally managed to make the program work. I figured out that the Leap motion program senses the user’s hand (or hands), displays it on the computer screen, and automatically creates a rig for their hand(s). It’s really neat!
I’ve been trying to figure out how to get it to sense an object that the user is holding (in my case, the objects would be the crochet hook and yarn). I believe the phrase for sensing that is “tool tracking.” I wanted to figure that out for the next prototype phase, but sadly, I haven’t had any luck in finding the setting or application within the program that rigs objects. So, my next prototype, is going to be a mock up of how the interactive stitcher will look on screen. I’m going to take a screenshot of my rigged hand in the Leap motion app, and edit in the guide dialogue, the crochet hook, and the pattern. So I’ll be showing those prototypes and letting people test out the Leap Controller at Art After Dark.