Here’s another example of why having an open source environment void of lawyers encourages innovation. Here’s a developer who took a projector, Android, a Kinect, and turned it into a future UI (in alpha form of course). It’s actually pretty cool and reminds me a bit of the movie Minority Report.
Developer DDRBoxman was able to create a touchless gesture controlled Android UI to be used when projecting an Android device onto a large screen. Using a Kinect hooked up to a Windows PC running Simple Kinect Touch, the developer’s movements are processed through the software which then sends TUIO commands to his Galaxy Nexus running TuioForAndroid. These events are then inputed into the Android OS as touch events which can control any application. Everything was then being displayed through the projector hooked up to the Galaxy Nexus via MHL.
I’m not going to try to pretend I know everything that’s going on here, but according to the developer, he had to download the AOSP and compile a separate ROM in order to give TuioForAndroid system access. System access was needed to get a system pid in order to inject the touch events into the OS. It looks like a lot of work has gone into this project and I’m looking forward tracking its progress. This is what open source and innovation is all about. Kudos to the developer. Keep up the good work!
- Source Recursive Penguin