I'm working on something odd but fun...
I've been learning my way around the Leapmotion integration in Unreal, and have started putting together a little game to test it out. With the new drivers, the hand tracking is very good, and I'm pretty excited about potential applications for it in future.
I've seen the Vive and Oculus motion controllers, and while they seem very cool, there's something very tactile about being able to use your own hands. Leapmotion uses an infrared camera to do that, and as such is the most intuitive VR interaction system I've seen to date.
There are a couple of downsides, of course. The lack of tactile feedback can make some actions, including fine manipulation quite difficult or unsatisfying. You can get around this with some really solid audiovisual feedback, but its still only meeting the problem half way.
Because of the nature of the device, it can only see what you see. If you move your hand out of the visible area, your hand no longer exists. This has obvious issues, but can also occur if your hand becomes occluded by anything, including your other hand. This makes complex gesture recognition really difficult.
Finally, you have the double edged sword that occurs when you consider accessibility. A system that fully utilities complete hand control will not be usable by people who do not have full use of both hands, effectively locking them out from using that software. Where with other input devices you could remap controls, that's simply not viable here; your controller is your hand.
That said, I still think it's a neat little device, and pretty good value for what it is. Now it's time to see what I can do with it.
No comments:
Post a Comment