New Wiimote motion recognition softwares: LiveMove and LiveCombat
Having some difficulty in getting your Wiimote to understand your movements? The Artificial Intelligence and Interactive Digital Entertainment (AIIDE) Conference held last June 6 to 8 at Stanford University had AiLive‘s John Funge and Wolff Daniel introduce two new software tools that will greatly enhance motion recognition for both the Wii-mote and multi-SKU combat.
AiLive, a Palo Alto-based artificial intelligence firm, developed Context Learning, an efficient learning-based AI that expedites an AI’s learning and recognition processes.
LiveMove resulted from using Context Learning to address the motion recognition problem that plagues the Wiimote. Funge claimed:
WeÂ’ve had cases where people have been working for months on motion recognizers for the Wii remote and theyÂ’ve been struggling with it and theyÂ’ve used our LiveMove software and in an afternoon theyÂ’ve had better motion recognition than theyÂ’ve had in months trying to hand code solutions.
Simply put, LiveMove enables users to “teach” the Wiimote basic and complicated gestures for it to easily recognize patterns without requiring them to be exactly replicated.
Click on Full Article to read more about LiveMove’s and LiveCombat’s demonstrations.
Having some difficulty in getting your Wiimote to understand your movements? The Artificial Intelligence and Interactive Digital Entertainment (AIIDE) Conference held last June 6 to 8 at Stanford University had AiLive‘s John Funge and Wolff Daniel introduce two new software tools that will greatly enhance motion recognition for both the Wii-mote and multi-SKU combat.
AiLive, a Palo Alto-based artificial intelligence firm, developed Context Learning, an efficient learning-based AI that expedites an AI’s learning and recognition processes.
LiveMove resulted from using Context Learning to address the motion recognition problem that plagues the Wiimote. Funge claimed:
WeÂ’ve had cases where people have been working for months on motion recognizers for the Wii remote and theyÂ’ve been struggling with it and theyÂ’ve used our LiveMove software and in an afternoon theyÂ’ve had better motion recognition than theyÂ’ve had in months trying to hand code solutions.
Simply put, LiveMove enables users to “teach” the Wiimote basic and complicated gestures for it to easily recognize patterns without requiring them to be exactly replicated.
Dobson and Funge demonstrated LiveMove in action by running a test cooking game, first by teaching the Wiimote basic movements such as frying, pounding, and shaking. To show the accuracy of LiveMove’s motion detection, Dobson performed the same movements again, with the LiveMove-enabled Wii successfully identifying the actions.
“The concept of slack is very important,” Dobson said regarding the instances that people unconsciously vary their gestures when they get excited. “Because as people get excited they begin to make very strange gestures and we still have to recognize it.”
AiLive also applied Context Learning on real-time combat AI, creating LiveCombat. If LiveMove enables people to teach the Wiimote how to properly identify gestures, LiveCombat enables users to teach combat AI how to fight and react to its surroundings.
To demonstrate the depth of LiveCombat’s learning program, Dobson used the a simple, hand-to-hand combat game called Tale of the Monkey King. At first, the character in-game did not respond to any virtual stimuli, but after Dobson taught it how to react and fight with a few simple controller inputs, the AI knew how to attack, round up, and pursue enemies.
The conception of both LiveMove and LiveCombat will most certainly open up a lot of game possibilities, not to mention make Wii gameplay a lot less stressful and frustrating. Being able to teach AI how to react and move will likely bring a new meaning to real-time battles as well.