During his talk at the Bradford Animation Festival 2011, Blitz Games' Nick Adams revealed that his first vision for Puss In Boots game was to have the furry Shrek's friend mimic the player's movements one-to-one, but it turned out to be less fun than expected and they had to innovate their own intelligent gesture system which - he believes - is more fun.
"We had to make the player feel connected, we had to make the player feel like a hero," he explained. "This is where we came up against a problem, because Puss in Boots is Zorro in cat-form: he's dynamic, he's got flair and every one of his poses has been lovingly crafted by Dreamworks animators, so he always looks awesome."
"Most players don't exhibit that same degree of flair."
Problem is, most players believe that perceive their performance as more elegant than it actually is; and so they don't actually feel that the clumsy movements reflected by Puss on screen are actually mirroring their own. Adams described this phenomenon as "egocentric bias."
As an example of egocentric bias, Adams referred to the famous Star Wars Kid. "I can't speak for this guy - I've never met him and I don't know what was going on in his head, but he probably thought he looked quite cool," he pondered. "Other people didn't see him in the same way."
Adams then went on to explain how they evolved the game's control system after discovering the shortcomings of one-to-one motion control. "We really wanted to create this one-to-one bond with the character, but after trying it, it didn't really work, and it didn't look cool," he said. "So somehow we had to go beyond one-to-one, but we had to do it in such a way that we didn't break the bond."
Adams and his team then decided to preconfigure a set of animations for Puss then use gesture recognition to decide which one of them resembles the player's motion the most. "Rather than make the player look worse, we were making them look better - taking what they were doing and exaggerating it," said Adams.
The team called this solution "semi-teering" but it had its problems as well. "We were reading the gesture, and then when it finished we'd play the animation," Adams explained. "But you end up with lag. It looked good, but it just didn't feel right - you were in this uncanny valley of animation. Sword fighting is such a fast dynamic action that any lag is going to feel horrible."
To address the lag issue, the team evolved their gesture recognition system to a gesture precognition system which tries to predict the player's gesture halfway through it to trigger the animation in what appears to be realtime.
"This was probably our greatest success in the game," said Adams. "It's the core mechanic that people are going to judge it on."