How The Hague Tech offered a helping hand to bring game characters to life through motion capture
For the past 6 months, I have been hard at working developing my first game - a first person neo noir detective adventure, set in the grisly underbelly of a futuristic city where aliens and humans coexist. The sheer amount of animations required to bring the different characters and species of this world to life is staggering, so traditional keyframing methods for all animations simply wasn’t an option
My adult life has been spent working on Visual Effects for such film franchises as The Planet of the Apes and The Hobbit - where motion capture plays a huge role in bringing iconic creatures like Gollum and Caesar to the big screen. The motion capture technology utilised in these projects and in large AAA game studios is well beyond the reach of my humble project though - with full motion capture setups starting in excess of 10 thousand dollars US all the way up to above $100 000.
So how does one leverage this technology when the project is self funded? In years past such an undertaking may have been a pipe dream, but in the last decade more software and hardware options have been put in the hands of the masses. If someone has a fast enough computer and access to the internet, a whole world of creative opportunities are afforded to those with the patience and passion to build something from nothing.
The motion capture solution that was settled on was an easy decision, I used ipiSoft - a motion capture program that puts the power of motion capture into the hands of indies. The process is similar in many ways to larger more mainstream options - where in that an actors movements are tracked via an array of cameras capturing multiple angles of the performance, then using those movements to drive the bones of a digital character.
The difference is in the implementaion and hardware. Rather than an array of infrared cameras that track the movements of reflective markers attached to the actors body, we use an array of 6 playstation eye cameras that track RGB data (colour). The great thing about these cameras, is although the resolution is low, you are afforded a locked 60 frames per second and a wide viewing angle - perfect for capturing fast motion - and the greatest thing of all - they are cheap. We picked up our cameras for 8 euro a piece!
Once we had the software and hardware sorted, we noticed we werent getting correct head or hand tracking of the actor, this is one of the drawbacks of this low-end method of motion capture, Rather than tracking points, the RGB colour data recognises the colour of the actors torso, head, arms and legs. When the base pose is setup correctly - the software will track each frame, trying to match the sillouete of these colours from all angles throughout the frame range. Its a novel concept that I have only seen ipiSoft employ. But the drawback here is a loss in fidelity for end-bone rotations such as hands and head.
The solution to this was to get 3 playstation move controllers, take them out of their casing - shedding excess weight such as the rumble motor etc. - and attach them to two gloves and a helmet. The result being that we can record the gyro data from these controllers and apply the rotations directly to the head and hands bones after the software had already tracked their positions in 3d space.
The result? full body motion capture at a fraction of the cost of a traditional motion capture setup. So now we have software? check. hardware? check. How about the space to capture the performance? This is where our friends at The Hague Tech stepped in. Esther Pronker, the games producer and my partner in crime, had been tasked with finding a shooting location that fit the requirements for motion capture. A large space, well lit, private - and ofcourse - cheap. She told me about The Hague Tech, a mere stroll down the road from where the game is being developed in The Hague. She spoke to Olga, the community manager there- who was just awesome in allowing us to come and check out the space. It was perfect. So a fortnite later we were back at The Hague Tech, and used one of the spaces there to setup for the shoot. It was a relatively short shoot, two full days - so we worked hard to get as much usable data for as many characters as possible.
The final result being more than 250 gigabytes of raw footage to be processed - around 500 separate animations including walk and run cycles, combat animations and cutscene performances. All done by one actor, one coordinator, 6 psEye cameras and a laptop. From here the data is processed and exported to an external 3D program, the skeleton with animation applied is then retargeted to a character rig, the animation is tweaked (for looping walk cycles, fixing jitters in the raw data etc) and finally exported to the Unreal Game Engine.
The time and money saved vs conventional motion capture really is a testement how even the smallest teams can bring big ideas to life, and I cant be more thankful for The Hague Tech for helping us toward our goal. We will be more than happy to shoot there again when the project is further along. The project is currently in prototype and we aim to launch a website and crowd finding campaign in 2019.
Benjamin Swinbanks, founder, creative director - Simmersive Digital
The time and money saved vs conventional motion capture really is a testement how even the smallest teams can bring big ideas to life, and I cant be more thankful for The Hague Tech for helping us toward our goal.