Wednesday, 12 December 2012

Bucks Students visit Pinewood Studios and Centroid - home of Europe's Largest Motion Capture Studio

Yesterday a group of students from Bucks went to visit the Motion Capture studio Centroid, based in Pinewood Studios. Motion Capture, sometimes called "Performance Capture", is an animation technique which captures real-time actor's performances and turns them into digital data which can be used to create animated characters - like the character "Gollum" played by Andy Serkis in The Lord of The Rings.

Bucks and Centroid are pioneering a partnership that will offer Centroid opportunities for research and development, and also give the students a chance to use a leading industry facility for their student films. It's just the kind of thing that universities should be doing - partnering with industry to create a mutually beneficial relationship.
Pinewood Studios - home of 007

As for me, I couldn't wait to visit Pinewood Studios, home to Centroid and one of Britain's largest and most historic movie studios. And, of course, the home of James Bond himself. I wasn't disappointed. Just like at Disney in Burbank where you get to stroll down Dopey Drive and Goofy Lane, so at Pinewood I was delighted to find that I had to walk along Goldfinger Avenue to get to work.
Bond villains live on at Pinewood
Centroid says it is the largest Motion Capture Studio in Europe. Which could easily be true. They have a large sound stage, hosting 150 separate cameras which create real time 3D visualisation of virtual characters, including full body and facial capture. Among the film projects they have worked on are Harry Potter, Prometheus and Watchmen, as well as countless TV commercials and games.

I was particularly delighted to learn that the room in which the motion performance would be captured was the very same sound stage in which Ripley, the character played by Sigourney Weaver, shot up the alien Queen in James Cameron's 1986 classic Aliens. I tried to imagine the room filled with giant plastic eggs and latex insects.
Ripley -v- Aliens. Shootout in the MoCap corral
We were warmly greeted at Centroid by Stuart, Phil and Cadell, who helped us to get started with what would end up being a full day's work. MoCap gets a much faster result than can be achieved with key frame animation but it still takes time to set up - even getting into the special suits, with all the markers properly placed, took the best part of an hour.

Two students, Fiona and Umar, were the first to get suited up for a martial arts performance - acting out scenes from Fiona's final year film project. But were we properly prepared? asked the guys from Centroid. Had we brought storyboards? Had we planned exactly what we wanted in advance? Had we brought some dirty old trainers like we were told to? The MoCap markers get attached to your shoes with gaffer tape - and this is not kind to expensive shoes. I got the feeling we were not the most organised clients they had ever had.
Fiona and Umar get suited


While the actors got into their suits, I wandered around Centroid to see what other goodies could be found. That's when I came across the Gun Room, where a whole range of weaponry was available from automatic rifles to shotguns and pistols. Useful for capturing shoot 'em-up games data. I started to envy the students in the suits. This could obviously be fun.
The Gun Room - a boy's paradise
Once the performers were suited up, Stuart led Fiona and Umar individually through a technical exercise designed to capture a range of different physical motions, all of which looked a bit like a warm-up for calisthenics. This exercise apparently helps the cameras and the computer system to recognize the markers for each individual MoCap performer, creating a unique digital map for each actor. That way, when the two performers acted together, the system would recognise each individual and not get confused.
The "T Pose" - Stuart and Umar calibrate the system
Once the system was calibrated, we were ready to shoot. Just like on a normal set, the director calls "action", and the cameras roll - all 150 of them. In this case it was Fiona, who is making a short martial arts training film in her final year, who directed Umar on set (it was a surprise to me that the director can actually be on the MoCap floor directing the action - the cameras will ignore him. They just look for the markers). At the beginning and end of each action Stuart shouted "T Pose please!"; the T Pose helps the cameras pick up the correct markers at the start and end of every shot.

The software used is called Cortex, and comes built into the Motion Analysis System. This then can be used to export an "HTR" (hierarchy Translation Rotation) to Motion Builder, which is what we will use to animate with.
Umar gets turned into Cortex data
The Cortex data gets captured in a live performance by Plastic Man, a CG character helpfully coloured blue for the boys and pink for the girls. These two virtual characters recorded in real time each and every movement made by the performers. Sort of like having your own real live avatar to mimic your actions.
Umar and Fiona get turned into Plastic Man and Plastic Girl
It quickly became clear that the accuracy of the cameras is of primary importance and needed constant tuning. The data is sometimes choppy, as it is all based on "line of sight" calculations. That is to say, the cameras get confused if they can't see the markers properly. For example, if a character crosses his arms, hiding some of the markers, the cameras have difficulty remembering which marker was where. At one point during the day Plastic Man's bottom got re-attached to his front, to great hilarity.

Atmospheric change matters too. It was a freezing cold day and, as the room warmed up, the cameras changed. Every now and again Stuart would jog around the room with what looked like a three-pointed magic wand, recalibrating the cameras to keep them in perfect alignment.

In the afternoon we recorded some dance moves. Laura and Naomi, two students from the Dance Department, performed a 2 minute performance piece that we will use as a teaching exercise, training animators to use MoCap data to create a performance. The raw MoCap data tends to be a bit soft and floaty and a good animator can do a lot to make it crisper, snappier and more compelling.
Laura and Naomi from the Dance Department
And, finally, the data has to be cleaned up. "Sculpting trajectories" (as Phil put it) is a time-consuming process - it's all about making sure that the camera has picked up the correct markers, and making adjustments where necessary. As a result the typical client gets just 60 seconds of cleaned-up data from the shoot. As Stuart explained, MoCap is best suited to creating complex material, there is no real need to bother with the simple stuff. Clients tend to be selective about what they shoot - they don't want a huge bill.

At the end of the day, I could see why directors love the process of shooting MoCap. If you can capture 60 seconds of action a day then you can shoot an entire movie in a few months - much, much faster than a traditionally keyframed animated film. Motion Capture is a process which is getting better all the time, and as animators we need to embrace it, and make it work for us.

All of this of course is today's technology. But what really gets Centroid excited is what tomorrow will bring - live motion capture. Think Jessica Rabbit, on the red carpet, as a live projection, performing in front of a live crowd. Now that would be worth seeing.

----Alex




No comments:

Post a Comment