Renderu.com
Save
22Posts
2Followers
40.5 KVisits

War for the Planet of the Apes: An interview with Weta Digital

Publications

RENDERU.COM spoke to Weta Digital - a digital visual effects company based in New Zealand. Luke Millar, VFX Supervisor, told us about the main challenges the artists faced and, of course, about the incredible motion capture work done on War for the Planet of the Apes.

RENDERU.COM: Could you tell us in detail about the most unique and interesting aspects of the motion-capture technology?

Luke Millar: Probably the biggest advancement to the motion capture work done on War for the Planet of the Apes was that almost all the capture was carried out on the actual shoot location rather than on a stage within a motion capture volume as has been traditional in the past. The main advantage this gave us was that the film could be shot and performed simultaneously, enabling the ape actors to interact with the environment and other human characters. The result is incredible realism in the quality of the motion and the ability to capture the true emotional intent of performances. It also allows the director to shoot the film and direct the performance as if the apes themselves were sitting there on set!


1.jpg2.jpg

Images: Двадцатый Век Фокс СНГ


RENDERU.COM: How has the technology been enhanced since Dawn of the Planet of the Apes?


L.M.: Since Dawn of the Planet of the Apes, we have made a number of technological advancements, most notably around lighting and rendering. We have developed a proprietary path tracing render engine, Manuka, allowing us to accurately trace light paths from sources within a scene and build a very accurate picture of how objects would react if lit in reality. This has gone hand-in-hand with the changes to our lighting pipeline where we are now able to capture and honestly recreate the lighting that was present on set. We can also use these 'verified' lighting scenarios within the look development process so we can be very accurate in dialling in the level of realism. I remember being blown away by the level of realism we could achieve on Dawn but with War, these technological advancements have taken it to the next level.


3.jpg4.jpg


RENDERU.COM: The apes’ proportions are slightly different from humans. How did you solve this problem?


L.M.: There are a number of areas where the differences between ape anatomy and human anatomy throw up some challenges. We have the apes and their human actor counterparts well mapped out so that we can translate motion from human to ape. However, the ape muzzle in particular is very different from the anatomy of a human face, so translating the facial performances required a fair bit of tuning and key frame animation from the animation team.


Another challenge was as the actors were always present on set, we would often need to paint them out before integrating the digital apes. The ideal situation in this scenario is that the digital asset replacing the actor is larger and so covers up most/all of the actor. Unfortunately, in War, this was not the case! There was a lot of complex paintwork to remove the actors. This was particularly challenging where you had a child actor (Devon) playing Cornelius who was over double the size of the digital ape!


5.jpg6.jpg


RENDERU.COM: What was the main challenge with the facial animation?


L.M.: I think the biggest challenge with the facial animation is in capturing the performance of the actor and being able to translate that performance to the digital ape so that the same emotional intent is conveyed. There is an incredible amount of subtly in facial motion and it isn't as simple a pressing a button and transplanting the motion capture data. We have a team of incredibly skilled motion editors and animators who can look at the original actor’s performance, dial and tune the data and keyframe animated where necessary so that when you see the final render, you recognise and identify the actor’s performance. Matt Reeves was incredibly sensitive to the performance that the actors gave and being able to 'see' that performance within the final image is key to achieving the level of emotion that we have.


7.jpg

8.jpg

RENDERU.COM: Could you tell us about the fur simulation? Did you use new tools?


L.M.: We did quite a bit of development on our fur-to-fur simulation toolsets for War. There were a number of challenges throughout the film and in the past, we would have relied on the underlying skin to collide with another ape's fur. However, that wasn't giving us the level of realism desired. One scene worth noting was when Caesar strangles Winter. We went through a number of iterations to get the right sense of struggle and tension between Caesar’s arm and Winter's head to make it believable that he was being choked. The problem with hair-to-hair simulation is stability, so the creature team put in some incredible effort to deliver sims that not only looked convincing but were smooth and stable.


Another area that we focussed on was effect of wind upon fur. We had blizzards, snow falls and horseback chases to contend with and part of the challenge was to get complexity within the fur sim so it didn't move en masse. Sometimes we would even go to the extent of running 2 or 3 simulations acting on various fur groups on a single ape in a shot to be able to achieve that level of motion.

10.jpg9.jpg


0
Comments:0
post dateAll languagesOnly English