FMX 2014: Conference on Animation, Effects, Games and Transmedia
April 29, 2014
Without a doubt, this festival was worth every penny spent and more. As a second year student, I took away so much valuable information and was presented with so many incredible opportunities for networking and getting feedback on my own work. I definitely feel more focussed and determined to make the most out of my third year whilst working towards a show reel that can now be targeted at specific companies after speaking to recruiters and discovering what each company is looking for. There's no doubt in my mind that I'll be attending again next year when I'll be looking to break in to industry, as I can't imagine a better atmosphere and sociable place for doing so.
Being a major geek, I was one of few people who sat with my notepad scribbling away, sometimes blindly, in the darkest of lecture rooms in an attempt to remember everything I was learning from the experts in industry. Below are some of the notes I made in an attempt to encapsulate my experience.
Tuesday 22nd April - Day One
The Art of Animation
Advancing Visual Effects in "How to Train Your Dragon 2"
Scott Peterson, DreamWorks
The very first talk I went to set the standard high for the rest of the festival; it was extremely interesting to find out that the lead animator for one of the dragons in the last film had actually been recruited at a previous FMX festival. We were lucky enough to be shown the first five minutes of the new film in 3D and I couldn't help but be blown away by the animation quality and as always, how pretty it looked. One of the things that stood out was Hiccup's new dragon blade and it was fascinating to learn the ways in which they concurred the challenges behind it. Alongside the dragon blade, we also see a new type of fire that we haven't seen dragons make before; the glowing in their mouths was broken down step by step for us and how they achieved it.
One of the questions asked about the software that the company used for animating actually led to an extremely evasive answer and some sort of comment as to keep our eyes and ears open in regards to something to do with character animation. Will have to keep a look out for more information on this!
Kenneth McDonald, Teppei Takehana, Quantic Dream
Games isn't something we get taught about specifically on our course so it was really interesting to learn about the technicalities behind Beyond Two Souls. Digital portable scanners are used to create models for the games, the raw scans of which don't require much clean up. They can be used for all kinds of things such as expressions to create many blend shapes for animating. A specific company is hired in to do the scanning for them, but the portability is essential as actors and actresses aren't always willing to travel to studios.
One of the key things I took away was that limitations are key when it comes to making games. Memory is everything! Rigging is kept very light - no toe rolls for example, as you would usually have for making full feature animations. Using motion capture is difficult for this particular game because there are so many options that could be played out, all of which have to be captured which make it hard to keep everything linked whilst capturing. All of the motion capture work is targeted in Maya, and handed over to animators. They developed a software that can map one character's animation on to another, a re-targeting system that really helped with background characters and secondary animation.
Due to the high quality motion capture and skill of the animators, there were only five people working on all of the facial animation whilst only sixteen covered all of the body animation. This talk was thoroughly enjoyable and interesting, and actually resulted in going to see Kenn on another panel later on in the day to learn more about the uncanny valley and how his previous work in Polar Express effected his current work.
How We See
Kevin Glasier, TACTICA, Kenneth McDonald, Quantic Dream, Ken Perlin, NYU Media Research Lab
Four people from industry gathered together to discuss the importance of communication between disciplines; getting the idea out of the director's head and on to paper is the main focus. The pipeline has changed dramatically, now it is organic and circular rather than linear. Technology is changing also, once it was a really important trip to the cinema, now people watch on much smaller screens and more often with the development of tablets and smart phones. The essence of story however, remains there, it is always going to be present and always must be, otherwise we won't see anything.
Cultural differences depend on how people see and what you get back will be different if based around the world. David Lynch's movies are supposedly slightly green because when asked he always finds a slightly pink hue. This is just one example given from the panel of people's different takes on what they see.
What We See
Nuno Bernardo, beActive, Tore Blystad, Doug Cooper, DreamWorks, Nonny de la Pena, USC, Immersive Journalism Lab
Next up the panel session continued with new members of industry. It began with stating that every medium you design your audience for, you need to understand how your audience is going to view it. Doug Cooper discussed the new experience from DreamWorks: Dragon Flight is a first person journey through the world of film making, the story of how they tell the story. We were shown a sneak peek video of how it will work. He described it as presenting in a new format - visual, gentle, emotional and that's why it will be successful.
Another topic covered was using environments to tell story - objects as an opportunity rather than just background fillers. Genres can be used to play to expectations and create diversity. This allows games especially, said Tore Blystad, to have different layers and ways for players to experience games at their own pace. Whether it's films, games or alternative kinds of media, people find different meanings in what they're viewing or interacting with. Hitman's pacing is slow so they have to fill it up with content and story on every level so that people don't get bored.
Wednesday 23rd April - Day Two
Dennis Lenart, Telltale Games
Definitely one of my favourite talks from the festival; being taken step by step through what the company learnt from each game they made in order to reach the success that they've had with The Walking Dead was absolutely fascinating and thoroughly educational. The Walking Dead took a decade to get to the point where they could produce something like it. Serialized episodic game, although not the intention originally. Original formula consisted of: comedy, one off stories, single experience, characters that remained the same, a neatly wrapped narrative and puzzles as the main form of game-play.
After creating Sam & Max, the formula was edited slightly with most points remaining the same, but comedy became a bit of drama and rather than one off stories the episodes became part of an arc. Next, they altered it again so tha it wasn't neatly wrapped up but instead the episodes lead to the next one. After this, difficult puzzles were replaced with simpler ones with dramatically engaging stories and action oriented gameplay. Soon after, characters were changed from having no development to evolving through the episodes. Finally they reached a final formula that consisted of: a custom tailored experience where the story is the main gameplay; this meant that it had an emotional impact on the player and ambiguity was the key to success because it mirrored real life situations.
"Toward the next Avatar" with Jon Landau
Jon Landau, Lightstorm Entertainment
Virtual production was described by Landau as actors intimacy with cameras and their environment. The thing that makes Avatar stand out is that their process is finding ways to use technology to tell a story rather than a story for technology. Motion capture, in his opinion, is a purer form of acting that concentrates on the essence of the performance. Reference cameras and blocking cameras are used in large quantitis on set so that there are no changes in physicality in post-production. Between 2007 and 2014 there have been vast improvements for he real time render test, the process is now much easier and produces a much more finished product.
Due to the much improved quality of test renders, this means that edited sequences can be delivered to digital companies rather than shots, allowing for Weta to only be working on the stuff that they need to. "Templates" are the technical term used, meaning parts of a sequence that are then made photographic - not photo-real because instead it's whatever real is.
Landau said that the biggest impact on industry as a whole was the real time CG element in a live action environment. This basically consisted of a hand held camera that allows organic camera movements quickly and effectively without having really awkward set ups and still cameras but has the technical capability to playback CG elements in a real time live action setting. There will definitely be more CG and human interactive scenes in the next movie, taking advantage of the tools available and telling stories accordingly.
Early Look at Dawn of the Planet of the Apes
Obviously one of the big highlights of FMX was Andy Serkis, the face of motion capture performance. Unfortunately I didn't take too many notes on this particular presentation because due to the exclusive nature of the things we were shown, the security measures were extremely high and my bag was taken off of me before I went in (definitely own too many gadgets). We were taken through five exclusive clips of the film, with Andy talking to each of us about the characters evolution and how they'd approached the newer film in comparison to the last. It was extremely interesting to see some of he shots that were still in progress, especially as an animator to see it in it's final stages of animation without any rendering or effects.
Another thing that I took away from the experience was listening to Andy talk about the work he does at his Imaginarium workshop of digital characters. They're applying all aspects of performance capture technology to the creation of new-generation storytelling in film, television and games.
Simulating Monsters University
Samantha Raja, Pixar
After being absorbed in visual effects for a large proportion of the festival it was nice to sit back and stare at the pretty colours in Monsters University. Simulation is something I've always been curious about but never had the time or reason to experiment with. It was really cool to see breakdowns of animation, simulation and then lighting and the difference, or more importantly, the professional outcome that simulation has on the final product. Hairs are curves in Maya, or at least the main ones are that they use to control the rest of the hairs; it continues to shock me that industry are using the same software that I'm being trained in, as silly as that might sound.
The Application of Robotic Technology to the World of Film-Making
Tobias Kinnebrew, Bot & Dolly
When I turned up at this presentation I had no idea that they were actually the company who made the technicalities in Gravity possible, although it was definitely a pleasant surprise. Tobias described the company as the creators of art in the context of technology. When approached about Gravity, the concept was rather unusual; whereas usually there are many cameras surrounding an actor or actress whilst they move, they wanted to instead swap this around, moving the world around the actor. One of the main questions asked was why robots; the answer to this simply being the precision and of course repeatability that they bring. It was interesting to learn that the technology integrates with Maya's workflow, enabling animators to become robot programmers and therefore is opening these tools up to millions of creatives.
The machine they designed and built was the only ever thirteen axis robot in the world able to move around cameras and actors. What was particularly cool was that it was programmed to have complete control by the director, allowing them the possibility to change things even after the pre-visualisation stage.
Thursday 24th April - Day Three
Autodesk: Concept Art and Storyboards
An extremely last minute talk as the majority of Thursday was spent either looking after the University of South Wales stall or networking in the recruitment room. However, I'm glad I stumbled upon this incredibly small presentation which managed to fit a surprising amount of people in the room. Ulrich Zeidler had worked as a concept artist on 300: Rise of an Empire and The Grand Budapest Hotel, both films that are very aesthetically pleasing. At first we were talked through the sort of jobs concept artists do in big production films, this was actually surprising. Whereas people on our course focus on a lot of character work, his role was to plan out the environments and how some of the visual effects shots would work.
Aswell as this, he introduced the majority of the room to a piece of software called Sketchbook Pro. I worked with this a lot last year as I found it much more interactive and easier for thumbnailing and storyboarding that is required of me as an animator. I transferred to using Photoshop because it seemed much more recognized by industry, however, now that he's talked us through how he applies he software to his working process it's definitely something that I'll be looking to do more of.
Blue Sky Studios
Slightly disappointed that I didn't take more advantage of the recruiting presentations this year, having not realised they weren't actually trying to hire people within them until the Thursday. However, I was lucky enough to catch the Blue Sky one and I'm very glad I did as it proved to be extremely insightful. As it turns out, animation is their largest department with around seventy to eighty animators working there at the moment. Both interns and juniors are given little responsibility, which was a relief to hear as a second year student. Both have mentors to guide them, but junior animators are given just cycles and extremely small shots whereas interns use character rigs from films for tests with feedback from the different character leads.
Most importantly, it was nice to hear yet another company give feedback about what they look for in a show reel. Generalist reels aren't what they're looking for at all, having been told to focus on just one discipline with at least four or five strong examples of it.
Friday 25th April - Day Four
The Art of Animation
The LEGO Movie
Damien Gray, Animal Logic
Of course, just because they can, there was a LEGO Movie montage with the backing track Everything is Awesome in the first few minutes and so this presentation was immediately amazing. When making the Lego Movie there were several things they felt they had to stick to, these were the following; the photo-real/hand made aesthetic, staying true to the LEGO medium, feeling epic and absurd but not a toy commercial and most importantly, a definite tribute to the fans. To make sure everything was incorporated, the team produced a test animation of Emmett interviewing for his role in the film. We were lucky enough to be shown this in the presentation and yet again the room filled with laughter and happiness. A production hub test was also carried out, in which they took one shot to the final stages of production in order to expose problems early in every department.
Another important aspect, alongside maintaining the initial set of guidelines, were the influences. The main source of inspiration came from the AFOL community (adult fans of LEGO), who build incredibly detailed sculptures and environments. We were shown a lot of amazing sculpture work and given web addresses to access more online. Macro photography and stop motion animation also had a big impact - both of which bring some aspect of story of narration to the LEGO world.
The Art of Animation
The Art Direction of Frozen
Michael Giaimo, Disney Animation
The art director described his role in the production and the general aesthetic ideas that he had for the film. Heavily influenced by Norway in terms of geometry, Michael was keen to push horizontal and vertical boundaries of the landscapes and buildings as far as realistically possible. I didn't take too many notes in this presentation either, as the majority of it was imagery and as one of the darkest possible rooms in the venue, it was impossible to see my paper! I'm definitely glad to have experienced a presentation on Frozen however, due to the success of the film and it's impact on the animation industry.
Deconstructing the Visual Effects of "Pacific Rim"
John Knoll, ILM
Once again, not too many notes to make for this particular presentation due to the sheer quantity of visuals presented to us. The majority of which were breakdowns of the visual effects and animation. I had no idea of the incredible amounts of render passes for every individual aspect of each shot when it comes to making photo-real shots. Absolutely overwhelmed by the amount of work that went in to every shot on this film and I actually ended up going and purchasing the art book shortly afterwards. Despite pipelines becoming much more circular, animation has to be at the very start of the pipeline for the fx to work. This was a cool insight as to where I might fit in industry and to get a better understanding of visual effects studios.
The Amazing Spider-Man 2 VFX
David A. Smith, SPI
One of the presentations I was most excited for became a brilliant end to FMX. David described the film as anamorphic - the way in which light reacts and the style of shots make for a beautifully visual film. With the second film came a looser suit, creating a much more organic look and also having the added bonus of allowing the actors to go to the bathroom much more easily. Because of this looser fit, however, it requires simulation - Maya nCloth is used to add the extra animation that makes it so convincing. There are also CG reflections added to the eyes, actors just had mesh when filming so as not to distort his vision, therefore the digital team had to edit this to match the CG model and add more sparkle.
Animation was obviously described as tons of fun but because of the dynamic nature, they had to be careful about the laws of physics. David Schaub taught the animators a class on physics and a stunt guy was bought in to record video reference to explore how weight works on webs. Cut length was much more varied in this film, having given the animators much more freedom, one of the examples shown turned an initial twelve shot sequence in to just one cut.
The electric eels sequence was done be MPC. However other sequences that included the bolts that Electro produces were done in house and were actually done in what seems like a complex way but generated a good quality outcome. Animators had complete control over the size and speed of the bolts, which were made up of mesh lights, to determine the interaction of characters.
The mesh lights mentioned previously were used in other areas of the pipeline once they had been created for the bolts. Times Square had to be fully modelled in order to successfully have some of the CG shots, every detailed was referenced exactly including lights, signs and plants. Over 36,000 still photos were taken for reference.