Commercial: China Southern Air

china_southern_air_thumb_1
Click to play video.

I worked on this with Dominic Taylor who had set up the comps, cameras, and worked with the clients on the direction. I mainly did the flipboard effect.

This was quite a challenging and difficult effect to do in Maya. The main driver of the rotations was expressions; the expression were taking their values from samples from textures, which were generated from AE. The main difficulty lay in the fact that it was slow, and it needed to be baked out before it could be sent to the farm because setAttr was the mandatory method of applying the motions.

The flipboard effect was not only flipping from image A to image B; in fact, it goes through a series of photographs before it resolves into the final, and designing the mechanics of the scene took some tries before getting right.

In retrospect, LW’s nodal displacement in conjunction with Denis Pontonnier’s Part Move nodes is a superior method. Where it took me about week to get all the shots set up in Maya, I think I would have done the same in LW for less than half the time.

 

Commercial: NSW Communities (Cogs)

Click to play video.
Click to watch video.

This was done a relative quick job for the New South Wales government. They had, in fact, recently come back to us for a ‘phase 2’ ad along the same lines, only the deadline was a bit shorter.

The ad was in two parts: the cogs and the town hall scene. I’ll just talk about the cogs scene because that was the only part I was involved in.

I was asked to create the cogs that formed the NSW state lines. We bought a few gears off Turbosquid to get started, but other bits and pieces had to be modelled along the way. I had first set up the layout of the gears, and rigged sections of them to follow different controllers. Then individual gear component combinations were rigged, and then placed into the main scene.

The rendering was also done in LightWave, but the final look was supposed to be hand-drawn. This process was in 3 parts.

The base render was from LW, which was a clean, multi-toned render. LW enables me to colour individual gears/items based on the fact that they are instanced or separated, and I used this to quickly shade variations of the colour theme. I like the fact that I can get lots of shading control across whatever shader channel (eg diffuse, specularity) that I’m using.

After the base render, Richard post-processed that in AE using his own concoction of adjustment layers and textures.

And for the third step, Richard‘s AE renders were passed on to Alexandre Belbari and Thomas Buiron, who gave it the more sketchy, organic look.

When it came back to us, Richard add further effects such as the smoke and particles.

 

Commercial: Spark Light Box

lightbox_thumb_1
This was a joint effort by Dan McKay and I, with him driving the project from After Effects. My part lay mostly on populating the screens with footage. But there was a particular problem; the clients did not/could not sign off on all the footage, and knowing that in advance, I thought of making the footage generation procedural, so that Dan could render his bits and I could create the screen footage separately, thus working in parallel.

The main technique was UV mapping (STMapping in Nuke) plus floating-point ID mattes. It wasn’t enough to use UV mapping because even though I could easily map a footage on a screen, there were hundreds of screens that needed semi-random footage running on them. Using RGB mattes were out of the question since I would end up, still, with too many mattes to manage. I decided that I needed to mark these screens by ID, and so approached it a numbers-based AOV render.

This was done by first UV mapping all the screens into one UV map, then create two ramps (one U and the other V) with  gradients and a precise multiply calculations which enabled colour ranges way past 1.0. The idea is that the first screen would have the surface value of 0, and another will be 1.0, and the next 2.0, and so on. When rendered from Maya as an .exr, every screen looks white, but when colour-picked in Nuke, floating-point values are recognised.

In Nuke, I created an setup which took any number of footage variations, and randomly assigned them to ID mattes, which were subsequently piped into STMaps. The result is that I had a ‘rig’ which I could switch any footage for another, replace any one screen with a particular footage if I wanted to, and/or change the randomisation of generic screens at any time.

It was a technical challenge that I found satisfying, and all the more so because the client did the predictable thing and started change stuff around. But we were ready.

Watching the video, one would never think the lengths of which artists go through to account for things that seems out of scope of a commercial. Most people just think about colours, sound, motion, effects, and all the stuff that’s in front of them; but as cg artists, we have to think about the framework behind all that in order to accommodate eventualities known client feedback.

Commercial: Macquarie’s Bank (Tools/Otter)

otter_thumb_1
(Click to watch video)

In most jobs, I’m glad to be working in the shadows. There are some jobs that I’m really glad to be in the shadows. This is one of them.

Let me be clear: when I say I’m glad to be in the shadows, I don’t mean not being associated with the ad; I’m perfectly fine with the final product (not to say I particularly like it — I’m just fine with it). What I mean about shadows is this: it’s like being inside a tank while looking at a huge industrial fan and above it, tonnes of horse shit ready to drop onto it. When I see trouble, I try to warn off the people. If people don’t listen, I take cover myself.

And remember that this work thread is not just about what you see, but about invisible things behind the work you see. And in this case, what was behind was quite incredible from a production point of view. And I don’t mean incredible in a good way. But let me only say one thing among many things:

We didn’t have the render hardware and software to render Yeti fur, and that we had to sink lots of money to subcontract our rendering; first in the messianic illusion that is cloud-rendering, and second, when that failed to save, in an old-school outsourcing rendering service. The good news is that after two years of waiting (the job was 2 years ago) we have finally incrementally upgraded our software and hardware. There, I wasn’t being that negative was I? Less is definitely more: when you have shit hardware you have to rely on your smarts to get things done. So I’m not choked up about hardware because it helps me shine!

Speaking of shine, there are those that brought the goods on this project. Louis Desrochers groomed the Yeti fur look, and we devised a way to generate wet maps from Maya, since the our Realflow op couldn’t get wet maps from Hybrido sims at the time. We used a combination of ambient occlusion maps that have an Time Echo effect applied in After Affects; that image sequence drove the Yeti maps; I was the one that wrote the script to bake animated ambient occlusion maps to be plugged into AE.

There were so many other people involved in this project, and contributed their part in it: I didn’t even get to rig the otter! That’s a first! Of course, as usual, I was there to clean up the scenes and troubleshooting the most stubborn of the lot. But when the dust finally settled all I wanted to do is forget about it.

 

 

Commercial: Orcon’s Daisty and Gav (Go Unlimited)

I consider myself a competent but frustrated character animator. I say frustrated on account of people’s bias of me as a technical person, I’ve too often been asked, instead, to rig characters. In this case, the characters of this commercial were completely in 2D, illustrated by Gareth Jensen, who also did a lot of the animation along with a few other animators.

The main difference was that the job was going to be done completely in After Effects, including the rigging. I’m not an AE rigger, nor have I animated a character in AE before. While I’ve seen what mind-boggling AE rigging tools can do, at the time of this commercial, I hadn’t seen any of them yet. So I basically had to do everything from scratch: learn Puppet Tools, create the workflow for swapping assets, expressions to switch drawings, etc.

Frankly, I don’t know if I want to do that again, given the powerful rigging tools available today. I think I’d prefer to animate.