Tuesday, July 30, 2013

VFX of The Last of Us: Bills Town

This is a difficult thing to convey when working with visual effects. You want to create some movement and visible interest but it also needs to feel appropriate given the context of the environment. In The Last of Us this was particularly difficult because the weather wasn't always in some extreme state. For the most part things needed to feel calm and peaceful while being juxtaposed with horrific realities happening around you. I think it's because of this setting and tone that fools you into thinking you're safe but you know better. This brings me to Lincoln or as we referred to it, Bills Town.
Life Goes On
In Bill's Town there's only Bill, and of course the infected wandering around. No electricity, no human life. This was a constant theme throughout the game and it required us to explore other ways to create movement and life. We did this though things that weren't as obvious as animals and things of that nature. We used small swarms of flies and gnats that you end up walking through, grasshoppers that jump back and forth and loose pieces of paper trash that fly around in the wind to name a few. I personally had to really think about how to create the movement without it being intrusive or out of place. I borrowed a technique from one of our lead effects artist Eben Cook where I would take a still from the level with no effects in it yet and do some paint overs to try to imagine what effects would look good in areas. This helped a ton to see what worked and what didn't. I wanted to describe this because I think it's important to explain that visual effects in a game are not something that just get plopped down without any forethought. There are some effects that you have on hand and you know they will work well in many areas but before we lay them down we put a deal of thought into how it all fits together. This game was a great practice in restraint as well. An effects artist we could put any number of things into a scene to convey movement and so forth but you also need to be able to look at something and say "It only needs this one thing." and be alright with that. Just because we can doesn't mean we should.

Wind Effects
We have a wind system in the game that controls how things like trees and other geo behave to simulate the appearance of wind. In previous games we have created effects that best match what this looks like to us, but we were always just eyeballing it. We had to orient our emitters in the direction of what we perceived the wind to be and while this was fine it could be a little slow at times and they didn't always remain consistent. In this game we decided to ask for the ability to fetch the wind direction and strength from our global wind system so that we could use those values to control some of the behavior in our effects. I created a simple script that grabbed the wind direction and intensity and passed it to one of our global wind fields we exported from Maya. This allowed us to put things like leaves falling from trees and any other particles we wanted to have flowing in the wind wherever we wanted without having to worry about emitter orientation. So in Bills town I created a number of effects to take advantage of this, like grass and dusty dirt that gets kicked up randomly.

I created the video below to demonstrate some of these concepts at play. Ranging from subtle wind effects to in you face decapitation effects for the infected. There are spoilers so be warned if you haven't played the game yet.

Tuesday, July 9, 2013

Deep Snow Prints

Spoiler Warning!!!!!!!!!!!!!!!!!!
Be aware that the video below shows some footage that would be a spoiler for someone who hasn't played the game yet. I apologize for that.

Long ago in the before time we didn't have a way to artistically control the footprints characters could leave on surfaces. The footprints in Uncharted 2 were smaller in scope and artists were limited in what they could do with them. Since then we have had some considerable support in a system that allows us to spawn effects based on collision data from objects in the game. Using joints in the game we can cast rays from joints that probe surfaces and use that information to test some conditions to spawn effects once those conditions are satisfied. We call this system the Splasher System. Originally it was just for creating splash effects for characters in water but it's a lot more intricate now. The splashers are something I'd like to go into more detail on at a later point if I'm allowed to. It's a great system but on the PS3 it became quite expensive and we had to scale back how many characters and objects used this system.

Projected Particles
One of the greatest things we have is the ability to project materials onto objects using our particle systems to generate the projections. This sounds like a decal system and in it's most basic description, it is. Our Dedicated effects programmer Marshall Robin, during uncharted 3 developed for us the ability to spawn projected particle cubes that utilize a number of parameters to project materials that we author onto surfaces. For example we can use this system to write directly to our destination normal buffer. This buffer takes the normal value of every pixel in screen space and writes to this buffer. Out projected particles directly perturb this buffer. Our lead effects artist Keith Guerrette, used this technique when creating the sand footprints in Uncharted 3. They can animate and do all sorts of great stuff. We can also draw these projected particles to the same buffers we draw all of our other particles to. So we can draw projected color based materials onto surfaces which is very much like the decals you are used to seeing. There is a lot of overdraw cost in these projected cubes so we don't like to use them all over the place just where it will have the greatest impact visually and where surfaces tend to have a lot of deformation. In the video I put together you'll see the outline of some cubes and those are the projected particles.

The Materials
We are lucky enough to have our own node based shader authoring tool. If you've ever used UDK's material editor it's a lot like that. Knowing we can only project a flag image onto the surface I used a technique that in UDK is referred to as bump offsetting. The idea is that you can use the camera direction to bias your original uv values modulated by some constants to create the illusion of parallax. The camera direction vector is a unit vector that ranges from -1 to 1. This is ideal because uv coordinates are commonly in tangent space which typically range from -1 to 1 as well. Of course you can push the values to whatever ranges you want but you would soon find the problem with that. The camera direction is a vec3 and I really only need 2 of the components from the camera direction. The x component and the y component. So I separate all three and recombine x and y back into another vector that is only x and y.

Using these vectors I multiplied them by a texture of a deep simple impression I made in Maya. The texture is the equivalent of a displacement map. By multiplying the camera direction vectors by the texture I create a more complicated and interesting shape of parallax per pixel. Now, what to use these crazy uv's on? I used the displacement map texture to create the illusion of shadowing inside the pit of the print. One of the unique things about our shader authoring tool is that we can define how our alpha is applied. This is similar to Photoshop layer stylings. You know things like multiplicative and additive and so on. The difference being that our options don't have those kinds of presets so we need to know how that alpha blending works. The simple version of what we have to work with is the source pixel (the pixel we are trying to draw) and the destination pixel (the pixel that is already there). Then we define some arithmetic on each of those alpha blending attributes and then by default the result of those two get added together. The majority of our materials multiply the source pixel by 1 so the alpha we are trying to draw just remains as is and then we multiply the destination pixel by one minus the source alpha. So the inverse of our alpha we are drawing. I didn't want to try to create any color of the footprints beyond what was already being drawn there. So the only color in my material is the shadow for the deeper parts of the print to make it look darker. For the alpha blending I set my source alpha to be multiplied by the destination pixels color. So now my alpha has within it the color value of the pixel that is already there. Then my destination alpha is multiplied by 1 minus the source alpha. Since my source alpha has within it the color of the destination pixel I don't want twice the color value when the two values are added together. That's a little complicated I know but totally worth it.

In Conclusion
There are some other things I did to create the prints but the bulk is being done in that one material with the fake parallax. The video will quickly demonstrate how the material is being modulated by a float to control how deep the parallax goes. These prints were important because we couldn't create any dynamic displacement of the geo and the player needed to have something to clearly track the deer. I'm proud of how these prints turned out. We tried to create prints like this for Ellie and I spent a good deal of time on them but it was difficult to make it look like she was leaving long trails and tracks in the snow. So I opted for the prints that write to the destination normal buffer to make it look like long streaks in the snow. Anyway, enjoy the video.