Par dream-maker1 le 17 Novembre 2005 à 00:25
Sky Captain and the World of Tomorrow was launched a decade ago as a question in a film student's mind: how can one make a film with little money, a small crew and out-of-this-world ideas?
Using his knowledge of traditional cel-animation techniques, Kerry Conran looked at the stories he had loved as a kid film noir thrillers, classic Universal horror movies, German Expressionist films, comic books and pulp-fiction novels and decided to blend them together into one homage-packed, visually exciting piece. The resultant film is a nod to the serial cliffhangers of the 1930s and '40s, with a little film noir and Buck Rogers science fiction thrown in for good measure.
Sky Captain centers on ace pilot Joe Sky Captain Sullivan (Jude Law), who leads a squad of aerial defenders of all things good. When New York City is attacked by scores of giant, lumbering robots, Sky Captain is reunited with former flame Polly Perkins (Gwyneth Paltrow), an intrepid reporter who suspects that the attack is linked to the recent disappearances of several renowned German scientists. Polly is in hot pursuit of the story, and Sky Captain is called in to save the day and unravel the mystery.
When Conran conceived the idea for Sky Captain in the early 1990s, he didn't want to create something ultramodern and slick. He explains, The films I love have a somewhat crude image quality that I embrace wholeheartedly. I started examining them closely and saw a way to mimic their styles and looks within my limited means. When I broke down a frame into its components, I realized they were roughly equivalent to a background plate and a foreground element, and my idea was to marry those elements together using computers that were just becoming affordable on the consumer level at that time. I thought I could create those elements any way I wanted to fairly successfully with a simple approach rooted in cel animation; in this case, the cel animation' would be live actors photographed against bluescreen and then placed against simple 2-D photographic background plates.' Even back in the early 1990s, the results looked surprisingly convincing, especially when the final image was composited in black-and-white.
At the time, Conran's idea seemed fairly radical. He explains, I was proposing using a consumer-grade computer to compensate for the fact that I didn't have an optical printer or an animation stand to composite those various foreground' and background' elements. It seemed to be an emerging way to combine live-action footage with different backgrounds while utilizing the comfortable conventions of traditional 2-D animation. When I started experimenting with piecing this stuff together, the material that lent itself to a more realistic-looking finish was the slightly cruder, more stylistic imagery something like [Fritz Lang's] Metropolis.
Determined to see his vision through, Conran enlisted director of photography Eric Adkins, whom he had met at the California Institute of the Arts while he was studying animation and film; Adkins was working as a teaching assistant for cinematography instructor Kris Malkiewicz. Together, the duo began to work out how they would film the ambitious project. Kerry and his brother, Kevin, an illustrator [and production designer on Sky Captain], spent months working on conceptual sketches and designing this world that Kerry had in his head, recalls Adkins. They created hundreds of backgrounds that were comprised of archival photographs, matte paintings and animated CG environments. Slowly, layer upon layer, they built the look of the project.
We then began to dissect the elements that would go into a given frame of the film, Adkins continues. We built a bluescreen studio in Kerry's apartment by blacking out the windows with aluminum foil and using a PVC-tubing frame to hold a chroma-key blue material. We also started replacing Kerry's temporary CG poser model' references with live-action bluescreen footage that we shot on Sony's then-new 6mm tape format [now known as MiniDV] with a rented DCR-VX1000.
Four years later, the first six minutes of Conran's project, dubbed The World of Tomorrow, were complete. During that time, technology had continued to catch up with Conran's ideas, and Adkins had forged a career specializing in visual-effects cinematography on commercials, features (Mars Attacks!) and television shows (The PJs). Then, a chance meeting landed their demo reel in the lap of producer Jon Avnet. Avnet immediately saw the project's potential, but he thought the film should be developed outside of the studio system so that Conran could maintain autonomy while completing his singular vision.
Bringing Conran's project to the big screen was initially a modest proposal. Working with a budget of $5 million, the filmmakers planned to continue using the method Conran and Adkins had devised for their six-minute demo only instead of capturing live action on MiniDV, they would use high-definition (HD) video; and instead of Conran's old PowerPC clunking away at renders, an in-house team of animators and compositors would use a network of up-to-date effects workstations.
This low-budget plan was short-lived, however. Once Avnet attracted Law and Paltrow to play the leads, more financing fell into place, and almost a year of extensive preproduction and testing commenced. As the budget grew and actors of note starting signing on, we quickly determined that we needed to do an animatic for every shot in the film, says Adkins. Kerry wanted to have the shots all figured out before we began to shoot. He didn't want to roll on anything unless he had an approved animatic for it. It was a way of both planning and controlling what we were going to shoot. We didn't want to be in wing-it' mode on a vast, empty blue stage.
The animatics also served as a reference both during and after filming, the cinematographer continues. Whenever possible, we decided to lock off the shots to save time on compositing if we set the camera up as close as possible to the specs of the animatic, they wouldn't need to tweak it that much. Of course, moving shots still had to be tracked, but again, we had the animatics as a reference.
An additional benefit of the animatics was that they could even be used in the final compositing work. Most productions that do previsualizations shoot their film and then throw away what was in the computer, says Adkins. On this picture, however, all of the previz animatics could essentially be up-rezzed, textured and lit to be used in the final film. In fact, quite often we did utilize all of that work we set up in the beginning.
During the lengthy process of storyboarding and creating animatics, Adkins participated in 10 months of planning and testing. Chief among the decisions that needed to be made was which image-capture medium would be best for the live-action bluescreen shoot. After evaluating the HD cameras available, Adkins decided to shoot the entire picture with Sony's 24p CineAlta HDW-F900/3, and the company erected a small bluescreen set within its production office in Van Nuys, California, to see how the camera performed with bluescreen and determine whether the effects team could pull clean mattes from the HD material. We chose bluescreen over greenscreen because Jude and Gwyneth both have blonde hair, explains Adkins. Gwyneth's hair was down to the middle of her back, and with any moving shot or fast-paced action, if we shot a slightly overexposed greenscreen we wouldn't be able to pull a key very well. Our in-house tests also confirmed that the bluescreen handled under- and overexposure better than the green. In fac
t, with today's software, you can underexpose a bluescreen and still get a good key without giving the actors a digital haircut.'
We decided early on to have a camera set up full-time [during prep] so we could test any complexities or questions we had we wanted to troubleshoot the entire film beforehand, Adkins continues. We tested all of the camera's internal color matrices. If you're modifying the blue in camera to make the bluescreen a nice, pretty blue, it might look good on the monitor, but what does that do to the edges you're keying? It doesn't matter what the blue looks like behind the [actors], it's how the edges react to the blue. We also tested turning the detail on, which gives you more focused, sharper edges, and found that it actually hurt the keying because there wasn't a natural falloff to the edges; it started to look more like a cutout.
Based on those results, we committed ourselves to turning off the matrices and detail-enhancement options entirely, he continues. Upon closer examination, we found that turning off matrices and detail revealed a little more noise in the blue channel; in the design of the CineAlta, the blue channel is buried last in the prism, so they ended up having to amp up the blue signal a bit. When we looked at a blowup of the blue channel, we saw this almost anamorphosized, noise-like grain. We therefore had to figure out a way to tone down the blue-channel noise, and we discovered the gain switch had an immediate effect. When we switched the camera to -3dB, the blue noise suddenly went way down and the grains took on a normal shape again.
Tapping his visual-effects experience, Adkins also proposed affixing a Polarizer filter on all of the cameras for the entire shoot. Given the risk of blowing out highlights in HD and knowing that we were going to back- and sidelight everything to create a film-noir look we needed to devise a way to avoid blowing out the highlights without resorting to using intense makeup, he says. We couldn't use diffusion on the camera because of our need to pull keys, so during all of our tests I put a Pola on the camera. I'd used Polas a lot on bluescreen shoots to help control spill on a floor in wide shots, or to eliminate a light skip-up or a sheen on a face, and they enabled us to kind of dial a flare' in our film-out. However, in committing to using Polas and setting the camera at -3dB gain, we were also committing ourselves to losing almost three stops of light right off the bat!
Once the real-time animatics were complete, the filmmakers created a first cut of the entire picture, and this moving storyboard, complete with dialogue and some sound effects, became their bible for the duration of the production. The filmmakers secured two stages at Elstree Studios just outside of London, England, for the main bluescreen shoot. Their main stage, the George Lucas Stage, measured 135' long, 116' wide and 50' high, and housed three separate bluescreens. I wanted a larger-than-usual radius base for all of the bluescreens, because if the ramp is too tight, light gets amplified like it does in a cylinder and puts a horizontal highlight [behind the actors' legs] on your bluescreen, explains Adkins.
When you're lighting a bluescreen, that light level determines your exposure for the entire shoot, so we had commit ourselves to that as well, he adds. And when you're shooting close-ups against bluescreen, you have to make sure there's enough depth of field, because if you don't give enough stop, the hair on the back of the actors' heads can turn into a cloud of color and doesn't look good when keyed.
Despite the filmmakers' careful planning, one unforeseen problem arose when they arrived at Elstree. Adkins recalls, On our small stage in Van Nuys, we used several 6K space lights to light the bluescreen, and we based our upsizing calculations on those units. But when we got to England, where the standard is 220 volts, the 1,000 watts became about 800 watts, so suddenly we had a 4.8K instead of a 6K. And less light was the opposite of what we needed! So we had to cram more space lights up in the grid to compensate for that loss.
To plot out the shoot, Adkins and visual-effects supervisor Darin Hollings devised a grid map for each bluescreen set, both on the practical stage and on the virtual stage in the computer. In this way, they could place any shot from the animatic on the virtual Elstree set, slide and rotate it on the grid to fit the lighting/practical needs of the scene, and then generate a shot-specific printout (complete with all lens and camera/actor positional data) for approval.
With each shot plotted and staged, the production adopted a three-camera, tag-team filming strategy. The idea was to have two cameras tackle two of the planned animatic shots for a given scene at once shooting a medium shot and a corresponding close-up at the same time, for example while the third camera set up an entirely different shot on one of the adjacent bluescreens. This way, the production could leapfrog sets and keep rolling to maximize the six-week shooting schedule with Law and Paltrow. Integral to this plan was the use of some extraordinarily skilled stand-ins, Stephen Morphew and Colette Appleby, to facilitate the setting up and translation of the actors' blocking from each animatic shot.
However, as Conran and Adkins can attest, envisioning the physical details of a scene depicted in an animatic while standing in a vast, blue space that holds only a desk, a lamp, and a chair was a daunting matter. People always ask actors what it's like to have no walls to act around, says Adkins, but imagine what that's like for the director! Darin and I were the only ones on set who really knew the physical structure of the [virtual] set, because we had placed the setup on our grid layout the night before.
When we started working in London, acknowledges Conran, I was afraid enough to not want to veer away from our animatic game plan. So we took a very rigid approach, taking measurements to make sure we were exactly where we wanted to be in relation to the animatics. However, we came back to the States and ended up tracking a lot more of our shots than we'd anticipated, so locking things down to the degree that we did was probably unnecessary. Moreover, that rigidity may also have limited the camera operators, who might otherwise have been able to bring their intuition to bear on framing or movement; I think it would have freed up everyone to be a little more experimental or inventive. Had I known that it would still guarantee the look we were going for, that would have been a nice thing to incorporate.
However, I was squeamish about deviating from the animatics, Conran continues, because if we moved the camera this way or that way, we were still only framing against a blank, bluescreen wall, and we had lined the shots up so that we knew what we'd be seeing in the [virtual backgrounds]. If we put the camera exactly where we'd planned to, we could predict what would be behind the characters, but if we weren't in that position, the backgrounds would be slightly off. And if we did something different on set and later discovered we'd completely screwed ourselves, there might be no way to recover. So my reluctance to loosen up a bit was driven by uncertainty. From experience, I knew we could always rely upon the animatics.
In creating the look of Sky Captain, the filmmakers referenced Mark A. Vieira's black-and-white photography book Sin in Soft Focus, the films of F.W. Murnau and the noir classic The Third Man, as well as a number of 3-strip and 2-strip Technicolor pictures. Sky Captain borrows its visual sense from the Thirties and Forties filmmaking style we strove to maintain that pieced-together feeling, says Conran. In that regard, it's not too different from what I originally set out to do on my home computer, but I got to work with a lot of extremely talented people who helped make it better. We fully embraced the idea of shooting everything against bluescreen and the limitations created therein so the visuals would have consistency.
We originally planned to release the film in black-and-white, adds the director. There's a quality to black-and-white that would be interesting and strange for a movie like this, and the idea that someone would go to all of this effort for a black-and-white movie perversely appeals to me. But obviously, color was a concern for distribution, so we tried to embrace color and use it in such a way that it would add to the project.
Eric and I looked at the old 3-strip and 2-strip Technicolor processes, and there are elements of both of those techniques in Sky Captain, Conran continues. But when we looked at one of the crowning achievements of 3-strip Technicolor, Black Narcissus [shot by Jack Cardiff, BSC], we both loved the way the skin tones behaved and how the colors responded. To create that look for our picture, we spent many months developing the right technique for mixing color into the image. Stephen Lawes, the compositing supervisor, really spearheaded two departments: black-and-white compositing, which did the initial compositing, and the color department, which laid color over the composited black-and-white images. We always used the black-and-white composite as the master image and then added color.
Lighting a feature film shot entirely against bluescreen presented both logistical and creative challenges. When you're lighting a scene in an empty, blue environment, you have to imagine what the actors are going to encounter in that scene, be it a tree or a building shadow, says Adkins. We shot one sequence set in a jungle, and we only had practical trees and bushes where we knew the actors were going to touch them. We didn't know what the background was going to be because the storyboards for the latter half of the film were somewhat rough, but it was abstract enough that we decided to keep all of our key lights on one side of the actors and just create some dappled light from above, like it was filtering in through the trees.
Adkins maintains that lighting actors for bluescreen work is far trickier than most filmmakers believe. Everyone talks about lighting things flat and says they'll adjust it [in post] to make it look good, but that is completely factitious! he emphasizes. That might be a producer's dream, but in the long run that approach is more than a little costly. You can't re-create the feel of back- and sidelight in post off a flat-lit image. Sure, you can add a slash of shadow across an actor's face, but if you think you can make it look as though actors are interacting with lighting created entirely in post, you're in for a lot of work. And on this film, the effects department was already tasked with a lot of work.
In lighting every shot, Adkins' first consideration was the bluescreen. Of course, you have to light the bluescreen cyc and floor, but we didn't want all of the light on the actors to be toplight, he says. However, that's not to say you couldn't use some of that toplight if you wanted to. For example, there's a scene in a scientist's laboratory where our set consisted of a complete floor, lab equipment, stairs and a door. It's a pretty complex shot and fairly wide, because it starts with Sky Captain and Polly walking in the door and then pans to reveal the entire lab. We placed that set on our grid so that one of the space lights was directly above Jude and Gwyneth at the door; that way, we could dim it to add ambience if we wanted to. (See diagram on page 39.)
Although Sky Captain's lighting was inspired by film noir, Adkins used large sources to light the actors. He explains, Video likes soft sources, but we wanted to avoid flat lighting, so we used large, soft sources and then cut that light down so it looked more directional. I was always trying to create a more stimulating look.
In fact, the week before we were to start shooting, we decided to shoot a test of the scene where Sky Captain goes into his office and is reunited with Polly for the first time, Adkins continues. We wanted to get the actors accustomed to our shooting style and the bluescreen environment. On set, there were a door frame with a frosted-glass window, a desk, a couch and a filing cabinet set against our wraparound bluescreen. The scene has very moody lighting, and it was a great opportunity to play out our noir look and see what the actors would do with it.
In the preceding scene, we'd seen Sky Captain for the first time, and this scene begins with him coming into his office from a dark hallway. It seemed like a great time to play Jude's silhouette against the frosted glass as he enters, so we backlit him with a softened Nine-light. Inside the room, I had a couple of 5Ks cross-lighting the doorway from his desk, and we cut the light off of him down to his stomach. You just see the highlights down below, but as he walks up to his desk, the light from his desk lamp, the only known light source, fills in from below. He then walks around the desk and sits in the chair, moving fully into the light from the desk lamp, which on this angle was a low 2K bounced off of beadboard. There were areas of blue framing the door that we let go darker than we normally would so we'd just be able to key it. But still, it was a high-contrast silhouette of Jude, so if they couldn't pull a key, they could pull a difference matte or a luminance key. For a test, it was pretty bold lighting, but I wanted to establish early on that I didn't want to sacrifice lighting the actors [creatively] in favor of the bluescreen. [Ed. Note: This test ended up in the final film.]
To light a large-scale scene, Adkins needed to know exactly what would be in the virtual environment so that he could not only light the actors, but also create a tonal interaction with those non-existent elements on set. In preproduction, we had to figure out what structures were going to be there' in the virtual environments, but on set we had to imagine the rest, he says. You're in a big blue room, so you have to imagine the details of the environment that will surround the actors [in the final image] in order to give your lighting a sensibility.
In one shot, Polly comes out of an elevator into the lobby of the building where she works, Adkins continues. When we shot that, I had visual-effects stage coordinator Jim Tharp bring in a few big blue blocks to act as [lighting] barriers to provide a transition from the elevator' to the hallway,' even though Gwyneth was basically walking in a sea of blue. We had toplight from one of our space lights on her in the elevator,' and she then walked out into a black zone, becoming a full silhouette, and then finally stepped into heavy sidelight, which represented light in the lobby.' We had to envision that lighting transition when we laid out the shot, and using those blue blocks not only helped with flagging some of the hallway' light from spilling into the elevator' area, it also helped Gwyneth have a frame of reference for where she was in the virtual set.
With almost every shot an effects composite (except for a few small scenes and a handful of inserts that were shot practically), Adkins made himself available as the 2-D and 3-D background environments were integrated and lit in the computer domain. Right after the shoot, I spent about six weeks helping the compositors and CG lighters understand what I was going for, he recalls. Then, when they brought in CG lighting director Michael Sean Foley, I talked him through my work because he was so involved with how the images were being created and processed. Still, it felt a little odd to let go after being so involved in the development of the film, the extensive prep and then the physical shoot. All I could do at the post stage was try to transfer my knowledge [to the effects team]. Fortunately, Adkins was able to spend four weeks supervising the digital intermediate at EFilm, where, with colorist Steve Bowen, he made one last pass at the look of this unusual project.
After all those years of planning and plotting, to see what Kerry's project became is just incredible, he remarks. When we finally had the actors on set for the first time, we were standing there wondering how it was all going to come together, and Gwyneth suddenly stood up and turned into the light, and it picked up her new Veronica Lake hairdo. Our jaws dropped open, and we knew it had all been worth it.
Christopher Probst was a camera operator and second-unit cinematographer on this project.
1.85:1 (16x9 native capture)
Sony CineAlta HDW-F900/3
Digital Intermediate by EFilm
Kodak Vision Premier 2393 and Vision 2383
test from the american cinematographer