AQUAMAN 2

So many lessons learned! So many twists! And turns!

The production of Aquaman 2 was trial by fire for me– there were so many unique and newly released technologies in the mix– coupled with the media attention whirling around Amber Heard’s lawsuit against Johnny Depp, which culminated in a sizable chunk of her scenes being removed from the final cut.

All in all, the production of Aquaman and the Lost Kingdom was a transformative era of my life.

When work began on the film, I was pulling double duty by working in two departments: the ScanlineVFX Unreal Pipeline Team, led by Tim Catalano, and within Eyeline Studios Virtual Art Department as the resident ‘Unreal guy’. At the time, Unreal was a relatively new addition to the pipeline, and a fair amount of discourse and debate had raged back and forth over Unreal Engine’s suitability to the work being done.

Why?

Well, for one thing, the studio was heavily invested in volume capture, and Aquaman 2 was selected to demonstrate what could be achieved with the technology. For those unfamiliar with volume capture, it’s a method for capturing actor performances that produces three dimensional footage. Rather than recording an actor from a single camera, with volume capture, an actor performs in the center of a circular stage, and hundreds of cameras mounted along the perimeter of the stage are spaced evenly from the ground to the ceiling. The footage from each camera is parsed via it’s timestamp, and each frame is used to produce a single 3d recording of the actor’s performance, complete with ever-changing geometry and textures.

The performance itself is output as an alembic file, and once it’s imported into Unreal it’s converted into a geometry cache.

For those who’ve never heard of a geometry cache before, it’s specially packaged dataset of geometry . Later years would see the introduction of more sophisticated methods of working with volume capture, and the advent of Nanite also proved to be a game-changer, since poly count and mesh complexity quickly became a near-trivial consideration thanks to Nanite’s ultra-optimized algorithm for shuttling polygons in and out of memory. But, with Aquaman 2 and Unreal Engine 4.27, the constraints of geometry caches resulted in fairly bloated Unreal scenes, which were challenging for even the best workstation GPUs. At the time, RTX 8000 GPUs reigned supreme, but even they struggled mightily with all those geometry caches being juggled around within the project. Averaging 2-4 frames per second in the editor was common during the final stages of production, and many of the provided assets had never been optimized for real-time engine constraints.

I spent a considerable number of quiet evenings in front of my monitors. And each evening, as the sun went down, the many shades of blue from my monitors lit up my face– until I, too, seemed to be deep undersea.

Fond memories fade quickly, and are replaced, more often, by vague yet warm recollections. I remember, however, the shared sense of accomplishment we felt when the work was handed to the COMP team for further revisions and polish, as well as color grading. What RELIEF! The only way to preserve such memories is to stop and dedicate the time to jot them down somewhere, in some form. And really, that’s my intent here and now– to leave a memento, while most of the memories remain, of what went into the creation of a Marvel movie about about a man, his trident, and the kingdom of fish he ruled.

My thanks and admiration go out to the many talented artists I worked with throughout the project, and especially to Anne Kim, Eyeline Studio’s jack-of-all-trades, who weathered a thousand and one different bugs and issues at my side.