Digital Technology Revolution in Film and Television: How Virtual Production and Generative AI Are Paving the Way for Engaging Content
Film and television entertainment have served not only as a form of leisure throughout the development of human culture, but have also documented the shifts in global society, artistic trends, cultural and emotional values as time evolves. They have become crucial mediums for cultural exchange worldwide.
From Hollywood’s influence on global pop culture to the cultural exports of Japanese anime and Korean dramas and variety shows, these forms of media have profoundly impacted audiences around the world. The production of film and television has also driven technological advancements—from the black-and-white films of the 1940s to today’s digital and virtual production techniques—reflecting humanity's dedication to both software and hardware innovation.
This article will explore several key advancements in modern film production technologies, examining how these innovations affect the production process and, in turn, reshape the ways creators tell their stories.
The Evolution of Film Production: From "The Lord of the Rings" to "Avatar: The Way of Water"
Looking back at the 21st century, technological breakthroughs have led to numerous landmark achievements in visual effects films. Beginning with "The Lord of the Ring" trilogy in the early 2000s, director Peter Jackson and Wētā FX employed live-action motion capture to bring the CG character Gollum to life with remarkable realism. In 2008, "The Curious Case of Benjamin Button" introduced innovative techniques that allowed the actor to appear older through CG technology.
By 2010, Christopher Nolan’s "Inception" utilized previsualization (Previs) technology, enabling the effects team to design shots before filming. Fast forward to 2022, "Avatar: The Way of Water" marked another significant advancement, featuring innovations such as computer-generated water, underwater performance capture, SimulCam (a synchronized live camera system), and real-time depth compositing, all of which redefined production workflows.
.jpg)
These films each feature distinct storylines and groundbreaking achievements in visual effects. Most of them combine live-action footage with computer-generated (CG) elements to create both fantastical and realistic visuals, utilizing motion capture technology to animate fully CG characters, with digital actor creation often involving facial scanning techniques.
A prime example is the film "Avatar" which came out in 2009. Despite its heavy use of digital elements, the film's foundation is still rooted in motion capture. The movements and expressions of the Na'vi were originally data captured from live-action actors on set and later converted into the movements of these 3D characters.
Director James Cameron also employed a virtual camera system, allowing him to directly view 3D characters running through Pandora's forest on his monitor. This virtual production workflow, which integrates motion capture, real-time rendering, and physical cameras, enabled the director to frame shots from a tangible, real-time perspective, resulting in visuals that are both imaginative and breathtaking.
.jpg)
Breakthroughs in digital technology have addressed the limitations of traditional makeup, props, and physical effects. Though practical effects continue to play a significant role in current productions, we cannot deny that digital advancements allow creators and performers to operate in safer and more efficient environments while delivering more realistic visual effects. This immersion enhances the audience's experience, enabling them to engage with the deeper themes conveyed by the narrative.
Whether depicting an epic adventure against malevolent forces or showcasing a grand setting that reflects humanity's impact on the environment, these films highlight technological achievements and resonate profoundly across diverse global cultures.
Innovations in Motion Capture Technology Lead to More Realistic CG Characters
Motion capture technology has become a standard practice in today’s film and television productions. Iconic characters like Gollum in "The Lord of the Rings" trilogy, the Na’vi in "Avatar," Caesar in "Planet of the Apes," and Smaug in "The Hobbit: The Desolation of Smaug," owe their lifelike presence to real actors wearing motion capture suits and performing on set. The captured movements are transformed into digital data, which animators subsequently refine to create final animations that align seamlessly with the characters' unique requirements. This process brings these fantastical beings to life.
When it comes to motion capture technology, Wētā FX is recognized as a leader among visual effects studios, specializing in the creation of high-quality CG creatures and characters while pioneering innovative processes for both body and facial capture. Take "The Lord of the Rings"(2000) as an example, Wētā FX collaborated with actor Andy Serkis to bring the character Gollum to life. Serkis wore a motion capture suit with optical markers, and filmed in a dedicated motion capture area on set. The final depiction of Gollum on the big screen was a seamless integration of Serkis’s motion data and the animators’ keyframe adjustments.
Another significant technological breakthrough occurred during the production of "Avatar"(2009), when Wētā FX successfully achieved simultaneous motion capture of multiple actors within the same capture area. They also introduced head-mounted cameras to record both body movements and facial expressions, providing animators with more authentic performance references. This advancement became Wētā FX's proprietary "FACETS" system, which utilizes facial data-driven solvers in conjunction with animator adjustments to control facial expressions. Today, Wētā FX continues to innovate, enabling real-time viewing of motion capture data and wireless transmission—milestones in their ongoing technological development.
.jpg)
In Taiwan, motion capture technology is primarily utilized in game development, particularly for creating character movements. However, it is also employed in film production, especially when dealing with unique subjects that cannot rely on existing motion libraries. This technology allows for the extensive recording of actors' movements.
A notable example is the popular 2023 Taiwanese series "Wave Makers," which features numerous election campaign scenes with crowds waving flags and cheering. The visual effects specialty, Film Tailor Studio, utilized motion capture technology to record the movements of real actors, integrating these movements throughout the scenes to enhance the authenticity of the visual experience.
.gif)
With the advancement of AI technology, tools such as Wonder Studio have emerged, offering an innovative approach compared to traditional motion capture processes. In this setup, actors do not need to wear any motion capture equipment; instead, they simply prepare a 3D character and record a video of their live performance.
This allows for the replacement of live actors with 3D characters, incorporating lighting and shadow effects that can seamlessly match the video environment. Although this technology still requires post-production adjustments, it can reduce the workload for special effects artists by nearly 80% compared to the conventional method of using motion capture data to animate 3D characters.
.jpg)
The Rise of Virtual Production: Transforming the Production Workflow in Film and Television
In recent years, a new production tool / process has gradually transformed the creative approach within the entertainment content industry, impacting everything from live-action films and series to broadcasting and live events. This approach, known as "Virtual Production," integrates traditional and virtual filming techniques, utilizing technologies such as green screens, motion capture, post-visualization, game engines, and LED volumes. This innovation has not only affected the roles of producers, directors, VFX leads, lighting artists, and actors but has also revolutionized the entire production workflow.
Compared to traditional filmmaking workflow, a key difference in virtual production is that the creation of animation and visual effects occurs during the pre-production phase. In conventional green screen filming, backgrounds and visual effects are typically added after shooting, which often results in challenges such as inconsistent on-set lighting and the crew's inability to see composite effects in real-time, potentially affecting actors' performances.
In contrast, virtual production utilizes LED walls instead of green screens, allowing animation and effects teams to create virtual environments and lighting effects directly within game engines in advance. This approach enables directors and the crew to view composite effects in real-time on monitoring screens on set.
Taking the 2019 series "The Mandalorian" as an example, the visual effects production team at Industrial Light & Magic (ILM) mentioned that over 50% of the first season was filmed using a virtual production process. This technique involves replacing outdoor shooting with LED volume, which allows actors to perform within immersive 3D environments displayed on large LED walls. This innovation eliminated the need for actors to imagine their surroundings while looking at a green screen.
The success of the virtual production workflow relies heavily on the capabilities of real-time rendering engines such as Unreal Engine and Unity, which can generate high-quality 3D scenes instantaneously. This also allows "real-time" adjustments to lighting, camera angles, and scenes during filming, without the need to wait for post-production. Consequently, production efficiency is significantly enhanced, the necessity for reshoots is diminished, and overall production costs are reduced.
.jpg)
Moreover, virtual cameras and motion capture play crucial roles in this process. Virtual cameras can detect the movement trajectory of physical cameras, enabling the 3D scenes displayed on the LED walls to follow the camera's movement, thereby presenting accurate parallax and perspective effects. Additionally, motion capture combined with head-mounted cameras facilitates real-time interaction between real and virtual elements, allowing directors to view the final results directly.
.jpg)
In Taiwan, Moonshine Studio introduced virtual production technology in 2019, with its initial application in the TV series "Gold Leaf" presented by PTS in 2021. The historical street scenes visible through the windows of vehicles in the show were created using LED walls. Recently, Reno Studios has also utilized virtual production techniques to film vehicle scenes in films "Old Fox" (2023) and "Weekend in Taipei" (2024).
Currently, companies offering LED volumes services in Taiwan include Moonshine Studios’ MOONSHINE XR STUDIO, the LED stage operated by Reno Studios and Illusion Studios in Central Pictures Corporation, as well as the CYANS LED Virtual Studio managed by Formosa Television.
.jpg)
Generative AI Tools Introduce New Breakthroughs in Film and Television Production
In this rapidly changing era, generative AI has become an essential tool for content creators nowadays. Whether in text, voice, visuals, or sound, generative AI is unlocking unlimited possibilities at an astonishing pace.
Since the launch of Midjourney and ChatGPT in 2022, generative AI has quickly integrated into people's daily lives thanks to its low barriers to entry and its ability to generate content rapidly and at no cost. In just two years, a multitude of generative AI related products and services has emerged, ranging from text-to-image and voice generation to text-to-video creation, proliferating like mushrooms after rain.
Currently, creators are extensively utilizing generative AI. For instance, they employ tools such as ChatGPT and Midjourney to assess client needs, which include data organization, email communication, proposal strategy development, requirement formulation, script generation, and concept design. Artificial Intelligence Generated Content (AIGC) is evident in nearly every aspect of this process. In an interview featured in the 56th issue of InCG Magazine, Chia Chi Lin, founder of Moonshine Studio, stated that they use Stable Diffusion in conjunction with tools like ControlNet and LoRA during the initial design phase to generate atmospheric or character concept art, effectively visualizing the creative concepts in their minds.
In the early stages of film and television production, creative teams must attract investors through effective proposal planning. The rapid generation of still images, initial posters, and mood boards using AI tools not only facilitates financing but also aids clients in making decisions by quickly producing multiple image variations. This approach reduces the time spent on revisions and lowers communication costs. Peter Huang, co-founder of Reno Studios, noted that they have integrated AI tools into the film production process, thereby enhancing work efficiency and optimizing workflows. For instance, during the pre-production phase, AI can be employed in concept design, storyboarding, and post-production facial replacement techniques to recreate deceased actors or modify their appearances and ages.
.jpeg)
Speaking of facial replacement technology, the Hollywood visual effects company RISING SUN PICTURES (RSP) launched the "REVIZE™" toolkit at the beginning of 2024. Utilizing machine learning technology in conjunction with visual effects processes, REVIZE™ can be employed for common needs in film production including facial and body replacements, facial expression modifications, and wrinkle removal. Recently, RSP applied this technology in the film "Furiosa: A Mad Max Saga," seamlessly transferring the facial features of the lead actress, Anya Taylor-Joy, onto a younger actor's face, effectively demonstrating the character's aging effects.
The Future Path of Technology and Empathy
In the entertainment industry, the impact of virtual production and generative AI undoubtedly compel creators and operators to reconsider their future career and work methods. However, we can adopt a positive perspective toward this transformation and uncover limitless opportunities within it. Much like the transition from film to digital photography, the role of photographers has endured; only a photographer's vision can effectively utilize technology to capture beautiful moments.
The advancement of technology has significantly improved the convenience of digital content production. High-quality and realistic visuals, real-time rendering, and rapid generation capabilities save creators substantial manpower and time. Meanwhile, directors and producers can allocate more resources to conceptualizing "empathetic" projects.
From the phenomenal and successful works around the world in recent years, it becomes evident that the key to success lies not in utilizing cutting-edge production technologies but more in transcending cultural narratives, adapting intellectual properties (IPs), and implementing robust marketing strategies. These factors are crucial for creators to consider during the early development phase. While technology improves production efficiency and the quality of results, what truly resonates with audiences is the content itself. Only when technology and creativity merge seamlessly can it lead to the creation of works that profoundly connect with the audiences.