This project is driven by the pursuit of a technological breakthrough, centered on creating a scene that deeply integrates Unreal Engine 5 with Houdini. Its creative inspiration is drawn directly from a specific Chinese urban fantasy novel.
Theme
The story revolves around a pair of dragon brothers from the novel’s narrative. In this world, the Dragon Kings are always born as twins, resurrecting every few millennia. They begin their lives in human form, initially unaware of their true divine nature.
Within the story’s framework, the dragon race is portrayed as ancient adversaries of humanity. The elder brother, Norton, grows up believing himself to be human. The pivotal moment occurs when his younger brother regains his memories and is killed while protecting him. This tragedy triggers Norton’s own awakening, compelling him to remember his true identity and destiny. Consumed by grief and a thirst for vengeance, he journeys to a sacred lakeside, where he summons his ancient kin to unleash their wrath upon humanity.
week 1
This week, I made significant progress on the project by completing the storyboard and character modeling, with a primary focus on developing and refining the materials.
Story board


Given the significant number and complexity of the visual effects, I made a strategic decision to limit the number of storyboards. While this streamlined the initial phase, it has had a considerable impact on guiding the subsequent production stages.
Skeleton skeleton character scene production
Reference


Modeling
This week, I primarily utilized ZBrush to sculpt the skull of my younger brother character, Constantine. To enhance workflow efficiency, I strategically incorporated the use of physical asset references during this process.


I made the deliberate decision to build this scene in Unreal Engine. Given the rainy setting, this required me to specifically study and implement techniques for rendering a convincing rainy atmosphere within the engine.

A key takeaway from this process was learning to utilize the Rainy Material function within the EasyRain plugin to generate realistic wet and waterlogged materials for both my character model and the terrain. This experience demonstrated that creating a convincing rainy environment through Unreal Engine can be a remarkably straightforward and effective workflow.



reflection
In producing this particular shot, I encountered challenges in achieving the right balance between the atmospheric rain effects and the environmental lighting. The precipitation appears somewhat disconnected from the scene rather than feeling like an integrated natural element. This issue primarily stems from insufficient interaction between the rain particles and the scene’s primary light sources, resulting in a lack of believable absorption and reflection on surfaces. Additionally, the mist and humidity effects that typically accompany rainfall need further development to enhance atmospheric cohesion.
week 2
This week, my primary focus is on creating the animated effect of a skull dissipating into particles. I intend to use this scene as the narrative conclusion, symbolizing that despite the protagonist’s burning desire for revenge, they ultimately share the same fate as their brother’s skull — moving irrevocably toward death and dissolution
First,I used houdini to create this effect. I first made the shape of the fragments.

I created this effect using Houdini. The process began with shaping the individual fragments, which were then used to generate a volumetric source. This volume was emitted as a particle simulation, with wind forces applied to achieve a natural smoke-like dissipation. The final effect visually echoes the thematic motif of dissolution and fading memory.

Following the effect setup, I progressed to the material authoring phase, where I utilized the Redshift renderer to develop and refine all material assignments. The scene was then rendered in multiple dedicated passes (AOVs), which were systematically composited and integrated in Nuke to achieve full creative control and a polished final look.



Following the completion of the core effects, I proceeded to the material authoring phase where I utilized Redshift renderer for material creation and refinement. The scene was rendered through multiple AOV passes to preserve flexibility, with final integration and artistic refinement completed in Nuke for optimal visual quality.




reflection
This shot serves as a metaphorical sequence and can be viewed as having minimal direct connection to the main storyline. It functions more as a self-contained visual piece designed for the film’s conclusion. During production, my focus remained primarily on realizing the effect itself, rather than ensuring visual or narrative continuity with other shots — an oversight in hindsight, though I still believe it effectively conveys its intended symbolic meaning.
Additionally, I encountered a certain stylistic ambivalence while authoring the material. I intentionally emphasized its metallic qualities, deliberately reducing the reflectivity that a realistic skull would typically possess. Ultimately, I chose to proceed with this approach, as I believe the resulting material better accentuates the symbolic weight and visual impact of the effect.
week 3
Dragon Animation Production
My plan involves creating a dynamic animation sequence featuring a giant dragon emerging from and submerging back into the water. The full animation will be developed in Maya, utilizing a combination of custom modeling and pre-existing assets to construct the dragon model and bring the creature to life.
reference

Maya animation production


During the animation process, I conducted a study of Western dragon animation references to refine the wing mechanics. By carefully controlling the amplitude and deformation of the wing membranes, I ensured that the wings maintain an aerodynamically plausible shape throughout their rotation in the air, enhancing the physical believability of the creature’s flight.
Following the completion of the core animation, I introduced an additional narrative sequence depicting the dragon ascending into the sky and unleashing a torrent of flames, for which I created a dedicated animation cycle to enhance the dramatic conclusion.
reflection
The limitations in this segment are quite apparent. Primarily, animation is not my core area of expertise, and the complexity involved in creating believable creature animation significantly surpasses that of standard character animation. Consequently, I made a strategic decision to allocate only limited resources to this animation sequence, with the intent of focusing my primary efforts on effects production where I can deliver stronger results.
week 4
Simulation of seawater collision
This week, I officially began developing the dynamic effect of a dragon emerging from the water’s surface. I will leverage Houdini’s FLIP solver to achieve this complex fluid simulation, ensuring a realistic and visually compelling interaction between the creature and the aquatic environment.

Firstly, generate VDB for subsequent collision simulations

Subsequently, a seawater collision simulation was conducted, during which I tested multiple versions
Then, a seawater collision simulation was conducted, during which I tested multiple versions. I need to adjust the outflow and inflow speed of this part to control the size of its water spray. I spent a lot of time adjusting a suitable water spray. Afterwards, there will be a long-term simulation.


Then came the production of white water and mesh, which I also spent a lot of time adjusting
Water sim
white water
reflection
This week’s focus remained on developing the core visual effects. I had to limit time spent refining the whitewater simulation due to its significant computational cost, and the simulation accuracy fell short of my target due to hardware constraints. Additionally, the relatively small surface area defined for the water simulation may present challenges for upcoming shot design and final rendering.
Week 5
This week, I began constructing the core environment in Unreal Engine and developed a camera animation sequence in Houdini. This cinematic camera path was then imported into the UE scene to establish the visual framing and rendering setup for the final sequence.
This shot was designed to fully showcase the intended visual effect while optimizing my workflow efficiency. Although the original plan called for a sequence of three distinct shots, I ultimately consolidated the narrative and visual goals into this single, comprehensive
I then exported the camera as an FBX file, preparing to move into the construction and lighting phase of the Unreal Engine scene.


I began by utilizing UE5’s built-in water system to generate a vast ocean surface as the foundational environment for the scene.

I then enriched the environment by strategically integrating cliff assets, which helped define the spatial boundaries and enhance the overall visual composition of the scene.


A critical technical issue emerged due to the 1000:1 scale ratio between Houdini and Unreal Engine cameras. After importing, the camera was positioned much farther from the water surface than intended. This excessive distance directly resulted in a severe loss of visible water ripple detail, making the surface appear unnaturally flat and compromising the scene’s realism.
Through systematic troubleshooting, I identified that the ocean surface texture is governed by the Water Static Mesh Material. After extensive parameter analysis, I pinpointed the Default Distant Water Scale parameter as the primary control. Following standard UE material practices, I created a new material instance based on the original, then adjusted this key parameter to successfully restore a realistic ocean surface appearance.


Following the parameter adjustments, the ocean surface now renders correctly from my camera’s perspective, achieving the intended visual balance and scale.

The scene is set on a brightly illuminated night. I strategically adjusted the directional light to sculpt the atmosphere and redefine the visual mood of the environment.

reflection
During this phase of production, I acquired proficiency in utilizing the water plugin to efficiently create ocean environments and mastered techniques for adjusting marine textures. While this stage of production was relatively straightforward, it still required a significant investment of time to resolve various ocean texture issues. That said, I recognize that I did not dedicate sufficient effort to the broader environmental design—an aspect I admittedly overlooked.
Week 6
This week, I primarily focused on creating the dragon’s fire-breathing effect. Although this element was not part of the original design plan, I recognized that adding a dynamic fire effect would significantly enhance the visual richness and narrative impact of the scene. I will be developing this effect using Houdini to ensure a high level of procedural control and visual quality.
This marks my first venture into Houdini’s Pyro system. I began by tackling the fundamental challenge of emitting particles directly from the dragon’s mouth as the core source for the fire effect.

The method I employed involves selecting a specific point on both the upper and lower jaws of the dragon model to compute a central origin point. I then repeated this process for two points located at the front of the mouth to define a target. By subtracting the former central point from the latter, I successfully derived a direction vector that accurately guides the emission and propagation of the flame effect.

However, a significant issue arose when introducing the POP simulation—a problem rooted in an oversight during the initial modeling and animation phase. The dragon’s animation was crafted as a slow-motion shot, but I found that the POP simulation could not run at a correspondingly slow speed without compromising the system. Running the simulation too slowly caused particles to fail to emit correctly from the dragon’s mouth aperture and resulted in highly unnatural particle velocity, a problem that persisted and puzzled me for quite some time.

A fundamental misalignment issue emerged from my technical approach. My process involved first emitting a stream of particles, which I later converted into a continuous trail. However, it became visually evident that the entire particle stream was generating to the left side of the dragon’s head, rather than being locked to its mouth. The root cause was clear: the dragon model was continuously moving in world space, while the emitted particles, once spawned, failed to inherit this subsequent motion, causing them to lag behind and break the illusion

After integrating the Pyro system, I managed to address the spatial alignment through keyframe adjustments. However, the fundamental issue of temporal mismatch persisted—the particle flow velocity remained excessively fast, creating a severe disconnect with the dragon’s slow-motion animation.

This compelled me to adopt an alternative particle emission method. I copied a small sphere to the starting point of my calculated vector, scattered points within this volume, and applied noise distortion to them. Using this processed volume as the new emission source effectively solved the issue of uniform emission inherent in the single-line trail approach.

This compelled me to adopt an alternative particle emission method. I copied a small sphere to the starting point of my calculated vector, scattered points within this volume, and applied noise distortion to them. Using this processed volume as the new emission source effectively solved the issue of uniform emission inherent in the single-line trail approach.
While the final result was achieved by saving the simulation with slower particle speeds and then manually aligning it to the dragon’s mouth via keyframing in the timeline, this method is ultimately a compensatory workflow rather than a fundamental solution. It bypasses the core issue—the mismatch between simulation time scale and artistic direction—and incurs significant costs in iteration time and creative flexibility.
As shown in the image, I tested a vast number of flame variations, which consumed a significant amount of my time.
Fire scene 2
Reflection
As my first attempt at creating an effect with Houdini Pyro, I have delivered a functional result, yet it still falls significantly short of my initial vision. I am aware that the effect has substantial room for optimization—particularly in how the slow-motion cinematography was designed without fully considering its impact on the simulation workflow. This oversight in pre-production planning stands as a key learning from this project.
Week 7
This week, my focus will shift primarily to compositing. I will integrate and refine all the rendered elements to achieve the final visual look.


I began by applying a CC Toner to adjust the overall color palette of the scene. Subsequently, I performed camera tracking to solve the scene’s camera, a crucial step that enables the seamless integration and replacement of the sky background in subsequent compositing operations

As shown in the image, I used a starry sky image as the new background. By duplicating the original scene layer and applying Cryptomatte to extract the foreground elements, I successfully replaced the sky while preserving all foreground details.
Following this, I extracted the core dragon element and began compositing it into the scene through a multi-step integration process. This involved carefully blending the creature with the new environment while maintaining visual consistency.






Finally, I introduced a bright moon into the sky composition, which serves as both a focal point and a source of atmospheric lighting, enhancing the overall visual hierarchy of the scene.

Reflection
I am relatively satisfied with the compositing outcome, as I believe I have successfully integrated the scene with my visual effects. However, I must acknowledge a significant limitation: the VFX elements were not rendered using the lighting data from this composite scene. This discrepancy undoubtedly impacted the final cohesion, representing a key area for improvement in my compositing pipeline.
Week 8
This week, I will focus on expanding the Unreal Engine scene and refining the characters to strengthen the narrative cohesion and storytelling impact of my film.


As shown in the figure, I utilized MetaHuman characters for the production and modified their iris texture maps to accurately recreate the protagonist’s distinctive golden pupils.

During this production phase, I primarily focused on mastering the technique of controlling Niagara particle size through sequence frames, specifically learning how to incorporate and manipulate user parameters within the sequence frame workflow.
During this phase, I actually produced multiple shot iterations, though ultimately only two were selected for the final cut. The remainder were omitted due to not meeting the desired quality standards.





Reflection
This part warrants deep reflection. The majority of my energy was dedicated to VFX production, without establishing detailed storyboards for these narrative segments. Consequently, despite attempting multiple shots, I struggled to assemble them into a coherent sequence. Furthermore, as character animation is not my core competency, the results appeared noticeably rigid, forcing me to minimize their use wherever possible.
Conclusion
This VFX project represents a bold attempt at integrating Houdini simulations with Unreal Engine environments for film production. Through this process, I deeply appreciated UE’s efficiency in scene assembly for animation pipelines, as well as Houdini’s powerful capabilities in producing fluid and fire effects. While this graduation project has been an immense learning journey, the final outcome falls short of my initial vision in terms of polish and aesthetic refinement—though time constraints prevent further iterations.
Upon reflection, I recognize that undertaking so many production modules alone was overly ambitious. I should have focused more deeply on specific aspects rather than spreading efforts thin across all areas. Another key lesson was the lack of a clearly defined visual target from the outset, which significantly impacted both workload efficiency and final quality.
Nevertheless, I dedicated my full effort to this film, and I believe it genuinely reflects the progress I’ve made throughout my one-year master’s program.