This week I learned to make a diffusion effect in UE4 and learned about the Niagara system, material system, and blueprint system in UE.
This is my documentation:
I referred to an effect in the Spider-Man: Into the Spider-Verse:
This is the final effect.
Through animation experiments in these real-time engines, I learned about the actual workflow of the real-time rendering engine. With these experimental experiences, I will be more efficient when I start to produce my personal animation works, and I will have a deeper understanding of the advantages and disadvantages of the engine.
This week I have been testing a variable weather scene in Unity.
From model asset import, material creation, to post-processing, I got a basic understanding of the rendering part of the real-time rendering engine.
The effect of using displacement mapping:
Shader Code:
I mainly produce rainy and snowy environments.
The core idea of snow: texture mixing to make snow colour, and vertex offset to make snow thickness.
The addition of snow colour is completed in the fragment program, which is mainly to calculate the dot product between the normal of the object and the direction of the snow. The snow position is an angle less than 90 degrees. Code:
Snow thickness is made by vertex offset, which is mainly combined with vertex normal in the direction of snow cover, multiplied by snow thickness and snowfall; code:
The special effects of snow particles need to pay attention to the movement effect of the particles and the melting after landing. Because the snow is lighter and the flying trajectory is messy, the speed setting can be given multiple directions, and the falling speed can also be lower. Let the snow fall more slowly. The noise effect can also be enabled, but since the noise effect always exists, the snowflakes will still move after landing, which is not applicable here. The landing effect needs to turn on collision, add Mesh Collider to the objects in the scene, the particles can collide with the model, adjust the parameters to make the particles stay on the surface of the model because the scene is a static mesh, the collision quality can choose medium or low, it A set of voxels can be used to cache previous collisions so that they can be quickly reused in future frames. The effect of disappearing and melting is similar to that of screen snowflakes.
The core idea of raining: dynamic texture, a mask based on the direction of world space
I separately created an orthogonal camera and particle system with a black background in a very far place of the scene and created a RenderTexture. I captured the particle image on the RenderTexture through this orthogonal camera and obtained a dynamic black and white texture, and The dynamics of the texture can be modified and added arbitrarily.
Then I have to consider the method of calculating the mask. The rain is generally on top of the model. Here I decided to use the world normal direction float3 wNormal = WorldNormalVector(IN, o.Normal); I can give the model different faces in different directions. The colour, the positive direction of the y-axis is green, and it can be changed as the model rotates.
After obtaining the dynamic texture and black and white mask, the next product idea is very clear. Combine the mask to display the dynamic texture in the correct position, and then adjust the roughness, normal, colour and other attributes. Here we refer to some pictures showing objects in the rain and summarize some characteristic attributes, such as water absorption. The stronger the water absorption of the object, the darker the colour of the object will become when it is wetted by water.
Later, I also made moss growth based on the snowing effect. This is the final effect of the weather system:
This week I have been looking at animations using the Real-Time Rendering engine, the official engines for Unity and Unreal have released a number of short animations using the engines in recent years to demonstrate the effects of the Real-Time Rendering engine. I have been studying the work of these two engines.
Animation created by UE4:
Animation created by Unity:
Unreal Engine-powered animation production pipeline:
Of course, information learned only through these works can be somewhat biased. After all, they are all officially presented as advantages. Next, I will use the UE and Unity engines and do some practical project testing. On the one hand, I will learn about real-time rendering engines, and on the other hand, through actual project practice, I may find some shortcomings in the current real-time engines for animation creation.
This week I have been looking at how real-time engines affect the narrative structure in the creation of animation. There have been a number of interactive films created using the features of the real-time engine, such as Detroit: The Changeling, Heavy Rain, The Incredibles and many more. I looked at how they tell their stories.
The first thing that is quite intuitive is that the quality of the graphics is getting better and better, thanks to new rendering techniques and the development of graphics cards.
Both of these productions, with the exception of The Incredible Twins, have a multi-line narrative, with as many as four lines even in Heavy Rain. And in all of them, the viewer is supported to manipulate the character to view the current scene freely. The people or objects in the scenes provide more plot and enhance the immersion of the viewer.
Animation creation combined with the nature of the real-time engine is still largely borrowed from games in the form of RPG dialogue choices, QTEs, third-person action-adventure games and so on.
The overall experience I felt was that the format probably needed to match the story, and sometimes the feel of the film and the feel of the game were influencing each other. Some of the habits developed from watching traditional movies didn’t work in this type of work, for example, the sudden shift from watching a movie to playing a game when you get into a tense episode was a bit sudden and abrupt.
This week I have been collecting references for the topic I am researching, the use of real-time engines in animation production.
Through my research, I found that there has been a lot of more in-depth discussion in this area and that some companies have already experimented with it in commercial projects. The analysis of some specific cases can be continued next.