This is my final result!
This is the breakdown.
This is my final result!
This is the breakdown.
This week I imported all the resources into UE4 and created a project to synthesize all the elements.
For animation resources, I use Abc files to import the effects of cloth simulation into UE4.
In addition, I also obtained a special effects resource pack on the Unreal Mall to achieve some cartoon effects.
Then I used Sequencer in UE4 to make the camera animation and added various elements to the appropriate positions.
When rendering and exporting, I tried MovieRenderQueue in UE4. Compared with direct export using Sequencer, MovieRenderQueue can sample multiple times in each frame to obtain better dynamic blur and smooth lines. Real-time rendering is super fast, I have more than 1,400 frames in total, and the rendering is completed in about an hour.
Finally, I imported the rendered picture sequence into Pr and used Adobe Au’s free sound effect material library. I also purchased some sound effects on the Internet and edited and synthesized them in Pr. The title of the film is the last to think about. In fact, I think the story is very unclear, but it has been post-production and I can’t modify it. In short, I am satisfied with the visual effects.
This week I learned to make a diffusion effect in UE4 and learned about the Niagara system, material system, and blueprint system in UE.
This is my documentation:
I referred to an effect in the Spider-Man: Into the Spider-Verse:
This is the final effect.
Through animation experiments in these real-time engines, I learned about the actual workflow of the real-time rendering engine. With these experimental experiences, I will be more efficient when I start to produce my personal animation works, and I will have a deeper understanding of the advantages and disadvantages of the engine.
This week I have been testing a variable weather scene in Unity.
From model asset import, material creation, to post-processing, I got a basic understanding of the rendering part of the real-time rendering engine.
The effect of using displacement mapping:
Shader Code:
I mainly produce rainy and snowy environments.
The core idea of snow: texture mixing to make snow colour, and vertex offset to make snow thickness.
The addition of snow colour is completed in the fragment program, which is mainly to calculate the dot product between the normal of the object and the direction of the snow. The snow position is an angle less than 90 degrees. Code:
Snow thickness is made by vertex offset, which is mainly combined with vertex normal in the direction of snow cover, multiplied by snow thickness and snowfall; code:
The special effects of snow particles need to pay attention to the movement effect of the particles and the melting after landing. Because the snow is lighter and the flying trajectory is messy, the speed setting can be given multiple directions, and the falling speed can also be lower. Let the snow fall more slowly. The noise effect can also be enabled, but since the noise effect always exists, the snowflakes will still move after landing, which is not applicable here. The landing effect needs to turn on collision, add Mesh Collider to the objects in the scene, the particles can collide with the model, adjust the parameters to make the particles stay on the surface of the model because the scene is a static mesh, the collision quality can choose medium or low, it A set of voxels can be used to cache previous collisions so that they can be quickly reused in future frames. The effect of disappearing and melting is similar to that of screen snowflakes.
The core idea of raining: dynamic texture, a mask based on the direction of world space
I separately created an orthogonal camera and particle system with a black background in a very far place of the scene and created a RenderTexture. I captured the particle image on the RenderTexture through this orthogonal camera and obtained a dynamic black and white texture, and The dynamics of the texture can be modified and added arbitrarily.
Then I have to consider the method of calculating the mask. The rain is generally on top of the model. Here I decided to use the world normal direction float3 wNormal = WorldNormalVector(IN, o.Normal); I can give the model different faces in different directions. The colour, the positive direction of the y-axis is green, and it can be changed as the model rotates.
After obtaining the dynamic texture and black and white mask, the next product idea is very clear. Combine the mask to display the dynamic texture in the correct position, and then adjust the roughness, normal, colour and other attributes. Here we refer to some pictures showing objects in the rain and summarize some characteristic attributes, such as water absorption. The stronger the water absorption of the object, the darker the colour of the object will become when it is wetted by water.
Later, I also made moss growth based on the snowing effect. This is the final effect of the weather system:
This week I have been looking at animations using the Real-Time Rendering engine, the official engines for Unity and Unreal have released a number of short animations using the engines in recent years to demonstrate the effects of the Real-Time Rendering engine. I have been studying the work of these two engines.
Animation created by UE4:
Animation created by Unity:
Unreal Engine-powered animation production pipeline:
Of course, information learned only through these works can be somewhat biased. After all, they are all officially presented as advantages. Next, I will use the UE and Unity engines and do some practical project testing. On the one hand, I will learn about real-time rendering engines, and on the other hand, through actual project practice, I may find some shortcomings in the current real-time engines for animation creation.
This week I have been looking at how real-time engines affect the narrative structure in the creation of animation. There have been a number of interactive films created using the features of the real-time engine, such as Detroit: The Changeling, Heavy Rain, The Incredibles and many more. I looked at how they tell their stories.
The first thing that is quite intuitive is that the quality of the graphics is getting better and better, thanks to new rendering techniques and the development of graphics cards.
Both of these productions, with the exception of The Incredible Twins, have a multi-line narrative, with as many as four lines even in Heavy Rain. And in all of them, the viewer is supported to manipulate the character to view the current scene freely. The people or objects in the scenes provide more plot and enhance the immersion of the viewer.
Animation creation combined with the nature of the real-time engine is still largely borrowed from games in the form of RPG dialogue choices, QTEs, third-person action-adventure games and so on.
The overall experience I felt was that the format probably needed to match the story, and sometimes the feel of the film and the feel of the game were influencing each other. Some of the habits developed from watching traditional movies didn’t work in this type of work, for example, the sudden shift from watching a movie to playing a game when you get into a tense episode was a bit sudden and abrupt.
This week I have been collecting references for the topic I am researching, the use of real-time engines in animation production.
Through my research, I found that there has been a lot of more in-depth discussion in this area and that some companies have already experimented with it in commercial projects. The analysis of some specific cases can be continued next.
The special effects in the film were created this week. There is a story where the little prince uses magic to disappear to trick people. Here is the disappearing magic I wanted to show with a particle effect.
There is a new particle system in UE4 called Niagara. The Niagara VFX System is one of two tools you can use to create and adjust visual effects (VFX) inside Unreal Engine 4 (UE4).
Firstly, I created a blank system. Since the character as a whole is going to disappear, the particles should be generated from the character. Here I used a mesh emitter. I exported a separate static pose from Maya to be used as an emitter for the particles.
The generated particles need some movement and variation, and I control the movement mainly through a Curl Noise Force. And a colour curve to control the colour change of the particles.
This is the final result.
This week, the production of the first animation was mainly carried out. The entire animated character sits on the roof of the car, mainly on the character’s expression and yawning movements. What I want to express in the whole paragraph is the boringness of the character at the beginning and the excitement of seeing the police car later, which is mainly manifested in the speed change of the rhythm.
Thanks to the previous binding, the scarf animation can be easily produced. However, there was a problem with the animation of the necklace. Because it was bound by IKSpine, the bending direction of the necklace also needed to be adjusted. Because the early binding lacked some controllers, manual modification took a lot of time here. Fortunately, the animation of the necklace is not complicated, but it is difficult to make naturally.
This is the final animation:
After that, I unbind the jacket and shorts of this animation and simulated it with Ncloth. Mainly refer to Maya’s official fabric presets, using the settings of Heavy Denim.
In addition, Constraint is used to fix the clothes on the body. It should be noted that if the scene is in centimetres, you need to change the nuclear space scale to 0.01 or a smaller value to achieve a more realistic effect.
After determining the production process, I also made other animations in accordance with this process. These are the final results.
I started making animation this week! During this period, I solved the scene problem. I purchased an environmental project in the UE mall, rebuilt the environment in UE, and finally built a usable environment.
Once I have a basic environment, I can make a layout.
I imported the environment into the model bound by Maya and made two layouts, mainly to determine the shots.