Let’s take a look at some of the new features and highlights of the latest version of Unreal Engine 5.4. This release brings significant updates to the built-in animation toolset in Unreal Engine, enabling you to rig characters quickly and easily and create animations directly within the editor. With the Modular Control Rig, you can create intuitive animation rigs with just a few steps, without the need to switch to complex external software.
Unreal Engine 5.4 Feature Highlights
Character rigging and animation authoring
The new experimental feature, Modular Control Rig, enables automatic rig retargeting, allowing for easy reuse of biped character animations on other characters, to quickly achieve satisfactory results on different models. Bone editor performance has been significantly improved, and many new features have been added, all optimized for deformation.
This includes new experimental Gizmos, a reorganized Anim Details panel, updates and improvements to the Constraints system, and a new Layered Control Rigs feature that drastically simplifies adding animations on animated clips.
Meanwhile, Sequencer, Unreal Engine’s nonlinear animation editor, has undergone a significant redesign with better readability and usability in various aspects of the Sequencer tree. Among other new features in this version, Keyframe Scriptability has been added, opening up further potential for the creation of custom animation tools.
Why Unreal Engine 5.4 is a Game Changer
Animation gameplay
Regarding animation gameplay, Motion Matching, previously introduced as an experimental feature, is now production-ready. It has been tested in Fortnite Battle Royale and deployed on all platforms, from mobile to console, covering all 100 characters plus NPCs.
Motion Matching is an expandable next-gen framework for animation features, which relies on searching a relatively large database of captured animations using the current motion information of the character in the game as the key.
In this version, we’ve focused on delivering a robust, high-performing, and memory-scalable set of tools, along with adding a suite of debugging tools that provide developers with visibility into its internal workings.
Additionally, on the gameplay front, we’ve introduced Choosers, a highly requested tool that leverages the game context to drive animation selection. The system can use variables to inform selections and set variables based on those selections to inform gameplay logic.
We’ve improved usability, making it more intuitive and stable, with numerous enhancements to Sequencer. New features have been added to character animation control, allowing the use of the character’s current motion information in the game as keyframes at runtime.
Rendering
Regarding rendering, Nanite, UE5’s virtualized micropolygon geometry system, continues to receive enhancements, starting with a new experimental tessellation feature that allows for adding details like cracks and bumps at render time without altering the original mesh.
Furthermore, the addition of variable rate shading (VRS) functions to Nanite compute materials also brings significant performance improvements. Render times have been super-resolved. Stability and speed performance have been enhanced to produce predictable output on any target platform.
Lastly, the Movie Render Queue has received a major update to set rendering levels through a node-based architecture. Additionally, Unreal Engine 5.4 introduces a new Motion Design mode, developed in collaboration with industry leaders, providing specialized tools for creating 2D and 3D motion graphics, including cloners, effectors, modifiers, animation controllers, and more.
Virtual production
In the upcoming version, the virtual camera tool will support Android and iOS systems, while on macOS, it will fully support the virtual camera workflow, including VR scouting. A new, fully customizable toolkit has been launched, utilizing the XR Creative Framework to support OpenXR for HMDs such as Oculus and Valve Index.
Regarding ICVFX, depth of field compensation has been added to nDisplay using the real camera position and lens parameters, rendering correct depth of field in Unreal and displaying the extra blur produced for the physical lens on the LED wall. This feature, along with other depth of field enhancements, significantly improves the realism and credibility of LED wall images. Additionally, a new self-hosted cloud storage system called Unreal Cloud DDC has been released for managing derived data from Unreal Engine.