Virtual reality and augmented reality games and experiences represent a horizon approaching closer every day. There are several parties involved, including content delivery networks and game developers. The bandwidth needs are huge, and to solve the ping issues gaming engines must be hosted in multiple edge locations. To understand why, let’s dive into the world of game engines and how they relate to latency.
The Rise of Game Engines
Today’s rich, interactive video experience is made possible by advanced game engines – software environments that provide the architecture with which developers create and run games, rendering images on the fly in response to the movements of the user. One of the major players in the gaming sphere is Epic games, thanks largely in part to their Unreal Engine 4, a pervasive game engine renowned for its accessible tools and impressive graphic abilities. Epic plans to release Unreal Engine 5 later this year, which will surely push to the forefront of video game development for years to come. In a preview demonstration released last year, the capacity for fine details and photo realistic use of light are breathtaking.
Unreal Engine 5 and others like it are ushering in a new era of virtual reality and augmented reality games that will serve as a cornerstone of gaming’s evolution and XR developments in the tech world at large. XR (a term that encompasses augmented, mixed, and virtual reality technology) in video gaming is on the rise and driving demand for game engines with advanced capabilities. Currently a $2 billion industry, the game engine market is predicted to balloon to almost $6 billion in the next six years.
The emergence of cloud gaming means games are no longer limited by the power of local gamer devices. Compute power has been decentralized, which opened the door for a new kind of multiplayer experience. A perfect example of the way a game engine like Unreal Engine has been utilized to stretch the constraints of an addressable audience is the wildly popular game Fortnite. Fortnite boasts a staggering 350 million registered users worldwide, where hundreds of players can play live on a shared game map. Games now exist as ambient digital spaces to explore: interactive environments rather than linear episodes. As such, elements within the game must be rendered constantly, to the tune of a stunning 54 billion events every day. Unreal Engine is used for more than just game creation. It’s also used for processing inputs, updating the game world, and generating outputs – the brain behind gameplay. The engine is constantly aiding Epic Games in its effort to analyze the scores of new data and make the in-game responsive adjustments. Some game engines have artificial intelligence modules to predict and learn along with game players.
Game Engines and Latency
The hulking power of these new engines to support so many simultaneous users, along with the vast and intricate virtual worlds they create bring a new kind of latency discussion to the forefront. There is more functionality, more need to process large amounts of data quickly, and the risk of delay experienced by players becomes the most prominent challenge in gaming. Server selection is one area to focus on, but so too is video rendering, encoding, and reducing delays in the game engine.
When players press a button and the screen registers the action, the amount of time in between extends beyond just display latency. There is the controller’s transmission speed, the console, and the game engine. When playing on dedicated hardware connected to a monitor, latency can be kept to a minimum. However, the next generation of gaming will soon feature game engine processing that takes place on a server located elsewhere, with the video stream sent to a monitor using network muscle instead of that of the console. In this future configuration, the game engine rendering images and handling game logic depends on signal processing time from the cloud, where the game is actually being hosted.
Latency is the number one reason users will turn away from a game, and engines like Unreal Engine 5 have been working with console companies to increase processing speeds and increase bandwidth. In fact, Unreal Engine 5 will handle primarily data, leaving the majority of memory to next-gen consoles like PlayStation 5 and Xbox Series X – meaning game environments can load faster without having to worry about storage. The importance of this should not be overlooked. Petabytes of data will be stored and delivered in data centers so the gaming device only needs small amounts of RAM and a quality GPU for image processing to play in a massive game environment, one that would otherwise require so much data that it would outstrip what could be provided by a typical gaming rig.
Increasing virtual machine efficiency and transmission speed, though, will not cover the optimal solution. Next generation engines need all the help they can get. When Fortnite issued its Chapter 2 update in 2019, Epic games used several content delivery networks, including Akamai. Akamai’s traffic exploded, recording 106 terabits per second at peak, the highest in the company’s 20-year history. Gaming’s popularity is only growing, and with a new plane of high quality visual and audio effects, updates and releases like these will only get bulkier. With DevOps and video game development teams needing to achieve increasingly complex tasks quickly and reliably, an increase in distributed architecture is on the horizon. Providing superior experiences to players will inevitably necessitate the housing of applications, data, and compute resources closer to where gamers are – including game engine technology. Taking game engines to the edge eases several of the obstacles developers face in orchestrating their resources across providers and geographical areas.
Unreal Engine and others of its ilk are the platforms that will deliver a new world of XR and gaming. As these experiences and demand for them rise, providing responsiveness and storage will become the fulcrum for success. If engines will provide the driving force, are networks prepared to provide the wheels?