Computer wargames cannot be fully analyzed without scrutinizing the video game systems that power them. The technology that drives these video game systems has transformed dramatically over the past 10-15 years. Initially, leaps in computational power allowed players to control and manipulate hundreds of units and perform an array of functions, as demonstrated in the earliest versions of the Harpoon computer simulation. Subsequently, the graphics behind these games experienced multiple breakthroughs that range from three dimensional features to advanced motion capture systems capable of detecting even the slightest facial animations. Eventually, game consoles and PCs reached the point where they could combine this computational complexity with stunning visuals into a single, effective simulation. Simply, these systems have evolved at a rapid rate.
Yet, as we near the midpoint of the second decade of the 21st century, it is important to ask “What’s next?” What future technologies will impact the design of military simulations? After reaching out to a variety of gamers, there are two technologies that CIMSEC readers should look forward to: 1) virtual reality (VR) headsets, and 2) comprehensive scenario design tools with better artificial intelligence (AI).
Virtual Reality Headsets—A Gamer’s Toy or Useful Tool?
VR headsets are by far one of the most anticipated innovations of the next few years. Gamers are not the only individuals excited for this development; Facebook’s $2 billion purchase of VR developer of Oculus VR and Sony’s Project Morpheus demonstrate how VR is a potential revolution. For those unfamiliar with a VR headset, it is a device mounted on the head that features a high definition display and positional tracking (if you turn your head right, your in-game character will turn his head right simultaneously). When worn with headphones, users claim that these headsets give them an immersive, virtual reality experience. One user describes the integration of a space dogfighting game with a Oculus Rift VR headset below:
The imagery is photorealistic to a point that is difficult to describe in text, as VR is a sensory experience beyond just the visual. Being able to lean forward and look up and under your cockpit dashboard due to the new DK2 technology tracking your head movements adds yet another layer of immersion…I often found myself wheeling right while scanning up and down with my head to search for targets like a World War II pilot scanning the sky…The level of detail in the cockpit, the weave of the insulation on the pipes, the frost on the cockpit windows, the gut-punch sound of the autocannons firing, every aspect has been developed with an attention to detail and an intentionality which is often missing in other titles.
Even though VR headsets strictly provide a first-person experience, they can still play a serious role in military simulations and wargames. At the tactical level, VR headsets can supplement training by simulating different environments custom built from the ground up. For example, imagine a team Visit, board, search, and seizure (VBSS) team training for a situation on an oil rig. Developers can create and render a digital model of an oil rig that members of the VBSS team could explore with the assistance of VR headsets in order to better understand the environment. In addition to supplementing training, VR headset technology could potentially be manipulated to enhance battlefield helmets. Although this concept is many years away (at least 15), readers should think of the F-35’s Distributed Aperture System for pilot helmets; even though this helmet currently faces development challenges, it demonstrates how a VR system can track and synthesize information for the operator. Essentially, the first-person nature of VR headsets restricts their application to the technical and tactical levels.
Better Tools: Enabling the Construction of Realistic Simulations
Although not as visually impressive as VR headsets, the ability to design complex military scenarios that will run on even the simplest laptops is an exciting feature that many spectators disregard. Many wargames are often judged by their complexity. When crafting scenarios, designers ask “Does the simulation take account for _______, what would ________ action trigger,” and other similar questions that try to factor in as many variables as possible. Their answers to these questions are programmed into the simulation with the assistance of a variety of development tools. Within the next decade, the capabilities of these tools will increase significantly and ultimately provide developers the ability to craft more comprehensive military simulations.
Since these technical tools can be confusing, I am going to use a personal example to demonstrate their abilities. In a game called Arma 2, a retail version built off the Virtual Battlespace 2 engine, I designed a scenario inspired by Frederick Forseyth’s famous novel, Dogs of War. Human players would assault an African dictator’s palace defended by units commanded by AI. Using the game’s mission editor, I inserted multiple layers of defense each programmed to respond differently. The AI had multiple contingency plans for different scenarios. If the force was observed in the open, aircraft would be mobilized. If certain defending units did not report in every 15 minutes, then the AI would dispatch a quick reaction force (QRF) to investigate. If the dictator’s palace was assaulted, his nearby loyal armor company would immediately mobilize to rescue him. These are just a few examples but illustrate how I was able to detail multiple different scenarios for the AI. Yet, the mission was not completely scripted. When the AI came into contact, it would respond differently based on the attacking force’s actions; during testing, I witnessed the dictator’s armor company conduct a variety of actions ranging from simply surrounding the city to conducting a full assault on the palace using multiple avenues of approach.
When considering the complexity of the above scenario, it may appear that extensive programming knowledge and experience were required. The astounding fact is that this is not the case because of the system’s mission editor (I do not know how to program). Yet, after spending one weekend building this scenario with the system’s editor, I was able to craft this comprehensive scenario. In the future, we will witness the development of tools and AI systems that allow for the construction of more detailed military simulations.
We have identified two technologies—VR headsets and more comprehensive simulation design tools—that will rapidly evolve throughout the next several years. Yet, the challenge is not the development of these technologies, but determining how to effectively harness their power and integrate them into meaningful, military simulations that go beyond ‘pilot programs.’ Even as these two technologies improve, they will not substitute for real-world experience; for instance, VR headset users cannot feel the sweat after a long hike and scenarios cannot to be customized to fully depict the active populations in counterinsurgency simulations. Nevertheless, as technology improves and is better leveraged, the utility of military simulations will only increase.
Bret Perry is a student at the Walsh School of Foreign Service at Georgetown University. The views expressed are solely those of the author.