1. Introduction
Remotely Piloted Aircraft Systems (RPASs), commonly known as drones, have rapidly expanded into such diverse industries such as infrastructure inspection, agriculture, delivery, and public safety. This proliferation has increased the need for skilled drone pilots, yet training on real hardware can be costly, risky, and constrained by weather conditions or airspace regulations [
1,
2]. In response, extended reality (XR) simulation has emerged as a cornerstone of drone education, providing immersive and risk-free environments in which pilots can hone their skills [
3]. XR-based drone simulators leverage Virtual Reality (VR) and Augmented Reality (AR) technologies to recreate realistic flight conditions that do not risk real-world consequences [
4]. Studies have demonstrated that simulator-based training can significantly improve piloting performance and confidence among both novice and expert drone operators [
3,
5]. For example, VR flight training has been shown to reduce accident risk by allowing beginners to practice challenging maneuvers safely, thereby increasing overall training efficiency. Meanwhile, the demand for effective training solutions continues to rise as commercial and civil drone applications increase.
XR technologies not only make training safer, they are also potentially more effective than traditional methods. A recent scoping review of 18 studies concluded that XR-based simulators often produce learning outcomes equal to or better than conventional training, while also being cost-efficient and environmentally friendly by reducing the need for fuel-burning flight hours [
6]. The military and aerospace sectors have likewise taken notice: the U.S. Department of Defense is investing in VR/AR simulators to better prepare personnel for complex, multi-domain operations, recognizing that these tools can recreate the complexities of modern battlespaces with fewer resources and at reduced risk to personnel, assets, and the environment [
7]. Surveys of licensed pilots also reflect a growing acceptance of immersive training among them, with general aviation pilots rating the integration of simulation and AR technology highly (above 4 out of 5) and expressing readiness for XR-based training solutions. These trends underscore that XR-driven drone simulation is poised to become a key element in pilot training pipelines, from hobbyist drone clubs to professional flight schools and defense agencies.
This survey provides a comprehensive overview of XR-based drone simulation technologies, their current applications, and emerging directions for their development and use. We first clarify the technological landscape of XR for drone simulation, then examine major application domains and highlight case studies. We discuss technical challenges—such as achieving high levels of realism and interaction fidelity—and consider how ongoing advancements aim to address them. Finally, we outline future directions for XR drone simulator development, particularly focusing on improvements in realism, interaction, and training effectiveness. By synthesizing findings from academia, industry, and government, in this survey we aim to map the state of the art in XR-based drone simulation, and guide future research and development in this evolving intersection of unmanned aerial systems and immersive technology. The methodology used to select and classify the studies reviewed in this paper is described in Appendix A.
2. Technology Overview
2.1. XR Technologies
XR is an umbrella term encompassing VR, AR, and Mixed Reality (MR). It is situated along Milgram’s “virtuality continuum” [
8] (). In a fully immersive VR environment, users interact exclusively within a synthetic space that replaces the real world. By contrast, AR overlays the user’s real-world view with digital elements, typically via see-through displays or camera pass-through, so that virtual objects appear within the physical environment. Importantly, AR systems can enable dynamic interactions between virtual and physical elements, such as a virtual drone colliding with a real wall and responding accordingly.
MR represents a more advanced point on the AR spectrum, where virtual and real elements are not only co-present but also interact with each other in real time and in a geometrically coherent manner. In MR, users can simultaneously perceive and manipulate both physical and virtual objects. For instance, an MR drone training system might allow a trainee to operate a physical flight controller and see their own hands, while a virtual drone and wind effects are rendered seamlessly into the scene. Achieving this blending requires robust spatial mapping and tracking so that virtual drones respect physical surfaces (e.g., becoming occluded when passing behind real buildings) and remain correctly anchored in space. As a concrete example of MR in a drone context, reproduces the first-person view of a drone in a mixed-reality environment where real-world video is blended with a virtual forest [
9].
It is important to note that MR’s definition varies across the literature: some researchers treat it as a distinct category, while others regard it as part of a spectrum between AR and VR. Regardless of these definitional nuances, MR highlights a form of interaction situated on the continuum—closer to AR but extending toward a richer integration of physical and virtual domains.
Compared to VR, which offers complete control over the synthetic environment, AR and MR face unique technical challenges such as dynamic lighting, occlusion handling, and registration accuracy. Nevertheless, MR offers distinct benefits by anchoring training in real-world contexts, thus preserving situational awareness while introducing interactive simulation elements. Each modality along the reality–virtuality continuum serves different training needs, from risk-free full immersion in VR to context-rich MR scenarios. Modern XR drone simulators increasingly combine VR, AR, and MR to maximize realism, adaptability, and training value.
. The reality–virtuality continuum. XR encompasses VR (full simulation), AR (virtual overlays on real world), and MR (interactive blending of virtual and real environments). Arrows indicate the continuous transition between reality and virtuality. Adapted from reference [
8].
. First-person view of a drone in mixed reality combining a real world with a virtual forest (adapted from [
9], ; CC BY 4.0). Panel (<b>a</b>) shows the real-world environment used as a safe baseline for novice users; (<b>b</b>–<b>d</b>) illustrate the synthesized MR forest scenes with a virtual flight path and head-up display elements, demonstrating how physical surroundings are blended with interactive virtual landscapes.
2.2. System Components: Hardware, Interfaces, and Simulation Modes
XR-based drone simulators integrate multiple technologies to create immersive and responsive training environments. A typical system comprises three core components: a Head-Mounted Display (HMD) for visual immersion; input devices for control; and simulation software for flight dynamics and environmental modeling.
Common HMDs used in XR drone simulation include standalone or tethered devices such as the
Meta Quest series,
Varjo XR headsets, and
HTC Vive. These devices provide stereoscopic rendering, head tracking, and (depending on the model) AR capabilities via pass-through cameras or transparent displays. Control inputs vary based on the intended realism and training goals. Some simulators support the standard radio controllers used in real drone piloting, while others employ gamepads or VR hand controllers mapped to emulate drone flight behavior. High-fidelity simulators allow stick-and-throttle control with yaw, pitch, and roll fidelity, accommodating different UAV types such as multirotors and fixed-wing aircraft. Depending on the display mode, XR simulators can operate using either VR or AR. In VR mode, the pilot is placed in a fully virtual 3D environment, often with realistic terrain, obstacles, and weather effects. In AR mode, virtual elements such as drones and waypoints are superimposed onto the user’s real-world view, using the HMD’s cameras or optical see-through displays.
For example, Foxtrot
SimFlight XR supports both VR and AR configurations [
10]. In VR mode, it presents a complete digital flight ground with variable conditions; in AR mode, it overlays a simulated drone onto the user’s physical surroundings, enabling pilots to train as if a drone were flying in their actual space. In both modes, the system aims to replicate drone behavior and physics with high fidelity, ensuring that virtual UAVs respond realistically to user input and environmental variables such as wind and collisions.
2.3. Simulation Engines and Spatial Computing Capabilities
Modern drone simulators—whether XR-based or not—rely heavily on simulation engines to deliver realistic and immersive training experiences. These systems are built on game and physics engines that model aerodynamics, weather, sensor behavior, and collision dynamics. Platforms such as
Flightmare [
11] and
AirSim [
12] exemplify research-oriented simulators, offering high-fidelity physics modeling, modular scenario generation, and extensible Application Programming Interfaces (APIs) suitable for both autonomous flight algorithm development and XR integration. In contrast,
Real Drone Simulator [
13] adopts a more game-centric approach, emphasizing customizable terrains, weather conditions, and skill-based progression for manual pilot training. While it lacks far-reaching extensibility, it serves as an accessible platform for entry-level skill-building and familiarization with diverse flight scenarios in a safe virtual environment.
Among the most widely used platforms for building XR-enabled drone simulators are Unity and Unreal Engine. Unity is favored for its flexible cross-platform support and ease of integration with XR Software Development Kits (SDKs), making it well-suited for standalone HMDs like the
Meta Quest series. Its component-based design streamlines the development of interactive training interfaces and real-time feedback systems. On the other hand, Unreal Engine, with its high-fidelity rendering and built-in physics systems, is often the foundation for simulation environments requiring photorealism and complex dynamics—particularly in research-oriented projects like
AirSim, which uses Unreal Engine as its rendering backend. Both engines support integration with external physics libraries and robotics middleware, enabling the development of highly customizable and scalable training scenarios.
In addition, spatial computing capabilities—especially relevant in AR and MR settings—are transforming how simulators interact with the real world. Advanced headsets like the
Meta Quest and
Varjo XR series feature outward-facing cameras and depth sensors for environment scanning. This allows the virtual drone to recognize and interact with physical objects in the user’s space. For example,
SimFlight XR [
10] uses passthrough AR and spatial mapping to detect real obstacles such as walls or furniture. Trainees can thus practice avoiding real-world hazards, enhancing realism and situational awareness. This mixed-reality blending ensures that training scenarios feel authentic and responsive, even within confined and indoor environments.
2.4. Physics Engines, Flight Dynamics Modeling, and Realism Validation
Accurate flight dynamics modeling is a cornerstone of drone simulation, particularly in XR environments where real-time responsiveness and physical plausibility are essential for pilot immersion. The modeling ensures that virtual UAVs replicate the inertia, thrust, torque, and aerodynamic forces experienced by real-world aircraft, whether multirotors or fixed-wing drones. Modern XR simulators typically rely on integrated physics engines to handle these calculations. For example, Unity and Unreal Engine both incorporate robust physics systems: Unity uses NVIDIA’s PhysX while Unreal integrates its own high-performance physics layer. These engines offer native components such as rigidbodies, colliders, and force application methods, which can be customized to simulate drone-specific dynamics. In the case of
SimFlight XR [
10], the simulator models flight behavior using Unity’s built-in Rigidbody component in combination with the PhysX engine. The simulator computes lift, gravity, and rotational torque by applying physically accurate forces derived from real drone specifications. Key parameters such as weight, frame size, and maximum velocity are tuned to match those of actual UAV models, resulting in a simulation that reacts to user input with realistic feedback.
The physical model also incorporates external disturbances. Wind effects are represented in two primary forms: constant wind, modeled as a steady directional force across the drone’s body with Perlin noise added to replicate natural variability; and gusts, modeled as brief randomized bursts of force generated using functions such as Perlin noise to create motion patterns that are both smooth and unpredictable. These gusts are designed to test a trainee’s control precision and adaptability under dynamically changing conditions. By simulating such disturbances and nuanced aerodynamic interactions, XR simulators not only enhance manual control proficiency but also prepare pilots for real-world challenges such as sudden wind shifts or proximity to obstacles. Although they are simplified when compared to full computational fluid dynamics (CFD) modeling, these physics-based approximations provide a practical compromise between computational efficiency and physical realism.
Achieving high physical realism, however, requires rigorous validation and tuning of these flight dynamics models. Developers employ multiple benchmarks to ensure that simulator behavior aligns with empirical UAV performance. One key metric is hover and drift behavior: for example, when a virtual drone hovers or decelerates from forward flight, its tendency to drift or overshoot should mirror that of a real drone under similar conditions. If a simulator’s drone stops instantly with no settling time, it indicates that the model has been oversimplified, which may cause it to teach incorrect muscle memory. To prevent this, simulators are tuned so that dynamics like momentum and inertia (e.g., how far a UAV coasts when throttle is cut) closely match empirical flight data.
Another critical criterion is trajectory fidelity. Simulators often replay recorded flight paths from real drones, adjusting the aerodynamic parameters until the virtual trajectory overlaps the real one within acceptable error margins (e.g., position errors within a few percent). Discrepancies in turn rate, climb rate, or response times signal that elements such as thrust curves or drag coefficients require refinement. Wind and gust responses are likewise validated: for instance, a 5 m/s crosswind gust should induce a drift angle and stabilization time in the virtual drone that compare with real flight observations under controlled conditions. Meeting these benchmarks typically requires iterative physics tuning, often involving real-world flight logs and sensor data for calibration.
Simulators apply techniques such as system identification: a drone may perform controlled test inputs (e.g., step throttle changes, rapid yaw rotations), and designers adjust the simulator’s responses until they match logged accelerations and velocities from real flights. This process extends to sensor simulation as well. For example, VR camera feeds can be degraded with realistic noise, motion blur, or lens artifacts so that trainees do not develop habits based on a “perfect” visual feed unavailable in real-world operations. Ultimately, a validated physics model is one that has been stress-tested against real flight data across a wide range of scenarios, from stable hover to aggressive maneuvers, ensuring that the XR environment provides both high fidelity and appropriate variability. By defining benchmarks such as drift distance, trajectory error, and gust tolerance, researchers iteratively refine simulators until the virtual drone behaves in ways indistinguishable from its physical counterpart within the regime of interest. This rigorous cycle of modeling, validation, and refinement is essential not only for building pilot trust but also for maximizing skill transfer from XR training to live UAV operations.
2.5. Interaction Fidelity
A critical aspect of any flight simulator is how closely the user’s interactions match real-life operations. XR-based systems employ various strategies to maximize interaction fidelity. Many simulators support real drone controllers or use replicas to provide familiar hand-feel and control layouts. For example, the Varjo-Inzpire mixed reality UAV trainer integrates an actual Ground Control Station interface and physical controls with the virtual environment [
14]. Trainees can manipulate real sticks, switches, or tablets, which they see rendered in their MR headset exactly where they are in reality, blending tactile feedback with virtual visuals. In less elaborate setups, consumer VR drone apps often allow the use of USB gamepads or dedicated RC transmitter hookups so that muscle memory from simulator practice will transfer directly to real drone transmitters [
15]. Haptic feedback is another channel for user instruction: while full-motion platforms are rare for drone simulators (given that drones impart less kinesthetic feedback than manned aircraft), some systems simulate cues like vibrations or resistance. For instance, a hand controller might buzz when the virtual drone crashes or experiences turbulence, alerting the pilot through touch.
SimFlight XR demonstrates a scalable interaction approach by utilizing the standard Quest 3 controllers, mapping their left and right joysticks to the corresponding sticks of a traditional RC transmitter (e.g., Mode 2). This eliminates the need for specialized hardware, making the simulator highly accessible and deployable on standalone headsets. To prevent interference between vertical (altitude) and horizontal (yaw) inputs on the left joystick, the system dynamically interprets input based on directional intent. If the joystick is pushed along a clear axis or diagonal, the corresponding input is passed through; otherwise, the dominant axis is prioritized and the lesser axis is suppressed, ensuring clean, intuitive control for new and experienced users alike.
XR’s ability to merge the real and virtual also aids interaction fidelity. In AR, trainees can reference physical instruments and maps during simulated missions. For example, Inzpire’s system allows drone operators to see their actual hands and notebooks while flying in the virtual battlespace, thus preserving the workflow of checking a map or jotting coordinates as they would on a real mission [
14]. Such seamless integration of real tools addresses a common drawback of pure VR (
i.e., the isolation from one’s physical surroundings) and has been noted to improve trainees’ sense of realism and situational awareness. James Clarkson, Engineering Lead at Inzpire, emphasized that using a high-fidelity XR headset achieved “seamless integration between the live and the virtual” and increases trainees’ belief in the scenario: “If you’re immersed in an environment, you forget that you’re wearing a headset. You believe you’re in the field and what you’re doing is part of your environment”. This level of immersion can make simulation training more operationally valid, meaning that pilots treat the exercises seriously and develop habits as if in real operations.
2.6. System Deployment
The hardware requirements for XR drone simulators are becoming increasingly accessible. High-end military or aviation setups might use top-tier
Varjo XR headsets for ultra-realistic visuals and pass-through (Varjo’s XR-3 can even allow fine text to be read in cockpit instruments via focal plane technology). However, many training solutions today use standalone VR headsets like
Meta Quest 2/3, which have built-in inside-out tracking and color pass-through cameras for AR.
SimFlight XR, for example, runs on
Meta Quest devices and can be deployed wirelessly via an app, making it highly portable (
i.e., no external PC or cameras are required). This portability is crucial for scaling training programs, as an instructor can bring a set of headsets to a classroom or a field site and start a training session within minutes. The software is typically distributed through enterprise app stores or sideloaded, with cloud connectivity used for multi-user scenarios or logging. Some B2B solutions (like Foxtrot’s) include cloud-based flight log analytics and progress tracking: the simulator records telemetry and controller inputs during each session, enabling post-training debriefs and analysis of pilot performance trends. Such data can highlight, for example, if a trainee consistently struggles with nose-in orientations or tends to fly too low. This information is invaluable for targeted coaching.
Edge computing has also emerged as a complementary deployment option. Whereas cloud services enable centralized logging and multi-user connectivity, edge nodes—for example, a local mini-PC or an on-site server—can offload parts of the simulation workload and reduce latency for interactive XR tasks. This is particularly useful in field training scenarios where network bandwidth is limited or unreliable.
In summary, XR-based drone simulators fuse advanced 3D simulation software with immersive display and interaction devices to create convincing flight training experiences. VR provides total immersion enabling practice in any imaginable environment (from basic fields to warzones), whereas AR/MR makes it possible to blend the practice drone into real settings and equipment.
compares several notable XR drone simulation platforms and prototypes, illustrating the range of technologies and features in use.
.
Examples of XR-Based Drone Simulation Platforms (VR = Virtual Reality; AR = Augmented Reality).
| Simulator |
XR Mode(s) |
Notable Features and Use Cases |
| DroneSim (Albeaino et al., 2022) [3] |
VR |
PC-based VR simulator developed as a training tool for building inspection via drones. Offers a realistic virtual environment mimicking building fa¸cades and structural features; trainees practice maneuvers like close-proximity inspection and camera control. In a study, VR training with DroneSim improved participants’ navigation skills and confidence in conducting drone inspections and surveillance tasks. Focus: industrial inspection training (construction and maintenance sector). |
| xOperator C-UAS Simulator (DroneShield & XRG, 2022) [16] |
VR |
Tethered VR system integrated with DroneShield’s DroneGun (counter-drone jammer device); provides immersive training for counter-UAS tactics in a variety of virtual scenarios. Users engage incoming virtual drone threats of different types, practicing detection and engagement procedures in a safe environment. Includes an After-Action Review module to replay training sessions from multiple angles for debriefing and performance feedback. Focus: security and defense (protecting airports, infrastructure, or battlefields from rogue drones). |
| Inzpire “CASE” UAV Trainer (UK, 2023) |
AR |
Uses a Varjo XR headset with pass-through MR; integrates a physical UAV Ground Control Station and real mission planning tools with a virtual battlespace simulation; trainees see and use real hands, controllers, and maps while virtual targets and terrain are rendered in the headset. Can link multiple units for networked multi-operator training; deployable system (compact for field use). Focus: military UAV operator training (surveillance, reconnaissance, and defense scenarios). |
| ARDroneSim (Muqri et al., 2024) [4] |
AR (Mobile) |
An augmented reality drone training app using a mobile device or tablet. Employs marker-based AR: users place fiducial markers in a real room or outdoor area, which the app recognizes to anchor virtual obstacles and flight paths [17].
Provides an affordable, accessible way to practice basic drone controls in AR without a physical drone. Emphasizes key simulation-based learning principles (gradual skill development, feedback). Usability evaluations reported a System Usability Scale (SUS) score of 72 (“Good”), indicating a positive training experience. Focus: entry-level pilot training and education, using common devices.
|
| SimFlight XR (Foxtrot Inc., 2025) |
VR & AR (Hybrid) |
Standalone HMD-based; switchable VR mode (fully virtual environment) and AR mode (overlay virtual drone onto real-world via passthrough); realistic physics mirroring real
drone behavior; environment scanning to include real obstacles in AR; supports GPS and ATTI (attitude) flight modes; customizable scenarios (e.g. simulate national licensing test courses, construction sites, crop fields, disaster scenes); flight log recording for post-training analysis. Focus: general pilot training for enterprise, drone schools, and individuals.
|
| SafeSpect “ADAPT-AR” (Xu et al., 2025) [18] |
VR for AR UI prototyping; AR HUD interface |
A research prototype of an adaptive AR heads-up display (HUD) for drone pilots, designed to improve safety during tasks like high-rise inspections [19]. The AR interface runs on an optical see-through HMD, but was developed and tested within a VR drone simulator (using a virtual city and drone) to allow safe experimentation with hazardous scenarios (e.g., collisions, GPS loss). The adaptive HUD intelligently shows or hides information based on context (mission vs. safety-critical events). In user studies, the adaptive AR interface significantly reduced cognitive load and enhanced situational awareness for pilots compared to a traditional tablet interface. Focus: Human factors research for UAV operations (improving interface design for real-world drone flying via AR). |
2.7. Simulator Realism and Fidelity
XR-based simulators vary in their level of realism, but many strive for high fidelity in both visuals and physics. Graphically, modern simulators use detailed 3D models of drones and environments; some incorporate photogrammetry-based maps or geospatial data for authentic landscapes [
15]. For example, industrial training sims may recreate specific work sites—like oil refineries or farms—so that pilots can practice virtually in a twin of their actual operating environment. Lighting and weather effects (sun glare, fog, wind gusts,
etc.) add to visual realism and can be toggled for training under different conditions. On the physics side, realistic flight dynamics are crucial: accurate aerodynamics (lift, drag), battery/performance models, and even sensor noise models (for GPS or altimeter) must all mimic real drone behavior. Some simulators allow users to switch between flight modes like GPS-stabilized
vs. ATTI (manual attitude) to train pilots on handling loss of GPS, or high precision flying.
SimFlight XR, for instance, supports both modes: in GPS mode, the virtual drone self-stabilizes, whereas in ATTI mode it drifts in wind. The simulator thereby teaches students to manage different control scenarios [
10]. Similarly, simulators often include multiple drone models (quadcopter, fixed-wing,
etc.), each possessing distinct physics.
Zephyr, a training simulator used in schools and agencies, meticulously models a range of popular drone types so that each virtual aircraft “feels” like the real one in terms of mass and responsiveness [
15]. The ultimate goal is that when a trainee transitions from the XR simulator to a real drone, the experience is familiar, with control sensitivities, vehicle dynamics, and even visual cues all matching reality closely, shortening the learning curve and improving safety in initial real flights.
2.8. Open Datasets and Tools for XR Drone Simulation
To accelerate research and development, the community has made several datasets, simulation frameworks, and scenario libraries publicly available. One foundational tool is Microsoft
AirSim (open-sourced in 2017) [
12], a photorealistic drone and vehicle simulator built on Unreal Engine.
AirSim provides several ready-to-use environments (e.g., urban city blocks, forests,
etc.) and an API for retrieving sensor data, enabling researchers to generate custom flight scenarios and collect synthetic training data for computer vision or control algorithms (
).
Another influential platform is
Flightmare (UZH, 2020), an open-source quadrotor simulator that couples the Unity graphics engine with a high-fidelity physics engine [
11]. Frameworks like
Flightmare,
RotorS (for ROS/Gazebo), and MIT’s
FlightGoggles allow investigators to test new control strategies or reinforcement learning policies in simulation with relative ease, often including example scenarios or baseline models.
Examples include
RotorS, which offers a library of multirotor models and environments, and the
gym-pybullet-drones toolkit, which provides lightweight simulation and benchmark tasks for autonomous drone control.
In addition to simulators, there are curated datasets useful for XR drone research. The EuRoC Micro Aerial Vehicle dataset (ETH Zurich) [
20] and the Blackbird UAV dataset (MIT) offer real-world flight recordings (including stereo camera feeds, Inertial Measurement Unit (IMU) readings, and ground-truth trajectories) that serve as benchmarks for validating simulation realism and developing AR tracking or Simultaneous Localization and Mapping (SLAM) algorithms. These datasets (with permanent DOIs and documentation) allow XR simulations to be configured under conditions identical to real flights, facilitating direct comparisons. Some repositories host libraries of 3D assets and drone-specific scenarios; for example,
DroneRaceX contains standardized race gate models and course layouts for training or competition. Industry and government initiatives have also released reference models: for example, the U.S. Federal Aviation Authority (FAA) has published sample XR Flight Training Device configurations and scenarios for alignment with training standards.
In summary, the XR drone simulation community benefits from a growing ecosystem of open tools: from high-fidelity simulators and physics engines, through public datasets of flight data, to shareable scenario and asset libraries. These resources, which are often accompanied by persistent identifiers or GitHub links for reproducibility, greatly aid researchers and practitioners in building upon each other’s work, ensuring that new advances in XR drone training can be compared against common benchmarks and rapidly integrated into improved simulation systems.
. Example of a high-fidelity drone simulation environment using AirSim simulator (© Microsoft Corporation, released under the MIT License).
3. Application Domains of XR-Based Drone Simulation
XR-based drone simulation has found a broad range of applications across industries and use cases. Below, we survey the primary domains where this technology is being applied.
3.1. Pilot Training and Certification
The predominant use of XR drone simulators is in training pilots to operate drones safely and proficiently. Drone flight schools and certification programs are increasingly incorporating VR/AR simulators to prepare students for licensing exams and real-world flying. For example, Japan’s national drone license training now leverages XR simulation to allow students to undertake unlimited practice of exam maneuvers (such as figure-of-eight flights or hover accuracy) in a virtual environment that mirrors the test conditions.
SimFlight XR was explicitly designed to support foundational training for national certification. It can project the official test course layout onto the virtual or AR environment, letting trainees rehearse each task as many times as needed. A major advantage is the simulator’s ability to facilitate all-weather, any- location training: because VR isn’t limited by weather or daylight, trainees can practice even when outdoor flights are grounded due to rain or darkness. Trainees can also experience difficult conditions (like high winds or low visibility) in simulation, allowing them to build skills that would be risky to attempt first in reality. XR sims are particularly beneficial for introducing complex drone types that might be expensive or difficult to access, such as heavy-lift drones or fixed-wing UAVs. A simulator can emulate these aircraft, giving pilots exposure to their handling characteristics without requiring access to the physical unit. Research confirms that starting training in simulators can boost students’ confidence and competence when they transition to real flights [
21]. Nasir et al. (2023) demonstrated that structured teaching using VR drone simulators significantly improved students’ operational control skills and reduced their anxiety before their first actual flight, thereby boosting their overall confidence in transitioning to real-world scenarios [
5]. Recent work by Cardona-Reyes et al. (2024) has further explored how task design in VR environments affects training effectiveness, highlighting the role of scenario complexity and feedback strategies in shaping pilot performance [
22]. Consequently, many training organizations now mandate a certain number of simulator hours before soloing a real drone.
3.2. Industrial and Commercial Operations
Beyond basic training, XR simulation is being tailored to industry-specific drone applications. In sectors like construction, infrastructure inspection, and agriculture, drone pilots must learn specialized maneuvers and data collection techniques. XR simulators provide safe sandboxes in which to learn and practice these complex tasks. A notable example is drone-mediated building inspection. Pilots inspecting structures (bridges, high-rises, cell towers,
etc.) need to fly close to surfaces and navigate tight gaps. VR simulators such as
DroneSim create realistic 3D models of buildings and allow pilots to practice inspection routines (e.g., circling a tower while keeping a camera trained on it) without risking crashes [
1]. Albeaino et al. (2022) reported that trainees who practiced building inspections in a VR simulator showed improved flight stability and better identification of structural defects when compared to those trained only in the field [
3]. Similarly, in precision agriculture, XR simulators are used to train pilots in crop-spraying missions: some systems can simulate flying over crop fields and managing a spray payload, complete with virtual trees, power lines, or no-fly zones to avoid. The pilot learns to maintain proper altitude and speed for even coverage, and to execute automated flight plans, all above a risk-free virtual farm.
Another burgeoning use area is emergency response and public safety. Drones are increasingly used by firefighters, search-and-rescue (SAR) teams, and police, often in high-pressure scenarios. XR simulations enable these operators to rehearse missions that would be impossible or dangerous to stage in reality. For instance, an AR-enabled disaster response trainer might overlay a virtual smoke plume or fire onto a real training ground, challenging a pilot to navigate a drone in low-visibility, high-stress conditions. In Foxtrot’s AR mode, the user can simulate elements like smoke, rubble, or even moving people/vehicles on the live camera view, turning an empty field into a disaster scene for training purposes. This capability is invaluable for preparing operators to undertake hazardous missions (finding victims in collapsed buildings or monitoring wildfires) without risking any actual harm. In Japan, where emergency services are training drone units, XR simulation has been used to practice night operations and formation flights for large-area searches. Public safety agencies also use simulators to maintain pilot proficiency during downtime; for example, a police drone pilot can practice pursuit or crowd monitoring scenarios in VR when live training is not feasible.
3.3. Defense and Security
The defense sector has adopted XR drone simulation both for training UAV pilots and for developing counter-drone tactics. Military drone operators (for reconnaissance or strike UAVs) traditionally trained on expensive computer-based simulators, but are now transitioning to more immersive XR systems. As detailed earlier, Inzpire’s mixed reality UAV simulator allows military crews to train in an interactive virtual battlespace while operating actual ground control stations [
14]. This MR approach means a crew can rehearse a complex mission (such as surveilling a target and coordinating with ground units) in a realistic setting: they can see virtual terrain, enemy vehicles, and threat indicators through the headset, yet still physically interact with the authentic controls and communication tools they use on deployment. The result is highly transferable training: the cognitive and procedural skills map directly to real operations. Military XR training systems often emphasize networked scenarios; multiple trainees in VR can collaborate to simulate a swarm of drones controlled by a team, or engage in a manned-unmanned teaming scenario with helicopter pilots and drone operators training together. The immersion and realism provided by XR is seen as a breakthrough for effective mission rehearsal: trainees “believe they’re in the field” and thus develop decision-making and situational awareness akin to live exercises. It is noteworthy that AR enables the integration of real-world mission intelligence (for example, by inserting up-to-date satellite imagery of a conflict zone as the virtual terrain) combined with dynamic virtual entities representing adversaries. This approach helps to ensure that training scenarios remain highly relevant and adaptable.
XR simulators are also being used to tackle the growing threat of rogue drones (unmanned aircraft used for malicious purposes). Companies like DroneShield have partnered with simulation firms to create Counter-UAS VR training platforms [
16]. These simulators place law enforcement or soldiers in realistic scenarios (such as a drone incursion near an airport or forward operating base) where they must detect, track, and neutralize hostile drones. The
xOperator VR system, for example, lets users operate virtual versions of DroneShield’s counter-drone equipment, like RF detectors and the DroneGun jammer, against incoming simulated drones. The environment can be customized to urban, desert, and forest scenarios (among others), and drone behaviors (single intruder, swarm attack,
etc.) adjusted to train appropriate responses. Crucially, these scenarios can be repeated and varied endlessly, which is impossible with live-flying adversary drones. Trainees gain experience in identifying drone types, assessing threats, and deploying countermeasures under realistic conditions, improving their readiness for actual incidents. Post-mission replay tools (after-action reviews) further enhance learning by allowing teams to debrief what went right or wrong in the virtual engagement. Given the increasing number of drone incursion incidents, such XR-based rehearsal is becoming standard for many security agencies.
3.4. Research and Development (R&D)
XR drone simulators are also invaluable in research, both for human factors and for autonomous systems development. Human-factors researchers use XR environments to study how pilots interact with drones and to prototype new interfaces. The
SafeSpect AR interface case is a prime example: researchers built a VR simulation of a complex inspection task (complete with a virtual city, wind gusts, and moving obstacles) to safely test an adaptive AR interface with real drone pilots [
18]. Through this simulated approach, they could evaluate pilot performance and workload with different user interface designs without risking a drone or waiting for ideal field conditions. Their study yielded insights (e.g., that an adaptive AR HUD can improve situational awareness significantly) that directly inform how future AR tools for drone operation should be designed [
18]. In general, XR enables the rapid iteration of drone interfaces (like new controller layouts or AR overlays for data) by testing them in high-fidelity scenarios with human users and gathering feedback, all inside a lab.
For autonomous drone Research and Development, simulators provide a virtual proving ground. Many robotics labs develop drone autopilots and Artificial Intelligence (AI) algorithms (for navigation, obstacle avoidance, swarming,
etc.) and use simulators as their initial testbed. XR does not necessarily play a role in autonomy (since the “pilot” is an AI), but developers are increasingly adding VR visualizations to better understand and debug autonomous drone behavior. Some frameworks enable a human operator to “step into” a running drone simulation via VR and watch the drone’s sensors and decisions in real-time, which can accelerate development of reliable autonomous systems. Moreover, educational programs in engineering and computer science use drone simulators for teaching: students can program a virtual drone to perform tasks in a simulated environment (such as delivering a package across a virtual city) and learn programming and robotics concepts without needing physical drones. Some simulators explicitly list support for education and development among their attributes, allowing integration of custom flight algorithms and programmatic control for academic exercises. This feature means the simulator can serve as a platform for coding competitions or algorithm testing, where each team’s logic flies the same virtual course, ensuring fairness and avoiding hardware discrepancies.
3.5. Entertainment and Drone Sports
Though training and professional use dominate, XR-based drone simulation has also entered the realm of entertainment and sports. Drone racing, a fast-growing e-sport, relies heavily on simulators to train and qualify pilots. PC-based FPV (first-person view) racing simulators like DRL Simulator and LiftOff are popular, but some developers are experimenting with VR to enhance the immersion—letting pilots feel like they’re truly “inside” the drone as they speed through courses [
15]. A few drone racing simulators support VR goggles for a more lifelike cockpit experience, though latency and nausea are challenges at extreme speeds. On the AR side, games have been developed that overlay live drone flights with race gates or virtual obstacles. For example, the
Drone Prix AR game (by EdgyBees) projects a virtual obstacle course into the live video feed of a DJI drone [
23]. Pilots flying a real drone see virtual rings and barriers in their goggles, turning any open field into an XR racing arena. This concept blurs the line between simulation and reality, effectively creating an augmented training exercise: pilots must maneuver a real drone as if the virtual obstacles were real, which sharpens their skills while keeping the course flexible and visually engaging.
A recent example demonstrating this blend of training and gameplay is
Drone Simulator VR, a Quest-based application developed by 0Space. The simulator features realistic drone physics and supports multiple flight modes, including Tripod, Position, and Sport. Its unique design combines GPS/ATTI toggling with challenge elements such as exploration missions, collectible unlocks, and time-trial racing. With its affordable price and focus on both realism and gamification, Drone Simulator VR demonstrates how XR platforms can reach casual users while maintaining training relevance. Its structure offers a strong benchmark for future XR drone games seeking to balance skill development with player engagement [
24].
Beyond competitive performance, recent research also explores how drones can serve as expressive agents within XR entertainment. Bevins and Duncan (2021) examined how people perceive and respond to drone flight paths as non-verbal communicative gestures, opening up possibilities for richer interaction design and performance-based drone experiences [
25]. Such AR-enhanced flying is still mainly a leisure activity, but it demonstrates the possibilities of XR: eventually, formal drone sport events might include AR elements (e.g., spectators seeing virtual effects through AR as drones race, or pilots dodging virtual hazards for extra points). The entertainment use cases are beyond the focus of this survey, but they underscore XR’s versatility and its potential to attract and train the next generation of drone enthusiasts.
In all these domains, XR-based simulation provides a safe, cost-effective, and flexible means to train and innovate in drone operations. Whether for a student mastering basic controls, an inspector practicing a difficult flight, or a soldier preparing for drone warfare, XR delivers realistic scenarios on demand. The following case studies spotlight some specific XR drone simulation solutions in action, illustrating how the technology is implemented to meet real-world needs.
4. Case Studies of XR Drone Simulation
4.1. Fixed-Wing and FPV Drone Simulation
While most XR drone simulators focus on multirotor drones due to their prevalence in commercial and consumer sectors, fixed-wing UAVs require a distinct training approach. The control schema for fixed-wing aircraft more closely resembles that of traditional Remote Control (RC) planes, employing inputs such as rudder, elevator (yoke), and throttle, in stark contrast to the joystick-based control schemes (e.g., Mode 1 and Mode 2) common in multirotors. A notable example of XR-based fixed-wing simulation is
RC Pilot Trainer, a commercially available application on the
Meta Quest platform [
26]. Designed for immersive training, it provides stick-and-rudder control using virtual reality controllers mapped to physical RC aircraft behavior. Despite being less feature-rich than professional-grade flight simulators, it showcases the increasing feasibility of portable, headset-based simulation as being effective for fixed-wing platforms. In addition to XR-based systems, PC-based simulators like
PicaSim [
27] have long served as valuable tools in fixed-wing pilot education. Offering detailed physics, wind dynamics, and custom training scenarios, PicaSim is widely used among hobbyists and professionals for pre-flight practice. These simulators are often the first step in training workflows, enabling pilots to internalize responses to aerodynamics and inertia without risking hardware.
While many modern fixed-wing UAVs, such as Aerosense’s
Aerobo Wing [
28], are designed for automated flight, manual control remains essential for certification, safety, and calibration. In many countries, type certification requires pilots to demonstrate manual operation skills, underscoring the continued need for simulator-based training.
FPV Drone Simulation
Although not strictly fixed-wing, FPV (First Person View) drones share key control characteristics with RC airplanes. Unlike GPS-stabilized drones, FPV drones require direct and continuous manipulation of the throttle, yaw (rudder), pitch, and roll, emulating the manual flight experience of fixed-wing models. In FPV drone operations, pilots wear an HMD that streams real-time footage from an onboard camera, providing a highly immersive cockpit-like experience. This perspective enables precise control and maneuvering, particularly in high-speed and obstacle-dense environments such as drone racing or inspection scenarios. Given the complexity of such control and the need for rapid reflexes and spatial awareness, simulators play a vital role in training. Notable FPV drone simulators like
VelociDrone [
29] and
Liftoff [
30] offer realistic flight dynamics, customizable environments, and compatibility with physical RC transmitters, making them effective tools for pilots, regardless of their initial level of competence.
In summary, XR and desktop-based simulators for fixed-wing and FPV platforms contribute significantly to flight readiness, muscle memory acquisition, and cognitive load management. These tools extend the scope of drone training beyond quadcopters, and serve niche sectors such as mapping, long-range inspection, and aerobatics.
4.2. Inzpire & Varjo Mixed Reality UAV Training (Military Application)
A particularly compelling case study comes from the defense sector: Inzpire Limited’s mixed reality UAV training system, developed in the UK in partnership with Varjo. Inzpire, a defense training provider, created a deployable Compact Agile Simulation Environment (CASE) for drone operators. It integrates a
Varjo XR-3 mixed reality headset with an actual UAV ground control station (GCS) [
14]. This system is among the first to blend a real control interface and live operational context with a virtual battlefield environment for drone training. In practice, an operator wearing the XR headset sees a composite view: the real-world elements (their hands, the physical control console, maps, and teammates in the room) are visible through the high-resolution pass-through, while a 3D virtual battlespace is overlaid, filling their field of view with the mission scenario. Essentially, the trainee sits at a real GCS (exactly as they would in a command post or vehicle), but instead of looking at physical screens or out a window, they perceive the mission through the MR headset. A virtual landscape with enemy targets, friendly units, and a UAV camera feed are all rendered in the immersive view.
This approach yields extremely high realism for mission training. Because the trainee interacts with real hardware (the same control software, keyboard, or joystick they use on actual operations), any gap in interface familiarity is eliminated. They develop muscle memory and workflow exactly as in real missions. Meanwhile, the MR environment provides visual immersion that traditional 2D screen simulators lack. In one scenario described by Inzpire, a drone pilot can practice flying an unmanned aircraft over a virtual village, identifying insurgents and coordinating with a virtual ground convoy, all while communicating with others and manipulating real controls. The mixed reality aspect is key: the operator might glance down to use a physical map or checklist on the desk, or see a colleague next to them giving instructions, preserving the situational context of a real operation. According to Inzpire, users have noted that the seamless blending of real and virtual makes the training feel convincing: “Using Varjo’s technology has allowed us to achieve seamless integration between the live and the virtual. it immerses them and enhances the believability”, says James Clarkson of Inzpire. Trainees reportedly forget they are in a simulation tent and instead feel like they are on a mission, which leads them to treat the exercise with appropriate seriousness. This emotional and cognitive engagement is crucial in military training, where the goal is to induce the same stress and decision-making pressure as are found in live operations, but without the cost and risk associated with live exercises.
The portability of Inzpire’s CASE system is another highlight. It is built to be taken to the front lines or deployed locations. It consists of a ruggedized case containing the headset, a laptop running the simulation (networked if necessary), and the control station equipment. This means that drone operators in the field can, if given sufficient warning, run quick mission rehearsals in MR before executing real missions. For example, prior to a complex urban operation, a team could program a virtual replica of the target area (using satellite data) and practice their surveillance flight plan in MR, identifying potential challenges, all from within a tent or vehicle. The system can also operate networked multi-player training: multiple MR setups can be linked so that distributed teams (pilots, sensor operators, and possibly other units) share the same virtual scenario and can communicate and coordinate within it. This enables collective training events, like an entire drone unit or even multi-crew scenarios, without the logistical overhead of gathering many aircraft and personnel in one physical location.
The MR trainer supports a range of UAV tasks: basic flight operations, payload management (camera and sensor use), and emergency procedures. Trainees can practice everything from normal reconnaissance patterns to handling simulated emergencies like datalink loss or aircraft failures in the virtual environment. Moreover, because it is software-defined, the scenarios can be changed as necessary, from (for example) small quadcopter reconnaissance in a peacekeeping scenario, to a large military UAV in a high-threat combat airspace. The
Varjo XR-3 ’s superior visual fidelity (with human-eye resolution focus area) means that even small text (like map labels or instrument readouts) can be read in MR, and targets can be seen at realistic detection ranges. This fidelity addresses a common shortfall of older VR: resolution and field of view. The XR-3 also has an autofocus to allow shifting between near (cockpit instruments) and far (the outside world) easily, mirroring human eye behavior. In training, this lets a pilot quickly refocus from monitoring a video feed to looking “outside” at the terrain in MR.
The Inzpire case study demonstrates the potential of XR at the high end of training: replacing multi-million-dollar dome simulators (or augmenting them) with lightweight XR gear that provides equal or better immersion. In a traditional simulator, you might have a physical mock-up of a cockpit inside a 360° projected dome, but this is an extremely expensive (and stationary) training solution. Here, a single XR headset achieves a similar effect for a fraction of the cost and with far more flexibility. It is notable that Inzpire’s MR solution was delivered not just as a tech demo but directly into service training; it reflects real user needs. The military context also reveals XR’s current limitations: issues like trust and wide field of view (for peripheral awareness) are actively being addressed. Inzpire’s James Clarkson noted that “Mixed reality is now the forefront of immersive technology because it accurately matches the virtual and real environment,” implying that only recently have XR devices become good enough for this level of training fidelity. The success of this program likely foreshadows broader adoption of MR in both military and civilian aviation training, where the benefits of seeing one’s real environment/tools while still being immersed in a scenario are immense.
4.3. Foxtrot SimFlight XR (Hybrid VR/AR Training)
SimFlight XR is a commercial training solution developed by Foxtrot Inc. in Yokohama, Japan (
). This system is a hybrid XR application for drone pilot training that runs on standalone VR headsets (
Meta Quest 2/3 ) [
10]. It offers two modes: a fully immersive VR mode and an AR passthrough mode. In VR mode, the user is placed in a completely virtual environment (such as a practice field or a digital twin of a real location) for training. In AR mode, the headset’s cameras pass through the real-world view, onto which the software overlays a lifelike virtual drone, allowing the user to walk in a real space and see/operate the drone as if it were actually there. This AR mode is particularly useful for on-site training. For example, an instructor can take trainees to a construction site and have them “fly” a virtual drone around the site using the headset, blending the real terrain with the simulated drone.
Foxtrot’s simulator is specifically geared toward professional training scenarios. As shown in its use-case taxonomy, it supports drone flight schools, various industries, and even insurance and public safety applications. In drone schools, the system is used to prepare students for Japan’s national drone license exam, simulating the official test course and maneuvers like figure-8s, precision hovering, and emergency procedures. A student can practice these repeatedly in VR without fear of crashing. In commercial contexts,
SimFlight XR provides scenario packages: for construction and surveying, it can load a virtual construction site with obstacles (e.g., cranes, buildings,
etc.) to practice safe navigation and mapping flights; for agriculture, it simulates crop dusting missions for which pilots must maintain stable altitude and speed over crop fields; for disaster response, it offers scenarios with virtual smoke, low visibility, and obstacles, in order to train pilots for SAR missions in hazardous conditions. Uniquely, the AR mode can be used to overlay those virtual scenarios on the real world, for example by projecting the layout of a disaster training course onto an actual open space, or placing virtual obstacles like flying debris into the user’s view of a real training field. This AR training gives pilots the experience of flying in their real operational environment but with added (virtual) challenges and without real risk.
. Comparison of VR and AR passthrough modes in SimFlight XR. Source: Foxtrot official site. © 2025 Foxtrot Inc.
The simulator also emphasizes analytics and customization. All flight telemetry and control inputs are recorded, allowing instructors to review pilots’ performance. The system can generate replay visualizations or even quantify metrics like stability, reaction time, and mission completion time. This data-driven approach aligns with how manned aviation simulators are used (with post-flight debriefings). Additionally, Foxtrot Inc. offers custom development to adapt the simulator to an organization’s needs. For example, a power utility company could ask to incorporate 3D models of their actual inspection sites (power lines or substations,
etc.) into the simulator, or a drone manufacturer could have their new drone model’s flight characteristics uploaded so that pilots can train on that specific platform. The simulator’s flexibility extends to language support (Japanese and English) and enterprise deployment options (managed via a B2B contract and distributed through the Meta App Lab for easy installation).
From a technical standpoint,
SimFlight XR leverages the
Meta Quest ’s inside-out tracking and color pass-through cameras to achieve its AR features. The headset scans the environment to allow the virtual drone to interact virtually with real objects. For example, if a real wall is present, the AR drone will appear to collide with it or be occluded by it, enforcing realistic line-of-sight rules. This is achieved by using the Quest’s depth mapping ability to register physical surfaces in the virtual space. Users operate the drone via the Quest hand controllers, which are mapped to throttle/joystick inputs; interestingly, the system lets trainees use the two hand controllers in the same position as an actual RC transmitter (gimbal sticks) for a more natural feel. Through this interface, pilots can switch between GPS and ATTI modes as necessary and in real time, as well as trigger return-to-home or other functions, mimicking real drone operations. The simulator provides immediate feedback for mistakes (visual and audio cues on crashes,
etc.) and can introduce failures (like GPS signal loss) to test the pilot’s response.
This case illustrates how a single XR platform can cater to a broad spectrum of training needs by toggling between VR and AR and by offering scenario customizability. Early deployments of
SimFlight XR have been met with interest in Japan’s infrastructure and training communities. It was showcased at the Japan Drone 2025 expo, where it was highlighted as a next-generation tool to maintain and enhance field skills amid cost and safety constraints. By overcoming physical limitations (like needing a large open space or specific weather for training) through XR, this simulator enables more frequent, varied, and safe practice for drone pilots. It brings the training to wherever the pilot is, and in whatever context is needed, whether that is an indoor classroom or directly at a worksite, thereby greatly expanding the reach and efficacy of drone education.
4.4. XR for Counter-Drone Training: DroneShield & Operator’s xOperator
The threat of malicious drones (for espionage, disruption, or attacks) has driven the development of specialized Counter-UAS (C-UAS) tactics and technologies. XR simulation is playing a role in training personnel for these scenarios. A notable example is the partnership between DroneShield (a C-UAS product company) and the Australian firm Operator (XRG) to create
xOperator, an XR-based counter-drone training system [
16] (see also
for a representative example of counter-UAS training by the U.S. Army). Introduced in 2022,
xOperator is a VR simulator specializing in counter-drone engagements: it immerses security personnel in realistic environments where they must detect, track, and neutralize simulated rogue drones using virtual avatars of DroneShield’s equipment.
In
xOperator, the trainee wears a VR headset and is placed in a lifelike scenario (e.g., a virtual airport, a forward operating base, or the roof of a building) appropriate to the needs of the client. The scenario will include one or multiple hostile drones intruding, which may behave in various ways (e.g., a fast kamikaze drone diving toward a target, or a slower quadcopter spying). The trainee is equipped with a VR model of the
DroneGun Tactical (a rifle-shaped jammer designed to disrupt drone signals), together with virtual radar screens or detection readouts that replicate DroneShield’s operational detection systems’ interfaces. Using these tools, the trainee must perform the full sequence of counter-UAS response: identifying the incoming drone on sensors, visually locating it in the sky (in VR), aiming the DroneGun, and “disabling” the drone by activating the jammer within the simulation. The physics and effects are modeled so that if the jammer is used effectively and within range, the virtual drone will behave as a real one would when jammed (typically, it would lose control or descend). If the drone gets too close or is not stopped, the scenario can play out consequences (with a simulated explosion or data leak, depending on the scenario), giving feedback on failure.
. Counter-UAS training. U.S. Army photo by SFC Tanisha Karn, via DVIDS (Public Domain).
One of the key benefits of using VR here is scenario diversity and repeatability. In reality, practicing counter-drone tactics is extremely difficult, requiring friendly drones to act as aggressor targets and a safe area in which to operate jammers (which are regulated devices). Even then, it is hard to simulate multiple simultaneous drones or dangerous behavior. In VR,
xOperator can present the trainee with a variety of drones, from small quadcopters to fast winged drones, and can simulate complex attack patterns and even swarms without risk. Trainees can experience tense situations like multiple drones approaching from different directions, forcing them to prioritize and react under pressure. The virtual environments can also be tailored: one session might be in daytime conditions, another set at night with poor visibility; one in an open desert base, another in a cluttered urban environment with many false targets (birds, kites,
etc.). This breadth of training is invaluable as it exposes security teams to far more possible threat scenarios than they could realistically stage live.
xOperator includes an After-Action Review (AAR) system. After a scenario ends, the software can replay the entire event, allowing the trainee and instructors to observe it from various camera angles or from a top-down view. They can see the path taken by the drone, how quickly the trainee responded to events, whether they aimed accurately, and at what distance they engaged. This replay feature is crucial for feedback, operating in much the same way that sports teams review game footage. In training, the instructor might point out that the trainee fixated on one drone and missed another, or that they took too long to acquire the target on sensors. The AAR can also provide quantitative metrics (time to detection, time to neutralization, number of attempts,
etc.), which can track a trainee’s improvement over multiple sessions.
From an implementation perspective,
xOperator combines commercial VR hardware with custom-developed simulation software. The system provides high-fidelity graphics to enable recognition of small drones and integrates a virtual model of DroneShield’s DroneGun controller. Through repeated training, operators can gain familiarity with detection, aiming, and engagement procedures in a safe virtual environment.
DroneShield’s partnership with Operator (XRG) shows how industry and XR specialists are combining their expertise. DroneShield provides the subject matter knowledge of drone threats and mitigation, while XRG brings XR simulation know-how. The result is a training solution that was not possible a few years ago, as XR hardware had to mature to render distant flying objects well and to track rapid head movements without latency (critical when “shooting” at drones in the sky). Now, with VR technology, it is not just possible but feasible to provide on-demand C-UAS drills to security forces. Since its introduction, this system has been marketed to military, police, and critical infrastructure security teams, particularly those that have invested in counter-drone technology and need to ensure that their personnel can use it effectively while under stress.
This case study underscores how XR simulations are moving into niche but important domains. Counter-UAS is a very recent challenge, and traditional training methods cannot address it easily: you cannot fire jammers around commercial airports for practice, for example. VR fills that gap by creating a safe stand-in environment. As drone threats evolve (e.g., swarms or AI-guided drones), the simulator can be updated to reflect new tactics, keeping training relevant. It also exemplifies a general pattern: XR training modules are developed as turnkey products for specialized skills, analogous to flight simulators existing for specific aircraft. We can expect more of these targeted XR trainers (for firefighting drones, medical delivery drones,
etc.) to emerge, each combining domain-specific expertise with XR immersion to solve a training need.
5. Technical Challenges in XR-Based Drone Simulation
While XR-based drone simulators show immense promise, they also face several technical and practical challenges. Achieving a high degree of realism, interaction fidelity, and training effectiveness in XR is not trivial. Here we discuss the key challenges identified in current XR implementations and research:
5.1.Realism vs. Performance
One fundamental challenge is delivering realistic visuals and physics without overloading the hardware or inducing user discomfort. The high-fidelity simulation of large outdoor environments (with detailed terrain, buildings, trees,
etc.) can be computationally intensive. In VR, maintaining a high frame rate (typically 72–90 frames per second [FPS] or more) is essential to prevent motion sickness. Thus, developers must balance detail with performance. For instance, rendering a dense urban city for a drone inspection scenario in VR might require the use of level-of-detail techniques and the culling of unseen areas to keep frame rates smooth. Similarly, realistic physics (e.g., accurate wind turbulence affecting the drone) can be hard to simulate in real time. Simplifications are often made, which can slightly reduce realism. If the physics are oversimplified (e.g., the drone stops instantly with no drift), pilots might develop incorrect muscle memory. Achieving a validated physics model that runs in real-time remains an ongoing effort; many simulators continuously refine their flight dynamics by comparing their virtual drones’ performance against real flight data. Sensor realism is also a concern: training a pilot to rely on camera feeds or FPV perspectives means that the VR/AR system must simulate camera behavior (limited field of view, some latency, exposure changes in different lighting) to be faithful. Overly perfect simulation visuals (with no noise or camera quirks) might make real flights feel strangely difficult by comparison. Achieving the right level of “imperfection” is challenging.
5.2. Field of View and Visual Acuity in HMDs
Although XR headsets have improved, their field of view (FOV) is still narrower than human vision. Many VR/MR headsets offer around 90–100 degrees horizontal FOV, whereas human peripheral vision extends to around 180 degrees. This can affect training, especially for tasks like scanning the sky for drones or maintaining situational awareness of obstacles. Trainees might need to develop scanning techniques that account for limited FOV, or else risk missing virtual objects at the periphery of their available vision. Some users have reported that certain spatial tasks are harder in VR simply because you do not catch movement “out of the corner of your eye” as in real life. Additionally, visual acuity and resolution in headsets, while good, can make distant objects or fine details harder to see. In a drone simulator, a small quadcopter 100 meters away in VR might be rendered in just a few pixels, making it potentially more challenging to spot than in real life where the eye might pick it out against the sky (although this depends on resolution and contrast). Advanced headsets like those developed by Varjo attempt to solve this with foveated rendering (
i.e., the picture is sharp where you are looking) [
14]. However, not all systems have that facility. These limitations mean that trainers must design scenarios appropriately (e.g., by ensuring that targets are visible enough, or by providing visual/auditory cues) to avoid negative training. The
SafeSpect AR study found that some pilots felt discomfort or difficulty when AR interface elements were placed awkwardly in their view or when the headset’s limited FOV cut off parts of the interface when they moved their eyes [
18]. Headset ergonomics (such as weight and comfort) are also factors that must be considered. For example, a heavy headset can cause fatigue during long training sessions, which is not something real drone piloting usually entails.
5.3. Motion and Simulator Sickness
Related to the above, simulator sickness (a form of motion sickness that occurs in VR environments) is a concern. Drone simulations can induce contradictory sensory cues, such as the visual scene indicating motion (
i.e., the ground moving as the drone flies forward) but the user’s body remaining stationary. If not handled carefully, this discontinuity of experience can lead to nausea and disorientation in some users. Rapid accelerations, rolls, or sudden viewpoint changes in VR are especially problematic. Many drone simulators thus limit extreme maneuvers or provide comfort options (like reducing motion blur or offering a fixed horizon reference in the HUD). The situation is particularly acute for FPV drone racing simulations in VR because in them, acrobatic moves are the norm; as a result, most FPV simulators still recommend using a standard monitor or FPV goggles (which show the view but without head tracking) to avoid motion sickness. Over time, users can adapt, but it remains a barrier for broad use, as a trainee who gets sick in VR obviously cannot train effectively. MR/AR can mitigate this because the real world is still visible (the user has a stable frame of reference), but MR is only applicable to some training modes. Active research on reducing VR sickness is ongoing, including techniques such as dynamically blurring peripheral vision during rapid motion, or using three degrees of freedom (3DoF) modes in which the viewpoint is partially decoupled from head movement. However, all of these have to be tuned so as not to degrade training realism too much.
5.4. Interaction and Control Fidelity
Ensuring that the user interacts with the simulation in a way that faithfully replicates real drone operation can be difficult. As discussed, many simulators use game controllers or VR controllers to emulate an RC transmitter. However, subtle differences in feel or latency can affect training. A physical RC transmitter has sticks that have their own particular tension, and many pilots develop a feel for throttle management based on that. A VR controller might not replicate the exact tension or range of motion, potentially leading to slightly different control inputs. Some simulators allow the direct use of real transmitters via USB dongles (as used for PC drone sims) which can improve fidelity. Another aspect is haptic feedback: when a drone is about to hit something or when the battery is low, real drones might beep or vibrate (through the controller or the drone itself). Simulators have to provide analogous feedback, usually via on-screen indicators or sounds, but if they fail to emulate these cues, a trainee might not learn to rely on them. Moreover, MR systems like Inzpire’s that incorporate actual controls face integration challenges (
i.e., ensuring the virtual environment and the real hardware inputs stay perfectly in sync, with no lag or misalignment). If there is any mis-registration (e.g., the virtual map appears offset from the real map on the table), it can cause confusion or error. Aligning virtual and physical coordinate systems precisely remains a technical challenge for MR simulators.
5.5. Communication and Team Training
Drone operations often involve team communication, such as a drone pilot talking to a camera payload operator, or a pilot communicating with a command center or teammates on the ground. Simulating the communications environment is important but can be challenging. In a networked VR scenario, voice chat can connect trainees, but inserting realistic radio effects or simulating comm delays might be necessary to generate authenticity. Ensuring that multi-user XR training remains synchronized (
i.e., all participants seeing the scenario consistently) also requires robust networking. If one person’s simulation lags or diverges, it could disrupt group training. The bandwidth needed for rich shared VR environments can be high, so special networking code or local network setups might be needed for multi-person XR drills. This consideration is particularly relevant in military settings.
5.6. Content Creation and Scenario Authoring
Developing high-quality training scenarios in XR can be resource-intensive. It often requires 3D artists to build virtual environments or import GIS data; subject matter experts to script realistic events; and possibly AI to control virtual entities (like a virtual “intruder drone” that behaves intelligently). For a training program to cover a wide range of scenarios, a large library of virtual environments and event scripts is needed. Moreover, it is challenging to create authoring tools so that instructors (who may not be VR experts) can easily set up scenarios or modify parameters. Some simulators address this with user-friendly scenario editors (e.g., drag-and-drop a drone here, define its path, add wind from this direction,
etc.), but this capability is not universal. Without easy content creation, the risk exists that the simulator will be used with only a limited set of canned scenarios, potentially reducing its training value over time. Automation and AI may be able to help by automatically generating a variety of obstacle courses or dynamic weather to keep scenarios fresh.
5.7. Hardware Cost and Accessibility
While XR hardware costs have come down for consumers, professional-grade setups (like
Varjo XR headsets or multiple systems for a class of students) can be expensive. Organizations must weigh these costs against training benefits. Additionally, accessibility is an issue, as not everyone is comfortable with or capable of using VR (people with certain vision problems or motion sensitivity might struggle). Traditional simulators (like a PC with a monitor) are more universally accessible even if they offer a less immersive experience. Maintenance and support remain significant considerations: XR gear can be maintenance-intensive (needing tracking calibration, software updates,
etc.), which might require an IT or support staff, especially when deployed at scale in an academy or enterprise. One interesting accessibility solution comes from research: a WebAR-based simulator that runs on a standard web browser and uses simple printed markers for AR [
17]. Ribeiro et al. (2021) showed that using WebAR could lower barriers to use, since trainees only need a smartphone or webcam-equipped PC to see a virtual drone fly around markers in their real space [
4]. This reduces the need for specialized hardware, although it does so at the expense of immersion and fidelity.
5.8. User Acceptance and Training Efficacy
Even if the technology works perfectly, an important challenge is convincing users and stakeholders of its effectiveness. Some seasoned pilots or instructors might be skeptical of VR training, especially if they grew up with hands-on experience. They may question whether skills learned in “gaming” environments truly transfer to real flights. Building trust in the simulator is crucial. One AR drone interface study found that while an adaptive AR HUD helped pilots by removing clutter, some did not trust the system’s judgment of what to hide: they felt uneasy not seeing everything even if it was extraneous [
18]. This indicates a need to design XR training tools with transparency and perhaps user control in mind, so that users feel comfortable. In training, if a student doesn’t take the simulation seriously (perhaps because the graphics are cartoonish or they feel that it is just a game), the learning value decreases. Thus, achieving a level of realism that invokes the appropriate mindset is important. As noted, MR tends to help because seeing real elements better anchors the experience in reality.
Empirical studies on training transfer (
i.e., measuring performance improvement in real flights after VR training) are still relatively limited, especially for drones. Early results are positive (with many studies citing improved or equal performance [
6]), but more data will help to address any lingering doubts. Recent work by Somerville et al. (2024) reinforces these findings, providing systematic evidence that structured simulator training can significantly improve real-world drone performance, including a 32% boost in flight accuracy on a return-to-home task compared to a control group [
1].
Ruiz-Medina et al. (2025) further highlight the importance of validating simulator-trained motor skills by directly comparing operator performance in simulated and real environments, showing measurable differences that indicate how well training transfers [
31]. Cardona-Reyes et al. (2024) also provide empirical evidence that task structure in VR significantly influences the quality of learning outcomes in drone pilot training [
22]. Beyond pure performance, Maeng et al. (2024) analyze how geographic information availability affects decision-making during Beyond Visual Line of Sight (BVLOS) drone operations, underscoring that cognitive context also shapes training effectiveness [
32]. Standards bodies and regulators also pose a challenge: for instance, will aviation authorities credit simulator hours toward drone pilot licensing requirements? Acceptance of XR sims in official curricula will depend on demonstrating their efficacy and reliability.
Training Outcomes: XR vs. Traditional Methods: A Meta-Analytic View
How effective is XR-based drone training compared to conventional training? Recent research indicates that immersive simulators can yield performance outcomes on par with, or even exceeding, those from real-world practice. For example, a 2021 meta-analysis by Kaplan et al. examined transfer-of-training across numerous domains and found no significant difference between XR-trained groups and those trained via traditional methods, with an average effect size of essentially zero [
33]. In other words, pilots training in VR or MR performed just as well on post-training evaluations as those who trained on real equipment, validating XR as a viable substitute in many cases.
More recent studies focusing specifically on aviation echo these findings with some positive trends. Somerville et al. (2024) report that student drone pilots who underwent structured simulator training achieved a 32% improvement in flight accuracy during a real-world test maneuver (return-to-home positioning), compared to a control group without simulator practice [
1]. Ruiz-Medina et al. (2025) extend this by directly evaluating operator motor skills across simulated and real environments, strengthening the evidence that XR training outcomes generalize to real-world flight tasks [
31].
Such gains suggest not only the effective transfer of skills from XR to reality, but also a quantifiable boost in performance efficiency. Furthermore, the results point to the overall benefits of XR training for pilots: improved test scores, maneuver proficiency, and evidence of better skill retention over time. This latter point is crucial: early data indicate that skills learned in XR may decay more slowly, perhaps due to the high engagement and feedback richness of immersive training.
summarizes select comparative findings from the literature on learning outcomes, highlighting effect sizes (where available), retention metrics, and transfer test results for XR-trained versus traditionally-trained cohorts.
.
Comparison of XR-based and traditional drone training outcomes.
| Study |
Performance Effect |
Skill Retention/Transfer |
| Kaplan et al. [33] (2020) |
XR training produced performance levels comparable to traditional methods. |
Long-term retention not significantly different; XR feasible as a substitute for conventional training. |
| Somerville et al. [1] (2024) |
Structured simulator training improved real-world return-to-home accuracy by 32% compared to control. |
Evidence suggests enhanced transfer to real flight tasks and indications of better long-term skill retention. |
| Ruiz-Medina et al. [31] (2025) |
Direct comparison of simulated vs. real environments highlighted measurable differences in operator motor skills. |
Demonstrates the extent to which simulator-acquired skills generalize to real-world UAV operation. |
5.9. Safety and Ethical Considerations
While XR removes physical risk, some safety considerations exist within VR. Users can trip or collide with real objects if they move around wearing a VR headset. This is a particular problem with room-scale AR scenarios; Foxtrot’s AR mode, for example, expects you to physically walk in an area, meaning that you must have a safe, clear space in which to do so. Clear boundaries and guardian systems (like Quest’s chaperone) are essential to prevent accidents. Ethically, scenario design in defense or public safety contexts might include distressing content (like simulated combat or disaster scenes). As with live training, ensuring that trainees are psychologically prepared and debriefed is important. Moreover, data recorded by simulators (pilot performance, errors,
etc.) should be handled with care and used for improvement, not punishment, to encourage trainees to use the system without fear of being judged harshly for virtual mistakes.
In summary, XR-based drone simulation must navigate myriad challenges, from the technical limits of the available hardware to human factors and acceptance issues. Each challenge is being actively addressed by the community. For example, hardware is improving every year (e.g., wider FOV, lighter headsets,
etc.), software techniques for foveated rendering and physics optimization are advancing realism, and interface research (like
SafeSpect) is tackling how to present information without overloading the recipient. The next section of this paper explores how future developments aim to overcome these challenges and enhance XR drone training further.
6. Future Directions
The field of XR-based drone simulation is evolving rapidly. As technology advances and adoption grows, we anticipate several key future directions that will shape the next generation of simulators.
6.1. Enhanced Realism through Advanced Graphics and Physics
Future simulators will leverage improvements in graphics hardware and rendering techniques to achieve near-photo-realistic environments. We can expect high-resolution textures, dynamic lighting, and weather effects that are virtually indistinguishable from reality. Real-world geographic data (from satellite imagery and 3D scans) will be more seamlessly integrated, enabling the instant creation of virtual copies of any location (e.g., a construction site, a downtown city block, or a dense forest) for mission-specific training. Physics engines will also advance; for example, more sophisticated fluid dynamics could allow realistic simulation of drone rotor wash interacting with the environment (which is useful for training in close quarters or around loose debris). Aerodynamic modeling might be taken further to include propeller turbulence, battery performance under various loads, and even acoustic signatures (so that trainees hear the drone’s sound change under strain). The goal is a “digital twin” level of realism, where every aspect of the drone and its environment in XR behaves as it does in real life. An additional benefit of this high fidelity is that it will allow simulators to be used not just for pilot training but also for system testing. For example, engineers could test new autopilot algorithms in an XR environment with confidence that it represents the real world accurately.
6.2. Wider Adoption of AR and Pass-through AR
Current evidence suggests that AR offers unique training advantages by combining the intuitive interaction of the real world with the immersion of simulation [
6]. In future, we are likely to see many VR-based simulators adding MR capabilities as hardware permits. The trend in headsets (like Meta’s and Apple’s upcoming devices) is toward incorporating high-quality color pass-through cameras, enabling more MR experiences. In drone simulation, this could manifest as augmented training scenarios in which real environments are annotated or populated with virtual elements. For instance, an instructor in the field could put out physical markers, and trainees wearing MR headsets would see a virtual obstacle course or target indicators aligned with those markers. This blends the benefits of practicing in a real location (authentic ground textures, true depth perception) with the flexibility of simulation. AR also allows multi-participant experiences, where some participants might be in VR and others in AR in the same space, which could be useful for mixed training (e.g., a drone pilot in VR coordinating with a field team that sees AR overlays). Pilots have shown a preference for MR in some studies because it feels more natural (e.g., you can glance at your hands, maintain peripheral awareness,
etc.) which suggests that MR training may yield better engagement and skill retention. As hardware FOV and resolution improve, MR could become the default mode for many training tasks that involve equipment handling or team coordination, while pure VR remains used for scenarios in which the environment itself must be entirely fictional (e.g., flying on an alien planet for a research simulation).
6.3. Integration of AI and Intelligent Tutoring
Future XR simulators will increasingly incorporate AI-driven agents and tutoring systems to enhance training effectiveness. Intelligent agents can control virtual entities (such as other air traffic, birds, or dynamic obstacles) to create richer scenarios, without requiring an instructor to micromanage every event or manually adjust the behavior of these elements. For example, an AI adversary drone could be programmed to “react” to the trainee’s actions, providing more challenging dogfight or evasion training that adapts to the pilot’s skill level. Beyond scenario control, AI can serve as a virtual instructor/coach, monitoring the trainee’s performance in real time and providing guidance or feedback. In an XR simulator, this might appear as contextual prompts (e.g., “Altitude low—pull up!” if the trainee is about to crash) or after-action tips (“You tended to drift left when hovering: focus on correcting that next time”). Adaptive training algorithms could adjust scenario difficulty in real time: if a trainee is excelling, the AI could introduce new stressors (like wind gusts or an additional task), and if the trainee is struggling, it might slow things down or highlight essential information to prevent overload. Such intelligent tutoring systems (ITSs) have been used in other simulation domains and could be transformative for drone training, providing a personalized curriculum for each pilot. The AI could also analyze long-term progress data to suggest when a trainee is ready for certain real-life tasks or where more practice is needed, thereby optimizing the training program and reducing risks to real assets and infrastructure.
6.4. Digital Twins for Real-Time Interaction
An important emerging concept in XR drone simulation is the use of digital twins. A digital twin is a highly accurate virtual model of a physical asset (e.g., a specific drone and its operating environment) that is kept in sync with its real-world counterpart in real time [
34]. In the context of drone training, a digital twin can mirror the state of an actual drone, its subsystems, and even the surrounding environment (terrain, obstacles, weather,
etc.), ensuring that any change in the real system is reflected in the simulation, and vice versa.
This synchronization enables scenarios in which operators interact with a live-updating replica of ongoing missions. For instance, researchers have demonstrated that integrating digital twins with XR allows operators to “step into” an ongoing drone operation virtually: they can examine a live UAV’s telemetry and environment in an immersive 3D scene, trial adjustments or what-if maneuvers on the twin, and evaluate outcomes without putting the real drone at risk [
35]. High-fidelity twins thus facilitate real-time interaction and decision-making by allowing operators to visualize complex data (battery status, payload sensor feeds,
etc.) that can be laid over the XR environment, improving interpretation and responsiveness to evolving conditions.
Moreover, digital twins help to bridge the longstanding simulation-to-reality gap. They provide a safe testbed for new control strategies or algorithms under true-to-life conditions before deployment to physical drones. Early studies indicate that the fusion of digital twins and XR could transform UAV operations by enhancing situational awareness, collaboration, and predictive capabilities [
36]. For example, maintenance crews could interact with a drone fleet’s twin to practice troubleshooting in real time, or distributed teams might jointly monitor a virtual battlefield populated by twins of drones engaged in actual missions.
This real-time interplay between physical drones and their virtual surrogates greatly enhances the fidelity and applicability of XR-based training and mission rehearsal, laying the groundwork for broader institutional adoption.
6.5. Multi-User MR and Distributed Simulation Environments
As networking technology and XR collaboration tools advance, multi-user XR drone simulations are becoming increasingly feasible. These environments allow multiple trainees and instructors to share the same virtual or mixed-reality airspace, either co-located or connected remotely across facilities. In such scenarios, maintaining a consistent, synchronized environment for all participants is paramount: every trainee must see the same drone positions, telemetry, and events at the same time. This imposes strict requirements on networking infrastructure, often involving low-latency, high-bandwidth connections via dedicated local networks, edge servers, or 5G links. To avoid divergence (where one user’s simulation runs ahead or desynchronizes), systems employ state reconciliation, time synchronization protocols, and techniques like local prediction and dead-reckoning to hide network delays, while high-refresh-rate headsets maintain the illusion of a unified shared space.
These capabilities make distributed training exercises at scale possible. For example, two drone pilots might practice flying in the same virtual airspace, coordinating to avoid collisions; entire inter-agency teams (fire, police, medical,
etc.) could rehearse emergency response drills across different cities within a shared VR disaster scenario. Human-in-the-loop extensions are also conceivable: a trainee flying a physical drone in a test field could appear as a synchronized virtual object in another participant’s MR headset, blending live and virtual training elements. Early steps in this direction align with the notion of augmented virtuality, where real-world flight data feeds directly into XR simulations. Beyond professional training, such infrastructures could support competitive formats resembling e-sports, in which drone pilots collaborate or compete in virtual missions with scoring systems and leaderboards.
Another essential component is the After-Action Review (AAR). Advanced XR training platforms increasingly integrate AAR modules that record telemetry, user actions, and high-level events during the session. These logs allow instructors and teams to replay exercises, analyze decision-making and coordination, and automatically flag key performance indicators such as reaction time, communication clarity, or adherence to procedures. To enable this functionality, MR implementations must provide precise data logging and timestamping against a common clock, ensuring the fidelity of the replay when compared with the original session. Moreover, virtual content must be spatially aligned across users: co-located trainees need to see the same virtual drone in the exact same physical location (an attribute typically achieved through shared reference points or origin calibration). Networking solutions range from short-range wireless synchronization for co-located headsets to cloud-based relay servers for geographically distributed participants.
In summary, effective multi-user MR and distributed simulation requires a synergy of high-performance networking, precise synchronization, and robust session recording. When these components come together, XR enables safe yet realistic training environments that support real-time collaboration, inter-agency drills, live–virtual integration, and detailed after-action review. These developments point toward a future in which team-based drone operations can be practiced at scale, combining the flexibility of simulation with the authenticity of shared field exercises.
6.6. Toward Institutional Integration and Occupational Legitimacy
Currently, XR drone training is often a supplemental or experimental tool. In the future, we anticipate it becoming a formalized, integral part of certification and proficiency requirements. Aviation regulators and standards bodies are already examining how XR simulators can be qualified as Flight Simulation Training Devices (FSTDs) for drones [
37]. This evolution reflects a broader conceptual shift. As the boundaries between manual, cognitive, and digitally mediated labor continue to blur, VR and AR are expected to reshape occupational skill requirements across industries. Tanaka et al. (2022) [
38] argue that immersive technologies will transform not only training methods but also the qualifications and competencies demanded in various professions. XR simulation, by enhancing skills acquisition while eliminating real-world risk, may help to legitimize and professionalize drone piloting in both public and private sectors.
If guidelines and standards are established (in a similar way to how airplane simulators are certified), drone pilots might be allowed to log a certain number of simulator hours toward license currency, or even to take portions of practical tests in a simulator. This would greatly accelerate adoption in academies and industry. The military’s increasing reliance on XR—with evidence that it saves time and money while maintaining training outcomes—will likely push civilian adoption as well as the technologies filter down. We may see XR training modules built into drone manufacturer offerings; for example, a high-end commercial drone might come with VR simulator software so that before any pilot flies the expensive aircraft, they must achieve proficiency with the drone’s digital twin via VR. Insurance companies could also drive this, by offering better rates to organizations that use XR training to maintain pilot skills (in the same way that flight simulators currently reduce incidents in aviation). Indeed, Foxtrot’s materials even cite insurance use cases, like using simulators to evaluate pilots’ skill levels for underwriting or to retrain pilots after an incident [
10].
6.7. Human Factors and the Safety Imperative
Recent findings suggest that pilots’ willingness to embrace AR and simulation technologies is a key determinant of successful institutional adoption. Schranz et al. (2025) [
39] found that XR systems’ perceived usefulness and ease of use significantly influenced acceptance among general aviation pilots, highlighting the importance of human factors in standardization efforts. In addition to adoption factors, XR-based simulation offers a promising avenue for addressing well-known human-factor risks, such as tunnel vision and narrowing attention during drone operations. Borowik et al. (2022) [
40] used mobile eye tracking with TV-drone pilots and found substantial gaze allocation to the controller display during VLOS shots, implying reduced peripheral awareness and potential safety risks. This emphasis on training is especially relevant given the high proportion of UAV incidents attributed to human factors. Rahmani and Weckman (2023) [
41] analyzed drone accidents and emphasized that improving pilot training—including simulation-based methods—is critical to mitigating errors linked to attention lapses, poor decision-making, and situational misjudgments.
6.8. Improved User Experience and Trust
Future XR simulators will place greater emphasis on the usability and comfort of the training experience. This includes lighter, more ergonomic headsets (some are eventually likely to resemble regular glasses), and features like eyetracking to allow more natural interactions (e.g., menus or interface elements that respond to where the pilot is looking). To address the trust issues noted with adaptive systems [
18], designers will incorporate transparent modes in which the trainee can understand or override the AI’s decisions in training scenarios. For instance, an adaptive AR interface might include a quick way to reveal hidden information if the user feels they need it, or a brief explanation of why it is hiding something (“hiding waypoint markers to reduce clutter. Press X to show”). Building trainee confidence in the XR system is key. We may even see the integration of biometrics: the simulator could monitor a pilot’s heart rate or stress levels and adjust difficulty or provide calming cues (useful in high-pressure scenario training to avoid cognitive overload). Another user-experience focus will be minimizing simulator sickness through higher display refresh rates (e.g., 120 Hz+), wider FOV, and software techniques, so that anyone can use the simulator comfortably for extended periods.
6.9. Integrating XR Simulation with Real Drones
A critical emerging challenge in the field of drone training lies in linking XR simulators with physical drone systems in real time. When a simulator interfaces with actual drone hardware or live flight data, synchronization, latency, and calibration become central concerns [
42]. For instance, mixed-reality setups may overlay virtual elements onto real drone operations (or vice versa), requiring precise alignment of coordinate frames and minimal delay between real and virtual sensor feeds. Any mis-registration (e.g., a virtual obstacle appearing spatially offset) or control lag can lead to operator confusion and undermine training validity.
Ensuring that the virtual environment responds to hardware inputs or telemetry with high fidelity demands rigorous hardware-in-the-loop testing and the careful tuning of update rates and network protocols. Safety protocols are indispensable when trainees directly control physical drones through XR interfaces. Previous research emphasizes that bridging this “simulation-to-reality” gap requires continuous validation—for example, comparing simulator outputs against real flight logs under identical inputs and iteratively refining the model until performance metrics (e.g., position drift, response to commands,
etc.) converge with real-world behavior.
This integration remains an ongoing technical effort, but it lays the foundation for hybrid training paradigms in which live systems and XR environments converge.
6.10. AR-Enhanced Live Flight Training
Building on these integration challenges, one promising application is the use of AR in live drone operations. In this paradigm, a pilot flying a real drone wears AR glasses that overlay mission-relevant information directly onto their view. While not a simulation per se, this approach can be regarded as an extension of XR training: the same technology used to simulate obstacles in virtual settings can be applied to assist or challenge pilots during actual flights.
For example, during live training sessions, instructors may insert virtual obstacles that are visible only through the pilot’s AR display. The trainee must then maneuver the physical drone to avoid the “virtual” obstacle, creating a safe yet complex training challenge. Such an “AR obstacle course” enables authentic flight practice with both an added safety net and enhanced cognitive load. Early prototypes already exist — such as EdgyBees’ AR drone games and startup-driven experiments like DronOSS [
43]—and future AR headsets may make it commonplace for training schools to deploy holographic gates or hazards in real-world airspaces.
Research by Kim et al. (2020) [
9] supports this trajectory, showing that mixed-reality training environments for FPV drone flying—in which virtual gates and obstacles are overlaid onto the pilot’s real-world view—can improve spatial awareness, depth perception, and control accuracy. In addition, AR can provide heads-up data such as telemetry, navigation waypoints, or enhanced situational awareness cues, such as by highlighting power lines or designating nofly zones in real time. Products such as Heliguy’s AirHUD [
44] and findings from a systematic review by Buchner et al. (2022) [
45] underscore that well-designed AR interfaces can reduce cognitive load and support effective information processing, demonstrating the broader feasibility of such overlays in operational contexts.
Ultimately, training for these AR tools should be embedded within XR simulation curricula. Projects like
SafeSpect [
19] illustrate this pipeline: AR heads-up display (HUD) concepts are first refined in immersive VR before being transitioned into live, safety-critical drone operations. By merging XR integration efforts with AR-based live training, the field moves closer to a cohesive ecosystem in which simulation and reality mutually reinforce pilot performance and safety.
6.11. Cross-Domain Simulation and Interoperability
Drone simulation may not remain independent of broader contexts. Future XR simulations could interoperate with other simulators. For example, an XR drone simulator could link with a virtual air traffic control simulator or a manned aircraft simulator for joint exercises. One could envision a comprehensive emergency response drill in which a drone pilot in VR is surveying a disaster site, while elsewhere a helicopter pilot in a simulator is in the same airspace virtually, and a command center team is managing both—all via interconnected simulation platforms. Achieving this requires common standards and protocols (possibly building on architectures like Distributed Interactive Simulation (DIS) or High Level Architecture (HLA), as used in military simulators). As interest in urban air mobility grows (with drones, air taxis,
etc. all potentially sharing skies), such integrated simulations could be important for developing traffic management procedures: XR-trained drone pilots might practice operating in virtual skies populated by many other craft. Interoperability also extends to using standard content: for instance, an airport model created for a manned aviation simulator could be re-used in a drone simulator, so that training environments are consistent across different pilot communities.
7. Conclusions
XR-based drone simulators represent a growing convergence of immersive media, robotics simulation, and pilot training. Our survey highlights the diversity of current systems, from modular research platforms like Flightmare and AirSim to commercial-grade simulators like SimFlight XR, with each tailored to specific control paradigms such as multirotor, fixed-wing, or FPV flight. A key trend is the widespread use of game engines (Unity, Unreal Engine) and physics libraries (e.g., PhysX) to enable realistic flight dynamics, wind effects, and sensor behaviors. Notably, AR-capable simulators now employ spatial mapping to allow virtual drones to interact with real-world objects, significantly expanding the possibilities for safe, on-site training.
The future of XR-based drone simulation appears extremely vibrant. Realism will continue to increase, closing the gap between simulation and real-world operations so that the difference becomes one of consequence, not of form. AR is likely to emerge as a dominant mode for many training tasks, offering superior engagement and situational relevance. Simultaneously, advances in AI and connectivity will make simulators smarter, more adaptive, and more collaborative—paving the way for on-demand, personalized training experiences across a range of mission types.
Nevertheless, several challenges remain to be addressed, including regulatory acceptance, creating high-fidelity haptic feedback, and lowering the barrier to entry for novice users. However, as these technological and pedagogical fronts evolve, XR simulators are poised to become indispensable tools across the entire pilot journey—from initial orientation flights to advanced mission rehearsals, and even for lifelong skill maintenance.
The innovations pioneered in this field may also influence adjacent domains such as ground robotics, manned aviation, and industrial training. XR is not merely a visual enhancement, but a bridge between simulation and embodied cognition. It is a foundation upon which the next generation of drone training will be built.
Appendix A. Survey Methodology and Classification of Studies
In this survey, we adopted a systematic approach to identify and analyze relevant literature on XR-based drone simulation. First, we employed a broad search strategy across scholarly databases (including IEEE Xplore, ACM Digital Library, Scopus, and Google Scholar) using keywords such as “drone OR UAV”, “simulation OR training”, and “VR OR AR OR MR OR XR”. We also included specific platform names (e.g., “AirSim”, “Flight Simulator”, “Augmented Reality drone”) to ensure coverage of well-known systems.
We then applied inclusion and exclusion criteria: we included studies focusing on unmanned aircraft simulation with a significant XR component (virtual, augmented, or mixed reality) and oriented toward training or human–UAV interaction. We excluded papers that dealt solely with robotics simulation (with no human-in-the-loop XR aspect) or purely gaming applications without training relevance. After title and abstract screening, 47 papers remained; we reviewed these in full. Of those, we further excluded works that were purely visionary or had no technical evaluation, resulting in
N = 27 core publications that form the basis of this survey. The selection spans journal articles, conference papers, and technical reports published up to mid-2025.
Our process aligns with PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) guidelines for scoping reviews. We also cross-checked references of key papers and recent review articles [
2] to avoid missing influential works. This led to the addition of several preprints and very recent studies not captured in the database search.
Each included study was then classified by type: (a)
Platform/System Descriptions–papers introducing XR drone simulation platforms or tools, (b)
User Studies/Experiments–papers evaluating training outcomes or user performance in XR vs. non-XR settings, and (c)
Reviews/Analyses–papers providing higher-level insights (e.g., systematic reviews, meta-analyses, or trend analyses). This classification allowed us to organize the survey findings by thematic categories.
By clearly documenting this literature selection procedure, we aim to ensure transparency and reproducibility. Future researchers can trace how sources were chosen and can update the survey by applying the same criteria as the field evolves. Our methodical approach also gives confidence that the survey provides a comprehensive overview of the state of the art, capturing the important developments in XR-based drone simulation up to the present time.
Acknowledgments
The authors would like to thank Makoto Itoh (University of Tsukuba, ReAMo Project) for his valuable advice on references regarding the efficacy of drone simulators, Yuto Ikeda for insightful discussions on drone simulators in general, Wataru Ken Tanaka for valuable discussions on simulator development and market perspectives, and Tuong Cao and Hoan Van for providing practical implementation details and other technical contributions.
Ethics Statement
Not applicable. This article is a survey of existing literature on XR-based drone simulation and does not involve studies with human participants or animals.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Funding
This research received no external funding.
Declaration of Competing Interest
The author, Kenji Tanaka, is the CEO of Foxtrot Inc., a company active in the development and deployment of drone simulation systems. This relationship may be perceived as a potential conflict of interest.
References
1.
Somerville A, Lynar T, Joiner KF, Wild G. Use of Simulation for Pre-Training of Drone Pilots.
Drones 2024,
8, 640. doi:10.3390/drones8110640.
[Google Scholar]
2.
Ross G, Gilbey A. Extended Reality (XR) Flight Simulators as an Adjunct to Traditional Flight Training Methods: A Scoping Review.
CEAS Aeronaut. J. 2023,
14, 799–815. doi:10.1007/s13272-023-00688-5.
[Google Scholar]
3.
Albeaino G, Eiris R, Gheisari M, Issa RR. DroneSim: A VR-based Flight Training Simulator for Drone-Mediated Building Inspections.
Constr. Innov. 2022,
22, 831–848. doi:10.1108/CI-03-2021-0049.
[Google Scholar]
4.
Mazlan MSM, Moketar NA, Kamalrudin M, Yusop N, Abdul Aziz A, Mat Zain NH, et al. The usability evaluation of the ARDroneSim application for augmented reality-based drone flight training simulator.
J. Electr. Syst. 2024,
20, 5825–5838.
[Google Scholar]
5.
Nasir ANM, Arsat M, Noordin MK, Muhamad Sidek MA, Tarmidi MZ. Structured Teaching Using Drone Simulators for Students’ Confidence in Real Flight. In Methods and Applications for Modeling and Simulation of Complex Systems: Proc. AsiaSim 2023; Springer: Singapore, 2024; pp. 125–136. doi:10.1007/978-981-99-7240-1_10.
6.
Storz V. Why XR Will be the Future of Pilot Training? LinkedIn Commentary. 2025. Available online: https://www.linkedin.com/pulse/why-xr-future-pilot-training-valentin-storz-lsoyf/ (accessed on 20 September 2025).
7.
Shephard Media. I/Itsec 2023: How VR and AR are reshaping the training capabilities for the Pentagon. News Article. 2023. Available online: https://shephardmedia.com (accessed on 20 September 2025).
8.
Milgram P, Takemura H, Utsumi A, Kishino F. Augmented reality: A class of displays on the reality-virtuality continuum. In Proceedings of Telemanipulator and Telepresence Technologies; SPIE: Bellingham, WA, USA, 1994; Volume 2351, pp. 282–292. doi:10.1117/12.197321.
9.
Kim D-H, Go Y-G, Choi S-M. An aerial mixed-reality environment for first-person-view drone flying.
Appl. Sci. 2020,
10, 5436. doi:10.3390/app10165436.
[Google Scholar]
10.
Foxtrot Inc. SimFlight XR—Drone Flight Training with Head-Mounted Display. Product Page. 2025. Available online: https://foxtrot.biz/xr-drone-en/ (accessed on 20 July 2025).
11.
Song Y, Naji S, Kaufmann E, Loquercio A, Scaramuzza D. Flightmare: A flexible quadrotor simulator. In Proceedings of the Conference on Robot Learning (CoRL), virtual, 16–18 November 2021; pp. 1147–1157. doi:10.5555/3495724.3495738.
12.
Shah S, Dey D, Lovett C, Kapoor A. AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles. In Field and Service Robotics; Springer Proceedings in Advanced Robotics; Hutter M, Siegwart R, Eds.; Springer: Cham, Switzerland, 2018; Volume 5; pp. 621–635. doi:10.1007/978-3-319-67361-5_40.
13.
Real Drone Simulator. Real Drone Simulator—Professional Drone Flight Simulation Software. 2025. Available online: https://www.realdronesimulator.com (accessed on 21 June 2025).
14.
Inzpire Ltd. Operating Drones in Mixed Reality: How Inzpire Leverages Varjo for Next-Gen UAV Training. Varjo Case Study, n.d. Available online: https://varjo.com/case-studies/operating-drones-in-mixed-reality-how-inzpire-leverages-varjo- for-next-gen-uav-training/ (accessed on 20 September 2025).
15.
Mairaj A, Baba AI, Javaid AY. Application specific drone simulators: Recent advances and challenges.
Simul. Model. Pract. Theory 2019,
94, 100–117. doi:10.1016/j.simpat.2019.01.004.
[Google Scholar]
16.
DroneShield; Operator (XRG). DroneShield and Operator Team Up for Extended Reality Counter-UAS Training. C-UAS Hub Article, 26 November 2022. Available online: https://cuashub.com/en/content/droneshield-and-operator-team-up-for- extended-reality-counter-uas-training/ (accessed on 20 September 2025).
17.
Ribeiro R, Ramos J, Safadinho D, Reis A, Rabadão C, Barroso J, et al. Web ar solution for uav pilot training and usability testing.
Sensors 2021,
21, 1456. doi:10.3390/s21041456.
[Google Scholar]
18.
Tsang OW, Peisen X, Garcia J, Jouffrais C. Seeing Safety: How Augmented Reality Could Transform Drone Inspections Forever. NUS Computing Feature, 4 June 2025. Available online: https://www.comp.nus.edu.sg/features/augmented-reality-transform-drone-inspections/ (accessed on 20 June 2025).
19.
Xu P, Garcia J, Ooi WT, Jouffrais C. SafeSpect: Safety-first augmented reality heads-up display for drone inspections. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI ’25), Yokohama, Japan, 26 April–1 May 2025; pp. 1–15. doi:10.1145/3613904.3642899.
20.
Burri M, Nikolic J, Gohl P, Schneider T, Rehder J, Omari S, et al. The EuRoC micro aerial vehicle datasets.
Int. J. Robot. Res. 2016,
35, 1157–1163. doi:10.1177/0278364915620033.
[Google Scholar]
21.
Hassan F, Sunar N, Basri MA, Mahmud MS, Ishak MH, Ali MS. Modeling and simulation of complex systems. In Communications in Computer and Information Science; Springer Nature: Singapore, 2023; pp. 492–503. doi:10.1007/978-981-99- 6995-1_38.
22.
Cardona-Reyes H, Parra-Gonzalez E, Trujillo-Espinoza C, Villalba-Condori K. Task design in virtual reality environments for drone pilot training. In New Perspectives in Software Engineering; Springer: Cham, Switzerland, 2024; pp. 261–274. doi:10.1007/978-3-031-62739-7_19.
23.
EdgyBees Ltd. Drone Prix AR—The first augmented reality game for DJI drones. DJI Developers Blog. 2017. Available online: https://developers.dji.com/newsroom/news/drone-prix-ar/ (accessed on 20 September 2025).
24.
0Space Inc. Drone Simulator VR. Scheduled for release on Meta Quest Store, June 2025. Available online: https://www.meta. com/quest/store/products/drone-simulator-vr/ (accessed on 20 September 2025).
25.
Bevins A, Duncan B. Aerial Flight Paths for Communication: How Participants Perceive and Intend to Respond to Drone Movements. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’21), Boulder, CO, USA, 8–11 March 2021; ACM: New York, NY, USA, 2021; pp. 16–23. doi:10.1145/3434073.3444652.
26.
RC Pilot Trainer. RC Pilot Trainer on Meta Quest. 2025. Available online: https://www.meta.com/experiences/rc-pilot-trainer/8142055429181651/ (accessed on 21 June 2025).
27.
Rowland D. PicaSim—Free flight simulator for R/C planes. 2020. Available online: https://www.rowlhouse.co.uk/PicaSim/ (accessed on 21 June 2025).
28.
Aerosense Inc. Aerobo Wing: Fixed-Wing Autonomous Drone (AS-VT01). 2025. Available online: https://aerosense.co.jp/ en/products/drone/as-vt01.html (accessed on 21 June 2025).
29.
Bat Cave Games Ltd. VelociDrone FPV Simulator. Available online: https://www.velocidrone.com/ (accessed on 21 June 2025).
30.
LuGus Studios. Liftoff: FPV Drone Racing Simulator. 2023. Available online: https://www.lugus-studios.be/portfolio/liftoff- drone-racing (accessed on 21 June 2025).
31.
Ruiz-Medina JS, Maeng S, Tu N, Itoh M. Evaluating drone operator motor skills: A comparison between simulated and real environments. In Proceedings of the 23rd International Symposium on Aviation Psychology (ISAP); Corvallis, OR, USA, 27–30 May 2025.
32.
Maeng S, Tu N, Nishida H, Itoh M. Analysis of effect of geographic information on decision-making under emergency conditions during BVLOS drone operation.
Tech. J. Adv. Mobil. 2024,
5, 89–108. doi:10.34590/tjam.5.10 89.
[Google Scholar]
33.
Kaplan AD, Cruit J, Endsley M, Beers SM, Sawyer BD, Hancock PA. The effects of virtual reality, augmented reality, and mixed reality as training enhancement methods: A meta-analysis.
Hum. Factors 2021,
63, 706–726. doi:10.1177/0018720 820904229.
[Google Scholar]
34.
Grieves M, Vickers J. Digital Twin: Mitigating Unpredictable, Undesirable Emergent Behavior in Complex Systems. In Transdisciplinary Perspectives on Complex Systems; Kahlen F-J, Flumerfelt S, Alves A, Eds.; Springer: Cham, Switzerland, 2017; pp. 85–113. doi:10.1007/978-3-319-38756-7 4.
35.
Costa C, Gomes E, Rodrigues N, Gonçalves A, Ribeiro R, Costa P, et al. Augmented reality mobile digital twin for unmanned aerial vehicle wildfire prevention.
Virtual Real. 2025,
29, 71. doi:10.1007/s10055-025-01145-w.
[Google Scholar]
36.
Hossen MS, Gurses A, Sichitiu M, Guvenc I. Accelerating development in UAV network digital twins with a flexible simulation framework.
arXiv 2025, arXiv:2503.07935.
[Google Scholar]
37.
U.S. Helicopter Safety Team (USHST). Flight Simulation Training Device (FSTD) with Extended Reality (XR) Systems—Guidance. 2023. Available online: https://ushst.org/flight-simulation-training-device-fstd-with-extended-reality-xr-systems/ (accessed on 20 September 2025).
38.
Tanaka K, Tambe P. How Will VR and AR Impact Occupations?
SSRN Electron. J. 2022. doi:10.2139/ssrn.4022827.
[Google Scholar]
39.
Rizvi SAQ, Rehman U, Cao S, Moncion B. Exploring technology acceptance of flight simulation training devices and augmented reality in general aviation pilot training.
Sci. Rep. 2025,
15, 2302. doi:10.1038/s41598-025-85448-7.
[Google Scholar]
40.
Borowik G, Kożdoń-Dębecka M, Strzelecki S. Mutable Observation Used by Television Drone Pilots: Efficiency of Aerial Filming Regarding the Quality of Completed Shots.
Electronics 2022,
11, 3881. doi:10.3390/electronics11233881.
[Google Scholar]
41.
Rahmani H, Weckman GR. Working under the Shadow of Drones: Investigating Occupational Safety Hazards among Commercial Drone Pilots.
IISE Trans. Occup. Ergon. Hum. Factors 2024,
12, 55–67. doi:10.1080/24725838.2023.2251009.
[Google Scholar]
42.
Emami Y, Li K, Almeida L, Zou S, Ni W. On the Use of Immersive Digital Technologies for Designing and Operating UAVs.
arXiv 2024, arXiv:2407.16288.
[Google Scholar]
43.
Brooks C. Startup DronOSS Sets Its Sights on Fewer Drone Crashes with Augmented Reality Pilot Training. Next Reality, 20 September 2019. Available online: https://next.reality.news/news/startup-dronoss-sets-its-sights-fewer-drone-crashes-with-augmented-reality-pilot-training-0206369/ (accessed on 21 June 2025).
44.
Heliguy. AirHUD: Augmented Reality Drone Software (AR + VR). Heliguy Blog, 8 August 2023. Available online: https://www.heliguy.com/blogs/posts/heliguy-offers-airhud-augmented-reality-drone-software/ (accessed on 20 September 2025).
45.
Buchner J, Buntins K, Kerres M. The impact of augmented reality on cognitive load and performance: A systematic review.
J. Comput. Assist. Learn. 2022,
38, 285–303. doi:10.1111/jcal.12617.
[Google Scholar]