
Anduril, the defense tech company co-founded by Palmer Luckey, has given a first public look at its EagleEye headset system, developed alongside Meta for U.S. military applications, tells this Road to VR article. The goal is to deliver real-time, fused visual and sensor overlays that help soldiers see and act more decisively.
In a video shared on social media, Anduril shows EagleEye in action, stitching multiple data streams into a soldier’s HUD. Displays include a mini-map, switches between thermal and low-light vision, overhead drone perspectives, and AI tracking that highlights friendly units even when they’re partially obscured behind terrain or obstacles. The system utilizes Anduril’s existing Lattice battlefield software to ingest inputs, including RF signature detection, environmental sensors, rearview cameras, and alerts, which all contribute to the augmented experience.
The hardware includes AR glasses that can be augmented with a removable shroud in bright conditions, improving visibility of overlays when ambient light is strong. Compared with past military XR efforts, such as Microsoft’s IVAS project, Anduril is aiming for a more seamless, soldier-centric interface. EagleEye is reportedly competing for U.S. Army contracts that are set to rework AR systems for combat use.
One particularly striking design choice: EagleEye’s HUD mimics conventions from modern video games, such as mini-maps, waypoints, and toggles, because younger soldiers are already familiar with these interaction models. That reduces the learning curve and may speed adoption. The article suggests that such XR systems might eventually power not just on-site soldiers but remote telepresence, autonomously piloted systems, or robotic units operated from afar.
EagleEye marks a significant step: not just wearable AR, but battlefield-aware XR that fuses sensor data, AI, and spatial context, integrating the digital and physical in real time for the soldier in the field.