Can AI improve dog-handler teamwork in search and rescue?
KHAIT: K-9 Handler Artificial Intelligence Teaming for Collaborative Sensemaking
KHAIT (K-9 Handler Artificial Intelligence Teaming) uses AI and augmented reality (AR) to improve communication between search and rescue dogs and their handlers. A camera on the dog's harness, powered by an NVIDIA Jetson Orin Nano running YOLOv8l for object detection, streams video to the handler's AR headset (Microsoft HoloLens 2). This allows the handler to see what the dog sees, including potential survivors or hazards identified by the AI. The system also provides enhanced location tracking and sharing.
While not explicitly a multi-agent system in the traditional sense, KHAIT demonstrates a human-agent teaming approach relevant to LLM-based multi-agent systems. It highlights the potential of combining real-time sensor data with AI processing (object detection in this case, which could be replaced with more sophisticated LLM-based scene understanding) and AR interfaces for improved human-agent collaboration. The study showcases the value of edge computing for real-time performance in the field and explores the usability challenges of such systems, providing insights for future development of LLM-powered multi-agent applications. The canine's enhanced perception, interpreted by the AI, can be seen as analogous to an agent providing information to a human user, a common pattern in multi-agent systems.