How can I assess agent importance in my MAS?
Understanding Individual Agent Importance in Multi-Agent System via Counterfactual Reasoning
This paper introduces EMAI, a method for explaining the importance of individual agents within a multi-agent reinforcement learning (MARL) system. It uses a counterfactual reasoning approach – observing reward changes when an agent's actions are randomized – to determine importance. This is modeled as a MARL problem itself, training "masking agents" to select which target agents to randomize at each timestep.
For LLM-based multi-agent systems, EMAI offers a way to understand the contributions of individual LLMs in collaborative tasks. By identifying critical agents, developers can gain insights into emergent strategies, debug unexpected behavior, and potentially improve overall system performance through targeted interventions like patching or retraining. The black-box nature of EMAI is particularly relevant as it doesn't require access to internal LLM workings.