Intuitive Command: An EMG-Based Approach for Pilot-Drone Interaction

A man is piloting a virtual plane inside a flight simulator.

Background: Fighter jet pilots operate under extreme conditions, managing complex tasks that require immense concentration and precision. As defence technologies evolve, there is an initiative to incorporate unmanned drone swarms to assist pilots in tasks such as electromagnetic warfare, target identification, and terrain surveying. While this integration offers potential advantages, it also introduces challenges in communication between the pilot and the swarm, especially given the pilot's pre-existing responsibilities. Traditional communication systems would further burden the pilot, necessitating the exploration of hands-free, intuitive communication methods. Such innovative interfaces, initially designed for high-stakes aerial operations, hold promise for a myriad of applications. From industrial machine operators to emergency response teams, the potential to communicate seamlessly with advanced systems using non-traditional interfaces could revolutionize how professionals across various sectors interact with and manage sophisticated technologies.

Objective and Methodology: In collaboration with the Norwegian Defence Research Establishment (FFI), this project will pioneer the use of electromyography (EMG) via an armband as a communication interface between pilots and simulated drone swarms. By interpreting muscle activity, the project aims to allow pilots to command the swarm for tasks such as directing them to specific sectors or objectives, without physically interacting with any device. The primary challenge lies in distinguishing between muscle activity inherent to flying and intentional commands directed at the swarm. With access to a flight simulator and state-of-the-art myoelectric sensors, the research will focus on real-time data collection, machine learning algorithm training, and validation of the approach.

Data Validation and Significance: Ensuring the accuracy and reliability of data is paramount. For this project, EMG data will be recorded during various standard flight maneuvers, providing a baseline for differentiating between normal flight muscle activity and intentional commands. This data will be recorded in a controlled flight simulator environment at a facility at FFI, allowing for rigorous testing and iterative refinement of the system.

Expected Outcomes: By the end of the project, the aim is to demonstrate a proof-of-concept EMG-based interface that can facilitate rudimentary communication between a pilot and drone swarms. The primary focus will be on distinguishing between two specific EMG-activated inputs and the standard EMG signals that are generated during flight. This differentiation is crucial for the system's feasibility, ensuring that the interface can potentially recognize intentional commands amidst regular muscle activities. This project represents a foundational step where the successful implementation of this system could lead to enhanced interaction methods between pilots and auxiliary defense technologies, potentially improving operational efficiency, safety, and overall mission success. Beyond the realm of aviation, such an intuitive communication method has the potential to redefine how operators in various sectors interact with advanced technologies. From industrial settings to emergency response scenarios, the ripple effect of this innovation could lead to enhanced efficiency, safety, and success across diverse operational landscapes.

Tags: Flight simulator, Human-Computer Interaction, Machine Learning & Deep learning
Published Oct. 31, 2023 10:25 AM - Last modified Oct. 31, 2023 11:14 AM

Supervisor(s)

Scope (credits)

60