Tuesday, July 23, 2024

Bird-Like Drone For Unseen Environment Navigation

- Advertisement -

Researchers at Massachusetts Institute of Technology have developed organic brain-inspired flight navigation agents for vision-based tasks in new and complex settings.

Makram Chahine, a PhD student in electrical engineering and computer science and an MIT CSAIL affiliate, leads a drone used to test liquid neural networks. Credit: Mike Grimmett/MIT CSAIL

Amidst the boundless, sprawling skies that were once ruled by feathered creatures, a novel generation of aviators has emerged. These airborne pioneers are drones. However,the drones of today are average flying machines that buzz around like mechanical bees. 

Researchers at Massachusetts Institute of Technology’s (MIT’s) Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed robust flight navigation agents for vision-based tasks in new and complex settings, inspired by organic brain adaptability. Liquid neural networks excel in decision-making for navigation in diverse domains, surpassing state-of-the-art models, and could enhance drone applications such as search and rescue, delivery, and wildlife monitoring in unknown environments.

- Advertisement -

The researchers see potential for learning-based control to tackle issues of training and deployment in different environments without extra training. Drone successfully learns object location in summer forest, transfers to winter or urban settings for varied tasks like seeking and following, enabled by causal-based adaptability. Deep learning struggles to capture causality, hindering its ability to adapt to new environments. This is problematic for resource-limited embedded systems like drones. Liquid networks offer a promising solution. The team trained their system on pilot data to test navigation skill transfer to new environments with changing conditions. Unlike traditional neural networks, liquid neural net parameters can adapt over time, making them more robust to unexpected or noisy data.

The drones underwent various closed-loop control experiments, including range and stress tests, target rotation and occlusion, and dynamic tracking of moving targets. They also executed multi-step loops between objects in new environments, outperforming other advanced models. The team thinks that the capability to learn from limited expert data and generalize to new environments can make autonomous drone deployment more efficient, cost-effective, and reliable. Liquid neural networks could empower autonomous air mobility drones for various applications, such as environmental monitoring, package delivery, autonomous vehicles, and robotic assistants. Experimental setup tests deep learning systems’ reasoning in controlled scenarios. Robust learning and performance in out-of-distribution tasks are crucial for machine learning and autonomous robots in critical societal applications.

The liquid neural networks, a brain-inspired paradigm developed by the researchers, show remarkable performance and could potentially enhance reliability, robustness, and efficiency of AI and robotic systems if the findings are replicated.

Reference : Makram Chahine et al, Robust flight navigation out of distribution with liquid neural networks, Science Robotics (2023). DOI: 10.1126/scirobotics.adc8892

Nidhi Agarwal
Nidhi Agarwal
Nidhi Agarwal is a journalist at EFY. She is an Electronics and Communication Engineer with over five years of academic experience. Her expertise lies in working with development boards and IoT cloud. She enjoys writing as it enables her to share her knowledge and insights related to electronics, with like-minded techies.


Unique DIY Projects

Electronics News

Truly Innovative Tech

MOst Popular Videos

Electronics Components