Purdue University’s Somali Chaterji has developed AGILE3D, a groundbreaking 3D detection system that enhances real-time perception for self-driving vehicles and other autonomous technologies.
A team at Purdue University, led by Indian American researcher Somali Chaterji, has unveiled a revolutionary 3D detection system that could significantly impact the manufacturing of autonomous vehicles, industrial robotics, delivery robots, and drones. This innovative system, known as AGILE3D, is currently patent-pending and is designed to outperform traditional 3D lidar perception pipelines, particularly during resource contention.
“AGILE3D is the first adaptive, contention- and content-aware 3D object detection system specifically tailored for embedded GPUs, or graphics processing units,” explained Chaterji, who serves as an associate professor of agricultural and biological engineering in Purdue’s College of Agriculture and College of Engineering. She also holds a courtesy appointment in the Elmore Family School of Electrical and Computer Engineering.
The AGILE3D system can dynamically adjust its detection strategies based on real-time hardware constraints and varying input data. This adaptability is crucial for applications that require rapid 3D perception while operating within the limited computational resources of onboard systems.
Research findings presented at prestigious conferences, including the Conference on Neural Information Processing Systems (NeurIPS), the European Conference on Computer Systems (EuroSys), and the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), indicate that AGILE3D meets stringent latency objectives. It delivers an accuracy improvement of over 3% compared to adaptive controllers and up to 7% over commonly used static 3D detectors.
Chaterji emphasized the broad applicability of AGILE3D, stating that it is particularly well-suited for autonomous driving, where real-time processing of lidar frames is essential for safety. “Beyond cars, AGILE3D can enhance the performance of delivery robots, drones, industrial and mobile robotics, as well as augmented reality and virtual reality applications,” she noted. “This is especially important in fields like digital agriculture and forestry, where platforms rely on embedded GPUs and require predictable latency for smoother and safer operations.”
As multiple onboard workloads—such as perception, tracking, planning, and in-cabin infotainment—compete for GPU resources, maintaining performance becomes increasingly challenging. Chaterji explained that resource contention arises when these various processes share the same embedded GPU and memory system simultaneously. An example of this is a ride-hailing robotaxi, where camera perception, lidar processing, tracking, mapping, and planning must all function concurrently.
One of the primary limitations of 3D lidar technology is its update rate, which dictates how frequently the sensor can provide a new point cloud frame, essentially a fresh 3D snapshot of the surrounding environment. AGILE3D addresses this challenge by employing two coordinated layers: a multibranch execution framework (MEF) and a contention- and content-aware reinforcement learning (CARL) controller. These components work together to maintain high accuracy even under varying levels of hardware contention and latency budgets ranging from 100 to 500 milliseconds.
Chaterji and her team are continuing to develop AGILE3D to facilitate dense scene understanding on onboard computers, ensuring that 3D semantic segmentation can operate reliably within tight compute and memory constraints. Funding for this project has been provided through Chaterji’s National Science Foundation CAREER grant, as well as a separate NSF grant for their CHORUS center.
Chaterji holds a PhD in Biomedical Engineering from Purdue University, where she has received several accolades, including the Chorafas International Award and the College of Engineering Best Dissertation Award in 2010. She completed her post-doctoral fellowship at the University of Texas at Austin in the Department of Biomedical Engineering and has been a scientific advisor to the IC2 Institute at the University of Texas at Austin since 2014. In 2016, she was honored with Purdue’s Seed-for-Success Award for securing a research grant exceeding $1 million.
The development of AGILE3D marks a significant advancement in the field of autonomous technology, promising to enhance the safety and efficiency of various applications reliant on real-time 3D perception.
According to a media release from Purdue University, the AGILE3D system represents a pivotal step forward in the integration of advanced perception capabilities into autonomous systems.

