top of page

Our Projects

Blazing Trails in Industrial Innovation:
Digital Twins Powering Swarm Robotics, Autonomous Navigation, Material Handling, and Beyond.....

SEAL-Graphical Abstract.jpeg

Scalable Ecosystem for Automated Integration (SEAL) 

SEAL facilitates the integration of a heterogeneous robot network to collaboratively execute intricate tasks within industrial settings. The underlying framework supports interoperability among disparate robotic entities, including Fanuc and Yamaha systems, ensuring efficient collective operation. The system is designed to replicate tasks demonstrated by human operators, enabling robotic units to acquire the skills necessary for high-precision assignments through a single observational learning episode, devoid of any prerequisite object-specific knowledge. Prior to deployment, task protocols undergo validation within a bespoke virtual simulation environment, a process that precedes their integration into CIIR. This framework facilitates seamless communication and data exchange among the robotic units, thereby amplifying operational efficiency and enhancing overall performance metrics. The incorporation of deep learning imbues the robotic network with the capacity to dynamically adjust to evolving operational demands and environmental conditions. This adaptability, coupled with the system's scalable architecture, renders it an exceptionally versatile solution, capable of addressing a broad spectrum of industrial requirements.  

SLAM Swarm (1).png

Slam Swarm

Our project harnesses ROS 2 Humble to orchestrate a network of robots for comprehensive environmental mapping and localization. The main robot, equipped with a Lidar and a Realsense camera, utilizes the SLAM Toolbox for environment mapping and self-localization. Meanwhile, Apriltags on peripheral robots facilitate their localization through the Realsense camera, integrating their data into the generated map. This synergy allows for precise task execution by peripheral robots. Additionally, we employ the RTAB-Map library with the Realsense camera for 3D environmental scanning, further enriching our digital twin simulation with detailed spatial data.

SLAM Swarm (1).png
SLAM Swarm (1).png
Umashankar_Saravanakumar_-_Thesis-removebg-preview (2).png

Computer Vision-Aided Automated Mid-Flight Aerial Docking for Quadroters

In this innovative endeavor, a secondary drone autonomously flies in close proximity to the primary drone, facilitating the seamless replacement of batteries mid-flight.

This project encompasses two key stages:

(a) Detection and Tracking of the Drone: Utilizing advanced sensing and tracking technologies, the secondary drone accurately detects and tracks the primary drone, ensuring precise coordination throughout the battery replacement process.

(b) Mid-Flight Aerial Docking: A pivotal aspect of the project, this stage involves the development of a sophisticated docking mechanism capable of handling the downwash exerted by the primary drone on the secondary drone during aerial docking. This ensures stable and secure connection between the two drones, enabling successful battery replacement without compromising flight stability or safety.

By tackling the challenges of mid-flight battery replacement head-on, our project paves the way for enhanced drone capabilities, extending their operational endurance and unlocking new possibilities for aerial missions requiring extended flight durations.

SLAM Swarm (1).png

Real world robot overlaid with digital robot

The Augmented Reality Robot Control Interface merges physical robotics with digital augmentation, enabling seamless control and programming of robots in real time. Users can visualize a real-world robot overlaid with digital elements through augmented reality, utilizing sliders to manually adjust joint values and select specific joints for manipulation. With the option to activate a pre-programmed pick-and-place script and program the robot directly within HoloLens 2, this interface offers both flexibility and efficiency in controlling robotic actions. Whether for robotics enthusiasts, researchers, or educational purposes, this project provides an intuitive and immersive platform for enhancing productivity and innovation in robotics applications.

​

This project streamlines robot control and programming by integrating intuitive interfaces with advanced technologies. By combining augmented reality visualization with manual joint adjustment and pre-programmed scripts, users can achieve precise control over robotic movements while leveraging the efficiency of automation. Whether in research, industry, or education, this interface empowers users to enhance productivity and innovation in robotics applications, offering a versatile platform for real-time control and programming of robots.

bottom of page