Developed a real-time object deformation measurement system for continuum robots as part of the Mitacs Globalink Research Internship in the Robotics, Mechatronics, and Automation Laboratory (RMAL) at Toronto Metropolitan University.
Synthetic Data Generation
Developed a realistic lab environment in Blender to simulate the continuum arm's movements accurately, mimicking the physical setup at RMAL, Toronto Metropolitan University. Over 5,000 synthetic images were generated in Blender, modeling the continuum robot in a photorealistic setting. This extensive dataset served as the foundation for training detection algorithms. The environment was carefully designed to ensure realistic conditions, allowing for the creation of diverse scenarios. Communication between MATLAB and Blender was established via socket programming, enabling seamless real-time data exchange. This integration ensured synchronization between the simulated environment and control algorithms, enhancing the overall effectiveness of the project.
Detection & Segmentation
To achieve precise tracking of the continuum arm’s shape and movements, the YOLOv8 model was trained on the synthetic dataset. The training process involved annotating the synthetic images with specific labels for the continuum arm’s segments, enabling the model to learn the distinct shapes and orientations of the arm. Using YOLOv8’s advanced object detection capabilities, the model performed real-time segmentation of the continuum arm, accurately identifying its shape and position across various simulated scenarios.
This high level of accuracy was achieved through iterative training, adjustments in hyperparameters, and data augmentation techniques. The precise segmentation capability was crucial for enabling accurate control over the arm's movements, facilitating detailed analysis of each position and directly informing the fine-tuning of control algorithms.
The combination of synthetic data and deep learning led to a 20% improvement in shape control accuracy, with error margins reduced to less than 5 degrees. This advancement enabled finer control adjustments and improved overall reliability. The automated data generation process in Blender also streamlined testing, making the project highly scalable. The structured approach to synthetic data generation offers a valuable framework for robotics, demonstrating the potential of combining simulated environments with deep learning to enhance precision and responsiveness in real-world applications.