January 1, 2021

INDUSTRIAL ROBOTICS

Frontier Research Overview

How can we let the different robots work together and understand our needs in a practical way such that users can transfer human skills or teach them to perform multiple complex industrial tasks?

To address this issue, we focus on developing adaptive industrial robot control with learning and adaptation, allowing different industrial robots (like manipulators, mobile robots, and drones) to perform complex action sequences of industrial tasks as well as to intelligently collaborate to complete difficult tasks for industrial factories that cannot be achieved by an individual robot (like object transportation/logistics).


Research Highlights

2024

An Integrated Vision-based Robotic Arm Control Framework for Imitation Learning and Online Adaptive Robot Manipulation

Abstract:
This video demonstration paper presents an integrated vision-based control framework for imitation learning and adaptive manipulation control of robot arms. It addresses the challenges of programming robot skills for industrial applications. The results illustrate that the framework enables our robotic arm system to passively observe user demonstrations and imitate the demonstrated (complex) trajectories in pick-and-place tasks. Furthermore, the robot can automatically adapt its motion on the fly to a change in the target position (placement) and avoid an obstacle.

A. Harnkhamen, T. Rassameecharoenchai, K. Rothomphiwat and P. Manoonpong, “An Integrated Vision-Based Robotic Arm Control Framework for Imitation Learning and Online Adaptive Robot Manipulation,” 2024 IEEE/SICE International Symposium on System Integration (SII), Ha Long, Vietnam, 2024, pp. 1185-1186, doi: 10.1109/SII58957.2024.10417706. keywords: {Service robots;Programming;Manipulators;Trajectory;Task analysis;Robots;Testing},


2023

Imitation Learning with Dynamic Movement Primitives and Temporal and Kernel
Adaptation for Practical Robot Programming and Online Adaptive Robot
Manipulation

Abstract:
Robot programming by demonstration (PbD) has become a popular technique for implementing
robot skills in recent years. A typical PbD technique relies on kinesthetic guidance and teach
pendant or teleoperation to record and replay demonstrated trajectories. Although this facilities
demonstrations, such a technique can be time-consuming and inconvenient when applied to
industrial tasks that require complex robot actions or a sequence of manipulation actions. An
alternative PbD technique is based on optically capturing motions demonstrated by the user with
his or her own body. For robot execution, if a record and replay strategy is applied without
adaptability, the robot may fail to accomplish the task if the real situation is changed.
To comprehend this PbD issue and demonstrate our solution, the video presents the typical PbD,
followed by our approach (imitation learning with dynamic movement primitives and temporal and
kernel adaptation). It can automatically program a robot by optically capturing user motions. It
also enables the robot to online adapt its motion to follow the captured motions and to deal with
target position changes in tasks like pick-and-place. The performance of our approach is
illustrated by teaching a robot with a complex trajectory to pick and place a cup on a tray. During
execution, the robot can online adapt its motion to place the cup while the tray is moving. In a
nutshell, the video shows a practical way to program robot skills with adaptability.

Harnkhamen, A.; Rothomphiwat, K.; Manoonpong, P. (2023) Imitation Learning with Dynamic Movement Primitives and Temporal and Kernel Adaptation for Practical Robot Programming and Online Adaptive Robot Manipulation, Stand-alone video, 2023 IEEE International Conference on Robotics and Automation (ICRA)

2020

Advanced Collaborative Robots for the Factory of the Future

We develop an integrated robotic platform for advanced collaborative robots and demonstrates an application of multiple robots collaboratively transporting an object to different positions in a factory environment. The proposed platform integrates a drone, a mobile manipulator robot, and a dual-arm robot to work autonomously, while also collaborating with a human worker. We develop an integrated robotic platform for advanced collaborative robots and demonstrates an application of multiple robots collaboratively transporting an object to different positions in a factory environment. The proposed platform integrates a drone, a mobile manipulator robot, and a dual-arm robot to work autonomously, while also collaborating with a human worker. The platform also demonstrates the potential of a novel manufacturing process, which incorporates adaptive and collaborative intelligence to improve the efficiency of mass customization for the factory of the future.

For more details, see Rothomphiwat, et al., IEEE/SICE International Symposium on System Integration (SII), 2021.

Video link :

Robot-robot collaboration video 1, video 2

Presentation