The goal of the project is to design and implement new perception methods based on 3D sensors, which enable efficient work of the robot in flexible manufacturing systems. The robot will be equipped with a collaborative arm and mobile platform to move autonomously between defined workstations using 3D sensors like RGB-D cameras and laser scanners. The new perception system will allow the robot to localize and move autonomously between defined workstations, localize with respect to the workstation, manipulate objects and avoid collisions with machines. Finally, we are going to design a new interface based on voice commands to enable flexible robot programming by end-users.
Date: 2018 – 2020
Budget: 1 198 705 PLN
Project Manager: Dominik Belter
Funder: NCBR