KI.FABRIK

ki fabrik logo

ROBOT LEARNING

Collective Learning

©Technical University of Munich. TUM School of Computation, Information and Technology. Chair of Robotics ands Systems Intelligence. Munich Institute of Robotics and Machine Intelligence (MIRMI).

In the collective (Zusammenschluss) of robots, all robots can exchange their knowledge about their respective task. This exchange enables the robots to learn much faster than unconnected robots. An example is the insertion skill. With this skill, robots can insert keys into keyholes, plugs into sockets or bolts into boreholes. As this is a highly tactile problem, it is just solvable for a sensitive robot. Connecting robots in such a collective not just speed up the total learning success, but also enables the robots to converge to more general solutions. As the robots can draw on a diverse set of solutions within the collective, more general solutions are promoted but also optimized, and individual solutions are possible. A single robot would take ~1.5 hours to learn on its own and that would just result in being able to insert this special key into the learned hole. When the whole collective learns similar tasks, the solution is more general and the learning is much faster. You can see the fully connected AI-powered production system of the future. Robots are working closely together with humans to make production more ergonomic and efficient.

Contact

Working with intelligent cobots to inspect cars’ quality

Human workers can work in close proximity to the robot and are still the experts who set up the robot and teach the AI new inspections and the desired position easily without occlusions. As a result, the human worker can focus on richer and more ergonomic tasks.

BMW’s developments at the AI.Society exhibition area of munich_i. ©Messe München

Networked AI-based quality inspection

On the front of the vehicle, a human worker teaches the robot to measure a specific gap intuitively with an AR glass. On the front, the inspection system is highly flexible and can be taught or set up in the most intuitive way, just by a gesture.

BMW’s developments at the AI.Society exhibition area of munich_i. ©TUM-MIRMI

Augmented Reality (AR) -Based intelligent robot for gap measurement

On the back of the car, we show collaborative AI-based quality inspection, where the robot can directly communicate with the car. Both systems are fully connected and the robot is built to be set up without any prior robotics knowledge. Quality inspections can be performed highly flexibly on the fly for each car individually.

Human Motion Tracking

Real-time contactless tracking of human movements enables to control robots and teach them how to move.

telepresence robot perception vision

©Technical University of Munich. TUM School of Computation, Information and Technology. Chair of Robotics, Artificial Intelligence and Real-time Systems; Machine Vision and Perception Group. Munich Institute of Robotics and Machine Intelligence (MIRMI).

Telepresence by Hand Movement

Teleoperation using visual perception and control to detect hand movement.

©Technical University of Munich. TUM School of Computation, Information and Technology. Chair of Robotics, Artificial Intelligence and Real-time Systems; Machine Vision and Perception Group. Munich Institute of Robotics and Machine Intelligence (MIRMI).

©Technical University of Munich. TUM School of Computation, Information and Technology. Chair of Robotics, Artificial Intelligence and Real-time Systems; Machine Vision and Perception Group. Munich Institute of Robotics and Machine Intelligence (MIRMI).

Learning and Detecting Movement

Teaching skills to the robot by demonstration is possible via vision-based tracking of human motion. The system is able to analyze human actions in different everyday environments, which helps, for instance, to assist a human with difficult tasks.

New Movements

Teaching a new movement by simply showing the robot the movement and labeling it afterwards. In the demo, we show the robot how to write the letter “h”.

Known Movements

The robot can detect already known movements and repeat them, scaled in time and amplitude. In the demo, the letter “a” is correctly detected and repeated by the robot.

Contact