It is not only in the automotive industry that autonomous driving represents the Holy Grail of progress. The idea of having transportation vehicles drive without human operators has been explored and successfully implemented in warehouses and production environments since as early as the 1950s . Navigation at that time still depended on permanently installed fixed guide rails, but the advanced sensor technology now used in grid, laser, and contour navigation has significantly increased the degree of freedom enjoyed by the trucks. Nevertheless, there is a lack of critical features that would allow the trucks to truly move around freely and detect obstacles without the intervention of humans and, above all, without running the risk of collisions. The KION Group is currently significantly involved in several research initiatives to determine precisely how this can be achieved in both smart factories and warehouses of the future.
IMOCO4.E: Making Autonomous Transportation Vehicles More Intelligent
Semi-autonomous vehicles are already being deployed in many production environments. While it is standard for a forklift to recognize a static obstacle and brake, fully autonomous driving is still a vision of the future. It is in this context that the project “IMOCO under Industry4.E” (short for “Intelligent Motion Control”) was recently launched, and in which the KION subsidiary STILL is significantly involved: Using AI and advanced sensors and communications, it aims to enable intelligent trucks to move around production halls or warehouses completely autonomously, avoiding obstacles and intelligently navigating their way around them. The IMOCO project is supported by the German Federal Ministry of Education and Research as well as the European Union with the research incubator ECSEL (Electronic Components and Systems for European Leadership). Many project partners are involved including Fraunhofer IML.
The technical challenges are immense: To begin with, the truck must be able to perceive its surroundings using a wide range of sensors (cameras, laser scanners, and radar). This not only includes spatial objects such as shelves, but also signs, markings, and displays. The second stage involves the truck understanding what it perceives and learning to classify objects: Are they static (such as shelves), movable (such as pallets), or even dynamic (such as other trucks and people)? Self-localization skills (where am I?) are enhanced and an understanding of assigned tasks (what should I do?) is added. In the final stage, the truck is expected to perform its tasks autonomously: autonomous navigation to the destination, load detection and handling, drive through the warehouse including automated decision-making, such as avoiding obstacles and finding a logical place to put down a pallet. These are typical processes that can be handled by autonomous transport fleets in the future.