The award winning start-up trying to link up the autonomous landscape

We talk with the winner of the Accenture Freight & Logistics Europe Innovator Award, Unmanned Life, about their innovative solution and how they plan to change the supply chain space

Reuters Events Supply Chain recently had the opportunity to judge the Accenture Freight & Logistics Europe Innovator Award. After a close contest, Unmanned Life, a company working to link up the back-end of robotics systems into a coherent whole, took home the prize. We talk to the company’s Chief Technical Officer, Kim Clement, about the vision and future of this exciting start up.

His remarks have been edited for length and clarity.

Alex Hadwick, Editor-in-Chief: First of all, congratulations on your win. Can you talk me through the different applications that you envisage for your solution in the supply chain space?

Kim Clement: Robotics are being adopted by different industries, including freight and logistics. You see a lot of companies that are using robotics for their warehousing, their last mile deliveries, their dock to dock deliveries, inventory, management, etc., etc.

A lot of companies on the market are providing what we call vertical stack solutions, each with a robotic piece of hardware that can solve a specific problem. That comes with a specific software set that allows you to control and monitor that robotic device, sometimes [operating] as a single device, sometimes as a swarm, but always within one vendor stack or within one piece of software. However, there are many solutions out there and there are many problem sets out there.

What we are solving is to create that software layer. It's a platform, a software layer that can enable different types of robotic devices to collaborate and work together as one autonomous force.

You have different robotics with different capabilities and skills, driven by different types of software on the backend, but you need to put them to work as a team. That's what we enable with our software

You need to compare it with people working in a factory or people working in freight and logistics: you always have a combination of different people working together with different skills and different capabilities. It is only when you bring these people together as one workforce, properly managed, that you have a functional team.

It's exactly the same. You have different robotics with different capabilities and skills, driven by different types of software on the backend, but you need to put them to work as a team. That's what we enable with our software.

Today, a lot of solutions are being automated at the back end of the supply chains with more conventional technologies, including conveyor belt, sorting solutions, sorting systems, picking and placing systems, but they require quite a high level of investment to install in the first place. Secondly, they also require a lot of time to be engineered and installed in a facility, and they are always designed for a specific facility, so it's not mobile.

We have an approach where we use robotic devices that can be used to replace conventional solutions such as conveyor belts, and can be driven by a software platform that is fully integrated with your business information system and your management system

If you then look into our approach over here, we have an approach where we use robotic devices that can be used to replace conventional solutions such as conveyor belts, and can be driven by a software platform that is fully integrated with your business information system and your management systems.

That solution is something that we call a ‘fold your tent’ solution; it's something that can be set up ad hoc. A sorting solution on our site takes about half a day to set up and have it up and running, and you can decommission it afterwards as fast as you set it up.

It doesn't have a very invasive impact on your facility, which means that you can have temporary facilities where you set it up for a temporary increase of capacity, and you can take that down again once the capacity is no longer required. That also allows you to shift your geographical footprint almost on demand. 

Alex: How do you plug in to other robotics systems and what is your approach to data?

Kim: It can be either an active or a passive integration, which allows us to create a full context of the environment and to enable efficient control, management and orchestration of the robotic devices.

It's a full ecosystem; robots and humans, we all have to work together.

So, how do we do the integration part? The first point is that a couple of years ago, when we started building the platform, you saw that some companies very much had a focus on creating that hardware-software stack. That is their solution and that works in specific industries. There's nothing wrong with that, they will have a spot and they will fit into that, but you cannot ignore the fact that more robotic devices are being installed.

Technologies are being produced at a much larger scale, which reduces the cost of the robotics and, in the meantime, the technology is more advanced, so you will see more adoption of these robotics, and more complexity in orchestrating them all. That's where we come in as a software platform

However, we noticed, as a software developer, that a lot of companies approaching us to extend their capabilities in their warehouses and logistics chain had technology solutions that often cannot work together in one warehouse.

Technologies are being produced at a much larger scale, which reduces the cost of the robotics and, in the meantime, the technology is more advanced, so you will see more adoption of these robotics, and more complexity in orchestrating them all. That's where we come in as a software platform.

In the beginning, some of these robotics companies were not open to integration, and they said, “No, we have a solution and that works”. However, now they have noticed that there are more and more requests to do full integration and it is not their core business to take care of those integrations, to work together with other platforms, to orchestrate all these platforms.

We install on top of the device, so we can install it on the existing compute unit, if it has a compute unit, or we can install it with a companion PC with a small computer that we install on top of the device. Then we integrate with the existing architecture or with the existing motor drivers on the device. Once that is done, that one specific compute unit takes care of that integration, with the device to take over the control and the management of the device, but also to collect all the sensor data from the device.

We are sharing that sensor data wirelessly with our central control platform. That's another piece of software which we call the ULCCP - that is the Unmanned Life Central Control Platform - which is normally installed on premises on a dedicated server, or is running on the edge, or the cloud. There are three options depending on the customer's requirements, and sometimes restrictions.

So, you have a central part, which is managing the whole fleet, and then you have the decentral part, which is the integration of each single device into the swarm.

For the central control platform, we are focusing on four main areas. The first level is the device management. That’s everything related to the device and the characteristics and capabilities of the device, e.g. the battery management, the status of the device, any alarms or conditions that might be there - they are centralised and monitored by our platform.

On top of that you have a data management layer, which is quite complex. Within that we create one single context through all the sensor of data from all the devices on the deployment, so all the devices can leverage the insights. We also do the data filtering and composing the data at this stage, to enable higher accuracy in navigation and dynamic path.

On top of that, we have a decision-making layer. The decision-making layer is where we do all the in-house algorithms. For example, if it's in a sorting environment we train the system to detect where an output gate is exactly located, so that all the other Automated Guided Vehicles (AGVs) and Autonomous Mobile Robots (AMRs) also know the locations where the gates are, because it can be very flexible.

Each device is being managed, but they're all being managed with one common goal

This is also where we integrate with third party platforms for object recognition, facial recognition, detection of humans, counting of humans and throughput in, for example, tracking and tracing the location of containers and assets in a harbour. For all these purposes we work together with experts in the field that have algorithms already available, and use them for our decision-making layer.

Then, most importantly, on top, is the mission management and the orchestration layer. This is the layer that actually takes care of managing and orchestrating the full squad. Each device is being managed, but they're all being managed with one common goal. That common goal is driven by business information systems, operational insights and legacy systems, which we integrate with, and that data is driving down the solution: Which assets need to be tracked? What is the expected location of the assets? Which packages need to be sorted? What are the destinations of these packages? Which packages might have priority?

Based on these four pillars, you can drive almost every single use case.

Alex: What examples exist today of your solution being utilised in supply chains?

Kim: Where we see a lot of traction in is postal and logistics, where AMRs and AGVs are being used for the autonomous sorting of packages. We are active in that field with various players sorting up to about 2,500 packages an hour. In some of these layouts, it's about 16 AMRs that are sorting over an area of about 200 square meters, and they really take up those flexible, elastic moments when throughput is peaking in certain areas to extend the capacity of their conventional sorting solutions.

On a product level, if you look into some other industries, let's take an example in the automotive industry. There you have a logistics chain of goods that are coming from the warehouse, which is often in the facility of the factory on the assembly line. Then you have your assembly line where the operators are assembling the car parts on a continuously moving assembly line. What they notice from time to time are parts that are defective or are not correct and need to be replaced, but they don't have these parts available on demand on the assembly line. What they do today is they need to do a request and they basically need to go and pick up these parts that are missing or need to be added on. Then they ship them over to the assembly line and they need to revisit that car afterwards to finish up the car, which is quite labour intensive. It disrupts the process a lot.

We are taking a drone pilot and replacing it with our software platform. So, the drone is autonomously fulfilling the on demand, the ad hoc supply chain from warehouse to the assembly line

A company such as Seat out in Spain already are doing, or have done, tests and trials over there, using pilots and drones to ship those parts immediately over to the factory where they do the assembly. There are still humans picking up the package and then delivering it to the assembly line. It's still quite a time-consuming process.

We are running a pilot setup with them, where we are taking a drone pilot and replacing it with our software platform. So, the drone is autonomously fulfilling the on demand, the ad hoc supply chain from warehouse to the assembly line, but also then there is the link from the warehouse outside of their house to inside to the assembly line to the worker, which is also being done with an automated ground vehicle, and these ground vehicles are already in place and being used for other logistic purposes within Seat for their supply chain in the warehouse.

If you are innovating, you are always walking on the edge of what the technologies are capable of today, and that is evolving very fast

These robots are being sent outside and pick up the package from the drone, which is dropped down on top of the AMR and the AMR brings it to the worker at the assembly line, which takes a matter of 10 to 15 minutes. That's really disruptive, and immediately gives them a return on investment.

Alex: What kind of major challenges do you need to overcome in the medium term? And where do you see the company in 10 years time?

Kim: So, one challenge is to keep up with all these new evolutions, to be able to integrate them in an efficient way, and to leverage those technologies.

If you are innovating, you are always walking on the edge of what the technologies are capable of today, and that is evolving very fast.

Therefore, sometimes you bump into limitations of those technologies. For example, each robotic device is wireless, so you need to have a proper and reliable network. What we saw in the past, when we started, is 4G has its limitations, especially if you want to share a lot of data. So, networking is one of the bottlenecks that we identified. We are working very hard in collaborations with telco partners to integrate with 5G networks and to run on private and public infrastructures to ensure that we can offer solutions to our customers with the lowest latency and maximum bandwidth, so we can offload as much computational power as possible to the IT infrastructure.

We also see a lot of challenges in localising these devices, especially in logistics, where you can have quite a dense area with a lot of robots moving around

On the other side we also see a lot of challenges in localising these devices, especially in logistics, where you can have quite a dense area with a lot of robots moving around. A typical technology used for localisation is to use Simultaneous Localization And Mapping (SLAM) or sensors on top of the device such as stereo cameras, LIDARs, but if you have a lot of moving objects, you create a lot of complexity to localise it. You get a lot of false positives on your localisation, which can have an influence on the precision and the accuracy of your positioning.

Our sensor fusion plays an important role by combining different sets of technologies, putting them together, and leveraging the different outputs to make sure that we have a reliable and robust localisation

We have had to overcome that complexity, and that is where our sensor fusion plays an important role by combining different sets of technologies, putting them together, and leveraging the different outputs to make sure that we have a reliable and robust localisation.

We also see that there's a lot of evolution happening on localisation. For example, 5G will provide us the possibility to localise the device based on communication.

If you look at where we are going to stand 10 years from now, robotics are being adopted very, very fast these days, and due to current situations with COVID-19, this is only going to be accelerated in the coming years, because we have seen a lot of disruption.

When people had to remain at home due to lockdowns, or had to stay in isolation, it had some impact on supply chains and production. To minimize that, companies are going to invest more and more in automation and autonomous solutions to ensure that continuity. This adoption of robotics and will also exponentially accelerate the demand for control platform orchestration platforms to enable these robotics. So, in 10 years from now, I see Unmanned Life as one of the go-to platforms for orchestration of autonomous robotics for business operations.

comments powered by Disqus