fb-pixel
Back to Blog

Following Veera – adding social features to an MiR autonomous mobile robot

We enhanced an autonomous MiR 200 mobile robot with social features. In addition to logistics, it can now serve library customers by guiding them to particular books and categories – a task that normally takes up a lot of employee time. The prototype was tested successfully, and received a positive response from the library staff and Oodi visitors. The project continues.

If you prefer videos to walls of text, here's a promotional video about this project:

Heart illustration

We’d like to help you make a fully informed decision about cookies. Accessing some of our embedded content that could be of interest and use to you requires you to enable cookies.The choice is always yours.

Oodi is Helsinki's new Central Library, an artwork of a building with wooden walls and floor-to-ceiling windows that flow and curve organically like a living tree. It also houses a lot of books. We Finns read a lot and appreciate our libraries. Oodi's collection is over a hundred thousand volumes strong.

Oodi Helsinki Third Floor Reading Room1 photo Tuomas Uusheimo-1350x900 Photo by Tuomas Uusiheimo, https://www.oodihelsinki.fi/en/for-media/

When borrowed books are returned, they're not taken by hand to the shelves. Instead, a conveyor belt takes the books down to the cellar, where an automated system sorts them into boxes. When a box fills up, the system calls an autonomous robot wagon to pick it up, and carry it upstairs to be eventually shelved by the employees.

The robots are MiR 200s by Mobile Industrial Robots A/S. They have lift modules custom built by Mikro-Väylä Oy to pick up and drop off book boxes. The robots control an elevator to travel between the cellar and the other floors in the library. The MiRs navigate well among people, constantly scanning their environment with LiDARs, depth cameras, and ultrasonic sensors.

mir200 Photo by Mobile Industrial Robots ApS, https://www.mobile-industrial-robots.com/en/products/mir200/

We at Futurice focus on social robotics. As a part of her master's thesis robotics designer Minja Axelsson created a design framework for this purpose. The framework originates from our social robotics project where we built Momo the Robot and had it meet with children. The framework is a set of canvases that allows you to structurally approach the various aspects of designing a social robot.

minja-ws Minja Axelsson facilitating a workshop that resulted in our robot concept, photo by Futurice, CC-BY 4.0

With Oodi the social robotics design framework was used in a workshop with employees and library customers to facilitate the design process. From many ideas, one was selected for proof-of-concept technical implementation.

The selected concept is based on one of the most time-consuming chores the employees face – guiding customers to find certain books or book categories. There are a lot of shelves at Oodi. Finding a particular book or a category is not a trivial matter for a customer, and often requires employee assistance. It may also, in fact, be tricky for the employee.

Romeo Pulli, project manager at Oodi, suggested that we should make use of their MiR robots and extend their abilities for the social interaction. This makes a lot of sense, as the MiR wagons move reliably and had already been tested to navigate around in the public space. He could dedicate one of them for us for the duration of the project, and help us get started with configuring and programming it.

We gladly accepted the robot and Romeo's help. Programming autonomous movement would be a daunting task. Having that mostly sorted out would allow us to better concentrate on the social features.

Oodi let children choose names for their robots. The robots were all named after a popular children's book series, Tatu and Patu. We got Veera. Veera is the Finnish version of the more globally recognised Vera, an excellent name for a friendly robot guide.

So we assembled our forces:

  • Minja Axelsson for the overall service design, graphical design for the user interface, building the electronics, and programming the robot's social capabilities, creating a library for its simulated emotions.
  • Olli Ohls to look into the physical form, handle communications, help with testing, and also to study another promising use case a bit; automatic inventory of the books by using an RFID reader riding the robot.
  • Niki Ulmanen to implement the user interface for our robot and help with testing.
  • Teemu Turunen – that'd be me – to program the controlling logic for the robot, with integration to the cloud-based library system, and do whatever else is needed to make this work.

... and started working.

While the robot wagon moves, it constantly scans its surroundings with lasers and depth cameras to detect obstacles like meandering humans. Detecting something on its immediate path will cause it to alter its path, or stop moving entirely if the object is too close. This works impressively well. No humans were harmed in the making of this prototype.

oodi mir map Screenshot of the MiR control dashboard, the robot detects obstacles around it, shown in bluish purple

Minja wanted mechanical googly eyes on the robot, to allow it to better express its feelings, and give it some extra character. I was a bit hesitant to include a DIY electronics projects in the already slightly frightening project scope, but this turned out to be a very important feature. Kudos to bikerglen for providing us instructions on how to build these. Bikerglen, you are truly a prince among bikers and glens.

robot lab-oodi Prototype of Veera with bikerglen’s mechanical goggly eyes, photo by Futurice, CC-BY 4.0

A library is a very challenging environment for autonomous moving robots. There may be a lot of people. Baby transports. Large groups. Narrow spaces. Moving furniture. Huge houseplants.

There are also a lot of children. Children are attracted to moving robots. Children of certain age – I would empirically put this at 18 to 36 months old – are very attracted to big red safety kill switches on top of moving robots. Fortunately their parents are usually faster than them. At least half a dozen kids approached the robot in a swaying trot with a small arm raised to engage the robot's kill switch, just to get airlifted seconds before making contact.

The first incarnation of Veera we had during the project was just the base module, a flat suitcase-looking thing with six wheels. It only saw around itself on that very low level. A chair with thin legs might go undetected, as the bulk of its volume is too high for the robot to detect. This would result in moving chairs, as the powerful robot will try to plow through the chair.

Later in the project Veera was equipped with a lift module that doubled its size. It also got extra depth cameras angled upwards. We noticed a shift in generic attitude towards the robot. The sharper edges and larger bulk clearly made it a bit less approachable to some people. The new side-placement of the kill switches fortunately deterred the toddler attacks.

evil-veera With the lift module Veera didn't look like a toy anymore, photo by Futurice, CC BY 4.0

The prototyping meant several weeks of piling Python code upon Python code, extensive use of curl to study the robot and library system APIs, soldering wires and assembling stepper motor setup for the eyes with the driver boards, writing Arduino sketch code for the eyes, configuring a Linux laptop to run it all, sleeping way too little, badly calculating battery power to run it all, throwing together a Flask app to show the user interface, moments of despair integrating to the complex cloud-based library system, testing, testing, testing, and most of all commanding Veera around the library, studying and adjusting its path choices.

We made a simple emotions library for the robot. The idea being that a social robot could have and should show some emotions. This can introduce some variation to the coded base behaviour. For instance, Veera will communicate a bit differently, if it is excited or frustrated.

veera-crop Veera's final form for the prototype, Futurice, CC-BY 4.0

Our use case for this proof of concept:

  1. Veera idles at its home position, in front of the escalators taking people to the third floor of Oodi.
  2. Veera catches the eye of an Oodi customer by its form, moving its eyes, making some sounds, or moving slightly, or doing some patterns with its lights. As it has feelings, it can get bored, and its behaviour reflects this.
  3. The visitor accesses the user interface using a tablet on a stand on top of the robot.
  4. The customer can look for a science book, or choose a category like languages, history, travel, etc.
  5. When a given book is searched, the UI lists all the matching books (title or author) that are available at Oodi at that precise moment. This requires calling the library system.
  6. When a category or a book is selected, the customer is prompted whether she would like to be guided there.
  7. If the customer confirms, Veera starts to move and the tablet shows an arrow forward and a "follow me" text. Veera runs some robotic audio on the way, happy chirping sounds, clicks and whirrs.
  8. If people are in its path (this almost always happens in Oodi), it evades them with the MiR's autonomous movement logic, trying to reach its target position and calculating new routes as necessary.
  9. When the destination is reached, Veera stops and the tablet indicates in which direction the category or the book should be located.
  10. Finally Veera heads back to its homebase and re-enters the customer attraction loop.

During the three days of live testing a total of 270 of these guidance runs were done with actual customers. This count excludes the hundreds we did ourselves.

The service was autonomous. The robot required no human to operate it. A few of the runs failed, because Veera could not reach the target position due to curious people blocking its route. A few of them failed because of my code failing. However, over 90% succeeded. Not bad at all, considering the project's tight schedule and limited resources.

We talked with many customers, asking about how they felt getting help from a robot for this purpose. We also observed Veera in operation, and followed the robot's logging in real-time, allowing us to consider optimisations and improvements to the prototype.

Overall, it seems people react very positively to a robot in a role like this. Children absolutely loved it. This can be a bit of a challenge. A trio of young girls liked to follow Veera around so much they kept going around and around the library many times, making a game out of it, much to the dismay of their waiting parents, who had places to be.

best-testers Volunteer testers triggered every warning state of the MiR, photo by Futurice, CC BY 4.0

This being a public library the ad hoc game was just fine, but this is still something to consider in the design, if you want to avoid your fancy new service being gamified by surprise.

The negative reactions were very few during the whole testing phase of three weeks. Some adults were not amused by the robot. It scared a few people a bit, taking them by surprise. Veera sneaked upon a young lady browsing books and she called it a creepy-ass Creepy McCreepface. She apologised to the robot afterwards, when they met again.

These encounters could mostly be avoided by audio design, having the robot emit enough sound to alert people to its presence.

The library staff were happy with how these efforts turned out. Combined with the primary function of carrying books, and possibly doing automatic inventory at the slow hours of the night, a lot of valuable work could be performed by a few MiR robots. Seeing the value in this, we agreed to continue the project and are now designing and building for an extended pilot, where the robot will operate autonomously for several months.

This continues to be a fascinating research project for us, and furthers our understanding of how people and robots can and should interact in social situations.

A similar setup could work well in large department stores, airports, hotels, or other locations were people need to be guided around to find items and places. As with Oodi, a PoC could be up, running showing value to a business within weeks. If you'd like to discuss these kinds of opportunities, let us know! You can reach me directly on teemu.turunen@futurice.com

Author

  • Portrait of Teemu Turunen
    Teemu Turunen
    Corporate Hippie