
An artist contacted me with a wild idea: to create an art installation of a wild boar searching for food, but the Boar would be a robot. The artist was Federico Diaz, we had long meetings about how this concept could come to life. Federico has grandiose ideas where the sky is the limit, while my set of skills is always constrained by what I know. Luckily, I enjoy learning new things and taking on challenges.
To visualize the scent that attraced the boar, we came up with AR mobile application that presents the smells as cloud of tiny particles.

Of course, our deadline was very tight. That’s common for ambitious projects. We also wanted the performance to be truly unique, The behaviours and scenarios weren’t scripted. It was a live performance every time.
We didn’t have access to the robot most of the time. We basically had to build a very crude prototype using a longboard with a mounted phone. We knew the data we’d get from the real robot would be extremely limited due to restrictions from the owner company.
There was no tracking information, no lidar, so we had to create our own. Based on that, I built an ArKit application, which worked decently.
Prototype of “Spot robot”, attracting a large cloud of particles. The box on the ground is our marker.
Prusa research provided the Spot by Boston Dynamics for testing. That was when we realized just how poor our tracking solution was, mainly due to the robot’s natural vibrations while walking.
Video recorded directly from the phone mounted on the Spot Robot during movement.
Jakub Petr Sklář came up with anti-shock mount system for the robot, along with a practical solution to improve the physical marker placement accuracy.
Because the physical ArUco markers were moved every night by crows and actual wild boars on site, we had to recalibrate the markers every morning. I used 3D scanning app on a phone with Lidar along with a custom map editor for markers.
I also had to create a live editor where I could “DJ” the particles throught the entire exhibition and everyone using the app could see them in real time.
Video shows a rough 3d scan of the location surrounded by white boxes representing markers. Large circles with white dots represent clouds of particles. A moving white sphere shows the robot’s position.
Outcome
The performance was a hit. Tracking was surprisingly good at that scale. Viewers were amazed by the robot’s movements and its interaction with the augmented reality. Each performance was unique, and real-time networking worked flawlessly.
I was relieved and extremelly happy with the result. The whole team did an amazing job, and I’m truly grateful I had the chance to be part of it.
Project website | Federico Diaz



