Navigation auf uzh.ch
The tasks that the participants in the Cybathlon competition for vision assistive technology had to complete were varied. They included setting a table and pouring a glass of water, picking the right variety of tea from a shelf, avoiding various obstacles on a sidewalk and sorting items of clothing by color, and so on. The teams had to solve a total of 10 tasks in no more than eight minutes. The event brings together teams from all over the world to develop assistive technologies for people with disabilities.
Lukas Hendry competed for the Sight Guide team, a collaboration between UZH, Zurich University of Applied Sciences and ETH Zurich. He is almost completely blind and can only faintly distinguish between light and dark. Hendry successfully solved the most tasks in the qualification and achieved the highest score. The team did particularly well in the two tasks involving spatial orientation: avoiding obstacles on a sidewalk and navigating through a forest.
Apart from Sight Guide, only one other team was able to complete any of these tasks. “We did well because we were the only team from the field of robotics,” explains Giovanni Cioffi from the Robotics and Perception Group (RPG) at UZH. Spatial orientation and calculating paths past obstacles are typical tasks for robots that move about. In the RPG, Cioffi normally researches these issues for autonomous drones.
Although Sight Guide had to settle for third place in the final, they still didn’t come home empty-handed. For the first time, an award for innovation and user-friendliness was presented alongside the medals, which Sight Guide won in its discipline.
Compared to the other teams, Sight Guide’s assistance system was relatively complex and consisted of various hardware and software components: three cameras, a portable computer in a rucksack, and a belt that can provide feedback through vibrations.
To navigate between obstacles on a sidewalk or through a forest, the team used the same technology that the RPG uses for its drones. Cioffi was in charge of programming the software that analyzes the camera data and uses it to calculate a collision-free path for the pilot. The software identifies distinctive points on the camera images, such as the corner of an object. It calculates the position of the camera in space from the change in the points in the successive images. From this, it can then determine the path that will lead the person carrying the camera past the obstacles.
The pilot at the Cybathlon received directional information via vibrations in his belt. “During training, it turned out that these signals were faster and easier to understand than spoken instructions,” explains Cioffi.
The team spent around two years preparing for the event. For Cioffi, the Cybathlon was an excellent opportunity to test research in a practical setting. “The biggest challenge in the project was to adapt our technology, which we normally develop for robots, to humans.” Together with pilot Lukas Hendry, they were able to find out which impulses and information would best support him. “With robots, we can program how they react to signals. This is different for humans,” says Cioffi.
The findings from the Cybathlon will now be evaluated in scientific papers, says Cioffi. “We hope that we can inspire others who are working on assistive technology for blind people.”