A new smartphone app could soon make it possible to build a near-real version of yourself into a computer game. All you’ll have to do is film yourself with the phone’s camera, and the app will create a 3D reconstruction of your head. You’ll be able to use this 3D selfie as a game avatar. A prototype of the new app was presented this week at the Mobile World Congress in Barcelona.
Mapping out space
The imminent market launch of the app is particularly exciting for Davide Scaramuzza, since 2012 Assistant Professor of Human Oriented Robotics at the UZH Department of Informatics. The software for the 3D app was developed by his Robotics and Perception Group.
Scaramuzza is convinced that this technology will also spark interest in architecture, engineering and medicine. “Easy-to-create, quickly available 3D reconstructions could be really useful in these fields.” An app for capturing smaller objects in 3D is already on the market.
Scaramuzza and his team specialize in using camera images to calculate spatial models and determine the position of objects in space.
There are other applications of this technology besides the 3D app developed by Zurich start-up Dacuda on the basis of software licensed by Scaramuzza. The professor’s main interest lies in flying robots, so-called quadrotors. He wants to teach them to fly autonomously. To do so he fastens a camera and a mini-computer to the drone. The camera delivers a constant stream of images of the environment, focusing on high-contrast fixed points in the landscape. When the drone moves, these fixed points change as well. The software developed by Scaramuzza uses these changes to calculate the new position of the drone.
Scaramuzza wants to equip the drones for use in disaster areas, for example following earthquakes or nuclear accidents like Fukushima. The idea is that drones will record damage, measure radioactivity, and search the rubble for survivors.
Improving camera and software
Davide Scaramuzza’s next goal is to make drones fly faster so they can save more lives in the wake of catastrophes. This means improving both the camera and the software.
To this end, Scaramuzza is working with Tobi Delbruck at the UZH Institute of Neuroinformatics. Delbruck has developed a new type of camera system called the Dynamic Vision Sensor.
Scaramuzza’s research group also wants to improve the positioning systems used by flying and land-based robots. This is the task of robotics experts working for the Zurich Eye project at the newly established Wyss Translational Center Zurich (WTZ) run by UZH and ETH Zurich.
The editorial team reserves the right to not publish comments. We will not publish anonymous, defamatory, racist, sexist, otherwise prejudiced, or irrelevant comments. UZH News will also not publish comments with advertising content.