Interactive Laser Projector

Project:
Posted:
Updated:

This project accompanied me for some time while working in the Innovationlab @ Milla & Partner. Bosch Sensortec developed an optical microsystem with MEMS mirrors, which makes it possible to build a miniature laser projector. With an additional sensor and clever electronics, it was even possible to obtain a digital image from the reflected laser light. By analyzing this image, it was possible to recognize fingers and the intention to activate a button, for example. Our task was to show the world the capabilities and possibilities of this interactive laser projector. For this purpose, I programmed a framework to help us do just that. This framework consists of two parts:

Interactive part: Image analysis with computer vision algorithms is used to detect when a user activates a button. Fundamentally, the algorithm takes advantage of the fact that the reflected light changes slightly when a finger is held in the beam of light. With a basic implementation, we could start to address interesting questions such as when exactly should an activation interaction be registered? Because this way of detecting interactions is different from that of a touchscreen, as the finger is also »seen« in mid-air, which incidentally could be a significant hygienic advantage. But this also makes it harder to detect the user's intent. This self-developed framework gave us the chance to explore this part even further with gesture and marker recognition.

Content part: For classical 2D interfaces a web view was used, into which interactions from the interactive part were injected. A web view enables rapid development and research. The interaction events could also be transmitted to other software. This allowed us to present more elaborate live rendered 3D sequences in Unity while still using the Interactive Part.

What's interesting regarding displaying content is that this laser projector doesn't use a lens, which means that the projected image remains sharp regardless of the distance. Also, blacks are as black as possible because the laser turns off in the black areas of the image.

Among others, these were the features presented at the first public demonstration of the project at Hannover Fair 2016. By pairing the projector with another Bosch development – a robotic arm that can be used in close proximity to humans – demonstrated not only these features, but also an application: Robots and humans working together, where this technology could be the interface between the two.

Robot using interactive laser projector to demonstrate an interface at Hanover Fair 2016.

I also used this framework in my bachelor thesis, which dealt with the control problem of artificial superintelligence.

Bachelor Thesis with Robot and Interactive Laser Projector.

It was then used again as an interface for a coffee-serving robot in the Bosch Coffee Bar at CES, Las Vegas 2017. The Innovationlab also helped Bosch in collaboration with Nokia to demonstrate how the robot and laser projector combination can be remotely controlled over 5G. The framework has been steadily improved and adapted for new applications. For the Mobile World Congress 2017 in Barcelona, I added, for example, a rudimentary marker recognition to distinguish different electronic components placed in the projector beam. Another feature was a new mode optimized for fast finger detection for a simple reaction game.

Interactive Laser Projector at Mobile World Congress 2017.

At CES 2018, the interactive laser projector was integrated into a lamp, showcasing how this technology could be discreetly integrated into your life by displaying the information and options you need at a specific time and place.

Interactive laser projector integrated into a lamp at CES 2018.

Text and media used with permission from Milla & Partner.