Seeing the world through the eyes of an autonomous vehicle - the new moovel lab "Who Wants to Be a Self-Driving Car" project makes this possible. A wide variety of information from a 3D depth camera, LIDAR scanner and sensors attached to an unconventional vehicle is pooled and visualized in virtual reality glasses. Trying out the project will be able at the Push UX Conference in Munich and the KIKK Festival in Belgium.
How will autonomous vehicles change our mobility? How do autonomous vehicles see their surroundings? How are other road users, such as vehicles, cyclists and pedestrians, presented? And what information does the vehicle use when moving about in traffic? A specially constructed, unconventional vehicle, which the moovel lab team fitted with sensors and cameras, enables these questions to be addressed. The vehicle is steered by the user using a VR headset, which displays three-dimensional mapping and object recognition to aid the driver in navigation. The person takes on, so to speak, the role of the control unit of the vehicle, much like the computers in self-driving cars.
The project is aimed at being a platform for all those who want to experience and share their impressions, feelings and thoughts on the future subject of autonomous driving. By allowing people to experience it for themselves, we want to make the topic less complex, create empathy for the technology and trigger a discussion about the future of mobility.
Via the VR headset, the driver is provided with the data required to steer the vehicle
The prototype replaces the human sense of sight with sensors like those which may be used in a self-driving vehicle. It is a non-autonomous vehicle in which the decision-making and controlling computer is replaced by a human. The vehicle consists of a platform with electronic hub motors and sensors. A control device is used to steer, accelerate and stop the vehicle The driver lies on the vehicle with his head facing forwards. Together with the VR headset, this lying position enhances the driver´s feeling of movement and allows full immersion in the perception.
The VR experience is created by two types of sensors. In the main view, the data of a 3D depth camera is used to present the landscape in real-time. The image information of the 3D camera is supplemented by automatic object recognition of the vehicle surroundings. This provides the driver with information about objects which are recognized by the computer. Finally, a LIDAR (light detection and ranging) sensor provides an additional dimension with distance measurement. Here, the sensor uses light impulses sent towards nearby objects to determine their distance from the sensor. These various components are all combined in the VR headset and provide the driver with the data required to steer the vehicle.
The moovel lab is presenting the project at two conferences in October and November 2017: at the Push UX Conference in Munich on October 20th and 21st, and at the KIKK Festival from November 2nd to 4th in Namur, Belgium.