News

Towards The Internet of Senses

The Internet of Senses (IoS) aims at providing comprehensive multisensory experiences that are nearly inseparable from reality and improve intelligent human-machine interaction.

This joint project between Aalto University and TII (Technology Innovation Institute), Abu Dhabi UAE, presents a human-machine interaction testbed for 6 DoF real-time immersion with haptic feedback towards realizing the Internet of Senses. In this project, we have worked on designing a real-life system that allows the remote control of a UAV for maximum immersion (6 Degrees of Freedom) while allowing optimal and reliable control of UAVs.
Internet of Senses

 The testbed included: 

 

 

Remote UAV

The remote UAV is equipped with an embedded computer, IoT sensors,  360◦ camera and a 5G modem. 

VR users

Upon receiving the 360◦ video stream, the user views the real-time stream through a Web platform and controls the UAV remotely using his body movements on the VR Treadmill platform.

Edge Server

The Edge server comprises the streaming module, control and monitoring module and the web server. 

Streaming module

This application is composed of an optimal that allows transmitting the real-time video stream to the web application with the lowest latency possible. It provides 360◦ video to any device able to access the web. 

Web server

The web server serves the WebVR application. It is the interface to the user to view the information status of the UAV and 360◦ video stream through an HTML5 video player adapted to play 360◦ video. It manages the video stream from the streaming module and synchronizes different video inputs (UAV video streams) with the outputs (video players who are requesting a given stream). The 360◦ video can be viewed by any device able to access a web browser. The choice of WebVR was mainly due to allowing an immersive view to any device that has access to a web browser starting from a simple card box to a HMD. 

Control and monitoring module

It is in charge of two functions: i) forwarding the user control commands from the web application to the flight controller module, and ii) updating the user about the censorial information of the UAV such as altitude, latitude, longitude and speed, as well as LTE and 5G-relevant information from the modem that is connected to the UAV. This information is integrated within the 360◦ immersive view of the HMD. It is visualized by clicking on virtual elements within the immersive view

VR Treadmill

The treadmill is used to track the user movements (walking speed, heading, height) and translates them into drone maneuvers. We implemented an algorithm that gets the different sensors measurements from the treadmill platform and translates them into drone commands. We also exploited the drone’s sensors feedback to implement the so called haptic feedback. The implemented model takes into consideration the different drones’ movements and translates them into vibrating frequencies that are sensed by the VR User while flying the drone. Different maneuvers will lead to different torques at the copter motors level which would be translated into different vibration amplitudes at the treadmill level by the implemented algorithm. The implemented haptic feedback algorithm maps one to one the vibrations frequencies into different frequencies and amplitudes to give to the user different senses for different maneuvers.

A demo has been done at GITEX, the world’s largest tech event in Dubai. We had a user controlling a drone using his body movements based on the 4K 360 video feedback from Dubai while the drone was in Abudhabi which makes it around 100km with an end-to-end video delay <500ms.

We also tested the setting from Aalto University while the drone was in TII, Abudhabi UAE where a user would control the drone using VR joysticks and we had an end-to-end delay of 900ms.

For more information, please contact Nassim Sehad.

  • Published:
  • Updated:
Share
URL copied!

Read more news

image of a wooden pillar from little finlandia and the text time out
Research & Art Published:

Aalto University shakes up construction practices at the New European Bauhaus Festival in Brussels

The exhibition Time Out! will be on show in Brussels from 9 to 13 April 2024 as part of the NEB Festival.
Two of the awardees and their robotic arm all holding colorful mugs. Aalto Open Science Award, Honorary mention.
Awards and Recognition, Research & Art Published:

Aalto Open Science Award third place awardee 2023 – Intelligent Robotics Research Group with the Robotic Manipulation of Deformable Objects project

We interviewed the Intelligent Robotics Research Group with the Robotic Manipulation of Deformable Objects project, 3rd place awardees of the first Aalto Open Science Award.
Five Aalto University students around a table
Research & Art Published:

Read the Qual+ Newsletter

We are excited to welcome you to the second Qual+ Newsletter and continue bringing you new ways of looking at methods within management studies.
Picture of leaves in water.
Press releases Published:

Graduate Sustainability Competencies and Influence in the Workplace – Aalto University's Latest Research

Aalto University's Meeri Karvinen successfully defends her doctoral dissertation, February 2024.