Modular Robotic Conference Cam

Modular Robotic Conference Cam

Our challenge, your solutions


Our solution aims to make the Telepresence experience more dynamic, for example allowing a user to move inside a room, showing what he is working with, without the need to move his entire pc.

The solution consists of various modules with a core: the robotic arm.


Team: StarHTML Crusaders

Team members

Alex Prosdocimo, Andrea Sinibaldi, Luca Barban

Members roles and background

Andrea Sinibaldi, Developer, Telecomunications @ ITIS Rossi (4th year)
Luca Barban, Designer, Information Techonology @ ITIS Rossi (4th year)
Alex Prosdocimo, Writer and Video Editor, Information Techonology @ ITIS Rossi (4th year)

Contact details

Solution description

Our solution is a modular robotic arm with a core: the Comau's e.DO robotic arm. The e.DO is a 6-axis robotic arm on which we can mount various modules. The core also includes a Raspberry Pi, on which the body tracking script (based on the open-source library OpenCV and the e.DO library) written in Python runs. The robotic arm can be controlled manually or with automatic tracking through a Android/Windows/WearOS application.

We have planned 3 main modules that could be useful in improving the Telepresence expirience:

  • Simple Webcam: This is the simplest module. It consist of a simple 1080p webcam connected via USB to the Raspberry. The webcam provides both the video output that is sent to the user's laptop or cellphone and the video feed that is analyzed by the tracking software.
  • UGV (Unmanned Ground Veichle): This module allows even more freedom of movement. It consist of a base with 4 wheel-motor assembly, a LiFePO4 battery and its BMS (Battey Management System) and 5 LiDAR sensors, 1 per side to avoid collisions with walls and 1 on the robotic arm to check the distance from the user. The UGV would follow the user, using the tracking script and the LiDAR to determine his position in space.
  • 360° Camera: This additional camera can be used to allow users to watch all around the robotic arm. It is best used in conjuction with the UGV module.

How it works
While in automatic tracking, the video output is processed through OpenCV, which can identify the face of every person in frame. For reasons of simplicity, we only track the first face that has been recognized. OpenCV gives us the coordinates of the face and the script then converts the cartesian coordinates in angles with trigonometry. We then send this rotation to the robotic arm. This operation happens every second, to avoid the false detection of OpenCV and also to reduce the noise of the motors.
While working with the UGV module, the tracking works the same as before, but if the users gets far enough away from the webcam, then the scripts tries to get back in range by moving the robot itself. In general, the objective of the script is to keep the user's face in the center of the screen at every time.
While controlled manualy, the robot arm can be moved in cartesian mode through a joystick in the application. 

One more thing: Virtual Conference Room
We are planning to create a custom program in Unity (a 3D Game Engine) to create a better experience for the users. The Virtual Conference is a 3D room where people use an avatar to move around. The host's video is projected on a screen, as it would on a projector in a real rool. This program allows more freedom for the users, giving them the possibility to move around the room and talk to eachother in a realistic way, since the voice is played from the user's avatar, or to get closer to the host's video if something isn't very clear in the video.

Solution context

With our solution, we are trying to solve 2 different problems:

  • First of all, Telepresence requires a static position and if a user wants to show objects in the room he is in, he has to move his laptop or, even worse, his tower pc. An example of a user who would want to show an object in the room is a teacher while he is doing Distance Teaching: if he has to use the chalkboard, he would have to move his laptop to show it to his students.
  • We are also trying to solve problems related with the Covid19 epidemic: with the new laws and rules, it is becoming quite difficult to show, for example, an industrial site because people might be in quarantine, or maybe they can't move from their house because there is a lockdown in the country they live in.

Solution target group

The solution is not targeted to a single group, but to every person who needs to use Telepresence in a dynamic context, like in a classroom or during a museum tour. 

Solution impact

The solution would greatly improve the telepresence experience for everyone, from the host of a virtual conference to its spectators. It is usable by anyone, even the most inexpirenced user, because of its autonomous mode and accessible interface.
The UGV module would also help those businesses that are closed due to the current Covid Pandemic, like museums and art galleries, allowing them to do virtual guided tours.

Solution tweet text

Tired of the classic, boring, static video conference? Try our new product! It will allow you to move around while you are hosting a videoconference, giving you the possibility to show things all around you. No more boring conferences!

Solution innovativeness

We are not the first to think about a product to improve virtual conferences, but our product is the first to use an autonomous tracking software and robotic arm to get to that objective.
What is also quite innovative is the Virtual Conference Room, something that doesn't quite exist on the market but that would greatly improve the experience.

Solution transferability

Given its modular nature, this solution is trasferable and adaptable to a range of situations. The base modules (the ones describeds in the Solution Description) can be used in a classroom, conference room, industrial plant, museum and other structures, but with other modules this list can be extended.

Solution sustainability

Our solution isn't something that would easly become obsolete. 
It would remain usefull even after the Covid pandemic, for example for industries with different administration and production structures to do virtual inspections. 
The UGV is the only delicate module: it has to be charged with its specific power supply unit to avoid damaging the battery.
The battery is also the only thing that has to be disposed like hazardous waste, the rest has to be disposed like normal electronic products.

Solution team work

This isn't the first time we work toghether on a project like this and, as always, we like to work as a team.
We would really enjoy to work togheter again.

* Climate-KIC publishes the proposed solutions developed during the DigiEduHack event solely for the purposes of facilitating public access to the information concerning ideas and shall not be liable regarding any intellectual property or other rights that might be claimed to pertain to the implementation or use any of the proposed solutions shared on its website neither does it represent that it has made any effort to identify any such rights. Climate-KIC cannot guarantee that the text of the proposed solution is an exact reproduction of the proposed solution. This database is general in character and where you want to use and develop a proposed solution further, this is permitted provided that you acknowledge the source and the team which worked on the solution by using the team’s name indicated on the website.

DigiEduHack 2021 partners & supporters

DigiEduHack is an EIT initiative under the European Commission's Digital Education Action Plan, led by EIT Climate-KIC and coordinated by Aalto University. In 2021, the main stage event is hosted by the Slovenian Presidency of the Council of the European Union in cooperation with the International Research Center on Artificial Intelligence (IRCAI) under the auspices of UNESCO.

EIT Climate-Kic

Aalto University

European commission

Slovenian Ministry of Education, Science and Sport

International Research Center on Artificial Intelligence

EIT Community: Human Capital