Available Projects

Preliminary remarks

If you are interested in one of the projects listed below, please contact the first assistant mentioned at the bottom of each project description by email, phone or in person. It is sometimes possible to have two students working on the same project, please discuss the formalities with your first assistant.

Remarques préliminaires

Si vous vous intéressez à l’un des projets ci-dessous, veuillez prendre contact directement avec le premier responsable indiqué à la fin de chacune des descriptions, soit par téléphone, soit par email, soit en passant au LIS. Il est parfois envisageable de travailler à deux sur un même projet de semestre. Veuillez en discuter les formalités avec l’assistant responsable.

La langue (souvent Anglais) utilisée dans les descriptions de nos projets n’a pas d’influence directe sur la langue utilisée dans les relations avec les assistants et pour les rapports de projets.



Advanced Roll Control for Avian-Inspired Drone

We aim to improve the roll effectiveness of a highly maneuverable, avian-inspired feathered drone, which has thus far relied on wing tip folding for roll control. To guarantee controllability at low angles of attack, as well as in the post stall regime, a combination of various avian-inspired roll strategies are combined. Therefore, the goal of this semester project is to (i) improve and manufacture the the current mechanism, (ii) manufacture the wing architecture and mount it on the drone, (iii) perform comprehensive wind tunnel tests followed by data analysis, and finally (iv) implement a suitable strategy into the control architecture of the autopilot. This multifarious project gives the student a comprehensive insight into applied flight mechanics and aerodynamics combined with the development of a state of the art aeronautical device. A suitable candidate should be a do-it-yourself enthusiast.

Type: Master project
Period: to be defined
Section(s): Robotics Microengineering Mechanical Engineering Materials Science and Engineering Energy Science and Technology
Type of work: 20% theory, 50% hardware, 20% experiments, 10% coding
Requirements: 3D printing, carbon fiber reinforcing, do-it-yourself attitude
Subject(s): Aerodynamics, flight mechanics, mechanics, control
Responsible(s): Enrico Ajanic, Mir Feroskhan
 

Wearable Motion Capture System for telerobotics

Wearable sensors are a relatively young technology which has been object of great interest for several industrial, clinical and research applications in the past years. One of the main limitations in this field is the lack of a complete and versatile environment for the interface of such devices. In the LIS we are planning to move a first step towards a framework for wearable technology, to allow fast and proficient use of wearable technology for robotics research purposes. For this project, an upper body motion tracking system (MoCap) based on inexpensive inertial sensors (IMU) will be designed and validated with respect to an IR professional facility (Optitrack). Data acquisition, calibration and communication will be handled by the student. The sensors will be compared with high-end IMUs and the IR system in terms of error and bandwidth. Anchoring systems and textile design will also be covered, in order to optimize the system on the user's body., Finally, you will test the developed MoCap for the teleoperation of a simple drone. Students interested in embedded software and processing and wearable technology are encouraged to apply.

Type: Master project
Period: to be defined
Section(s): Robotics Microengineering
Type of work: 20% theory +50% software-firmware, 30% hardware
Requirements: Firmware development (C++), basic of electronics
Subject(s): Motion Capture, Firmware Development, System Integration
Responsible(s): Matteo Macchini, Fabrizio Schiano

Wearable haptic interface to control a drone

Learning a teleoperation task normally requires a high cognitive effort as well as extensive training. In the frame of the Symbiotic Drone project, we are designing a new Human-Robot Interface (HRI) based on wearable technology in order to ease the process of learning a new interface as well as the dynamics of the controlled machine. For this project, we want to test how effectively a human is able to learn to use his own body instead of a predefined hardware interface (i.e. a joystick). In particular, a motion controller based on hand motion was implemented and used to control the position of the drone, following a path through obstacles in a simulated environment. At the same time, we implemented a haptic device (a glove) capable of transmitting relevant information about the navigation of the robot and the surrounding environment during flight. The glove shows promising capabilities for the teleoperation of a flying robot when limited or no visual feedback is available from the machine. A hardware platform was implemented for further testing on a real quadrotor. For this project, we want to extensively test the developed wearable interface and assess its potentialities for teleoperation of single and multiple quadrotors, assessing learning time, efficiency and cognitive effort needed, compared to standard hardware devices such as joysticks. Students interested in flying robotics, haptics, and wearable technology are encouraged to apply.

Type: Master project
Period: to be defined
Section(s): Robotics Microengineering
Type of work: 20% hardware, 40% software, 40% experimental
Requirements: Python, ROS C++
Subject(s): Wearable, telerobotics, haptics
Responsible(s): Matteo Macchini, Fabrizio Schiano

Vision-aware modelling and control for drone swarms

At the Laboratory of Intelligent Systems, we are interested in the collective behavior of drone swarms. Among the examples that nature offers, birds show extraordinary coordination abilities only based on local visual information that we use as source of inspiration for our work.
In addition to our recent results that show the importance of lateral vision to coordinate the movements of the group, we want to refine our flocking performance metrics and study the impact of occlusions on the behavior of a drone swarm.
The first phase of the project will involve 3D modeling of limited vision and occlusions in an existing Matlab simulator. From extensive simulations and systematic analysis you will evaluate the collective behavior of the drones for different visual configurations and draw requirements for the visual sensors. Examples of relevant metrics are the correlation of the drones movements and the number of collisions among agents.
In the second part of the project, you will focus on more realistic simulations of drone swarms involving ROS and the Gazebo simulator. You will implement a swarm controller that, given the positions and the uncertainties of simulated sensors, is able to generate commands for the swarm that avoid collisions.
This project requires modelling and programming skills and previous knowledge of ROS. Quantitative and multimedia outcomes (figures, videos) are necessary to support the conclusions drawn from the project.

Type: Master project
Period: to be defined
Section(s): Robotics Microengineering Electrical and Electronic Engineering School of computer and communication sciences
Type of work: 30% theory, 40% software, 30% analysis
Requirements: Modelling and programming skills, Matlab, ROS, previous knowledge of Python and PX4 is preferred.
Subject(s): Swarm intelligence, multi-agent control
Responsible(s): Enrica Soria, Fabian Maximilian Schilling

Detection and relative position estimation of drones

At the Laboratory of Intelligent Systems, we are developing algorithms for decentralized, vision-based drone swarms that resemble swarms of animals found in nature. Rather than relying on communication among the drones, we are taking a biologically plausible approach to solve the problem of swarm coordination.

The first part of the project, you will train a state-of-the-art convolutional object detector to recognize and localize drones. Challenges include their relatively small size, especially if they are flying fast and against complex backgrounds. With the help of an object tracker, you will estimate the relative positions of the agents.

The second part of the project will involve the validation of the relative position estimates obtained from the detector/tracker with ground truth positions from our motion tracking system. To this end, you will implement and evaluate your system onboard a physical quadcopter.

This project requires previous knowledge of machine learning and computer vision, specifically object detection (some relevant courses: EE-559 Deep learning, MICRO-455 Applied machine learning, and CS-442 Computer vision). Quantitative and multimedia outcomes (figures, videos, etc) are necessary to support the conclusions drawn from the project.

Type: Master project
Period: to be defined
Section(s): Robotics Microengineering Electrical and Electronic Engineering School of computer and communication sciences
Type of work: 50% software, 30% experiments, 20% theory
Requirements: Courses in (deep) machine learning, excellent programming skills (specifically Python and the SciPy stack), familiarity with deep learning frameworks (Pytorch or TensorFlow)
Subject(s): Deep/machine learning, object detection, state estimation, computer vision
Responsible(s): Fabian Maximilian Schilling, Enrica Soria

Analysis of the limitations of quadrotor swarms in the real world with Crazyflies

At the Laboratory of Intelligent Systems, we develop swarming algorithms for quadcopters. These algorithms are extensively tested in a dynamics simulator developed internally. The goal of this project is to integrate the current simulation setup and interface it with hardware to allow experimental testing (https://crazyswarm.readthedocs.io/en/latest/). The first phase of the project will involve the development of a Matlab/Simulink (or Python) program able to send velocity commands to a Crazyflie through ROS. The second step involves testing on hardware in an indoor room equipped with a Motion Tracking system. The robot should be able to accomplish a navigation mission thanks to the commands generated through Matlab/Simulink and the measurements coming from the OptiTrack. The integration of a second drone will allow to evaluate the swarming behavior of the robots in the established framework. A final step of analysis is necessary to assess the limitations of the tested algorithm. Previous experience with the cited software and hardware are required.

Type: Semester project
Period: to be defined
Section(s):
Type of work: 20% theory, 40% software, 40% testing
Requirements: Modelling and programming skills (Matlab, Simulink, Python), previous familiarity with ROS and hardware
Subject(s): Swarm robotics, drone formations, experimental testing
Responsible(s): Enrica Soria, Fabrizio Schiano

Drone Log Analyser

At the Laboratory of Intelligent Systems (LIS) at École Polytechnique Fédérale de Lausanne (EPFL) we are developing drones for last-cm delivery. These delivery drones are fully autonomous with the help of a web-application framework of Dronistics. The first goal of this project is to develop a software that can receive logs from an autopilot of a drone after every delivery. The implementation of the same should be compatible with DroneCode SDK. This developed software should be capable of retrieving the logs from the drone and store it in the database (Mongo DB) of the server. The REST API should be then developed to provide specific data from logs based on users query that will be used in the next step. The second goal of this project is to implement a web-based Log Analyser that uses the REST API developed before. This implementation should analyze the corresponding logs and fetch the meaningful data to the user. This Log Analyser should be as generic as possible and should be capable of decoding logs of PX4 autopilot and Ardupilot (at the least). At the end of the implementation, the user should be capable of visualizing the logs in the form of interactive graphs such as a battery, altitude, velocity and other sensor reading over the timeframe. The third goal of the project is to implement a Machine Learning (or) Deep, Learning algorithm that can analyze the logs automatically and report/notify if an anomaly has been detected. Implementation should be supported by a strong state-of-the-art study and feasibility analysis. All the above features should be well documented, unit tested and should be made available on a web-based user interface that should part of Dronistics Software Framework.

Type: Semester project
Period: to be defined
Section(s): Robotics Microengineering
Type of work: 70%software, 10% hardware, 20%testing
Requirements: JAVA, Full Stack Development, Frontend design Knowledge of Machine Learning or Deep Learning is a plus
Subject(s): Software+Architecture +IoT
Responsible(s): Anand Bhaskaran, Przemyslaw Kornatowski
URL: Click here