Introduction The flying action cam is a challenging project where we want to build a semi- autonomous drone which is able to track and fly behind a person during action sport scenarios. Therefore we propose a radically new type of user human robot interaction that brings together elements from robotics, Augmented Reality (AR) and ubiquitous computing. Whilst MAVs have been explored in the context of autonomous flight, this project will for the first time explore how these can be made fully interactive, working in synergy directly with users in real-world environments. To be able to be used in the proposed way, Quadrotors need a certain degree of autonomy as well as a fast and robust self-localization in 3D. In addition the MAV should be able to handle dynamic environment changes and fly safely if humans are present.
Because we want that the quadrotor precisely track and follow the person also during fast motion changes, we want to predict the human motion using body worn Sensors. Especially we want to instantaneously estimate the human velocity and acceleration as well as the direction of the movement.
The ideal candidate will have a background machine learning and Estimation. Solid programming skills, especially on embedded platforms, and an interest in hands-on development and experimentation is also a requirement.
Programming languages: C/C++ and Matlab Simulink
Supervisor: Prof. Otmar Hilliges, Tobias Naegeli (firstname.lastname@example.org)
Place: ETH Institute for Pervasive Computing
CLS Student Project (MPG ETH CLS)
Information, Computing and Communication Sciences