Categories
Projects

Interactive Multi-Robot Aerial Cinematography

An interactive multi-robot aerial cinematography framework is proposed in this project to provide high-level position instructions to unmanned aerial vehicles (UAVs) in a distributed fashion by leveraging multi-robot system (MRS) coverage control. The following principles are considered when designing a control strategy:

1) the UAVs should stay some distance from the target;

2) the MRS can dynamically observe the target from different points of view;

3) the pilot should be able to manipulate the UAVs from some high-level specifications, i.e., the system should be user-friendly and easy to express different artistic styles, such that the pilot does not have to learn complex case-by-case commands.

To meet these requirements, we assume a virtual hemisphere with its center located at the real or virtual target, and UAVs maintain a prescribed distance to the hemisphere’s center. The UAVs are deployed by defining their distribution over the hemisphere to provide different points of view. Furthermore, the abstraction of the state of the MRS can be used as two exogenous inputs, i.e., the predicted motion of the hemisphere (target) and the density over the hemispherical surface, to manipulate the MRS’s behavior. These two inputs can be determined either in decentralized or centralized manners, e.g., obtained from decentralized autonomous sensor-based techniques, or provided by a pilot using a human-computer interface such as a joystick or a tablet. The proposed control strategy allows an MRS to distribute over the hemisphere and track the exogenous inputs efficiently.