Our autonomous blimp, developed in our ongoing project `WildCap', for animal MoCap.
In this context, we design and develop novel and innovative autonomous robotic systems that are needed to solve the given challenges, like motion capture of animals in the wild and active visual SLAM. Monitoring animals in their natural habitat, require aerial platforms that are relatively silent and can operate for longer duration without needing to recharge. To this end, we have developed small autonomous airships (see figure) that satisfy the above needs while being able to carry payloads up to 1-2kg, sufficient to deploy an on-board camera and a computer with GPU.
In conservation of wild animal species, collecting data about animal behavior is of utmost importance. Such data can be broken down into several categories of which pose, shape and behavior are
key components. Existing efforts to collect such data typically involves using line-of-sight observations from a stationary point of view on the ground or installing tracking devices on the
animals. There are, however, a number of challenges and drawbacks associated with both approaches. For instance, in case of ground-based observations, the wild animals tend to move a lot over
vast landscapes, hence stationary devices provide for only a limited insight into animal behavior. Landscape features, such as trees, bushes, rocks and hills, serve as occlusions when maintaining
visual contact. And, a single-view observation contains limited information which might not suffice to reliably estimate pose and shape, especially when an animal is posed in relation to the
observer in an unfavorable way. On the other hand, invasive approaches, such as anesthesia performed in order to install a tracking device, can be stressful for animals and expensive. Besides,
invasive methods interfere with normal behavior.
To address these challenges, we have developed a system to monitor animals using self-designed and developed autonomous unmanned aerial vehicles (UAVs). These include multi-rotor drones (see
inset figure) and airships (see first figure on page). Such vehicles are able to move relatively freely around the subjects of interest, allowing to choose convenient points of view and estimate
animal poses reliably. They can follow animals for long periods of time even in difficult landscapes, allowing for long stretches of uninterrupted observations. Placed at sufficient distance,
they allow for minimal invasiveness, which ensures that normal animal behavior patterns remain uninterrupted. Overall such a system can enable the collection of vast amounts of high quality
behavioral data, while achieving this in a more time-efficient and environmentally friendly way compared to current approaches. Figure: Our aerial robots, developed
in our ongoing project 'WildCap', tracking Prezwalski's horses in the Hungarian Steppe and performing animal MoCap.
Here we approach active SLAM from a systems perspective. We have developed a robotic platform with an independently moving camera that allows our previously-developed active SLAM method to become generalizable for all kinds of robotic chasis (holonomic or not), and further reduce energy consumption. A combined state estimation method for the robot and the independently moving camera was developed within this solution. Figure: Front-view of our robot with the independently-moving camera (in real and in sketch)