Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Robust and efficient target-tracking algorithms embedded on moving platforms, are a requirement for many computer vision and robotic applications. However, deployment of a real-time system is challenging, even with the computational power of modern hardware. As inspiration, we look to biological lightweight solutions-lightweight and low-powered flying insects. For example, dragonflies pursue prey and mates within cluttered, natural environments, deftly selecting their target amidst swarms. In our laboratory, we study the physiology and morphology of dragonfly 'small target motion detector' neurons likely to underlie this pursuit behaviour. Here we describe our insect-inspired tracking model derived from these data and compare its efficacy and efficiency with state-of-the-art engineering models. For model inputs, we use both publicly available video sequences, as well as our own task-specific dataset (small targets embedded within natural scenes). In the context of the tracking problem, we describe differences in object statistics within the video sequences. For the general dataset, our model often locks on to small components of larger objects, tracking these moving features. When input imagery includes small moving targets, for which our highly nonlinear filtering is matched, the robustness outperforms state-of-the-art trackers. In all scenarios, our insect-inspired tracker runs at least twice the speed of the comparison algorithms.

Original publication

DOI

10.1088/1748-3190/aa5b48

Type

Journal article

Journal

Bioinspiration and Biomimetics

Publication Date

16/02/2017

Volume

12