Recent advances in remote sensing technology (e.g., miniaturization, battery life increases) enable the wholesale, reliable, and accurate recording of animal movements. Further, the sensor-based tracking is strongly promoted by wide-ranging projects such as the Movebank data repository or the ICARUS-Initiative, for which an antenna on the ISS gathers animal locations on a global base. Having the resulting movement trajectories, academia want to better understand and address some of the most urgent problems of our time: climate change, species decline, natural crises, disease transmissions— to name only a few. However, the sensor-based trajectories commonly lack in population quantification and do not entail a thorough view on the highly complex, spatiotemporal movement context.
In this project, we address the two latter issues. We use human observations of animals to enrich the trajectories and improve their understanding. For this enrichment, we can consider observation descriptions, photos, even audio and video recordings from several VGI web portals. Initially, this project refers to data integration, uncertainty assessment, interactive matching, and trajectory annotation. Once suitable VGI is identified, we will enrich existing movement prediction models by VGI and enable a bidirectional verification process.
Throughout this project, we realize multiple visual-interactive applications that shall ensure the uncertainty-aware and semi-automated enrichment and analysis by movement ecologists. While VGI portals already provide several uncertainty and quality measures (e.g., accuracy, community-based rankings), we will also derive own measures: e.g., text-derived features, or by the automated comparison with species distribution maps. Moreover, matching quality is of particular interest, for instance referring to spatiotemporal distances or taxon epithet mismatches.
With this project, we aim to fill distinct research gaps and hope to improve the thorough understanding of animal movements by combining benefits from sensor recordings and human observations. This project benefits from a direct Movebank linkage, as well as from the close collaboration with the Max Planck Institute of Animal Behavior and well-established VGI portals.
- Meschenmoser, P., Buchmüller, J. F., Seebacher, D., Wikelski, M., & Keim, D. A. (2021). MultiSegVA: Using Visual Analytics to Segment Biologging Time Series on Multiple Scales [Journal Article]. IEEE Transactions on Visualization and Computer Graphics, 27(2), 1623–1633. DOI: 10.1109/TVCG.2020.3030386
- Rauscher, J., Miller, M., & Keim, D. A. (2022). Visual Exploration of Preference-based Routes in Ski Resorts. In M. Krone, S. Lenti, & J. Schmidt (Eds.), EuroVis 2022 - Posters. The Eurographics Association. DOI: 10.2312/evp.20221123
- Metz, Y., Schlegel, U., Seebacher, D., El-Assady, M., & Keim, D. (2022). A Comprehensive Workflow for Effective Imitation and Reinforcement Learning with Visual Analytics. EuroVis Workshop on Visual Analytics (EuroVA). DOI: 10.2312/eurova.20221074