MARINE: A Computer Vision Model for Detecting Rare Predator-Prey Interactions in Animal Videos

Open Access
Authors
Publication date 2025
Host editors
  • Anirban Dasgupta
  • Rage Uday Kiran
  • Radwa El Shawi
  • Satish Srirama
  • Mainak Adhikari
Book title Big Data and Artificial Intelligence
Book subtitle 12th International Conference, BDA 2024, Hyderabad, India, December 17–20, 2024 : proceedings
ISBN
  • 9783031818202
ISBN (electronic)
  • 9783031818219
Series Lecture Notes in Computer Science
Event 12th International Conference on Big Data and Artificial Intelligence, BDA 2024
Pages (from-to) 183–199
Publisher Cham: Springer
Organisations
  • Faculty of Science (FNWI) - Institute for Biodiversity and Ecosystem Dynamics (IBED)
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract

Encounters between predator and prey play an essential role in ecosystems, but their rarity makes them difficult to detect in video recordings. Although advances in action recognition (AR) and temporal action detection (AD), especially transformer-based models and vision foundation models, have achieved high performance on human action datasets, animal videos remain relatively under-researched. This paper addresses this gap by proposing the model MARINE, which utilizes motion-based frame selection designed for fast animal actions and DINOv2 feature extraction with a trainable classification head for action recognition. MARINE outperforms VideoMAE in identifying predator attacks in videos of fish, both on a small and specific coral reef dataset (81.53% against 52.64% accuracy), and on a subset of the more extensive Animal Kingdom dataset (94.86% against 83.14% accuracy). In a multi-label setting on a representative sample of Animal Kingdom, MARINE achieves 23.79% mAP, positioning it mid-field among existing benchmarks. Furthermore, in an AD task on the coral reef dataset, MARINE achieves 80.78% AP (against VideoMAE’s 34.89%) although at a lowered t-IoU threshold of 25%. Therefore, despite room for improvement, MARINE offers an effective starter framework to apply to AR and AD tasks on animal recordings and thus contribute to the study of natural ecosystems.

Document type Conference contribution
Language English
Published at https://doi.org/10.1007/978-3-031-81821-9_11
Downloads
978-3-031-81821-9_11 (Final published version)
Permalink to this page
Back