Biodiversity Information Science and Standards :
Conference Abstract
|
Corresponding author: Jonas Grieb (jonas.grieb@senckenberg.de), Claus Weiland (cweiland@senckenberg.de)
Received: 14 Aug 2024 | Published: 15 Aug 2024
© 2024 Jonas Grieb, Claus Weiland, Alexander Wolodkin, Leyla Jael Castro, Stian Soiland-Reyes, Maya Beukes, Martin Jansen
This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Citation:
Grieb J, Weiland C, Wolodkin A, Castro LJ, Soiland-Reyes S, Beukes M, Jansen M (2024) Using Webby FDOs to Integrate AI Taxon Identification and Citizen Science. Biodiversity Information Science and Standards 8: e134757. https://doi.org/10.3897/biss.8.134757
|
|
Camera traps and passive acoustic devices are particularly useful in providing non-invasive methods to document wildlife diversity, ecology, behavior, and conservation. The application of autonomous Internet of Things (IoT) sensors is constantly developing and opens up new application possibilities for research and nature conservation such as taxon identification based on real-time audio processing in the field (
Machine-learning, in contrast, can generate (baseline) annotations at scale on high-throughput data but may not capture all details compared with the complex contextual understanding of human annotators.
We developed the WildLIVE platform*
Data flow in the WildLIVE portal: A machine-learning service provides automatic high throughput annotations as baseline (a). Annotations are then subject to review and refinement by citizen scientists (b). Reviewed data are subsequently compiled into new training data (c). The data model (d), based on lightweight packaging with RO-Crate, captures contextual (provenance) information from all human and machine-based operations.
To this effect, the data model of WildLIVE features a “webby" FDO approach leveraging web-based components involving Research Object Crate (RO-Crate) and FAIR Signposting to enable packaging of an observing process’ contextual information (e.g., metadata of sensors, geolocation, and links to content stream), together with operational semantics giving machines the information needed to autonomously process (e.g., detect regions of interest in images,
The talk will provide an overview of the platform’s development status and the technology stack employed (combining RO-Crate and FAIR Signposting with AI plus “Humans-in-the-Loop”) for data exchange with emerging data infrastructures such as the Common European Data Spaces.
Human-in-the-Loop, deep learning, FAIR Digital Object, RO-Crate, machine-actionability, Wildlife Monitoring Ontology, Common European Data Spaces, camera trap
Claus Weiland
SPNHC-TDWG 2024
CAMTRAPPER– CAMera TRap Archive and Public Portal for Exploration & Research, Deutsche Forschungsgemeinschaft (DFG) no. 437771903
AI4WildLIVE - Citizen Science Portal, Archive and Analysis Tools for Multimodal Monitoring Data, funding line: BiodivKI - Methods of Artificial Intelligence as an Instrument of Biodiversity Research, German Federal Ministry of Education and Research (BMBF)