P.-Y. Hsing
Economical crowdsourcing for camera trap image classification
Hsing, P.-Y.; Bradley, S.P.; Kent, V.T.; Hill, R.A.; Smith, G.C.; Whittingham, M.J.; Cokill, J.; Crawley, D.; Volunteers, MammalWeb; Stephens, P.A.
Authors
Professor Steven Bradley s.p.bradley@durham.ac.uk
Professor
V.T. Kent
Professor Russell Hill r.a.hill@durham.ac.uk
Professor
G.C. Smith
M.J. Whittingham
J. Cokill
D. Crawley
MammalWeb Volunteers
Professor Philip Stephens philip.stephens@durham.ac.uk
Professor
Abstract
Camera trapping is widely used to monitor mammalian wildlife but creates large image datasets that must be classified. In response, there is a trend towards crowdsourcing image classification. For high‐profile studies of charismatic faunas, many classifications can be obtained per image, enabling consensus assessments of the image contents. For more local‐scale or less charismatic communities, however, demand may outstrip the supply of crowdsourced classifications. Here, we consider MammalWeb, a local‐scale project in North East England, which involves citizen scientists in both the capture and classification of sequences of camera trap images. We show that, for our global pool of image sequences, the probability of correct classification exceeds 99% with about nine concordant crowdsourced classifications per sequence. However, there is high variation among species. For highly recognizable species, species‐specific consensus algorithms could be even more efficient; for difficult to spot or easily confused taxa, expert classifications might be preferable. We show that two types of incorrect classifications – misidentification of species and overlooking the presence of animals – have different impacts on the confidence of consensus classifications, depending on the true species pictured. Our results have implications for data capture and classification in increasingly numerous, local‐scale citizen science projects. The species‐specific nature of our findings suggests that the performance of crowdsourcing projects is likely to be highly sensitive to the local fauna and context. The generality of consensus algorithms will, thus, be an important consideration for ecologists interested in harnessing the power of the crowd to assist with camera trapping studies.
Citation
Hsing, P., Bradley, S., Kent, V., Hill, R., Smith, G., Whittingham, M., …Stephens, P. (2018). Economical crowdsourcing for camera trap image classification. Remote Sensing in Ecology and Conservation, 4(4), 361-374. https://doi.org/10.1002/rse2.84
Journal Article Type | Article |
---|---|
Acceptance Date | Apr 13, 2018 |
Online Publication Date | Jul 4, 2018 |
Publication Date | Dec 31, 2018 |
Deposit Date | Apr 18, 2018 |
Publicly Available Date | Apr 19, 2018 |
Journal | Remote Sensing in Ecology and Conservation |
Publisher | Wiley Open Access |
Peer Reviewed | Peer Reviewed |
Volume | 4 |
Issue | 4 |
Pages | 361-374 |
DOI | https://doi.org/10.1002/rse2.84 |
Public URL | https://durham-repository.worktribe.com/output/1334313 |
Files
Published Journal Article
(862 Kb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by-nc/4.0/
Published Journal Article (Advance online version)
(859 Kb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by-nc/4.0/
Copyright Statement
Advance online version
Accepted Journal Article
(1.2 Mb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by-nc/4.0/
Copyright Statement
© 2018 The Authors.Remote Sensing in Ecology and Conservationpublished by John Wiley & Sons Ltd on behalf of Zoological Society of London.This is an open access article under the terms of the Creative Commons Attribution-NonCommercial License, which permits use,distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.
You might also like
ExBERT: An External Knowledge Enhanced BERT for Natural Language Inference
(2021)
Book Chapter
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search