Access the full text.
Sign up today, get DeepDyve free for 14 days.
(2017)
Deep residual learning
Høye T.T. (2020)
18DCE‐Nationalt Center for Miljø Og Energi, 371
J. Munkres (1957)
Algorithms for the Assignment and Transportation ProblemsJournal of The Society for Industrial and Applied Mathematics, 10
Luca Pegoraro, O. Hidalgo, I. Leitch, J. Pellicer, Sarah Barlow (2020)
Automated video monitoring of insect pollinators in the field.Emerging topics in life sciences, 4 1
Christin S. (2019)
1632Methods in Ecology and Evolution, 10
Joseph Redmon, S. Divvala, Ross Girshick, Ali Farhadi (2015)
You Only Look Once: Unified, Real-Time Object Detection2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Kim Bjerge, J. Nielsen, Martin Sepstrup, Flemming Helsing-Nielsen, T. Høye (2020)
An Automated Light Trap to Monitor Moths (Lepidoptera) Using Computer Vision-Based Tracking and Deep LearningSensors (Basel, Switzerland), 21
T.‐Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D. Rananan (2014)
Computer Vision ? ECCV 2014, 8693
(2008)
Traps for capturing insects
Høye T.T. (2021)
e2002545117Proceedings of the National Academy of Sciences, 118
Ronny Steen (2017)
Diel activity, frequency and visit duration of pollinators in focal plants: in situ automatic camera monitoring and data processingMethods in Ecology and Evolution, 8
O. Hansen, J. Svenning, K. Olsen, S. Dupont, B. Garner, Alexandros Iosifidis, Benjamin Price, T. Høye (2019)
Species‐level image classification with convolutional neural network enables insect identification from habitus imagesEcology and Evolution, 10
Moran Ju, Jiangning Luo, Panpan Zhang, Miao He, Haibo Luo (2019)
A Simple and Efficient Network for Small Target DetectionIEEE Access, 7
Yi Huang, William Wang (2017)
Deep Residual Learning for Weakly-Supervised Relation ExtractionArXiv, abs/1707.08866
Leana Zoller, Joanne Bennett, T. Knight (2020)
Diel-scale temporal dynamics in the abundance and composition of pollinators in the Arctic summerScientific Reports, 10
Ärje J. (2020)
922Methods in Ecology and Evolution, 11
Bjerge K. (2021)
343Sensors (Switzerland), 21
Barlow S.E. (2017)
2552Current Biology, 27
N. MacLeod, M. Benfield, P. Culverhouse (2010)
Time to automate identificationNature, 467
Hansen O.L.P. (2020)
737Ecology and Evolution, 10
Sylvain Christin, É. Hervet, N. Lecomte (2018)
Applications for deep learning in ecologybioRxiv
Didham R.K. (2020)
103Insect Conservation and Diversity, 13
Gilpin A.M. (2017)
383Ecological Entomology, 42
J. Ärje, C. Melvad, M. Jeppesen, S. Madsen, Jenni Raitoharju, Maria Rasmussen, Alexandros Iosifidis, V. Tirronen, Kristian Meissner, M. Gabbouj, T. Høye (2020)
Automatic image‐based identification and biomass estimation of invertebratesMethods in Ecology and Evolution, 11
Amjad Bashir M. (2014)
5Journal of Environmental and Agricultural Sciences, 1
(2021)
Deep learning and computer vision
Girshick R. (2015)
1440
(2020)
Kamerabaseret overvågning af insekter på grønne bytage
Graham Montgomery, M. Belitz, R. Guralnick, M. Tingley (2021)
Standards and Best Practices for Monitoring and Benchmarking Insects, 8
Y.Y. Huang, W.Y. Wang (2017)
EMNLP 2017 ? conference on empirical methods in natural language processing, proceedings
Weibo Liu, Zidong Wang, Xiaohui Liu, Nianyin Zeng, Yurong Liu, F. Alsaadi (2017)
A survey of deep neural network architectures and their applicationsNeurocomputing, 234
R. Collett, D. Fisher (2017)
Time‐lapse camera trapping as an alternative to pitfall trapping for estimating activity of leaf litter arthropodsEcology and Evolution, 7
M. Bashir, A. Alvi, Hina Naz, G. Khan (2014)
Effectiveness of sticky traps in monitoring insectsJournal of Environmental and Agricultural Sciences, 1
(2021)
Tracking algorithm. a 2021 The Authors
R. Girshick (2015)
Proceedings of the IEEE International Conference on Computer Vision
D. Wagner, E. Grames, M. Forister, M. Berenbaum, D. Stopak (2021)
Insect decline in the Anthropocene: Death by a thousand cutsProceedings of the National Academy of Sciences, 118
(2018)
YOLOv3: an incremental improvement. arXiv preprint arXiv: 1804.02767
Veronica Wignall, Isabella Harry, Natasha Davies, Stephen Kenny, Jack McMinn, F. Ratnieks (2019)
Seasonal variation in exploitative competition between honeybees and bumblebeesOecologia, 192
Tsung-Yi Lin, M. Maire, Serge Belongie, James Hays, P. Perona, Deva Ramanan, Piotr Dollár, C. Zitnick (2014)
Microsoft COCO: Common Objects in Context
(2008)
Traps for capturing insects. Encyclopedia of Entomology, 3887–3901
C. Lortie, A. Budden, Anya Reid (2011)
From birds to bees: applying video observation techniques to invertebrate pollinatorsJournal of pollination ecology, 6
(2021)
Jetson Nano Developer Kit User Guide
J. Redmon, S. Divvala, R. Girshick, A. Farhadi (2016)
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
(2020)
Brio ultra hd pro webcam
T. Høye, J. Ärje, Kim Bjerge, O. Hansen, Alexandros Iosifidis, F. Leese, Hjalte Mann, Kristian Meissner, C. Melvad, Jenni Raitoharju (2020)
Deep learning and computer vision will transform entomologyProceedings of the National Academy of Sciences, 118
P. Eliopoulos, N. Tatlas, I. Rigakis, I. Potamitis (2018)
A “Smart” Trap Device for Detection of Crawling Insects and Other Arthropods in Urban EnvironmentsElectronics
Ben. Weinstein (2018)
Scene‐specific convolutional neural networks for video‐based biodiversity detectionMethods in Ecology and Evolution, 9
(2021)
Motion an open source program that monitors video from cameras
Amy‐Marie Gilpin, A. Denham, D. Ayre (2017)
The use of digital video recorders in pollination biologyEcological Entomology, 42
DeNan Xia, Peng Chen, Bing Wang, Jun Zhang, Chengjun Xie (2018)
Insect Detection and Classification Based on an Improved Convolutional Neural NetworkSensors (Basel, Switzerland), 18
V. Wojcik, L. Morandin, Laurie Adams, Kelly Rourke (2018)
Floral Resource Competition Between Honey Bees and Wild Bees: Is There Clear Evidence and Can We Guide Management and Conservation?Environmental entomology, 47 4
R. Didham, Y. Basset, C. Collins, S. Leather, N. Littlewood, Myles Menz, Jörg Müller, L. Packer, Manu Saunders, K. Schönrogge, Alan Stewart, S. Yanoviak, C. Hassall (2020)
Interpreting insect declines: seven challenges and a way forwardInsect Conservation and Diversity, 13
(2008)
Traps for capturing insects. Encyclopedia of Entomology
Ross Girshick (2015)
Fast R-CNN
Collett R.A. (2017)
7527Ecology and Evolution, 7
(2017)
Distasteful nectar deters floral
Sarah Barlow, Geraldine Wright, Carolyn Ma, M. Barberis, I. Farrell, E. Marr, A. Brankin, B. Pavlik, P. Stevenson (2017)
Distasteful Nectar Deters Floral RobberyCurrent Biology, 27
Eliopoulos P. (2018)
161Electronics (Switzerland), 7
Insects are declining in abundance and diversity, but their population trends remain uncertain as insects are difficult to monitor. Manual methods require substantial time investment in trapping and subsequent species identification. Camera trapping can alleviate some of the manual fieldwork, but the large quantities of image data are challenging to analyse. By embedding the image analyses into the recording process using computer vision techniques, it is possible to focus efforts on the most ecologically relevant image data. Here, we present an intelligent camera system, capable of detecting, tracking, and identifying individual insects in situ. We constructed the system from commercial off‐the‐shelf components and used deep learning open source software to perform species detection and classification. We present the Insect Classification and Tracking algorithm (ICT) that performs real‐time classification and tracking at 0.33 frames per second. The system can upload summary data on the identity and movement track of insects to a server via the internet on a daily basis. We tested our system during the summer 2020 and detected 2994 insect tracks across 98 days. We achieved an average precision of 89% for correctly classified insect tracks of eight different species. This result was based on 504 manually verified tracks observed in videos during 10 days with varying insect activities. Using the track data, we could estimate the mean residence time for individual flower visiting insects within the field of view of the camera, and we were able to show a substantial variation in residence time among insect taxa. For honeybees, which were most abundant, residence time also varied through the season in relation to the plant species in bloom. Our proposed automated system showed promising results in non‐destructive and real‐time monitoring of insects and provides novel information about phenology, abundance, foraging behaviour, and movement ecology of flower visiting insects.
Remote Sensing in Ecology and Conservation – Wiley
Published: Jun 1, 2022
Keywords: Computer vision; deep learning; insects; pollinators; real‐time; tracking
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.