Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Bring your own camera to the trap: An inexpensive, versatile, and portable triggering system tested on wild hummingbirds

Bring your own camera to the trap: An inexpensive, versatile, and portable triggering system... INTRODUCTIONStudying animals in their natural habitat is advantageous and preferable to working with them in captivity for a variety of research questions. Natural habitats allow for the study of behavior in situ, offering opportunities to measure energetic performance of free‐living animals, and opening doors to understanding intra and interspecific interactions. Researchers face three main challenges while working with wild animals: (1) neutralizing the observer effect (Baker & McGuffin, ; Wade, Zalucki, & Franzmann, ), (2) dealing with long waiting times (review in Cutler & Swann, ), and (3) compensating for human sensory limitations (Weale, ). These challenges are solved with camera traps (reviews in Cutler & Swann, ; O'Connell, Nichols, & Karanth, ; Rowcliffe & Carbone, ), which are used to detect animals in a given area (Karanth, ; Silveira, Jácomo, & Diniz‐Filho, ), and to study behavior (Bischof, Ali, Kabir, Hameed, & Nawaz, ; Gula, Theuerkauf, Rouys, & Legault, ; Ohashi, D'Souza, & Thomson, ). Nevertheless, commercial camera traps are effective with relatively few taxa and can be prohibitively expensive (Meek & Pittet, ). Our goal was to design an alternative with the flexibility to study a larger variety of taxa at the lowest cost possible.We developed a system with many advantages over available camera traps: (1) it functions mechanically and can be coupled with cameras that do not support remote triggering, (2) it can incorporate most sensors, which allows for other modes of detection besides movement (e.g., light, color, sound), (3) it offers versatility to position multiple sensors separately from the camera, adapting to the subject of interest, (4) it is inexpensive enough to be affordable in field projects requiring multiple camera traps, (5) as it is not married to any particular camera, it can be updated when a given technology becomes obsolete, (6) it is powered by standard AA batteries for long durations, facilitating recharging, easy replacement, and accessibility in remote locations, (7) it is weatherproof, light, and portable, allowing for the deployment of several units with few personnel, and (8) the triggering and sensor systems are easy to customize in the field to adapt to changing conditions or objectives.Most camera traps have been designed to capture large animals using passive infrared motion sensors (PIR) (Meek & Pittet, ; Rovero, Zimmermann, Berzi, & Meek, ; Welbourne, Claridge, Paull, & Lambert, ), although some applications have been aimed at small mammals (Pearson, ; Soininen & Jensvoll, ; Villette, Krebs, Jung, & Boonstra, ), birds (Bolton, Butcher, Sharpe, Stevens, & Fisher, ; Kross & Nelson, ), and arthropods (review in Steen & Ski, ). PIR sensors detect a change in surface temperatures, such as when an animal with a different surface temperature than the background enters the scene (Welbourne et al., ). Despite broad use, PIR sensors present various problems such as false negatives when the subject's surface temperature differs minimally from the background (Welbourne et al., ), or false positives from vegetation blowing in the wind (Rovero et al., ; Welbourne et al., ). Innovations for improving detection of smaller (or poikilothermic) taxa include active infrared sensors (AIR) that trigger when an infrared beam is crossed (Hernandez, Rollins, & Cantu, ; Rovero et al., ; Swann, Hass, Dalton, & Wolf, ), cameras with video motion detection that trigger when there is movement in the selected field of view (Bolton et al., ; Kross & Nelson, ), and multiple sensors to improve detectability (Meek & Pittet, ). Unlike PIR, the former two innovations are not affected by surface temperature of the background or subject. In addition to sensor limitations, available camera traps generally lack specialized features such as tele‐macro and high‐speed video. External sensors and triggering systems (e.g., Trailmaster®, Cognisys®) provide control over the sensors used, and allow the use of some specialized cameras (Brooks, ; Hernandez et al., ; Kucera & Barrett, ), but all require cameras that can be remotely triggered.A review of the market found no suitable camera trap for capturing high‐speed video of hummingbird floral visits: limitations included a combination of slow trigger speed (latency to start recording 0.5 s, cf. Meek & Pittet, ), low video frame rate (<60 frames/s), and no remote triggering. Our system overcomes these limitations by pairing with nearly any camera or sensor setup, allowing a researcher to choose optimal camera and sensor configurations separately to match their organism and application. We tested the system by tailoring it specifically to hummingbirds, but as it is presented here, it can be coupled to any kind of sensor and camera in order to study a wide variety of subjects (e.g., AIR sensors and night‐vision cameras to study nocturnal poikilotherms).We present a test of this novel system using hummingbirds (Figure ). These small, superb fliers visit flowers quickly without perching–difficult subjects for camera traps to detect and record. We have studied hummingbirds drinking nectar from artificial feeders (Rico‐Guevara & Rubega, ) and have developed predictions from biomechanical principles (Rico‐Guevara, Fan, & Rubega, ) that necessitate testing in the wild. Hummingbirds may visit a flower at intervals from 10 minutes to a few hours (Araujo & Sazima, ; Rodrigues & Rodrigues, ), so camera trapping becomes imperative to collect data without researcher‐intensive monitoring. A camera trap that can capture hummingbirds visiting wild flowers requires: (1) detection of small animals, (2) high‐speed video, and (3) tele‐macro functionality (close‐up videos from a distance).A hummingbird photographed using one of the triggering systems. A male Glowing puffleg (Eriocnemis vestita) captured by setting a camera in burst mode and triggered automatically by one of the systems described in the present study. A complete list of the hummingbirds studied is available in Table S3We aimed to quantify a hummingbird's net energy gain during a floral visit, for which we needed to measure wing beat frequency to estimate energy expenditure in addition to energy acquired from the nectar (c.f. Anderson, ). A hummingbird may completely deplete a flower in less than a second; capturing this process requires fast triggering. With nectar licking rates up to 20 Hz (Rico‐Guevara, ) and wingbeat frequencies usually around 20–50 Hz (Altshuler & Dudley, ; Hedrick, Tobalske, Ros, Warrick, & Biewener, ), high‐speed video (>200 frames/s) is required to count the number of licks and wing beats during a single visit to a flower, information of primary importance to understand their energetics and consequent decision‐making behaviors in nature.MATERIALS AND METHODSSensor selection and testingWe choose to use PIR motion sensors over the alternatives listed above because they did not need to be in close proximity to the flower, were cheap and easy to deploy, and successfully detected hummingbirds. We assessed their ability to detect hummingbirds by filming each sensor connected to a light‐emitting diode, at different distances (measured with a laser range finder: Simmons 600) at hummingbird feeders and flowers at the Finca Colibrí Gorriazul, a private field station near Fusagasugá, Colombia. PIR sensors successfully detected over 15 species of hummingbird, varying in size and behavior, at both feeders and flowers. Individuals of one of the smallest species (3.5–4 g), Chaetocercus mulsant (Table S3) were reliably detected at distances of 50 cm, and individuals of one of the largest species (6–9 g), Colibri coruscans (Table S3) were reliably detected at distances of 100 cm.Triggering mechanismOur triggering mechanism consisted of two PIR sensors connected to a triggering circuit (Figures , S2). When either of the PIR sensors activated, the triggering circuit briefly turned on a mechanical actuator, which manually pushed the shutter button of the camera (Figure ). The PIR sensors’ retrigger delay was set to 45 s to prevent retriggering before the camera was ready to record again (see supplemental methods); this delay may be unnecessary depending on the study subject and camera selected. Our application also generated a wireless signal to a control box (Fig. S1) some distance away so that a researcher was notified when any traps had triggered to allow for nectar measurements (see below). PIR sensors generally have a range of <7 m for larger‐bodied animals, although with smaller‐bodied animals, such as hummingbirds, their effective range is <1 m. Using multiple external sensors, we were able to optimally position them with respect to the camera, decreasing the likelihood of false negatives. For example, if background vegetation movement was triggering a sensor, we could reposition that sensor (pointing it away from the piece of moving vegetation) without compromising detectability because of the redundancy achieved by having more than one sensor. Furthermore, our standalone sensors (Figure ) included an option to adjust sensitivity, allowing for fine‐tuning by optimizing the tradeoff between increased sensitivity and false positives. Our final costs per triggering mechanism were <$50 (Table S2) plus the cost of the camera (Table S1, we used pre‐owned cameras of ~$200); more inexpensive than other proposed homemade systems (that generally cost $500–$1,500—Pierce & Pobprasert, ; Gula et al., ; Steen, ) or commercial solutions (that can cost upwards of $1,000–5,000—Meek & Pittet, ; Rovero et al., ). Triggering mechanisms weighed ~500 g including batteries and the two sensors, much smaller and lighter than many of the alternatives. Additionally, a number of cost‐saving innovations accompany our system and are broadly applicable to any ecological research using homemade electronics or sensors (see supplemental methods).General configuration of the triggering system. (a) Diagram showing, on the right, two passive infrared (PIR) sensors able to detect changes in surface temperature in the scene caused by an animal and to signal the control circuit. Upon receiving a signal from either sensor, the control circuit sends a brief pulse to the shutter button actuator, which mechanically presses the shutter, activating the camera. (b) Close‐up photograph of the control circuit built on a 400 tie‐point breadboard. Breadboards have addresses for rows and columns. A parts list along with their R × C addresses on the breadboard is provided in Table S2 as well as additional photos and diagrams (Figure S1, S2, supplementary circuit diagrams)Photos of the sensor and trigger box (three views). (a) Lateral view of a PIR sensor weatherproof module, with the sensor pointing up (white dome) and a phone cord port for connection to the camera trap trigger box labeled in red. On the right side of the sensor, the tripod adapter (1/4‐inch female screw) is visible. (b) Dorsal view of the trigger box with the lid of the weatherproof container removed. On the left is the control circuit (cf. Figure b). (c) Lateral view of the trigger box where the weatherproof ports for the sensor phone cord (labeled in red) and the actuator connections (on the right side) are visible. (d) Frontal view of the trigger box in which the power switch (labeled in orange) and the connections for the power wires of the actuator (in blue) can be observedCamera selection and specificationsMore than two frames per wing beat cycle are convenient to estimate hummingbird wing beat frequency (cf. Altshuler & Dudley, ). Consequently, we needed cameras with high‐speed video and a prerecord mode (recording a brief amount of video before the shutter is pressed) to compensate for the camera's shutter lag. We compared available high‐speed video cameras to balance affordability of a multicamera setup with the required features, which we include in the Table S1. We opted to use consumer‐grade high‐speed video cameras (cf. Steen, ), and after experimenting with different models with our triggering system (first four rows in Table S1, recording length section in Supplement), we chose the Casio EX‐FH20/5, which has already been used for biological research (e.g., Ryerson & Schwenk, ). These cameras featured video recording at 210 fps and 480 × 360‐pixel resolution, along with a prerecord mode. We mounted the cameras on light tripods (with triggering system attached) and shielded them with reflective‐layered foam covers, offering rain and sun protection. Cameras were powered externally by two 4xAA battery packs wired in parallel and plugged into the camera's AC adapter port.System testsWe studied hummingbird feeding at Peña del Aserradero Natural Reserve (cloud forest ~2,400 m.a.s.l.) in the Northern Andes of Colombia. We tested the cameras during 3 days (pilot fieldwork), then stopped filming to review the videos and make adjustments (see Section “3”), and finished with seven more days of filming. We deployed camera traps (Figures , S3, S4) at focal flowers (May–June 2015), experiencing copious rain, cold (lowest temperatures under 5°C), and intense sun (peak temperatures above 30°C). We collected data simultaneously from four high‐speed cameras situated at focal flowers in different feeding territories. To assess the reliability of the system by documenting missed visits or false positives, backup cameras at each site continuously filmed both the focal flowers and the camera traps at 30 fps (Video S1). We camouflaged all the cameras and systems at the field site (Video S1, Fig. S3), with two researchers alternating waiting for the signals at the base camp and measuring nectar volume and concentrations of the flowers adjacent to the focal ones.Photographs of the system deployed in the field. The left photograph shows the mounting of the actuator positioned to press the camera's shutter. Our mount used Meccano™ pieces (Meccano S.N., Calais, France), although simple hardware or an articulating arm would suffice. On the right the camera is shown in prerecord mode with two PIR sensors in the background. Camera standby time was extended using an external AA battery pack connected to the camera's power socket (blue and red wires). The trigger box (not visible) is below the cameraTo study energetics, we needed to obtain an estimate of the nectar energy available to a hummingbird at the time of the visit. Therefore, at each camera trap, we bagged a flower next to the focal flower, and immediately after a bird visit, we emulated a “visit” to the bagged flower, and measured nectar volume and concentration. Our triggering system only started recordings at the beginning of a visit; therefore, recordings were stopped manually while researchers measured nectar. We reached each camera location within 1—2 min and did not record hummingbirds re‐visiting the focal flower during these intervals. As triggering systems sent a wireless signal to a base camp when activated, we were able to monitor all four traps simultaneously and immediately emulate flower visits. From the videos, we measured licking rate, bill insertion distance, handling time, amount of nectar collected, nectar properties, and aerodynamic parameters (e.g., wingbeats/s). These allowed us to obtain extraction efficiency (μl/s), energy content in the nectar consumed (cal/μl), and net energy gain (by subtracting the costs of hovering).RESULTSWe manually reviewed the pilot fieldwork videos from the backup cameras at 3× speed and compared the number of visits with those captured by our systems. There were 35 hummingbird visits to the focal flowers (e.g., Video S2), and the cameras were triggered 60 times; of these 60, 34 were recordings of actual visits, and 26 were videos of the focal flower without a visiting hummingbird (false positives). One hummingbird visit did not trigger our systems (a false negative) (Table ). This false negative occurred when a hummingbird arrived during the 45 s trigger delay immediately following a false positive triggered by wind. False positives occurred in only one location and only during the afternoons; studying the backup videos we conjectured that all were caused by strong wind moving vegetation. This location was particularly exposed and windy in the afternoons compared to other locations. Review of backup videos showed that, despite the camouflage, the hummingbirds inspected the cameras and sensors. Nevertheless, after an initial inspection (Video S1), all hummingbirds visited the focal flower and did not inspect them the second and third days. Actuator motion and sound were minimal and occurred away from the flower, provoking no observable behavioral changes in the hummingbirds. Following pilot fieldwork (Figure ), we performed a series of fixes that minimized the false positives through trial and error at the problematic windy location. We greatly reduced the false positives by repositioning the sensors (away from the piece of vegetation previously triggering them) and decreasing their sensitivity (to ignore background vegetation movement detection), minimizing triggering by vegetation moving the wind. In addition, we enhanced the quality of the data from the videos collected through the triggering systems by improving the zoom and framing (to capture both hummingbird hovering and feeding), and accounted for lighting changes throughout the day (avoiding dark recordings). We also discarded videos from the first day to minimize observer effects.Performance of the systems. The number of correctly captured visits, false positives, and false negatives are shown for both our initial pilot fieldwork, and after adjusting locations and PIR sensitivity to minimize false positives. Percentages in parentheses are shown relative to total triggers or total visits, denoting rates of true positives, false positives, and false negativesInitial pilotAfter adjustmentTriggeredNot triggeredTotalTriggeredNot triggeredTotalVisit34 (57%)1 (3%)35107 (87%)0 (0%)107No visit26 (43%)—2616 (13%)—16Total601611230123In the extended fieldwork, we documented 107 floral visits by hummingbirds in high‐speed video: There were no false negatives, and we only recorded 16 false positives (Table ). We collected data on visits to eight plant species by 11 species of hummingbirds (Table S3). The lack of false negatives is a testament to the usefulness of multiple external PIR sensors for capturing hummingbirds, and our final false positive rate of <15% is trivial in comparison to the benefits of our system. It took less than a minute to review one of the false‐positive videos (for a total of about 15 min in the extended fieldwork phase), but about 180 hr to visually review the videos from the continuously filming backup cameras at 3× speed for the same fieldwork phase. By using macro, backlit‐filming techniques (cf. Rico‐Guevara, ), we visualized and measured the amount of nectar inside flowers, and tracked the bill and tongue inside the corolla. Through this combination of automated macro, high‐speed, and backlit videography, we were able to observe what was previously unobservable–wild hummingbirds depleting nectar inside flowers.Our triggering system drew 15 mA of current, with a one‐second 350 mA pulse when triggered. We were able to run each trigger on one set of batteries (8xAA) for the entire study (~100 hr). Battery life depends somewhat on the rate of triggering, but 140–160 hr is reasonable for 2,500 mAh AA batteries and greatly exceeds that of the cameras, even when using external battery packs. The external battery packs for the cameras were changed every 12–15 hr of monitoring and never fully drained. However, their run time was considerably less than that of the triggering mechanisms. Therefore, battery life of the entire system is generally dependent on the chosen camera, not the triggering mechanism.DISCUSSIONFilming animal behaviors in their natural environment, while minimizing observer disruption is costly and time‐consuming and camera traps help solve this problem. However, the flexibility of available camera traps is limited, and no options exist for filming high‐speed video of small‐bodied animals such as hummingbirds. We solved this logistical challenge by splitting the camera trap into two parts: the camera and the triggering mechanism. We designed a system that could mechanically trigger specialized cameras and receive input from multiple external sensors positioned separately from the camera, all while being cheap, portable, weatherproof, battery‐efficient, and easy to upgrade. We were able to independently pick the ideal camera and sensor configuration for our application, allowing more control in picking critical camera features such as video frame rate and prerecord mode, and more flexibility with designing sensor configurations to optimize sensitivity. Our system simply presses the camera's shutter button; therefore, cameras can be upgraded and recoupled. The system can also be adapted for cameras with remote triggering by closing the camera's remote trigger switch instead of operating an actuator.While our application is unique in using high‐speed video camera traps, the main novelty of our design is the decoupling of camera and triggering system, increasing camera trap flexibility. While we used high‐speed video cameras, the system could be adapted to other specialized cameras such as starlight or thermal cameras to study nocturnal animals, including cameras that researchers already own or those with future technological advances. While we used PIR sensors to detect movement, the design is not limited to PIR, and other sensors for light, color, or sound could be employed to trigger the camera instead of or in combination with PIR sensors.Alternatives to our approach include several technologies already used for studying small animals. Camera traps triggered by AIR sensors (Hernandez et al., ) and mini‐DVR video recorders or cameras with video motion detection that start recording when motion in the video is detected (e.g., auto‐record mode on JVC GC‐PX100 or surveillance software such as Scene Analyzer™, i‐PRO SmartHD™, iSpy) are appropriate (Bolton et al., ; Kross & Nelson, ). So are computer vision algorithms (e.g., Anandan, Bergen, Hanna, & Hingorani, ; Joshi & Thakore, ; Nordlund & Uhlin, ; Zeljkovic, ) that have recently been applied to biological studies to filter video for animal activity after recording (e.g., Dell et al., ; Weinstein, ). However, none of these systems currently support a wide array of cameras, including any with high‐speed video. Most cost in excess of $500 and many use heavy 12 V batteries. We found the auto‐record mode of many video motion detection solutions was too slow to capture the start of a visit by hummingbirds. Computer vision algorithms require continuous prerecorded high‐speed video and acquiring this under field conditions faces significant drawbacks such as short camera battery life and storage problems (a 32 GB memory card lasts 2 hr). Therefore, our solution allows for cost‐effective camera trapping with more functionality than previously possible. One limitation of our system was the high maintenance level of the cameras we used. Cameras needed to be protected with rainproof covers, were turned off at night, and were not deployed for long periods of time. One advantage of commercial camera traps is that they are completely weatherproof and designed for deployments of weeks or months. These features are not useful when filming hummingbirds, due to high turnover rates of inflorescences. Nevertheless, if researchers require longer filming periods, we recommend weatherproof cameras (e.g., GoPro®), that can maintain standby mode for long durations.CONCLUSIONWe are unaware of a recent camera trap application for ecological research in which the triggering system was separated from the camera itself. This approach leads fewer design compromises and has the potential to minimize the limitations of many extant camera traps (Meek & Pittet, ; Rovero et al., ). In our application, we were able to use a specialized camera with much improved video features from standard camera traps, situate the camera separately from the triggering sensors, modify the sensors, and add wireless capabilities; all at a much lower cost. Our system used simple integrated circuits, although the Arduino® platform could provide future innovations in camera trapping above and beyond those of our system because of its immense flexibility. Arduinos (and similar microcontrollers) have the capability to set video recording duration, trigger flashes, and weigh the input of multiple sensors in complex ways. They can also collect and store ancillary data with sensors for ambient temperature, humidity, and many other aspects of the environment, while being cheap and power efficient.ACKNOWLEDGMENTSWe are deeply indebted to Kristiina Hurme, Margaret Rubega, the UConn Electrical Engineering Department and Austin Warner, Daniel Lees, Jacob Moulton, and Rajeev Bansal for assisting in the early stages of this project. We thank Jesse Joy, Briana Lechkun, Sebastian Aragon, Catalina Peña, and Miguel Ángel Muñoz for field assistance, and Kelsey O'Connor and Jessica Attalah for help organizing data and measuring videos. Carl Schlichting, Timothy Moore, and two anonymous reviewers kindly read the manuscript and provided valuable comments. We thank Diego Sustaita, Jessica Lodwick, Kevin Burgio, Holly Brown, and the UConn Ornithological Research Group for feedback. A. R‐G. thanks the Miller Institute for funding. This work was supported by the UConn EEB Department, and NSF IOS‐DDIG 1311443. The authors declare no conflict of interest.CONFLICT OF INTERESTNone declared.AUTHOR CONTRIBUTIONSAG conceived the camera trap idea. JM built the traps and the control box. AG and JM performed initial tests, and AG conducted field tests in Colombia and analyzed videos. AG and JM co‐wrote the manuscript. Both authors contributed to all drafts and gave final approval for publication.REFERENCESAltshuler, D. L., & Dudley, R. (2003). Kinematics of hovering hummingbird flight along simulated and natural elevational gradients. The Journal of Experimental Biology, 206, 3139–3147.Anandan, P., Bergen, J. R., Hanna, K. J., & Hingorani, R. (1993). Hierarchical model‐based motion estimation. In M. I. Sezan & R. L. Lagendijk (Eds.), Motion analysis and image sequence processing (pp. 1–22). New York, NY: Springer.Anderson, J. D. (1991). Fundamentals of aerodynamics. New York, NY: McGraw Hill.Araujo, A. C., & Sazima, M. (2003). The assemblage of flowers visited by hummingbirds in the ‘capões’ of Southern Pantanal, Mato Grosso do Sul, Brazil. Flora—Morphology, Distribution, Functional Ecology of Plants, 198, 427–435.Baker, R. L., & McGuffin, M. A. (2007). Technique and observer presence affect reporting of behavior of damselfly larvae. Journal of the North American Benthological Society, 26, 145–151.Bischof, R., Ali, H., Kabir, M., Hameed, S., & Nawaz, M. A. (2014). Being the underdog: An elusive small carnivore uses space with prey and time without enemies. Journal of Zoology, 293, 40–48.Bolton, M., Butcher, N., Sharpe, F., Stevens, D., & Fisher, G. (2007). Remote monitoring of nests using digital camera technology. Journal of Field Ornithology, 78, 213–220.Brooks, R. T. (1996). Assessment of two camera‐based systems for monitoring arboreal wildlife. Wildlife Society Bulletin (1973‐2006), 24, 298–300.Cutler, T. L., & Swann, D. E. (1999). Using remote photography in wildlife ecology: A review. Wildlife Society Bulletin (1973‐2006), 27, 571–581.Dell, A. I., Bender, J. A., Branson, K., Couzin, I. D., de Polavieja, G. G., Noldus, L. P. J. J., … Brose, U. (2014). Automated image‐based tracking and its application in ecology. Trends in Ecology and Evolution, 29, 417–428.Gula, R., Theuerkauf, J., Rouys, S., & Legault, A. (2010). An audio/video surveillance system for wildlife. European Journal of Wildlife Research, 56, 803–807.Hedrick, T. L., Tobalske, B. W., Ros, I. G., Warrick, D. R., & Biewener, A. A. (2012). Morphological and kinematic basis of the hummingbird flight stroke: Scaling of flight muscle transmission ratio. Proceedings of the Royal Society B: Biological Sciences, 279, 1986–1992.Hernandez, F., Rollins, D., & Cantu, R. (1997). An evaluation of Trailmaster® Camera systems for identifying ground‐nest predators. Wildlife Society Bulletin (1973‐2006), 25, 848–853.Joshi, K. A., & Thakore, D. G. (2012). A survey on moving object detection and tracking in video surveillance system. International Journal of Soft Computing and Engineering (IJSCE), 2, 44–48.Karanth, K. U. (1995). Estimating tiger Panthera tigris populations from camera‐trap data using capture—recapture models. Biological Conservation, 71, 333–338.Kross, S. M., & Nelson, X. J. (2011). A portable low‐cost remote videography system for monitoring wildlife. Methods in Ecology and Evolution, 2, 191–196.Kucera, T. E., & Barrett, R. H. (1993). In my experience: The Trailmaster® Camera system for detecting wildlife. Wildlife Society Bulletin (1973‐2006), 21, 505–508.Meek, P. D., & Pittet, A. (2012). User‐based design specifications for the ultimate camera trap for wildlife research. Wildlife Research, 39, 649–660.Nordlund, P., & Uhlin, T. (1996). Closing the loop: Detection and pursuit of a moving object by a moving observer. Image and Vision Computing, 14, 265–275.O'Connell, A. F., Nichols, J. D., & Karanth, K. U. (2011). Camera traps in animal ecology. Tokyo, Japan: Springer.Ohashi, K., D'Souza, D., & Thomson, J. D. (2010). An automated system for tracking and identifying individual nectar foragers at multiple feeders. Behavioral Ecology and Sociobiology, 64, 891–897.Pearson, O. P. (1960). Habits of harvest mice revealed by automatic photographic recorders. Journal of Mammalogy, 41, 58–74.Pierce, A. J., & Pobprasert, K. (2007). A portable system for continuous monitoring of bird nests using digital video recorders. Journal of Field Ornithology, 78, 322–328.Rico‐Guevara, A. (2014). Morphology and function of the drinking apparatus in hummingbirds. Storrs, CT: University of Connecticut.Rico‐Guevara, A., Fan, T.‐H., & Rubega, M. A. (2015). Hummingbird tongues are elastic micropumps. Proceedings of the Royal Society B: Biological Sciences, 282, 1–8.Rico‐Guevara, A., & Rubega, M. A. (2011). The hummingbird tongue is a fluid trap, not a capillary tube. Proceedings of the National Academy of Sciences of the United States of America, 108, 9356–9360.Rodrigues, L. C., & Rodrigues, M. (2014). Flowers visited by hummingbirds in the open habitats of the southeastern Brazilian mountaintops: Species composition and seasonality. Brazilian Journal of Biology, 74, 659–676.Rovero, F., Zimmermann, F., Berzi, D., & Meek, P. D. (2013). ‘Which camera trap type and how many do I need?’ A review of camera features and study designs for a range of wildlife research applications. Hystrix, the Italian Journal of Mammalogy, 24, 148–156.Rowcliffe, J. M., & Carbone, C. (2008). Surveys using camera traps: Are we looking to a brighter future? Animal Conservation, 11, 185–186.Ryerson, W. G., & Schwenk, K. (2011). A simple, inexpensive system for digital particle image velocimetry (DPIV) in biomechanics. Journal of Experimental Zoology Part B: Molecular and Developmental Evolution, 317, 127–140.Silveira, L., Jácomo, A. T. A., & Diniz‐Filho, J. A. F. (2003). Camera trap, line transect census and track surveys: A comparative evaluation. Biological Conservation, 114, 351–355.Soininen, E. M., & Jensvoll, I. (2015). Under the snow: A new camera trap opens the white box of subnivean ecology. Remote Sensing in Ecology and Conservation, 1, 29–38.Steen, R. (2014). The use of a low cost high speed camera to monitor wingbeat frequency in hummingbirds (Trochilidae). Ardeola, 61, 111–120.Steen, R., & Ski, S. (2014). Video‐surveillance system for remote long‐term in situ observations: Recording diel cavity use and behaviour of wild European lobsters (Homarus gammarus). Marine and Freshwater Research, 65, 1094–1101.Swann, D. E., Hass, C. C., Dalton, D. C., & Wolf, S. A. (2004). Infrared‐triggered cameras for detecting wildlife: An evaluation and review. Wildlife Society Bulletin (1973‐2006), 32, 357–365.Villette, P., Krebs, C. J., Jung, T. S., & Boonstra, R. (2016). Can camera trapping provide accurate estimates of small mammal (Myodes rutilus and Peromyscus maniculatus) density in the boreal forest? Journal of Mammology, 97, 32–40.Wade, M. R., Zalucki, M. P., & Franzmann, B. A. (2005). Influence of observer presence on Pacific Damsel bug behavior: Who is watching whom? Journal of Insect Behavior, 18, 651–667.Weale, R. A. (1961). Limits of human vision. Nature, 191, 471–473.Weinstein, B. G. (2014). MotionMeerkat: Integrating motion video detection and ecological monitoring (S. Dray, Ed.). Methods in Ecology and Evolution, 6, 357–362.Welbourne, D. J., Claridge, A. W., Paull, D. J., & Lambert, A. (2016). How do passive infrared triggered camera traps operate and why does it matter? Breaking down common misconceptions. Remote Sensing in Ecology and Conservation, 2, 77–83.Zeljkovic, V. (2013). Video surveillance techniques and technologies. Hershey, PA: IGI Global. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Ecology and Evolution Wiley

Bring your own camera to the trap: An inexpensive, versatile, and portable triggering system tested on wild hummingbirds

Loading next page...
 
/lp/wiley/bring-your-own-camera-to-the-trap-an-inexpensive-versatile-and-X5Lez1XD2a

References (43)

Publisher
Wiley
Copyright
© 2017 Published by John Wiley & Sons Ltd.
ISSN
2045-7758
eISSN
2045-7758
DOI
10.1002/ece3.3040
Publisher site
See Article on Publisher Site

Abstract

INTRODUCTIONStudying animals in their natural habitat is advantageous and preferable to working with them in captivity for a variety of research questions. Natural habitats allow for the study of behavior in situ, offering opportunities to measure energetic performance of free‐living animals, and opening doors to understanding intra and interspecific interactions. Researchers face three main challenges while working with wild animals: (1) neutralizing the observer effect (Baker & McGuffin, ; Wade, Zalucki, & Franzmann, ), (2) dealing with long waiting times (review in Cutler & Swann, ), and (3) compensating for human sensory limitations (Weale, ). These challenges are solved with camera traps (reviews in Cutler & Swann, ; O'Connell, Nichols, & Karanth, ; Rowcliffe & Carbone, ), which are used to detect animals in a given area (Karanth, ; Silveira, Jácomo, & Diniz‐Filho, ), and to study behavior (Bischof, Ali, Kabir, Hameed, & Nawaz, ; Gula, Theuerkauf, Rouys, & Legault, ; Ohashi, D'Souza, & Thomson, ). Nevertheless, commercial camera traps are effective with relatively few taxa and can be prohibitively expensive (Meek & Pittet, ). Our goal was to design an alternative with the flexibility to study a larger variety of taxa at the lowest cost possible.We developed a system with many advantages over available camera traps: (1) it functions mechanically and can be coupled with cameras that do not support remote triggering, (2) it can incorporate most sensors, which allows for other modes of detection besides movement (e.g., light, color, sound), (3) it offers versatility to position multiple sensors separately from the camera, adapting to the subject of interest, (4) it is inexpensive enough to be affordable in field projects requiring multiple camera traps, (5) as it is not married to any particular camera, it can be updated when a given technology becomes obsolete, (6) it is powered by standard AA batteries for long durations, facilitating recharging, easy replacement, and accessibility in remote locations, (7) it is weatherproof, light, and portable, allowing for the deployment of several units with few personnel, and (8) the triggering and sensor systems are easy to customize in the field to adapt to changing conditions or objectives.Most camera traps have been designed to capture large animals using passive infrared motion sensors (PIR) (Meek & Pittet, ; Rovero, Zimmermann, Berzi, & Meek, ; Welbourne, Claridge, Paull, & Lambert, ), although some applications have been aimed at small mammals (Pearson, ; Soininen & Jensvoll, ; Villette, Krebs, Jung, & Boonstra, ), birds (Bolton, Butcher, Sharpe, Stevens, & Fisher, ; Kross & Nelson, ), and arthropods (review in Steen & Ski, ). PIR sensors detect a change in surface temperatures, such as when an animal with a different surface temperature than the background enters the scene (Welbourne et al., ). Despite broad use, PIR sensors present various problems such as false negatives when the subject's surface temperature differs minimally from the background (Welbourne et al., ), or false positives from vegetation blowing in the wind (Rovero et al., ; Welbourne et al., ). Innovations for improving detection of smaller (or poikilothermic) taxa include active infrared sensors (AIR) that trigger when an infrared beam is crossed (Hernandez, Rollins, & Cantu, ; Rovero et al., ; Swann, Hass, Dalton, & Wolf, ), cameras with video motion detection that trigger when there is movement in the selected field of view (Bolton et al., ; Kross & Nelson, ), and multiple sensors to improve detectability (Meek & Pittet, ). Unlike PIR, the former two innovations are not affected by surface temperature of the background or subject. In addition to sensor limitations, available camera traps generally lack specialized features such as tele‐macro and high‐speed video. External sensors and triggering systems (e.g., Trailmaster®, Cognisys®) provide control over the sensors used, and allow the use of some specialized cameras (Brooks, ; Hernandez et al., ; Kucera & Barrett, ), but all require cameras that can be remotely triggered.A review of the market found no suitable camera trap for capturing high‐speed video of hummingbird floral visits: limitations included a combination of slow trigger speed (latency to start recording 0.5 s, cf. Meek & Pittet, ), low video frame rate (<60 frames/s), and no remote triggering. Our system overcomes these limitations by pairing with nearly any camera or sensor setup, allowing a researcher to choose optimal camera and sensor configurations separately to match their organism and application. We tested the system by tailoring it specifically to hummingbirds, but as it is presented here, it can be coupled to any kind of sensor and camera in order to study a wide variety of subjects (e.g., AIR sensors and night‐vision cameras to study nocturnal poikilotherms).We present a test of this novel system using hummingbirds (Figure ). These small, superb fliers visit flowers quickly without perching–difficult subjects for camera traps to detect and record. We have studied hummingbirds drinking nectar from artificial feeders (Rico‐Guevara & Rubega, ) and have developed predictions from biomechanical principles (Rico‐Guevara, Fan, & Rubega, ) that necessitate testing in the wild. Hummingbirds may visit a flower at intervals from 10 minutes to a few hours (Araujo & Sazima, ; Rodrigues & Rodrigues, ), so camera trapping becomes imperative to collect data without researcher‐intensive monitoring. A camera trap that can capture hummingbirds visiting wild flowers requires: (1) detection of small animals, (2) high‐speed video, and (3) tele‐macro functionality (close‐up videos from a distance).A hummingbird photographed using one of the triggering systems. A male Glowing puffleg (Eriocnemis vestita) captured by setting a camera in burst mode and triggered automatically by one of the systems described in the present study. A complete list of the hummingbirds studied is available in Table S3We aimed to quantify a hummingbird's net energy gain during a floral visit, for which we needed to measure wing beat frequency to estimate energy expenditure in addition to energy acquired from the nectar (c.f. Anderson, ). A hummingbird may completely deplete a flower in less than a second; capturing this process requires fast triggering. With nectar licking rates up to 20 Hz (Rico‐Guevara, ) and wingbeat frequencies usually around 20–50 Hz (Altshuler & Dudley, ; Hedrick, Tobalske, Ros, Warrick, & Biewener, ), high‐speed video (>200 frames/s) is required to count the number of licks and wing beats during a single visit to a flower, information of primary importance to understand their energetics and consequent decision‐making behaviors in nature.MATERIALS AND METHODSSensor selection and testingWe choose to use PIR motion sensors over the alternatives listed above because they did not need to be in close proximity to the flower, were cheap and easy to deploy, and successfully detected hummingbirds. We assessed their ability to detect hummingbirds by filming each sensor connected to a light‐emitting diode, at different distances (measured with a laser range finder: Simmons 600) at hummingbird feeders and flowers at the Finca Colibrí Gorriazul, a private field station near Fusagasugá, Colombia. PIR sensors successfully detected over 15 species of hummingbird, varying in size and behavior, at both feeders and flowers. Individuals of one of the smallest species (3.5–4 g), Chaetocercus mulsant (Table S3) were reliably detected at distances of 50 cm, and individuals of one of the largest species (6–9 g), Colibri coruscans (Table S3) were reliably detected at distances of 100 cm.Triggering mechanismOur triggering mechanism consisted of two PIR sensors connected to a triggering circuit (Figures , S2). When either of the PIR sensors activated, the triggering circuit briefly turned on a mechanical actuator, which manually pushed the shutter button of the camera (Figure ). The PIR sensors’ retrigger delay was set to 45 s to prevent retriggering before the camera was ready to record again (see supplemental methods); this delay may be unnecessary depending on the study subject and camera selected. Our application also generated a wireless signal to a control box (Fig. S1) some distance away so that a researcher was notified when any traps had triggered to allow for nectar measurements (see below). PIR sensors generally have a range of <7 m for larger‐bodied animals, although with smaller‐bodied animals, such as hummingbirds, their effective range is <1 m. Using multiple external sensors, we were able to optimally position them with respect to the camera, decreasing the likelihood of false negatives. For example, if background vegetation movement was triggering a sensor, we could reposition that sensor (pointing it away from the piece of moving vegetation) without compromising detectability because of the redundancy achieved by having more than one sensor. Furthermore, our standalone sensors (Figure ) included an option to adjust sensitivity, allowing for fine‐tuning by optimizing the tradeoff between increased sensitivity and false positives. Our final costs per triggering mechanism were <$50 (Table S2) plus the cost of the camera (Table S1, we used pre‐owned cameras of ~$200); more inexpensive than other proposed homemade systems (that generally cost $500–$1,500—Pierce & Pobprasert, ; Gula et al., ; Steen, ) or commercial solutions (that can cost upwards of $1,000–5,000—Meek & Pittet, ; Rovero et al., ). Triggering mechanisms weighed ~500 g including batteries and the two sensors, much smaller and lighter than many of the alternatives. Additionally, a number of cost‐saving innovations accompany our system and are broadly applicable to any ecological research using homemade electronics or sensors (see supplemental methods).General configuration of the triggering system. (a) Diagram showing, on the right, two passive infrared (PIR) sensors able to detect changes in surface temperature in the scene caused by an animal and to signal the control circuit. Upon receiving a signal from either sensor, the control circuit sends a brief pulse to the shutter button actuator, which mechanically presses the shutter, activating the camera. (b) Close‐up photograph of the control circuit built on a 400 tie‐point breadboard. Breadboards have addresses for rows and columns. A parts list along with their R × C addresses on the breadboard is provided in Table S2 as well as additional photos and diagrams (Figure S1, S2, supplementary circuit diagrams)Photos of the sensor and trigger box (three views). (a) Lateral view of a PIR sensor weatherproof module, with the sensor pointing up (white dome) and a phone cord port for connection to the camera trap trigger box labeled in red. On the right side of the sensor, the tripod adapter (1/4‐inch female screw) is visible. (b) Dorsal view of the trigger box with the lid of the weatherproof container removed. On the left is the control circuit (cf. Figure b). (c) Lateral view of the trigger box where the weatherproof ports for the sensor phone cord (labeled in red) and the actuator connections (on the right side) are visible. (d) Frontal view of the trigger box in which the power switch (labeled in orange) and the connections for the power wires of the actuator (in blue) can be observedCamera selection and specificationsMore than two frames per wing beat cycle are convenient to estimate hummingbird wing beat frequency (cf. Altshuler & Dudley, ). Consequently, we needed cameras with high‐speed video and a prerecord mode (recording a brief amount of video before the shutter is pressed) to compensate for the camera's shutter lag. We compared available high‐speed video cameras to balance affordability of a multicamera setup with the required features, which we include in the Table S1. We opted to use consumer‐grade high‐speed video cameras (cf. Steen, ), and after experimenting with different models with our triggering system (first four rows in Table S1, recording length section in Supplement), we chose the Casio EX‐FH20/5, which has already been used for biological research (e.g., Ryerson & Schwenk, ). These cameras featured video recording at 210 fps and 480 × 360‐pixel resolution, along with a prerecord mode. We mounted the cameras on light tripods (with triggering system attached) and shielded them with reflective‐layered foam covers, offering rain and sun protection. Cameras were powered externally by two 4xAA battery packs wired in parallel and plugged into the camera's AC adapter port.System testsWe studied hummingbird feeding at Peña del Aserradero Natural Reserve (cloud forest ~2,400 m.a.s.l.) in the Northern Andes of Colombia. We tested the cameras during 3 days (pilot fieldwork), then stopped filming to review the videos and make adjustments (see Section “3”), and finished with seven more days of filming. We deployed camera traps (Figures , S3, S4) at focal flowers (May–June 2015), experiencing copious rain, cold (lowest temperatures under 5°C), and intense sun (peak temperatures above 30°C). We collected data simultaneously from four high‐speed cameras situated at focal flowers in different feeding territories. To assess the reliability of the system by documenting missed visits or false positives, backup cameras at each site continuously filmed both the focal flowers and the camera traps at 30 fps (Video S1). We camouflaged all the cameras and systems at the field site (Video S1, Fig. S3), with two researchers alternating waiting for the signals at the base camp and measuring nectar volume and concentrations of the flowers adjacent to the focal ones.Photographs of the system deployed in the field. The left photograph shows the mounting of the actuator positioned to press the camera's shutter. Our mount used Meccano™ pieces (Meccano S.N., Calais, France), although simple hardware or an articulating arm would suffice. On the right the camera is shown in prerecord mode with two PIR sensors in the background. Camera standby time was extended using an external AA battery pack connected to the camera's power socket (blue and red wires). The trigger box (not visible) is below the cameraTo study energetics, we needed to obtain an estimate of the nectar energy available to a hummingbird at the time of the visit. Therefore, at each camera trap, we bagged a flower next to the focal flower, and immediately after a bird visit, we emulated a “visit” to the bagged flower, and measured nectar volume and concentration. Our triggering system only started recordings at the beginning of a visit; therefore, recordings were stopped manually while researchers measured nectar. We reached each camera location within 1—2 min and did not record hummingbirds re‐visiting the focal flower during these intervals. As triggering systems sent a wireless signal to a base camp when activated, we were able to monitor all four traps simultaneously and immediately emulate flower visits. From the videos, we measured licking rate, bill insertion distance, handling time, amount of nectar collected, nectar properties, and aerodynamic parameters (e.g., wingbeats/s). These allowed us to obtain extraction efficiency (μl/s), energy content in the nectar consumed (cal/μl), and net energy gain (by subtracting the costs of hovering).RESULTSWe manually reviewed the pilot fieldwork videos from the backup cameras at 3× speed and compared the number of visits with those captured by our systems. There were 35 hummingbird visits to the focal flowers (e.g., Video S2), and the cameras were triggered 60 times; of these 60, 34 were recordings of actual visits, and 26 were videos of the focal flower without a visiting hummingbird (false positives). One hummingbird visit did not trigger our systems (a false negative) (Table ). This false negative occurred when a hummingbird arrived during the 45 s trigger delay immediately following a false positive triggered by wind. False positives occurred in only one location and only during the afternoons; studying the backup videos we conjectured that all were caused by strong wind moving vegetation. This location was particularly exposed and windy in the afternoons compared to other locations. Review of backup videos showed that, despite the camouflage, the hummingbirds inspected the cameras and sensors. Nevertheless, after an initial inspection (Video S1), all hummingbirds visited the focal flower and did not inspect them the second and third days. Actuator motion and sound were minimal and occurred away from the flower, provoking no observable behavioral changes in the hummingbirds. Following pilot fieldwork (Figure ), we performed a series of fixes that minimized the false positives through trial and error at the problematic windy location. We greatly reduced the false positives by repositioning the sensors (away from the piece of vegetation previously triggering them) and decreasing their sensitivity (to ignore background vegetation movement detection), minimizing triggering by vegetation moving the wind. In addition, we enhanced the quality of the data from the videos collected through the triggering systems by improving the zoom and framing (to capture both hummingbird hovering and feeding), and accounted for lighting changes throughout the day (avoiding dark recordings). We also discarded videos from the first day to minimize observer effects.Performance of the systems. The number of correctly captured visits, false positives, and false negatives are shown for both our initial pilot fieldwork, and after adjusting locations and PIR sensitivity to minimize false positives. Percentages in parentheses are shown relative to total triggers or total visits, denoting rates of true positives, false positives, and false negativesInitial pilotAfter adjustmentTriggeredNot triggeredTotalTriggeredNot triggeredTotalVisit34 (57%)1 (3%)35107 (87%)0 (0%)107No visit26 (43%)—2616 (13%)—16Total601611230123In the extended fieldwork, we documented 107 floral visits by hummingbirds in high‐speed video: There were no false negatives, and we only recorded 16 false positives (Table ). We collected data on visits to eight plant species by 11 species of hummingbirds (Table S3). The lack of false negatives is a testament to the usefulness of multiple external PIR sensors for capturing hummingbirds, and our final false positive rate of <15% is trivial in comparison to the benefits of our system. It took less than a minute to review one of the false‐positive videos (for a total of about 15 min in the extended fieldwork phase), but about 180 hr to visually review the videos from the continuously filming backup cameras at 3× speed for the same fieldwork phase. By using macro, backlit‐filming techniques (cf. Rico‐Guevara, ), we visualized and measured the amount of nectar inside flowers, and tracked the bill and tongue inside the corolla. Through this combination of automated macro, high‐speed, and backlit videography, we were able to observe what was previously unobservable–wild hummingbirds depleting nectar inside flowers.Our triggering system drew 15 mA of current, with a one‐second 350 mA pulse when triggered. We were able to run each trigger on one set of batteries (8xAA) for the entire study (~100 hr). Battery life depends somewhat on the rate of triggering, but 140–160 hr is reasonable for 2,500 mAh AA batteries and greatly exceeds that of the cameras, even when using external battery packs. The external battery packs for the cameras were changed every 12–15 hr of monitoring and never fully drained. However, their run time was considerably less than that of the triggering mechanisms. Therefore, battery life of the entire system is generally dependent on the chosen camera, not the triggering mechanism.DISCUSSIONFilming animal behaviors in their natural environment, while minimizing observer disruption is costly and time‐consuming and camera traps help solve this problem. However, the flexibility of available camera traps is limited, and no options exist for filming high‐speed video of small‐bodied animals such as hummingbirds. We solved this logistical challenge by splitting the camera trap into two parts: the camera and the triggering mechanism. We designed a system that could mechanically trigger specialized cameras and receive input from multiple external sensors positioned separately from the camera, all while being cheap, portable, weatherproof, battery‐efficient, and easy to upgrade. We were able to independently pick the ideal camera and sensor configuration for our application, allowing more control in picking critical camera features such as video frame rate and prerecord mode, and more flexibility with designing sensor configurations to optimize sensitivity. Our system simply presses the camera's shutter button; therefore, cameras can be upgraded and recoupled. The system can also be adapted for cameras with remote triggering by closing the camera's remote trigger switch instead of operating an actuator.While our application is unique in using high‐speed video camera traps, the main novelty of our design is the decoupling of camera and triggering system, increasing camera trap flexibility. While we used high‐speed video cameras, the system could be adapted to other specialized cameras such as starlight or thermal cameras to study nocturnal animals, including cameras that researchers already own or those with future technological advances. While we used PIR sensors to detect movement, the design is not limited to PIR, and other sensors for light, color, or sound could be employed to trigger the camera instead of or in combination with PIR sensors.Alternatives to our approach include several technologies already used for studying small animals. Camera traps triggered by AIR sensors (Hernandez et al., ) and mini‐DVR video recorders or cameras with video motion detection that start recording when motion in the video is detected (e.g., auto‐record mode on JVC GC‐PX100 or surveillance software such as Scene Analyzer™, i‐PRO SmartHD™, iSpy) are appropriate (Bolton et al., ; Kross & Nelson, ). So are computer vision algorithms (e.g., Anandan, Bergen, Hanna, & Hingorani, ; Joshi & Thakore, ; Nordlund & Uhlin, ; Zeljkovic, ) that have recently been applied to biological studies to filter video for animal activity after recording (e.g., Dell et al., ; Weinstein, ). However, none of these systems currently support a wide array of cameras, including any with high‐speed video. Most cost in excess of $500 and many use heavy 12 V batteries. We found the auto‐record mode of many video motion detection solutions was too slow to capture the start of a visit by hummingbirds. Computer vision algorithms require continuous prerecorded high‐speed video and acquiring this under field conditions faces significant drawbacks such as short camera battery life and storage problems (a 32 GB memory card lasts 2 hr). Therefore, our solution allows for cost‐effective camera trapping with more functionality than previously possible. One limitation of our system was the high maintenance level of the cameras we used. Cameras needed to be protected with rainproof covers, were turned off at night, and were not deployed for long periods of time. One advantage of commercial camera traps is that they are completely weatherproof and designed for deployments of weeks or months. These features are not useful when filming hummingbirds, due to high turnover rates of inflorescences. Nevertheless, if researchers require longer filming periods, we recommend weatherproof cameras (e.g., GoPro®), that can maintain standby mode for long durations.CONCLUSIONWe are unaware of a recent camera trap application for ecological research in which the triggering system was separated from the camera itself. This approach leads fewer design compromises and has the potential to minimize the limitations of many extant camera traps (Meek & Pittet, ; Rovero et al., ). In our application, we were able to use a specialized camera with much improved video features from standard camera traps, situate the camera separately from the triggering sensors, modify the sensors, and add wireless capabilities; all at a much lower cost. Our system used simple integrated circuits, although the Arduino® platform could provide future innovations in camera trapping above and beyond those of our system because of its immense flexibility. Arduinos (and similar microcontrollers) have the capability to set video recording duration, trigger flashes, and weigh the input of multiple sensors in complex ways. They can also collect and store ancillary data with sensors for ambient temperature, humidity, and many other aspects of the environment, while being cheap and power efficient.ACKNOWLEDGMENTSWe are deeply indebted to Kristiina Hurme, Margaret Rubega, the UConn Electrical Engineering Department and Austin Warner, Daniel Lees, Jacob Moulton, and Rajeev Bansal for assisting in the early stages of this project. We thank Jesse Joy, Briana Lechkun, Sebastian Aragon, Catalina Peña, and Miguel Ángel Muñoz for field assistance, and Kelsey O'Connor and Jessica Attalah for help organizing data and measuring videos. Carl Schlichting, Timothy Moore, and two anonymous reviewers kindly read the manuscript and provided valuable comments. We thank Diego Sustaita, Jessica Lodwick, Kevin Burgio, Holly Brown, and the UConn Ornithological Research Group for feedback. A. R‐G. thanks the Miller Institute for funding. This work was supported by the UConn EEB Department, and NSF IOS‐DDIG 1311443. The authors declare no conflict of interest.CONFLICT OF INTERESTNone declared.AUTHOR CONTRIBUTIONSAG conceived the camera trap idea. JM built the traps and the control box. AG and JM performed initial tests, and AG conducted field tests in Colombia and analyzed videos. AG and JM co‐wrote the manuscript. Both authors contributed to all drafts and gave final approval for publication.REFERENCESAltshuler, D. L., & Dudley, R. (2003). Kinematics of hovering hummingbird flight along simulated and natural elevational gradients. The Journal of Experimental Biology, 206, 3139–3147.Anandan, P., Bergen, J. R., Hanna, K. J., & Hingorani, R. (1993). Hierarchical model‐based motion estimation. In M. I. Sezan & R. L. Lagendijk (Eds.), Motion analysis and image sequence processing (pp. 1–22). New York, NY: Springer.Anderson, J. D. (1991). Fundamentals of aerodynamics. New York, NY: McGraw Hill.Araujo, A. C., & Sazima, M. (2003). The assemblage of flowers visited by hummingbirds in the ‘capões’ of Southern Pantanal, Mato Grosso do Sul, Brazil. Flora—Morphology, Distribution, Functional Ecology of Plants, 198, 427–435.Baker, R. L., & McGuffin, M. A. (2007). Technique and observer presence affect reporting of behavior of damselfly larvae. Journal of the North American Benthological Society, 26, 145–151.Bischof, R., Ali, H., Kabir, M., Hameed, S., & Nawaz, M. A. (2014). Being the underdog: An elusive small carnivore uses space with prey and time without enemies. Journal of Zoology, 293, 40–48.Bolton, M., Butcher, N., Sharpe, F., Stevens, D., & Fisher, G. (2007). Remote monitoring of nests using digital camera technology. Journal of Field Ornithology, 78, 213–220.Brooks, R. T. (1996). Assessment of two camera‐based systems for monitoring arboreal wildlife. Wildlife Society Bulletin (1973‐2006), 24, 298–300.Cutler, T. L., & Swann, D. E. (1999). Using remote photography in wildlife ecology: A review. Wildlife Society Bulletin (1973‐2006), 27, 571–581.Dell, A. I., Bender, J. A., Branson, K., Couzin, I. D., de Polavieja, G. G., Noldus, L. P. J. J., … Brose, U. (2014). Automated image‐based tracking and its application in ecology. Trends in Ecology and Evolution, 29, 417–428.Gula, R., Theuerkauf, J., Rouys, S., & Legault, A. (2010). An audio/video surveillance system for wildlife. European Journal of Wildlife Research, 56, 803–807.Hedrick, T. L., Tobalske, B. W., Ros, I. G., Warrick, D. R., & Biewener, A. A. (2012). Morphological and kinematic basis of the hummingbird flight stroke: Scaling of flight muscle transmission ratio. Proceedings of the Royal Society B: Biological Sciences, 279, 1986–1992.Hernandez, F., Rollins, D., & Cantu, R. (1997). An evaluation of Trailmaster® Camera systems for identifying ground‐nest predators. Wildlife Society Bulletin (1973‐2006), 25, 848–853.Joshi, K. A., & Thakore, D. G. (2012). A survey on moving object detection and tracking in video surveillance system. International Journal of Soft Computing and Engineering (IJSCE), 2, 44–48.Karanth, K. U. (1995). Estimating tiger Panthera tigris populations from camera‐trap data using capture—recapture models. Biological Conservation, 71, 333–338.Kross, S. M., & Nelson, X. J. (2011). A portable low‐cost remote videography system for monitoring wildlife. Methods in Ecology and Evolution, 2, 191–196.Kucera, T. E., & Barrett, R. H. (1993). In my experience: The Trailmaster® Camera system for detecting wildlife. Wildlife Society Bulletin (1973‐2006), 21, 505–508.Meek, P. D., & Pittet, A. (2012). User‐based design specifications for the ultimate camera trap for wildlife research. Wildlife Research, 39, 649–660.Nordlund, P., & Uhlin, T. (1996). Closing the loop: Detection and pursuit of a moving object by a moving observer. Image and Vision Computing, 14, 265–275.O'Connell, A. F., Nichols, J. D., & Karanth, K. U. (2011). Camera traps in animal ecology. Tokyo, Japan: Springer.Ohashi, K., D'Souza, D., & Thomson, J. D. (2010). An automated system for tracking and identifying individual nectar foragers at multiple feeders. Behavioral Ecology and Sociobiology, 64, 891–897.Pearson, O. P. (1960). Habits of harvest mice revealed by automatic photographic recorders. Journal of Mammalogy, 41, 58–74.Pierce, A. J., & Pobprasert, K. (2007). A portable system for continuous monitoring of bird nests using digital video recorders. Journal of Field Ornithology, 78, 322–328.Rico‐Guevara, A. (2014). Morphology and function of the drinking apparatus in hummingbirds. Storrs, CT: University of Connecticut.Rico‐Guevara, A., Fan, T.‐H., & Rubega, M. A. (2015). Hummingbird tongues are elastic micropumps. Proceedings of the Royal Society B: Biological Sciences, 282, 1–8.Rico‐Guevara, A., & Rubega, M. A. (2011). The hummingbird tongue is a fluid trap, not a capillary tube. Proceedings of the National Academy of Sciences of the United States of America, 108, 9356–9360.Rodrigues, L. C., & Rodrigues, M. (2014). Flowers visited by hummingbirds in the open habitats of the southeastern Brazilian mountaintops: Species composition and seasonality. Brazilian Journal of Biology, 74, 659–676.Rovero, F., Zimmermann, F., Berzi, D., & Meek, P. D. (2013). ‘Which camera trap type and how many do I need?’ A review of camera features and study designs for a range of wildlife research applications. Hystrix, the Italian Journal of Mammalogy, 24, 148–156.Rowcliffe, J. M., & Carbone, C. (2008). Surveys using camera traps: Are we looking to a brighter future? Animal Conservation, 11, 185–186.Ryerson, W. G., & Schwenk, K. (2011). A simple, inexpensive system for digital particle image velocimetry (DPIV) in biomechanics. Journal of Experimental Zoology Part B: Molecular and Developmental Evolution, 317, 127–140.Silveira, L., Jácomo, A. T. A., & Diniz‐Filho, J. A. F. (2003). Camera trap, line transect census and track surveys: A comparative evaluation. Biological Conservation, 114, 351–355.Soininen, E. M., & Jensvoll, I. (2015). Under the snow: A new camera trap opens the white box of subnivean ecology. Remote Sensing in Ecology and Conservation, 1, 29–38.Steen, R. (2014). The use of a low cost high speed camera to monitor wingbeat frequency in hummingbirds (Trochilidae). Ardeola, 61, 111–120.Steen, R., & Ski, S. (2014). Video‐surveillance system for remote long‐term in situ observations: Recording diel cavity use and behaviour of wild European lobsters (Homarus gammarus). Marine and Freshwater Research, 65, 1094–1101.Swann, D. E., Hass, C. C., Dalton, D. C., & Wolf, S. A. (2004). Infrared‐triggered cameras for detecting wildlife: An evaluation and review. Wildlife Society Bulletin (1973‐2006), 32, 357–365.Villette, P., Krebs, C. J., Jung, T. S., & Boonstra, R. (2016). Can camera trapping provide accurate estimates of small mammal (Myodes rutilus and Peromyscus maniculatus) density in the boreal forest? Journal of Mammology, 97, 32–40.Wade, M. R., Zalucki, M. P., & Franzmann, B. A. (2005). Influence of observer presence on Pacific Damsel bug behavior: Who is watching whom? Journal of Insect Behavior, 18, 651–667.Weale, R. A. (1961). Limits of human vision. Nature, 191, 471–473.Weinstein, B. G. (2014). MotionMeerkat: Integrating motion video detection and ecological monitoring (S. Dray, Ed.). Methods in Ecology and Evolution, 6, 357–362.Welbourne, D. J., Claridge, A. W., Paull, D. J., & Lambert, A. (2016). How do passive infrared triggered camera traps operate and why does it matter? Breaking down common misconceptions. Remote Sensing in Ecology and Conservation, 2, 77–83.Zeljkovic, V. (2013). Video surveillance techniques and technologies. Hershey, PA: IGI Global.

Journal

Ecology and EvolutionWiley

Published: Jul 1, 2017

Keywords: ; ; ; ; ; ;

There are no references for this article.