Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Open Science Drone Toolkit: Open source hardware and software for aerial data capture

  • Gustavo Pereyra Irujo ,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    pereyrairujo.gustavo@conicet.gov.ar

    Affiliation Instituto Nacional de Tecnología Agropecuaria, Consejo Nacional de Investigaciones Científicas y Técnicas, Balcarce, Argentina

  • Paz Bernaldo,

    Roles Conceptualization, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Writing – review & editing

    Affiliation Independent Researcher, Melipilla, Chile

  • Luciano Velázquez,

    Roles Conceptualization, Investigation, Methodology, Software, Supervision, Validation, Writing – review & editing

    Affiliation Facultad de Ciencias Agrarias, Universidad Nacional de Mar del Plata, Balcarce, Argentina

  • Antoni Pérez,

    Roles Conceptualization, Investigation, Methodology, Resources, Software, Supervision, Validation, Writing – review & editing

    Affiliation Independent Researcher, Santiago, Chile

  • Celeste Molina Favero,

    Roles Data curation, Investigation, Supervision, Validation, Writing – review & editing

    Affiliation Unidad Integrada Balcarce, Universidad Nacional de Mar del Plata, Instituto Nacional de Tecnología Agropecuaria, Balcarce, Argentina

  • Alejandrina Egozcue

    Roles Conceptualization, Investigation, Methodology, Resources, Software, Supervision, Writing – review & editing

    Affiliation Independent Researcher, Balcarce, Argentina

Abstract

Despite the increased access to scientific publications and data as a result of open science initiatives, access to scientific tools remains limited. Uncrewed aerial vehicles (UAVs, or drones) can be a powerful tool for research in disciplines such as agriculture and environmental sciences, but their use in research is currently dominated by proprietary, closed source tools. The objective of this work was to collect, curate, organize and test a set of open source tools for aerial data capture for research purposes. The Open Science Drone Toolkit was built through a collaborative and iterative process by more than 100 people in five countries, and comprises an open-hardware autonomous drone and off-the-shelf hardware, open-source software, and guides and protocols that enable the user to perform all the necessary tasks to obtain aerial data. Data obtained with this toolkit over a wheat field was compared to data from satellite imagery and a commercial hand-held sensor, finding a high correlation for both instruments. Our results demonstrate the possibility of capturing research-grade aerial data using affordable, accessible, and customizable open source software and hardware, and using open workflows.

Introduction

Openness has been central to modern science since its inception [1], through the publication of theories and data on which they are based, and the encouragement of replication, scrutiny and challenge [2]. In recent decades, information technologies have enabled the rise of an “open science” global movement that seeks to not only increase transparency and dissemination of scientific processes and products, but also enable more widespread collaboration, participation and inclusion in science [35]. Although access to scientific publications and raw data has increased significantly in recent years [69], access to the scientific tools needed to obtain or analyze data (i.e., scientific instruments, materials and software) remains as one of the main barriers to increasing participation in science production, replication or reproduction of published results [10,11].

Open source software is computer code that is licensed so that the user has the freedom to copy and redistribute it, have access to the source code, and make improvements to it, among other rights [12]. Similarly, open source hardware is any physical object or artifact whose design is available so that anyone can study, modify, distribute, make, and sell the design or hardware based on that design [13]. Open source research software and open scientific instruments and materials are considered to provide a series of advantages over proprietary alternatives: i) being either cost-free (in the case of software) or usually more affordable (in the case of hardware), they allow more people to participate in science endeavors, especially for non-professional or budget-limited researchers; ii) reproducibility of published results or replication attempts are less constrained by a lack of access to the same tools that were originally used; iii) having access to the software code or hardware design allows for a better understanding of the functioning of the tool and the methods or algorithms that it implements; and iv) it is possible to customize the tools to adapt them to new uses or local contexts [11,1416].

Uncrewed aerial vehicles (UAVs, usually called “drones”) can be a powerful tool for research in disciplines such as agriculture and environmental sciences, allowing the capture of high-resolution aerial imaging with great speed and flexibility [17]. Drone use in research is rapidly growing but it is dominated by closed source tools: in a recent literature review of applications in agro-environmental monitoring, more than 80% of studies used fully closed-source drones, and more than 90% of studies used proprietary closed-source software for image processing [18]. Proprietary drone solutions usually require a significant initial investment, monthly software subscriptions, or an internet connection for cloud processing, which can constitute barriers for many low-resource users [19,20], and usually function as a ‘black-box’ which offer users little insight on its internal workings, and limited customization [21]. Also, when these solutions are implemented in developing countries, concerns have been raised regarding limited repairability, and risks of technological dependence and extractive practices [22].

For a typical use of drones in environmental or agricultural research, the drone needs to be able to be reliably and precisely positioned over the studied terrain and capture images that can be later processed to get a high quality image of the surveyed area and extract useful data [2325]. Although there are already many open source hardware and software tools that can be used in each of the individual steps of this process, our question was whether it was possible to perform all these steps using open tools. The objective of this work was to address this question by collecting, curating, organizing and testing a comprehensive set of open source tools for aerial data capture for research purposes. The result of these actions is the Open Science Drone Toolkit (OSDT), which is presented here in detail, and is also available online at https://vuela.cc/en/toolkit.

Toolkit design process

The OSDT was developed as part of “Vuela” [26,27], a research-action project which aimed to fight the lack of access to creating scientific and technological knowledge, by exploring an alternative way of developing scientific tools. The toolkit was built through a collaborative and iterative process, involving the work of more than 100 people between 2017 and 2019, in more than 30 local, in-person workshops, in five countries (Argentina, Brazil, Chile, Paraguay and Uruguay), as well as permanent online collaboration. Workshop participants were school students, traditional scientists, technicians, hobbyists, journalists, local community members, self-taught software developers, and included both academics and people with no formal academic or technology background, and people with and without experience making or using drones. Work was carried out also in different languages: Spanish, Haitian Créole, French, Portuguese and English.

The Open Science Drone Toolkit is a set of hardware and software tools, guides, and protocols that enable the user to perform all the necessary tasks to obtain aerial data, as detailed in Table 1. These steps represent a ‘typical’ use case, but can be modified according to the specific research question.

thumbnail
Table 1. Tasks that can be performed using the Open Science Drone Toolkit in order to obtain data from aerial images.

https://doi.org/10.1371/journal.pone.0284184.t001

One of the first objectives of the project was to put into practice one of the commonly less-exercised freedoms of open source hardware: the freedom to modify an existing design. Instead of developing a drone from scratch, the project started by replicating, testing, and identifying potential improvements for an already available open-source drone called “Flone” [28]. The original design had limited capabilities for performing the required tasks listed in Table 1, so a series of changes were needed for using the drone for research purposes: increasing the range that the drone could safely cover, adding satellite navigation capability, increasing the payload capacity, and improving the stability of the camera. This iterative process of hardware development was carried out in conjunction with a careful selection of open source software tools (and development of new ones) to perform each of the tasks listed in Table 1, and the development of protocols and detailed user guides. The resulting set of tools is described in the following section.

Toolkit components

The components of the OSDT are listed in Table 2. The main component is the open-hardware drone, developed especially for this toolkit. Other hardware components of the toolkit are not open source, but are mostly off-the-shelf equipment that can be readily replaced. The software components of the toolkit are all open-source software projects, which have been selected for being suitable for each task, or developed especially for the toolkit. Finally, the documentation includes assembly and usage guides for the toolkit.

thumbnail
Table 2. Summary of the hardware, software, and documentation components of the Open Science Drone Toolkit.

https://doi.org/10.1371/journal.pone.0284184.t002

The OVLI drone (an acronym for “Objeto Volador Libre”, which means “Free Flying Object” in Spanish) is a quadcopter (i.e., a helicopter with four propellers), equipped with an autopilot board with accelerometer, gyroscope, barometer, and GNSS (Global Navigation Satellite System) sensors that allow fully autonomous flight. The autopilot is an open source Pixhawk board [29,30], running the open source ArduPilot/ArduCopter firmware [31,32]. The OVLI has a frame diameter of 395mm and weighs 0.773kg without batteries. Its frame is assembled from MDF (Medium Density Fibreboard) cut with a laser cutter according to a design file, which can be easily edited to modify the drone structure. This material was chosen because it is widely available, low cost, and easy to assemble, repair and modify. The final design of the OVLI is shown in Fig 1, alongside the original “Flone” drone on which it was based.

thumbnail
Fig 1. The Flone and OVLI drones.

(A) original “Flone” design [28] upon which further versions of the open hardware drone were developed. (B) the “OVLI” drone that was developed as part of the Open Science Drone Toolkit.

https://doi.org/10.1371/journal.pone.0284184.g001

The OVLI drone has a payload capacity of around 500 g, enough for a high-resolution RGB (Red-Green-Blue, i.e., visible spectrum) camera, multi-spectral camera, or other sensors. Maximum flight time is 11 minutes, using a 5000 mAh battery and carrying a 141 g camera as payload, measured from take-off until the low-battery alarm was activated, with approximately 30% of battery capacity remaining.

Operation of the OVLI drone can be done both through a manually operated radio controller, but for research purposes it is usually convenient to fly the drone autonomously using a pre-programmed flight plan. Planning usually begins by identifying the area to be surveyed (task #1 in Table 1), which can be done by physically surrounding the area carrying a smartphone and using the open-source app GPSLogger (Table 2). The resulting file with the coordinates is uploaded to the open-source software Mission Planner (Table 2) to design the flight plan (task #2 in Table 1), according to the desired image resolution, and considering the maximum flight time of the drone and other constraints. This flight plan is uploaded to the OVLI drone, which can later fly autonomously while capturing the images (task #4 in Table 1). During flight, the Mission Planner software is also used to view live telemetry data, such as the position of the drone, battery voltage, altitude, speed, etc.

The camera selected for the toolkit is an RGB, 12-megapixel pocket camera (Canon PowerShot ELPH100HS, or equivalent). This kind of camera was selected for two main reasons: 1) “pocket” or “point and shoot” cameras usually have a mechanical shutter, which means that when an image is captured, all the pixels are captured at the same time, whereas many “action cameras” usually used in drones have an electronic “rolling shutter”, in which the sensor captures the images line-by-line, potentially introducing image distortions [33]; 2) most Canon cameras have the possibility of being “hacked” by means of the open-source CHDK (Canon Hack Development Kit) software [34], which allows setting the camera to capture images automatically, and to manually set camera parameters (e.g., shutter speed and ISO values; task #3 in Table 1) to capture sharp, well-exposed and suitable images for further processing and data extraction [35]. This type of camera does not provide location data of the images (which is necessary for later obtaining a georeferenced mosaic), so this information has to be retrieved from the flight log of the drone. This process is called geotagging (task #5 in Table 1), and can be performed using the Mission Planner ground station software. This kind of off-the-shelf RGB cameras have been shown to be useful for the measurement of vegetation indices [36]. Moreover, the camera filters can be modified by replacing the standard near-infrared (NIR) filter with an appropriate filter, resulting in a low-cost camera capable of detecting two visible bands and one NIR band [37,38].

The captured images then need to be merged to obtain a rectified and georeferenced image of the complete surveyed area, which is known as an orthomosaic (task #6 in Table 1). Currently, a suitable open-source software to perform this task is OpenDroneMap [39], which has been shown to provide high quality results comparable to widely-used commercial packages [21]. The next step is to extract information from the orthomosaic image. This step depends largely on the research question that is being addressed. The open source software QGIS [40] can be used to open the georeferenced orthomosaic and calculate vegetation indices, measure areas, and multiple other data extraction tasks (task #7 in Table 1).

The open-source application “Bitácora” (“logbook” in Spanish) (https://vuela.cc/en/bitacora), developed especially for the OSDT, helps in visualizing and organizing all the files, images and metadata generated during the whole process, for archiving, sharing or further processing. The user only needs to save all the files generated in a flight (survey area polygon, flight plan, captured images, mosaic, elevation model, etc.) in a folder, and the program will automatically generate a map visualizing the files, and a table with flight information (flight date and time, location, altitude, speed, names of relevant files; Fig 2). This information is also saved in open formats compatible with other software (flight information table in csv format, flight map in png and kml formats).

thumbnail
Fig 2. Screenshot from the “Bitácora” software.

(A) The main window, showing the list of logs already registered in the logbook. (B-C) Individual flight windows showing details for selected flights.

https://doi.org/10.1371/journal.pone.0284184.g002

The toolkit documentation includes: 1) an “assembly guide” that includes a step-by-step guide for building the OVLI drone, setting up and configuring the hardware components, and installing the software, and 2) a “usage guide”, with instructions for flying the drone, programming an autonomous mission, programming the camera, and processing the images. Both are available as openly licensed documents (using a Creative Commons license allowing users to redistribute and build upon the material, with attribution), ready for download in PDF and HTML format, and also as live documents (in Google Docs) open for suggestions. The guides have a simple layout that allow for automatic machine translations in many languages, which are readily available from the project website.

Example use case

An example use case is presented here in which the OSDT was used to obtain data on the spatial variability in the maturity of a wheat crop, assessed through a vegetation index that quantifies the “greenness” of the crop canopy. A 6000 m2 area was surveyed in a wheat field sown in late July 2020 in Balcarce, Buenos Aires Province, Argentina. This particular field was selected for having uneven maturity of the crop due to variability in soil depth. Following the steps in the toolkit guide, the area of interest was first delimited (Fig 3A), a flight plan was designed (Fig 3B), and the OVLI drone was flown on the 5th of December 2020, when the crop was in the grain filling stage (Zadoks 7.7 stage). A total of 151 images were captured at a flight altitude of 50m (Fig 3C), and 104 of them (discarding those captured during takeoff and landing) were then used to obtain an orthomosaic with a resolution of 2 cm/pixel (Fig 3D).

thumbnail
Fig 3. Aerial data capture and analysis steps followed in the presented example use case.

A) identification of the study area, which was done first in situ with a location recorder app (bright red line) and then manually refined by drawing a rectangular polygon (dark red area); B) design of the flight plan to cover the study area using a “grid” or “lawn-mower” pattern; C) flight and image capture, represented through the actual flight path of the drone (blue line) and the position of each captured image (blue circles); D) orthomosaic obtained by joining the captured images; and E) calculated vegetation index (VARI) calculated from the orthomosaic data. The background image is a 10x10m resolution satellite image captured 1 day after the drone flight (retrieved from Sentinel Hub EO Browser under a CC-BY 4.0 license [45]).

https://doi.org/10.1371/journal.pone.0284184.g003

Vegetation indices are transformations of data obtained from optical sensors, usually based on the plant’s increased reflectance in the green and/or infrared wavelengths, that can be used to quantify spatial and temporal variations in vegetation characteristics [41]. A vegetation index (Visible Atmospherically Resistant Index, VARI) that has been previously used to estimate wheat growth and phenology [42,43] was calculated from the red, green and blue channels of the orthomosaic, in order to quantify the “greenness” of the crop as an indicator of the degree of maturity (Fig 3E). This data obtained from the drone images was compared to the Normalized Difference Vegetation Index (NDVI), which is the most widely used vegetation index [44], obtained from two sources: a handheld sensor and satellite data.

Satellite data from the Sentinel 2 constellation (European Space Agency) was retrieved from the Sentinel Hub EO Browser under a CC-BY 4.0 license [45]. Data from bands B04 (red, central wavelength = 665nm) and B08 (near infrared, central wavelength = 842nm) for the 6th December 2020 (1 day after the drone image capture) at 10m/pixel resolution was used to calculate NDVI values for the surveyed area (Fig 4A). For comparison between satellite (NDVI) and drone (VARI) data, 140 10 x 10 m areas equivalent to the satellite image pixels were delimited in the processed orthomosaic (Fig 4C), and the mean VARI value measured in each of them. One caveat to this comparison was that the georeferencing of the orthomosaic was based on the drone GNSS sensor which usually has an accuracy of about 2 m [46], so the correspondence between these 10 x 10 m areas and the satellite image pixels might not have been complete.

thumbnail
Fig 4. Comparison between data obtained with the OSDT and data obtained from satellite imagery and a handheld sensor.

A) NDVI satellite image from Sentinel 2; B) VARI image from drone orthomosaic; C) VARI drone image showing the 10x10m areas equivalent to the satellite image pixels delimited in the processed orthomosaic; D) transects used for measurement with hand-held sensor; E) relationship between NDVI from the satellite image and VARI from the drone; and F) relationship between NDVI from the hand-held sensor and VARI from the drone. Satellite image data was retrieved from Sentinel Hub EO Browser under a CC-BY 4.0 license [45].

https://doi.org/10.1371/journal.pone.0284184.g004

On the 28th of November 2020 (seven days before the drone image capture) a hand-held sensor (Greenseeker, N-tech Industries, USA) was used to measure NDVI in six transects parallel to the crop rows, each of them 70 meters long and spaced 18 meters apart (Fig 4D). To aid in the correspondence between sensor and drone data, a visible mark that could be easily identified in the aerial images was placed at the start of the first transect, and each subsequent transect was started based on ground measurements relative to that reference. The sensor was placed one meter above the canopy, resulting in a field of view of about 60 cm. Around 700 data points were recorded in each transect, equivalent to around one point every 10 cm. For comparison between handheld NDVI sensor and drone (VARI) data, NDVI data was averaged every two meters (yielding 35 data points per transect, and 210 in total). Similarly, 35 2 x 2 m areas along each transect were delimited in the processed orthomosaic, and the mean VARI value measured in each of them.

VARI data obtained from the drone orthomosaic were plotted against NDVI data from the satellite image and the handheld sensor, and an exponential equation of the form y = axb-c was fitted for each set of data. The NDVI index usually has a curvilinear relationship with the green leaf area index, while the VARI index has been shown to have a rather linear relationship [42], therefore a curvilinear relationship between NDVI and VARI can be expected. A high correlation was found in both cases, with R2 values of 0.84 and 0.88 when drone data was compared to satellite and hand-held sensor data, respectively (Fig 4E and 4F). A single curve could also be fit to both datasets, with an R2 of 0.86 (not shown).

Discussion

This paper reports the results of developing a complete toolkit in order to demonstrate the possibility of capturing research-grade aerial data using open source software and hardware. The use case example presented shows its suitability for tasks that can be useful for many research questions (and also commercial applications, e.g., farming). The wide range of possible applications of aerial imaging in research, however, cannot be fully covered by any single toolkit. Nevertheless, the open nature of the OSDT allows for its components to be used separately or be replaced by alternative tools, either open-source or proprietary, as necessary. For example, if the area of interest is significantly larger than the one shown in the example, the flight time of the OVLI drone would not be sufficient. In that case, a “fixed wing” type of drone, such as the open-source “Asa-Branca-I” [47], could provide the capability of covering more than 100 ha in a single flight (albeit with lower image resolution). And if the cost of a drone is a limitation, or if drones cannot be used due to regulations, aerial images can also be captured using kites [48]. It is also possible to use the image processing, analysis and management software tools of the toolkit with aerial images obtained with proprietary drones.

Concerns about the quality and reliability of open source tools (especially hardware) can sometimes limit their use [49]. In this work we attempted to overcome this limitation by comparing the results obtained with the OSDT to data from satellite imagery and a commercial hand-held sensor, finding a high correlation for both instruments. The two sensors used for comparison have been among the most popular tools used by farmers for crop monitoring for around 20 years [50], both of them make use of the most widely used vegetation index (NDVI) [44], and can therefore be considered as a reliable benchmark. Studies with off-the-shelf RGB cameras such as that used in the OSDT have shown that they can yield robust measurements of vegetation indices [36], which can also be further improved through radiometric calibrations [51], with results comparable to multispectral cameras which can cost several times more [5255], and with higher spatial resolution [56]. Although not shown in this paper, the OSDT can also be used to generate digital elevation maps and 3D point clouds of the studied terrain. Studies using similar tools showed that it is possible to obtain results similar to those of expensive LiDAR (Light Detection And Ranging) systems, when appropriate ground control points are used [57,58].

Another concern about open source research tools is their sustainability [59], especially when they lack an associated business model to provide funding, considering that long-term availability is important for reproducibility and replicability [60]. Nevertheless, open source is usually considered as a requisite for sustainable research software, as it ensures the possibility of continuous validation, reuse and improvement [61,62] while, on the other hand, proprietary software has been found to be an obstacle for reproducibility [63]. The issue of sustainability of open hardware has not been studied extensively [64], but it has been argued that open source hardware can provide more long-term security to research projects due to the possibility of in-house repairing in case the original provider went out of business [14]. We therefore argue that using the OSDT for aerial data capture for research purposes could be considered a more sustainable option than proprietary commercial systems.

While there have been significant technical improvements in drones and sensors in recent years, little attention has been paid to the management and storage of the increasingly large and complex datasets that are the result of drone operations [65]. There are few standards for drone data management, sharing or publication, which makes collaboration and reproducibility difficult [66]. Commercial drone packages usually offer complete solutions, but generally at the cost of expensive licenses and less interoperability with other tools or possibility of customization. Open source tools offer a more flexible but fragmented landscape, and users can be set back by the need to deal with many individual components. In the OSDT, the software “Bitácora” was developed with the aim of helping to overcome this issue, by providing a way of centralizing all the files and data that is generated in the different steps of the process of aerial data capture, which would otherwise have to be managed and visualized using many different tools. As an open source software, it can also be extended to incorporate other data formats and file types (e.g. flight plans for proprietary drones, images from multispectral cameras, etc.). It also aids the user in collecting all the generated data in a single folder, adding the corresponding metadata, which helps in openly sharing research data in reusable and interoperable formats. “Bitácora” can automatically extract part of the metadata required in the Minimum Information Framework for drone users (proposed by [66]), thus helping users make their data “FAIR” (findable, accessible, interoperable and reusable). Another open source software with a similar goal is “DroneDB” [67], which aids especially in sharing datasets of images, orthomosaics and other drone products through a cloud interface, but without providing flight metadata. “Bitácora” is therefore a small but key component of the OSDT, since it helps “bundle” the toolkit together and use it to further the goals of open science.

Conclusions

The Open Science Drone Toolkit was presented in this paper, which enables the user to perform all the necessary tasks to obtain aerial data. Data obtained with this toolkit over a wheat field was compared to data from satellite imagery and a commercial hand-held sensor, finding a high correlation for both instruments. These results demonstrate the possibility of capturing research-grade aerial data using affordable, accessible and customizable open source software and hardware, and using open workflows.

Supporting information

S1 File. Alternative language article.

Complete manuscript in Spanish: “Herramientas de código abierto para la captura de datos aéreos mediante drones”.

https://doi.org/10.1371/journal.pone.0284184.s001

(PDF)

Acknowledgments

We would like to acknowledge the contributions of the participants of the Vuela project workshops and online collaborators that made building this toolkit possible, especially Loulou Jude, Daniela Muñoz, Lot Amorós, Guillermo Pereyra Irujo, Nicolás Narváez, Carla Alvial, Fernando Yamada, John Arancibia, Constanza Alberio, Vicente Dimuro, and Stevens Azima. We want to thank Junta de Vecinos Teniente Merino Alto, Junta de Vecinos Francisco Werchez, Universidad Católica, INIA Rayentué (Chile), INTA Balcarce, Club Social de Innovación Balcarce, R’lyeh Hacklab, INTA Marcos Juárez, and Universidad Nacional de Cuyo (Argentina), IPTA Capitán Miranda (Paraguay), Universidade Federal do Rio Grande do Sul (Brazil), and INIA La Estanzuela (Uruguay) for providing the venues for the workshops. We also want to thank Abril Pereyra Molina and Julián Pereyra Molina for their assistance with field measurements.

References

  1. 1. Hull D. Openness and Secrecy in Science: Their Origins and Limitations. Sci Technol Hum Values. 1985;10(2):4–12.
  2. 2. Boulton G, Campbell P, Collins B, Elias P, Hall W, Laurie G, et al. Science as an open enterprise. London: The Royal Society; 2012.
  3. 3. Bahlai C, Bartlett L, Burgio K, Fournier A, Keiser C, Poisot T, et al. Open Science Isn’t Always Open to All Scientists. Am Sci. 2019;107(2):78.
  4. 4. UNESCO. UNESCO Recommendation on Open Science. UNESCO; 2021. Available from: https://unesdoc.unesco.org/ark:/48223/pf0000379949.locale=en.
  5. 5. Wolff B, Schlagwein D. From Open Science to Open Source (and beyond): A Historical Perspective on Open Practices without and with IT. In: 17th International Symposium on Open Collaboration. New York, NY, USA: Association for Computing Machinery; 2021. P. 1–11. (OpenSym 2021). https://doi.org/10.1145/3479986.3479990
  6. 6. Boulton G, Rawlins M, Vallance P, Walport M. Science as a public enterprise: the case for open data. The Lancet. 2011;377(9778):1633–5. pmid:21571134
  7. 7. Forero DA, Curioso WH, Patrinos GP. The importance of adherence to international standards for depositing open data in public repositories. BMC Res Notes. 2021;14(1):405. pmid:34727971
  8. 8. Himmelstein DS, Romero AR, Levernier JG, Munro TA, McLaughlin SR, Greshake Tzovaras B, et al. Sci-Hub provides access to nearly all scholarly literature. eLife. 2018;7:e32822. pmid:29424689
  9. 9. Piwowar H, Priem J, Larivière V, Alperin JP, Matthias L, Norlander B, et al. The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles. PeerJ. 2018;6:e4375. pmid:29456894
  10. 10. Arancio J, Dosemagen S. Bringing Open Source to the Global Lab Bench. Issues in Science and Technology. 2022. Available from: https://issues.org/open-source-science-hardware-gosh-arancio-dosemagen/.
  11. 11. Walters WP. Code Sharing in the Open Science Era. J Chem Inf Model. 2020;60(10):4417–20. pmid:32937075
  12. 12. Open Source Initiative. The Open Source Definition. 2007. Available from: https://opensource.org/osd.
  13. 13. Open Source Hardware Association. Open Source Hardware (OSHW) Statement of Principles 1.0. Open Source Hardware Association. 2016. Available from: https://www.oshwa.org/definition/.
  14. 14. Chagas AM. Haves and have nots must find a better way: The case for open scientific hardware. PLOS Biol. 2018;16(9):e3000014. pmid:30260950
  15. 15. Diederich B, Müllenbroich C, Vladimirov N, Bowman R, Stirling J, Reynaud EG, et al. CAD we share? Publishing reproducible microscope hardware. Nat Methods. 2022;1–5.
  16. 16. Ravindran S. How DIY technologies are democratizing science. Nature. 2020;587(7834):509–11. pmid:33204016
  17. 17. Chabot D. Trends in drone research and applications as the Journal of Unmanned Vehicle Systems turns five. J Unmanned Veh Syst. 2018;6(1):vi–xv.
  18. 18. Eskandari R, Mahdianpari M, Mohammadimanesh F, Salehi B, Brisco B, Homayouni S. Meta-analysis of Unmanned Aerial Vehicle (UAV) Imagery for Agro-environmental Monitoring Using Machine Learning and Statistical Models. Remote Sens. 2020;12(21):3511.
  19. 19. Paneque-Gálvez J, Vargas-Ramírez N, Napoletano BM, Cummings A. Grassroots Innovation Using Drones for Indigenous Mapping and Monitoring. Land. 2017;6(4):86.
  20. 20. Vargas-Ramírez N, Paneque-Gálvez J. The Global Emergence of Community Drones (2012–2017). Drones. 2019;3(4):76.
  21. 21. Pell T, Li JYQ, Joyce KE. Demystifying the Differences between Structure-from-Motion Software Packages for Pre-Processing Drone Data. Drones. 2022;6(1):24.
  22. 22. Hanrahan BV, Maitland C, Brown T, Chen A, Kagame F, Birir B. Agency and Extraction in Emerging Industrial Drone Applications: Imaginaries of Rwandan Farm Workers and Community Members. Proc ACM Hum-Comput Interact. 2021;4(CSCW3):233:1–233:21.
  23. 23. Assmann JJ, Kerby JT, Cunliffe AM, Myers-Smith IH. Vegetation monitoring using multispectral sensors—best practices and lessons learned from high latitudes. J Unmanned Veh Syst. 2019;7(1):54–75.
  24. 24. Rusnák M, Sládek J, Kidová A, Lehotský M. Template for high-resolution river landscape mapping using UAV technology. Measurement. 2018;115:139–51.
  25. 25. Tmušić G, Manfreda S, Aasen H, James MR, Gonçalves G, Ben-Dor E, et al. Current Practices in UAS-based Environmental Monitoring. Remote Sens. 2020;12(6):1001.
  26. 26. Bernaldo P, Pereyra Irujo G. Proyecto “Vuela.” Liinc Em Rev. 2018;14(1).
  27. 27. Vuela Proyecto. Proyecto Vuela: ciencia libre con drones. Proyecto Vuela. 2018. Available from: http://vuela.cc/.
  28. 28. Amorós L, Varona N, Boronat R, Rangholia C. Flone: una plataforma para que los smartphones puedan volar. 2015. Available from: https://flone.cc/.
  29. 29. Meier L, Tanskanen P, Heng L, Lee GH, Fraundorfer F, Pollefeys M. PIXHAWK: A micro aerial vehicle design for autonomous flight using onboard computer vision. Auton Robots. 2012;33(1):21–39.
  30. 30. Project Pixhawk. Pixhawk: the hardware standard for open-source autopilots. 2014. Available from: https://pixhawk.org/.
  31. 31. ArduPilot Project. ArduPilot. ArduPilot.org. 2016. Available from: https://ardupilot.org.
  32. 32. Short J, Mackay R, Robustini M. ArduCopter. 2015. Available from: https://github.com/ArduPilot/ardupilot/blob/master/ArduCopter/Copter.cpp.
  33. 33. Vautherin J, Rutishauser S, Schneider-Zapp K, Choi HF, Chovancova V, Glass A, et al. Photogrammetric accuracy and modeling of rolling shutter cameras. In: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Copernicus GmbH; 2016. P. 139–46. Available from: https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/III-3/139/2016/.
  34. 34. CHDK Development Team. Canon Hack Development Kit. 2007. Available from: https://chdk.fandom.com/wiki/CHDK.
  35. 35. O’Connor J, Smith MJ, James MR. Cameras and settings for aerial surveys in the geosciences: Optimising image data. Prog Phys Geogr Earth Environ. 2017;41(3):325–44.
  36. 36. Svensgaard J, Jensen SM, Westergaard JC, Nielsen J, Christensen S, Rasmussen J. Can reproducible comparisons of cereal genotypes be generated in field experiments based on UAV imagery using RGB cameras? Eur J Agron. 2019;106:49–57.
  37. 37. Sankaran S, Zhou J, Khot LR, Trapp JJ, Mndolwa E, Miklas PN. High-throughput field phenotyping in dry bean using small unmanned aerial vehicle based multispectral imagery. Computers and Electronics in Agriculture. 2018;151:84–92.
  38. 38. Jewan SYY, Pagay V, Billa L, Tyerman SD, Gautam D, Sparkes D, et al. The feasibility of using a low-cost near-infrared, sensitive, consumer-grade digital camera mounted on a commercial UAV to assess Bambara groundnut yield. International Journal of Remote Sensing. 2022;43(2):393–423.
  39. 39. Authors OpenDroneMap. OpenDroneMap: A command line toolkit to generate maps, point clouds, 3D models and DEMs from drone, balloon or kite images. 2020. Available from: https://github.com/OpenDroneMap/ODM.
  40. 40. QGIS Development Team. QGIS: a Free and Open Source Geographic Information System. 2002. Available from: https://www.qgis.org/.
  41. 41. Bannari A, Morin D, Bonn F, Huete AR. A review of vegetation indices. Remote Sens Rev. 1995;13(1–2):95–120.
  42. 42. Gitelson AA, Kaufman YJ, Stark R, Rundquist D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens Environ. 2002;80(1):76–87.
  43. 43. Zhou M, Ma X, Wang K, Cheng T, Tian Y, Wang J, et al. Detection of phenology using an improved shape model on time-series vegetation index in wheat. Comput Electron Agric. 2020;173:105398.
  44. 44. Jiang Z, Huete AR, Chen J, Chen Y, Li J, Yan G, et al. Analysis of NDVI and scaled difference vegetation index retrievals of vegetation fraction. Remote Sens Environ. 2006;101(3):366–78.
  45. 45. Laboratory Sinergise. EO Browser. 2022. Available from: https://apps.sentinel-hub.com/eo-browser/?zoom=18&lat=-37.76428&lng=-58.51554&datasetId=S2L2A&fromTime=2020-12-06&toTime=2020-12-07.
  46. 46. Guo W, Carroll ME, Singh A, Swetnam TL, Merchant N, Sarkar S, et al. UAS-Based Plant Phenotyping for Research and Breeding Applications. Plant Phenomics. 2021;2021:9840192. pmid:34195621
  47. 47. Mesquita GP, Rodríguez-Teijeiro JD, Oliveira RR de, Mulero-Pázmány M. Steps to build a DIY low-cost fixed-wing drone for biodiversity conservation. PLOS ONE. 2021;16(8):e0255559. pmid:34388153
  48. 48. Anderson K, Griffiths D, DeBell L, Hancock S, Duffy JP, Shutler JD, et al. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones. PLOS ONE. 2016;11(5):e0151564. pmid:27144310
  49. 49. Parker A, Dosemagen S, Molloy J, Bowser A, Novak A. Open Hardware: An Opportunity to Build Better Science. The Wilson Center; 2021. Available from: https://diplomacy21-adelphi.wilsoncenter.org/sites/default/files/media/uploads/documents/STIP%20Open%20Hardware%20An%20Opportunity%20to%20Build%20Better%20Science_0.pdf.
  50. 50. Mulla D, Khosla R. Historical Evolution and Recent Advances in Precision Farming. In: Lal R, Stewart BA, Eds. Soil Specific Farming: Precision Agriculture. CRS Press: Boca Raton, FL, USA, 2016.
  51. 51. Svensgaard J, Jensen SM, Christensen S, Rasmussen J. The importance of spectral correction of UAV-based phenotyping with RGB cameras. Field Crops Res. 2021;269:108177.
  52. 52. Holman FH, Riche AB, Castle M, Wooster MJ, Hawkesford MJ. Radiometric Calibration of ‘Commercial off the Shelf’ Cameras for UAV-Based High-Resolution Temporal Crop Phenotyping of Reflectance and NDVI. Remote Sens. 2019;11(14):1657.
  53. 53. Ashapure A, Jung J, Chang A, Oh S, Maeda M, Landivar J. A Comparative Study of RGB and Multispectral Sensor-Based Cotton Canopy Cover Modelling Using Multi-Temporal UAS Data. Remote Sensing. 2019;11(23):2757.
  54. 54. Costa L, Nunes L, Ampatzidis Y. A new visible band index (vNDVI) for estimating NDVI values on RGB images utilizing genetic algorithms. Computers and Electronics in Agriculture. 2020;172:105334.
  55. 55. Davidson C, Jaganathan V, Sivakumar AN, Czarnecki JMP, Chowdhary G. NDVI/NDRE prediction from standard RGB aerial imagery using deep learning. Computers and Electronics in Agriculture. 2022;203:107396.
  56. 56. Herzig P, Borrmann P, Knauer U, Klück HC, Kilias D, Seiffert U, et al. Evaluation of RGB and Multispectral Unmanned Aerial Vehicle (UAV) Imagery for High-Throughput Phenotyping and Yield Prediction in Barley Breeding. Remote Sensing. 2021;13(14):2670.
  57. 57. Zahawi RA, Dandois JP, Holl KD, Nadwodny D, Reid JL, Ellis EC. Using lightweight unmanned aerial vehicles to monitor tropical forest recovery. Biol Conserv. 2015;186:287–95.
  58. 58. Zhang F, Hassanzadeh A, Kikkert J, Pethybridge SJ, van Aardt J. Comparison of UAS-Based Structure-from-Motion and LiDAR for Structural Characterization of Short Broadacre Crops. Remote Sens. 2021;13(19):3975.
  59. 59. Carver JC, Weber N, Ram K, Gesing S, Katz DS. A survey of the state of the practice for research software in the United States. PeerJ Comput Sci. 2022;8:e963. pmid:35634111
  60. 60. Hocquet A, Wieber F. Epistemic issues in computational reproducibility: software as the elephant in the room. Euro Jnl Phil Sci. 2021;11(2):38.
  61. 61. Jiménez RC, Kuzak M, Alhamdoosh M, Barker M, Batut B, Borg M, et al. Four simple recommendations to encourage best practices in research software. F1000Research 2017;6. pmid:28751965
  62. 62. Anzt H, Bach F, Druskat S, Löffler F, Loewe A, Renard BY, et al. An environment for sustainable research software in Germany and beyond: current state, open challenges, and call for action. F1000Research. 2021;9:295.
  63. 63. Konkol M, Kray C, Pfeiffer M. Computational reproducibility in geoscientific papers: Insights from a series of studies with geoscientists and a reproduction study. International Journal of Geographical Information Science. 2019;33(2):408–29.
  64. 64. Li Z, Seering W. Does Open Source Hardware Have a Sustainable Business Model? An Analysis of Value Creation and Capture Mechanisms in Open Source Hardware Companies. Proceedings of the Design Society: International Conference on Engineering Design. 2019;1(1):2239–48.
  65. 65. Wyngaard J, Barbieri L, Thomer A, Adams J, Sullivan D, Crosby C, et al. Emergent Challenges for Science sUAS Data Management: Fairness through Community Engagement and Best Practices Development. Remote Sens. 2019;11(15):1797.
  66. 66. Thomer A, Barbieri L, Wyngaard J, Swanz S. A Minimum Information Framework for capturing FAIR data with small Uncrewed Aircraft Systems. 2021; Available from: https://eartharxiv.org/repository/view/2593/.
  67. 67. Authors DroneDB. DroneDB—Effortless Aerial Data Management and Sharing. 2020. Available from: https://dronedb.app/.