It is calculated that the world population in 2050 will be over 9 billion people, so food demand could double this figure. In the face of this, the need arises to increase farming production; nonetheless, conventional agriculture is carried out applying agrochemical products in a uniform manner over a whole crop without considering its specific variability as regards climate, topography, soil properties, humidity, weeds, pests, diseases. This managing leads to the inefficient use of natural resources and agricultural input, which at the same time generates low profitability and environmental degradation (
Leiva, 2008).
This therefore demands the need to optimize agricultural production, which can be achieved by adapting new techniques and practices regarding crop management which, based in spatial and temporary variations in a piece of land, will permit to diagnose and apply the adequate agrochemical doses in its different sectors. This manner of agricultural production, known as precision agriculture, has been developed in different countries since the 80s, with remarkable advances in the last years
(Gebbers et al., 2010).
Precision agriculture requires geographical information systems (GIS), remote sensors, digitalized maps, databases, global positioning systems (GPS), robotics, among others. Regarding the latter are ground and aerial robots capable of carrying out operations such as planting, crop management (pests, weeds, disease control), phytosanitary product spraying, harvest, remote sensing and others
(Tom et al., 2018).
In the literature there are different developments of unmanned air vehicles for application in precision agriculture. One of these applications is crop monitoring through high-resolution multispectral images. In this way it is possible to extract different vegetation indexes that permit to locate weeds, pests, diseases, nutrients, water stress, among others
(Bendig et al., 2014; López-Granados, 2011;
Shi et al., 2016; Vega et al., 2015, Kumar et al., 2020). With this information it is possible to take located control actions applying, for example, agrochemicals in a specific site and in the adequate amount
(Gonzalez et al., 2016; Huang et al., 2009), which leads to savings in agricultural inputs, reducing the environmental effect and improving profitability. These intelligent spray systems may be accompanied by computer vision systems and artificial intelligence techniques that guarantee the effective recognition of crop weeds or diseases
(Gao et al., 2019).
Unlike manual on-site monitoring, remote sensing allows non-invasive, fast and efficient crop monitoring. This is possible thanks to significant advances in unmanned aerial vehicles, different types of sensors, georeferencing systems and image processing algorithms
(Gogoi et al., 2018). Remote sensing using unmanned aerial vehicles, unlike satellite sensing, allows images of higher spatial and temporal resolution to be taken at a low cost, with less interference from atmospheric conditions.
A bibliographic review on the application of unmanned aerial vehicles in agriculture is presented below, placing emphasis on remote sensing for crop monitoring as it ends up being their main application. It starts by describing the methodology applied to compile and filter the articles. It continues by going in depth into the most common remote sensing application such as pests and disease detection and phenotyping. The main image processing techniques and the corresponding sensors are described as well. A summary of the crops with which the said monitoring has been carried out using UAVs and the type of aerial vehicles used is also included.
To carry out an analysis that will permit to provide a vision of the development in precision agriculture and that will allow to contribute to this food safety field, a data search is carried out in a high-impact database in the academic field, using a search equation with relevant key words. After using the search equation in the data bases, an analysis of the articles, authors and years is done to filter those with a greater impact and identify the authors who have written the most on precision agriculture. When selecting relevant articles and authors, this database is analyzed in a bibliometric analysis program thus permitting to interpret the search results and make decisions to reduce and eliminate non-relevant data in the research.
An initial search was carried out at the Scopus database with the following equation: (quadrotor OR multirotor OR drone OR uav OR *copter) AND (agriculture OR farming OR crop), the result of the authors’ expertise and mastery of the issue. In this way 2076 articles were found in the said database. To have a pertinent filtering, key words were analyzed, leaving out those that are not a topic of interest for the research, to have in the end 1519 articles. Some keywords considered were “remote sensing”, “monitoring” and “mapping”.
Because the number of articles, only journal articles were selected, rejecting those from conferences, book chapters, general reviews, among others. This way, from 1519 articles 692 were chosen, reducing the resulting documents by 55%. The next criterion to reduce articles to review was the number of paper citations of each one of them. Those with fewer than 50 citations were rejected, thus permitting to review the ones that generate a greater impact in the academic field. In this way, 51 articles were obtained out of which this review was made.
Crop monitoring
Crop monitoring by means of remote sensing has different objectives. One of them is in precision agriculture, to determine and handle weeds, pests, diseases, nutrients, hydric stress, among others. Another objective is crop phenotyping, where seeds with the best phenotypic characteristics are selected, such as height and biomass production. In the literature is found that crops that have been monitored the most through remote sensing are cereals such a wheat, corn and barley, as shown in Fig 1. The corresponding references are shown in Table 1.
Crop monitoring for precision agriculture
Precision agriculture is a set of techniques through which, by monitoring of temporal and spatial variation in a crop, it is possible to set up a prompt treatment at a specific site. Precision agriculture comprises the following stages: data collection, variability mapping, decision making and application of the management practices. Variable remote sensing is important in the first three stages and it is complemented with other technologies such as geographic information systems (GIS) and machinery for the adequate treatment
(Vega et al., 2015). As applications of this type there is the management of weeds, pests, diseases, nutrients, hydration, among others.
Weed management
Weed management by means a precision agriculture approach initially requires detecting weed localized areas to then take a control action through spraying or mechanical removal. This can be done in real time, by means of capturing geo-referenced images, a weed recognition system and one for their management (
López-Granados, 2011). These practices to handle weeds increase profitability and reduce environmental impact (
López-Granados, 2011), since, different from conventional treatments whereby herbicides are uniformity distributed (
Gómez-Candón et al., 2014), with precision agriculture, agrochemical spraying is done at a specific site thus optimizing the amount of the agrochemical required.
Weed management is a priority in early-phase crops. To recognize weeds at this phase, high spatial resolution is required (less than 5 cm per pixel, since crop reflectance and weeds are quite similar at this phase (
Gómez-Candón et al., 2014). To do so it is necessary to have a high altitude topographic survey (less than 100 m) that permits to discriminate each one of the plants by means of high resolution (
Torres-Sánchez et al., 2013) (
Gómez-Candón et al., 2014) (
Pérez-Ortiz et al., 2016). This implies a challenge since the lesser the altitude, the higher number of images is required, as well as the greater UAV autonomy (
Gómez-Candón et al., 2014). Computer vision techniques and machine learning may be used for this task (
Pérez-Ortiz et al., 2016).
In some cases when differentiating soil weeds is simply required, it is sufficient to determine the green excess index (ExG), which is obtained from RGB images (
Pérez-Ortiz et al., 2016). In other cases it may be necessary to determine the different plant species (weeds and crop) and identify, for example, dominant plants and their size
(Shi et al., 2016). To do so, a multispectral sensor may be useful or even a hyperspectral one that permits to evidence quite small reflectance variations (
López-Granados, 2011;
Torres-Sánchez et al., 2013).
Management of pests, diseases and crop condition
The localized and timely detection of crop pests and diseases permits to take corrective and preventive decisions and their application in an effective manner. Thus, for example, a commonly monitored parameter is the biomass contents as it is strongly related to a crop production. This way, early problem detection in the crop, such as nutrient deficiency, permits to take timely measures without having to wait until the harvest to detect low production zones
(Vega et al., 2015). On the other hand, in case of detecting zones with sufficient nutrient contents (Nitrogen, for example) it permits to avoid a crop over fertilization, saving agricultural input
(Bendig et al., 2014). Disease detection may be carried out through vegetation indexes since some diseases produce changes in reflectance
(Garcia-Ruiz et al., 2013).
Pest, disease and nutrient deficit detection can be fought through agrochemical spraying, including pesticides, fungicides, fertilizers,
etc. Using UAV for spraying is ideal for small lots or difficult access areas
(Huang et al., 2009). In (
Faiçal et al., 2014) for example, a UAV fumigation technique is used as well as a ground-based sensor network. As a control strategy, they generate route adaptation for an agrochemical spraying vehicle depending on speed and wind intensity. To do so, they have wireless sensors located on the ground which feedback data on the amount of agrochemicals they are receiving.
The crop condition is evaluated not only by means of pest and disease detection but also by estimating other parameters such as plant density
(Jin et al., 2017), biomass contents
(Vega et al., 2015), crop height
(Geipel et al., 2014), canopy area, nitrogen contents
(Bendig et al., 2014), solid clay contents
(Shi et al., 2016), crop hydric stress contents
(Berni et al., 2009), among others.
Crop monitoring for phenotyping
Remote sensing has proved to be a great potential in crop phenotyping processes, where a specie’s response to different environmental conditions is evaluated. Crop phenotyping requires measuring and evaluating observable physical characteristics
(Holman et al., 2016), for a species’ different phenotypes, different generations and at different growth stages
(Tattaris et al., 2016a). Traditional phenotyping with direct field observations demands much time and costs
(Haghighattalab et al., 2016). In the face of this, non-invasive phenotyping by means of unmanned aerial vehicles is more common now-a-days due to the possibility to take temporal and spatial high-resolution images that permit to collect different patterns conforming crop phenotype
(Shi et al., 2016).
Thanks to phenotyping it is possible to collect plant or seed species that better adapt, those of higher production and, overall, those with a favorable genotype and phenotype. Thus, for example, how a crop responds to different fertilizers can be evaluated
(Holman et al., 2016), or how they perform against pests and diseases
(Chapman et al., 2014).
The most common phenotypic traits considered are:
-Canopy height (m), biomass amount (kg/m
2) and crop’s crown volume (m
3). These are indicators for good growth, crop vitality, efficient use of light, production during harvest, carbon reserves and nutrient availability
(Bendig et al., 2014; Holman et al., 2016; Li et al., 2016; Shi et al., 2016; Torres-Sánchez et al., 2015).
-Foliar area index and canopy cover are good indicators of how much vegetation there is per surface unit
(Liebisch et al., 2015; Shi et al., 2016). This is related to the plant photosynthetic capacity, respiration and evaporation-transpiration and efficiency in the use of light (
Córcoles et al., 2013;
Verger et al., 2014). The foliar area index reflects the crop’s growth potential and it permits to estimate biomass production (
Córcoles et al., 2013;
Hunt et al., 2010; Lelong et al., 2008; Liebisch et al., 2015; Shi et al., 2016; Verger et al., 2014).
A crop phenotype from remote sensing can be carried out in two complementary manners: from vegetation indexes and from photogrammetry. Each approach allows to determine different phenotypic traits, as it is described in the following sections.
Vegetation indexes
Vegetation indexes are quantitative indicators calculated as of crop images. These indexes allow to monitor a crop or soil condition, providing information on growth, biomass contents, crop health, crop condition, weed presence,
etc.
Some indexes which are gotten from red or infrared bands are related to the biomass contents
(Aasen et al., 2015), the canopy structure and the foliar area index (LAI)
(Aasen et al., 2015; Lelong et al., 2008; Tattaris et al., 2016). Other indexes only depend on visible bands and are related to pigment concentration in the leaves and nitrogen contents
(Lelong et al., 2008). On the other hand, some indexes such as canopy temperature
(Berni et al., 2009), which is calculated as of infrared radiation, permit to evaluate a plant’s transpiration rate in an indirect manner. It is a performance indicator of a plant with low hydric stress
(Tattaris et al., 2016b).
(Henrich et al., 2009) have compiled ample data on different sensors, vegetation indexes and remote sensing applications, available presently only in a free-access web site. Fig 2 shows the most common vegetation indexes used in remote sensing. Table 2 summarizes some vegetation indexes with some applications found in the literature and the corresponding formula to estimate them based on different spectral bands.
Taking images to extract vegetation indexes is a great challenge, as it demands stable lighting conditions during capture. Since these indexes depend of reflectance, the time of the day when images are captured may have an influence on the same
(Vega et al., 2015), as well as the angle used to take the images
(Rasmussen et al., 2016). On the other hand, it must be taken into account that plant reflectance varies depending on their growth stage (
López-Granados, 2011). Finally, to guarantee a good spatial resolution, images must be taken at a height of 30 to 100 m
(Rasmussen et al., 2016).
Photogrammetry
Photogrammetry is a technique for the tridimensional reconstruction of an object from multiple images taken on it. Photogrammetry permits to generate digital models of a 3D surface as of which it is possible to measure distances, areas, volumes, with high accuracy. UAV have a great potential for this since they permit to take high-resolution aerial images at a low altitude.
The photogrammetry process is carried out under the following phases: flight planning, location and measuring of control points, flight execution to capture images and processing of the georeferenced images (
Córcoles et al., 2013). For geo-referenciation different ground control points (GCPs) are commonly used in order to ensure accuracy in the mosaicking process (
Gómez-Candón et al., 2014).
By means of the tridimensional reconstruction of lands and crops (CSM crop surface models) it is possible to determine the crop height, the canopy volume, the crop area
(Bendig et al., 2014; Díaz-Varela
et_al2015;
Li et al., 2016; Torres-Sánchez et al., 2015), parameters that cannot be obtained from vegetation indexes
(Geipel et al., 2014). It is also possible to determine a crop growth rate if the process takes place under different growth phases
(Bendig et al., 2014; Holman et al., 2016).
For the tridimensional reconstruction a computer vision technique is normally used known as Structure from motion, which offers high topographical resolution from the different sequences of images
(Bendig et al., 2014; Geipel et al., 2014; Holman et al., 2016; Lucieer et al., 2014; Zahawi et al., 2015). Generating 3D hyperspectral maps permits to get hyperspectral data by pixel in an image from a camera that has been previously characterized and calibrated using radiometric methods
(Aasen et al., 2015). Some software tools used in the literature to carry out this process are Agisoft Photo Scan Proffessional
(Bendig et al., 2014; Geipel et al., 2014; Haghighattalab et al., 2016), Smart3DCapture
(Li et al., 2016) and Leica Photogrammetry Suit (
Gómez-Candón et al., 2014).
For image georeferencing, UAVs use GPS. Nonetheless, since the GPS receptor provides the UAV position with an ample error margin (of several meters) normally land control points (GPC) are used in order to obtain a minor error less than 10 cm
(Li et al., 2016; Lucieer et al., 2014; Vega et al., 2015).
Sensors for remote sensing
According to the reviewed articles, crop monitoring can be done through remote sensing, based mainly on RGB, multispectral and hyperspectral sensors, as shown in Fig 3. The corresponding references are shown in Table 3.
RGB sensors provide 3 bands (red, green and blue) of the visible spectrum, as shown in Fig 4, with a wavelength ranging from 380 nm up to 750 nm. These bands allow the calculation of some of the vegetation indexes shown in Table 2, which are mainly intended to discriminate vegetation from the soil and estimate biomass amount.
Multispectral sensors are the most utilized in remote sensing with UAVs as they are more accessible and offer a good number of bands aside from RGB
(Verger et al., 2014). These sensors capture between 3 and 12 bands from around 100nm width each (
Adão et al., 2017;
López-Granados, 2011). These bands allow the calculation of different vegetation indexes as shown in Table 2. One of the most common vegetation indexes is NDVI, which requires the near-infrared band and a visible band, allowing the detection of vegetation in a more effective way than the ExG index, which is based on the RGB bands.
On the other hand, hyperspectral sensors detect up to hundreds of narrow bands for a width less than 10 nm (
López-Granados, 2011;
Lucieer et al., 2014). Thanks to this they allow to analyze variables manifesting in a very small band such as, for example, chlorophyll contents in a 670-780nm band and obtain high resolution thermic images (Berni, Zarco-Tejada, Sepulcre-Cantó,
et al., 2009). Overall, hyperspectral sensors allow us to discriminate components that may be grouped by multispectral bands. A wide review of hyperspectral sensors is done in (
Adão et al., 2017).
Both multispectral and hyperspectral sensor may be useful for the detection of weeds, diseases and pests, as they lead to small spectral variations that require high spectral resolution.
Aerial vehicles for remote sensing
For remote sensing traditional satellite images or those taken from manned aerial vehicles have been used. They present limitations as to temporal and spatial resolution and depend on climate conditions, such as cloudiness (
Torres-Sánchez et al., 2013). In the last years UAV have shown a great potential to carry out this task at a lower cost and with better performance. Unmanned aerial vehicles permit to take high spatial resolution images due to their maneuverability to fly at a low altitude. Fig 5 summarizes the types of UAV used in different studies on remote sensing. The corresponding references are shown in Table 4.
Multirotor is the most common type of UAV used for remote sensing, since they allow us to take multispectral images at low altitude and high spatial resolution, they have low operating costs and high maneuverability. However, multirotors face challenges including lower flight autonomy, slower velocity and lower payload capacity compared to fixed wings UAVs. These last ones are intended for large areas and flat terrains, whereas multirotors are more suitable for small areas regardless of the terrain topography. Table 5 summarizes the overall characteristics of different types of UAVs, that may be considered for remote sensing applications.