Quantcast
Viewing all articles
Browse latest Browse all 187

UAV for automated crop count

Image may be NSFW.
Clik here to view.
Preview content: 
Share this story: 
Anonymous teaser: 

Paul Brown and Lee Butler of Fera describe the development of methods to automatically count potato plants during early stage crop emergence from remotely sensed imagery obtained by unmanned aerial vehicles.

Introduction

Fera’s unmanned aerial vehicle (UAV) is used for deployment in agriculture, forestry, quarrying, open mining and construction environments. UAV can be combined with sensor technology and used as a tool for remote sensing, allowing GIS scientists to capture bespoke aerial imagery within and beyond the visible spectrum. The power of this data is in the analysis and, by using specialist image processing and interpretation software, scientists can extract valuable information from the imagery. Applications include:

• Counting individual crops within field during initial emergence for harvest yield estimates

• Locating tree species in a pest or disease outbreak scenario

• Crop stress mapping – agronomy survey targeting.

The UAV surveillance and inspection technology allows better informed decisions to be made about future planning and helps to drive down planning and development costs.

Counting crops to high levels of accuracy in the field is extremely time consuming. Even manually

counting from imagery takes a very long time and does not provide a reliable result. To address this

problem, an algorithm has been developed to automatically count crops using specialist image analysis software. The resulting crop count is provided as an ESRI shapefile that can be imported into any GIS package or viewer and can be viewed over the imagery to add context. The file contains a single point per plant and is attributed with the individual plant’s Normalised Difference Vegetation Index (NDVI) as an indicator of crop health. NDVI is a numerical indicator that uses the red visible and near-infrared bands of the electromagnetic spectrum to assess whether the target being observed contains live green vegetation or not. The NDVI ranges from 0.1 to 1.0 where positive values indicate increasing greenness and negative values indicate nonvegetated features [1].

Case study: potato crop yields

A study was carried out with a large potato grower to develop an automatic crop-counting algorithm to map potato plants during early stage crop emergence. The aim was to produce a data product that locates each individual potato plant in a field so that the user can interrogate whole fields or part areas of fields to obtain accurate crop counts. Objectives included:

1. Flying UAV imagery twice during early potato plant emergence. Flying two dates during this emergence time to assess when crop counting is most successful.

2. Obtaining process imagery to produce 4 band multispectral imagery – B,G,R,NIR.

3. Developing a counting algorithm.

4. Assessing the suitability of UAV imagery to identify and count potato plants.

5. Producing an intuitive and easy to use web app

Study site and UAV flights

Two fields were selected based on project timings and crop emergence coinciding. Two UAV flights were successfully conducted over the fields during initial crop emergence on the 3rd and 9th June 2016. The sensors used for image acquisition were a twin system of standard Lumix LX7 digital cameras, with one modified to detect near infrared spectral data. These sensors are good for obtaining high spatial resolution data overlarger areas for visual assessment and broadband vegetation indices [2,3]. The UAV images were then processed using the software Pix4D and ERDAS IMAGINE to create 4 band multispectralimages of the two fields at the two time periods (Figure 1).

Image may be NSFW.
Clik here to view.
Figure 1: Natural colour composites of survey flight 1 (3rd June 2016) and flight 2 (9th June 2016)

Image analysis and rule set algorithm development

With the growing use of UAV, very high resolution remotely sensed images are becoming more and more widely used in many industries including agriculture.

The analysis and interpretation of these images is fundamental to unlocking their potential in precision agriculture. Thus, choosing appropriate software and methodology [4] is the most important part of UAV agricultural remote sensing.

eCognition is a software based on object-orientated image analysis that extracts both spatial and spectral information, where the smallest unit is not a pixel but an object [4]. An object is a grouped set of pixels with similar spectral response. This methodology works very well for identifying individual 

plants, as in the imagery they appear as small areas of similar spectral response (green plants) on a background of a clearly different spectral response (brown soil). The rule set (rule logic to evaluate data) has been successful at automatically locating and therefore counting individual potato crop plants during early stage crop emergence in the two fields shown in Figure 1. The rule set follows a specific workflow to achieve an individual point per potato plant. Figure 2 shows the basic workflow used for an eCognition project and how to programme the software to identify individual potato plants.

Image may be NSFW.
Clik here to view.

Figure 2: The workflow for rule set algorithm development

Worflow stage 1: Zoom of the area within the fields. 

Workflow stage 2: The image is segmented in many small objects (polygons) of similar spectral response; this means pixels of similar colour are grouped togehter. 

Workflow stage 3: The objects that are identified as bare earth are removed, leaving only objects identified as vegetation (in this case porato plants). There are still many objects per plant.

Workflow stage 4: Each object is then interrogated to identify the objects that most resemble an individual potato plant. To achieve this, the objects of brightest spectral response are used as well as matching to an identified template for a potato plant.

Workflow stage 5: A point is then added to each individual potato plant and objects are removed, in this example there are 5180 potato plant.

Performance

 Figure 3 (a zoomed in area and the resulting crop identification), shows that the algorithm is very successfully identifying individual potato plants at different stages of emergence. Even though some of the plants were just breaking the surface whilst others were beginning to merge together, the algorithm is still able to identify individual plants in both cases. The algorithm has been developed to identify even small amounts of vegetative material above the surface.

Potential users would not necessarily have access to GIS software, therefore a web application was developed as a method to disseminate the data and results [5]. The web application has a number of 

tools enabling the user to interrogate the data:

 • layers can be added or removed

 • measurements can be taken of distance and area

 • swiping between data layers

 • printing of maps

 • a crop counting tool, where the user draws a polygon around an area of interest and a count for the number of plants is returned.  

Image may be NSFW.
Clik here to view.

Figure 3: Zoomed in example from the first flight on 3rd June. The red crosses are where the algorithm is identifying individual crops. 

Figure 4 shows the web Application and the areas of less efficient planting. There are in-row gaps between some individuals indicating that the in-row planting density is not uniform. It would be interesting to investigate this further to understand and identify an optimal planting density. It is also apparent that the planter is not planting as efficiently at the start of rows as at the end of the rows. The red polygons in Figure 4 clearly show wider in-row gaps between individual plants at the start of planting rows. Once the merging of the crop becomes more substantial, then the algorithm struggles to identify individuals (Figure 5). However, Figure 3 and less merged areas of Figure 5 show that during the initial stages of crop emergence, individual potato plants are automatically identifiable from UAV acquired imagery. Merging of the crops does not affect performance during in-row merging; it is when the crop begins to merge across rows that the algorithm has difficulties in identifying individual plants (Figure 5).

Image may be NSFW.
Clik here to view.
Figure 4 Zoomed in example from the second flight on 9th June. The blue points are where the algorithm is identifying individual crops. The red polygons show areas where the precision planter is not planting as efficiently at the start of rows than at the end of rows.
Image may be NSFW.
Clik here to view.
Figure 5: Zoomed in example from the second flight on 9th June, where some areas have been planted earlier and the potato plants have merged quite significantly across rows. The red crosses are where the algorithm is identifying individual crops. In the areas that are merging in-row individuals are still identifiable but when the crop begins to merge across row the algorithm is not performing successfully.

Conclusions

This initial study has shown that high resolution UAV imagery acquired during the initial two weeks after crop emergence was successful in identifying individual potato plants in field, allowing future potato yield to be predicted to a higher level of accuracy as well as an assessment of planting efficiency. The algorithm still performs very well as the crop begins to merge within row, however once the crop merges across rows it becomes difficult to distinguish between individual plants.

The data can be exported to a GIS ESRI shapefile and used in any GIS software package for further analysis. Fera has also developed a web application to disseminate the collected UAV imagery and derived potato crop count. The web application allows the user to interact with the data more effectively than would be possible with paper or .pdf maps. It also includes a number of tools that allow users to interrogate the data. These include the Swipe tool, allowing users to compare different sets of imagery, and the Crop Count tool that allows users to select an area on the map and retrieve the amount of potato plants within that area. 

Future work

This was an initial study into how successfully a crop-locating and counting algorithm could perform on UAV acquired imagery. The study has raised some interesting questions on planting efficiency. It would be valuable to have input from a machinery manufacturer and to investigate how different planters perform.

Further investigation of planting density would be useful in the light of the non-uniform gaps within rows and also the changes in planting efficiency at the beginning and end of rows. Analysis of the yield obtained from different areas within the field where planting density differs could provide valuable information to reinforce the results obtained from UAV imagery analysis.

Paul Brown, GI Remote Sensing Scientist, Lee Butler, GIS Spatial Analyst, Fera Science Ltd, National Agri-Food Innovation Campus, Sand Hutton, York, YO41 1LZ

Email: paul.brown@fera.co.uk, lee.butler@fera.co.uk

Tel: +44 (0)300 100 0323 Web:www.fera.co.uk

References

1. Candiago, S., Remondino, F., De Giglio, M., Dubbini, M. and Gattelli, M. (2015) Evaluating multispectral images and vegetation indices for precision farming applications from UAV images.  Remote Sensing. 7, 4026-4047.

2. von Bueren, S. K., Burkart, A., Hueni, A., Rascher, U., Tuohy, M. P. and Yule, I. J. (2015) Deploying four optical UAV-based sensors over grassland: challenges and limitations. Biogeosciences. 12, 163-175.

3. Nebiker, S., Lack, N., Abächerli, M. and Läderach, S. (2016) Light weight multispectral UAV sensors and their capabilities for predicting grain yield and detecting plant diseases. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Volume XLI-B1, 2016. Paper presented to XXIII ISPRS Congress, 12-19 July 2016, Prague, Czech Republic.

4. Liu, X., XU, J., ZHAO, J., Yong, L. and Xin, Z. (2015) The comparison of segmentation results for highresolution remote sensing image between eCognition and EDISON.  Applied Mechanics and Materials. 713-715, 373-376.

5. https://uav.fera.co.uk/PotatoCount

 

Content type: 

Viewing all articles
Browse latest Browse all 187

Trending Articles