Figures
Abstract
Grafting robots have been developed in the world, but some auxiliary works such as gap-inspecting for grafted seedlings still need to be done by human. An machine-vision system of gap inspection for grafted muskmelon seedlings was developed in this study. The image acquiring system consists of a CCD camera, a lens and a front white lighting source. The image of inspected gap was processed and analyzed by software of HALCON 12.0. The recognition algorithm for the system is based on principle of deformable template matching. A template should be created from an image of qualified grafted seedling gap. Then the gap image of the grafted seedling will be compared with the created template to determine their matching degree. Based on the similarity between the gap image of grafted seedling and the template, the matching degree will be 0 to 1. The less similar for the grafted seedling gap with the template the smaller of matching degree. Thirdly, the gap will be output as qualified or unqualified. If the matching degree of grafted seedling gap and the template is less than 0.58, or there is no match is found, the gap will be judged as unqualified; otherwise the gap will be qualified. Finally, 100 muskmelon seedlings were grafted and inspected to test the gap inspection system. Results showed that the gap inspection machine-vision system could recognize the gap qualification correctly as 98% of human vision. And the inspection speed of this system can reach 15 seedlings·min-1. The gap inspection process in grafting can be fully automated with this developed machine-vision system, and the gap inspection system will be a key step of a fully-automatic grafting robots.
Citation: Liu S, Xing Z, Wang Z, Tian S, Jahun FR (2017) Development of machine-vision system for gap inspection of muskmelon grafted seedlings. PLoS ONE 12(12): e0189732. https://doi.org/10.1371/journal.pone.0189732
Editor: Yuan Huang, Huazhong Agriculture University, CHINA
Received: June 17, 2017; Accepted: November 30, 2017; Published: December 21, 2017
Copyright: © 2017 Liu et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All relevant data are within the paper and its Supporting Information files.
Funding: This work was supported by the Nation Natural Science Foundation of China (31101076) whose URLs are "http://www.nsfc.gov.cn/", and the Tianzhu Mountain Scholar in Shenyang Agricultural University (2014) whoes URLs are "http://www.syau.edu.cn/". The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
Introduction
Grafting is a technique that graft branches or buds from one plant with appropriate parts of another plant that has a strong affinity. A branch or bud named scion to be cultivated is supposed to be combined with a strong plant named rootstock, so that the scion can grow, blossom and bear. Grafting technology can solve the continuous cropping obstacles, increase yield and fruit quality [1]. Grafting robot technology has recently been developed. In many countries, some semi-automatic grafting robots for cucumber, tomato and watermelon have been developed and commercialized [2]. The working efficiency of semi-automatic grafting robots is three times faster than the efficiency of traditional manual grafting [3]. So far, most automatic grafting machine’s speed can reach 600 seedlings/h [4–6]. However, since the machines still require manual work then they can only achieve a semi-automatic [7]. The quality of the grafted gap must be inspected so that transplanting survival rate can be improved. However, present semi-automated grafting robots are still need manual gap quality inspection, the gap inspection method need to be improved.
Machine-vision has been applied in many research fields, breeding industry, fishery, agriculture and so on. In recent years, pattern recognition study of machine-vision has been applied in various fields For example in inspecting quality of grafted seedlings [8], acquiring parameters of seedling such as, cotyledon node information, seedling stem diameter and seedling stem bending angle [9]. Image processing algorithms is the most important in machine-vision [10–14]. The main algorithms include that, color space conversion algorithms, gray-scale transformation, image binaryzation processing algorithms[15–18]. Although some algorithms can distinguish the characteristics of inspected object in detail, the calculation process of them are still too complex, hence not suitable for inspecting the gap of grafted seedling like muskmelon. Furthermore, because of the variety of seedlings, some algorithm is very simple and usually give inaccurate results. For this reason, it is important to design a simple and practical gap inspection machine-vision system for muskmelon grafting machine.
The aim of this study was to develop an automatic gap inspection machine-vision system for muskmelon grafted seedling. Objectives of the study are: to select appropriate hardware (camera, lens, and lighting source) for image acquisition, to develop algorithm of gap inspection machine-vision system and to conduct gap quality inspection test of muskmelon grafted seedling.
Materials and methods
Image acquisition
The seedling grafting was carried out using an automated grafting machine developed by our research team [19]. The main structure of the grafting machine is as shown in Fig 1. The machine’s grafting process is as follows: Firstly rootstock (pumpkin) and scion (muskmelon) were held by two respective mechanical arms and then sent them to the cutting device where they were cut to the same appropriate angle. The ready rootstock and scion were later aligned, put together and clamped using a clip. Finally, the grafted seedling was prepared for the image acquisition process.
Camera selection
The hardware structures of the image acquisition system include: camera, lens and light source were selected separately. The selected camera is MER-125-30UC modelwhich has high resolution and suitable for acquiring shapes of very tiny objects such as muskmelon seedlings. Therefore, a complete and reliable images of the grafted seedlings can easily be acquired within a short distance.
The colored camera is equipped with 1/3” Sony ICX445 CCD sensor chip. The sensor chip size is 4.8mm×3.6mm with resolution of 1292×964 pixels. Specific performance parameters of the camera are shown in Table 1.
The camera can also connect directly with some image processing software such as HALCON to set appropriate exposure and other parameters to acquire and process images.
Lens selection
In selecting appropriate lens for grafted seedling gap inspection, the characteristics of grafting clip and the clamped stem portion were considered. In this way, the main characteristics of the grafting clip and clamped stem should be very clear. Two lenses of different focal lengths (f) 16 and 25mm were tested. Two images of the seedling stem with same object distance were acquired with each lens, to get the best image as shown in Fig 2. The seedling stem images acquired with 25mm focal length lens were found to be clearer, and more conspicuous. The images obtained (f = 25mm) also had fewer background information hence, more suitable for inspecting the grafted seedling gap. Consequently, lens with focal length, f = 25mm was selected for image acquisition.
Light source selection
Light source plays an important role in the machine-vision system. The image acquired with natural light usually has too much noise hence it will not give proper illumination, and white light or green light may enhance the contrast between grafting gap and background, while lighting source with other color can’t make the grafting gap clearer. Therefore, two lighting schemes:, the white LED ring front lighting source, and green LED ring front lighting source were selected and tested. The lighting sources and their respective images acquired together with the templates created are shown in Fig 3. In the figure, (a1) and (b1) are the white and the green lighting source, while (a2) and (b2) are the images acquired with the white and green lighting sources respectively. Also, (a3) and (b3) depict the templates created from the images acquired with the white and green lighting sources. It was clearly observed that, for the same seedling, a much clearer template was created with the white front lighting and the intensity of natural light will not affect the images acquired, but an incomplete and unclear template was obtained with the green front lighting. Therefore, the white front lighting was selected for acquiring image.
The grafted seedlings are not vertical in general and the images are acquired difficulty. The height of seedling pot is 85mm, the distance between the grafted gap and the pot rim is 20mm to 30mm, thus the height of the camera was set as 110mm. The working distance of selected lens with focal length 25mm is over 50mm. In order to acquire whole image of grafted gap, the center of seedling pot and the camera was set in a horizontal line, and the distance between them was set as 175mm which was tested as best. Finally, the hardware of machine vision system was set up. The hardware layout and structure are shown in Figs 4 and 5. The ring front lighting is set on the lens and and the distance between the center of seedling pot and the camera (D) was set as 175mm. The height of seedling pot (H1) is 85mm, distance between center of camera and the working plane (H2) was set as 110mm.
Gap inspection of grafted seedlings
Grafted gap inspection standard.
The grafted seedlings were inspected by physical observation and prepared to be inspected of qualification.Two necessary conditions must be satisfied for a qualified grafted seedling thus: (1) Rootstock and scion must be grafted completely and gap fitted well. (2) The seedling must be held tightly by the grafting clip.
There are mainly three possible results are usually obtained (Fig 6) as follows: (a) Rootstock and scion are grafted completely, the gap is well fitted and seedling held tightly by the grafting clip. (b) Rootstock and scion are grafted completely, but the seeding is not held tightly. (c) Rootstock and scion are grafted incompletely, the grafted parts are not fitted well. Consequently, result in (a) is regarded as the qualified case, while results (b), (c) and any other conditions are considered as unqualified.
Template matching technique.
HALCON 12.0 software was used, the system is a professional image processing software developed by MVtec. The software integrated development environment and flexible architecture makes it convenient for all kinds of automatic detection.
Due to the different shapes of grafted gap, traditional image processing methods could not be able to directly recognize grafted gap. Therefore, a better and more reliable method of template matching has been adopted by many researchers in different fields, and this method is also used in this study to recognize the grafted gap[20–23]. Template matching method can calculate the similarity between the template characteristics and the image to be inspected. In this study the template matching based on deformable template technique was selected to recognize the grafted gap, since it is sensitive to deformation. The theory of template matching is as following: The size of template T is supposed as M×N, and the size of target image S is L×W(L>M, W>N). Template matching means overlapping the template T onto the target image, and the similarity between the template and sub-image Si,j can be measured based on the following equations:
(1)
Expansion formula (1) shown as following:
(2)
In the Eq (2), the first term is the energy of the template which will not change with the template position (i,j); the second term is the energy of the sub-image which will change a little with the template position; and the last term of the equation means the cross correlation energy between the template and the sub-image, and this term will reach its maximum value when the template and the sub-image are totally matched. So the similarity between the template and sub-image can be calculated with the following equation:
(3)
And the Eq(3) can be normalized as Eq(4):
(4)
And based on the Cauchy-Schwarz inequality, the similarity R(i,j) is a number between 0 to 1 (in this study “score”), signify the similarity between the template characteristics and the found characteristics.
The technique basically consist of template creation, searching for the template from the target images, and outputting the inspecting result. The flowchart of the template matching process is as shown in Fig 7 below.
Template matching steps. 1) Template creation: During the process of template creation, important parameters were first defined, these include, pyramid level, starting angle and rotation range of the template, periodic angle of template rotation and the contrast between the background and target object. The best clear image was selected as the template image, by generating a ROI (Region of Interest). The selected ROI is a region filled with the typical characteristics (the gap and the grafting clip holding it) and was separated out from background. The template was then created based on the characteristics in the ROI image with the given parameters. The template contours include the grafting clip and the grafted gap. Fig 8 shows pictorial flowchart of template creation process.
2) Searching for the created template: The deformable matching can be used to detect an object that is distorted by a local deformation. The purpose of searching for the template is to find the best match of a deformable template in the target image. In order to find the best match in the target images, some important defined parameters were optimized for the template searching thus: The angle of start and rotation range of the template, number of template searching, the largest overlap ratio, and greediness degree. Before searching of the created template from the target image, the program firstly rectify the shape change of contours characteristics of the target image, then search for the template in the target image according to the given parameters. After searching of the template in the target image, if a match is found, a number between 0 and 1 (score) will be output, as shown in the Fig 9A. However, if no match is found, there will not be a “score”, as shown in the Fig 9B.
3) Outputting the inspection result: After searching for the template among many target images, some results obtained for the qualified and unqualified grafted gaps are shown in Table 2 below. These result shows that scores of the qualified grafted gaps are greater than the unqualified ones. This means that qualified grafted gaps have more similar contours with that of template contour are more. Through the data comparison it was observed that the minimum score value of qualified gaps(0.58) is always greater than the maximum value of the unqualified gaps. Consequently, a threshold score value of 0.58 was established. Therefore, if the score of a match is less than 0.58 the gap will be considered unqualified, else it is unqualified.
Developing the grafted gap inspection algorithm.
The developed algorithm for the template matching process is as following: (1) Read the template image;(2) Create the template;(3) Search for the template in target image;(4) Inspect the grafted gap quality with searching result(if “score” is equal to or greater than 0.58, the grafted gap will be judged as qualified, else if “score” is less than 0.58 or there is no match found in the target image, the grafted gap will be judged as unqualified).The output of the program algorithm as displayed on the interface is also shown in Fig 10.
More so, the main grafted gap inspection algorithm programmed with HALCON are as follow:
- Read an image using the operator “read_image()”;
- Generate a ROI using the operator “gen_rectangle1()”;
- Create a deformable template using the operator “create_local_deformable_model()”;
- Get the characteristic contour of the template using the operator “get_deformable_model_contours()”;
- Search for the template in the target image using the operator “find_local_deformable_model()”.
Results and discussion
The developed machine-vision system was tested to inspect the gaps qualities of 100 muskmelon grafted seedlings. The inspection results showed there are 8 unqualified seedlings, however in actual observation, there are only 6 unqualified seedlings. That means there were 2 qualified seedlings are inspected by the system as unqualified ones. Therefore, the failure rate of the gap inspection machine-vision system is 2%.
12 samples were selected among these 100 tested grafted seedlings showed in Fig 11. Result 1 to 8 depict the output of the inspected seedlings as qualified, and the results are the same with the actual. The scores obtained range from 0.97 to 0.64, however it should be noted that a score doesn’t mean the success degree of the seedling, rather it only gives the degree of similarities between the seedling and the template. Similarly, results 9 and 10 show the output of two seedlings that inspected as unqualified, and were actually found physically to be unqualified. Their rootstocks and scions were seen not grafted completely. In addition, the last two results (results 11 and12) show the fail inspection result but actually, these two seedlings are qualified ones, even though the inspection output shows as unqualified grafted seedlings. The reason for this failure could be due to large bending angles of the seedlings, because it will cause the distance between the center of the camera and the gap too far or too near, and unclear grafted gap will be acquired. Also, if the seedling bending angle is too large, the template matching algorithm could not find the template in the target image, then the inspection will be a failure. In some condition the presence of soil and water on the seedling will destroy the contour of the gap image, and cause the inspection to fail.
It was observed that average time taken to inspect a seedling using the machine-vision system is less than 4 seconds (the inspection speed of this system can reach 15 seedlings·min-1.), based on experiment results, which can fulfill the speed demand of the automatically grafting process. However, it takes much longer time to inspect the grafted seedling manually in the actual production. Therefore the application of machine-vision system for gap quality inspection can improve the efficiency and automation degree of grafting process, and can give an important meaning in grafting robots development.
Conclusion
In this study, a machine-vision system for gap inspection of muskmelon grafted seedlings based on template matching was developed. The hardware of the system was selected and set up, and the gap inspection algorithm was developed based on the deformable template matching. The test indicates that the inspection speed of this grafted gap inspection system can reach 15 seedlings·min-1, and, the inspection success rate is 98%, which can fulfill the demand of the automation of grafting robots with great application value. The high success rate machine-vision gap inspection system provides a theoretical basis for further research of automatic grafting robots.
Supporting information
S1 File. Detection results of 100 tested grafted seedlings.
https://doi.org/10.1371/journal.pone.0189732.s001
(XLSX)
Acknowledgments
This study was supported by the National Natural Science Foundation of China (31101076) and Tianzhu Mountain Scholar in Shenyang Agricultural University(2014).
References
- 1. Ashraf MA, Kondo N, Tomoo S. Use of machine vision to sort tomato seedlings for grafting robot. Engineering in Agriculture, Environment and Food. 2011; 4(4), 119–125.
- 2. Tian SB, Wang ZF, Yang JF, Huang ZC, Wang RL, Wang LP, et al. Development of an automatic visual grading system for grafting seedlings. Advances in Mechanical Engineering. 2017; 9(1):1–12.
- 3. Lee JM, Kubota C, Tsao SJ, Bie Z, Echevarria PH, Morra L, Oda M. Current status of vegetable grafting: Diffusion, grafting techniques, automation. Scientia Horiculturae. 2010; 127(2), 93–105.
- 4. Gu S, Jiang LB. Development of domestic and foreign vegetable grafting robot. Journal of Northeast Agricultural University. 2007; 38(6), 847–851.
- 5. Luo J, Gao YW, He XB. The research status and development trend of automatic grafting machine. Human Agricultural Machinery. 2010; 37(2), 1–2.
- 6. Tian SB, Yang JF, Wang RL, Xu DL, Li TL. Optimization experiment of operating parameters on vibration sorting-clip device for vegetable grafting machine. Transactions of the Chinese Society of Agricultural Engineering. 2014; 30(6), 9–11.
- 7. Kubota C. Use of grafted seedlings for vegetable production in North America. Acta Horticulture., 2008.;770, 21–28.
- 8. Ling PP, Ruzhitsky VN. Machine vision techniques for measuring the canopy of tomato seedling. Journal of Agricultural Engineering Research. 1996; 65(2), 85–95.
- 9. Zhang TZ, Wei JT. Study on vision system of vegetable grafting robot: Searching for growth point of cucurbit seedling with morphological method. Journal of China Agricultural University. 1999; 4(4),45–47.
- 10. Jayme MAB. Using digital image processing for counting whiteflies on soybean leaves. Journal of Asia-Pacific Entomology. 2014; 17(4), 685–694.
- 11. Wu Y, Li D, Li Z, Yang W. Fast processing of foreign fiber images by image blocking. Information Processing in Agriculture. 2014;1(1), 2–13.
- 12. Zhang D, Lillywhite KD, Lee DJ, Tippetts BJ. Automatic shrimp shape grading using evolution constructed features. Computers and Electronics in Agriculture. 2014; 100,116–122.
- 13. Sun C, Liu T, Ji C, Jiang M, Tian T, Guo D, et al. Evaluation and analysis the chalkiness of connected rice kernels based on image processing technology and support vector machine. Journal of Cereal Science. 2014; 60(2), 426–432.
- 14. Li MX, Yan ZC. Automated strawberry grading system based on image processing. Computers and Electronics in Agriculture. 2010;71(1), 32–39.
- 15. Ebrahimi E, Mollazade K, Babaei S. Toward an automatic wheat purity measuring device: A machine vision-based neural networks-assisted imperialist competitive algorithm approach. Measurement. 2014; 55, 196–205.
- 16. Kurtulmuş F, Kavdir İ. Detecting corn tassels using computer vision and support vector machines. Expert Systems with Applications. 2014; 41(16), 7390–7397.
- 17. Manickavasagan A, Al-Mezeini NK, Al-Shekaili HN. RGB color imaging technique for grading of dates. Scientia Horticulturae. 2014; 175, 87–94.
- 18. Mizushima A, Lu RF. An image segmentation method for apple sorting and grading using support vector machine and Otsu’s method. Computers and Electronics in Agriculture. 2013; 94, 29–37.
- 19. Tian SB, Dong S, Wang RL, Bai G, Li TL. Design and Experiment on Pushing-clip Device for Automatic Muskmelon Grafting Machine. Journal of Shenyang Agricultural University. 2015; 46(3):329–334.
- 20. Kurtulmuş F, Kavdir İ. Detecting corn tassels using computer vision and support vector machines. Expert Systems with Applications. 2014; 41(16), 7390–7397.
- 21. Reza A, Nader F. Estimating geometrical parameters of cylindrical targets detected by ground-penetrating radar using template matching algorithm.Arabian Journal of Geosciences.2017; 10(6), 140.
- 22. Kartikey T, Divyang K, Fenil N, Jignesh S. Implementation and analysis of template matching for image registration on DevKit-8500D. Optik. 2016; 130,935–944.
- 23. Mark DJ, Peter B,Tom B, Gordon M. Extended fast compressive tracking with weighted multi-frame template matching for fast motion tracking. Pattern Recognition Letters. 2016; 69, 82–87.