Research & development - Wageningen | More than two weeks ago
Plague management in Greenhouses requires automatic analysis of insect on sticky plates. By observing these insects with a hyperspectral camera not only using visible light, we are investigating if this improves the recognition. Annotating the insects in hyperspectral images can be challenging due to the fact that insects are very small and easily missed and the boundary between object of interest and other objects is not always super clear. Methods in machine learning might be used to capture the information of a human in annotating image. These methods can support users in faster annotation with having to draw accurate contours and suggestion missed insect to the human annotator. Grabcut is a well-known method in computer vision which could be extended to the hyperspectral domain.
One of the challenges in machine learning is the collection and annotation of the data (in our case hyperspectral images). We currently have a dataset of insects on sticky plates which we would like to monitor for plaque management in Geenhouse. The dataset contains the following 4 species: Trialeurodes vaporariorum, Frankliniella occidentalis, Myzus persicae and Aphidius. where we have boxing box annotation by human annotation done on color images of the same sticky plate. A first issue is that human annotation is not perfect and especially for the small insect often insects are missed in the annotation process. A second issue is that the current bounding boxes do not give very precise information and the size of the bounding boxes might vary on the human.
Getting a mask of which pixels belong to an insect and which pixels belong to background would be a better result, however creating this kind of detailed labeling costs more work. The idea here is that we use the strengths provided by machine learning to learn for the annotation of the human on the fly. A famous example of this technology is grabcut, which has been developed for normal images. We hope machine learning driven approaches will help here in making segmentation annotations on the hyperspectral images easier. The research task is to improve the segmentation of insects using this technology.
This technology should be applicable first for insect recognition on sticky plates (see also a typical example below). Later the same technology should be able to deal with hyperspectral images of fruit damage/bruises, plant health, etc.
We already have a code base of technology for analyzing both normal/depth and hyperspectral images. This technology can use part of the existing code and datasets, which can help you to make fast progress. You will also create new technology which can be useful in other projects besides recognizing the insect on the sticky plates. For us it is important that you create new code that can be easily used by us in another project.
-Literature review of related solution and technology.
-Prepare a plan containing a timeline with deliverable.
-Create a machine learning algorithm to predict pixels belong to the insect.
-Measure the performance of your methods in pixels/annotations actions/different species found on the sticky plate.
-Compare your methods with other methods of finding insects.
Does this project sound like an interesting next step in your career at imec? Don’t hesitate to submit your application by clicking on ‘APPLY NOW’.
Got some questions about the recruitment process? Martijn Kohl of the Talent Acquisition Team will be happy to assist you.