In agriculture, the prevention of pests is usually to use the detection system to observe the signs of pests, to analyze the signals after processing, and then transmit the real-time information to the user's mobile device.
In the scourge of crops, the destruction of pests is one of the critical factors that reduce crop production and harm consumers. However, the irregular appearance of pests in various parts of crops is difficult to predict or prevent. It is difficult for farmers to give timely countermeasures. Therefore, it is hoped that the sensor system can be used to detect environmental conditions and the distribution of pests and diseases, and provide users with information in real-time so that corresponding measures can be taken to prevent pests and crop loss effectively. We want to target specific crops, understand the current solutions on the market and its pros and cons, and then take the best measures to deal with them.
According to the survey results, what users care about is not the automated monitoring and resolution system but the most basic detection and monitoring of crop status, which can be sent back to the user on time, allowing the user to decide the countermeasures to take.
To figure out farmers’ pain-points, we interviewed farmers and asked them questions we designed according to potential problems we found in the papers. We considered a variety of solutions from using a mechanical device to spraying pesticides. In my user research, I found that most farmers in Taiwan are elderly people so they would prefer a simpler interface and clear information instead of large amounts of data. This was contrary to my original thinking—as an engineer, I thought the more information the better.
(1) Sound detection: Arrange sound waves near the pest attraction, process the sound received by the sensor, and combine the sound wave and target wave into the characteristics of the activity of more harmful objects. To determine whether it is a harmful thing and one or more pests, use the wireless transmitter to generate a data signal.
(2) Infrared detection: Infrared detection will be used in combination with image sensing. When the infrared detects an object, the camera will be controlled to shoot, reducing the electricity consumed by the image and processing the amount of data.
(3) Image recognition: Add an image module to the device, cooperate with infrared light detection for image capture, and use MATLAB to perform image processing to determine whether it is harmful.
Send back to the mobile terminal in real-time. The returned data includes the detection system's data, including sound, infrared, image, Etc. The data value is transmitted back to the mobile terminal for processing.
(1) Image report: Use Bluetooth Terminal/Graphics to visualize the data value so that the user can observe the crop monitoring status. Periodically, it can be reported to users for reference.
(2) Real-time push notification: Under certain circumstances (ex. return data is lower/exceeding the standard value), the form of instant message notification is used to remind users that they need to check the pest status around the crop personally.
(3) Data collection to provide the best solution: through the collection of past pest experience and pest characteristics analysis data, the possible applicable solutions are listed under specific conditions so that users can receive push notifications as soon as possible Find a suitable solution.
In the sound sensing part, we used LM386 to connect the Arduino to read the ambient sound. We tried two methods in the read-in section:
a. Use the Add-in of Arduino to call readVoltage() from MATLAB, and then convert the voltage of 5V into the analog value measured by the pin, but the delay is severe, and the sampling rate has hidden ceiling.
b. Use the Serial Port to read the value directly. In this situation, MATLAB is extremely unstable, and the corresponding port often fails suddenly, so there is no problem if we write it in Python.
Speaking to the pre-processing of the signal, no matter it is the source audio read in or the comparison sample collected, all need to be processed
(however, the sample only needs to be processed once, and the input audio must be done once per unit)
Because the sample is static, it is relatively simple to handle. The first is to low-pass the sample to remove noise, remove the part with a value close to 0, and then extract a period of audio as a formal sample for comparison.
Since the sample scale and read-in signals are different, normalize and rescale to [-1,1].
The acquisition of the read-in signal initially attempts to access the read-in signal with an array of a fixed size, fix a sampling rate, and then round it forward to maintain the size of the array and then compare it with the sample. However, in the testing process, it will be found that the buffer is not considered, and only part of the target signal may be captured and misjudged. The frame method was used to capture both the read-in and sample signals to solve this problem.
In order not to waste extra memory to access, the condition of frame capture here is to capture when the read signal > 30% of the average value, but clear it if 50 consecutive points are < 30% of the average value. Here the frame length is preset to 25, and then each frame is multiplied by the Hamming window:
In this alpha value, the default value of MATLAB hamming is 0.46. After multiplying by hamming, the left and right sides of the frame are more continuous.
Then use MATLAB to do FFT to convert the time domain to frequency domain, and then multiply it by a set of triangular bandpass filters to smooth and reduce the amount of data. After that, the obtained log data is converted back to the time domain through DCT, and finally, MFCCs are obtained after logarithmic transformation ( Mel frequency cepstral coefficient) a total of 13 dimensions (1 log energy, 12 cepstral parameters), you can use MATLAB's built-in function in cepstral or use mfcc() to do it, it will return the default 13 eigenvalues of the dimension.
Finally, the sound comparison. Here, let me explain that when I first tried, I used the processed sample and input directly (of course, it was also pulled to the same scale and the same sampling rate) and then now looked for a similar sample signal from the input. , calculate the similarity (judged by xcorr()) as long as the difference < 0.05 is regarded as a bug. This is the only one that has been done. Of course, it can be successfully tried in an ideal environment and does not consider the alignment problem during the test.
But because this method needs to be improved, I decided to try it with MFCC, so I did more calculations in the previous pre-processing part. Ideally, I hope to use the obtained MFCCs to compare with some models or algorithms. At present, I plan to do linear estimation to calculate LPCC compare. The advantage of using MFCCs is that more insect species can be added in the future, and the eigenvalues of various samples can be added. Save it and throw it in for training, which can further identify insects and increase judgment accuracy.
In the part of image processing, RGB processing is finally chosen instead of HSV, because the expected erection position is mainly for crop shooting, and when using the camera to capture pictures, RGB processing and conversion to HSV processing After the comparison, although HSV is more suitable, considering the expected erection position, the picture will be closer to the searched picture, and the searched picture will be processed by RGB and converted into HSV.
Finally, RGB has a better effect in judging whether there are pests, so RGB is finally selected. Figure 11 to Figure 14 show the above four codes respectively.
1. Since the sensor signal and data will be processed by Matlab, the initial idea is to process Matlab in the cloud and access the data in the cloud simultaneously.
2. It is possible to perform calculations in the cloud, but the speed is relatively slow, and the load is high.
3. Because Virtual Machine could not read the input of Port, we changed to use Thingspeak as the cloud platform.
4. Push notification: The part of inputting the assumed value from the cloud and sending it back to the user has been completed. When the system information is complete, it can be input to the transmission platform, and users can receive real-time push notifications.
The damage of pests to plants, especially crops, is inevitable. Only long-term monitoring can ensure that crops can grow in a good environment and not be eroded by pests. Therefore, it is believed that the real-time notification system of crop pests can have a positive impact on small farmers and even the entire agriculture. It can not only improve the productivity of crops and maintain a sustainable level above, but also promote economic development.
Find the balance point between an engineer and a designer: My user research in this project highlighted an interesting point.
We initially thought farmers would appreciate automatic pest repelling. However, they described a strong preference for solving the problem in their own ways. To minimize conflicts, I learned to apply user research findings in every stage to ensure the product would indeed fit users’ needs.
Transparency of algorithm: When involving AI decision-making algorithm into the system, the users may have concerns about how the algorithm works and whether it is accountable. To deal with this problem, it is better to provide explanation in the application to keep it transparent.