Proactive Peach Pest Control: Image Analysis and Real-Time Environmental Method

Hyunwook Kim (1), Hyun Sim (2)
(1) Kornerstone. Co,Ltd., Suncheon City, Republic of Korea
(2) Department of Smart Agriculture, Sunchon National University, Republic of Korea
Fulltext View | Download
How to cite (IJASEIT) :
[1]
H. Kim and H. Sim, “Proactive Peach Pest Control: Image Analysis and Real-Time Environmental Method”, Int. J. Adv. Sci. Eng. Inf. Technol., vol. 15, no. 3, pp. 997–1006, Jun. 2025.
This study presents an intelligent peach pest prediction and control system that fuses deep‑learning image diagnostics with real-time IoT agro‑climatic sensing. A CNN trained on a large, expert-labeled dataset automatically detects key pests and diseases—brown rot, bacterial spot, aphids, and peach moth—achieving 92% classification accuracy. Concurrently, multi-point sensors stream temperature, humidity, soil‑moisture, and sunlight data to an LSTM forecasting model that learns environment-driven outbreak patterns. The two outputs are merged through a rule-based data‑fusion algorithm that grades risk and triggers alerts. Field trials in Suncheon and Gwangyang orchards confirmed that the integrated approach increases early-detection rates by 10% over image-only baselines, issues a warning an average of three days before visible symptoms appear, and enables targeted interventions that reduce chemical usage and damage. Certification testing by the Korea Institute of Lighting Technology further validated key performance targets, including≥87% predictive accuracy (achieved at 94.2 %), image analysis within 20 seconds, and sensor data processing within 1 minute. The modular edge-to-cloud architecture runs on cost-effective hardware, supports real-time dashboards and mobile notifications, and is readily extensible to other crops through transfer learning. By combining computer vision, time‑series analytics, and IoT, the proposed system offers a practical, scalable template for proactive, data‑driven crop protection that advances sustainable, precision agriculture. Future work will extend deployment through drone imagery, lighter edge models, and explainable‑AI modules to widen crop coverage and strengthen farmer trust.

K.-H. Kim and J. Lee, "Smart plant disease management using agrometeorological big data," Res. Plant Dis., vol. 26, no. 3, pp. 121–133, Sep. 2020, doi: 10.5423/RPD.2020.26.3.121.

S. Jang et al., "Early prediction of white rot in apple using hyperspectral imagery," Proc. Korean Soc. Agric. Mach., vol. 26, no. 2, pp. 154–154, 2021. [Online]. Available: https://www.riss.kr/link?id=A107913093.

A. Soussi et al., "Smart sensors and smart data for precision agriculture: A review," Sensors, vol. 24, no. 8, p. 2647, Apr. 2024, doi: 10.3390/s24082647.

Z. Wang et al., "IoT-based system of prevention and control for crop diseases and insect pests," Front. Plant Sci., vol. 15, 2024, doi: 10.3389/fpls.2024.1323074.

D. O. Kiobia et al., "A review of successes and impeding challenges of IoT-based insect pest detection systems for estimating agroecosystem health and productivity of cotton," Sensors, vol. 23, no. 8, 2023, doi: 10.3390/s23084127.

K. P. Ferentinos, "Deep learning models for plant disease detection and diagnosis," Comput. Electron. Agric., vol. 145, pp. 311–318, Feb. 2018, doi: 10.1016/j.compag.2018.01.009.

M. A. Ali, R. K. Dhanaraj, and S. Kadry, "AI-enabled IoT-based pest prevention and controlling system using sound analytics in large agricultural field," Comput. Electron. Agric., vol. 220, p. 108844, May 2024, doi: 10.1016/j.compag.2024.108844.

H. Pang et al., "A real-time object detection model for orchard pests based on improved YOLOv4 algorithm," Sci. Rep., vol. 12, no. 1, p. 13557, Aug. 2022, doi: 10.1038/s41598-022-17826-4.

N. Yao et al., "Deep learning-based segmentation of peach diseases using convolutional neural network," Front. Plant Sci., vol. 13, 2022, doi: 10.3389/fpls.2022.876357.

D. Zhang et al., "Peach leaf diseases identification using convolutional neural network and Fastai framework," Res. Sq., 2023, doi: 10.21203/rs.3.rs-3005284/v1.

M. Shoaib et al., "Leveraging deep learning for plant disease and pest detection: A comprehensive review and future directions," Front. Plant Sci., vol. 16, 2025, doi: 10.3389/fpls.2025.1538163.

M. El Sakka et al., "A review of CNN applications in smart agriculture using multimodal data," Sensors, vol. 25, no. 2, p. 472, 2025, doi: 10.3390/s25020472.

C. Bersani et al., "Internet of Things approaches for monitoring and control of smart greenhouses in Industry 4.0," Energies, vol. 15, no. 10, 2022, doi: 10.3390/en15103834.

N. Materne and M. Inoue, "IoT monitoring system for early detection of agricultural pests and diseases," in Proc. 12th South East Asian Tech. Univ. Consort. (SEATUC), vol. 1, pp. 1–5, Mar. 2018, doi: 10.1109/seatuc.2018.8788860.

M. P. Rico-Fernández et al., "A contextualized approach for segmentation of foliage in different crop species," Comput. Electron. Agric., vol. 156, pp. 378–386, Jan. 2019, doi: 10.1016/j.compag.2018.11.033.

T. Amit et al., "Segdiff: Image segmentation with diffusion probabilistic models," arXiv, 2021, doi: 10.48550/arXiv.2112.00390.

F.-A. Croitoru et al., "Diffusion models in vision: A survey," IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 9, pp. 10850–10869, Sep. 2023, doi: 10.1109/tpami.2023.3261988.

M. Jung et al., "Construction of deep learning-based disease detection model in plants," Sci. Rep., vol. 13, no. 1, p. 7331, 2023, doi: 10.1038/s41598-023-34549-2.

H. Kim and D. Kim, "Deep-learning-based strawberry leaf pest classification for sustainable smart farms," Sustainability, vol. 15, no. 10, p. 7931, 2023, doi: 10.3390/su15107931.

N. G. Nair and V. M. Patel, "T2V-DDPM: Thermal to visible face translation using denoising diffusion probabilistic models," in Proc. IEEE Int. Conf. Autom. Face Gesture Recognit. (FG), 2023, pp. 1–7, doi: 10.1109/FG57933.2023.10042661.

N. Peri et al., "A synthesis-based approach for thermal-to-visible face verification," in Proc. IEEE Int. Conf. Autom. Face Gesture Recognit. (FG), 2021, pp. 1–8, doi: 10.1109/FG52635.2021.9666943.

C. Saharia et al., "Palette: Image-to-image diffusion models," in Proc. ACM SIGGRAPH, 2022, pp. 1–10, doi: 10.1145/3528233.3530757.

S. Zhao et al., "Crop pest recognition in real agricultural environment using convolutional neural networks by a parallel attention mechanism," Front. Plant Sci., vol. 13, p. 839572, 2022, doi: 10.3389/fpls.2022.839572.

Y.-J. Jeon et al., "A hybrid CNN-Transformer model for identification of wheat varieties and growth stages using high-throughput phenotyping," Comput. Electron. Agric., vol. 230, p. 109882, 2025, doi: 10.1016/j.compag.2024.109882.

Y. Wang et al., "A lightweight CNN-Transformer network for pixel-based crop mapping using time-series Sentinel-2 imagery," Comput. Electron. Agric., vol. 226, p. 109370, 2024, doi: 10.1016/j.compag.2024.109370.

A. Metin, A. Kasif, and C. Catal, "Temporal fusion transformer-based prediction in aquaponics," J. Supercomput., vol. 79, no. 17, pp. 19934–19958, 2023, doi: 10.1007/s11227-023-05389-8.

F. Zhang et al., "Hyperspectral imaging combined with CNN for maize variety identification," Front. Plant Sci., vol. 14, p. 1254548, 2023, doi: 10.3389/fpls.2023.1254548.

A. Ramesh et al., "Hierarchical text-conditional image generation with CLIP latents," arXiv, 2022, doi: 10.48550/arXiv.2204.06125.

H. Tian et al., "Computer vision technology in agricultural automation—A review," Inf. Process. Agric., vol. 7, no. 1, pp. 1–19, 2020, doi: 10.1016/j.inpa.2019.09.006.

P. Xu et al., "Visual recognition of cherry tomatoes in plant factory based on improved deep instance segmentation," Comput. Electron. Agric., vol. 197, p. 106991, 2022, doi: 10.1016/j.compag.2022.106991.

Y. Mu et al., "Intact detection of highly occluded immature tomatoes on plants using deep learning techniques," Sensors, vol. 20, no. 10, p. 2984, 2020, doi: 10.3390/s20102984.

E. Xie et al., "SegFormer: Simple and efficient design for semantic segmentation with transformers," Adv. Neural Inf. Process. Syst., vol. 34, pp. 12077–12090, 2021, doi: 10.48550/arXiv.2105.15203.

A. Muhammad et al., "Harnessing the power of diffusion models for plant disease image augmentation," Front. Plant Sci., vol. 14, 2023, doi: 10.3389/fpls.2023.1280496.

S. Ahmed et al., "IoT based intelligent pest management system for precision agriculture," Sci. Rep., vol. 14, no. 1, p. 31917, 2024, doi: 10.1038/s41598-024-83012-3.

Y. Wang et al., "Cucumber downy mildew disease prediction using a CNN-LSTM approach," Agriculture, vol. 14, no. 7, p. 1155, 2024, doi: 10.3390/agriculture14071155.

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Authors who publish with this journal agree to the following terms:

    1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
    2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
    3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).