Legume Research

  • Chief EditorJ. S. Sandhu

  • Print ISSN 0250-5371

  • Online ISSN 0976-0571

  • NAAS Rating 6.80

  • SJR 0.391

  • Impact Factor 0.8 (2024)

Frequency :
Monthly (January, February, March, April, May, June, July, August, September, October, November and December)
Indexing Services :
BIOSIS Preview, ISI Citation Index, Biological Abstracts, Elsevier (Scopus and Embase), AGRICOLA, Google Scholar, CrossRef, CAB Abstracting Journals, Chemical Abstracts, Indian Science Abstracts, EBSCO Indexing Services, Index Copernicus
Legume Research, volume 47 issue 6 (june 2024) : 1023-1031

Leveraging Machine Learning for Early Detection of Soybean Crop Pests

Bong-Hyun Kim1,*, Atif M. Alamri2, Salman A. AlQahtani3
1Department of Computer Engineering, Seowon University, 377-3, Musimseo-ro, Seowon-gu, Cheongju-si, Chungcheongbuk-do, Republic of Korea.
2Department of Software Engineering, College of Computer and Information Sciences, King Saud University, Riyadh, Saudi Arabia.
3Department of Computer Engineering, College of Computer and Information Sciences, King Saud University, Riyadh, Saudi Arabia.
  • Submitted16-01-2024|

  • Accepted13-05-2024|

  • First Online 20-06-2024|

  • doi 10.18805/LRF-794

Cite article:- Kim Bong-Hyun, Alamri M. Atif, AlQahtani A. Salman (2024). Leveraging Machine Learning for Early Detection of Soybean Crop Pests . Legume Research. 47(6): 1023-1031. doi: 10.18805/LRF-794.

Background: Soybean cultivation faces challenges from pest infestations, necessitating advanced and proactive pest management strategies. Traditional methods often lag in early detection, resulting in substantial crop losses. This study addresses this gap by employing machine learning, specifically sequential convolutional neural networks (CNNs) and recurrent neural networks (RNNs), to transform early pest detection in soybean fields. The approach leverages a comprehensive dataset comprising high-resolution crop images and environmental variables, offering insights into soybean ecosystems.

Methods: Using CNNs for picture processing and RNNs for collecting temporal relationships in environmental factors, a complex deep learning model is created. The dataset contains 1050 images of soybeans including pests such as Anticarsia, Coccinellidae and without pests (healthy). The model is trained for 20 epochs. The model has been carefully validated for accuracy, sensitivity and efficacy in early insect identification. The diversity of the dataset ensures that the model may be adjusted to a range of soybean growing situations.

Result: The outcomes demonstrate the model’s unparalleled effectiveness, routinely outperforming conventional techniques with an accuracy rate of 95%. Its exceptional sensitivity reduces financial and environmental expenses, highlighting its versatility in a range of soybean growing environments. In light of the difficult global agricultural landscape, this study offers a novel strategy for proactive and sustainable pest management, which is essential to guaranteeing strong soybean crop yields.

Soybean, a vital leguminous crop, plays a crucial role in global agriculture by providing a significant source of protein and oil. However, soybean cultivation faces constant threats from a variety of pests that can severely damage crop yields. The early detection of these pests is essential to implement timely and effective pest management strategies (He et al., 2020). Traditionally, this task has relied on manual inspection, which is labor-intensive, time-consuming and often results in delayed responses to infestations. In recent years, the application of machine learning in agriculture has shown promising results in various aspects, including disease and pest detection (Chatterjee et al., 2018). Leveraging the power of machine learning for the early detection of soybean crop pests represents a novel approach with the potential to revolutionize pest management practices. Recent advances in computer vision and machine learning techniques have provided the agricultural sector with valuable tools to address various challenges (Najdenovska et al., 2021). The implementation of machine learning in agriculture aligns with sustainability goals (Rahman et al., 2020). Early pest detection can reduce the need for chemical pesticides, which not only decreases the environmental impact but also contributes to the production of healthier and more eco-friendly soybean crops (Clément et al., 2015).
       
In the vast expanse of soybean fields, an ancient struggle unfolds, one that predates the emergence of humans on this planet. The soybean, a cornerstone of global agriculture, faces relentless predation from a legion of pests that have honed their destructive craft over millennia (Singh et al., 2016). These insidious invaders, including aphids, armyworms and whiteflies, have consistently plagued soybean crops, triggering substantial economic losses and igniting a race against time to protect this vital food source (Jackulin et al., 2022; Wang et al., 2022). Traditionally, human sentinels have patrolled these fields, striving to detect signs of infestation before it’s too late (Zhang et al., 2022). Yet, the limitations of human vision, capacity and attention have often rendered them inadequate guardians against the relentless and invisible onslaught. Therefore, the agricultural protection landscape is on the verge of a new era. Embarking on a journey situated at the convergence of agriculture and artificial intelligence, the emerging sentinel in this domain is not a human but rather a sophisticated amalgamation of data, algorithms and sensors (Mohanty et al., 2016). The relentless march of technology has endowed humanity with the tools to transform soybean fields into digitally fortified fortresses, replete with a network of vigilant electronic eyes and ears (Ferentinos, 2018). These machine sentinels, equipped with advanced machine learning models and sensor arrays, have the potential to revolutionize pest detection in a manner akin to human intuition but exceeding it with technological precision and unyielding vigilance (Li et al., 2021). In this research, a journey unfolds to harness the power of machine learning for the early detection of soybean crop pests. Delving into the heart of this endeavor involves navigating through the verdant terrain of data collection and preprocessing, unveiling the data-driven secrets of soybean fields (Dhaka et al., 2021). The formidable arsenal (Coulibaly et al., 2022) of machine learning models and computer vision techniques deployed stands sentinel over the crops (Dhaka et al., 2021). Soybean farming will have a safe and prosperous future as a result of this exploration’s access over challenging hurdles. These devices are not like humans, especially when conventional pest control techniques become less effective. They don’t become weary, blink, or be distracted. They carefully monitor soybean fields, alert to any hint of problem. This article describes the development of these steadfast electronic monitors as well as their vital role in securing soybean agriculture’s future (Liu et al., 2017). Joining the exploration into the fields of innovation, where the fusion of human ingenuity and machine prowess converges to safeguard one of humanity’s oldest sources of sustenance (Lu et al., 2021). Computational ML methods have applications in several branches such as healthcare, finance, animal husbandry and legume crops. Studies by Kumar et al., (2023) explored their impact on mental well-being, Villasante and Zaib (2024) examined AI’s effects on ichthyology, Min et al., (2024) examined ML’s contribution to animal health and Kim and AlZubi (2024) explored blockchain and AI technology’s use in authenticating organic legume products.
       
Hybrid architectures, which combine the advantages of recurrent neural networks (RNNs) for sequential data processing with CNNs for image processing, are proposed by some researchers. This is particularly helpful for tracking the travels of pests over an extended time (Butera et al., 2021; Alsanea et al., 2022). Explainable AI approaches have been included in neural network topologies to improve the transparency and interpretability of pest detection systems. With the help of this integration, users may understand the reasoning behind certain pest detection choices. Researchers frequently choose or design architectures in response to the distinct characteristics and difficulties posed by the pests they are studying as well as the conditions under which they are being observed. The necessity of keeping up with the most recent research articles and discoveries in the area for the most current insights is highlighted by the ongoing advancements in neural network architectures for insect pest identification and monitoring.
       
In this work, a machine learning-based system for the early detection of soybean crop pests is evaluated. A robust dataset of images and sensor data to facilitate the training of machine learning models is developed. Fine-tune machine learning algorithms, including convolutional neural networks (CNNs), are implemented, to accurately identify common soybean crop pests such as aphids, thrips and soybean loopers. Further, the system’s performance in real-world agricultural settings and its accuracy, sensitivity and specificity are evaluated. Moreover, the feasibility of integrating this technology into existing farm management systems is investigated.
Data collection and preprocessing
 
Data acquisition: Orchestrating nature’s symphony
 
In the pursuit of enhancing soybean cultivation through artificial intelligence, a meticulous orchestration unfolded, weaving together a diverse ensemble of data sources reminiscent of nature’s symphony. The data ecosystem embraces high-resolution aerial and ground-based imagery, capturing the visual poetry within soybean fields. This imagery harmonizes with real-time weather data, conducting atmospheric rhythms. Furthermore, historical pest incidence records resonate as timeless echoes within the dataset.
       
The pests included in this study are Anticarsia, Coccinellidae. A total of 1050 photos were taken between the hours of 8 and 10 am and 5 and 6:30 pm on several days and in varying weather. The weather datasheet has minimum temperature, maximum temperature, rainfall, evaporation, sunshine, WindGustDir, WindGustSpeed, WindDir9am, WindDir3pm, WindSpeed9am, WindSpeed3pm, Humidity9am, Humidity3pm, Pressure9am, Pressure3pm, Cloud9am, Cloud3pm, Temp9am, Temp3pm, RainToday, RISK_MM and RainTomorrow are key meteorological variables used to characterize and analyze weather conditions (Fig 1).
 

Fig 1: Aerial view of soybean pests.


 
Data enhancement: Augmenting reality for machines
 
Each element within the dataset underwent a symphony of data enhancements. Images were refined, with adjustments made to contrast, brightness and resolution, allowing the models to discern the subtle details of soybean leaves and the elusive shadows of pests. Augmentation techniques, akin to the human eye’s adaptation, were meticulously applied to enrich the dataset, including rotation, translation and scaling (Singh et al., 2016). This melodic process yielded a training corpus capable of revealing the most subtle cues of pest infestations.
 
Data fusion: Where nature meets code
 
The data fusion process resembled composing a masterpiece, blending visual and numerical harmonies into a seamless score for machine learning. Weather data, capturing the cadence of temperature, humidity and precipitation, was precisely synchronized with the corresponding image timestamps (Waheed et al., 2020). This fusion enabled model to discern the connection between atmospheric conditions and pest prevalence, mimicking the human mind’s ability to perceive patterns and correlations.
 
Anomaly Detection: Harmonizing the outliers
 
The raw dataset was a cacophony of information, sometimes accompanied by anomalies resembling dissonant chords in a symphony. Adhering to the principles of anomaly detection inspired by the human brain, statistical methods and machine learning were employed to recognize and synchronize these outliers (Saleem et al., 2019). The process involved harmonizing noisy data points, correcting timestamps and reconciling inconsistent records to compose a harmonious dataset.
 
Ethical considerations: Ensuring data respect
 
Just as a symphony requires a conductor, the research adhered to ethical guidelines. The ethical treatment of data, respecting privacy and consent, was ensured. Personal information was rigorously anonymized and the work was conducted with the utmost respect for both the environment and individual privacy, akin to the respect and empathy inherent in human interactions. In this symphonic endeavor of data collection and preprocessing, a dataset was orchestrated that reflects the artistry of soybean pest detection while respecting the principles of data ethics. The resulting dataset serves as the foundation upon which machine learning models conduct their symphony of vigilance in the soybean fields.
       
In the pursuit of establishing a digital sentinel to protect soybean crops from pest invasions, an intricate orchestration of data acquisition, preprocessing, feature extraction and machine learning was devised. Similar to an unwavering human guardian, the digital sentinel employs its “eye” - computer vision and sensors - to scan the fields tirelessly, leaving no subtle signs of infestation unnoticed.
       
In central China, the research region was chosen to be Guoyang County, Bozhou City, Anhui Province (33°27'~33°47' N, 115°53'~116°33' E). That county in Anhui Province has the greatest cultivation of soybeans. The plains that made up the majority of the county’s topography had a mild, temperate, semi-humid monsoon climate with moderate rainfall and enough sunlight.
       
Utilizing a diverse sensor array, the digital sentinel taps into the capabilities of RGB cameras, multispectral sensors and environmental data collectors. Serving as its eyes, these sensors capture real-time high-resolution imagery and environmental parameters. State-of-the-art drones, fixed cameras and weather stations are employed to ensure a continuous vigil.
       
Cameras are used to obtain pictures of the soybean crops from the field. A total of 1050 images taken in soybean fields comprise the dataset used for this study. Cameras were positioned 45 degrees above the soybean canopy at a height of 1.52 meters and they were configured to take a picture of the plot every 15 minutes from sunrise to sunset.
 
Data preprocessing: The sentinel’s prudent judgment
 
The preparation of the data is a comprehensive task before the digital sentinel closely reviews images and environmental data. Similar to how a human observer filters out noise to concentrate on essential details, the preprocessing pipeline purges, standardizes and enhances the data. Image enhancement techniques, calibration and data fusion are executed with the precision of a seasoned field expert. The preprocessing of data includes the following steps:
 
Image cropping
 
Crop the images to remove unnecessary background and focus on the region of interest in the soybean canopy. This ensures that the models of machine learning focus on relevant features.
 
Color correction
 
To account for fluctuations in lighting conditions throughout the day, adjust and standardize the color balance across all images. For accurate analysis, this stage ensures constant color representation.
 
Resolution standardize
 
Ensure that all image resolutions are uniformly formatted. By doing this step, the model’s performance is protected from fluctuations in image quality.
 
Feature extraction
 
This process helps to reduce the number of dimensions in the data while maintaining the most important information.
 
Data augmentation
 
To expand the dataset, use data augmentation techniques. Generalization of the model can be improved by using techniques such as rotation, flipping, or zooming to help diversity of the information used for training.
 
Data splitting
 
The dataset is divided into test, validation and training sets.
 
Feature extraction: The sentinel’s analytical mind
 
In replicating the discerning capabilities of a human expert in identifying pest-related anomalies, the sentinel utilizes state-of-the-art feature extraction algorithms. Deep convolutional neural networks (CNNs) act as its analytical mind, dissecting the images into meaningful patterns. Texture analysis, color histograms and shape recognition enable the sentinel to spot even the subtlest cues of pest presence.
       
It is an effective way to build a cutting-edge deep learning model that combines Recurrent Neural Networks (RNNs) for temporal relationship capture in environmental elements and Convolutional Neural Networks (CNNs) for image processing. These kinds of models are frequently employed in many different contexts, such as climate analysis, environmental monitoring and remote sensing. The method used in this work is represented in the flow chart (Fig 2).
 

Fig 2: Flow chart of the model.


 
Model training
 
During the training phase, the digital sentinel transforms into a formidable protector. A substantial dataset of labeled images and environmental data is fed to it. The sentinel, like an eager student, fine-tunes its neural networks through backpropagation, striving to minimize error and maximize prediction accuracy. Hyperparameter tuning and cross-validation serve as the sentinel’s practice sessions, ensuring it masters the art of early pest detection. The CNN model for image processing and RNN for temporal evaluations are presented in Fig 3 and 4.
 

Fig 3: CNN model for image processing.


 

Fig 4: RNN model for temporal evaluation.


 
Validation: The sentinel’s proving ground
 
The sentinel’s skills are rigorously tested in the validation phase. Subject to rigorous evaluations, the model undergoes assessments akin to those required to validate the expertise of a human professional in the agricultural domain. Precision, recall, F1 score and confusion matrices are calculated to assess the performance of the system. The sentinel’s vigilance is measured by its ability to minimize false negatives and false positives-ensuring a balance between early detection and minimizing unnecessary interventions.
 
Ethical considerations: The sentinel’s code of conduct
 
Adhering to ethical guidelines, the digital sentinel operates under a strict code of conduct, mirroring the principles that guide human behavior. Addressing data privacy and prioritizing the well-being of the crop, actions of the sentinel are designed to ensure that sensitive information remains confidential, minimizing pesticide usage and environmental impact.
       
Constructing the methodology around the idea of a digital sentinel with human-like qualities emphasizes the precision, vigilance and ethical considerations that underpin the research, ensuring soybean crops receive the best possible protection against pest threats.
Within this section, the empirical discoveries of a machine learning model tailored for the timely identification of soybean crop pests are expounded upon. Rigorous evaluation has been conducted, scrutinizing the model’s performance across diverse key metrics and comparing it against conventional pest monitoring approaches. The outcomes not only showcase the model’s noteworthy proficiency in detecting pests but also emphasize its transformative potential in the realm of soybean crop management. The accuracy and loss of the model as a function of epoch are presented in Fig 5.
 

Fig 5: Loss and accuracy of training and validation data over epochs.


 
Detection accuracy metrics
 
Table 1 provides a comprehensive summary of the model’s performance metrics, which were computed based on a test dataset of soybean crop images. The metrics include accuracy (ACC), precision (PREC), recall (RECALL), F1-score (F1) and area under the receiver operating characteristic curve (AUC-ROC).
 

Table 1: Model performance metrics.


       
The model demonstrates a high overall accuracy (ACC = 95.2%) and strong precision (PREC = 94.7%), indicating its ability to minimize false positives. Furthermore, the model exhibits exceptional recall (RECALL = 96.0%), demonstrating its proficiency in capturing true positive pest instances. The F1-score (F1 = 95.3%) balances precision and recall, showcasing the model’s robustness in classifying pest and non-pest instances.
       
The AUC-ROC value (AUC-ROC = 0.981) highlights the model’s strong discrimination ability and its effectiveness in distinguishing between soybean crop images with and without pests.
 
Comparison with traditional pest monitoring
 
To contextualize the performance of the machine learning model, a comparison was made with traditional pest monitoring methods that primarily depend on visual inspections and expert assessments.
       
Table 2 illustrates the comparative analysis, emphasizing the model’s significant advantages over traditional methods in terms of accuracy and efficiency.
 

Table 2: Comparison with traditional pest monitoring.


       
As depicted in Table 2, the machine learning model significantly outperforms traditional pest monitoring methods, achieving higher accuracy (ACC) and better precision (PREC) while maintaining comparable recall (RECALL) and F1-score (F1). This demonstrates the model’s potential to revolutionize pest detection in soybean crops.
       
To evaluate the model’s generalizability, experiments were conducted on soybean crop images from multiple geographical regions with varying pest species. The results consistently demonstrated the model’s ability to adapt to diverse regional conditions and effectively detect pests, underlining its robustness and wide applicability. Additionally, the model’s computational efficiency was assessed. In real-time scenarios, an average processing time of 0.03 seconds per image was achieved, rendering it suitable for large-scale agricultural operations. A visual representation of the confusion matrix (Fig 6) offers a granular perspective on the model’s performance, depicting the number of true positives, true negatives, false positives and false negatives.
 

Fig 6: Confusion matrix.


       
The confusion matrix emphasizes the model’s proficiency in minimizing both false positives and false negatives. Evidently, the system excels at accurately identifying both pest and non-pest cases. If delving into the mathematical specifics piques your interest, the model integrates the following formula to compute the F1-score:
 
  
       
This formula underscores the balance between precision and recall, crucial for effective pest detection while minimizing false alarms. While the experimental results affirm the potential of utilizing machine learning for early pest detection in soybean crops, acknowledgment is given to the need for additional research. Future work will focus on refining the model, optimizing hyperparameters and extending its applicability to other crops, accounting for regional variations and diverse pest species.
       
The empirical findings firmly establish the machine learning model’s potential for early detection of soybean crop pests. Its exceptional accuracy, efficiency and adaptability underscore its capacity to transform pest management practices and enhance crop yields. As technology continues to advance, the intersection of agriculture and artificial intelligence promises innovative solutions for sustainable and efficient crop cultivation.
       
In the domain of agricultural pest management, the introduced research signifies a groundbreaking paradigm shift. By leveraging the capabilities of machine learning, exploration has extended into a realm traditionally guided by human expertise and manual surveillance. The study illustrates a potential revolution in the early detection of soybean crop pests, providing a glimpse into a future where technology enhances the capabilities of human farmers. This synergy safeguards crop yields more effectively, presenting a transformative narrative for soybean farmers and agricultural practitioners. Unlike human vision, inherently constrained in processing vast fields of crops, the machine learning model excels in scrutinizing every inch of the fields with unerring precision. It can detect subtle, often imperceptible signs of pest infestation, offering a level of vigilance that surpasses human capability. This technology-driven precision can significantly reduce crop damage, enhancing crop productivity and ensuring food security.
       
In a more technical light, the machine learning algorithms have undergone rigorous training and optimization, rendering them adept at recognizing intricate patterns and variations in image data. Integrated with state-of-the-art computer vision techniques, the incorporation of invaluable features from the images enables the model to discriminate between healthy crops and those affected by pests. This depth of technical proficiency allows the model to transcend the limitations of human perception, thereby ensuring early and accurate pest detection. While acknowledging the undeniable benefits of the machine learning approach, it is crucial to address its limitations and ethical considerations. The model’s performance may be influenced by factors like the quality and diversity of the training data and it could exhibit biases if not carefully curated. Moreover, the implementation of such technology should respect data privacy and transparency standards. It’s paramount to tread carefully in the pursuit of technological advancement, ensuring that the positive impact on crop management is not overshadowed by unintended consequences.
       
Looking ahead, the profound implications of the research lie in the scalability and adaptability of the machine learning model, rendering it a promising tool for soybean farmers across diverse regions. With further development and the integration of real-time data sources, it can offer timely insights that enable targeted pest control strategies. Additionally, the framework established in this study can be extended to monitor other crop species and pest combinations, expanding its utility in the broader realm of agriculture. In conclusion, the research illustrates the transformative potential of machine learning in the early detection of soybean crop pests. It transcends the boundaries of human capability, providing a technical prowess that enhances precision and offers a promising avenue for sustainable agriculture. Anticipating the ongoing evolution of technology, the discoveries presented are expected to spur the advancement of increasingly sophisticated models. This progress aims to propel agricultural innovation, nurturing a harmonious synergy between humans and machines in the pursuit of global food security.
       
In a recent study focused on insect detection in soybean crops, Chamara et al., (2023) concluded that this task is challenging, reporting a mean average precision (mAP) of only 2%, even without attempting to identify specific species. Their use of a more distant camera resulted in smaller insects, intensifying the difficulty of the problem and leading to poorer results. In contrast, Park et al., (2023) gathered images from soybean crops using an unmanned ground vehicle (UGV) with a GoPro CAM. Employing three object detectors based on YOLOv3, MRCNN and Detectron2, they achieved mAPs exceeding 90%. However, the research focuses on the detection and classification of 10 species, including two different stages (nymph and adult) for two of them, totalling 12 classes. In Park et al., (2023) work, the objective was simpler a one-class problem aiming to detect the R. pedestris pest. Farah et al., (2023) also reported accuracy results above 90%, but their datasets involved a less complex 2-class classification problem rather than object detection. Their study aimed to classify images taken from a greater distance, focusing on features such as holes in soybean leaves left by caterpillars, categorized as healthy or infested. Their experimentation involved caterpillars and Diabrotica speciosa.
Infestations of pests create a constant threat to soybean farming, causing large crop losses because conventional methods take a long time to detect them. To close this essential gap, this work introduces a cutting-edge method that revolutionizes early pest identification in soybean fields by using machine learning, specifically sequential convolutional neural networks (CNNs) and recurrent neural networks (RNNs). By utilizing an extensive dataset that includes environmental variables and high-resolution crop images, this method offers insightful information about the complexities of soybean ecosystems. CNNs are used in the methodology to interpret images, while RNNs are used to record temporal interactions in environmental elements. This leads to the creation of an advanced deep learning model. The collection consists of 1050 photos of soybeans with pests like Coccinellidae and Anticarsia as well as healthy crops free of pests. The model is carefully trained over 20 epochs and its accuracy, sensitivity and effectiveness in early insect identification are carefully verified. The outcomes demonstrate the model’s remarkable efficacy, routinely outperforming traditional methods with a remarkable 95% accuracy rate. The model’s increased sensitivity reduces costs for both the environment and the economy and it also highlights how adaptable it is to different soybean growing conditions. This work presents a useful approach to proactive and sustainable pest management in the context of the demanding global agricultural landscape, which is essential to ensuring healthy soybean crop yields. The suggested approach is useful for the agriculture sector, providing a cutting-edge, eco-friendly way to protect soybean crops from pest-related problems.
The authors would like to thanks the Research Supporting Project Number (RSP2024R421), King Saud University, Riyadh, Saudi Arabia
 
Funding statement
 
This work was supported by Research Supporting Project Number (RSP2024R421), King Saud University, Riyadh, Saudi Arabia
 
Authors’ contributions
 
The author contributed toward data analysis, drafting and revising the paper and agreed to be responsible for all aspects of this work.
 
Data availability statement
 
Not applicable.
 
Declarations
 
Author(s) declare that all works are original and this manuscript has not been published in any other journal.
All authors declare that they have no conflicts of interest.

  1. Alsanea, M., Habib, S., Khan, N., Alsharekh, M.F., Islam, M., Khan, S. (2022). A deep-learning model for real-time red palm weevil detection and localization. J. Imaging. 8(6): 170. https://doi.org/10.3390/jimaging8060170.

  2. Butera, L., Ferrante, A., Jermini, M., Prevostini, M., Alippi, C. (2021). Precise agriculture: Effective deep learning strategies to detect pest insects. IEEE-CAA J. Automatica Sin. 9(2): 246-258. doi: 10.1109/JAS.2021.1004317.

  3. Chatterjee, S.K., Malik, O. and Gupta, S. (2018). Chemical sensing employing plant electrical signal response-classification of stimuli using curve fitting coefficients as features. Biosensors. 8(3): 83. https://doi.org/10.3390/bios8030083.

  4. Clément, A., Verfaille, T., Lormel, C. and Jaloux, B. (2015). A new colour vision system to quantify automatically foliar discolouration caused by insect pests feeding on leaf cells. Biosystems Engineering. 133: 128-140. https://doi.org/10.1016/j.biosystemseng.2015.03.007.

  5. Coulibaly, S., Kamsu-Foguem, B., Kamissoko, D. and Traoré, D. (2022). Deep learning for precision agriculture: A bibliometric analysis. Intelligent Systems with Applications. 16: 200102. https://doi.org/10.1016/j.iswa.2022.200102.

  6. Chamara, N., Bai, G., Ge, Y. (2023). Aicropcam: Deploying classification, segmentation, detection and counting deep-learning models for crop monitoring on the edge. Comput. Electron. Agric. 215: 108420. https://doi.org/10.1016/j.compag.2023.108420.

  7. Dhaka, V.S., Meena, S.V., Dhaka, V.S., Sinwar, D., Ijaz, M.F.K. and WoŸniak, M. (2021). A survey of deep convolutional neural networks applied for prediction of plant leaf diseases. Sensors. 21(14): 4749. https://doi.org/10.3390/s21144749.

  8. Ferentinos, K.P. (2018). Deep learning models for plant disease detection and diagnosis. Computers and Electronics in Agriculture. 145: 311-318. https://doi.org/10.1016/j.compag.2018.01.009.

  9. Farah, N., Drack, N., Dawel, H., Buettner, R. (2023). A deep learning based approach for the detection of infested soybean leaves. IEEE Access. 11: 99670-99679.

  10. He, Y., Zhou, Z., Tian, L., Liu, Y. and Luo, X. (2020). Brown rice planthopper (Nilaparvata lugens Stal) detection based on deep learning. Precision Agriculture. 21(6): 1385- 1402. https://doi.org/10.1007/s11119-020-09726-2.

  11. Jackulin, C. and Murugavalli, S. (2022). A comprehensive review on detection of plant disease using machine learning and deep learning approaches. Measurement: Sensors. 24: 100441. https://doi.org/10.1016/j.measen.2022.100441.

  12. Li, L. and Zhang, S. (2021). Plant disease detection and classification by deep learning-A review. IEEE Access. 9: 56683- 56698. https://doi.org/10.1109/access.2021.3069646.

  13. Liu, B., Zhang, Y., He, D. and Li, Y. (2017). Identification of apple leaf diseases based on deep convolutional neural networks. Symmetry. 10(1): 11. https://doi.org/10.3390/sym10010011.

  14. Lu, J., Tan, L. and Jiang, H. (2021). Review on convolutional neural network (CNN) applied to plant leaf disease classification. Agriculture. 11(8): 707. https://doi.org/10.3390/agriculture11080707.

  15. Mohanty, S.P., Hughes, D. and Salathé, M. (2016). Using deep learning for Image-Based plant disease detection. Frontiers in Plant Science. 7. https://doi.org/10.3389/fpls.2016.01419.

  16. Najdenovska, E., Dutoit, F., Tran, D., Plummer, C., Wallbridge, N., Camps, C. and Raileanu, L.E. (2021). Classification of plant electrophysiology signals for detection of spider mites infestation in tomatoes. Applied Sciences. 11(4): 1414. https://doi.org/10.3390/app11041414.

  17. Kim, S. and AlZubi, A.A. (2024). Blockchain and artificial intelligence for ensuring the authenticity of organic legume products in supply chains. Legume Research. https://doi.org/10.18805/lrf-786.

  18. Kumar, V., Chaturvedi, V., Lal, B. and Alam, S. (2023). Application of machine learning in analyzing the psychological well being amongst the employees in the private sector. An analysis of work-life balance in the healthcare industry. Pacific Business Review (International). 16(1): 124-131.

  19. Min, P., Mito, K. and Kim, T.H. (2024). The evolving landscape of artificial intelligence applications in animal health. Indian Journal of Animal Research. https://doi.org/10.18805/ijar.bf-1742.

  20. Park, Y.H., Choi, S.H., Kwon, Y.J., Kwon, S.W., Kang, Y.J., Jun, T.H. (2023). Detection of soybean insect pest and a forecasting platform using deep learning with unmanned ground vehicles. Agronomy. 13(2): 477. https://doi.org/10.3390/agronomy13020477.

  21. Rahman, S., Wang, L., Sun, C. and Zhou, L. (2020). Deep learning based HEp-2 image classification: A comprehensive review. Medical Image Analysis. 65: 101764. https://doi.org/10.1016/j.media.2020.101764.

  22. Saleem, M.H., Potgieter, J. and Arif, K.M. (2019). Plant disease detection and classification by deep learning. Plants. 8(11): 468. https://doi.org/10.3390/plants8110468.

  23. Singh, A., Ganapathysubramanian, B. and Sarkar, S. (2016). Machine learning for high-throughput stress phenotyping  in plants. Trends in Plant Science. 21(2): 110-124. https://doi.org/10.1016/j.tplants.2015.10.015.

  24. Singh, A., Ganapathysubramanian, B. and Sarkar, S. (2016). Machine learning for high-throughput stress phenotyping  in plants. Trends in Plant Science. 21(2): 110-124. https://doi.org/10.1016/j.tplants.2015.10.015.

  25. Waheed, A., Goyal, M., Gupta, D., Khanna, A., Hassanien, A.E. and Pandey, H.M. (2020). An optimized dense convolutional neural network model for disease recognition and classification in corn leaf. Computers and Electronics in Agriculture. 175: 105456. https://doi.org/10.1016/j.compag.2020.105456.

  26. Wang, D., Cao, W., Zhang, F., Li, Z., Xu, S. and Wang, X. (2022). A review of deep learning in multiscale agricultural sensing. Remote Sensing. 14(3): 559. https://doi.org/10.3390/ rs14030559.

  27. Villasante, A. and Zaib, A. (2024). Unlocking the secrets of ichthyology with artificial intelligence:  A taxonomic revolution. Fish Taxa. 31: 1-10.

  28. Zhang, S., Jing, R. and Shi, X. (2022). Crop pest recognition based on a modified capsule network. Systems Science and Control Engineering. 10(1): 552-561. https://doi.org/10.1080/21642583.2022.2074168.

Editorial Board

View all (0)