AI-Empowered Livestock Monitoring in Captive or Free-Range Breeding: A Review

L
Lili Bai1
C
Chaopeng Guo1
Z
Zhe Zhang1
J
Jie Song1,*
1Software College, Northeastern University, Shenyang-110 819, China.

Timely health monitoring is crucial for preventing livestock diseases throughout the breeding process. Conventional Livestock Health Monitoring (LHM) relies on manual efforts, achieving acceptable accuracy but suffering from low efficiency. Modern LHM integrates artificial intelligence to enhance both accuracy and efficiency in monitoring practices. This study reviews and evaluates over 200 research papers on intelligent monitoring in livestock, focusing on two breeding scenarios: Captive and Free-range. It examines various species, including pigs, cattle, sheep and horses and conducts a comparative analysis of three key tasks: livestock identification, growth estimation and behavior/posture detection. By highlighting common challenges and technical requirements across species, the study provides insights for selecting suitable monitoring technologies and designing tailored systems for each breeding scenario. The scenario-based classification enables customized LHM solutions that improve versatility and adaptability across different farming environments. The study reveals common challenges in cross-species monitoring and outlines future research directions to address technical limitations and practical implementation barriers in AI-driven livestock health management.

In livestock breeding, livestock health monitoring (LHM) plays a vital role in disease prevention and production efficiency. As shown in Fig 1, breeders mark livestock offspring for ongoing monitoring. They assess development by measuring growth metrics such as body size and weight and observing posture, behavior and social activities to evaluate overall health. Upon maturity, breeders may sell, slaughter, or continue breeding the livestock. Regular LHM throughout the breeding cycle helps identify potential problems promptly, improving livestock quality and yield, preventing diseases, optimizing breeding strategies and promoting the sustainable development of animal husbandry.

Fig 1: Livestock breeding process.


       
Conventional LHM methods depend on manual measurements and empirical judgments, resulting in high information collection costs, poor real-time performance and low accuracy. In contrast, modern LHM utilizes artificial intelligence technologies, offering automation, efficiency and precision. As research advances, numerous scholars have summarized developments in various dimensions. For instance, Bretas et al., (2024) review precision livestock farming technologies for monitoring grazingland, emphasizing their role in enhancing productivity and sustainability. AlZubi et al., (2023) highlight the transformative role of artificial intelligence in predicting and diagnosing animal diseases, focusing on its potential to improve animal health management, while Antognoli et al., (2025) explore how computer vision improves efficiency and animal welfare in dairy farming. These reviews provide valuable technological insights and lay a methodological foundation for the field. 
       
However, despite the thorough exploration of technical applications in existing research, a critical issue is often overlooked: The effectiveness of these technologies is highly dependent on the specific physical scenarios and environmental constraints in which they are deployed. Most reviews juxtapose technologies validated in diverse environments-varying in background, spatial layout and data collection conditions-without a scene-aware classification framework. This approach fails to address the crucial question of “which technology is best suited for which breeding model,” which is vital for practical applications. The limitations arise from the fundamental differences between Captive breeding (e.g., closed barns) and Free-range breeding (e.g., open pastures), including background complexity, animal density, installable equipment and data acquisition methods. These differences directly impact the feasibility, robustness and economic viability of technological solutions.
       
To address the aforementioned research gap, this study systematically reviews and reassesses existing intelligent monitoring research in livestock, focusing on two primary breeding scenarios: Captive and Free-range. The research encompasses various livestock species, including pigs, cattle, sheep and horses and conducts a comprehensive comparative analysis centered on three key monitoring tasks: Livestock identification, Growth estimation and Behavior and Posture detection. By summarizing common challenges and technical requirements across different species, this study provides theoretical foundations and practical guidance for selecting monitoring technologies and constructing systems tailored to various breeding scenarios, thereby promoting the intelligent advancement of precision livestock farming. The main contributions of this paper are as follows:
(1)   We have collected and analyzed 207 LHM-related papers in recent years. Through organizing and summarizing these gathered resources, we aim to provide readers with the latest trends and research findings in the field of LHM.
(2)   This paper studies the LHM in captive breeding and free-range breeding respectively and comprehensively evaluates the health status of livestock from three key monitoring tasks: Livestock identification, growth estimation, Behavior and posture detection.
(3)   The study systematically analyzed monitoring tasks and modeling approaches in the LHM component, revealing challenges such as data heterogeneity and insufficient model generalization ability.  Based on current research trends, this paper describes existing technical limitations through three dimensions: Architecture optimization, multimodal fusion and algorithm innovation, providing new directions for the LHM field.
 
Research of livestock health monitoring
 
Scenario of breeding
 
Hunter (1996) introduced the concepts of captive and free range systems. To better analyze and evaluate existing studies, we integrate these concepts into AI enabled LHM farming environments and establish a classification framework along four dimensions: spatial characteristics, monitoring equipment type, environmental controllability and data collection characteristics. Details are provided in Table 1.

Table 1: Indicators of captive and free-range breeding in AI-enabled LHM.


       
The study prioritized the type of equipment, followed by spatial features, environmental descriptors (keywords: “barn/fence” vs. “Pasture/grazing”) and data collection conditions. When studying elements containing both scenarios, the classification was determined by the dominant features (andgt;60%) across all four dimensions.
 
Research of livestock health monitoring
 
We discussed LHM based on two scenarios (Fig 2) and analyzed three key monitoring tasks: livestock identification, growth estimation and behavior and posture detection. First, livestock identification ensures each animal receives individualized attention and records, providing essential data for management. Second, growth estimation involves evaluating body condition, body weight and body size. Body condition detection assesses nutritional status to promote optimal growth and production. Weight detection monitors growth rates and trends, allowing adjustments in feeding and management. Body size detection analyzes physical characteristics, helping breeders implement effective breeding programs. Finally, Behavior and posture detection provides insights into activity levels and physiological states, enabling the identification of sick or abnormal livestock.

Fig 2: Research of livestock health monitoring.


       
These three parts are interconnected to form a complete livestock monitoring framework. In two different scenarios, the use of various monitoring technologies combined with data analysis and intelligent algorithms can accurately identify the health status of livestock in this scenario, timely identify potential disease risks, effectively plan feeding strategies and improve production efficiency.
 
Research method
 
We conducted a structured search across three platforms-Web of Science Core Collection, DBLP and Google Scholar-covering the period from 2017-01-01 to 2025-10-09. Queries were organized into three thematic blocks: (A) identification (identification/re-identification/face recognition/ ear tag/muzzle, multi-object tracking, counting); (B) growth estimation (body weight/size, morphometry, body condition score, BCS); and (C) Behavior and posture (posture/pose, social behavior, lameness, rumination, estrus, aggression). Each block was combined with species terms (livestock; pig/swine; cattle/cow/bovine; sheep/ovine; goat/caprine; horse/equine) and method terms (deep learning, computer vision, YOLO, transformer, etc.) using the AND operator and further constrained by scenario terms (captive/barn/pen/indoor/slaughterhouse and free range/pasture/grazing/rangeland/paddock/outdoor). The three blocks returned 121, 97 and 88 records in Web of Science; 131, 99 and 80 in Google Scholar; and 103, 81 and 127 in DBLP, respectively.
       
Search results from all platforms were exported and merged, then de-duplicated by DOI and by normalized titles (case-insensitive, punctuation removed). Inclusion criteria required AI/computer-vision–based monitoring of livestock species, publication within the specified time window and accessible full text. After screening, 207 studies were retained for synthesis. Their distribution is shown in Fig 3 and the complete list of all 207 studies is provided in the appendix.

Fig 3: Distribution of literature on AI-enabled livestock health monitoring (2017-2025).



Livestock identification
 
In livestock breeding, the accurate identification of livestock individuals is the basis of breeding management and disease monitoring. The identification method based on AI technology has become the mainstream method, which can realize 24/7 real-time monitoring and identification of livestock and help managers find abnormal conditions in time. Accurate recognition can be achieved by extracting the body features of livestock.
 
Captive breeding
 
Captive breeding provides a structured setting where animal behavior can be closely monitored and analyzed. The implementation of fixed surveillance systems, positioned strategically around enclosures, allows for continuous observation and data collection, enabling researchers to gather insights into individual animal behaviors and interactions.
 
Multi-object tracking algorithms: Lu et al., (2024) and Yang et al., (2025) achieved 99.8% and 98.8% tracking accuracy in complex environments and low-light pig farms respectively through rotated bounding box detectors and spatiotemporal dynamics analysis. Myat Noe et al., (2023) improved black cattle tracking accuracy to 76.2% via algorithm comparisons.
 
Lightweight biometric recognition: Zheng et al., (2023)’s six-layer convolutional model, Ruchay et al., (2024) transfer learning framework and Li et al., (2023) MobileViTFace achieved embedded deployment for cattle face (98.37%) and sheep face (97.13%) recognition. Yang et al., (2023) enhanced cattle MOTA to 96.88% using deformable convolutions.
 
Unsupervised feature optimization: Lu et al., (2023) developed a key region module for unsupervised fine-grained feature extraction, achieving 98.37% cattle recognition accuracy.
 
Free-range breeding: In free-range breeding scenarios, the identification of livestock requires a more dynamic approach due to the inherent mobility and unpredictability of animals. Scholars emphasize the importance of blending technology with behavioral insights to effectively track animals in open environments. 
 
IoT tracking systems: Voulodimos et al., (2010)’s RFID management system and Pretto et al., (2024) visual ear mark recognition (89% precision) formed dual-mode individual tracking. Natori et al., (2019) integrated GPS/accelerometers to establish grazing cattle activity models.
 
Drone-based monitoring: Soares (2024) applied Ford-Fulkerson algorithm to reduce counting errors to 2.34%. Liu et al., (2021) developed a real-time app that compressed transmission latency below 500 ms with intelligent UAV path guidance.
 
Multimodal recognition: Sun et al., (2024) enhanced MixFormer trackers to handle posture variations (75.79% AUC), while Hassan-Vasquez (2022) combined deep learning with GPS for activity boundary mapping.
 
Summary
 
Table 2 summarizes studies on livestock identification tasks (details in the Appendix). Significant differences exist between captive and free-range breeding in the application and development of identification technologies. In confined environments, identification accuracy is generally stable and exceeds 94%, with Odo et al., (2025) achieving up to 95% accuracy using high-frame-rate CCTV. However, this reliance on controlled conditions raises concerns about the generalizability of findings to real-world scenarios.

Table 2: Livestock identification of livestock.


       
Most studies in confined spaces focus on RGB-D cameras and deep learning algorithms. For instance, Lu et al., (2024) reported 97.5% accuracy under stable conditions, showcasing technological advancements. Nonetheless, limitations such as lighting variations and animal behavior must be acknowledged, as these can affect performance in practical applications.
       
In contrast, accuracy in free-range breeding is more variable, heavily influenced by weather and environmental conditions. Zhang et al., (2022) found a significant decrease in drone monitoring accuracy during adverse weather, while Huang et al., (2023) reported 84% accuracy through multi-sensor data fusion-less than that in captive environments. Addressing the need for robust sensor fusion techniques and adaptive algorithms is essential.
       
In summary, although captive settings foster the development of high-accuracy identification technologies, future research must tackle the limitations in fluctuating conditions. This could involve hybrid approaches that integrate diverse data sources, enhance algorithm adaptability and improve sensor robustness for reliable performance across different livestock species and breeding contexts.
 
Growth estimation
 
The estimation of livestock growth involves detecting body condition, weight and body size, which are key parameters for evaluating their development and health (Machebe et al.,  2016). Regular monitoring of these factors enables breeders to understand livestock growth status and adjust feeding and management practices to ensure adequate nutrition and optimal growth conditions, thus enhancing breeding efficiency and production performance. Fig 4 illustrates two typical AI-based approaches for detecting growth indicators.

Fig 4: (a) Growth estimation based on image analysis, with measurements: 1-9 representing various body dimensions. (b) Growth estimation using 3D stereo reconstruction technique, with measurements a-i representing different body dimensions.


 
Captive breeding
 
In captive breeding, livestock growth is primarily influenced by human management rather than natural environmental factors. This controlled setting allows breeders to monitor growth closely and adjust feeding strategies based on regular measurements of body shape, weight and size to ensure healthy development.
 
Body condition monitoring: Basak et al., (2023) demonstrated 94.6% body composition prediction in pigs using ultrasound-SVR integration, while He et al., (2023a) developed YOLOX-based posture scoring achieving 0.85 F1-score.
 
Contactless weight estimation: LiDAR-PointNet++ systems attained 3.2% prediction error in cattle Hou et al., (2023), contrasting with RGB-D LiteHRNet’s 14.6% MAPE for sheep He et al., (2023b).
 
3D morphometrics: Weng  et al. (2023) and Ling et al., (2022) established Kinect V2-based pig measurement pipelines (2.1cm MAE), enhanced by Lu et al., (2025) ‘s GCN-mesh refinement (3.58% MAPE).
 
Free-range breeding
 
In free-range breeding, livestock growth is more influenced by natural factors than by human intervention. This environment necessitates a different evaluation approach, considering behavioral characteristics, social relationships and autonomous selection abilities to comprehensively assess the health status of livestock.
 
Vision-based systems: Bai et al., (2025) enabled smartphone-based cattle weight estimation (97% precision) through multi-scale fusion networks. Garcia et al., (2021) identified weight anomalies with a gradient boosting model (R² = 0.92).
 
Sensor Integration: Vaughan et al., (2017) ‘s tomography sensors (<1% error) and Simanungkalit et al., (2020) ‘s WOW systems demonstrated 100% repeatability in pasture conditions.
 
Predictive analytics: Abdelhady et al., (2019) achieved 98.75% accuracy in sheep weight prediction using K-means clustering. Sultana et al., (2022) compared three nonlinear models for predicting yearly cattle weight, showing that the Brody model offers better fit statistics for various cattle genotypes in Bangladesh.
 
Summary
 
Growth estimation in livestock differs significantly between captive and free-range breeding, each presenting unique challenges for technology application. In captive settings, researchers have achieved over 94% accuracy using advanced imaging and algorithms, thanks to stable monitoring and controlled management (Table 3). Fig 5 highlights that weight measurements account for 36% of growth estimation methodologies in both settings, with body condition assessments contributing 28% in captivity and 20% in free-range environments. The trend indicates an increasing emphasis on optimizing technology for accurate assessments, particularly in captive breeding.

Table 3: Growth estimation of livestock.



Fig 5: Comparison of growth estimation.


       
However, the reliance on controlled environments raises concerns about the transferability of these findings to real-world scenarios, where factors like animal stress and varying behaviors can significantly affect growth outcomes. This emphasizes the need for caution when interpreting results from ideal conditions. Moreover, species-specific differences in growth estimation methods call for tailored approaches. For example, pigs benefit from visual and tactile data, while cattle and sheep typically rely on conventional weight and body condition metrics, indicating that a one-size-fits-all solution may not be effective.
       
Although advancements in sensor technology offer new opportunities for growth estimation, they also introduce complexities that can hinder practical applications. While multiple sensors enhance sensitivity to behavioral changes, they complicate data management, underscoring the preference for simpler monitoring solutions in real-world settings.
 
Behavior and posture detection
 
Monitoring livestock behavior and posture enables breeders to quickly assess health, emotional states and growth, facilitating early detection of diseases or abnormalities. Key posture indicators include standing and lying positions, head posture and leg alignment, which reveal activity levels and comfort, as well as potential musculoskeletal issues. Additionally, behavior patterns serve as vital indicators of health and welfare; timely monitoring can identify abnormal behaviors and early signs of disease or environmental stress, allowing for prompt preventive measures.
 
Captive breeding
 
In captive breeding scenarios, livestock are often affected by spatial limitations and human intervention and their Behavior and posture may exhibit certain characteristics. Livestock may exhibit more regular and repetitive behavior patterns in captive environments, such as walking back and forth, standing or lying down. Monitoring the Behavior and posture of captive livestock can help evaluate their adaptation to captive conditions, explore their physiological and psychological states and identify potential health issues or behavioral abnormalities.
       
Zia et al., (2023) established the CVB dataset with 502 natural-light videos, achieving 11 fundamental cattle behavior recognitions through an enhanced SlowFast model (57fps real-time processing). Zheng et al., (2023) developed a Siam-AM dual-network architecture incorporating attention mechanisms for leg motion tracking, demonstrating 94.7% lameness detection accuracy across three large-scale farms. Gan et al., (2021) and Bati and Ser (2023) addressed social interactions in piglets and ovine stress responses respectively, with the former attaining F1=0.9377 for aggression detection in 8-hour videos using spatiotemporal CNN, while the latter implemented fear behavior classification through optical flow-CNN fusion. Yan et al., (2024) proposed a dual-modality network with adaptive RGB-optical flow fusion, achieving 77.8% recall for porcine aggression detection in 642 annotated clips. Gao et al., (2023)’s team enhanced the UD-YOLOv5s model via jaw skeleton feature extraction, reaching 86.4% mAP for rumination recognition.
 
Free-range breeding
 
In free-range breeding scenarios, livestock have a larger range of activity and a more natural growth environment and their Behavior and posture may be more diverse and diverse. Free-range livestock can exhibit more exploratory and active behaviors, such as foraging, playing, or communicating. By monitoring the Behavior and posture of free-range livestock, we can gain a more comprehensive understanding of their adaptability to the natural environment, as well as their personality traits and social behavior exhibited in free activities.
       
Kirsch et al., (2025) integrated Time-Distributed Residual LSTM-CNN with bidirectional LSTM to create a hierarchical framework for equine behavior classification (<93\% cross-validation accuracy across 15 activities). Gu et al., (2023)’s team developed a two-stage recognition system that preliminarily screens six sheep behaviors (including standing/attacking) before VGG network refinement, maintaining model memory under 120MB. Schmeling et al., (2021) and Li et al., (2024) devised multi-sensor solutions: the former predicted bovine lying patterns using accelerometer-magnetometer-gyroscope fusion, while the latter achieved 93.64% lameness detection accuracy in 330-cow trials through EfficientNet-B0-based hoof-angle analysis. Riaboff et al., (2020) revealed via accelerometer-GPS correlation that 63% rumination occurs under tree shade in permanent pastures, with feeding frequency increasing by 28% near automated milking systems in temporary grasslands.
 
Summary
 
In captive breeding, the limited range of animal activity allows for easier and more precise monitoring of behavior and posture. In contrast, free-range breeding is significantly influenced by natural conditions, requiring researchers to develop effective monitoring solutions for dynamic environments. As illustrated in Fig 6 the distribution of research on behavior and posture detection varies, with a greater emphasis on behaviors such as feeding and social interactions in free-range settings compared to controlled captive studies (Table 4 in the Appendix).

Fig 6: Behavior and posture papers from the last decade in captive breeding and free-range breeding.



Table 4: Behavior and posture detection of livestock.


       
Technologically, the diversity of detection methods reflects adaptability to different scenarios. Captive breeding benefits from video analysis and deep learning models like YOLO for real-time posture analysis. In free-range contexts, automated sensors and cameras are increasingly used; for example, Gu et al., (2023) employed RGB-D cameras to monitor cattle behavior under variable conditions. Due to natural elements affecting data in free-range settings, integrating multiple sensors like accelero-meters and GPS is crucial for improving accuracy.
       
Species-specific behavioral traits also influence the choice of posture detection technologies. Different species, such as pigs, cattle and sheep, require tailored approaches; for instance, Eisermann (2022) used jaw-tracking equipment measurements for pigs, while cattle behavior detection relies more on video analysis. This specialization enhances monitoring accuracy and efficiency, facilitating the development of tailored management strategies.
 
Prospects
 
Research on AI-based LHM has become a hot topic in academia. Literature analysis shows that in captive breeding, the focus is on high-precision visual recognition, behavior and posture monitoring, utilizing deep learning for accurate individual identification and anomaly detection. In contrast, free-range breeding emphasizes multi-sensor integration and environmental adaptability, using GPS, sensors and long-term dynamic monitoring to navigate complex natural environments. This research highlights the preference for static monitoring technologies in confined settings, while free-range contexts rely on integrated multi-source data, emphasizing system robustness and adaptability and presenting distinct technological strategies for each breeding scenario. We foresee significant advancements in the LHM domain regarding structure, content and technology, leading to new innovations in intelligent animal breeding.
 
Classification method
 
This study examines scenarios in LHM, distinguishing between free-range breeding and captive breeding. Health assessments of livestock in free-range environments help simulate natural ecosystems, enabling a holistic evaluation of their well-being. In contrast, monitoring in captive environments minimizes external influences, yielding more precise data. Future studies could explore different categorizations, such as classifying LHM research based on livestock husbandry purposes:
(1) Meat-producing livestock: Specifically bred for meat production.
(2) Dairy livestock: Primarily serving for milk and dairy product output, e.g., cows, sheep.
(3) Working livestock: Utilized for tasks like plowing and transportation, e.g., horses, cattle.
(4) Fur-producing livestock: Reared for fur or wool extraction.
       
Through this classification method, personalized improvements can be made to the health status of livestock and targeted management suggestions can be provided for various types of animal husbandry, promoting the development and health enhancement of the entire animal husbandry industry.
 
Enhance refined management
 
Captive breeding offers significant opportunities for improved livestock management, particularly in behavior and posture detection. Future research can focus on:
 
(1)   Diversified behavior detection
 
Integrating multiplesensor technologies, such as video surveillance and accelerometers, to develop algorithms that recognize various postures and behaviors, aiding in early detection of health issues and enhancing animal welfare.
 
(2)   Behavior pattern analysis
 
Utilizing machine learning to identify behavior patterns in livestock, creating databases and alert systems that help herders monitor health and status effectively.
       
In free-range breeding, there is untapped potential for precision management, as current research predominantly emphasizes high-accuracy recognition tasks while neglecting growth indicator detection. Future directions may include:
 
(1) Sensor technology
 
Integrating machine learning with sensors (like neck collars or RFID tags) to monitor body size and enable real-time weight tracking.
 
(2) Computer vision technology
 
Employing portable devices for monitoring livestock growth, using smartphones or cameras to capture images for analysis and weight estimation through deep learning.
 
(3) Intelligent devices
 
Developing tools to automatically record and transmit growth data to the cloud for precise analysis through AI algorithms.
 
Technological improvements
 
Remote monitoring and data-driven decision-making
 
Current AI-driven LHM research faces challenges with fragmented monitoring tasks, such as livestock identification, posture detection and weight measurements, leading to inefficiencies and information silos. Future research should focus on developing an integrated monitoring system to streamline these tasks, enhancing data accuracy and decision-making. This system will utilize sensor technologies and internet connectivity for real-time health monitoring, enabling farmers to access comprehensive physiological and environmental data anytime. Intelligent decision support systems powered by machine learning can analyze health trends, predict disease outbreaks and provide targeted advice, thus improving productivity while ensuring livestock health.
 
Multimodal data fusion and comprehensive assessment
 
Future LHM research should prioritize multimodal data fusion by integrating data from various sensors, such as images, videos and sounds, for a comprehensive health assessment. For example, analyzing both audio and visual data can help detect abnormal breathing sounds and infer potential health issues. This approach, combined with machine learning, can enhance diagnostic accuracy in livestock health assessment.
 
Enhancing model performance in low data sample scenarios
 
AI-empowered LHM research typically relies on large datasets for image processing and deep learning. However, the scarcity of publicly available livestock datasets and the absence of standard data formats hinder generalization across studies. Future research should explore enhancing model performance in low-sample scenarios through techniques like transfer learning, meta-learning and active learning to leverage existing data effectively. Furthermore, methods such as Generative Networks and self-supervised learning can augment datasets, improving diversity and robustness. Combining these advanced technologies will help develop more precise models for livestock health monitoring, addressing challenges related to limited sample data and pushing innovation in AI-empowered LHM technology.
This paper analyzes over 200 AI studies on LHM in captive and free-range breeding, focusing on three key tasks: livestock identification, growth estimation and behavior detection. It compares monitoring approaches in both scenarios, outlines current challenges in the LHM domain and suggests future research directions for more effective livestock management solutions.
This paper was supported by Fundamental Research Funds for the Central Universities under grant No. N2317005 and the National Natural Science Foundation of China under grant No. 62302086.
 
Disclaimers
 
The views and conclusions expressed in this article are solely those of the authors and do not necessarily represent the views of their affiliated institutions. The authors are responsible for the accuracy and completeness of the information provided, but do not accept any liability for any direct or indirect losses resulting from the use of this content.
 
 The authors declare that there are no conflicts of interest regarding the publication of this article. No funding or sponsorship influenced the design of the study, data collection, analysis, decision to publish, or preparation of the manuscript.

  1. Abdelhady, A.S. and Hassanien, A.E., Awad, Y.M., El-Gayar, M. and Fahmy, A. (2019). Automatic sheep weight estimation based on k-means clustering and multiple linear regression. proceedings of the international conference on advanced intelligent systems and informatics 2018. 845: 546-555. doi: 10.1007/978-3-319-99010-1_50.

  2. AlZubi, Ahmad Ali. (2023). Artificial intelligence and its application in the prediction and diagnosis of animal diseases: A review. Indian Journal of Animal Research. 57(10): 1265-1271.  doi: 10.18805/IJAR.BF-1684.

  3. Antognoli, V., Presutti, L., Bovo, M., Torreggiani, D., Tassinari, P. (2025). Computer vision in dairy farm management: A literature review of current applications and future perspectives. Animals. 15(17): 2508. doi: 10.3390/ani15172508.

  4. Bai, L., Guo, C., Song, J. (2025). Cattle weight estimation model through readily photos. Engineering Applications of Artificial Intelligence. 143: 109976. doi:10.1016/j.engappai. 2024. 109976.

  5. Basak, J.K., Paudel, B., Deb, N.C., Kang, D.Y., Moon, B.E., Shihab, A.S., Tae, K.H.  (2023). Prediction of body composition in growing- finishing pigs using ultrasound based back-fat depth approach and machine learning algorithms. Computers and Electronics in Agriculture. 213: 108269. doi: 10. 1016/j.compag.2023.108269.

  6. Bati, C.T. and Ser, G. (2023). SHEEPFEARNET: Sheep fear test behaviors classification approach from video data based on optical flow and convolutional neural networks. Computers  and Electronics in Agriculture. 204: 107540. doi: 10. 1016/j.compag.2022.107540.

  7. Bretas, I.L., Dubeux, J.C.B., Cruz, P.J.R., Oduor, K.T., Queiroz, L.D., Valente, D.S.M. and Chizzotti, F.H.M. (2024). Precision livestock farming applied to grazingland monitoring and management- A review. Agronomy Journal. 116(3): 1164-1186. doi: 10. 1002/agj2.21346.

  8. Cao, Y., Chen, J., Zhang, Z. (2023). A sheep dynamic counting scheme based on the fusion between an improved-sparrow-search YOLOv5x-ECA model and few-shot deepsort algorithm. Computers and Electronics in Agriculture. 206: 107696. doi: 10.1016/j.compag.2023.107696.

  9. Dong, L. andrea, P., Eric, P., Robert, F., Tomas, N. (2023). Where’s your head at? Detecting the orientation and position of pigs with rotated bounding boxes. Computers and Electronics in Agriculture. 212: 108099. doi: 10.1016/j.compag.2023. 108099. 

  10. Du, A.,  Guo, H., Lu, J., Su, Y., Ma, Q., Ruchay, A.N., Marinello, F., Pezzuolo, A. (2022). Automatic livestock body measurement based on keypoint detection with multiple depth cameras. Computers and Electronics in Agriculture. 198: 107059. doi: 10.1016/j.compag.2022.107059.

  11. Eisermann, J., Schomburg, H., Knöll, J., Schrader, L. and Patt, A. (2022). Bite-o-Mat: A device to assess the individual manipulative behaviour of group housed pigs. Computers and Electronics in Agriculture. 193: 106708. doi: 10. 1016/j.compag.2022.106708.

  12. Gan, H., Ou, M., Huang, E.,  Xu, C., Li, S., Li, J., Liu, K., Xue, Y. (2021). Automated detection and analysis of social behaviors among preweaning piglets using key point-based spatial and temporal features. Computers and Electronics in Agriculture. 188: 106357. doi: 10.1016/j.compag.2021.106357.

  13. Gao, G., Wang, C., Wang, J., Lv, Y. (2023). UD-YOLOv5s: Recognition of cattle regurgitation behavior based on upper and lower jaw skeleton feature extraction. 2023 NANA. 532- 538. doi: 10.1109/NaNA60121.2023.00094.

  14. Garcia, R., Aguilar, J., Toro, M., Jimenez, M. (2021). Weight-identification model of cattle using machine-learning techniques for anomaly detection. 2021 SSCI. 1-7. doi: 10.1109/SSCI50451. 2021.9659840.

  15. Gu, Z., Zhang, H., He, Z., Niu, K. (2023). A two-stage recognition method based on deep learning for sheep behavior. Computers and Electronics in Agriculture. 212: 108143. doi: 10. 1016/j.compag.2023.108143.

  16. Guo, Y., Yu, Z., Hou, Z., Zhang, W., Qi, G. (2023). Sheep face image dataset and DT-YOLOv5s for sheep breed recognition. Computers and Electronics in Agriculture. 211: 108009. doi: 10.1016/j.compag.2023.108009.

  17. Hassan-Vásquez, J.A. and Maroto-Molina, F. and Guerrero-Ginel, J.E. (2022). GPS tracking to monitor the spatiotemporal dynamics of cattle behavior and their relationship with feces distribution. Animals. 12(18): 2383. doi: 10.3390/ ani12182383.

  18. He, C., Qiao, Y., Mao, R., Li, M., Wang, M. (2023). Enhanced LiteHRNet based sheep weight estimation using RGB-D images. Computers and Electronics in Agriculture. 206: 107667. doi: 10.1016/j.compag.2023.107667.

  19. He, H., Chen, C., Zhang, W., Wang, Z., Zhang, X. (2023). Body condition scoring network based on improved YOLOX. Pattern Analysis and Applications. 26(3): 1071-1087. doi: 10. 1007/s10044-023-01171-x.

  20. Hou, Z., Huang, L., Zhang, Q.,  Miao, Y. (2023). Body weight estimation of beef cattle with 3D deep learning model: PointNet++. Computers and Electronics in Agriculture. 213: 108184. doi: 10.1016/j.compag.2023.108184.

  21. Huang, X., Hu, Z., Wang, X., Yang, X., Zhang, J., Shi, D. (2019). An improved single shot multibox detector method applied in body condition score for dairy cows. Animals. 9(7): 470. doi: 10.3390/ani9070470.

  22. Huang, Y., Xiao, D., Liu, J., Tan, Z., Liu, K., Chen, M. (2023). An improved pig counting algorithm based on yolov5 and deepsort model. Sensors. 23(14): 6309. doi: 10.3390/s23146309.

  23. Hunter, D.L. (1996). Tuberculosis in free-ranging, semi free-ranging and captive cervids. Rev. Sci. Tech. OIE. 1: 171-181. doi:  10.20506/rst.15.1.911.

  24. Kirsch, K., Strutzke, S., Klitzing, L., Pilger, F., Thöne-Reineke, C., Hoffmann, G. (2025). Validation of a time-distributed residual LSTM–CNN and BiLSTM for equine behavior recognition using collar-worn sensors. Computers and Electronics in Agriculture. 231: 109999. doi: 10.1016/ j.compag.  2025.109999.

  25. Li, X. and Xiang, Y. and Li, S. (2023). Combining convolutional and vision transformer structures for sheep face recognition. Computers and Electronics in Agriculture. 205: 107651. doi: 10. 1016/j.compag.2023.107651.

  26. Ling, Y., Jimin, Z., Caixing, L., Xuhong, T. and Sumin, Z. (2022). Point cloud-based pig body size measurement featured by standard and non-standard postures. Computers and Electronics in Agriculture. 199: 107135.

  27. Liu, C., Jian, Z., Xie, M., Cheng, I. (2021). A real-time mobile application for cattle tracking using video captured from a drone. 2021 ISNCC. 1-6. doi: 10.1109/ISNCC52172. 2021.9615648.

  28. Lu, H.,  Zhang, J., Yuan, X., Lv, J., Zeng, Z., Guo, H. and Ruchay, A. (2025). Automatic coarse-to-fine method for cattle body measurement based on improved GCN and 3D parametric model. Computers and Electronics in Agriculture. 231: 110017. doi:  10.1016/j.compag.2025.110017.

  29. Lu, J., Chen, Z., Li, X., Fu, Y., Xiong, X., Liu, X., Wang, H. (2024). ORP- Byte: A multi-object tracking method of pigs that combines Oriented RepPoints and improved Byte. Computers and Electronics in Agriculture. 219: 108782. doi: 10.1016/j.compag. 2024.108782.

  30. Lu, Y., Weng, Z., Zheng, Z., Zhang, Y.,Gong, C. (2023). Algorithm for cattle identification based on locating keyarea. Expert Systems with Applications. 228: 120365. doi: 10.1016/j. eswa.2023.120365.

  31. Machebe, N.S., Ezekwe, A.G., Okeke, G.C. and Banik, S.2  (2016). Path analysis of body weight in grower and finisher pigs. Indian Journal of Animal Research. 50(5): 794-798.  doi: 10.18805/ijar.11319.

  32. Myat Noe, S.,  Zin, T.T., Tin, P. and Kobayashi, I. (2023). Comparing state-of-the-art deep learning algorithms for the automated detection and tracking of black cattle. Sensors. 23(1): 532. doi: 10.3390/s23010532.

  33. Natori, T., Ariyama, N., Tsuichihara, S., Takemura, H., Aikawa, N. (2019). Study of activity collecting system for grazing cattle. 2019 ITC-CSCC. 1-4. doi: 10.1109/ITC-CSCC.2019. 8793451.

  34. Odo, A., McLaughlin, N., Kyriazakis, I. (2025). Re-identification for long-term tracking and management of health and welfare challenges in pigs. Biosystems Engineering. 251: 89-100.  doi: 16/j.biosystemseng.2025.02.001.

  35. Pretto, A., Savio, G., Gottardo, F., Uccheddu, F., Concheri, G. (2024). A novel low-cost visual ear tag based identification system for precision beef cattle livestock farming. Information Processing in Agriculture. 11(1): 117-126. doi: 10.1016/j.inpa. 2022.10.003.

  36. Qian, L., Yongsheng, S., Mengyuan, C., Xi, K., Gang, L. (2024). Lameness detection of dairy cows based on key frame positioning and posture analysis. Computers and Electronics in Agriculture. 227: 109537. doi: 10.1016/j.compag.2024. 109537.

  37. Riaboff, L., Couvreur, S., Madouasse, A. and Roig-Pons. (2020). Use of predicted behavior from accelerometer data combined with gps data to explore the relationship between dairy cow behavior and pasture characteristics. Sensors20(17): 4741. doi:  10.3390/s20174741.

  38. Ruchay, A. and Kolpakov, V. and Guo, H. and Pezzuolo, A. (2024). On-barn cattle facial recognition using deep transfer learning and data augmentation. Computers and Electronics in Agriculture. 225: 109306. doi: 10.1016/j.compag. 2024. 109306.

  39. Schmeling, L., Elmamooz, G. and Hoang. (2021). Training and validating a machine learning model for the sensor-based monitoring of lying behavior in dairy cows on pasture and in the barn. Animals. 11(9): 2660. doi: 10.3390/ani11092660.

  40. Shi, J., Chen, X., Zhang, Y.,  Gong, P., Xiong, Y., Shen, M., Norton, T., Gu, X., Lu, M. (2025). Detection of estrous ewes’ tail-wagging behavior in group-housed environments using Temporal- Boost 3D convolution. Computers and Electronics in Agriculture. 234: 110283. doi: 10.1016/j.compag.2025. 110283.

  41. Simanungkalit, G.,  Hegarty, R.S., Cowley, F.C., McPhee, M.J. (2020). Evaluation of remote monitoring units for estimating body weight and supplement intake of grazing cattle. Animal. 14: s332-s340. doi: 10.1017/S1751731120000282.

  42. Soares, V.H.A. and Ponti, M.A. and Campello, R.J.G.B. (2024). Multi- attribute, graph-based approach for duplicate cattle removal and counting in large pasture areas from multiple aerial images. Computers and Electronics in Agriculture. 220: 108828. doi: 10.1016/j.compag.2024.108828.

  43. Soares, V.H.A., Ponti, M.A., Gonçalves, R.A.,  Campello, R.J.G.B.  (2021). Cattle counting in the wild with geolocated aerial images in large pasture areas. Computers and Electronics in Agriculture. 189: 106354. doi: 10.1016/j.compag. 2021.106354.

  44. Sultana, N., Khan, M.K.I., Momin, M.M. (2022). Nonlinear models for the prediction of yearly live weight of cattle. Asian Journal of Dairy and Food Research. 41(2): 168-172. doi: 10. 18805/ajdfr.DRF-257.

  45. Sun, Q., Yang, S., Wang, M., Hu, S., Ning, J. (2024). A real-time dairy goat tracking based on MixFormer with adaptive token elimination and efficient appearance update. Computers and Electronics in Agriculture. 218: 108645. doi: 10. 1016/j.compag.2024.108645.

  46. Vaughan, J.,  Green, P.M. and Salter, M., Grieve, B., Ozanyan, K.B. (2017). Floor sensors of animal weight and gait for precision livestock farming. 2017 IEEE SENSORS. 1-3. doi: 10. 1109/ICSENS.2017.8234202.

  47. Voulodimos, A.S. and Patrikakis, C.Z. and Sideridis, A.B. and Ntafis, V.A. and Xylouri, E.M. (2010). A complete farm management system based on animal identification using RFID technology. Computers and Electronics in Agriculture. 228: 120365. doi: 10.1016/j.eswa.2023.120365.

  48. Wang, Z., Zhou, S., Yin, P., Xu, A., Ye, J. (2023). GANPose: Pose estimation of grouped pigs using a generative adversarial network. Computers and Electronics in Agriculture. 212: 108119. doi: 10.1016/j.compag.2023.108119.

  49. Weng, Z., Li, Z. and Zheng, Z. (2023). Three-dimensional point cloud reconstruction and body ruler measurement of pig body under multi-angle KinectV2. BIC 2023. pp-39-40. doi:  10.1145/ 3592686.3592694.

  50. Xu, J., Liu, W., Qin, Y.,  Xu, G. (2022). Sheep counting method based on multiscale module deep neural network. IEEE Access. 10: 128293-128303. doi: 10.1109/ACCESS.2022.3221542.

  51. Yan, K., Dai, B., Liu, H., Yin, Y., Li, X., Wu, R., Shen, W. (2024). Deep neural network with adaptive dual-modality fusion for temporal aggressive behavior detection of group-housed pigs. Computers and Electronics in Agriculture. 224: 109243. doi:  10.1016/j.compag.2024.109243.

  52. Yang, W. and Wu, J.and Zhang, J. and Gao, K. and Du, R. and Wu, Z. and Firkat, E. and Li, D. (2023). Deformable convolution and coordinate attention for fast cattle detection. Computers and Electronics in Agriculture. 211: 108006. doi: 10.1016/j. compag.2023.108006.

  53. Yang, Y., Li, C., Wang, X.,  Zhou, H., Yang, J., Xue, Y. (2025). Automatic recognition of isolated piglet outliers based on multi- object tracking. Computers and Electronics in Agriculture. 235: 110377. doi:  10.1016/j.compag.2025.110377.

  54. Zhang, X., Xuan, C., Ma, Y., Su, H. (2022). An integrated goat head detection and automatic counting method based on deep learning. Animals. 12(14): 1810. doi: 10.3390/ani12141810.

  55. Zheng, Z. and Qin, L. (2023). PrunedYOLO-Tracker: An efficient multi-cows basic behavior recognition and tracking technique. Computers and Electronics in Agriculture. 213: 10871. doi: 10.1016/j.compag.2023.108172.

  56. Zheng, Z., Zhang, X., Qin, L., Yue, S. and Zeng, P. (2023). Cows’ legs tracking and lameness detection in dairy cattle using video analysis and Siamese neural networks. Computers and Electronics in Agriculture. 205: 107618. doi:  10.1016/j.compag. 2023.107618.

  57. Zia, A., Sharma, R., Arablouei, R., Bishop-Hurley, G., McNally, J., Bagnall, N., Rolland, V., Kusy, B., Petersson, L., Ingham, A. (2023). CVB: A video dataset of cattle visual behaviors. arXiv. doi: 10. 48550/arXiv.2305.16555.

AI-Empowered Livestock Monitoring in Captive or Free-Range Breeding: A Review

L
Lili Bai1
C
Chaopeng Guo1
Z
Zhe Zhang1
J
Jie Song1,*
1Software College, Northeastern University, Shenyang-110 819, China.

Timely health monitoring is crucial for preventing livestock diseases throughout the breeding process. Conventional Livestock Health Monitoring (LHM) relies on manual efforts, achieving acceptable accuracy but suffering from low efficiency. Modern LHM integrates artificial intelligence to enhance both accuracy and efficiency in monitoring practices. This study reviews and evaluates over 200 research papers on intelligent monitoring in livestock, focusing on two breeding scenarios: Captive and Free-range. It examines various species, including pigs, cattle, sheep and horses and conducts a comparative analysis of three key tasks: livestock identification, growth estimation and behavior/posture detection. By highlighting common challenges and technical requirements across species, the study provides insights for selecting suitable monitoring technologies and designing tailored systems for each breeding scenario. The scenario-based classification enables customized LHM solutions that improve versatility and adaptability across different farming environments. The study reveals common challenges in cross-species monitoring and outlines future research directions to address technical limitations and practical implementation barriers in AI-driven livestock health management.

In livestock breeding, livestock health monitoring (LHM) plays a vital role in disease prevention and production efficiency. As shown in Fig 1, breeders mark livestock offspring for ongoing monitoring. They assess development by measuring growth metrics such as body size and weight and observing posture, behavior and social activities to evaluate overall health. Upon maturity, breeders may sell, slaughter, or continue breeding the livestock. Regular LHM throughout the breeding cycle helps identify potential problems promptly, improving livestock quality and yield, preventing diseases, optimizing breeding strategies and promoting the sustainable development of animal husbandry.

Fig 1: Livestock breeding process.


       
Conventional LHM methods depend on manual measurements and empirical judgments, resulting in high information collection costs, poor real-time performance and low accuracy. In contrast, modern LHM utilizes artificial intelligence technologies, offering automation, efficiency and precision. As research advances, numerous scholars have summarized developments in various dimensions. For instance, Bretas et al., (2024) review precision livestock farming technologies for monitoring grazingland, emphasizing their role in enhancing productivity and sustainability. AlZubi et al., (2023) highlight the transformative role of artificial intelligence in predicting and diagnosing animal diseases, focusing on its potential to improve animal health management, while Antognoli et al., (2025) explore how computer vision improves efficiency and animal welfare in dairy farming. These reviews provide valuable technological insights and lay a methodological foundation for the field. 
       
However, despite the thorough exploration of technical applications in existing research, a critical issue is often overlooked: The effectiveness of these technologies is highly dependent on the specific physical scenarios and environmental constraints in which they are deployed. Most reviews juxtapose technologies validated in diverse environments-varying in background, spatial layout and data collection conditions-without a scene-aware classification framework. This approach fails to address the crucial question of “which technology is best suited for which breeding model,” which is vital for practical applications. The limitations arise from the fundamental differences between Captive breeding (e.g., closed barns) and Free-range breeding (e.g., open pastures), including background complexity, animal density, installable equipment and data acquisition methods. These differences directly impact the feasibility, robustness and economic viability of technological solutions.
       
To address the aforementioned research gap, this study systematically reviews and reassesses existing intelligent monitoring research in livestock, focusing on two primary breeding scenarios: Captive and Free-range. The research encompasses various livestock species, including pigs, cattle, sheep and horses and conducts a comprehensive comparative analysis centered on three key monitoring tasks: Livestock identification, Growth estimation and Behavior and Posture detection. By summarizing common challenges and technical requirements across different species, this study provides theoretical foundations and practical guidance for selecting monitoring technologies and constructing systems tailored to various breeding scenarios, thereby promoting the intelligent advancement of precision livestock farming. The main contributions of this paper are as follows:
(1)   We have collected and analyzed 207 LHM-related papers in recent years. Through organizing and summarizing these gathered resources, we aim to provide readers with the latest trends and research findings in the field of LHM.
(2)   This paper studies the LHM in captive breeding and free-range breeding respectively and comprehensively evaluates the health status of livestock from three key monitoring tasks: Livestock identification, growth estimation, Behavior and posture detection.
(3)   The study systematically analyzed monitoring tasks and modeling approaches in the LHM component, revealing challenges such as data heterogeneity and insufficient model generalization ability.  Based on current research trends, this paper describes existing technical limitations through three dimensions: Architecture optimization, multimodal fusion and algorithm innovation, providing new directions for the LHM field.
 
Research of livestock health monitoring
 
Scenario of breeding
 
Hunter (1996) introduced the concepts of captive and free range systems. To better analyze and evaluate existing studies, we integrate these concepts into AI enabled LHM farming environments and establish a classification framework along four dimensions: spatial characteristics, monitoring equipment type, environmental controllability and data collection characteristics. Details are provided in Table 1.

Table 1: Indicators of captive and free-range breeding in AI-enabled LHM.


       
The study prioritized the type of equipment, followed by spatial features, environmental descriptors (keywords: “barn/fence” vs. “Pasture/grazing”) and data collection conditions. When studying elements containing both scenarios, the classification was determined by the dominant features (andgt;60%) across all four dimensions.
 
Research of livestock health monitoring
 
We discussed LHM based on two scenarios (Fig 2) and analyzed three key monitoring tasks: livestock identification, growth estimation and behavior and posture detection. First, livestock identification ensures each animal receives individualized attention and records, providing essential data for management. Second, growth estimation involves evaluating body condition, body weight and body size. Body condition detection assesses nutritional status to promote optimal growth and production. Weight detection monitors growth rates and trends, allowing adjustments in feeding and management. Body size detection analyzes physical characteristics, helping breeders implement effective breeding programs. Finally, Behavior and posture detection provides insights into activity levels and physiological states, enabling the identification of sick or abnormal livestock.

Fig 2: Research of livestock health monitoring.


       
These three parts are interconnected to form a complete livestock monitoring framework. In two different scenarios, the use of various monitoring technologies combined with data analysis and intelligent algorithms can accurately identify the health status of livestock in this scenario, timely identify potential disease risks, effectively plan feeding strategies and improve production efficiency.
 
Research method
 
We conducted a structured search across three platforms-Web of Science Core Collection, DBLP and Google Scholar-covering the period from 2017-01-01 to 2025-10-09. Queries were organized into three thematic blocks: (A) identification (identification/re-identification/face recognition/ ear tag/muzzle, multi-object tracking, counting); (B) growth estimation (body weight/size, morphometry, body condition score, BCS); and (C) Behavior and posture (posture/pose, social behavior, lameness, rumination, estrus, aggression). Each block was combined with species terms (livestock; pig/swine; cattle/cow/bovine; sheep/ovine; goat/caprine; horse/equine) and method terms (deep learning, computer vision, YOLO, transformer, etc.) using the AND operator and further constrained by scenario terms (captive/barn/pen/indoor/slaughterhouse and free range/pasture/grazing/rangeland/paddock/outdoor). The three blocks returned 121, 97 and 88 records in Web of Science; 131, 99 and 80 in Google Scholar; and 103, 81 and 127 in DBLP, respectively.
       
Search results from all platforms were exported and merged, then de-duplicated by DOI and by normalized titles (case-insensitive, punctuation removed). Inclusion criteria required AI/computer-vision–based monitoring of livestock species, publication within the specified time window and accessible full text. After screening, 207 studies were retained for synthesis. Their distribution is shown in Fig 3 and the complete list of all 207 studies is provided in the appendix.

Fig 3: Distribution of literature on AI-enabled livestock health monitoring (2017-2025).



Livestock identification
 
In livestock breeding, the accurate identification of livestock individuals is the basis of breeding management and disease monitoring. The identification method based on AI technology has become the mainstream method, which can realize 24/7 real-time monitoring and identification of livestock and help managers find abnormal conditions in time. Accurate recognition can be achieved by extracting the body features of livestock.
 
Captive breeding
 
Captive breeding provides a structured setting where animal behavior can be closely monitored and analyzed. The implementation of fixed surveillance systems, positioned strategically around enclosures, allows for continuous observation and data collection, enabling researchers to gather insights into individual animal behaviors and interactions.
 
Multi-object tracking algorithms: Lu et al., (2024) and Yang et al., (2025) achieved 99.8% and 98.8% tracking accuracy in complex environments and low-light pig farms respectively through rotated bounding box detectors and spatiotemporal dynamics analysis. Myat Noe et al., (2023) improved black cattle tracking accuracy to 76.2% via algorithm comparisons.
 
Lightweight biometric recognition: Zheng et al., (2023)’s six-layer convolutional model, Ruchay et al., (2024) transfer learning framework and Li et al., (2023) MobileViTFace achieved embedded deployment for cattle face (98.37%) and sheep face (97.13%) recognition. Yang et al., (2023) enhanced cattle MOTA to 96.88% using deformable convolutions.
 
Unsupervised feature optimization: Lu et al., (2023) developed a key region module for unsupervised fine-grained feature extraction, achieving 98.37% cattle recognition accuracy.
 
Free-range breeding: In free-range breeding scenarios, the identification of livestock requires a more dynamic approach due to the inherent mobility and unpredictability of animals. Scholars emphasize the importance of blending technology with behavioral insights to effectively track animals in open environments. 
 
IoT tracking systems: Voulodimos et al., (2010)’s RFID management system and Pretto et al., (2024) visual ear mark recognition (89% precision) formed dual-mode individual tracking. Natori et al., (2019) integrated GPS/accelerometers to establish grazing cattle activity models.
 
Drone-based monitoring: Soares (2024) applied Ford-Fulkerson algorithm to reduce counting errors to 2.34%. Liu et al., (2021) developed a real-time app that compressed transmission latency below 500 ms with intelligent UAV path guidance.
 
Multimodal recognition: Sun et al., (2024) enhanced MixFormer trackers to handle posture variations (75.79% AUC), while Hassan-Vasquez (2022) combined deep learning with GPS for activity boundary mapping.
 
Summary
 
Table 2 summarizes studies on livestock identification tasks (details in the Appendix). Significant differences exist between captive and free-range breeding in the application and development of identification technologies. In confined environments, identification accuracy is generally stable and exceeds 94%, with Odo et al., (2025) achieving up to 95% accuracy using high-frame-rate CCTV. However, this reliance on controlled conditions raises concerns about the generalizability of findings to real-world scenarios.

Table 2: Livestock identification of livestock.


       
Most studies in confined spaces focus on RGB-D cameras and deep learning algorithms. For instance, Lu et al., (2024) reported 97.5% accuracy under stable conditions, showcasing technological advancements. Nonetheless, limitations such as lighting variations and animal behavior must be acknowledged, as these can affect performance in practical applications.
       
In contrast, accuracy in free-range breeding is more variable, heavily influenced by weather and environmental conditions. Zhang et al., (2022) found a significant decrease in drone monitoring accuracy during adverse weather, while Huang et al., (2023) reported 84% accuracy through multi-sensor data fusion-less than that in captive environments. Addressing the need for robust sensor fusion techniques and adaptive algorithms is essential.
       
In summary, although captive settings foster the development of high-accuracy identification technologies, future research must tackle the limitations in fluctuating conditions. This could involve hybrid approaches that integrate diverse data sources, enhance algorithm adaptability and improve sensor robustness for reliable performance across different livestock species and breeding contexts.
 
Growth estimation
 
The estimation of livestock growth involves detecting body condition, weight and body size, which are key parameters for evaluating their development and health (Machebe et al.,  2016). Regular monitoring of these factors enables breeders to understand livestock growth status and adjust feeding and management practices to ensure adequate nutrition and optimal growth conditions, thus enhancing breeding efficiency and production performance. Fig 4 illustrates two typical AI-based approaches for detecting growth indicators.

Fig 4: (a) Growth estimation based on image analysis, with measurements: 1-9 representing various body dimensions. (b) Growth estimation using 3D stereo reconstruction technique, with measurements a-i representing different body dimensions.


 
Captive breeding
 
In captive breeding, livestock growth is primarily influenced by human management rather than natural environmental factors. This controlled setting allows breeders to monitor growth closely and adjust feeding strategies based on regular measurements of body shape, weight and size to ensure healthy development.
 
Body condition monitoring: Basak et al., (2023) demonstrated 94.6% body composition prediction in pigs using ultrasound-SVR integration, while He et al., (2023a) developed YOLOX-based posture scoring achieving 0.85 F1-score.
 
Contactless weight estimation: LiDAR-PointNet++ systems attained 3.2% prediction error in cattle Hou et al., (2023), contrasting with RGB-D LiteHRNet’s 14.6% MAPE for sheep He et al., (2023b).
 
3D morphometrics: Weng  et al. (2023) and Ling et al., (2022) established Kinect V2-based pig measurement pipelines (2.1cm MAE), enhanced by Lu et al., (2025) ‘s GCN-mesh refinement (3.58% MAPE).
 
Free-range breeding
 
In free-range breeding, livestock growth is more influenced by natural factors than by human intervention. This environment necessitates a different evaluation approach, considering behavioral characteristics, social relationships and autonomous selection abilities to comprehensively assess the health status of livestock.
 
Vision-based systems: Bai et al., (2025) enabled smartphone-based cattle weight estimation (97% precision) through multi-scale fusion networks. Garcia et al., (2021) identified weight anomalies with a gradient boosting model (R² = 0.92).
 
Sensor Integration: Vaughan et al., (2017) ‘s tomography sensors (<1% error) and Simanungkalit et al., (2020) ‘s WOW systems demonstrated 100% repeatability in pasture conditions.
 
Predictive analytics: Abdelhady et al., (2019) achieved 98.75% accuracy in sheep weight prediction using K-means clustering. Sultana et al., (2022) compared three nonlinear models for predicting yearly cattle weight, showing that the Brody model offers better fit statistics for various cattle genotypes in Bangladesh.
 
Summary
 
Growth estimation in livestock differs significantly between captive and free-range breeding, each presenting unique challenges for technology application. In captive settings, researchers have achieved over 94% accuracy using advanced imaging and algorithms, thanks to stable monitoring and controlled management (Table 3). Fig 5 highlights that weight measurements account for 36% of growth estimation methodologies in both settings, with body condition assessments contributing 28% in captivity and 20% in free-range environments. The trend indicates an increasing emphasis on optimizing technology for accurate assessments, particularly in captive breeding.

Table 3: Growth estimation of livestock.



Fig 5: Comparison of growth estimation.


       
However, the reliance on controlled environments raises concerns about the transferability of these findings to real-world scenarios, where factors like animal stress and varying behaviors can significantly affect growth outcomes. This emphasizes the need for caution when interpreting results from ideal conditions. Moreover, species-specific differences in growth estimation methods call for tailored approaches. For example, pigs benefit from visual and tactile data, while cattle and sheep typically rely on conventional weight and body condition metrics, indicating that a one-size-fits-all solution may not be effective.
       
Although advancements in sensor technology offer new opportunities for growth estimation, they also introduce complexities that can hinder practical applications. While multiple sensors enhance sensitivity to behavioral changes, they complicate data management, underscoring the preference for simpler monitoring solutions in real-world settings.
 
Behavior and posture detection
 
Monitoring livestock behavior and posture enables breeders to quickly assess health, emotional states and growth, facilitating early detection of diseases or abnormalities. Key posture indicators include standing and lying positions, head posture and leg alignment, which reveal activity levels and comfort, as well as potential musculoskeletal issues. Additionally, behavior patterns serve as vital indicators of health and welfare; timely monitoring can identify abnormal behaviors and early signs of disease or environmental stress, allowing for prompt preventive measures.
 
Captive breeding
 
In captive breeding scenarios, livestock are often affected by spatial limitations and human intervention and their Behavior and posture may exhibit certain characteristics. Livestock may exhibit more regular and repetitive behavior patterns in captive environments, such as walking back and forth, standing or lying down. Monitoring the Behavior and posture of captive livestock can help evaluate their adaptation to captive conditions, explore their physiological and psychological states and identify potential health issues or behavioral abnormalities.
       
Zia et al., (2023) established the CVB dataset with 502 natural-light videos, achieving 11 fundamental cattle behavior recognitions through an enhanced SlowFast model (57fps real-time processing). Zheng et al., (2023) developed a Siam-AM dual-network architecture incorporating attention mechanisms for leg motion tracking, demonstrating 94.7% lameness detection accuracy across three large-scale farms. Gan et al., (2021) and Bati and Ser (2023) addressed social interactions in piglets and ovine stress responses respectively, with the former attaining F1=0.9377 for aggression detection in 8-hour videos using spatiotemporal CNN, while the latter implemented fear behavior classification through optical flow-CNN fusion. Yan et al., (2024) proposed a dual-modality network with adaptive RGB-optical flow fusion, achieving 77.8% recall for porcine aggression detection in 642 annotated clips. Gao et al., (2023)’s team enhanced the UD-YOLOv5s model via jaw skeleton feature extraction, reaching 86.4% mAP for rumination recognition.
 
Free-range breeding
 
In free-range breeding scenarios, livestock have a larger range of activity and a more natural growth environment and their Behavior and posture may be more diverse and diverse. Free-range livestock can exhibit more exploratory and active behaviors, such as foraging, playing, or communicating. By monitoring the Behavior and posture of free-range livestock, we can gain a more comprehensive understanding of their adaptability to the natural environment, as well as their personality traits and social behavior exhibited in free activities.
       
Kirsch et al., (2025) integrated Time-Distributed Residual LSTM-CNN with bidirectional LSTM to create a hierarchical framework for equine behavior classification (<93\% cross-validation accuracy across 15 activities). Gu et al., (2023)’s team developed a two-stage recognition system that preliminarily screens six sheep behaviors (including standing/attacking) before VGG network refinement, maintaining model memory under 120MB. Schmeling et al., (2021) and Li et al., (2024) devised multi-sensor solutions: the former predicted bovine lying patterns using accelerometer-magnetometer-gyroscope fusion, while the latter achieved 93.64% lameness detection accuracy in 330-cow trials through EfficientNet-B0-based hoof-angle analysis. Riaboff et al., (2020) revealed via accelerometer-GPS correlation that 63% rumination occurs under tree shade in permanent pastures, with feeding frequency increasing by 28% near automated milking systems in temporary grasslands.
 
Summary
 
In captive breeding, the limited range of animal activity allows for easier and more precise monitoring of behavior and posture. In contrast, free-range breeding is significantly influenced by natural conditions, requiring researchers to develop effective monitoring solutions for dynamic environments. As illustrated in Fig 6 the distribution of research on behavior and posture detection varies, with a greater emphasis on behaviors such as feeding and social interactions in free-range settings compared to controlled captive studies (Table 4 in the Appendix).

Fig 6: Behavior and posture papers from the last decade in captive breeding and free-range breeding.



Table 4: Behavior and posture detection of livestock.


       
Technologically, the diversity of detection methods reflects adaptability to different scenarios. Captive breeding benefits from video analysis and deep learning models like YOLO for real-time posture analysis. In free-range contexts, automated sensors and cameras are increasingly used; for example, Gu et al., (2023) employed RGB-D cameras to monitor cattle behavior under variable conditions. Due to natural elements affecting data in free-range settings, integrating multiple sensors like accelero-meters and GPS is crucial for improving accuracy.
       
Species-specific behavioral traits also influence the choice of posture detection technologies. Different species, such as pigs, cattle and sheep, require tailored approaches; for instance, Eisermann (2022) used jaw-tracking equipment measurements for pigs, while cattle behavior detection relies more on video analysis. This specialization enhances monitoring accuracy and efficiency, facilitating the development of tailored management strategies.
 
Prospects
 
Research on AI-based LHM has become a hot topic in academia. Literature analysis shows that in captive breeding, the focus is on high-precision visual recognition, behavior and posture monitoring, utilizing deep learning for accurate individual identification and anomaly detection. In contrast, free-range breeding emphasizes multi-sensor integration and environmental adaptability, using GPS, sensors and long-term dynamic monitoring to navigate complex natural environments. This research highlights the preference for static monitoring technologies in confined settings, while free-range contexts rely on integrated multi-source data, emphasizing system robustness and adaptability and presenting distinct technological strategies for each breeding scenario. We foresee significant advancements in the LHM domain regarding structure, content and technology, leading to new innovations in intelligent animal breeding.
 
Classification method
 
This study examines scenarios in LHM, distinguishing between free-range breeding and captive breeding. Health assessments of livestock in free-range environments help simulate natural ecosystems, enabling a holistic evaluation of their well-being. In contrast, monitoring in captive environments minimizes external influences, yielding more precise data. Future studies could explore different categorizations, such as classifying LHM research based on livestock husbandry purposes:
(1) Meat-producing livestock: Specifically bred for meat production.
(2) Dairy livestock: Primarily serving for milk and dairy product output, e.g., cows, sheep.
(3) Working livestock: Utilized for tasks like plowing and transportation, e.g., horses, cattle.
(4) Fur-producing livestock: Reared for fur or wool extraction.
       
Through this classification method, personalized improvements can be made to the health status of livestock and targeted management suggestions can be provided for various types of animal husbandry, promoting the development and health enhancement of the entire animal husbandry industry.
 
Enhance refined management
 
Captive breeding offers significant opportunities for improved livestock management, particularly in behavior and posture detection. Future research can focus on:
 
(1)   Diversified behavior detection
 
Integrating multiplesensor technologies, such as video surveillance and accelerometers, to develop algorithms that recognize various postures and behaviors, aiding in early detection of health issues and enhancing animal welfare.
 
(2)   Behavior pattern analysis
 
Utilizing machine learning to identify behavior patterns in livestock, creating databases and alert systems that help herders monitor health and status effectively.
       
In free-range breeding, there is untapped potential for precision management, as current research predominantly emphasizes high-accuracy recognition tasks while neglecting growth indicator detection. Future directions may include:
 
(1) Sensor technology
 
Integrating machine learning with sensors (like neck collars or RFID tags) to monitor body size and enable real-time weight tracking.
 
(2) Computer vision technology
 
Employing portable devices for monitoring livestock growth, using smartphones or cameras to capture images for analysis and weight estimation through deep learning.
 
(3) Intelligent devices
 
Developing tools to automatically record and transmit growth data to the cloud for precise analysis through AI algorithms.
 
Technological improvements
 
Remote monitoring and data-driven decision-making
 
Current AI-driven LHM research faces challenges with fragmented monitoring tasks, such as livestock identification, posture detection and weight measurements, leading to inefficiencies and information silos. Future research should focus on developing an integrated monitoring system to streamline these tasks, enhancing data accuracy and decision-making. This system will utilize sensor technologies and internet connectivity for real-time health monitoring, enabling farmers to access comprehensive physiological and environmental data anytime. Intelligent decision support systems powered by machine learning can analyze health trends, predict disease outbreaks and provide targeted advice, thus improving productivity while ensuring livestock health.
 
Multimodal data fusion and comprehensive assessment
 
Future LHM research should prioritize multimodal data fusion by integrating data from various sensors, such as images, videos and sounds, for a comprehensive health assessment. For example, analyzing both audio and visual data can help detect abnormal breathing sounds and infer potential health issues. This approach, combined with machine learning, can enhance diagnostic accuracy in livestock health assessment.
 
Enhancing model performance in low data sample scenarios
 
AI-empowered LHM research typically relies on large datasets for image processing and deep learning. However, the scarcity of publicly available livestock datasets and the absence of standard data formats hinder generalization across studies. Future research should explore enhancing model performance in low-sample scenarios through techniques like transfer learning, meta-learning and active learning to leverage existing data effectively. Furthermore, methods such as Generative Networks and self-supervised learning can augment datasets, improving diversity and robustness. Combining these advanced technologies will help develop more precise models for livestock health monitoring, addressing challenges related to limited sample data and pushing innovation in AI-empowered LHM technology.
This paper analyzes over 200 AI studies on LHM in captive and free-range breeding, focusing on three key tasks: livestock identification, growth estimation and behavior detection. It compares monitoring approaches in both scenarios, outlines current challenges in the LHM domain and suggests future research directions for more effective livestock management solutions.
This paper was supported by Fundamental Research Funds for the Central Universities under grant No. N2317005 and the National Natural Science Foundation of China under grant No. 62302086.
 
Disclaimers
 
The views and conclusions expressed in this article are solely those of the authors and do not necessarily represent the views of their affiliated institutions. The authors are responsible for the accuracy and completeness of the information provided, but do not accept any liability for any direct or indirect losses resulting from the use of this content.
 
 The authors declare that there are no conflicts of interest regarding the publication of this article. No funding or sponsorship influenced the design of the study, data collection, analysis, decision to publish, or preparation of the manuscript.

  1. Abdelhady, A.S. and Hassanien, A.E., Awad, Y.M., El-Gayar, M. and Fahmy, A. (2019). Automatic sheep weight estimation based on k-means clustering and multiple linear regression. proceedings of the international conference on advanced intelligent systems and informatics 2018. 845: 546-555. doi: 10.1007/978-3-319-99010-1_50.

  2. AlZubi, Ahmad Ali. (2023). Artificial intelligence and its application in the prediction and diagnosis of animal diseases: A review. Indian Journal of Animal Research. 57(10): 1265-1271.  doi: 10.18805/IJAR.BF-1684.

  3. Antognoli, V., Presutti, L., Bovo, M., Torreggiani, D., Tassinari, P. (2025). Computer vision in dairy farm management: A literature review of current applications and future perspectives. Animals. 15(17): 2508. doi: 10.3390/ani15172508.

  4. Bai, L., Guo, C., Song, J. (2025). Cattle weight estimation model through readily photos. Engineering Applications of Artificial Intelligence. 143: 109976. doi:10.1016/j.engappai. 2024. 109976.

  5. Basak, J.K., Paudel, B., Deb, N.C., Kang, D.Y., Moon, B.E., Shihab, A.S., Tae, K.H.  (2023). Prediction of body composition in growing- finishing pigs using ultrasound based back-fat depth approach and machine learning algorithms. Computers and Electronics in Agriculture. 213: 108269. doi: 10. 1016/j.compag.2023.108269.

  6. Bati, C.T. and Ser, G. (2023). SHEEPFEARNET: Sheep fear test behaviors classification approach from video data based on optical flow and convolutional neural networks. Computers  and Electronics in Agriculture. 204: 107540. doi: 10. 1016/j.compag.2022.107540.

  7. Bretas, I.L., Dubeux, J.C.B., Cruz, P.J.R., Oduor, K.T., Queiroz, L.D., Valente, D.S.M. and Chizzotti, F.H.M. (2024). Precision livestock farming applied to grazingland monitoring and management- A review. Agronomy Journal. 116(3): 1164-1186. doi: 10. 1002/agj2.21346.

  8. Cao, Y., Chen, J., Zhang, Z. (2023). A sheep dynamic counting scheme based on the fusion between an improved-sparrow-search YOLOv5x-ECA model and few-shot deepsort algorithm. Computers and Electronics in Agriculture. 206: 107696. doi: 10.1016/j.compag.2023.107696.

  9. Dong, L. andrea, P., Eric, P., Robert, F., Tomas, N. (2023). Where’s your head at? Detecting the orientation and position of pigs with rotated bounding boxes. Computers and Electronics in Agriculture. 212: 108099. doi: 10.1016/j.compag.2023. 108099. 

  10. Du, A.,  Guo, H., Lu, J., Su, Y., Ma, Q., Ruchay, A.N., Marinello, F., Pezzuolo, A. (2022). Automatic livestock body measurement based on keypoint detection with multiple depth cameras. Computers and Electronics in Agriculture. 198: 107059. doi: 10.1016/j.compag.2022.107059.

  11. Eisermann, J., Schomburg, H., Knöll, J., Schrader, L. and Patt, A. (2022). Bite-o-Mat: A device to assess the individual manipulative behaviour of group housed pigs. Computers and Electronics in Agriculture. 193: 106708. doi: 10. 1016/j.compag.2022.106708.

  12. Gan, H., Ou, M., Huang, E.,  Xu, C., Li, S., Li, J., Liu, K., Xue, Y. (2021). Automated detection and analysis of social behaviors among preweaning piglets using key point-based spatial and temporal features. Computers and Electronics in Agriculture. 188: 106357. doi: 10.1016/j.compag.2021.106357.

  13. Gao, G., Wang, C., Wang, J., Lv, Y. (2023). UD-YOLOv5s: Recognition of cattle regurgitation behavior based on upper and lower jaw skeleton feature extraction. 2023 NANA. 532- 538. doi: 10.1109/NaNA60121.2023.00094.

  14. Garcia, R., Aguilar, J., Toro, M., Jimenez, M. (2021). Weight-identification model of cattle using machine-learning techniques for anomaly detection. 2021 SSCI. 1-7. doi: 10.1109/SSCI50451. 2021.9659840.

  15. Gu, Z., Zhang, H., He, Z., Niu, K. (2023). A two-stage recognition method based on deep learning for sheep behavior. Computers and Electronics in Agriculture. 212: 108143. doi: 10. 1016/j.compag.2023.108143.

  16. Guo, Y., Yu, Z., Hou, Z., Zhang, W., Qi, G. (2023). Sheep face image dataset and DT-YOLOv5s for sheep breed recognition. Computers and Electronics in Agriculture. 211: 108009. doi: 10.1016/j.compag.2023.108009.

  17. Hassan-Vásquez, J.A. and Maroto-Molina, F. and Guerrero-Ginel, J.E. (2022). GPS tracking to monitor the spatiotemporal dynamics of cattle behavior and their relationship with feces distribution. Animals. 12(18): 2383. doi: 10.3390/ ani12182383.

  18. He, C., Qiao, Y., Mao, R., Li, M., Wang, M. (2023). Enhanced LiteHRNet based sheep weight estimation using RGB-D images. Computers and Electronics in Agriculture. 206: 107667. doi: 10.1016/j.compag.2023.107667.

  19. He, H., Chen, C., Zhang, W., Wang, Z., Zhang, X. (2023). Body condition scoring network based on improved YOLOX. Pattern Analysis and Applications. 26(3): 1071-1087. doi: 10. 1007/s10044-023-01171-x.

  20. Hou, Z., Huang, L., Zhang, Q.,  Miao, Y. (2023). Body weight estimation of beef cattle with 3D deep learning model: PointNet++. Computers and Electronics in Agriculture. 213: 108184. doi: 10.1016/j.compag.2023.108184.

  21. Huang, X., Hu, Z., Wang, X., Yang, X., Zhang, J., Shi, D. (2019). An improved single shot multibox detector method applied in body condition score for dairy cows. Animals. 9(7): 470. doi: 10.3390/ani9070470.

  22. Huang, Y., Xiao, D., Liu, J., Tan, Z., Liu, K., Chen, M. (2023). An improved pig counting algorithm based on yolov5 and deepsort model. Sensors. 23(14): 6309. doi: 10.3390/s23146309.

  23. Hunter, D.L. (1996). Tuberculosis in free-ranging, semi free-ranging and captive cervids. Rev. Sci. Tech. OIE. 1: 171-181. doi:  10.20506/rst.15.1.911.

  24. Kirsch, K., Strutzke, S., Klitzing, L., Pilger, F., Thöne-Reineke, C., Hoffmann, G. (2025). Validation of a time-distributed residual LSTM–CNN and BiLSTM for equine behavior recognition using collar-worn sensors. Computers and Electronics in Agriculture. 231: 109999. doi: 10.1016/ j.compag.  2025.109999.

  25. Li, X. and Xiang, Y. and Li, S. (2023). Combining convolutional and vision transformer structures for sheep face recognition. Computers and Electronics in Agriculture. 205: 107651. doi: 10. 1016/j.compag.2023.107651.

  26. Ling, Y., Jimin, Z., Caixing, L., Xuhong, T. and Sumin, Z. (2022). Point cloud-based pig body size measurement featured by standard and non-standard postures. Computers and Electronics in Agriculture. 199: 107135.

  27. Liu, C., Jian, Z., Xie, M., Cheng, I. (2021). A real-time mobile application for cattle tracking using video captured from a drone. 2021 ISNCC. 1-6. doi: 10.1109/ISNCC52172. 2021.9615648.

  28. Lu, H.,  Zhang, J., Yuan, X., Lv, J., Zeng, Z., Guo, H. and Ruchay, A. (2025). Automatic coarse-to-fine method for cattle body measurement based on improved GCN and 3D parametric model. Computers and Electronics in Agriculture. 231: 110017. doi:  10.1016/j.compag.2025.110017.

  29. Lu, J., Chen, Z., Li, X., Fu, Y., Xiong, X., Liu, X., Wang, H. (2024). ORP- Byte: A multi-object tracking method of pigs that combines Oriented RepPoints and improved Byte. Computers and Electronics in Agriculture. 219: 108782. doi: 10.1016/j.compag. 2024.108782.

  30. Lu, Y., Weng, Z., Zheng, Z., Zhang, Y.,Gong, C. (2023). Algorithm for cattle identification based on locating keyarea. Expert Systems with Applications. 228: 120365. doi: 10.1016/j. eswa.2023.120365.

  31. Machebe, N.S., Ezekwe, A.G., Okeke, G.C. and Banik, S.2  (2016). Path analysis of body weight in grower and finisher pigs. Indian Journal of Animal Research. 50(5): 794-798.  doi: 10.18805/ijar.11319.

  32. Myat Noe, S.,  Zin, T.T., Tin, P. and Kobayashi, I. (2023). Comparing state-of-the-art deep learning algorithms for the automated detection and tracking of black cattle. Sensors. 23(1): 532. doi: 10.3390/s23010532.

  33. Natori, T., Ariyama, N., Tsuichihara, S., Takemura, H., Aikawa, N. (2019). Study of activity collecting system for grazing cattle. 2019 ITC-CSCC. 1-4. doi: 10.1109/ITC-CSCC.2019. 8793451.

  34. Odo, A., McLaughlin, N., Kyriazakis, I. (2025). Re-identification for long-term tracking and management of health and welfare challenges in pigs. Biosystems Engineering. 251: 89-100.  doi: 16/j.biosystemseng.2025.02.001.

  35. Pretto, A., Savio, G., Gottardo, F., Uccheddu, F., Concheri, G. (2024). A novel low-cost visual ear tag based identification system for precision beef cattle livestock farming. Information Processing in Agriculture. 11(1): 117-126. doi: 10.1016/j.inpa. 2022.10.003.

  36. Qian, L., Yongsheng, S., Mengyuan, C., Xi, K., Gang, L. (2024). Lameness detection of dairy cows based on key frame positioning and posture analysis. Computers and Electronics in Agriculture. 227: 109537. doi: 10.1016/j.compag.2024. 109537.

  37. Riaboff, L., Couvreur, S., Madouasse, A. and Roig-Pons. (2020). Use of predicted behavior from accelerometer data combined with gps data to explore the relationship between dairy cow behavior and pasture characteristics. Sensors20(17): 4741. doi:  10.3390/s20174741.

  38. Ruchay, A. and Kolpakov, V. and Guo, H. and Pezzuolo, A. (2024). On-barn cattle facial recognition using deep transfer learning and data augmentation. Computers and Electronics in Agriculture. 225: 109306. doi: 10.1016/j.compag. 2024. 109306.

  39. Schmeling, L., Elmamooz, G. and Hoang. (2021). Training and validating a machine learning model for the sensor-based monitoring of lying behavior in dairy cows on pasture and in the barn. Animals. 11(9): 2660. doi: 10.3390/ani11092660.

  40. Shi, J., Chen, X., Zhang, Y.,  Gong, P., Xiong, Y., Shen, M., Norton, T., Gu, X., Lu, M. (2025). Detection of estrous ewes’ tail-wagging behavior in group-housed environments using Temporal- Boost 3D convolution. Computers and Electronics in Agriculture. 234: 110283. doi: 10.1016/j.compag.2025. 110283.

  41. Simanungkalit, G.,  Hegarty, R.S., Cowley, F.C., McPhee, M.J. (2020). Evaluation of remote monitoring units for estimating body weight and supplement intake of grazing cattle. Animal. 14: s332-s340. doi: 10.1017/S1751731120000282.

  42. Soares, V.H.A. and Ponti, M.A. and Campello, R.J.G.B. (2024). Multi- attribute, graph-based approach for duplicate cattle removal and counting in large pasture areas from multiple aerial images. Computers and Electronics in Agriculture. 220: 108828. doi: 10.1016/j.compag.2024.108828.

  43. Soares, V.H.A., Ponti, M.A., Gonçalves, R.A.,  Campello, R.J.G.B.  (2021). Cattle counting in the wild with geolocated aerial images in large pasture areas. Computers and Electronics in Agriculture. 189: 106354. doi: 10.1016/j.compag. 2021.106354.

  44. Sultana, N., Khan, M.K.I., Momin, M.M. (2022). Nonlinear models for the prediction of yearly live weight of cattle. Asian Journal of Dairy and Food Research. 41(2): 168-172. doi: 10. 18805/ajdfr.DRF-257.

  45. Sun, Q., Yang, S., Wang, M., Hu, S., Ning, J. (2024). A real-time dairy goat tracking based on MixFormer with adaptive token elimination and efficient appearance update. Computers and Electronics in Agriculture. 218: 108645. doi: 10. 1016/j.compag.2024.108645.

  46. Vaughan, J.,  Green, P.M. and Salter, M., Grieve, B., Ozanyan, K.B. (2017). Floor sensors of animal weight and gait for precision livestock farming. 2017 IEEE SENSORS. 1-3. doi: 10. 1109/ICSENS.2017.8234202.

  47. Voulodimos, A.S. and Patrikakis, C.Z. and Sideridis, A.B. and Ntafis, V.A. and Xylouri, E.M. (2010). A complete farm management system based on animal identification using RFID technology. Computers and Electronics in Agriculture. 228: 120365. doi: 10.1016/j.eswa.2023.120365.

  48. Wang, Z., Zhou, S., Yin, P., Xu, A., Ye, J. (2023). GANPose: Pose estimation of grouped pigs using a generative adversarial network. Computers and Electronics in Agriculture. 212: 108119. doi: 10.1016/j.compag.2023.108119.

  49. Weng, Z., Li, Z. and Zheng, Z. (2023). Three-dimensional point cloud reconstruction and body ruler measurement of pig body under multi-angle KinectV2. BIC 2023. pp-39-40. doi:  10.1145/ 3592686.3592694.

  50. Xu, J., Liu, W., Qin, Y.,  Xu, G. (2022). Sheep counting method based on multiscale module deep neural network. IEEE Access. 10: 128293-128303. doi: 10.1109/ACCESS.2022.3221542.

  51. Yan, K., Dai, B., Liu, H., Yin, Y., Li, X., Wu, R., Shen, W. (2024). Deep neural network with adaptive dual-modality fusion for temporal aggressive behavior detection of group-housed pigs. Computers and Electronics in Agriculture. 224: 109243. doi:  10.1016/j.compag.2024.109243.

  52. Yang, W. and Wu, J.and Zhang, J. and Gao, K. and Du, R. and Wu, Z. and Firkat, E. and Li, D. (2023). Deformable convolution and coordinate attention for fast cattle detection. Computers and Electronics in Agriculture. 211: 108006. doi: 10.1016/j. compag.2023.108006.

  53. Yang, Y., Li, C., Wang, X.,  Zhou, H., Yang, J., Xue, Y. (2025). Automatic recognition of isolated piglet outliers based on multi- object tracking. Computers and Electronics in Agriculture. 235: 110377. doi:  10.1016/j.compag.2025.110377.

  54. Zhang, X., Xuan, C., Ma, Y., Su, H. (2022). An integrated goat head detection and automatic counting method based on deep learning. Animals. 12(14): 1810. doi: 10.3390/ani12141810.

  55. Zheng, Z. and Qin, L. (2023). PrunedYOLO-Tracker: An efficient multi-cows basic behavior recognition and tracking technique. Computers and Electronics in Agriculture. 213: 10871. doi: 10.1016/j.compag.2023.108172.

  56. Zheng, Z., Zhang, X., Qin, L., Yue, S. and Zeng, P. (2023). Cows’ legs tracking and lameness detection in dairy cattle using video analysis and Siamese neural networks. Computers and Electronics in Agriculture. 205: 107618. doi:  10.1016/j.compag. 2023.107618.

  57. Zia, A., Sharma, R., Arablouei, R., Bishop-Hurley, G., McNally, J., Bagnall, N., Rolland, V., Kusy, B., Petersson, L., Ingham, A. (2023). CVB: A video dataset of cattle visual behaviors. arXiv. doi: 10. 48550/arXiv.2305.16555.
In this Article
Published In
Indian Journal of Animal Research

Editorial Board

View all (0)