Non Technical Summary
The goal of this project is to develop and demonstrate a new co-robot system of autonomous unmanned air vehicles (UAVs) for monitoring the health of cattle herds, thus improving management practices. Every year, over 2.5 million U.S. cattle, valued at $1.5 billion, die from health problems. In contrast, only 220,000 cattle are lost annually to predators. Poor livestock health is the single biggest cause of cattle loss--accounting for over 60% of all losses. Improved health monitoring can reduce herd loss and thus, help to secure an essential source of food. Unlike poultry and swine, grazing beef cattle spend a significant amount of time outside of confinement, which makes centralized monitoring difficult. Although neckbands for cattle monitoring exist, these devices are expensive, cumbersome, and rarely used in practice--regardless of the size of the operation.We propose a new co-robot multi-UAV system equipped with vision-sensing capabilities that will uniquely identify each cow in a herd, and provide daily monitoring of each cow's physical location in pasture as well as key health indicators such as facial features, volume, weight, and physical activity. All of these measurements will be obtained using a group of collaborative UAVs that patrol the herd and use non-invasive measurement methods. Significant advances in computer science, cooperative control, agricultural engineering, and livestock systems management are needed to unlock the potential of co-robot multi-UAV systems for cattle health monitoring and precision livestock management.This projects merits include: 1) new control methods for UAVs to herd and image cattle cooperatively; 2) novel computer vision algorithms to identify, measure, and track cattle in the field; and 3) the first study to quantify the response of livestock to UAVs. We propose a practical two-tier sensing-and-control approach using an observer UAV and multiple worker UAVs. The higher-altitude observer UAV determines the relative position and orientation of the worker UAVs and cattle. Worker UAVs herd and image cattle using fixed-mount cameras. Computer-vision techniques provide 3-dimensional cow models, facial images, and movement models for each cow. With regard to 3), this study will quantify the response of cattle to UAVs and establish interaction guidelines. One of the novel components is establishing the cattle flight zone for UAVs, which would allow for future studies to more accurately design sensors for livestock applications. Observing the potential changes in physiological and behavioral states of cattle due to UAVs will allow for the potential stress to be determined.This project will make transformational progress on the use of autonomous collaborative UAVs for monitoring cattle welfare and thus, improve the security of a critical food resource and improve the economic outlook for rural beef producers. The significance of beef production for Kentucky cannot be overstated. The proposed co-robot multi-UAV system has the potential to transform beef-production practices for over 1/2 million small-farm producers across the U.S. and 38,000 in Kentucky alone. This technology could reduce the cattle-monitoring burden on these producers, thus making the lives of beef producers a little easier and improving animal welfare. The technology that we will develop also has application to animal science and could be used to study a variety of other livestock (e.g., sheep, goats and horses) or even wild animals.
Animal Health Component
Research Effort Categories
Goals / Objectives
The goal of this project is to develop and demonstrate a new co-robot system of autonomous unmanned air vehicles (UAVs) for monitoring the health of small-farm cattle herds, thus improving management practices. We propose a new co-robot multi-UAV system equipped with vision-sensing capabilities that will uniquely identify each cow in a herd, and provide daily monitoring of each cow's physical location in pasture as well as key health indicators such as facial features, volume, weight, and physical activity. All of these measurements will be taken using a group of cooperative UAVs that use non-invasive measurement methods. Significant advances in computer science, cooperative control, agricultural engineering, and livestock systems management are needed to unlock the potential of co-robot multi-UAV systems for cattle health monitoring and precision livestock management. Our project team brings expertise in all of these areas. The major objectives of this project are:Objective 1. Cattle Identification and Facial Imaging. We will develop computer vision techniques that allow a group of autonomous UAVs to uniquely identify each cow in a herd and take daily facial images of each cow. These images can be used to detect illnesses such as pinkeye (infectious bovine keratoconjunctivitis), which is a painful bacteria infection that can quickly spread through a herd and can cause reductions in weaning weight by at least 18 kg.Objective 2. Three-Dimensional Scan of the Cattle. We will develop cooperative control and computer vision techniques that allow a group of co-robot UAVs to perform a 3-D scan of each cow in a herd. This 3-D image model could be used to track daily changes in the cow's volume and estimated weight.Objective 3. Induce and Measure Cattle Motion. We will develop computer vision and cooperative control methods that allow a group of co-robot UAVs to herd a single cow, capture images of that cow, and construct a motion model from those images. This model could be used to track daily changes in the cow's motion and thus, help identify potential health issues that affect motion such as lameness. This task will depend on the findings from the following task.Objective 4. Cattle Response to UAVs. We propose experiments to examine cattle response to UAVs. Cattle flight zones will be assessed for various UAV approach angles, directions, and speeds. Responses will multiple UAVs will also be examined. Potential behavioral and physiological changes will be measured.Objective 5. Evaluation. We will evaluate the accuracy of our system's performance. This is done through evaluation under a controlled setting and then a longitudinal study with a participating beef producer from the Kentucky Cattlemen's Association. We will also develop post-processing tools needed to transform the raw data into simple forms and metrics that are easily understandable by beef producers.
Methods for Objective 1We will use a two-stage approach for cattle identification. In the first stage, we will identify individual cattle using ID tags. In the second stage, we will explore advanced pattern recognition methods to identify cattle using facial images.In the first stage, we plan to read the ID tags using the following steps:Cattle Detection by the Observer UAV. We will adopt human-pedestrian-detection techniques. These methods can be transferred from human to cattle "pedestrians" by training with cattle data sets. We will apply deep neural network models to cattle video or image data sets with a variety of breeds, environments, scales, and lighting conditions.Flight Control. Once a cow is detected, one worker UAV needs to take a frontal image of its face and ID tag. The worker UAVs must maintain constant attitude and relative position despite wind. Our preliminary experiments suggest that the existing rotorcraft inner-loop control methods (e.g., PID) are sufficient. If necessary, we will implement more advanced inner-loop control strategies.Tag Recognition. Tag recognition is similar to license plate reading. However, livestock tags come in a variety of shapes, sizes, and colors. If needed, we will develop our own tags that are easy to recognize (e.g., bar-code tags).In the second stage, we will explore cattle identification using the facial images. Inspired by recent advances in human face recognition, we plan to use a deep learning method to perform facial recognition. However, there are challenges: i) cattle facial images will differ in pose, scale, illumination, and occlusions; and ii) unlike the variance of human faces, cattle faces look similar.Methods for Objective 2We will develop vision-based techniques to estimate body size and weight.3D Morphable Model. Inspired by the successful application of 3D morphable models in human face and body modeling, we plan to create a 3D morphable model of cattle. This model is used to constrain the 3D reconstruction space, and it serves as the means to obtain indirect measurements such as weight. We will start with a single cow, assuming no occlusion. Then, we will study reconstruction in the presence of moderate occlusions.Creating the 3D morphable model requires a database of 3D full-body cattle scans. We estimated that at least 30 different cows of various body shapes and sizes must be scanned to train the morphable model. Since cows cannot be asked to stand still, we plan to use an array of real-time depth sensors and techniques that deal with movement during scanning.Regression to Body Weight. During the scanning process, we will also collect cattle height, length, and weight measurements. If we assume that there is a linear relationship between these measurements and the principal-component-analysis coefficients of the 3D morphable model, then we can use least-squares regression to estimate body weight. This estimation method has been tested on human bodies to estimate height, hip circumference, and weight with reasonable accuracy.Outer-Loop Discrete-Time Formation Control. Recently, we developed a new discrete-time formation (DTF) control method for coordinated control of co-robot multi-UAV systems. Most existing formation-control approaches are continuous-time algorithms and do not account for sample-data effects, which can be significant for applications such as formation flying, where communication and sensing constraints limit the speed with relative position data can be obtained. To implement DTF, each UAV requires a measurement of the relative positions and velocities of nearby vehicles, which is provided by the observer UAV. DTF uses this information to achieve formation cohesion and collision avoidance from each other and nearby obstacles (e.g., cattle).Methods for Objective 3An animal's gait provides important cues about its health status and can reveal conditions such as lameness. Lameness due to foot rot or other minor leg injuries is a production-limiting problem. Studies have shown producers tend to under-recognize the presence of lameness in their herd when compared to automated detection methods.Animal Motion Capture. Gait analysis has long been used in both human healthcare and animal science. There are many tools and systems available for human motion capture and gait analysis. Marker-based optical tracking systems are the gold standard for motion capture. In this project, we aim to develop a video-based motion capture system that can be used to capture an animal's full gait in the wild without using markers. We previously developed a state-of-art markerless motion capture algorithm that utilizes a single camera. Since our algorithm requires only a surface template and underlying skeleton, we have successfully applied it to animal tracking. In the scope of this project, we plan to develop a silhouette-based tracking algorithm.Induce Motion. When the UAV system passes over a group of cattle, they may not be walking, which would prevent the UAVs from obtaining the motion videos. We therefore will use the worker UAVs to herd the cattle and induce movement. Several worker UAVs will move into the cattle's flight zone to induce motion. We will develop a cooperative-herding guidance algorithm based on DFT. However, there is one important challenge--the worker UAVs must react to the cattle motion. We do not want the UAVs to get too close to the cattle, but we want the UAVs to be close enough to induce motion.Methods for Objective 4Characterizing the UAVs influence on the typical behavior of cattle (grazing, ruminating, resting, and movement) is an important product for this project. From the introduction and interaction with the UAV, the cattle flight zone and potential stress response will be measured. Evaluating cattle response to UAVs requires a multifaceted approach. The trials will be conducted at University of Kentucky C. Oran Little Research Center, located in Woodford County, KY. Over the duration of the trial 48 predominately Angus steers (220 kg body weight (BW)) and 8 cows (600 kg BW) will be assigned to up to 16 different pastures. All cattle used in the trail will have ad libitum access to fresh water and mineral supplement. The trial period for these cattle will range from May to August, as this is the primary grazing season. Depending upon the availability of forages, fall and winter trials would be conducted with a reduced number of animals as well. Shrunk BW will be determined prior to the initial onset and at the conclusion of the trial so that the overall experiment average daily gain can be ascertained. Full BW will be measured throughout the trial periods so that the correlation to body volume could be developed.Methods for Evaluation We will evaluate the effectiveness of our approaches in two phases. In the first phase, we will evaluate under controlled settings. The localization accuracy will be validated indoor with motion capture systems. The effectiveness of identification, 3D measurement (both in the static and dynamic poses), and derived indirect measurement (such as weight) will be validated with the truth data obtained during our initial data collection. Standard techniques to validate pattern recognition systems, such as leave-one-out cross validation, will be used. Given the typical herd size for small operations, the population size is expected to be between 20 to 100 cows.In the second phase, we plan to evaluate our proposed techniques using real data collected in working farms. We will utilize the Kentucky Cattlemen's Association and the UK Cooperative Extension Service to identify beef producers who are willing to participate. We will plan a longitudinal study in which a small number of cows (e.g., 10) will be tracked over a period of 4 months to see the effectiveness of our approaches. Producers' feedback will also be solicited.