Source: UNIVERSITY OF KENTUCKY submitted to
NRI: INT: AUTONOMOUS UNMANNED AERIAL ROBOTS FOR LIVESTOCK HEALTH MONITORING
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
EXTENDED
Funding Source
Reporting Frequency
Annual
Accession No.
1015403
Grant No.
2018-67021-27416
Project No.
KY.W-2017-08130
Proposal No.
2017-08130
Multistate No.
(N/A)
Program Code
A7301
Project Start Date
Feb 15, 2018
Project End Date
Feb 14, 2022
Grant Year
2018
Project Director
Hoagg, J.
Recipient Organization
UNIVERSITY OF KENTUCKY
500 S LIMESTONE 109 KINKEAD HALL
LEXINGTON,KY 40526-0001
Performing Department
Mechanical Engineering
Non Technical Summary
The goal of this project is to develop and demonstrate a new co-robot system of autonomous unmanned air vehicles (UAVs) for monitoring the health of cattle herds, thus improving management practices. Every year, over 2.5 million U.S. cattle, valued at $1.5 billion, die from health problems. In contrast, only 220,000 cattle are lost annually to predators. Poor livestock health is the single biggest cause of cattle loss--accounting for over 60% of all losses. Improved health monitoring can reduce herd loss and thus, help to secure an essential source of food. Unlike poultry and swine, grazing beef cattle spend a significant amount of time outside of confinement, which makes centralized monitoring difficult. Although neckbands for cattle monitoring exist, these devices are expensive, cumbersome, and rarely used in practice--regardless of the size of the operation.We propose a new co-robot multi-UAV system equipped with vision-sensing capabilities that will uniquely identify each cow in a herd, and provide daily monitoring of each cow's physical location in pasture as well as key health indicators such as facial features, volume, weight, and physical activity. All of these measurements will be obtained using a group of collaborative UAVs that patrol the herd and use non-invasive measurement methods. Significant advances in computer science, cooperative control, agricultural engineering, and livestock systems management are needed to unlock the potential of co-robot multi-UAV systems for cattle health monitoring and precision livestock management.This projects merits include: 1) new control methods for UAVs to herd and image cattle cooperatively; 2) novel computer vision algorithms to identify, measure, and track cattle in the field; and 3) the first study to quantify the response of livestock to UAVs. We propose a practical two-tier sensing-and-control approach using an observer UAV and multiple worker UAVs. The higher-altitude observer UAV determines the relative position and orientation of the worker UAVs and cattle. Worker UAVs herd and image cattle using fixed-mount cameras. Computer-vision techniques provide 3-dimensional cow models, facial images, and movement models for each cow. With regard to 3), this study will quantify the response of cattle to UAVs and establish interaction guidelines. One of the novel components is establishing the cattle flight zone for UAVs, which would allow for future studies to more accurately design sensors for livestock applications. Observing the potential changes in physiological and behavioral states of cattle due to UAVs will allow for the potential stress to be determined.This project will make transformational progress on the use of autonomous collaborative UAVs for monitoring cattle welfare and thus, improve the security of a critical food resource and improve the economic outlook for rural beef producers. The significance of beef production for Kentucky cannot be overstated. The proposed co-robot multi-UAV system has the potential to transform beef-production practices for over 1/2 million small-farm producers across the U.S. and 38,000 in Kentucky alone. This technology could reduce the cattle-monitoring burden on these producers, thus making the lives of beef producers a little easier and improving animal welfare. The technology that we will develop also has application to animal science and could be used to study a variety of other livestock (e.g., sheep, goats and horses) or even wild animals.
Animal Health Component
20%
Research Effort Categories
Basic
40%
Applied
30%
Developmental
30%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4023399202040%
4023399208020%
4043399202020%
3063310106020%
Goals / Objectives
The goal of this project is to develop and demonstrate a new co-robot system of autonomous unmanned air vehicles (UAVs) for monitoring the health of small-farm cattle herds, thus improving management practices. We propose a new co-robot multi-UAV system equipped with vision-sensing capabilities that will uniquely identify each cow in a herd, and provide daily monitoring of each cow's physical location in pasture as well as key health indicators such as facial features, volume, weight, and physical activity. All of these measurements will be taken using a group of cooperative UAVs that use non-invasive measurement methods. Significant advances in computer science, cooperative control, agricultural engineering, and livestock systems management are needed to unlock the potential of co-robot multi-UAV systems for cattle health monitoring and precision livestock management. Our project team brings expertise in all of these areas. The major objectives of this project are:Objective 1. Cattle Identification and Facial Imaging. We will develop computer vision techniques that allow a group of autonomous UAVs to uniquely identify each cow in a herd and take daily facial images of each cow. These images can be used to detect illnesses such as pinkeye (infectious bovine keratoconjunctivitis), which is a painful bacteria infection that can quickly spread through a herd and can cause reductions in weaning weight by at least 18 kg.Objective 2. Three-Dimensional Scan of the Cattle. We will develop cooperative control and computer vision techniques that allow a group of co-robot UAVs to perform a 3-D scan of each cow in a herd. This 3-D image model could be used to track daily changes in the cow's volume and estimated weight.Objective 3. Induce and Measure Cattle Motion. We will develop computer vision and cooperative control methods that allow a group of co-robot UAVs to herd a single cow, capture images of that cow, and construct a motion model from those images. This model could be used to track daily changes in the cow's motion and thus, help identify potential health issues that affect motion such as lameness. This task will depend on the findings from the following task.Objective 4. Cattle Response to UAVs. We propose experiments to examine cattle response to UAVs. Cattle flight zones will be assessed for various UAV approach angles, directions, and speeds. Responses will multiple UAVs will also be examined. Potential behavioral and physiological changes will be measured.Objective 5. Evaluation. We will evaluate the accuracy of our system's performance. This is done through evaluation under a controlled setting and then a longitudinal study with a participating beef producer from the Kentucky Cattlemen's Association. We will also develop post-processing tools needed to transform the raw data into simple forms and metrics that are easily understandable by beef producers.
Project Methods
Methods for Objective 1We will use a two-stage approach for cattle identification. In the first stage, we will identify individual cattle using ID tags. In the second stage, we will explore advanced pattern recognition methods to identify cattle using facial images.In the first stage, we plan to read the ID tags using the following steps:Cattle Detection by the Observer UAV. We will adopt human-pedestrian-detection techniques. These methods can be transferred from human to cattle "pedestrians" by training with cattle data sets. We will apply deep neural network models to cattle video or image data sets with a variety of breeds, environments, scales, and lighting conditions.Flight Control. Once a cow is detected, one worker UAV needs to take a frontal image of its face and ID tag. The worker UAVs must maintain constant attitude and relative position despite wind. Our preliminary experiments suggest that the existing rotorcraft inner-loop control methods (e.g., PID) are sufficient. If necessary, we will implement more advanced inner-loop control strategies.Tag Recognition. Tag recognition is similar to license plate reading. However, livestock tags come in a variety of shapes, sizes, and colors. If needed, we will develop our own tags that are easy to recognize (e.g., bar-code tags).In the second stage, we will explore cattle identification using the facial images. Inspired by recent advances in human face recognition, we plan to use a deep learning method to perform facial recognition. However, there are challenges: i) cattle facial images will differ in pose, scale, illumination, and occlusions; and ii) unlike the variance of human faces, cattle faces look similar.Methods for Objective 2We will develop vision-based techniques to estimate body size and weight.3D Morphable Model. Inspired by the successful application of 3D morphable models in human face and body modeling, we plan to create a 3D morphable model of cattle. This model is used to constrain the 3D reconstruction space, and it serves as the means to obtain indirect measurements such as weight. We will start with a single cow, assuming no occlusion. Then, we will study reconstruction in the presence of moderate occlusions.Creating the 3D morphable model requires a database of 3D full-body cattle scans. We estimated that at least 30 different cows of various body shapes and sizes must be scanned to train the morphable model. Since cows cannot be asked to stand still, we plan to use an array of real-time depth sensors and techniques that deal with movement during scanning.Regression to Body Weight. During the scanning process, we will also collect cattle height, length, and weight measurements. If we assume that there is a linear relationship between these measurements and the principal-component-analysis coefficients of the 3D morphable model, then we can use least-squares regression to estimate body weight. This estimation method has been tested on human bodies to estimate height, hip circumference, and weight with reasonable accuracy.Outer-Loop Discrete-Time Formation Control. Recently, we developed a new discrete-time formation (DTF) control method for coordinated control of co-robot multi-UAV systems. Most existing formation-control approaches are continuous-time algorithms and do not account for sample-data effects, which can be significant for applications such as formation flying, where communication and sensing constraints limit the speed with relative position data can be obtained. To implement DTF, each UAV requires a measurement of the relative positions and velocities of nearby vehicles, which is provided by the observer UAV. DTF uses this information to achieve formation cohesion and collision avoidance from each other and nearby obstacles (e.g., cattle).Methods for Objective 3An animal's gait provides important cues about its health status and can reveal conditions such as lameness. Lameness due to foot rot or other minor leg injuries is a production-limiting problem. Studies have shown producers tend to under-recognize the presence of lameness in their herd when compared to automated detection methods.Animal Motion Capture. Gait analysis has long been used in both human healthcare and animal science. There are many tools and systems available for human motion capture and gait analysis. Marker-based optical tracking systems are the gold standard for motion capture. In this project, we aim to develop a video-based motion capture system that can be used to capture an animal's full gait in the wild without using markers. We previously developed a state-of-art markerless motion capture algorithm that utilizes a single camera. Since our algorithm requires only a surface template and underlying skeleton, we have successfully applied it to animal tracking. In the scope of this project, we plan to develop a silhouette-based tracking algorithm.Induce Motion. When the UAV system passes over a group of cattle, they may not be walking, which would prevent the UAVs from obtaining the motion videos. We therefore will use the worker UAVs to herd the cattle and induce movement. Several worker UAVs will move into the cattle's flight zone to induce motion. We will develop a cooperative-herding guidance algorithm based on DFT. However, there is one important challenge--the worker UAVs must react to the cattle motion. We do not want the UAVs to get too close to the cattle, but we want the UAVs to be close enough to induce motion.Methods for Objective 4Characterizing the UAVs influence on the typical behavior of cattle (grazing, ruminating, resting, and movement) is an important product for this project. From the introduction and interaction with the UAV, the cattle flight zone and potential stress response will be measured. Evaluating cattle response to UAVs requires a multifaceted approach. The trials will be conducted at University of Kentucky C. Oran Little Research Center, located in Woodford County, KY. Over the duration of the trial 48 predominately Angus steers (220 kg body weight (BW)) and 8 cows (600 kg BW) will be assigned to up to 16 different pastures. All cattle used in the trail will have ad libitum access to fresh water and mineral supplement. The trial period for these cattle will range from May to August, as this is the primary grazing season. Depending upon the availability of forages, fall and winter trials would be conducted with a reduced number of animals as well. Shrunk BW will be determined prior to the initial onset and at the conclusion of the trial so that the overall experiment average daily gain can be ascertained. Full BW will be measured throughout the trial periods so that the correlation to body volume could be developed.Methods for Evaluation We will evaluate the effectiveness of our approaches in two phases. In the first phase, we will evaluate under controlled settings. The localization accuracy will be validated indoor with motion capture systems. The effectiveness of identification, 3D measurement (both in the static and dynamic poses), and derived indirect measurement (such as weight) will be validated with the truth data obtained during our initial data collection. Standard techniques to validate pattern recognition systems, such as leave-one-out cross validation, will be used. Given the typical herd size for small operations, the population size is expected to be between 20 to 100 cows.In the second phase, we plan to evaluate our proposed techniques using real data collected in working farms. We will utilize the Kentucky Cattlemen's Association and the UK Cooperative Extension Service to identify beef producers who are willing to participate. We will plan a longitudinal study in which a small number of cows (e.g., 10) will be tracked over a period of 4 months to see the effectiveness of our approaches. Producers' feedback will also be solicited.

Progress 02/15/19 to 02/14/20

Outputs
Target Audience: The target audience for this project during this reporting period includes researchers in the areas of autonomous systems, UAVs, computer vision, control systems, and livestock systems. The target audience also includes cattle producers. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?This project has provided research training for 6 graduate students (5 Ph.D. and 1 M.S. student) and one undergraduate student. This project has provided valuable interdisciplinary research experience for these students. One Ph.D. student from this project graduate in the fall of 2019, and she is currently in a postdoctoral research position. One M.S. student from this project graduated in the spring of 2020, and he is currently in an industry position. The research conducted in this project will constitute a significant portion of the Ph.D. dissertations for the 4 other graduate students. How have the results been disseminated to communities of interest?Results from this project have published and/or presented at the 2019 IEEE Conference on Decision and Control and the 2019 ASABE Annual International Meeting. Results from this project were also presented at the 2020 National Robotics Initiative PI Meeting in Arlington, VA. Results from this project have been accepted for publication and presentation at the 2020 American Control Conference, and results are under review for publication in theIEEE Transactions on Control System Technology. This project has been featured in the following 3 popular press outlets: 1. CNET Documentary and News Article. "Drones on the farm: Using facial recognition to keep cows healthy". By Molly Price. August 22, 2019.https://www.cnet.com/news/drones-and-facial- recognition-could-help-keep-cows-healthy/ 2. WKYT News Report. "UK Researchers Using Drones to Solve Billion Dollar Cattle Industry Prob- lem". By Adam Burniston. January 29, 2020.https://www.wkyt.com/content/news/UK-researchers- working-to-solve-billion-dollar-cattle-industry-problem-with-drones-567397761.html 3. Spectrum News 1 Report. "Drone Research at The University of Kentucky Could Rescue Cattle In- dustry". By Crystal Sicard. March 8, 2020.https://spectrumnews1.com/ky/lexington/news/2020/03/08/ drones-save-cows- In addition, results from this project were presented at a variety of extension and outreach activities. Please see Other Products for details. At these events, we made contact with approximately 400 producers. Finally, this project was featured on a University of Kentucky Cooperative Extension video. Please see Other Products for details. What do you plan to do during the next reporting period to accomplish the goals?During the next reporting period, we plan to make progress against Objectives 1--4. In particular, we plan to conduct additional cattle response studies (Objective 4). We plan to develop the integrated outdoor multi-UAV system and conduct outdoor UVA tests (Objectives 2 and 3). We plan to build a database of cattle facial images (Objective 1). We also plan to further develop the generative three-dimensional cow model.

Impacts
What was accomplished under these goals? During this reporting period, we made 5 major accomplishments, which contribute to Objectives 1--4. These accomplishments are as follows: 1. Facial Imaging(Objective 1). We developed a new facial recognition algorithm for cows. Compared to existing methods, this new algorithm significantly improves the recognition accuracy by adopting a recently developed advanced deep-learning based framework. A paper on this work has been drafted and submitted; it is currently under review. 2. Three-Dimensional Scan of Cattle from UAVs(Objective 2). During the previous reporting period, we conducted experiments to determine optimal flight paths for 3 UAVs to simultaneously image a cow. We collected 108 images along 9 flight paths. During this reporting period, we processed and analyzed this data. The image sets (which were collected around two life-sized cow statutes) were subdivided to represent flight paths. These flight paths varied in radius and elevation above ground level relative to the cow. The images were processed to generate 3D point clouds, which, in turn, were used to estimate the cow's volume. Analysis revealed an optimal set of flight paths as well as several paths that produce inaccurate 3D models. In general, closer radii and higher elevations produced the most consistent volume estimate. 3. Generative Three-Dimensional Cow Model(Objectives 2). We built a 48-camera imaging setup, and we are in the process of collecting a three-dimensional cow image data set under field conditions. The existing dataset has been used successfully to create 3D cattle models. However, we plan to capture many more cattle images in the coming months to create a real 3D cattle database, which can be used to build a regression model for cattle. 4. Relative-to-Target UAV Formation Control Method and Experiments(Objectives 1--3). We continued to develop and improve our new relative-to-target (R2T) formation control algorithm that positions UAVs in a desired formation around a cow to obtain images simultaneously from different angles. This R2T formation control method allows the formation to rotate as the imagining target (e.g., cow) changes its orientation. We have conducted extensive indoor and outdoor flight tests to validate this R2T formation control method. 5. Evaluate Cattle Response to UAVs(Objective 4). We conducted experiments to evaluate the potential stress induced by UAVs flying near cattle. The cattle-UAV interaction for two different groups of cattle was evaluated. The first set contained 18 bred beef heifers that were approximately 18 months old, while the second set was composed of 16 weaned beef heifers that were 7 to 8 months old. For both sets of animals, two UAV flight treatments used circular and grid flight pattern. Circular flight patterns would represent direct animal monitoring, while grid flight patterns would represent the flights that are conducted for pasture monitoring. UAV flights were conducted at 9.1 m above ground level (AGL) and each treatment lasted for about five (5) minutes. For each set of heifers, a total of 120 flights were conducted over a four-week period. Heifer heart rate and movement rate were measured preflight and during UAV flight in beats per minute (bpm) and meters per second (mps) at 1 Hz to measure physiological and behavioral response respectively. At the end of the study, the mean heifer preflight heart rate was 57 - 80 bpm, and UAV flight heart rate was 58 - 82 bpm. The mean heifer preflight movement rate was 0.00 - 0.14 mps, and UAV flight movement rate of 0.00 - 0.07 mps. The studies demonstrated that beef heifers were not stressed by UAV flights as evidence in no significant change in heart rate and movement rate during UAV flights. The use of UAVs as a cattle health monitoring tool does not induce stress in beef cattle raised on pasture.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Z. S. Lippay and J. B. Hoagg. Formation control in a rotating frame for agents with double integrator dynamics: Theory and rotorcraft experiments, Proc. Conf. Dec. Contr., Nice, France, December 2019. DOI: 10.1109/CDC40024.2019.9029757
  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2020 Citation: Z. S. Lippay and J. B. Hoagg. Leader-following formation control with time-varying formations and bounded controls for agents with double integrator dynamics, Proc. Amer. Contr. Conf., Denver, CO, July 2020.
  • Type: Journal Articles Status: Under Review Year Published: 2020 Citation: Z. S. Lippay and J. B. Hoagg. Formation control with time-varying formations, bounded controls, and collision avoidance, IEEE Transactions on Control System Technology, (under review).
  • Type: Theses/Dissertations Status: Published Year Published: 2020 Citation: Pampolini, L.F. 2020. An Assessment of 2D and 3D Spatial Accuracy of Photogrammetry for Livestock Health Monitoring. MS Thesis. University of Kentucky.
  • Type: Conference Papers and Presentations Status: Other Year Published: 2019 Citation: Abdulai, G., Sama, M., Jackson, J.J. 2019, Low cost GPS receiver accuracy for cattle monitoring. ASABE Annual International Meeting, Boston, MA, July 7- 10
  • Type: Conference Papers and Presentations Status: Other Year Published: 2020 Citation: J. B. Hoagg, J. J. Jackson, M. P. Sama, and R. Yang. Autonomous unmanned aerial robots for livestock health monitoring, 2020 National Robotics Initiative Principal Investigators Meeting, Arlington, VA, February 2020.


Progress 02/15/18 to 02/14/19

Outputs
Target Audience:The target audience for this project during this reporting period includes researchers in the areas of autonomous systems, UAVs, computer vision, control systems, and livestock systems. The target audience also includes cattle producers. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?This project has provided research training for 5 graduate students and one undergraduate student. The research conducted in this project will constitute a significant portion of the PhD dissertations for 4 of these graduate students. How have the results been disseminated to communities of interest?Results from this project have been submitted for publication and presentation at the 2019 IEEE Conference on Decision and Control. Results from this project were also presented at the 2018 National Robotics Initiative PI Meeting in Washington D.C. In addition, results from this project were presented at the 2018 Beef Bash in Kentucky on September 20, 2018. This event was attended by 400 producers, and we made contact with approximately 120 producers. What do you plan to do during the next reporting period to accomplish the goals?During the next reporting period, we plan to make progress against Objectives 1--4. In particular, we plan to conduct additional cattle response studies (Objective 4). We plan to conduct outdoor UVA tests (Objectives 2 and 3). We plan to build a database of cattle facial images (Objective 1). We also plan to further develop the generative three-dimensional cow model.

Impacts
What was accomplished under these goals? During this reporting period, we made 4 major accomplishments, which contribute to Objectives 1--4. These accomplishments are as follows: Evaluate Cattle Response to UAVs(Objective 4). We conducted experiments to evaluate the potential stress induced by UAVs flying near cattle. A total of 20 dairy heifers (2 heifers per 2 acre pasture) were subjected to 3 different treatments of UAV flight: i) 18 m AGL grid over field (pasture monitoring); ii) 7.6-9.1 m AGL grid over field (lower altitude); and iii) 7.6-9.1 m AGL circular flight around cow. For each trial, we conducted 5-min flights per pasture with an average speed of 2.3 m/s. The animals' behavioral response were measured with Land Air Sea® trackers, and the animals' heart rate response were measured with Polar® H10 and Polar® Equine electrode set. These experiments demonstrated no significant behavioral changes. Moreover, the heart rate during flight was comparable to the heart rate before the flight. Three-Dimensional Scan of Cattle from UAVs(Objective 2). We conducted experiments to determine optimal flight paths for 3 UAVs to simultaneously image a cow. We collected 108 images along 9 flight paths. Flight paths were programmed into the UAV autopilot and the flight was repeated 3 times. Three-dimensional models were generated using Pix4Dmapper. Three paths were chosen from the 9 to form an individual treatment. The experiments and analyses demonstrate that three-dimensional models degrade if the number of images is reduced. Certain combinations resulted in entire sets of images being dropped from analysis because of an inadequate number of tie points. Generative Three-Dimensional Cow Model(Objectives 1 and 2). We designed a generative three-dimensional cow model that will be used to estimate cow volume and weight. To obtain data for the generative 3D cow model, we built a multi-camera imaging setup, and we are in the process of collecting a three-dimensional cow image data set. The generative cow model is able to generate a variety of cow shapes and poses with relatively few tuning parameters. Relative-to-Target UAV Formation Control Method and Experiments(Objectives 1--3). We developed a new relative-to-target (R2T) formation control algorithm that positions UAVs in a desired formation around a cow to obtain images simultaneously from different angles. This R2T formation control method allows the formation to rotate as the imagining target (e.g., cow) changes its orientation. We have conducted indoor flight tests to validate this R2T formation control method.

Publications

  • Type: Conference Papers and Presentations Status: Under Review Year Published: 2019 Citation: Z. S. Lippay and J. B. Hoagg. "Formation Control in a Rotating Coordinate Frame for Agents with Double Integrator Dynamics." Proc. Conf. Dec. Contr., 2019 (under review).