Source: MICHIGAN STATE UNIV submitted to NRP
AUTOMATED SWINE PHENOTYPING FOR MANAGEMENT AND RESEARCH
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1028818
Grant No.
2022-67021-37858
Cumulative Award Amt.
$590,005.00
Proposal No.
2021-11063
Multistate No.
(N/A)
Project Start Date
Aug 1, 2022
Project End Date
Jul 31, 2026
Grant Year
2022
Program Code
[A1521]- Agricultural Engineering
Recipient Organization
MICHIGAN STATE UNIV
(N/A)
EAST LANSING,MI 48824
Performing Department
ELEC COMP ENGR
Non Technical Summary
There is currently a pressing need for the implementation of inexpensive high-throughput phenotyping of pigs that can be widely applied to swine research, production and nucleus farms indistinctly.We are proposing a hardware and software phenotyping tool that will benefit and bring synergy between three areas: production animal management, animal research and genetic selection breeding. With low investment and operational cost, it will estimate individual animal traits useful for livestock management decisions. For animal researchers it will obtain quantitative traits that can be linked to animal health and productions. And breeders will be able to use the tool both in nucleus farms as well as in production farms, where much larger datasets can be acquired reflecting actual swine production conditions.Our tool consists of two components. (1) Smart hardware includes a networked, multi-modal camera system can be readily installed in farms and operates with minimal oversight. It acquires realtime color and depth imaging of swine in pens, at feeding and drinking stations and along hallways, and is integrated with a tag-based identification system. It automatically detects swine, tracks them and provides low-level phenotypes including shape scans, posture and joint motions. These low-level phenotypes are key enablers for large-scale, individualized swine data collection. Using these, many tools can be built that in infer swine health and welfare characteristics. (2) Construction of two modules that build high-level phenotypes from low-level phenotypes. (a) A body condition estimator module that can replicate the performance of current body condition measures, and can also estimate a score that more directly reflects swine welfare and productivity. (b) A swine interaction module that can automatically detect and annotate pairwise interactions such as fighting and intimidation at the feeder. Ability to automatically identify swine interactions enables adjustment of management strategies as well as research into genetic factors influencing swine interactions.Our broader goal is that our tool will be easily built-on and extended in both production and research farms. There is potential to create additional modules to estimate many more high-level phenotypes from the low-level phenotypes. The two modules we develop can be used as templates for modules that estimate other phenotypes.
Animal Health Component
50%
Research Effort Categories
Basic
0%
Applied
50%
Developmental
50%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
40235102020100%
Knowledge Area
402 - Engineering Systems and Equipment;

Subject Of Investigation
3510 - Swine, live animal;

Field Of Science
2020 - Engineering;
Goals / Objectives
The project goal is to develop and demonstrate an automated phenotyping tool to advance precision livestock farming. The following are our objectives in reaching the goal:Assemble hardware and data collection software for 10 PhenoKits and deploy in participating farmsDevelop low-level phenotyping capabilities and deploy on PhenoKits. These include detection, tracking, 3D posture estimation, instance segmentation and long-term sow identity within pens.Integrate low-level phenotypes to estimate sow Body Condition from walking sowsIntegrate low-level phenotypes to quantitatively measure swine activity and interactions at feeders
Project Methods
Hypothesis 1: Low-level phenotypes can be remotely and automatically estimated for swine in pens and hallways. These phenotypes include: swine detection, segmentation, 3D posture estimation, tracking and identity maintainance.Approach: A combined hardware & software solution called PhenoKits will be constructed, and machine learning techniques applied to estimate these phenotypes.Evaluation: Manual annotation will provide ground truth for each phenotype and an array of quantitative measures will be used to assess the phenotypes.Milestones:Year 1, Q4: Initial detection, segmentation and tracking of swineYear 2 Q2: Swine ID maintanance complete and evaluatedYear 2 Q 4: All low-level phenotypes complete and evaluatedHypothesis 2: High-level phenotypes can be remotely and automatically estimated by integrating low-level phenotypes for swine in pens and hallways. The two phenotypes we will address are body condition based on 3D shape and feeder interaction behaviors. Evaluation: Manual annotation will provide ground truth for each phenotype and an array of quantitative measures will be used to assess the phenotypes.Milestones:Year 3 Q 4: Body condition complete and evaluatedYear 3 Q 4: Feeder behaviors characterized and evaluated

Progress 08/01/23 to 07/31/24

Outputs
Target Audience: Nothing Reported Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?Two graduate students are working on the project. One undergraduate student worked on the project for the summer of 2024. How have the results been disseminated to communities of interest? Nothing Reported What do you plan to do during the next reporting period to accomplish the goals?We plan publish our results in conferences and journal papers. We also anticipate presenting at AI conferences.

Impacts
What was accomplished under these goals? For goal 1, we have selected the cameras including both RGB-D and color cameras. Both connect via power-over-ethernet, enabling flexibility in positioning them in the farm. We created a prototype enclosure and have tested temperatures and found overheating. To address this we explored cooling mechanisms, and found an effective approach with a thermoelectric plate plus heat sink plus fan. We expect to complete enclosure in the next quarter and mount our cameras in the farm. Our devices will generate a large quantity of multimodal data that needs to be stored. Ideally storage will be in a standard form accessible with open-source tools. With this in mind, we settled on ROS 2 bags for multimodal data storage, and have created storage and reading tools. We deployed one prototype device with a combined RGB-D sensor + color camera in a gilt pen, and are implementing and testing data handling with it. For goal 2, we have developed a detector and tracker and working towards achieving long-term tracks of swine. We are also developing a low-level scene flow algorithm for body-part motion analysis. Goals 3 and 4 are still in planning stages.

Publications


    Progress 08/01/22 to 07/31/23

    Outputs
    Target Audience: Nothing Reported Changes/Problems:There has been a delay in recruiting and bringing onboard a PhD student to work on this project. A full-time PhD student just started on this in August 2023. Thus there was limited progress on the hardware and software development to-date. What opportunities for training and professional development has the project provided? Nothing Reported How have the results been disseminated to communities of interest?Our publication has disseminated the develolpment plans for remote data collection on animalsr to the scientific community. What do you plan to do during the next reporting period to accomplish the goals?We plan to complete goal 1 and make progress on goals 2 and 3

    Impacts
    What was accomplished under these goals? For goal 1, we have purchased candidate hardware (Jetson embedded PC + Orbbec depth sensor), and have initial data collection implemented. To complete this goal, the software must operate remotely and be thoroughly tested, and then deployed.

    Publications

    • Type: Journal Articles Status: Accepted Year Published: 2023 Citation: Siegford, Janice M., Juan P. Steibel, Junjie Han, Madonna Benjamin, Tami Brown-Brandl, Joao RR Dórea, Daniel Morris, Tomas Norton, Eric Psota, and Guilherme JM Rosa. "The quest to develop automated systems for monitoring animal behavior." Applied Animal Behaviour Science (2023): 106000.