Source: UNIVERSITY OF ARKANSAS submitted to NRP
CENTER FOR SCALABLE AND INTELLIGENT AUTOMATION IN POULTRY PROCESSING (CSIAPP)
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1029956
Grant No.
2023-70442-39232
Cumulative Award Amt.
$5,000,000.00
Proposal No.
2022-11486
Multistate No.
(N/A)
Project Start Date
Feb 1, 2023
Project End Date
Jan 31, 2027
Grant Year
2023
Program Code
[A1731]- Meat and Poultry Processing and Food Safety
Recipient Organization
UNIVERSITY OF ARKANSAS
(N/A)
FAYETTEVILLE,AR 72703
Performing Department
(N/A)
Non Technical Summary
Background:The poultry industry is the largest meat industry in the world, and the United States is the world's No. 1 producer of poultry meat. The consumption of chicken products has steadily increased in recent decades, and this demand will likely continue into the foreseeable future. While the current poultry industry is centralized and designed to produce food efficiently, several operations such as meat deboning rely heavily on manual labor. The COVID-19 pandemic demonstrated that this reliance on manual labor makes the system vulnerable to disruptions. Manufacturing tasks in these facilities required many workers to stand side-by-side, without the ability to telework or operate equipment remotely. During the pandemic, the infection spread quickly among meat processing workers, disrupting the supply chain. High human-food contact can also lead to cross-contamination resulting in food safety recalls. The poultry and meat industry is currently facing unprecedented challenges of labor shortages and food and worker safety.The meat processing industry stands to benefit by more fully embracing transformative technical principles such as sensing, advanced robotics, and artificial intelligence. That said, the current capabilities of robotics and automation cannot yet compete with the dexterity and flexibility of human workers. Animals are highly variable, requiring intelligent and adaptive automation to handle the soft and variable meat tissues. With the U.S. meat manufacturing industry gradually recovering from the COVID-19 pandemic, now is the opportune time for the meat processing industry to reinvent itself and play a major role in addressing global protein needs, increasing processing efficiency, minimizing meat quality loss, alleviating the pressure of labor force shortage, protecting worker safety, improving worker welfare, and the work environment.Overall goals and objectives:The vision of the Center for Scalable and Intelligent Automation in Poultry Processing (CSI-APP) is to incorporate advanced technologies in robotics, artificial intelligence, digital sensing, biosensing, and food safety to provide U.S. poultry processing industry scalable and intelligent solutions to meet the rising national and global demand in poultry products. The long-term goal focuses on transforming current mass manufacturing protocols in large, centralized processing plants to "mass customization" protocols suitable for processing plants in different scales to overcome the inherent variability associated with raw biological materials and humans. Large-scale individualization can be achieved economically through the integration of digital and physical systems ("industry 4.0" principles). In pursuit of the vision and the long-term goal, CSI-APP will strategically target value creation and technological innovation by performing focused engineering research and extension activities by following four unifying objectives in this proposal:Objective 1: Scalable poultry manufacturing. The team will create ascalable plant-ready intelligent robotic deboning systemcapable of performing at parity with (or even exceeding) human deboners for the most skilled task in the plant: shoulder cutting of front-halves. Artificial intelligence algorithms will be developed to handle the high biological variability of meat.Objective 2: Virtual reality-based workforce transformation. The labor shortage is a major challenge for the meat industry. It takes considerable time to train an individual to perform dexterous jobs like meat deboning. Due to high line speeds in a cold, humid environment, there are injuries resulting in labor shortages. During the pandemic, the infection spread quickly among meat processing workers, disrupting the supply chain. Virtual reality can transform, diversify, and distribute the workforce in space and time. Using the proposed VR technology, someone will be able to stay in a comfortable environment and virtually operate a robot to debone meat in a processing plant remotely. This has the potential to reduce labor shortage and create job opportunities everywhere, including rural areas.Objective 3: Sensor and robotic-based product evaluation and bio-mapping for enhancing food quality and safety. A mobile robotic platform containing biosensors for rapid estimation of bacteria will be developed. The biosensors will provide initial biomapping of bacteria in the processing plant and identify the best areas to collect swab samples of the product and environmental surfaces for food safety evaluations. The final biomap will be used to guide sanitation and management decisions. An imaging system will also be developed for detecting foreign objects like small plastics in meat and food quality evaluation.Objective 4: Research and extension integration: create an innovation ecosystem through technology development/transfer and workforce education. Research and extension activities will be integrated, accelerating the technology transformation to better meet stakeholders' needs. Planned activities include surveys to identify barriers, workshops for disseminating information about advanced technologies, demonstration exhibits at industry conferences, and one-on-one technical support for industries considering implementation of these technologies.Expected Outcomes:CSI-APP is structured to (1) enhance the robustness and scalability of precision manufacturing in meat processing and chicken deboning; (2) distribute the workforce in space and time using virtual reality systems; (3) improve food quality and safety in processing plants using intelligent automation, real-time vision sensing, biosensing and biomapping; and (4) collect stakeholder feedback of digitalization transformation in the meat industry and disseminate the technology to the stakeholders. This contribution will be significant because it is expected to transform the poultry industry to a more digitized and automated industry, with enhanced labor safety and food quality/safety. The scalable and transferrable technology is expected to be adaptable to smaller chicken processors, which is beneficial for the economic development of rural areas. A distributed network of smaller producers/processors that can also supply chickens to local clients efficiently to protect the food supply from aggressive attacks and the spread of pathogens. On fundamental, applied, and extension levels, the long-term outcomes of CSI-APP can be adapted to allied food industries benefiting the U.S. and global economy, but the potential impact of CSI-APP goes far beyond this. Making the mass customization of protein manufacturing a reality will contribute to long-term environmental sustainability in food production and to well-being around the world by providing a safe and affordable source of protein.Project Team:CSI-APP connects four core institutes: University of Arkansas System Division of Agriculture, Georgia Tech Research Institute, University of Nebraska-Lincoln, and Fort Valley State University, along with a key collaborator from USDA ARS National Poultry Research Center. An interdisciplinary team from the four institutions aims to uncover the engineering and technologies to enable scalable, intelligent, efficient, safe, and transformable meat manufacturing systems to enhance worker safety, food safety and process efficiency. CSI-APP's Industrial board consists of 12 representative stakeholders related to the project from (1) poultry companies in large, medium, and small sizes; (2) food manufacturing and automation companies; and (3) industry associationswith backgrounds spanning poultry production, poultry processing, food technologies, and intelligent food system development.
Animal Health Component
40%
Research Effort Categories
Basic
40%
Applied
40%
Developmental
20%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4023260202028%
4023260208025%
4023620202010%
8033260308010%
9033260202012%
9033260308015%
Goals / Objectives
The vision of the Center for Scalable and Intelligent Automation in Poultry Processing (CSI-APP) is to incorporate the advanced technologies in robotics, artificial intelligence, digital sensing, biosensing, and food safety to provide U.S. poultry processing industry scalable and intelligent solutions to meet the rising national and global demand in poultry products. The long-term goal focuses on transforming current mass manufacturing protocols in large, centralized processing plants to "mass customization" scale-neural protocols suitable for processing plants in different scales to overcome the inherent variability associated with raw biological materials and humans, and large-scale individualization can be achieved economically through the integration of digital and physical systems (e.g., "industry 4.0" principles). In pursuit of the vision and the long-term goal, in short-term, CSI-APP will strategically target value creation and technological innovation by performing focused engineering research and extension activities by following four unifying objectives to bond bioproducts, human and sensing/robotics technologies:Objective 1: Scalable poultry manufacturing: lot size of one for robotic processing of chicken carcasses. Specifically, the objective is to create a scalable plant-ready intelligent robotic deboning system capable of performing at parity with (or even exceeding) human deboners for the most skilled task in the plant: shoulder cutting of front-halves. The human benchmark is 35 birds/minute and lost yield of 2% of total yield weight.Objective 2: Virtual reality-based workforce transformation.The objective 2 is to bridge the gap between fully manual and fully autonomous operations by leveraging human intelligence and robotic endurance. To develop and deploy the fully automated system proposed in Objective 1 needs to collect large amounts of human operational data, and requires long-term validation and optimization. To accelerate the data collection and robotic deployment, a VR based human-in-the-loop robotics will be developed in this objective to facilitate necessary steps that will allow for select manual operations within a poultry processing operation to be performed via robots. This VR based robotic system will allow a worker remotely collaborate with robotic devices in order to jointly accomplish processing tasks in a facility. The online human decision making results collected in the VR system will also be used for optimizing the fully automated system in the objective 1.Objective 3: Robots for robots: sensor and robotic-based product evaluation, bio-mapping and decision making in the processing plant to address the new challenges in food quality and safety raised by robotic manufacturing.The goal of Objective 3 is to create new 'robots for robots' protocols to design a new set of robotic and sensor solutions to address the emerging food safety and quality challenges brought by automated meat manufacturing solutions. The related challenges and questions include unknown pathogen transmission patterns, the new requirements of sanitization protocols, potential product quality degradation and the introduction of new FO contaminations. To fill the above gaps, specifically, CSI-APP team proposed a new proactive mobile swab sampling robot platform to collect the environmental surface swab samples with onsite pathogen detection and pathogen transmission pattern visualizations. The outputted quantitative results will also be used for designing and optimizing current sanitization protocols. Additionally, an all-in-one hyperspectral imaging based non-invasive FO contamination detection and food quality evaluation system will be developed and integrated with previous processing lines for online food quality control.Objective 4: Research and extension integration: create an innovation ecosystem through technology development/transfer and workforce education.The Objective 4 is to integrate the research and extension activities and accelerate the technology transformation to better meet the stakeholders needs via activities including interviews, workshops, industry conferences, and demonstration exhibits.
Project Methods
Objective 1: Scalable poultry manufacturing:1) Development of parametric bird physiology models: To achieve accurate and robust lot size of one robotic poultry deboning knife trajectories, it is important to predict the geometric coordinates of non-visible anatomical features of interest (output) from the visible external carcass features (input). The internal anatomical features will be partially modeled off-line from CT scans. The visible features will be collected on-line from dual RGB-D cameras. Machine learning models will be built to map the relationships between inputs and outputs.2) Applying learning from demonstration (LfD) methods to refine knife cutting paths. LfD methods will be incorporated to allow expert human deboners to inform/optimize robot knife paths that achieve maximal yield while avoiding bone chips. Expert users will use the instrumented knife that will contain a microprocessor and an IMU (inertial measurement unit) for measuring knife angular velocity and acceleration. Neural networks will be used to learn the hidden cost function from bird physiology models, using the collected expert data as training sets.3) Research and implementation of feedback control based on knife forces. Closed-loop feedback control based on knife forces will be designed in order to deal with errors and disturbances. The robot knife tool will be instrumented with a dedicated force/torque sensor to detect 3-axis translational and rotational loadings on the blade as deboning is performed. Methods such as model predictive control will be explored.Objective 2:2.1 Build human-robot collaborative interfaces with poultry plant operations utilizing virtual reality environments. This step attempts to comprehensibly model the interface of robotic devices, human operators, and processing plant environments, which will assist to transfer human operators' decisions to transformative robotic movements (deboning and trimming). In detail, the worker will operate the VR system in the well-conditioned controlling room, and the robot arms will the mounted on the processing lines. The VR space and real-world space will be well-calibrated, and the robot utilizing all sensor information will be capable of interpreting human commands and provide automation on a manual labor task. The system robustness under different WIFI bandwidth constraints will be evaluated. 2.2 Adaptation of artificial intelligenceIn objective 1, a LfD method has been proposed where human operations can be tracked and recorded from external IMU sensors. VR system proposed here offers a quicker and easier solution to record human operations. Combining with robotic perception to localize the characterize the object, the machine learning based LfD dataset can be easily established. The graceful integration of LfD and machine learning algorithms into robotic poultry tasks includes three stages utilizing active learning ideas: (1) user performs the task and generates the initial training dataset of input images and user annotations. (2) machine learning models continuously trains on acquired data and starts making predictions that are verified by the user, (3) machine learning approach incorporates uncertainty estimation and only requests user assistance or intervention when having low confidence in its prediction or encountering an unusual situation.Objective 3: Sensor and robotic based product evaluation and bio-mapping3.1 Development of mobile robot platform for swab sample collection:A mobile robotic platform will be developed in the objective to automatically collect meat product and environmental surface swab samples for food safety evaluations. The facility map will be preloaded into the system and the vehicle will move following pre-designated routes and acquire swab samples. The complete mobile robotic platform is expected to run in the processing plant three times a day to timely generate bio-maps to visualize the potential pathogen transmission patterns in poultry processing facilities to guide efforts to minimize biosafety risks.3.2. Biosensor based in-field pathogen detectionThe biosensor will be placed at the wrist of the mobile robot arm. This will allow for on-site rapid and automated screening of Salmonella and Campylobacter, using immunomagnetic separation and quantum dot-based fluorescent sensing. This will generate preliminary biomapping of the facilities.3.3 Biomapping and robotic sanitizationBased on identified hotspots on the preliminary biomap, the mobile robots will collect swabs for traditional microbial enumeration method. The biomapping tool and its outputted results will also be used for developing, optimizing, and validating the sanitization protocols. With the mobile robotic swab platform, the effect of sanitation can be easily and continuously monitored and evaluated.2.4 Hyperspectral-imaged based foreign object (FO) detection and food quality evaluationsAn integrated online hyperspectral imaging system will be established and validated in the subsection for fillet quality evaluation. The system is composed of a detector (900-1700 nm) and broad band illumination source (tungsten-halogen lamps), which will run in the line scanning reflective mode and be integrated following robotic deboning belt. Pixel-level multi-task deep learning model will be integrated in the system for simultaneous food quality evaluation and FO identification. The model will take the pixel spectrum as the input, and the output will be the FO categories and fillet texture properties.Objective 4: Extension4.4.2.1 Semi-structured interviews to collect stakeholder's feedback on CSI-APP innovationsCo-PI McQuillan and her team at UNL will conduct surveys from two key sets of stakeholders: (1) Current poultry processing workers and (2) communities and community members who could develop small CSI-APP facilities with the help of the new center. Similar social science research and extension activities will also be conducted in FVSU to evaluate the automation in action for potential technology transformation to the red meat industry. Besides the social awareness of the stakeholders for the newly developed technologies, the team will also evaluate how robotic technologies could be a relief for front line workers from long-term physical injuries using 3D camera systems.4.2. Organized workshops, conferences, and exhibitsSurveyswill be conducted with two key sets of stakeholders: (1) Current poultry processing workers and (2) Potential for new poultry processing facilities.Educational Workshops: New technologies will be introduced into existing educational workshops, Poultry 101 (twice per year) and Poultry 201 (once per year), A new workshop, Poultry 301, is expected to be launched in year 3 and 4, which will solely focus on new technologies in processing. GTRI will host the International Food Automation Conference (IFAN), which is an event specifically designed to engage engineering and technology decision-makers in the food, poultry, and meat manufacturing sector. GTRI will also host an exhibit booth on the floor of the International Production & Processing Expo (IPPE), which is a large tradeshow that attracts participation from all over the world.

Progress 02/01/23 to 01/31/24

Outputs
Target Audience:Poultry industries androbotic manufacturers Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?This project has provided excellent opportunities for a number of graduate students to become proficient in interdisciplinary research, experimental methods design, conduction of preliminary experiments and statistical analysis Graduate students are exposed to opportunities to visit different poultry processing plants to discuss with stakeholders and front line workers. How have the results been disseminated to communities of interest?We have attended various events and interatcted with various stakeholders to give updates and get feedback. This included presenting at and participating in several different meetings and engagements with academics and industry and are listed below. Harrison Poultry Company, Board of Directors Meeting, Atlanta, Georgia, November 10, 2022. Dr. Britton was invited to present, and part of his presentation included in the plan and desired outcomes for objectives 1 & 2 of the CSI-APP program. Part of the presentation included a feedback session, where this company could provide input regarding the approaches and needs of the technology being developed. International Production and Processing Expo, Atlanta, Georgia, January 27, 2023. Researchers working on objective 2 created a preliminary demo of a virtual reality system that can enable a user to provide input to a robot system remotely. This system was demonstrated to several industry partners during the week of this conference. In addition, researchers met with key leadership at Staubli Robotics to discuss the use of their washdown capable robots in both the intelligent deboning and VR tasks. Advancing Georgia's Leaders In Agriculture Program, Atlanta, GA, February 8, 2023. This program run by the University of Georgia provides leadership training to up and coming members of the agricultural community in Georgia. With around 25 participants, it includes participants from multiple different agricultural sectors. Dr. Britton presented on the research in both objectives 1 & 2 and shared the vision for how these new technologies and developments could have a significant impact on the future of poultry specifically, and agriculture more generally. Reciprocal Meat Conference, American Meat Science Association, St. Paul, MN, June 6, 2027. As part of his keynote address at this event, Dr. Britton provided a overview of the work being done in objectives 1&2 to an audience of industry and academic stakeholders. He presented the general technical approach for both the intelligent deboning and VR-based interfaces, and had multiple individual conversations throughout the course of the conference. Georgia Poultry Federation Summer Leadership Conference, Ponte Vedra Beach, FL, July 20, 2023. Dr. Britton shared a brief overview of the CSI-APP program to roughly 100 members of the Georgia poultry industry. Participants represented all of the producers, processors, genetics companies, equipment manufacturers, and other allied industry members of the poultry industry. Delaware, Maryland, Virginia Poultry Association Annual Meeting, Ocean City, MD, August 25th through 27th . Dr. Ahlin presented on the applicability of VR technologies and their role in providing advanced automation solutions in poultry processing. Staubli Robotics, Faverges, France, August 4, 2023. GTRI researchers met again with principals from Staubli Robotics at their manufacturing facility in France to discuss the possibility of collaborating on the CSI-APP program and other related work at GTRI. Dr. Ahlin provided an overview of the research activities associate with objectives 1 & 2, and detailed the opportunities for Staubli to become engaged in the research. This was extremely well received, with Staubli asking for specific needs and opportunities where they could partner and support the center. What do you plan to do during the next reporting period to accomplish the goals?We will continue to imporve the accuracy of robotic deboning and will test VR systems. Mobile robots will be further developed and tested for automatic swabbing. Bisoensors will be developed for rapid microbial analysis. Biomapping protocols will be developed to assist manufacturers in improving food safety. We will conduct research with poultry workers to understand the resistance to robots in the manufacturing environment and their interests in using VR tools in producting environments.

Impacts
What was accomplished under these goals? Objective 1: Scalable Poultry Manufacturing: Intelligent Robotic Deboning Objective 1's technical approach and accomplishments in year 1 encompass four areas: Commissioning New Robot Hardware, Automated Deboning Method Refinements, Introducing X-ray Sensing, and Researching Learning from Demonstration. Commissioning New Robot Hardware To meet the industry standard speed of processing (on average) 35 birds/minute and to withstand the wet conditions inside a plant, a new high-speed washdown Staubli robot arm has been fully set up and tested in GTRI's laboratory front-half conveyor line to perform automated front-half shoulder deboning. The verisimilitude between the lab set-up and the one found inside a poultry processing plant will aid a planned in-plant trial next year. Automated Deboning Method Refinements Drawing on GTRI's years of work on automated poultry deboning, two method refinements were identified as especially impactful. The first is a re-formulation of the statistics-based bird shoulder prediction algorithm, which resulted in a better-than order of magnitude improvement in prediction accuracy (to within 3 mm error). The second refinement is a real-time collision detection capability that can determine if the robot's knife tool will contact the rigid cone that is part of the conveyor line upon which the bird is mounted during a shoulder cut. A unique knife path is custom-computed for each bird front-half and this capability provides a safety feature. Introducing X-ray Sensing To date, GTRI has relied on predicting bird front-half shoulder location based exclusively on external sensing (using color-plus-depth cameras). In year 1, X-ray imaging has been explored as an additional sensing modality. A compact, handheld X-ray emitter and detector manufactured by OXOS Medical, Inc. has been used to obtain preliminary images of bird front-halves (see figure below) and work has begun on merging X-ray information with color and depth information, with the goal of improving knowledge of the shoulder joint location for use in generating knife cutting paths that result in optimal meat yield. Researching Learning from Demonstration Learning from demonstration (LfD) is a robot control method that uses expert demonstration data to derive a policy for executing a similar task using a robot. The activity this year has leveraged prior work at GTRI that created an instrumented knife used for collecting geometric and dynamic measurements from human deboners performing bird shoulder cuts. Objective 2: Virtual Reality-based Workforce Transformation ?A critical component of cooperation, for both interpersonal and human robotic interactions, is a shared understanding.To provide this shared understanding of the chicken's structure, this research is investigating and developing a deep learning technique known as "canonical mapping" to provide a visual representation of the robot's understanding of the chicken. Canonical mapping aims to associate every pixel of a relevant image to a vertex of a 3D model. This association allows for deformation and flexing of the product, something that classical feature detection offers struggle with managing. The result of this technology is a rainbow map where an image of a processed bird is directly mapped to a unique location on the physical model. The benefit of this technology, aside from the applicability of the robot perceiving a three dimensional model from a two dimensional image, is that a person can easily check the robot's understanding. The overlaid color map provides a user insight with how the robot would interact with a product. A critical issue with artificial intelligence is that it's not guaranteed to be perfectly accurate. Sometimes the inaccuracies are minor or irrelevant to the task, but the risk of learned behavior is that the robot will erroneously assess a critical aspect of a task relevant to its goal. If, for example, the robot mistakenly believes that the "left" wing of the bird is the "right" wing, then a cutting path for deboning does not have a chance to be correct. However, with canonical mapping, a user in a VR space, where the information is readily available, would be able to see that the robot has the wrong understanding and the user would know in what way the robot's understanding is incorrect. From this, a person would be able to intervene to prevent the robot from performing an inappropriate action before it acts. With a shared understanding of the bird, successful cutting paths can be applied to the model; a topic that has been investigated extensively within GTRI. This approach offers a promising path towards achieving deboning as well as applicability to other tasks within poultry processing, all of which rely on a holistic understanding of the product and its structures. Another aspect of this research is the methods of communication between a person and robot within a shared reality. VR headsets and controllers are becoming more easily accessible in the technology industry. However, VR is just one component of a larger field of Extended Reality (XR), which includes Augmented, Virtual, and Mixed reality (AR, VR, and MR respectively). Each of these technologies offer different modalities of communication and understanding between a person and a robot. Exploring these modalities is important, as some users find VR to be uncomfortable due to motion sickness and eye strain. AR applications can be facilitated with modern cell phones, a much more ubiquitous technology than VR headsets. AR allows for a projection of the virtual world to be overlaid onto real world environments. This research is exploring the applicability of this technology to enable communication between a robot in a processing environment and a person in a remote location. If successful, this approach would provide a fast and simple way for a human and robot to accomplish a shared goal in a way that is consistent with the methods that people currently use to communicate with technology. ?Objective 3: A preliminary robotic control ROS package has been developed. Integrated with deep learning swab sticker detection algorithm, the robot can grasp the sticker and perform the plane environmental surface swabbing similar to human swabbing pattern. Evaluated by a pressure force detection sensor, robot swabbing shows more consistent force applied compared to human swabbing. A novel hyperspectral imaging analysis model, named Network Architecture Search enabled Wide Deep Network (NAS-WD) algorithm, was developed for foreign material detection. The new model shows improved accuracy compared to many conventional hyperspectral data analysis models. We work with commercial venders to conduct studies on rapid detection and quantification of food soil remaining on food contact surfaces after commercial cleaning. We are continuing to assemble the test instruments, we have completed the preliminary testing of three food contact surfaces and are modifying the research protocol to further refine our Limits of Detection (LOD). In addition, a student survey was developed, tested then administered to students taking a robotics class at a technical institute. We were testing the hypothesis that after training and education about robotics that students would be less hesitant to work close to a cobot. The initial round of testing has been completed and we are analyzing the data. Objective 4: Research and Extension Integration During this first year, the extension and outreach activities focused on building networks and socializing the general concepts being addressed by the center. Details are provided in the dissemination section.

Publications

  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2023 Citation: Payne, K., O'Bryan, C. A., Marcy, J. A., & Crandall, P. G. (2023). Detection and prevention of foreign material in food: A review. Heliyon, 9(9), e19574.