Source: UNIVERSITY OF ARKANSAS submitted to NRP
CENTER FOR SCALABLE AND INTELLIGENT AUTOMATION IN POULTRY PROCESSING (CSIAPP)
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1029956
Grant No.
2023-70442-39232
Cumulative Award Amt.
$5,000,000.00
Proposal No.
2022-11486
Multistate No.
(N/A)
Project Start Date
Feb 1, 2023
Project End Date
Jan 31, 2027
Grant Year
2023
Program Code
[A1731]- Meat and Poultry Processing and Food Safety
Recipient Organization
UNIVERSITY OF ARKANSAS
(N/A)
FAYETTEVILLE,AR 72703
Performing Department
(N/A)
Non Technical Summary
Background:The poultry industry is the largest meat industry in the world, and the United States is the world's No. 1 producer of poultry meat. The consumption of chicken products has steadily increased in recent decades, and this demand will likely continue into the foreseeable future. While the current poultry industry is centralized and designed to produce food efficiently, several operations such as meat deboning rely heavily on manual labor. The COVID-19 pandemic demonstrated that this reliance on manual labor makes the system vulnerable to disruptions. Manufacturing tasks in these facilities required many workers to stand side-by-side, without the ability to telework or operate equipment remotely. During the pandemic, the infection spread quickly among meat processing workers, disrupting the supply chain. High human-food contact can also lead to cross-contamination resulting in food safety recalls. The poultry and meat industry is currently facing unprecedented challenges of labor shortages and food and worker safety.The meat processing industry stands to benefit by more fully embracing transformative technical principles such as sensing, advanced robotics, and artificial intelligence. That said, the current capabilities of robotics and automation cannot yet compete with the dexterity and flexibility of human workers. Animals are highly variable, requiring intelligent and adaptive automation to handle the soft and variable meat tissues. With the U.S. meat manufacturing industry gradually recovering from the COVID-19 pandemic, now is the opportune time for the meat processing industry to reinvent itself and play a major role in addressing global protein needs, increasing processing efficiency, minimizing meat quality loss, alleviating the pressure of labor force shortage, protecting worker safety, improving worker welfare, and the work environment.Overall goals and objectives:The vision of the Center for Scalable and Intelligent Automation in Poultry Processing (CSI-APP) is to incorporate advanced technologies in robotics, artificial intelligence, digital sensing, biosensing, and food safety to provide U.S. poultry processing industry scalable and intelligent solutions to meet the rising national and global demand in poultry products. The long-term goal focuses on transforming current mass manufacturing protocols in large, centralized processing plants to "mass customization" protocols suitable for processing plants in different scales to overcome the inherent variability associated with raw biological materials and humans. Large-scale individualization can be achieved economically through the integration of digital and physical systems ("industry 4.0" principles). In pursuit of the vision and the long-term goal, CSI-APP will strategically target value creation and technological innovation by performing focused engineering research and extension activities by following four unifying objectives in this proposal:Objective 1: Scalable poultry manufacturing. The team will create ascalable plant-ready intelligent robotic deboning systemcapable of performing at parity with (or even exceeding) human deboners for the most skilled task in the plant: shoulder cutting of front-halves. Artificial intelligence algorithms will be developed to handle the high biological variability of meat.Objective 2: Virtual reality-based workforce transformation. The labor shortage is a major challenge for the meat industry. It takes considerable time to train an individual to perform dexterous jobs like meat deboning. Due to high line speeds in a cold, humid environment, there are injuries resulting in labor shortages. During the pandemic, the infection spread quickly among meat processing workers, disrupting the supply chain. Virtual reality can transform, diversify, and distribute the workforce in space and time. Using the proposed VR technology, someone will be able to stay in a comfortable environment and virtually operate a robot to debone meat in a processing plant remotely. This has the potential to reduce labor shortage and create job opportunities everywhere, including rural areas.Objective 3: Sensor and robotic-based product evaluation and bio-mapping for enhancing food quality and safety. A mobile robotic platform containing biosensors for rapid estimation of bacteria will be developed. The biosensors will provide initial biomapping of bacteria in the processing plant and identify the best areas to collect swab samples of the product and environmental surfaces for food safety evaluations. The final biomap will be used to guide sanitation and management decisions. An imaging system will also be developed for detecting foreign objects like small plastics in meat and food quality evaluation.Objective 4: Research and extension integration: create an innovation ecosystem through technology development/transfer and workforce education. Research and extension activities will be integrated, accelerating the technology transformation to better meet stakeholders' needs. Planned activities include surveys to identify barriers, workshops for disseminating information about advanced technologies, demonstration exhibits at industry conferences, and one-on-one technical support for industries considering implementation of these technologies.Expected Outcomes:CSI-APP is structured to (1) enhance the robustness and scalability of precision manufacturing in meat processing and chicken deboning; (2) distribute the workforce in space and time using virtual reality systems; (3) improve food quality and safety in processing plants using intelligent automation, real-time vision sensing, biosensing and biomapping; and (4) collect stakeholder feedback of digitalization transformation in the meat industry and disseminate the technology to the stakeholders. This contribution will be significant because it is expected to transform the poultry industry to a more digitized and automated industry, with enhanced labor safety and food quality/safety. The scalable and transferrable technology is expected to be adaptable to smaller chicken processors, which is beneficial for the economic development of rural areas. A distributed network of smaller producers/processors that can also supply chickens to local clients efficiently to protect the food supply from aggressive attacks and the spread of pathogens. On fundamental, applied, and extension levels, the long-term outcomes of CSI-APP can be adapted to allied food industries benefiting the U.S. and global economy, but the potential impact of CSI-APP goes far beyond this. Making the mass customization of protein manufacturing a reality will contribute to long-term environmental sustainability in food production and to well-being around the world by providing a safe and affordable source of protein.Project Team:CSI-APP connects four core institutes: University of Arkansas System Division of Agriculture, Georgia Tech Research Institute, University of Nebraska-Lincoln, and Fort Valley State University, along with a key collaborator from USDA ARS National Poultry Research Center. An interdisciplinary team from the four institutions aims to uncover the engineering and technologies to enable scalable, intelligent, efficient, safe, and transformable meat manufacturing systems to enhance worker safety, food safety and process efficiency. CSI-APP's Industrial board consists of 12 representative stakeholders related to the project from (1) poultry companies in large, medium, and small sizes; (2) food manufacturing and automation companies; and (3) industry associationswith backgrounds spanning poultry production, poultry processing, food technologies, and intelligent food system development.
Animal Health Component
40%
Research Effort Categories
Basic
40%
Applied
40%
Developmental
20%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4023260202028%
4023260208025%
4023620202010%
8033260308010%
9033260202012%
9033260308015%
Goals / Objectives
The vision of the Center for Scalable and Intelligent Automation in Poultry Processing (CSI-APP) is to incorporate the advanced technologies in robotics, artificial intelligence, digital sensing, biosensing, and food safety to provide U.S. poultry processing industry scalable and intelligent solutions to meet the rising national and global demand in poultry products. The long-term goal focuses on transforming current mass manufacturing protocols in large, centralized processing plants to "mass customization" scale-neural protocols suitable for processing plants in different scales to overcome the inherent variability associated with raw biological materials and humans, and large-scale individualization can be achieved economically through the integration of digital and physical systems (e.g., "industry 4.0" principles). In pursuit of the vision and the long-term goal, in short-term, CSI-APP will strategically target value creation and technological innovation by performing focused engineering research and extension activities by following four unifying objectives to bond bioproducts, human and sensing/robotics technologies:Objective 1: Scalable poultry manufacturing: lot size of one for robotic processing of chicken carcasses. Specifically, the objective is to create a scalable plant-ready intelligent robotic deboning system capable of performing at parity with (or even exceeding) human deboners for the most skilled task in the plant: shoulder cutting of front-halves. The human benchmark is 35 birds/minute and lost yield of 2% of total yield weight.Objective 2: Virtual reality-based workforce transformation.The objective 2 is to bridge the gap between fully manual and fully autonomous operations by leveraging human intelligence and robotic endurance. To develop and deploy the fully automated system proposed in Objective 1 needs to collect large amounts of human operational data, and requires long-term validation and optimization. To accelerate the data collection and robotic deployment, a VR based human-in-the-loop robotics will be developed in this objective to facilitate necessary steps that will allow for select manual operations within a poultry processing operation to be performed via robots. This VR based robotic system will allow a worker remotely collaborate with robotic devices in order to jointly accomplish processing tasks in a facility. The online human decision making results collected in the VR system will also be used for optimizing the fully automated system in the objective 1.Objective 3: Robots for robots: sensor and robotic-based product evaluation, bio-mapping and decision making in the processing plant to address the new challenges in food quality and safety raised by robotic manufacturing.The goal of Objective 3 is to create new 'robots for robots' protocols to design a new set of robotic and sensor solutions to address the emerging food safety and quality challenges brought by automated meat manufacturing solutions. The related challenges and questions include unknown pathogen transmission patterns, the new requirements of sanitization protocols, potential product quality degradation and the introduction of new FO contaminations. To fill the above gaps, specifically, CSI-APP team proposed a new proactive mobile swab sampling robot platform to collect the environmental surface swab samples with onsite pathogen detection and pathogen transmission pattern visualizations. The outputted quantitative results will also be used for designing and optimizing current sanitization protocols. Additionally, an all-in-one hyperspectral imaging based non-invasive FO contamination detection and food quality evaluation system will be developed and integrated with previous processing lines for online food quality control.Objective 4: Research and extension integration: create an innovation ecosystem through technology development/transfer and workforce education.The Objective 4 is to integrate the research and extension activities and accelerate the technology transformation to better meet the stakeholders needs via activities including interviews, workshops, industry conferences, and demonstration exhibits.
Project Methods
Objective 1: Scalable poultry manufacturing:1) Development of parametric bird physiology models: To achieve accurate and robust lot size of one robotic poultry deboning knife trajectories, it is important to predict the geometric coordinates of non-visible anatomical features of interest (output) from the visible external carcass features (input). The internal anatomical features will be partially modeled off-line from CT scans. The visible features will be collected on-line from dual RGB-D cameras. Machine learning models will be built to map the relationships between inputs and outputs.2) Applying learning from demonstration (LfD) methods to refine knife cutting paths. LfD methods will be incorporated to allow expert human deboners to inform/optimize robot knife paths that achieve maximal yield while avoiding bone chips. Expert users will use the instrumented knife that will contain a microprocessor and an IMU (inertial measurement unit) for measuring knife angular velocity and acceleration. Neural networks will be used to learn the hidden cost function from bird physiology models, using the collected expert data as training sets.3) Research and implementation of feedback control based on knife forces. Closed-loop feedback control based on knife forces will be designed in order to deal with errors and disturbances. The robot knife tool will be instrumented with a dedicated force/torque sensor to detect 3-axis translational and rotational loadings on the blade as deboning is performed. Methods such as model predictive control will be explored.Objective 2:2.1 Build human-robot collaborative interfaces with poultry plant operations utilizing virtual reality environments. This step attempts to comprehensibly model the interface of robotic devices, human operators, and processing plant environments, which will assist to transfer human operators' decisions to transformative robotic movements (deboning and trimming). In detail, the worker will operate the VR system in the well-conditioned controlling room, and the robot arms will the mounted on the processing lines. The VR space and real-world space will be well-calibrated, and the robot utilizing all sensor information will be capable of interpreting human commands and provide automation on a manual labor task. The system robustness under different WIFI bandwidth constraints will be evaluated. 2.2 Adaptation of artificial intelligenceIn objective 1, a LfD method has been proposed where human operations can be tracked and recorded from external IMU sensors. VR system proposed here offers a quicker and easier solution to record human operations. Combining with robotic perception to localize the characterize the object, the machine learning based LfD dataset can be easily established. The graceful integration of LfD and machine learning algorithms into robotic poultry tasks includes three stages utilizing active learning ideas: (1) user performs the task and generates the initial training dataset of input images and user annotations. (2) machine learning models continuously trains on acquired data and starts making predictions that are verified by the user, (3) machine learning approach incorporates uncertainty estimation and only requests user assistance or intervention when having low confidence in its prediction or encountering an unusual situation.Objective 3: Sensor and robotic based product evaluation and bio-mapping3.1 Development of mobile robot platform for swab sample collection:A mobile robotic platform will be developed in the objective to automatically collect meat product and environmental surface swab samples for food safety evaluations. The facility map will be preloaded into the system and the vehicle will move following pre-designated routes and acquire swab samples. The complete mobile robotic platform is expected to run in the processing plant three times a day to timely generate bio-maps to visualize the potential pathogen transmission patterns in poultry processing facilities to guide efforts to minimize biosafety risks.3.2. Biosensor based in-field pathogen detectionThe biosensor will be placed at the wrist of the mobile robot arm. This will allow for on-site rapid and automated screening of Salmonella and Campylobacter, using immunomagnetic separation and quantum dot-based fluorescent sensing. This will generate preliminary biomapping of the facilities.3.3 Biomapping and robotic sanitizationBased on identified hotspots on the preliminary biomap, the mobile robots will collect swabs for traditional microbial enumeration method. The biomapping tool and its outputted results will also be used for developing, optimizing, and validating the sanitization protocols. With the mobile robotic swab platform, the effect of sanitation can be easily and continuously monitored and evaluated.2.4 Hyperspectral-imaged based foreign object (FO) detection and food quality evaluationsAn integrated online hyperspectral imaging system will be established and validated in the subsection for fillet quality evaluation. The system is composed of a detector (900-1700 nm) and broad band illumination source (tungsten-halogen lamps), which will run in the line scanning reflective mode and be integrated following robotic deboning belt. Pixel-level multi-task deep learning model will be integrated in the system for simultaneous food quality evaluation and FO identification. The model will take the pixel spectrum as the input, and the output will be the FO categories and fillet texture properties.Objective 4: Extension4.4.2.1 Semi-structured interviews to collect stakeholder's feedback on CSI-APP innovationsCo-PI McQuillan and her team at UNL will conduct surveys from two key sets of stakeholders: (1) Current poultry processing workers and (2) communities and community members who could develop small CSI-APP facilities with the help of the new center. Similar social science research and extension activities will also be conducted in FVSU to evaluate the automation in action for potential technology transformation to the red meat industry. Besides the social awareness of the stakeholders for the newly developed technologies, the team will also evaluate how robotic technologies could be a relief for front line workers from long-term physical injuries using 3D camera systems.4.2. Organized workshops, conferences, and exhibitsSurveyswill be conducted with two key sets of stakeholders: (1) Current poultry processing workers and (2) Potential for new poultry processing facilities.Educational Workshops: New technologies will be introduced into existing educational workshops, Poultry 101 (twice per year) and Poultry 201 (once per year), A new workshop, Poultry 301, is expected to be launched in year 3 and 4, which will solely focus on new technologies in processing. GTRI will host the International Food Automation Conference (IFAN), which is an event specifically designed to engage engineering and technology decision-makers in the food, poultry, and meat manufacturing sector. GTRI will also host an exhibit booth on the floor of the International Production & Processing Expo (IPPE), which is a large tradeshow that attracts participation from all over the world.

Progress 02/01/24 to 01/31/25

Outputs
Target Audience:The target audiences are poultry industry, robotics equipment manufacturers, and virtual reality technology providers. Changes/Problems:Both objective 1 and 2 were scheduled to conduct field trials in the spring of 2025. Unfortunately, the outbreak of highly pathogenic avian influenza (HPAI) forced partner companies to postpone access for testing in plant. As a result, the teams pivoted to conducting in-lab experiments, with the goal of doing more extensive trials in the fall of 2025. What opportunities for training and professional development has the project provided?Students and postdoctoral fellows were trained in interdisciplinary fields of robotics, poultry/foodscience, and engineering. A Project Assistant learned IRBreview application procedures. Several students presented results at conferences. Objective 4 (extension and outreach) had two undergraduate students oversee the design and development work of a group of four high school interns tasked with building and interactive robotic display cell. They were responsible for taking the design and eventually building the cell, that consists of a UR3 robot arm, a camera system, and then controls and a gui/screen to show what is going on in software. The objective is to use the robot to pick and place chicken tenders in a foam tray. Under the supervision of several research team members, they had the opportunity to staff the exhibit at the Georgia National Fair. How have the results been disseminated to communities of interest?Results have been presented at the conferences. We conducted and in-plant field study of VR at a poultry processing plant.Over the course of two days, seventeen volunteer employeestested the Virtual Reality technology for assisting Cone loading procedures. Managment and employees were able to experience the new automation technologies. During this second year, the extension and outreach activities shifted to presenting and sharing some of the robotics, automation and VR concepts being developed by the center. This included presenting at and participating in several different meetings and engagements with academics and industry and are listed below. STEM in Poultry and Agriculture Presentation and Demonstration for Houston County Schools, Atlanta, Georgia, January 19, 2024. Dr. Britton gave a presentation on the opportunities for STEM in poultry and agriculture to a group of high school teachers from Houston County, Georgia. The research team then provided a tour of the labs and a demonstration of the different technologies. Ag Career Expo Robotics Demonstration, Carroll County, Georgia, January 24, 2024. Dr. Ahlin demonstrated a robotic manipulator that could be controlled by participants to grasp unstructured objects. Approximately two hundred students visited the exhibit over four hours. International Production and Processing Expo, Atlanta, Georgia, January 30 - February 1, 2024. Researchers working on objective 2 demonstrated the virtual reality system in partnership with the Staubli robotics company exhibit. The VR headset was in the company's booth and it enabled a user to provide input to a robot system remotely. The system was demonstrated to many industry stakeholders during the week of the tradeshow, and provided an opportunity for them to operate the system directly. Technical Presentation and Organizational Discussion with Staubli Robotics, Atlanta, Georgia, February 2, 2024. GTRI researchers met with senior leadership and technical management from Staubli Robotics to discuss the possibility of a donation of washdown robots to support this research effort. In addition to technical approaches and hardware needs, there was significant discussion regarding some of the novel approaches being pursued. MeatingPlace.com Podcast Interview of Poultry Robotics & Technology, Atlanta, Georgia, February 6, 2024. Dr. Ahlin was interviewed as part of a podcast to discuss AI and automation in poultry processing. Atlanta Science Festival Demonstration and Exhibit, Atlanta, Georgia, March 9, 2024. Dr. Ahlin and Mr. Freidank hosted an interactive exhibit at the Atlanta Science Festival on Georgia Tech's campus in Atlanta, Georgia. Several hundred members of the public participated in an interactive exhibit demonstrating a co-robotic manipulator grasping unstructured objects. STEM in Poultry and Agriculture Presentation and Demonstration for Students from Warner Robbins High School, Atlanta, Georgia, March 21, 2024. Presentation to Congressman Austin Scott from Tifton, Georgia, Atlanta, Georgia, March 25, 2024. Dr. Britton and members of the research team briefed Congressman Scott on the nature of the research and projects supporting poultry processing and agriculture. Specific topics included the intelligent cutting research and the VR robotics efforts. MeatingPlace.com Podcast #2 Interview of Poultry Robotics & Technology, Atlanta, Georgia, February 6, 2024. Dr. Ahlin participated in a MetaingPlace.com podcast to discuss VR and automation in poultry processing. STEM in Poultry and Agriculture Presentation and Demonstration for Students Piney Woods High School in Mississippi, Atlanta, Georgia, April 26th 2024. Dr. Ahlin hosted a group of students from Piney Woods High School Mississippi as they visited the Food Processing Building in Atlanta. Approximately thirty students attended presentations and were shown demonstrations on automation and technologies in food processing. Hosted the Firm Foundation STEM Enrichment Camp, Atlanta, Georgia, June 21, 2024. Dr. Ahlin assisted in hosting the Firm Foundation STEM Enrichment Camp in the food processing building. Approximately forty students were shown presentations and demonstrations on automation and technologies in food processing. Hosted a Georgia 4H Group, Atlanta, Georgia, July 8, 2024. Dr. Ahlin hosted a 4H group that visited the Food Processing Building. Approximately twenty students from schools around Georgia were shown presentations and demonstrations on automation and technologies in food processing. ?Georgia Poultry Federation Summer Leadership Conference, Ponte Vedra, Florida, July 18, 2024. Dr. Britton provided an update and overview on both the VR and Intelligent Cutting Research Projects to a group of 65 industry and academic leaders in the poultry production and processing areas. RoboGeorgia's "Automating the ATL" Event, Atlanta, Georgia, August 22, 2024. Dr. Ahlin participated in RoboGeorgia's "Automating the ATL" event that brought together startups, larger corporations, and academic institutions to discuss the growing industry and economic impact of robotics in the state of Georgia. XR Symposium at Georgia Tech Research Institute, Atlanta, Georgia, August 28, 2024. Dr. Ahlin, Dr. Britton, Mr. Usher and other team members participated in the XR Symposium at Georgia Tech Research Institute. This internal event sought to discuss the various applications and benefits of extended reality technology in research and industry. International Food and Automation Networking Conference, Atlanta, Georgia, September 8-10, 2024. Dr. Ahlin, Dr. Britton, Dr. Hu, and other team members attended the industry focused IFAN conference. Dr. Ahlin presented "AI and VR Technologies for Robotic Cooperation in Poultry Processing" while Dr. Ai-Ping Hu presented on "Intelligent Cutting Robotics for Poultry Deboning." All participants participated in various networking activities with industry. Hosted the Australian Meat Processing Corporation, Atlanta, Georgia, September 11, 2024. Dr. Ahlin met with Mr. Stuart Shaw, the executive director of the AMPC to discuss automation in food processing on the international scale. Poultry World Exhibit at the Georgia National Fair, Perry, Georgia, October 4-13, 2024. A major push was made to provide a robotics demonstration related to poultry at the Georgia National Fair, where 1000's of people would have an opportunity to see the relevance of robotics and automation to poultry. See description for more details. Hosted a Second Meeting with Australian Meat Processing Corporation, Atlanta, Georgia, October 29, 2024. Dr. Ahlin hosted Edwina Toohey with AMPC to further discuss specific avenues of collaboration and cooperation in the area of automation in food processing. Article Submitted to MeatingPlace.com, Atlanta, Georgia, October 31, 2024. Dr. Ahlin submitted an article titled "Demystifying AI and Robotics in Meat and Poultry Processing" for publication. The article focused on the relationship between robotics and AI, and how these technologies might be used in food processing. Poultry World Exhibit, Georgia National Fair, October 4-14, 2024. This outreach and extension effort was one of the primary activities for year two of the center. While led by researchers William Freidank, Walker Byrnes, Nate Damen, the project was initiated by a group of summer high school interns supported by undergraduate students, Jenny Hou and Ander Gay. What do you plan to do during the next reporting period to accomplish the goals?For objectives 1 and 2, the team plans to pursue the following: (1) continue the Learning from Demonstration research, (2) to add a second robot arm to enable cutting BOTH sides of chicken front-half and (3) to prepare for an in-plant trial in the Fall of 2025. Objective 3 (Automation for Food Safety): The next phase involves integrating data from multiple sensors to develop a robust Simultaneous Localization and Mapping (SLAM) model. Specifically, we plan to use two 2D LiDAR sensors, wheel odometry, and an IMU to enhance the vehicle's localization and mapping capabilities in complex environments. This will allow the vehicle to navigate autonomously in complex environments like poultry processing plants. In addition, we plan to improve our robotic control algorithm capable of predicting force sensor values based on joint torques and positions, with the goal of eliminating the need for physical force sensors. This approach will simplify the hardware setup and improve system robustness. We also to integrate the robotic swabbing arm onto an unmanned ground vehicle (UGV) for real-world field deployment and sample collection. Biosensors will be integrated into the robotic system for entire system evaluation. In year 3, the Objective 4 team has planned two main activities: 1) Add interviews with people who have NOT worked in poultry processing and who are currently looking for a job. Participants will describe perceptions of working with robots before, during, and after doing a tasks in Virtual Reality with a remote robot. 2) Hold at least one focus group/futuring session with producers and processors in the Nebraska food system gain stakeholder insights about the potential of automation/remote co-cobotic work. We have had preliminary conversations withAFAN(Alliance for the Future of Agriculture in Nebraska) and the HeartlandUSDA Food Hubto bring their members/business builder applications to the event. The research activities will continue to be highlighted during ATRP industry events and at the International Trade Shows, such as IPPE and the International Meat Automation Conference (IMAC). Finally, conference papers are being prepared for presentation at the Poultry Science Association's and American Society of Agricultural and Biological Engineers Annual Meetings with further presentations being shared at statewide symposia such as the Integrative Precision Agricultural Conference in Georgia.

Impacts
What was accomplished under these goals? Objective 1 (Intelligent Deboning) saw accomplishments in further developing new algorithms in the Imitation Learning from Reinforcement Learning for robotic manipulation. These algorithms were developed, tested, and refined through a series of experiments at the Georgia Tech Food Processing Technology Building in preparation for in-plant trials. This included running the robotics at current processing line speeds and measuring the yield effectiveness of the output. Objective 2 (Virtual Reality) continued to pursue artificial intelligence research into mapping images of carcasses to a three-dimensional model using canonical mapping. In addition, methods of user interaction and accessibility were tested, and a roadmap for improvements is being developed. The next stage of the research is to combine these technologies to allow for a robotic system to interact with poultry products based on a user's input empowered by machine learning. Objective 3 (Automation for Food Safety) Progress has been achieved in the development of autonomous navigation and mapping using the Husky Unmanned Ground Vehicle (UGV) within a laboratory setting. Using the Robot Operating System (ROS1) and Clearpath's Indoor Navigation web interface, the UGV has successfully demonstrated its ability to move autonomously through multiple waypoints, comprising start, turn, and end points, and return to its initial location. This path-following task can be repeated reliably, indicating the robustness of the system. Additionally, the vehicle can detect obstacles in its path and dynamically adjust its route to reach its destination. For the robotic swabbing control side, we established a robotic platform for surface swabbing using a novel hybrid force/position control algorithm that enables precise and consistent contact based on swabbing protocols. Multiple force sensors were integrated to monitor both applied force and coverage, maximizing swabbing performance. A complete robotic program package was developed for both physical deployment and simulation. Additionally, we designed and conducted a comparative experiment involving 20 untrained human operators, evaluating their swabbing performance before and after training against the robot. For biosensing, we developed a new palm-size wireless piezoelectric immune-biosensing system for rapid bacterial detection. Objective 4 (Extension and Outreach) activities focused on building networks and socializing the general concepts being addressed by the center. This included presenting at and participating in several different meetings and engagements with academics and industry, and in particular the International Food Automation Networking Conference and the Poultry World Exhibit at the Georgia National Fair. We conducted interviews with 17 people at a mid-sized poultry processing plant in October of 2024 and learned perceptions of remote, VR, co-robotic work by using it. We are coding the interviews for themes, and presented preliminary results at a professional conference. We participated in the International Food Automation Networking Conference (Industry Conference) in September of 2024.

Publications

  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2024 Citation: Pallerla, C., Feng, Y., Owens, C. M., Bist, R. B., Mahmoudi, S., Sohrabipour, P., Davar, A., & Wang, D. (2024). Neural network architecture search enabled wide-deep learning (NAS-WD) for spatially heterogenous property awared chicken woody breast classification and hardness regression. Artificial Intelligence in Agriculture, 14, 73-85. https://doi.org/10.1016/j.aiia.2024.11.003
  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2024 Citation: Mahmoudi, S., Davar, A., Sohrabipour, P., Bist, R. B., Tao, Y., & Wang, D. (2024). Leveraging imitation learning in agricultural robotics: A comprehensive survey and comparative analysis. Frontiers in Robotics and AI, 11, 1441312. https://doi.org/10.3389/frobt.2024.1441312
  • Type: Peer Reviewed Journal Articles Status: Under Review Year Published: 2025 Citation: Tian, Yang and Kelso, Lisa and Xiao, Yiting and Pallerla, Chaitanya and Bist, Ramesh and Mahmoudi, Siavash and Liu, Ziyu and Xiong, Haizheng and Subbiah, Jeyamkondan and Howell, Terry and Wang, Dongyi, Palm-Size Wireless Piezoelectric Immune-Biosensing System for Rapid E. coli O157:H7 Detection. Available at SSRN: https://ssrn.com/abstract=5029846 or http://dx.doi.org/10.2139/ssrn.5029846
  • Type: Peer Reviewed Journal Articles Status: Under Review Year Published: 2025 Citation: Mahmoudi, S., Davar, A., & Wang, D. (2025). Data-Driven Contact-Aware Control Method for Real-Time Deformable Tool Manipulation: A Case Study in the Environmental Swabbing. arXiv preprint arXiv:2503.21491. Under Review of IEEE Transactions on Automation Science and Engineering
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Mahmoudi S., Sohrabipour P., Obe T., Gibson K., Crandall P., Jeyam S., Wang D. (2024), Automated Environmental Swabbing: A Robotic Solution for Enhancing Food Safety in Poultry Processing. In 2024 the Third Annual Artificial Intelligence in Agriculture Conference. College Station, TX [Poster]
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Pallerla C., &, Wang D., (2024) Hyperspectral imaging and Machine learning algorithms for foreign material detection on the chicken surface. In 2024 Poultry Science Annual International Meeting. Lexington, KY [Oral]
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Tian, Y., &, Wang D., (2024) Palm-size wireless piezoelectric immune-sensor system for E. coli detection. In 2024 ASABE Arkansas Session meeting [Poster]
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Wang, D., Mahmoudi S., Griscorn C., Crandall P (2024). Automated Environmental Swabbing: A Robotic Solution for Enhancing Food Safety in Poultry Processing  Human Swabbing Evaluation and Preliminary Robotic Swabbing Setup. In 2024 Institute of Biological Engineering (IBE) Annual Meeting, Atlanta, GA [Oral]
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Mahmoudi S., Wang D., (2024) Automated Solutions for Poultry Processing: Integrating Robotic Swab Sampling and Pathogen Detection Technologies. In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting. Anaheim, CA [Poster]
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Pallerla C., Wang D., (2024) Hyperspectral imaging and Machine learning algorithms for foreign material detection on the chicken surface. In 2024 American Society of Agricultural and Biological Engineers (ASABE) Annual International Meeting. Anaheim, CA [Poster]


Progress 02/01/23 to 01/31/24

Outputs
Target Audience:Poultry industries androbotic manufacturers Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?This project has provided excellent opportunities for a number of graduate students to become proficient in interdisciplinary research, experimental methods design, conduction of preliminary experiments and statistical analysis Graduate students are exposed to opportunities to visit different poultry processing plants to discuss with stakeholders and front line workers. How have the results been disseminated to communities of interest?We have attended various events and interatcted with various stakeholders to give updates and get feedback. This included presenting at and participating in several different meetings and engagements with academics and industry and are listed below. Harrison Poultry Company, Board of Directors Meeting, Atlanta, Georgia, November 10, 2022. Dr. Britton was invited to present, and part of his presentation included in the plan and desired outcomes for objectives 1 & 2 of the CSI-APP program. Part of the presentation included a feedback session, where this company could provide input regarding the approaches and needs of the technology being developed. International Production and Processing Expo, Atlanta, Georgia, January 27, 2023. Researchers working on objective 2 created a preliminary demo of a virtual reality system that can enable a user to provide input to a robot system remotely. This system was demonstrated to several industry partners during the week of this conference. In addition, researchers met with key leadership at Staubli Robotics to discuss the use of their washdown capable robots in both the intelligent deboning and VR tasks. Advancing Georgia's Leaders In Agriculture Program, Atlanta, GA, February 8, 2023. This program run by the University of Georgia provides leadership training to up and coming members of the agricultural community in Georgia. With around 25 participants, it includes participants from multiple different agricultural sectors. Dr. Britton presented on the research in both objectives 1 & 2 and shared the vision for how these new technologies and developments could have a significant impact on the future of poultry specifically, and agriculture more generally. Reciprocal Meat Conference, American Meat Science Association, St. Paul, MN, June 6, 2027. As part of his keynote address at this event, Dr. Britton provided a overview of the work being done in objectives 1&2 to an audience of industry and academic stakeholders. He presented the general technical approach for both the intelligent deboning and VR-based interfaces, and had multiple individual conversations throughout the course of the conference. Georgia Poultry Federation Summer Leadership Conference, Ponte Vedra Beach, FL, July 20, 2023. Dr. Britton shared a brief overview of the CSI-APP program to roughly 100 members of the Georgia poultry industry. Participants represented all of the producers, processors, genetics companies, equipment manufacturers, and other allied industry members of the poultry industry. Delaware, Maryland, Virginia Poultry Association Annual Meeting, Ocean City, MD, August 25th through 27th . Dr. Ahlin presented on the applicability of VR technologies and their role in providing advanced automation solutions in poultry processing. Staubli Robotics, Faverges, France, August 4, 2023. GTRI researchers met again with principals from Staubli Robotics at their manufacturing facility in France to discuss the possibility of collaborating on the CSI-APP program and other related work at GTRI. Dr. Ahlin provided an overview of the research activities associate with objectives 1 & 2, and detailed the opportunities for Staubli to become engaged in the research. This was extremely well received, with Staubli asking for specific needs and opportunities where they could partner and support the center. What do you plan to do during the next reporting period to accomplish the goals?We will continue to imporve the accuracy of robotic deboning and will test VR systems. Mobile robots will be further developed and tested for automatic swabbing. Bisoensors will be developed for rapid microbial analysis. Biomapping protocols will be developed to assist manufacturers in improving food safety. We will conduct research with poultry workers to understand the resistance to robots in the manufacturing environment and their interests in using VR tools in producting environments.

Impacts
What was accomplished under these goals? Objective 1: Scalable Poultry Manufacturing: Intelligent Robotic Deboning Objective 1's technical approach and accomplishments in year 1 encompass four areas: Commissioning New Robot Hardware, Automated Deboning Method Refinements, Introducing X-ray Sensing, and Researching Learning from Demonstration. Commissioning New Robot Hardware To meet the industry standard speed of processing (on average) 35 birds/minute and to withstand the wet conditions inside a plant, a new high-speed washdown Staubli robot arm has been fully set up and tested in GTRI's laboratory front-half conveyor line to perform automated front-half shoulder deboning. The verisimilitude between the lab set-up and the one found inside a poultry processing plant will aid a planned in-plant trial next year. Automated Deboning Method Refinements Drawing on GTRI's years of work on automated poultry deboning, two method refinements were identified as especially impactful. The first is a re-formulation of the statistics-based bird shoulder prediction algorithm, which resulted in a better-than order of magnitude improvement in prediction accuracy (to within 3 mm error). The second refinement is a real-time collision detection capability that can determine if the robot's knife tool will contact the rigid cone that is part of the conveyor line upon which the bird is mounted during a shoulder cut. A unique knife path is custom-computed for each bird front-half and this capability provides a safety feature. Introducing X-ray Sensing To date, GTRI has relied on predicting bird front-half shoulder location based exclusively on external sensing (using color-plus-depth cameras). In year 1, X-ray imaging has been explored as an additional sensing modality. A compact, handheld X-ray emitter and detector manufactured by OXOS Medical, Inc. has been used to obtain preliminary images of bird front-halves (see figure below) and work has begun on merging X-ray information with color and depth information, with the goal of improving knowledge of the shoulder joint location for use in generating knife cutting paths that result in optimal meat yield. Researching Learning from Demonstration Learning from demonstration (LfD) is a robot control method that uses expert demonstration data to derive a policy for executing a similar task using a robot. The activity this year has leveraged prior work at GTRI that created an instrumented knife used for collecting geometric and dynamic measurements from human deboners performing bird shoulder cuts. Objective 2: Virtual Reality-based Workforce Transformation ?A critical component of cooperation, for both interpersonal and human robotic interactions, is a shared understanding.To provide this shared understanding of the chicken's structure, this research is investigating and developing a deep learning technique known as "canonical mapping" to provide a visual representation of the robot's understanding of the chicken. Canonical mapping aims to associate every pixel of a relevant image to a vertex of a 3D model. This association allows for deformation and flexing of the product, something that classical feature detection offers struggle with managing. The result of this technology is a rainbow map where an image of a processed bird is directly mapped to a unique location on the physical model. The benefit of this technology, aside from the applicability of the robot perceiving a three dimensional model from a two dimensional image, is that a person can easily check the robot's understanding. The overlaid color map provides a user insight with how the robot would interact with a product. A critical issue with artificial intelligence is that it's not guaranteed to be perfectly accurate. Sometimes the inaccuracies are minor or irrelevant to the task, but the risk of learned behavior is that the robot will erroneously assess a critical aspect of a task relevant to its goal. If, for example, the robot mistakenly believes that the "left" wing of the bird is the "right" wing, then a cutting path for deboning does not have a chance to be correct. However, with canonical mapping, a user in a VR space, where the information is readily available, would be able to see that the robot has the wrong understanding and the user would know in what way the robot's understanding is incorrect. From this, a person would be able to intervene to prevent the robot from performing an inappropriate action before it acts. With a shared understanding of the bird, successful cutting paths can be applied to the model; a topic that has been investigated extensively within GTRI. This approach offers a promising path towards achieving deboning as well as applicability to other tasks within poultry processing, all of which rely on a holistic understanding of the product and its structures. Another aspect of this research is the methods of communication between a person and robot within a shared reality. VR headsets and controllers are becoming more easily accessible in the technology industry. However, VR is just one component of a larger field of Extended Reality (XR), which includes Augmented, Virtual, and Mixed reality (AR, VR, and MR respectively). Each of these technologies offer different modalities of communication and understanding between a person and a robot. Exploring these modalities is important, as some users find VR to be uncomfortable due to motion sickness and eye strain. AR applications can be facilitated with modern cell phones, a much more ubiquitous technology than VR headsets. AR allows for a projection of the virtual world to be overlaid onto real world environments. This research is exploring the applicability of this technology to enable communication between a robot in a processing environment and a person in a remote location. If successful, this approach would provide a fast and simple way for a human and robot to accomplish a shared goal in a way that is consistent with the methods that people currently use to communicate with technology. ?Objective 3: A preliminary robotic control ROS package has been developed. Integrated with deep learning swab sticker detection algorithm, the robot can grasp the sticker and perform the plane environmental surface swabbing similar to human swabbing pattern. Evaluated by a pressure force detection sensor, robot swabbing shows more consistent force applied compared to human swabbing. A novel hyperspectral imaging analysis model, named Network Architecture Search enabled Wide Deep Network (NAS-WD) algorithm, was developed for foreign material detection. The new model shows improved accuracy compared to many conventional hyperspectral data analysis models. We work with commercial venders to conduct studies on rapid detection and quantification of food soil remaining on food contact surfaces after commercial cleaning. We are continuing to assemble the test instruments, we have completed the preliminary testing of three food contact surfaces and are modifying the research protocol to further refine our Limits of Detection (LOD). In addition, a student survey was developed, tested then administered to students taking a robotics class at a technical institute. We were testing the hypothesis that after training and education about robotics that students would be less hesitant to work close to a cobot. The initial round of testing has been completed and we are analyzing the data. Objective 4: Research and Extension Integration During this first year, the extension and outreach activities focused on building networks and socializing the general concepts being addressed by the center. Details are provided in the dissemination section.

Publications

  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2023 Citation: Payne, K., O'Bryan, C. A., Marcy, J. A., & Crandall, P. G. (2023). Detection and prevention of foreign material in food: A review. Heliyon, 9(9), e19574.