Progress 02/01/23 to 01/31/24
Outputs Target Audience:Poultry industries androbotic manufacturers Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?This project has provided excellent opportunities for a number of graduate students to become proficient in interdisciplinary research, experimental methods design, conduction of preliminary experiments and statistical analysis Graduate students are exposed to opportunities to visit different poultry processing plants to discuss with stakeholders and front line workers. How have the results been disseminated to communities of interest?We have attended various events and interatcted with various stakeholders to give updates and get feedback. This included presenting at and participating in several different meetings and engagements with academics and industry and are listed below. Harrison Poultry Company, Board of Directors Meeting, Atlanta, Georgia, November 10, 2022. Dr. Britton was invited to present, and part of his presentation included in the plan and desired outcomes for objectives 1 & 2 of the CSI-APP program. Part of the presentation included a feedback session, where this company could provide input regarding the approaches and needs of the technology being developed. International Production and Processing Expo, Atlanta, Georgia, January 27, 2023. Researchers working on objective 2 created a preliminary demo of a virtual reality system that can enable a user to provide input to a robot system remotely. This system was demonstrated to several industry partners during the week of this conference. In addition, researchers met with key leadership at Staubli Robotics to discuss the use of their washdown capable robots in both the intelligent deboning and VR tasks. Advancing Georgia's Leaders In Agriculture Program, Atlanta, GA, February 8, 2023. This program run by the University of Georgia provides leadership training to up and coming members of the agricultural community in Georgia. With around 25 participants, it includes participants from multiple different agricultural sectors. Dr. Britton presented on the research in both objectives 1 & 2 and shared the vision for how these new technologies and developments could have a significant impact on the future of poultry specifically, and agriculture more generally. Reciprocal Meat Conference, American Meat Science Association, St. Paul, MN, June 6, 2027. As part of his keynote address at this event, Dr. Britton provided a overview of the work being done in objectives 1&2 to an audience of industry and academic stakeholders. He presented the general technical approach for both the intelligent deboning and VR-based interfaces, and had multiple individual conversations throughout the course of the conference. Georgia Poultry Federation Summer Leadership Conference, Ponte Vedra Beach, FL, July 20, 2023. Dr. Britton shared a brief overview of the CSI-APP program to roughly 100 members of the Georgia poultry industry. Participants represented all of the producers, processors, genetics companies, equipment manufacturers, and other allied industry members of the poultry industry. Delaware, Maryland, Virginia Poultry Association Annual Meeting, Ocean City, MD, August 25th through 27th . Dr. Ahlin presented on the applicability of VR technologies and their role in providing advanced automation solutions in poultry processing. Staubli Robotics, Faverges, France, August 4, 2023. GTRI researchers met again with principals from Staubli Robotics at their manufacturing facility in France to discuss the possibility of collaborating on the CSI-APP program and other related work at GTRI. Dr. Ahlin provided an overview of the research activities associate with objectives 1 & 2, and detailed the opportunities for Staubli to become engaged in the research. This was extremely well received, with Staubli asking for specific needs and opportunities where they could partner and support the center. What do you plan to do during the next reporting period to accomplish the goals?We will continue to imporve the accuracy of robotic deboning and will test VR systems. Mobile robots will be further developed and tested for automatic swabbing. Bisoensors will be developed for rapid microbial analysis. Biomapping protocols will be developed to assist manufacturers in improving food safety. We will conduct research with poultry workers to understand the resistance to robots in the manufacturing environment and their interests in using VR tools in producting environments.
Impacts What was accomplished under these goals?
Objective 1: Scalable Poultry Manufacturing: Intelligent Robotic Deboning Objective 1's technical approach and accomplishments in year 1 encompass four areas: Commissioning New Robot Hardware, Automated Deboning Method Refinements, Introducing X-ray Sensing, and Researching Learning from Demonstration. Commissioning New Robot Hardware To meet the industry standard speed of processing (on average) 35 birds/minute and to withstand the wet conditions inside a plant, a new high-speed washdown Staubli robot arm has been fully set up and tested in GTRI's laboratory front-half conveyor line to perform automated front-half shoulder deboning. The verisimilitude between the lab set-up and the one found inside a poultry processing plant will aid a planned in-plant trial next year. Automated Deboning Method Refinements Drawing on GTRI's years of work on automated poultry deboning, two method refinements were identified as especially impactful. The first is a re-formulation of the statistics-based bird shoulder prediction algorithm, which resulted in a better-than order of magnitude improvement in prediction accuracy (to within 3 mm error). The second refinement is a real-time collision detection capability that can determine if the robot's knife tool will contact the rigid cone that is part of the conveyor line upon which the bird is mounted during a shoulder cut. A unique knife path is custom-computed for each bird front-half and this capability provides a safety feature. Introducing X-ray Sensing To date, GTRI has relied on predicting bird front-half shoulder location based exclusively on external sensing (using color-plus-depth cameras). In year 1, X-ray imaging has been explored as an additional sensing modality. A compact, handheld X-ray emitter and detector manufactured by OXOS Medical, Inc. has been used to obtain preliminary images of bird front-halves (see figure below) and work has begun on merging X-ray information with color and depth information, with the goal of improving knowledge of the shoulder joint location for use in generating knife cutting paths that result in optimal meat yield. Researching Learning from Demonstration Learning from demonstration (LfD) is a robot control method that uses expert demonstration data to derive a policy for executing a similar task using a robot. The activity this year has leveraged prior work at GTRI that created an instrumented knife used for collecting geometric and dynamic measurements from human deboners performing bird shoulder cuts. Objective 2: Virtual Reality-based Workforce Transformation ?A critical component of cooperation, for both interpersonal and human robotic interactions, is a shared understanding.To provide this shared understanding of the chicken's structure, this research is investigating and developing a deep learning technique known as "canonical mapping" to provide a visual representation of the robot's understanding of the chicken. Canonical mapping aims to associate every pixel of a relevant image to a vertex of a 3D model. This association allows for deformation and flexing of the product, something that classical feature detection offers struggle with managing. The result of this technology is a rainbow map where an image of a processed bird is directly mapped to a unique location on the physical model. The benefit of this technology, aside from the applicability of the robot perceiving a three dimensional model from a two dimensional image, is that a person can easily check the robot's understanding. The overlaid color map provides a user insight with how the robot would interact with a product. A critical issue with artificial intelligence is that it's not guaranteed to be perfectly accurate. Sometimes the inaccuracies are minor or irrelevant to the task, but the risk of learned behavior is that the robot will erroneously assess a critical aspect of a task relevant to its goal. If, for example, the robot mistakenly believes that the "left" wing of the bird is the "right" wing, then a cutting path for deboning does not have a chance to be correct. However, with canonical mapping, a user in a VR space, where the information is readily available, would be able to see that the robot has the wrong understanding and the user would know in what way the robot's understanding is incorrect. From this, a person would be able to intervene to prevent the robot from performing an inappropriate action before it acts. With a shared understanding of the bird, successful cutting paths can be applied to the model; a topic that has been investigated extensively within GTRI. This approach offers a promising path towards achieving deboning as well as applicability to other tasks within poultry processing, all of which rely on a holistic understanding of the product and its structures. Another aspect of this research is the methods of communication between a person and robot within a shared reality. VR headsets and controllers are becoming more easily accessible in the technology industry. However, VR is just one component of a larger field of Extended Reality (XR), which includes Augmented, Virtual, and Mixed reality (AR, VR, and MR respectively). Each of these technologies offer different modalities of communication and understanding between a person and a robot. Exploring these modalities is important, as some users find VR to be uncomfortable due to motion sickness and eye strain. AR applications can be facilitated with modern cell phones, a much more ubiquitous technology than VR headsets. AR allows for a projection of the virtual world to be overlaid onto real world environments. This research is exploring the applicability of this technology to enable communication between a robot in a processing environment and a person in a remote location. If successful, this approach would provide a fast and simple way for a human and robot to accomplish a shared goal in a way that is consistent with the methods that people currently use to communicate with technology. ?Objective 3: A preliminary robotic control ROS package has been developed. Integrated with deep learning swab sticker detection algorithm, the robot can grasp the sticker and perform the plane environmental surface swabbing similar to human swabbing pattern. Evaluated by a pressure force detection sensor, robot swabbing shows more consistent force applied compared to human swabbing. A novel hyperspectral imaging analysis model, named Network Architecture Search enabled Wide Deep Network (NAS-WD) algorithm, was developed for foreign material detection. The new model shows improved accuracy compared to many conventional hyperspectral data analysis models. We work with commercial venders to conduct studies on rapid detection and quantification of food soil remaining on food contact surfaces after commercial cleaning. We are continuing to assemble the test instruments, we have completed the preliminary testing of three food contact surfaces and are modifying the research protocol to further refine our Limits of Detection (LOD). In addition, a student survey was developed, tested then administered to students taking a robotics class at a technical institute. We were testing the hypothesis that after training and education about robotics that students would be less hesitant to work close to a cobot. The initial round of testing has been completed and we are analyzing the data. Objective 4: Research and Extension Integration During this first year, the extension and outreach activities focused on building networks and socializing the general concepts being addressed by the center. Details are provided in the dissemination section.
Publications
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2023
Citation:
Payne, K., O'Bryan, C. A., Marcy, J. A., & Crandall, P. G. (2023). Detection and prevention of foreign material in food: A review. Heliyon, 9(9), e19574.
|